Hello everybody!
this is my frist mail and I have a big problem!
I can't use media wiki to do the restoration of wikipedia dump into my
mysql database.
So I use mwdumper, but it doesn't work very well. It inserts first
280,000 pages and not the others. During the execution of the process,
it runs slowly, for the first 280,000 pages, because, in my opinion,
it works well. Then it begins to go fastly and it ends after little
time...
The problem is that once the pages it inserts are 280,000, other time
49,000, and everytime I try to do the restoration, this number
changed.
I think the problem is with mysql, with the management of the tails
and the buffer in input, but I don't know what I could do.
If someone can help me, please answers to the mail, thanks a lot.
Mario Scriminaci