Waleed Meligy wrote:
Thanks for your help. one more question:
We have started importing the wikipedia dump using ImportDump.php, the issue
is we are using PuTTY and an SSH session to do that. We have kept a PC and
the session open for 30+ hours now, we have currently imported 3,637,700
pages.
a) You can make a program like
int main() {
if (fork()) return 0; //Parent
system("php importdump.php....."); //The command you use
return 0;
}
compile with gcc and run. The command will keep running even if you
close the ssh client (or the network drops the connection).
(There must be an easier way, any taker between the unix gurus in the list?)
How long do we have to wait ? and i am 80% sure there
has to be a quicker
way. Is there any ?
English wikipedia is loong to import. But using importDump is
always
much slower because it not only puts the data on the db, but also parses
the page.
You should use mwdumper
http://meta.wikimedia.org/wiki/Data_dumps#mwdumper