Hi,
I want to download the english wikipedia database and further process the data. The sql dump is approx 30G in compressed format. Currently I have setup MYSQL to import small database dumps in other languages. What is the best and quickest approach to import the english database to MySQL? My understanding is that I will have to download all compressed files from http://download.wikimedia.org/wikipedia/en/ and cat them, so I can gunzip them in order to import them to MySQL. Can I import subsets of the databases without downloading all the dumps? If so, where can I find these files ?
Is there any other ultility to convert the sql dumps to html apart from wiki2static.pl script?
I am a new user and any help or advise will be very useful.
Thanks, -Hemali