[Foundation-l] Wikipedia dumps downloader
emijrp
emijrp at gmail.com
Sun Jun 26 12:53:15 UTC 2011
Hi all;
Can you imagine a day when Wikipedia is added to this list?[1]
WikiTeam have developed a script[2] to download all the Wikipedia dumps (and
her sister projects) from dumps.wikimedia.org. It sorts in folders and
checks md5sum. It only works on Linux (it uses wget).
You will need about 100GB to download all the 7z files.
Save our memory.
Regards,
emijrp
[1] http://en.wikipedia.org/wiki/Destruction_of_libraries
[2]
http://code.google.com/p/wikiteam/source/browse/trunk/wikipediadownloader.py
More information about the wikimedia-l
mailing list