Thank you, Emijrp!
What about the dump of Commons images? [for those with 10TB to spare]
SJ
On Sun, Jun 26, 2011 at 8:53 AM, emijrp emijrp@gmail.com wrote:
Hi all;
Can you imagine a day when Wikipedia is added to this list?[1]
WikiTeam have developed a script[2] to download all the Wikipedia dumps (and her sister projects) from dumps.wikimedia.org. It sorts in folders and checks md5sum. It only works on Linux (it uses wget).
You will need about 100GB to download all the 7z files.
Save our memory.
Regards, emijrp
[1] http://en.wikipedia.org/wiki/Destruction_of_libraries [2] http://code.google.com/p/wikiteam/source/browse/trunk/wikipediadownloader.py
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l