Hello.
No, I don't mind to give a quick answer.
There is a valuable advise in the download page for every language. It warns you about the decompressing factor of 7z files, which could be up to 100 times the original file size.
If you got the complete meta history dump from enwiki, you should expect a decompressed file size far beyond the 600 - 700 GB....
I'm afraid 200 GB are not enough for the complete enwiki DB. At least, I hope the advise were useful for you.
Saludos.
Felipe.
Spanner 1234 spanneritwks@gmail.com escribió:
On 22/01/07, Felipe Ortega glimmer_phoenix@yahoo.es wrote: Hi all.
Finally, the first upload of images created with WikiXRay is available:
http://en.wikipedia.org/wiki/Wikipedia:WikiXRay
For next updates I'll try to use Commonist for faster upload of lots of files.
Any comments about the graphics are welcomed.
Saludos.
Felipe Ortega.
---------------------------------
LLama Gratis a cualquier PC del Mundo. Llamadas a fijos y móviles desde 1 céntimo por minuto. http://es.voice.yahoo.com
_______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Hello.
Do you mind a question from a newbie. I just downloaded the 7z file and it is about 8 gb. How much space do you need for unpacking the file? I currently only have 200 gbs left, is that sufficient?
regards
Spanner
wiki-research-l@lists.wikimedia.org