Στις 11-03-2013, ημέρα Δευ, και ώρα 05:35 -0500, ο/η wiki έγραψε:
Thank you for the response.
I think those sizes refer to the exported xml, e.g. 41.5GB is the English xml.bz2 expanded.
I was curious as to how much extra disk space is needed (and consumed) after importing this English xml dump into MySQL, i.e. how much the database tables use.
- sam -
Andre Klapper writes:
Hi,
On Sat, 2013-03-09 at 02:16 -0600, wiki wrote:
Sorry, I forgot to mention that I have in mind the English wikipedia dump.
http://en.wikipedia.org/wiki/Wikipedia:Database_download says "The size of the 3 January 2013 dump is approximately 9.0 GB compressed, 41.5 GB uncompressed)."
andre
Check the page at Meta which I recently updated, in particular the newly minted FAQ: http://meta.wikimedia.org/wiki/Data_dumps/FAQ#How_big_are_the_en_wikipedia_d...
I can't give a good estimate of the space needed for MySQL though; enabling compressed tables for the text table might help, or compressing some of the text revisions and storing them with the mediawiki compressed flag would surely make a difference.
We're only talking about the data, not the media files here.
Ariel
wikitech-l@lists.wikimedia.org