Hi Platonides
First let me thank you very much!
99 GByte, how could that be?
On https://dumps.wikimedia.org/commonswiki/20160305/
there are dumps like
commonswiki-20160305-pages-articles-multistream.xml.bz2
with 6.7 GByte and that file should contain
"Articles, templates, media/file descriptions, and primary meta-pages, in multiple bz2 streams, 100 pages per stream".
Doesn't this include the svg-files, too?
(Though I haven't found out how to get the contents out what it actually contains, because my GUI-programs on Win 7 Premium 64bit regularily crash when trying to open it. And trying to download the version splitted in 4 files now crashes my computer.)
If you say 99 GByte is correct, I will start downloading it.
(I have downloaded such big files before with the openstreetmap data, and it worked.)
Best regards
Dieter
Am 18.03.2016 00:42, schrieb Platonides:
Hello Dieter
That would be similarly hard for me than for you.
I have run the equivalent of:
mkdir dieter-svgs
cd dieter-svgs
wget
http://tools.wmflabs.org/heritage/commonswiki_svg_list-2016-03-02.txt.xzplatonides@bots:~$
time wget
http://tools.wmflabs.org/heritage/commonswiki_svg_list-2016-03-02.txt.xz
xz -d commonswiki_svg_list-2016-03-02.txt.xz
time wget --force-directories -i commonswiki_svg_list-2016-03-02.txt
cd ..
tar -cjf dieter-svgs-2016-03-02.tar.bz2 dieter-svgs/
I have kept the directory structure in place (by using the
--force-directories parameter) so that it isn't too stressful for the
filesystem. Still, a GUI app will probably choke on it.
And after a quite long wait, here is the result:
https://archivos.wikimedia.es/dieter-svgs-2016-03-02.tar.bz2 (99GB)
SHA256 checksum:
48966777cc5f5d733b2a1eaf4d11f86853e9545a56da329929c996e330b38e28
dieter-svgs-2016-03-02.tar.bz2
Best regards
_______________________________________________
Xmldatadumps-l mailing list
Xmldatadumps-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l