[Xmldatadumps-l] [Wikitech-l] Fwd: Old English Wikipedia image dump from 2005
Platonides
platonides at gmail.com
Thu Nov 17 16:06:03 UTC 2011
Erik Zachte wrote:
> Ariel:
>> Providing multiple terabyte sized files for download doesn't make any kind of sense to me.
>> However, if we get concrete proposals for categories of Commons
images people really want
>> and would use, we can put those together. I think this has been said before on wikitech-l if not here.
>
> There is another way to cut down on download size, which would serve a whole class of content re-users, e.g. offline readers.
> For offline readers it is not so important to have pictures of 20 Mb each, rather to have pictures at all, preferably 10's Kb's in size.
> A download of all images, scaled down to say 600x600 max would be quite appropriate (...)
I made this tool last month, precisely to allow easy downloading all
images from a given category (inspired by WLM needs).
http://toolserver.org/~platonides/catdown/catdown.php
Your download is just a tiny script with the list of urls to download,
but enough for doing it without further manual intervention.
There's also a nice estimate on how much space you will need to finish
the download.
More information about the Xmldatadumps-l
mailing list