[Xmldatadumps-l] [Wikitech-l] Fwd: Old English Wikipedia image dump from 2005

Erik Zachte erikzachte at infodisiac.com
Thu Nov 17 00:46:27 UTC 2011


Ariel:
> Providing multiple terabyte sized files for download doesn't make any kind of sense to me. However, if we get concrete proposals for categories of Commons images people really want and would use, we can put those together. I think this has been said before on wikitech-l if not here.  

There is another way to cut down on download size, which would serve a whole class of content re-users, e.g. offline readers. 
For offline readers it is not so important to have pictures of 20 Mb each, rather to have pictures at all, preferably 10's Kb's in size. 
A download of all images, scaled down to say 600x600 max would be quite appropriate for many uses. 
Map and diagrams would not survive this scale down (illegible text), but are very compact already. 
In fact the compress ratio of each image is very reliable predictor of the type of content.

In 2005 I distributed a DVD [1] with all unabridged texts for English Wikipedia and all 320,000 images on one DVD, to be loaded on 4Gb CF card for handheld. 
Now we have 10 million images on Commons, so even scaled down images would need some filtering, but any collection would still be 100-1000 times smaller in size.

Erik Zachte

[1] http://www.infodisiac.com/Wikipedia/ 





More information about the Xmldatadumps-l mailing list