On 18/04/07, Platonides <Platonides(a)gmail.com>
wrote:
Interesting. Copying one subfolder (1/256 of
files) is much more
affordable.
But how would you /copy/ them? Server admin probably doesn't want users
downloading those filesizes... Which goes to the imho original problem
of not having image dumps.
I suppose the devs could come up with something suitable - dumping one
subfolder at a time, torrent the lot.
I'll buy a couple of 400GB outboard disks and keep the files up for
torrenting! Anyone else?
- d.
I have a replicating program that will download and pull ALL images from
Wikimedia. It reads the dumps and spawns
parallel processes and can suck down commons completely in 36 hours.
Want a copy.
Jeff
_______________________________________________
foundation-l mailing list
foundation-l(a)lists.wikimedia.org
http://lists.wikimedia.org/mailman/listinfo/foundation-l