On 8/10/07, Brianna Laugher brianna.laugher@gmail.com wrote:
On 11/08/07, David A. Desrosiers desrod@gnu-designs.com wrote:
On Sat, 2007-08-11 at 02:02 +1000, Brianna Laugher wrote:
- Backup.
No image dumps, still? This is quite worrying. To say the least.
It would be faster to drive to the datacenter with a stack of hard drives (or at least 1x500gb drive) and have someone just copy the images over to the drive directly, than it would be to try to tar them up and have people retrieve them over http.
I don't know how long it would take to fetch a 300+ gigabyte file over http, but I certainly wouldn't want to do it... not without some distributed method of sharing the packets and bandwidth.
OK, my bad wording. The important thing is the backup. I would really like someone to do that manual backup. I don't care about if it exists on the web. I just care that it exists, somewhere (or some few places), for sure. So, you know. If all that's needed is for someone to rent a car, then let's do that. :)
A tool has been created called Wikix, it downloads all images from a wiki. I believe it has been run once or twice on the commons, but that is not often enough. Direct dumps would be great but in the mean time this could be used as a stop-gap measure. I would run it regularly and distribute the files but 1) I can't compile it for some reason, and 2) My hard drive isn't big enough.
cheers
Brianna
-- They've just been waiting in a mountain for the right moment: http://modernthings.org/
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikitech-l
-UH.