I suggested a similar idea in another thread in this mailing list.
Seriously, I don't know why after 10 years (since Wikipedia creation), we
haven't used a similar mirror system like GNU/Linux ISOs.
Some weeks ago, I wrote a script (I can share it with interested people) to
download every 7z pages-meta-history file from
download.wikimedia.org, and
it wasted only about 100 GB. In only 100 GB, we have the whole texts and
histories in all languages in all projects. Very cheap. In the future, we
can talk about images backup, ect, but I don't really understand why
Wiki[mp]edia texts are not replicated to *every* country in the world, to
avoid be vulnerable to disasters, human errors, censorship, ect.
2010/9/16 Federico Leva (Nemo) <nemowiki(a)gmail.com>
John Vandenberg, 16/09/2010 03:00:
English, French, German, Italian, Polish,
Portugeuse, Swedish and
Chinese Wikipedia all appear to have some mirrors, but are any of them
reliable enough to be used for disaster recovery?
Obviously not, at least Italian ones.
The smaller projects are easier to backup, as
they are smaller. I am
sure that with a little effort and coordination, chapters,
universities and similar organisations would be willing to routinely
backup a subset of projects, and combined we would have multiple
current backups of all projects.
I agree. Now we have only this:
http://www.balkaninsight.com/en/main/news/21606/
How many TB are needed? I don't know what's the average, but e.g. right
now my university should have about 50 TB of free disk space (which is
not so much, after all).
Nemo
_______________________________________________
foundation-l mailing list
foundation-l(a)lists.wikimedia.org
Unsubscribe:
https://lists.wikimedia.org/mailman/listinfo/foundation-l