On 27.02.2008, 19:46 Andrew wrote:
I run the "wikipedia for schools" static
version and am looking at size/
download possibilities for the 2008 version. Clearly we will keep offering
this for free but I wonder if anyone has a better "how to" solution.
The 2007 thumbnails version is 700 Mb and even 8
months after launch people
are still taking 500Gb a month bandwidth on downloads. The larger version (
2.3Gb) was on Torrent but that seems to be broken so we are mailing out DVDs
at a high rate too.
Current list of rated articles is nearing 30,000 ( 8x
last year) so either
we have to think about a very different product (won't run direct from a DVD
file, multiple disks) or we have to select much more on relevance as well as
just taking acceptable quality. Probably we will go for a size where
thumbnails or mid image size version still goes on a DVD, where we have
8-10,000 articles and where we have a larger version at 5Gb or so for some
sort of compressed download. However, when we put a newer larger version up
we are worried about just how big the bandwidth might be. Also whether to
re-try Torrent, whether there is something better and whether to use a
remote download service like Amazon.
SO: you guys must have some really big downloads and
bandwidths to handle
and understand them much better than I do. What should we go for to provide
this free service to offline static users at a minimum price?
BitTorrent. It's distributed, fast and has clients available for every
OS on the market.
Max Semenik ([[User:MaxSem]])