Hi guys.
I run the "wikipedia for schools" static version and am looking at size/ download possibilities for the 2008 version. Clearly we will keep offering this for free but I wonder if anyone has a better "how to" solution.
The 2007 thumbnails version is 700 Mb and even 8 months after launch people are still taking 500Gb a month bandwidth on downloads. The larger version ( 2.3Gb) was on Torrent but that seems to be broken so we are mailing out DVDs at a high rate too.
Current list of rated articles is nearing 30,000 ( 8x last year) so either we have to think about a very different product (won't run direct from a DVD file, multiple disks) or we have to select much more on relevance as well as just taking acceptable quality. Probably we will go for a size where thumbnails or mid image size version still goes on a DVD, where we have 8-10,000 articles and where we have a larger version at 5Gb or so for some sort of compressed download. However, when we put a newer larger version up we are worried about just how big the bandwidth might be. Also whether to re-try Torrent, whether there is something better and whether to use a remote download service like Amazon.
SO: you guys must have some really big downloads and bandwidths to handle and understand them much better than I do. What should we go for to provide this free service to offline static users at a minimum price?
BozMo
I was going to suggest usenet. Upload it once, and it would get replicated around the world, where people can download from.
But there are a bunch of drawbacks.
* Retention times. How long it stays on servers before its purged. * Assumes people have usenet access and know how to use it.
But it wouldn't incur any download costs at your end.
Jared
I wonder if any USB flash drive manufacturer would be interesting it putting it on their drives...
-----Original Message----- From: wikitech-l-bounces@lists.wikimedia.org [mailto:wikitech-l-bounces@lists.wikimedia.org] On Behalf Of Andrew Cates Sent: 27 February 2008 16:46 To: Wikimedia developers Subject: [Wikitech-l] After some download advice
Hi guys.
I run the "wikipedia for schools" static version and am looking at size/ download possibilities for the 2008 version. Clearly we will keep offering this for free but I wonder if anyone has a better "how to" solution.
The 2007 thumbnails version is 700 Mb and even 8 months after launch people are still taking 500Gb a month bandwidth on downloads. The larger version ( 2.3Gb) was on Torrent but that seems to be broken so we are mailing out DVDs at a high rate too.
Current list of rated articles is nearing 30,000 ( 8x last year) so either we have to think about a very different product (won't run direct from a DVD file, multiple disks) or we have to select much more on relevance as well as just taking acceptable quality. Probably we will go for a size where thumbnails or mid image size version still goes on a DVD, where we have 8-10,000 articles and where we have a larger version at 5Gb or so for some sort of compressed download. However, when we put a newer larger version up we are worried about just how big the bandwidth might be. Also whether to re-try Torrent, whether there is something better and whether to use a remote download service like Amazon.
SO: you guys must have some really big downloads and bandwidths to handle and understand them much better than I do. What should we go for to provide this free service to offline static users at a minimum price?
BozMo _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On 27.02.2008, 19:46 Andrew wrote:
Hi guys.
I run the "wikipedia for schools" static version and am looking at size/ download possibilities for the 2008 version. Clearly we will keep offering this for free but I wonder if anyone has a better "how to" solution.
The 2007 thumbnails version is 700 Mb and even 8 months after launch people are still taking 500Gb a month bandwidth on downloads. The larger version ( 2.3Gb) was on Torrent but that seems to be broken so we are mailing out DVDs at a high rate too.
Current list of rated articles is nearing 30,000 ( 8x last year) so either we have to think about a very different product (won't run direct from a DVD file, multiple disks) or we have to select much more on relevance as well as just taking acceptable quality. Probably we will go for a size where thumbnails or mid image size version still goes on a DVD, where we have 8-10,000 articles and where we have a larger version at 5Gb or so for some sort of compressed download. However, when we put a newer larger version up we are worried about just how big the bandwidth might be. Also whether to re-try Torrent, whether there is something better and whether to use a remote download service like Amazon.
SO: you guys must have some really big downloads and bandwidths to handle and understand them much better than I do. What should we go for to provide this free service to offline static users at a minimum price?
BozMo
BitTorrent. It's distributed, fast and has clients available for every OS on the market.
The torrent that has been provided on the site contains 0-1 seed and 0-1 peer. This is dismal. Atleast it should be officially seeded continuously. You can limit the upload bandwidth. It is better to have seed that gives slow but persistence speed than no seeds at all, which feels like the torrent is dead.
On Thu, Feb 28, 2008 at 5:23 PM, Max Semenik maxsem.wiki@gmail.com wrote:
On 27.02.2008, 19:46 Andrew wrote:
Hi guys.
I run the "wikipedia for schools" static version and am looking at size/ download possibilities for the 2008 version. Clearly we will keep offering this for free but I wonder if anyone has a better "how to" solution.
The 2007 thumbnails version is 700 Mb and even 8 months after launch people are still taking 500Gb a month bandwidth on downloads. The larger version ( 2.3Gb) was on Torrent but that seems to be broken so we are mailing out DVDs at a high rate too.
Current list of rated articles is nearing 30,000 ( 8x last year) so either we have to think about a very different product (won't run direct from a DVD file, multiple disks) or we have to select much more on relevance as well as just taking acceptable quality. Probably we will go for a size where thumbnails or mid image size version still goes on a DVD, where we have 8-10,000 articles and where we have a larger version at 5Gb or so for some sort of compressed download. However, when we put a newer larger version up we are worried about just how big the bandwidth might be. Also whether to re-try Torrent, whether there is something better and whether to use a remote download service like Amazon.
SO: you guys must have some really big downloads and bandwidths to handle and understand them much better than I do. What should we go for to provide this free service to offline static users at a minimum price?
BozMo
BitTorrent. It's distributed, fast and has clients available for every OS on the market.
-- Best regards, Max Semenik ([[User:MaxSem]])
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
wikitech-l@lists.wikimedia.org