On Wed, Apr 16, 2008 at 11:06 AM, Simetrical Simetrical+wikilist@gmail.com wrote:
On Wed, Apr 16, 2008 at 11:01 AM, Earle Martin earle@downlode.org wrote:
I'm not worried about reliability, but I thought the upside of distributing Big Files through BitTorrent was that it took the stress off the originating server, which otherwise has to handle everything (and certainly multiple times if the download breaks in the middle for whatever reason). I guess this isn't a problem if you have a hefty enough pipe, though.
Or few enough downloaders. Wikimedia's bandwidth use is what, a few Gbps? Are the number of people downloading these dumps actually going to be enough to make a noticeable difference? I haven't heard that stated as a concern by Brion or anyone, so I assume not.
The obvious solution to get the best of both worlds would be BitTorrent with http seeding. IOW, if there are other seeds or downloaders available, you use them, and if not you fall back to HTTP range requests. I have no idea why that idea never caught on. I guess because the use of BitTorrent to transfer legal files is such a small percentage of total BitTorrent use.