[Foundation-l] [Wikitech-l] BitTorrent Downloads
Jeff V. Merkey
jmerkey at wolfmountaingroup.com
Wed Apr 25 16:36:38 UTC 2007
Jeff V. Merkey wrote:
>Steve Sanbeg wrote:
>
>
>
>>On Tue, 24 Apr 2007 17:22:57 -0600, Jeff V. Merkey wrote:
>>
>>
>>
>>
>>
>>>Since I have open sourced the wikix program, now anyone wanting the
>>>images can download them directly from Wikipedia.
>>>I am in process of restructuring the WikiGadugi site, so anyone wanting
>>>the bittorrent downloads need to finish up this week,
>>>as I will discontinue them shortly since folks now have the ability to
>>>download them directly. The wikix program
>>>is not very intensive on the main Wikimedia servers. The program is
>>>setup to behave as several workstations, and it really
>>>does not take that long to get the images.
>>>
>>>
>>>
>>>
>>I was under the impression that bulk downloads needed to be throttled, and
>>that it would take a lot longer than that to download everything. Does
>>this just grab the images as fast as it can get them? Is that allowed?
>>
>>
>>
>>
>It's faster to get them from Wikipedia. The bittorrent downloads take
>about 1 1.2 weeks to download the archive. Using wikix
>directly only takes 1 1/2 days given the current size of the image set
>for commons.
>
>Getting them from Wikipedia is faster due to the squid caching both
>locally and internet wide. My analysis of the
>data sets from Wikipedia indicates that 60% of the images are cached
>either locally on squid or at other remote
>cache servers.
>
>Since they are cached in a distributed manner, the program will only
>access wikipedia intermittently. Copyvio is the bigger
>issue than performance. My image mirroring has had almost no noticable
>impact on Wikipedia with wikix. The program
>behaves like 16 workstations, so Wikipedia seems to be able to handle it
>with little additional overhead. Given the number
>of squid servers Brion has active, I think the impact is minimal in
>comparison to the massive amounts of access the site gets
>daily.
>
>Jeff
>
>
>
>
>
Also, the number of people actually needing to do this seems small. To
date, only 9 folks have downloaded
the image archive in a month period. That's a small number. I will leave
the bittorrent active at wikigadugi
and throttled since it has little impact on my bandwidth at present for
the next month or so if folks still
want to get at it. Wikix is a better method and given all the "gloom and
doom" talk about creating
backup sites for Wikimedia (which I think is probably not a big a
concern as people think), the Wikix tool's time has come
and folks should get access to it if they feel a need to mirror
Wikipedia sites elsewhere. At least it gives the
community the tools to do this.
Jeff
More information about the foundation-l
mailing list