It's not high priority, that's right, but you should also look at it from the users point of view. If a large download breaks at 85%, then you have to redownload the whole file. I know that there download managers, but with bitorrent you have extra certainty for the validity of your file: every 1 MB or so, you have a MD5 hash, so corrupt downloads can not happen. Lowering the bandwith on the download server is just an extra advantage.
Annabel
Delirium schreef:
Annabel wrote:
Bot users are able download database dumps of the wikimedia projects at http://download.wikimedia.org/backup-index.html. These bzipped dumps can be fairly large (especially for the English language). Would not it be handy to have a bitorrent link to lower the load on the download server?
This was recently discussed briefly on wikitech-l, and the tentative consensus seemed to be that, while that might be useful, it's not a high priority because downloads of the database dumps don't currently account for a large percentage of Wikimedia's total bandwidth consumption, and so aren't the first place to look for bandwidth savings.
-Mark