It's not high priority, that's right, but you should also look at it from the users point of view. If a large download breaks at 85%, then you have to redownload the whole file. I know that there download managers, but with bitorrent you have extra certainty for the validity of your file: every 1 MB or so, you have a MD5 hash, so corrupt downloads can not happen. Lowering the bandwith on the download server is just an extra advantage.
A download manager is a better way to increase your chances of a clean download. Bittorrent only works if you have lots of people downloading at the same time (or at least similar enough times that the people that have finished haven't given up seeding) - I'm not sure that's the case with Wikipedia DB dumps.