I just thought everyone would like to know:
With Avar's help, I wrote a unix command line script to upload files to Wikipedia (specifically, it's designed for uploading to commons, but it's trivial to target it at a different project)
Using wget to fetch whole websites, and oggasm to concode them into oggs, and the upload script to send them, I've been using a bot to throw large numbers of files onto commons.
You can see the results from the first run here: http://en.wikipedia.org/wiki/Wikipedia:Sound/list (Look for files contributed by Raulbot) and http://commons.wikimedia.org/w/index.php?title=Special:Contributions&tar...
Not bad, huh? :)
A few things though. The next "victim" will be Oyez project, a project that puts US Supreme Court recordings online. (Like the first site I harvested, they're CC-by-SA).
I would like to see the upload limit on commons bumped up to 40 megs at least. It's hard putting an entire SCOTUS argument file into 20 megs.
--Mark