Timwi wrote:
The Wiki way is not to use technical means to disallow something, but community means.
In other words, the "correct" way to handle large files would be to have a Special page to list the largest files by their size. In fact, I'm pretty sure one of the Special pages can already do this. Then all we need to do is to get some volunteers to patrol this list to see if there are any ridiculously large files that don't belong on Wikipedia.
Files are uploaded to the /tmp directory on the apache in question. If the file is of the wrong extension or over the recommended maximum of 150KB, a form is displayed asking the user if they really want to upload it. There are two buttons, "save file" and "re-upload". Clicking "re-upload" deletes the file from /tmp. Clicking "save file" moves it to NFS and registers it in the database. Doing neither leaves the file sitting on the local hard drive until the machine is restarted, or until someone manually cleans it off. Currently some of the apaches have only 10 GB free, and in the past they have occasionally run out completely. Now it's true that an attacker could write a script to upload 2000 5MB files and thereby fill up the hard drive, but that requires more technical knowledge than putting one's movie collection into a huge zip file and clicking "upload".
-- Tim Starling