Platonides schrieb:
Process? Tools?
It would just be making a 'bigupload' right for people to bypass file
size restrictions (or have a extremely high one).
Then give it to sysops or a new group.
Tell me if I'm wrong, but as far as I know, the file size is limited by PHP, nto
by MediaWiki. And it has to be: if we would admit huge files to be uploaded
before they are finally rejected by MediaWiki, this would already be an attack
vector - because, afaik, PHP got the dumb idea of buffering uploads in RAM. So,
to kill the server, just upload a 5GB file.
The only thing is not to get those users uploading
giantic files when
storage nodes are getting out of space. It could even be automated so
bigupload right would only be effective if there's at least X or X% free
space on disk.
If this worked, that would be cool. But as I said, afaik it's not possible, for
technical reasons.
Of course we would also need an interface able at
least to continue
interrupted uploads, to make it really useful.
That would be helpful. ALso helpful would be the ability to upload archive files
containing multiple images. If we have a way to deal with uploading big files,
this would become feasible.
I did a proposal years
ago based on a FTP upload interface. Maybe you are referring to
something similar. Please keep me posted.
Upload from URL and Firefogg should alleviate the issue, though.
A relatively simple way would be to allow big files to be uploaded via FTP or
any other protocol, to "dumb storage", and then transfer and import them server
side. I'd propose a ticket system for this: people with a special right can
generate a ticket good for uploading a one file, for instance. But it's just an
idea so far.
-- daniel