Platonides schrieb:
Tell me if I'm wrong, but as far as I know, the file size is limited by PHP, nto
by MediaWiki. And it has to be: if we would admit huge files to be uploaded
before they are finally rejected by MediaWiki, this would already be an attack
vector - because, afaik, PHP got the dumb idea of buffering uploads in RAM. So,
to kill the server, just upload a 5GB file.
Really? It makes sense for text POSTs but it's not very smart for files...
I didn't check for myself, but that'S what I was told when we discussed this
matter with Brion and Mark at FOSDEM.
And yes, it's utterly stupid. But that doesn't mean PHP won't do it.
I was thinking on a FTP server where you log in with your wiki
credentials and get to a private temporary folder. You can view pending
files, delete, rename, append and create new ones (you can't read them
though, to avoid being used as a sharing service).
You are given a quota so you could upload a few large files or many
small ones. Files get deleted after X time untouched.
When you go to the page name it would have on the wiki there's a message
reminding you of a pending upload an inviting you to finish it, where
you get the normal upload fields. After transferring, the file gets
public and you are returned the file size quota.
Having a specific protocol for uploads also allow to store them directly
on the storage nodes, instead of writing them via nfs from the apaches.
Yes, that sounds pretty nice.
-- daniel
_______________________________________________
Commons-l mailing list
Commons-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/commons-l