Tim Starling schreef:
PHP stores the entire contents of the POST request in
memory, as it is
receiving it. That's why we can't allow arbitrarily large uploads, the
server would run out of memory. In any case, HTTP does not support resuming
for uploads, so it's quite fragile.
The file sehare service
YouSendIt.com , now only providing 100mb free
upload, used the support the upload of file up to 1gb with only a basic
webform.
Maybe the are willing to share how the do that if asked.
What also maybe is an idea is the use one specific server for the upload
of large files files. And when you inter a request to upload a large
file it is not uploaded but queued.
The user can get a message like; "there are 14 uploads before you.
Estimated time to completion: 3 hours 53 minutes
That is not a fantastic system but at least you could upload large files.
Currently when you upload a non-supported file format I have the very
strong impression the check of the is allowed is done after the file is
uploaded. That is annoying for the user and a waste resources to upload
the file.
--
Contact: walter AT wikizine DOT org
Wikizine.org - news for and about the Wikimedia community