[Toolserver-l] Fwd: [Tagging] Download huge file via HTTP

Platonides platonides at gmail.com
Tue Nov 9 23:21:06 UTC 2010


Daniel Kinzler wrote:
> On 09.11.2010 15:23, bawolff wrote:
>> Considering that dumps.wikimedia.org serves files in excess of 100 GB,
>> I don't think HTTP the protocol is the issue. However when you try to
>> request the file (or even just do a HEAD request) the web server
>> closes the connection before sending any data, suggesting its a web
>> server issue, not an HTTP the protocol issue.
>>
>> -bawolff
> 
> Some servers, clients and proxies break on the 31 (or 32) bit boundary (2 or 4
> GB, respectively) for the content-length field. wget is a prominent example for
> this (or was, when i last tried this a couple of years ago).
> 
> -- daniel

wget works fine with big files.
I remember that some old wget version had issues showing the downloading
bar in that boundary (it went crazy) but still, I think it got the job done.



More information about the Toolserver-l mailing list