Daniel Kinzler schrieb:
Also, several http clients don't like files over
2GB - this is because the large
number of bytes in the Length field causes an integer overflow (2GB is the 31
bit limit). wget likes to die with a segmentation fault on those. I found that
curl works.
wget supports such big files. Perhaps you're running an old version?
IMHO the benefits of separated files are similar to the disadvantages. A
side side benefit if it would be that hashes would be splitted, too. If
you were unlucky, knowing that 'something' (perhaps just a bit) on the
150GB you downloaded is wrong, is not that helpful.
So having hashes for file sections on the big ones, even if not
'standard' would be an improvement.