Hi there,
I am trying to download the german "old" table on: http://download.wikimedia.org/#wikipedia The link says that the size of that file is 13.97 GiB.
Howver, the file is just 2GB (definatelly) and there are no aditional files like described on the wikipedia download help page.
Can somebody please give me some help on where to download that proper file? I tried this one, but it is not complete (which gzip is telling me).
My guess is that the file is stored on a server which does not support files > 2GB and that this is just a mistake.
Any suggestions?
Thank you in advance,
merlin
Merlin wrote in gmane.science.linguistics.wikipedia.technical:
I am trying to download the german "old" table on: http://download.wikimedia.org/#wikipedia The link says that the size of that file is 13.97 GiB.
Howver, the file is just 2GB (definatelly) and there are no aditional files like described on the wikipedia download help page.
are you sure the program you are using to download it supports large files?
? 101> curl >/dev/null http://download.wikimedia.org/wikipedia/de/20050516_old_table.sql.gz % Total % Received % Xferd Average Speed Time Curr. Dload Upload Total Current Left Speed 22 14.0G 22 3.1G 0 0 27.5M 0 0:08:40 0:01:55 0:06:45 28.2M
merlin
kate.
Hi Kate,
are you sure? The line gives me that output: curl >/dev/null http://download.wikimedia.org/wikipedia/de/20050516_old_table.sql.gz % Total % Received % Xferd Average Speed Time Curr. Dload Upload Total Current Left Speed 0 2047M 0 2600k 0 0 449k 0 1:17:48 0:00:05 1:17:42 511k04
I am trying this on suse 9.1 and files over 2GB are supported on the file system.
Regards, Merlin
Kate Turner wrote:
Merlin wrote in gmane.science.linguistics.wikipedia.technical:
I am trying to download the german "old" table on: http://download.wikimedia.org/#wikipedia The link says that the size of that file is 13.97 GiB.
Howver, the file is just 2GB (definatelly) and there are no aditional files like described on the wikipedia download help page.
are you sure the program you are using to download it supports large files?
? 101> curl >/dev/null
http://download.wikimedia.org/wikipedia/de/20050516_old_table.sql.gz
% Total % Received % Xferd Average Speed Time Curr. Dload Upload Total Current Left Speed 22 14.0G 22 3.1G 0 0 27.5M 0 0:08:40 0:01:55 0:06:45 28.2M
merlin
kate.
Wikitech-l mailing list Wikitech-l@wikimedia.org http://mail.wikipedia.org/mailman/listinfo/wikitech-l
Old versions of curl cannot download files >2GB. Intermediate versions have a bug on the display. This is probably your problem.
Newer version of curl can download files >2GB without problems. Mine is:
$ curl -V curl 7.12.1 (i386-redhat-linux-gnu) libcurl/7.12.1
The previous version I had (7.10 something) failed.
Alfio
On Mon, 23 May 2005, Merlin wrote:
Hi Kate,
are you sure? The line gives me that output: curl >/dev/null http://download.wikimedia.org/wikipedia/de/20050516_old_table.sql.gz % Total % Received % Xferd Average Speed Time Curr. Dload Upload Total Current Left Speed 0 2047M 0 2600k 0 0 449k 0 1:17:48 0:00:05 1:17:42 511k04
I am trying this on suse 9.1 and files over 2GB are supported on the file system.
Regards, Merlin
Kate Turner wrote:
Merlin wrote in gmane.science.linguistics.wikipedia.technical:
I am trying to download the german "old" table on: http://download.wikimedia.org/#wikipedia The link says that the size of that file is 13.97 GiB.
Howver, the file is just 2GB (definatelly) and there are no aditional files like described on the wikipedia download help page.
are you sure the program you are using to download it supports large files?
? 101> curl >/dev/null
http://download.wikimedia.org/wikipedia/de/20050516_old_table.sql.gz
% Total % Received % Xferd Average Speed Time Curr. Dload Upload Total Current Left Speed 22 14.0G 22 3.1G 0 0 27.5M 0 0:08:40 0:01:55 0:06:45 28.2M
merlin
kate.
Wikitech-l mailing list Wikitech-l@wikimedia.org http://mail.wikipedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@wikimedia.org http://mail.wikipedia.org/mailman/listinfo/wikitech-l
Kate Turner wrote:
Merlin wrote in gmane.science.linguistics.wikipedia.technical:
I am trying to download the german "old" table on: http://download.wikimedia.org/#wikipedia The link says that the size of that file is 13.97 GiB.
Howver, the file is just 2GB (definatelly) and there are no aditional files like described on the wikipedia download help page.
are you sure the program you are using to download it supports large files?
? 101> curl >/dev/null http://download.wikimedia.org/wikipedia/de/20050516_old_table.sql.gz % Total % Received % Xferd Average Speed Time Curr. Dload Upload Total Current Left Speed 22 14.0G 22 3.1G 0 0 27.5M 0 0:08:40 0:01:55 0:06:45 28.2M
merlin
kate.
Wikitech-l mailing list Wikitech-l@wikimedia.org http://mail.wikipedia.org/mailman/listinfo/wikitech-l
Hi,
I am having serious trouble in updating the curl sw for my running system. That is not as trivial as it seams. That said, I have read that many people do have the same problem in downloading the larger files. Is there are reason not to place the files on an ftp server? That seams to be the apropriate protocol for file transfers. Why http? I can not see a reason, but I might lack technical knowledge for this. Can somebody shed some light on this?
Thank you in advance,
merlin
Merlin wrote:
I am having serious trouble in updating the curl sw for my running system. That is not as trivial as it seams. That said, I have read that many people do have the same problem in downloading the larger files. Is there are reason not to place the files on an ftp server? That seams to be the apropriate protocol for file transfers. Why http? I can not see a reason, but I might lack technical knowledge for this. Can somebody shed some light on this?
Just because it's called "File Transfer Protocol", doesn't mean you shouldn't transfer files with anything else.
In fact, I prefer HTTP because it uses ONE connection only. FTP uses two.
Of course, this fact shouldn't really make any difference to you. Whether you use FTP or HTTP doesn't make the transfers any faster. Nor does it make curl (or any other download tool) work any better or worse.
Greetings, Timwi
Timwi wrote:
Merlin wrote:
I am having serious trouble in updating the curl sw for my running system. That is not as trivial as it seams. That said, I have read that many people do have the same problem in downloading the larger files. Is there are reason not to place the files on an ftp server? That seams to be the apropriate protocol for file transfers. Why http? I can not see a reason, but I might lack technical knowledge for this. Can somebody shed some light on this?
Just because it's called "File Transfer Protocol", doesn't mean you shouldn't transfer files with anything else.
In fact, I prefer HTTP because it uses ONE connection only. FTP uses two.
Of course, this fact shouldn't really make any difference to you. Whether you use FTP or HTTP doesn't make the transfers any faster. Nor does it make curl (or any other download tool) work any better or worse.
Greetings, Timwi
Yes. Since the advent of https and range transfers and MD5-digest authentication in HTTP 1.1, HTTP is now my preferred protocol for large file transfers. Being firewall-friendly and stateless is an added bonus.
-- Neil
Is rsync still setup for those files?
On 5/24/05, Ævar Arnfjörð Bjarmason avarab@gmail.com wrote:
In fact, I prefer HTTP because it uses ONE connection only. FTP uses
two. Not just two connections, but two connections using two different ports and two different protocols. _______________________________________________ Wikitech-l mailing list Wikitech-l@wikimedia.org http://mail.wikipedia.org/mailman/listinfo/wikitech-l
Timwi wrote:
Merlin wrote:
I am having serious trouble in updating the curl sw for my running system. That is not as trivial as it seams. That said, I have read that many people do have the same problem in downloading the larger files. Is there are reason not to place the files on an ftp server? That seams to be the apropriate protocol for file transfers. Why http? I can not see a reason, but I might lack technical knowledge for this. Can somebody shed some light on this?
Just because it's called "File Transfer Protocol", doesn't mean you shouldn't transfer files with anything else.
In fact, I prefer HTTP because it uses ONE connection only. FTP uses two.
Of course, this fact shouldn't really make any difference to you. Whether you use FTP or HTTP doesn't make the transfers any faster. Nor does it make curl (or any other download tool) work any better or worse.
Greetings, Timwi
Wikitech-l mailing list Wikitech-l@wikimedia.org http://mail.wikipedia.org/mailman/listinfo/wikitech-l
Hi Timwi,
that sounds plausible. However it creates a huge problem since (speaking for myself) I am not able to get CURL updated. An update of openssl is needed which does affect the currrent apache installation and so on. Dependencies over and over. I cant get a client on the suse 9.1 system running which does support that download size.
Has anybody a suggestion? Thank you for any hint.
Merlin
Merlin wrote:
I am not able to get CURL updated. An update of openssl is needed which does affect the currrent apache installation and so on. Dependencies over and over. I cant get a client on the suse 9.1 system running which does support that download size.
Hahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahaha hahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahaha hahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahaha hahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahaha
All hail free software!
Has anybody a suggestion? Thank you for any hint.
Use Windows.
On Thu, May 26, 2005 at 10:45:00PM +0100, Timwi wrote:
Merlin wrote:
I am not able to get CURL updated. An update of openssl is needed which does affect the currrent apache installation and so on. Dependencies over and over. I cant get a client on the suse 9.1 system running which does support that download size.
Hahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahaha hahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahaha hahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahaha hahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahaha
All hail free software!
Has anybody a suggestion? Thank you for any hint.
Use Windows.
s/Windows/Debian/g;
I think that we've got a troll here. Is this list moderated? If not, I'm in a mood to start a flamewar ;-)
wikitech-l@lists.wikimedia.org