Hello,
does anyone know how I can download the english old table ?
thank you!
Frederic
_____
De : wikipedia-l-bounces@Wikimedia.org [mailto:wikipedia-l-bounces@Wikimedia.org] De la part de Frederic Malo Envoyé : mardi 13 janvier 2004 15:28 À : wikipedia-l@Wikimedia.org Objet : [Wikipedia-l] trouble with gzip2 downloads
Hello,
Trying to install wikipedia on my W2K3 server, I had no trouble with french old and current gzip2 files and with current english file.
Last week, I've downloaded the two xaa.txt and xab.txt for the old_table from http://download.wikimedia.org/ . Yesterday, I've downloaded the big http://download.wikimedia.org/cgi-bin/old_table.sql.bz2.pl . All these three bzip2 files (I suppose that in fact, xaa.txt and xab.txt are bzip2 archive) are corrupted when I try to uncompress with winzip, 7-zip, or bzip2.exe.
Where and/or how can I download the english old table ? Perhaps this version http://download.wikimedia.org/archives/en/20040109_old_table.sql.bz2 should work, but access is denied.
Thank you for all this work and best regards,
Frederic
Thanks Brion !
-----Message d'origine----- De : wikipedia-l-bounces@Wikimedia.org [mailto:wikipedia-l-bounces@Wikimedia.org] De la part de Brion Vibber Envoyé : lundi 19 janvier 2004 21:06 À : wikipedia-l@Wikimedia.org Objet : Re: [Wikipedia-l] trouble with gzip2 downloads
On Jan 19, 2004, at 12:00, Frederic Malo wrote:
does anyone know how I can download the english old table ?
Please wait until the current backup is done (should be within a day), I'll make a split version available.
-- brion vibber (brion @ pobox.com)
Okay, new backup is now available. It says it's three days old because it took so long to run on the slow machine. :P
Temporarily at: http://susan.bomis.com/
The English Wikipedia old revisions table is available in two parts (xaa, xab) to get around the 2GB-download problem with Apache on a 32-bit server.
Also -- all wikis will go OFFLINE shortly for transferring the database back to geoffrin, the fast server, which seems to be happy with a lobotomy down to 2GB of ram.
-- brion vibber (brion @ pobox.com)
Thanks Brion!
By the way, does anybody know a good stitching software (windows) ?
Thank you for all!
Frederic
-----Message d'origine----- De : wikipedia-l-bounces@Wikimedia.org [mailto:wikipedia-l-bounces@Wikimedia.org] De la part de Brion Vibber Envoyé : mardi 20 janvier 2004 10:52 À : wikipedia-l@Wikimedia.org Objet : Re: [Wikipedia-l] trouble with gzip2 downloads
Okay, new backup is now available. It says it's three days old because it took so long to run on the slow machine. :P
Temporarily at: http://susan.bomis.com/
The English Wikipedia old revisions table is available in two parts (xaa, xab) to get around the 2GB-download problem with Apache on a 32-bit server.
Also -- all wikis will go OFFLINE shortly for transferring the database back to geoffrin, the fast server, which seems to be happy with a lobotomy down to 2GB of ram.
-- brion vibber (brion @ pobox.com)
On Wed, 2004-01-21 at 16:57, Frederic Malo wrote:
Thanks Brion!
By the way, does anybody know a good stitching software (windows) ?
The first thing I would try is the command-line "copy" command: start up a command prompt, and type:
copy /b xaa+xab 20040117_old_table.sql.bz2
My only question is whether this works on 2gb files; I've never tried it on really large files, but I don't know any reason it wouldn't work.
(The "/b" says to use binary mode; without it, copy in file-concatenation mode defaults to ASCII, which means it stops copying at the first Control-Z character.)
Carl Witty
Hello,
Thank you Carl, but your idea seems not working with huge files.
But I found a solution that works :
"type xab.txt >> xaa.txt"
I could unzip the bz2 final file without trouble.
Great! Thanks for your help!
Frederic
-----Message d'origine----- De : wikipedia-l-bounces@Wikimedia.org [mailto:wikipedia-l-bounces@Wikimedia.org] De la part de Carl Witty Envoyé : jeudi 22 janvier 2004 03:29 À : wikipedia-l@Wikimedia.org Objet : RE: [Wikipedia-l] trouble with gzip2 downloads
On Wed, 2004-01-21 at 16:57, Frederic Malo wrote:
Thanks Brion!
By the way, does anybody know a good stitching software (windows) ?
The first thing I would try is the command-line "copy" command: start up a command prompt, and type:
copy /b xaa+xab 20040117_old_table.sql.bz2
My only question is whether this works on 2gb files; I've never tried it on really large files, but I don't know any reason it wouldn't work.
(The "/b" says to use binary mode; without it, copy in file-concatenation mode defaults to ASCII, which means it stops copying at the first Control-Z character.)
Carl Witty
_______________________________________________ Wikipedia-l mailing list Wikipedia-l@Wikimedia.org http://mail.wikipedia.org/mailman/listinfo/wikipedia-l
Hello,
I have not found any wiki metapage about this subject. What is the process to update an uploaded wikipedia database ?
Do we have to upload each week the old and current tables ? How is managed the insertion of current rows into the old table ? What are the rules ?
What is happening if we forgot to update the database one time or several times ?
Thank you again,
Frederic
wikipedia-l@lists.wikimedia.org