Hi all,
I am trying to download mediawiki by the following link, but with no luck, the server seems down:
http://download.wikimedia.org/mediawiki/1.16/mediawiki-1.16.0.tar.gz
Anybody know where should i report to and how the service could be resume? thanks.
Hi Billy,
On 10.11.2010 19:48, Billy Chan wrote:
I am trying to download mediawiki by the following link, but with no luck, the server seems down:
That’s a known problem (see tech channel or server admin log). You may user the following file instead: http://noc.wikimedia.org/mediawiki-1.16.0.tar.gz
Regards, Robin
Hi Robin,
Thanks for your link. Do u know where i can download the xml dumps now? Thanks.
2010/11/11 Robin Krahl me@robin-krahl.de
Hi Billy,
On 10.11.2010 19:48, Billy Chan wrote:
I am trying to download mediawiki by the following link, but with no
luck,
the server seems down:
That’s a known problem (see tech channel or server admin log). You may user the following file instead: http://noc.wikimedia.org/mediawiki-1.16.0.tar.gz
Regards, Robin
-- Robin Krahl || ireas http://robin-krahl.de me@robin-krahl.de
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
The dump generating process is halted. Also, the official XML download page is "offline", until they fix the hardware.
I don't know if there are mirrors. I don't think so.
2010/11/11 Billy Chan waterfallbay@gmail.com
Hi Robin,
Thanks for your link. Do u know where i can download the xml dumps now? Thanks.
2010/11/11 Robin Krahl me@robin-krahl.de
Hi Billy,
On 10.11.2010 19:48, Billy Chan wrote:
I am trying to download mediawiki by the following link, but with no
luck,
the server seems down:
That’s a known problem (see tech channel or server admin log). You may user the following file instead: http://noc.wikimedia.org/mediawiki-1.16.0.tar.gz
Regards, Robin
-- Robin Krahl || ireas http://robin-krahl.de me@robin-krahl.de
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
There are some old dumps in Internet Archive,[1] but I guess you are interested in the most recent ones.
Also, I have a copy of all the pages-meta-history.xml.7z from August 2010 at home. But I can't upload them anywhere, they are 100 GB.
[1] http://en.wikipedia.org/wiki/User:Emijrp/Wikipedia_Archive
2010/11/11 emijrp emijrp@gmail.com
The dump generating process is halted. Also, the official XML download page is "offline", until they fix the hardware.
I don't know if there are mirrors. I don't think so.
2010/11/11 Billy Chan waterfallbay@gmail.com
Hi Robin,
Thanks for your link. Do u know where i can download the xml dumps now? Thanks.
2010/11/11 Robin Krahl me@robin-krahl.de
Hi Billy,
On 10.11.2010 19:48, Billy Chan wrote:
I am trying to download mediawiki by the following link, but with no
luck,
the server seems down:
That’s a known problem (see tech channel or server admin log). You may user the following file instead: http://noc.wikimedia.org/mediawiki-1.16.0.tar.gz
Regards, Robin
-- Robin Krahl || ireas http://robin-krahl.de me@robin-krahl.de
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
* emijrp emijrp@gmail.com [Thu, 11 Nov 2010 16:04:53 +0100]:
There are some old dumps in Internet Archive,[1] but I guess you are interested in the most recent ones.
Also, I have a copy of all the pages-meta-history.xml.7z from August 2010 at home. But I can't upload them anywhere, they are 100 GB.
Why such large dumps are not incremental ones? That could bring: 1. Save a lot of disk space 2. Greatly increase speed of dumping 3. Easier to upload and restore missed parts (something like P2P)
With XML import / export that could be done quite easily. And even with SQL you probably can limit entries at least of some tables (revision, user, page). Though I am unsure, if it is possible to make incremental dumps of all the tables - that would require SQL operations log, instead of simple range inserts, probably.
Although the developers are smart people so probably I am wasting the time and there is no reason to make incremental backups? Dmitriy
On Thu, Nov 11, 2010 at 4:32 PM, Dmitriy Sintsov questpc@rambler.ru wrote:
Although the developers are smart people so probably I am wasting the time and there is no reason to make incremental backups?
The main reason I believe is that it takes somebody to write such code.
Hoi, An incremental backup starts with a full backup. I think I remember that they were in the process of getting such a process started. Thanks, GerardM
On 11 November 2010 16:36, Bryan Tong Minh bryan.tongminh@gmail.com wrote:
On Thu, Nov 11, 2010 at 4:32 PM, Dmitriy Sintsov questpc@rambler.ru wrote:
Although the developers are smart people so probably I am wasting the time and there is no reason to make incremental backups?
The main reason I believe is that it takes somebody to write such code.
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Gerard Meijssen wrote:
Hoi, An incremental backup starts with a full backup. I think I remember that they were in the process of getting such a process started. Thanks, GerardM
I don't know about that. You may be thinking on dumps made using the previous dump. Only new data is retrieved, but the result is a full dump, too. There are also the new split dumps, but they aren't incremental.
Dmitry wrote:
With XML import / export that could be done quite easily. And even with SQL you probably can limit entries at least of some tables (revision, user, page). Though I am unsure, if it is possible to make incremental dumps of all the tables - that would require SQL operations log, instead of simple range inserts, probably.
It would require the new dump to delete previous entries: Pages deleted, revisions oversighted, articles renamed...
Sorry. Where I said "from August 2010", I mean "of August 2010". I have only one .7z for every wiki of WMF.
2010/11/11 emijrp emijrp@gmail.com
There are some old dumps in Internet Archive,[1] but I guess you are interested in the most recent ones.
Also, I have a copy of all the pages-meta-history.xml.7z from August 2010 at home. But I can't upload them anywhere, they are 100 GB.
[1] http://en.wikipedia.org/wiki/User:Emijrp/Wikipedia_Archive
2010/11/11 emijrp emijrp@gmail.com
The dump generating process is halted. Also, the official XML download page
is "offline", until they fix the hardware.
I don't know if there are mirrors. I don't think so.
2010/11/11 Billy Chan waterfallbay@gmail.com
Hi Robin,
Thanks for your link. Do u know where i can download the xml dumps now? Thanks.
2010/11/11 Robin Krahl me@robin-krahl.de
Hi Billy,
On 10.11.2010 19:48, Billy Chan wrote:
I am trying to download mediawiki by the following link, but with no
luck,
the server seems down:
That’s a known problem (see tech channel or server admin log). You may user the following file instead: http://noc.wikimedia.org/mediawiki-1.16.0.tar.gz
Regards, Robin
-- Robin Krahl || ireas http://robin-krahl.de me@robin-krahl.de
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
wikitech-l@lists.wikimedia.org