Kasimir Gabert wrote:
Hello,
I imported the dump from enwikibooks to test.kgprog.com and it worked perfectly. This is not one of the smaller wikis and it has just confirmed my belief that Wikimedia dumps work. If you can demonstrate with a specific example what your error is, and provide the appropriate parts of the xml file, then I am sure that the developers will look at your case seriously.
Kasimir
enwiki, not wikibooks. I can get wikibooks to work too. It is related to large dumps. It behaves like stack corruption. I have posted the logs already. You have to consider I also have all of the images as well in an integrated setup, like wikipedia - squid clusters included, so a workstation test is not probably the same scenario. Try the November and February XML dumps on 1.8.2, not 1.9.3. I have gone through the upgrade path too many times already each time hoping importDump would work better. It has improved, but still is buggy. I doubt its in the platform, or the site would not work at all, and it works.
Jeff
On 2/24/07, Kasimir Gabert kasimir.g@gmail.com wrote:
On 2/24/07, Jeffrey V. Merkey jmerkey@wolfmountaingroup.com wrote:
Kasimir Gabert wrote:
Hello,
Just because of this post I feel compelled to take the time to
[snip]
every time. We are talking about whether or not dumps from Wikimedia work, not whether or not your system is good.
I created Local Area Networking my friend. I think I know if a system works or not.
You invented LAN?? I guess everything I have heard about Robert Metcalfe is completely false because "Jeffrey V. Merkey" posted otherwise.
have installed applicances in the field and folks are coming back about this issue a lot when they try to use various XML dumps from the Foundation.
[snip]
importDumps.php it does not look like it would stop working after a larger xml file is loaded.
I have. Go back and read this list for the traces posted.
"Traces posted"? Do you mean... the debugging output from php5 when it is importing your database? I might be blind, but I can see nothing.
When mediawiki gets into low memory conditions of any kind, the wheels
There is a big difference between whether or not imports work and whether or not your machine can handle them.
The imports bugs have a large number of pages on google about this and a lot of other places.
Nothing critical at http://bugzilla.wikimedia.org/buglist.cgi?query_format=specific&order=re...
You might be providing a true argument, but I am not convinced right now. All of my tests have worked, it definitely could be faster, but I do not understand why you are having issues. It seems to me like it is your fault, not Wikimedia's fault.
fly off when imports, reading, and editing or going on at the same time.
Jeff
I would also like to know what your proposed solution would be? Allow people to dynamically load Wikipedia's content? Do you have any idea how ridiculous that sounds?
Kasimir
Kasimir Gabert wrote:
Hello,
I can verify that uploading MediaWiki dumps works.
I was able to successfully import (using *nothing but the built in MW tools*) ba.wikipedia.org to test.kgprog.com in a few minutes with a brand new installation of MW 1.9.3.
The list of things that I had to do (in case Merkey really does want to learn how to do this):
- Download MediaWiki
- Download the dump (for BA wiki and this test it was
http://download.wikipedia.org/bawiki/20070222/bawiki-20070222-pages-articles... )
- Install MediaWiki (many tutorials on doing this)
- Extract the XML file from the bz2 compressed file (I used bunzip2 on Linux)
- Move the extracted xml file to maintenance/dump.xml
- In a terminal cd to maintenance
- Type in "php importDump.php dump.xml"
- Wait for the dump to finish
- Type in "php rebuildrecentchanges.php"
The wiki has now been successfully created with all of the dumped pages from Wikimedia Foundation.
Kasimir
On 2/24/07, Jeffrey V. Merkey jmerkey@wolfmountaingroup.com wrote:
>Domas Mituzas wrote: > > > > > > > >>Dear Jeffrey, >> >> >> >> >> >> >> >> >> >>>to have some personal issue, but this is simply reports of bugs and >>>issues which have been floating for over a year and have not been >>>closed. >>> >>> >>> >>> >>> >>> >>> >>> >>This is open project, quality patches are always welcome. >> >> >> >> >> >> >> >> >> >I am happy to post them. > >Jeff > >_______________________________________________ >Wikitech-l mailing list >Wikitech-l@lists.wikimedia.org >http://lists.wikimedia.org/mailman/listinfo/wikitech-l > > > > > > >
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikitech-l
-- Kasimir Gabert