Hi everybody!
I'm newbie on this list, so apologize for any from me :-))
I think my problem is important for people who have like me local xAMP+M environment (for me x => W :-)) ) and want time to time load wikis dumps. All was all right since June 23rd, when was published last .sql dumps of "my" polish wiki. I load cur table into db for 10 minutes (300 pages per second). Now I find only .xml file. First, I completely not understand this change. For production needs, for example to restore db, .xml files vs. .slq file are out of range. For people like me too. Nowhere was any help how to use that new solution. When I at last find here help (from Brion post), no successful happens. gzip -dc pages_current.xml.gz | php importDump.php stops after loading appr. 15000 pages (from about 190000) when executing 47 row of importDump script (1.5rc2). Then I found 2979 bug about 47 row, the bug is still open (for 1.5rc4 too), and 3182 bug, open too.
Next I try Kates importDump.phps, things were long time fine and in progress, unfortunate for pages 142000 php was suddenly terminated without any message but from Windows. While php was work I watched const. increase consuming of memory reported by bug 3182, however not so drastic or failing the php. And next very important - flow of data through gzip & php is drastic low - appr. 5 p/s. vs. 300 p/s when I import .sql dump. I think export to .xml goes same slow and not acceptable for production needs. Are really wikis db's dumping to .xml? and if needed are restoring from xml?
So, is any chance that people can take from download.wikimedia.org .sql dumps? Xml dumps are completely useless for them.
Janusz 'Ency' Dorozynski