| -----Original Message----- | From: ... Brion Vibber | Sent: Monday, November 28, 2005 9:45 PM / | importDump.php is relatively inefficient, and is generally | expected to be used with smallish data sets being copied in | from another wiki. For bulk imports you'll get much much | better performance with mwdumper.
Well, it's true. mwdumper is excellent. I've got that result for processing pages_current of polish wiki - 233491 pages: conv. xml->sql 201 sec., and the whole process (unpack,conversion,load) ended for 30 min. Thx, Brion
Btw, can you change mwdumper output from I get
10á000 pages (292,235/sec), 10á000 revs (292,235/sec) ... 233á491 pages (1á186,643/sec), 233á491 revs (1á186,643/sec)
to
10.000 pages ( 292/sec), 10.000 revs ( 292/sec) ... 233.491 pages (1.186/sec), 233.491 revs (1.186/sec)
or (without thousand separator)
10000 pages ( 292/sec), 10000 revs ( 292/sec) ... 233491 pages (1186/sec), 233491 revs (1186/sec)
Reg., Janusz 'Ency' Dorozynski
| | > If not, I would highly appreciated any help. Unfortunately I cannot | > use mwdumper, because OSX 10.3 only allows java 1.4 and | with java 1.4 | > I get exception errors. | | Java 1.5 for Mac OS X can be downloaded from www.apple.com. | | 1.4 will still be the default JVM, but you can run 1.5 | specifically with: | /System/Library/Frameworks/JavaVM.framework/Versions/1.5/Commands/java | | -- brion vibber (brion @ pobox.com) | |