On 2015-12-31, LCJones lcjones@jonesipedia.com wrote:
On 12/31/2015 1:23 PM, Daniel Barrett wrote:
Chap Jones writes:
It takes an excruciatingly long time to import even a relatively small wikipedia page into my wiki.
Are you importing the page using Special:Import, or are you running the script maintenance/importDump.php?
If you're using Special:Import, are things any faster if you use importDump.php?
DanB
MediaWiki-l mailing list To unsubscribe, go to: https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
DanB,
Thanks for pointing out the importdump script. Indeed, the importdump script takes as long as Special:Import. I was running top and tcpdump in separate terminals during a small (484k xml), hopefully to see some oddity. But no joy. However, Apache eats a lot of CPU during the import process and I did notice resets in tcpdump, especially from upload-lb.eqiad.wikimedia.org.
How large are your XML dumps? The process is known to be pretty slow. What happens if you take the network out of the equation, i.e. import just a local file?
Saper