I apologise for replying myself...
I've done exactly the same with a tiny dump (amwiki) and it worked perfectly (100% match between mwdumper and mysql results).
Is there any problem with bigger dumps?
--- Felipe Ortega glimmer_phoenix@yahoo.es escribió:
Hi all.
I remember something about this posted aprox a year ago, but now we have new software versions so... There's something strange when I try to import dumps.
I've downloaded latest version of mwdumper.jar. I get a full .7z dump (for research purposes) of, let's say, svwiki.
Well, I create a new database, build the 29 tables with the tables.sql script (but remove the InnoDB type of tables, due to some compatibility problems in our MySQL, I think it doesn't mind).
Ok, I launch mwdumper.jar with the appropiate parameters, things go on and finally obtain:
373,142 pages (228.901/sec), 2,659,777 revs (1,631.616/sec)
As it should be. But when I log on in svwiki database in MySQL it says:
mysql> select count(page_id) from page; +----------------+ | count(page_id) | +----------------+ | 30000 | +----------------+ 1 row in set (0.00 sec)
mysql> select count(rev_id) from revision; +---------------+ | count(rev_id) | +---------------+ | 722180 | +---------------+ 1 row in set (0.00 sec)
Far lower results.
Shouldn't I get the same figures? Anyone could help?
When I fix this problem, I've built a functional script to recover per user statistics of contributions in several periods of time (months, weeks, etc...).
Thanks everybody.
Felipe.
______________________________________________ LLama Gratis a cualquier PC del Mundo. Llamadas a fijos y móviles desde 1 céntimo por minuto. http://es.voice.yahoo.com _______________________________________________ Wikitech-l mailing list Wikitech-l@wikimedia.org
http://mail.wikipedia.org/mailman/listinfo/wikitech-l
______________________________________________ LLama Gratis a cualquier PC del Mundo. Llamadas a fijos y móviles desde 1 céntimo por minuto. http://es.voice.yahoo.com