When I run mwdumper with enwiki-20061001-pages-articles.xml.bz2 and pipe it to mysql (see my other mail), it eventually prints out something like 3,583,699 pages (490.212/sec), 3,583,699 revs (490.212/sec) and exits.
However, on the database afterwards a query like SELECT COUNT(*) FROM page gives me a result count somewhere around 2.5 million (I don't have the exact number in my notes, sorry). And this discrepancy is further confirmed by many missing articles: lots of red links from existing articles where I can confirm that the article doesn't exist in the database but does exist in the dump.
Is mwdumper known to have problems in this area? How might I track this down?
wikitech-l@lists.wikimedia.org