[Mediawiki-l] HELP: wikipedia dump using MWdumper

Syed Tanveer Shahzad Gilani stshahzad at gmail.com
Mon Oct 16 13:28:30 UTC 2006


Hi, all!

I am new to wikipedia world. I have a Windows XP machine running XAMPP. I
have installed Mediawiki 1.7.1 and is working fine. Now I want to dump
English wikipedia. I have read and used a lot many applicable commands but
none seems to work. Today I have used the following command:

java -jar mwdumper.jar --format=sql:1.5
enwiktionary-20060927-pages-articles.xml.bz2 | mysql -u user -p passw wikidb

According to a suggestion I have placed mwdumper.jar and the dump .bz2 file
in the folder where mysql.exe exists.  (any other suggestion????)

The command seems to run successful.


ssl-capath                        (No default value)
ssl-cert                          (No default value)
ssl-cipher                        (No default value)
ssl-key                           (No default value)
ssl-verify-server-cert            FALSE
table                             FALSE
debug-info                        FALSE
safe-updates                      FALSE
i-am-a-dummy                      FALSE
connect_timeout                   0
max_allowed_packet                16777216
net_buffer_length                 16384
select_limit                      1000
max_join_size                     1000000
secure-auth                       FALSE
show-warnings                     FALSE
1.000 pages (598,086/sec), 1.000 revs (598,086/sec)
2.000 pages (446,828/sec), 2.000 revs (446,828/sec)
3.000 pages (537,827/sec), 3.000 revs (537,827/sec)
4.000 pages (672,495/sec), 4.000 revs (672,495/sec)
5.000 pages (750,863/sec), 5.000 revs (750,863/sec)
6.000 pages (831,025/sec), 6.000 revs (831,025/sec)
.............................................................................

it goes upto 279 pages with out any warning.

When i open wikidb in mysql there is no record in any table , not a single
entry. I have deleted from the page, revision and text tables. There is no
prefix set to wikidb tables. I have looked all forums which i found. And now
its driving me nuts!!

Can someone tell/HELP what i am missing. PLEASE!!!!!!!!!!!!!!!!!!

Thanks in advance!!



More information about the MediaWiki-l mailing list