Hi, all!
I am new to wikipedia world. I have a Windows XP machine running XAMPP. I have installed Mediawiki 1.7.1 and is working fine. Now I want to dump English wikipedia. I have read and used a lot many applicable commands but none seems to work. Today I have used the following command:
java -jar mwdumper.jar --format=sql:1.5 enwiktionary-20060927-pages-articles.xml.bz2 | mysql -u user -p passw wikidb
According to a suggestion I have placed mwdumper.jar and the dump .bz2 file in the folder where mysql.exe exists. (any other suggestion????)
The command seems to run successful.
ssl-capath (No default value) ssl-cert (No default value) ssl-cipher (No default value) ssl-key (No default value) ssl-verify-server-cert FALSE table FALSE debug-info FALSE safe-updates FALSE i-am-a-dummy FALSE connect_timeout 0 max_allowed_packet 16777216 net_buffer_length 16384 select_limit 1000 max_join_size 1000000 secure-auth FALSE show-warnings FALSE 1.000 pages (598,086/sec), 1.000 revs (598,086/sec) 2.000 pages (446,828/sec), 2.000 revs (446,828/sec) 3.000 pages (537,827/sec), 3.000 revs (537,827/sec) 4.000 pages (672,495/sec), 4.000 revs (672,495/sec) 5.000 pages (750,863/sec), 5.000 revs (750,863/sec) 6.000 pages (831,025/sec), 6.000 revs (831,025/sec) .............................................................................
it goes upto 279 pages with out any warning.
When i open wikidb in mysql there is no record in any table , not a single entry. I have deleted from the page, revision and text tables. There is no prefix set to wikidb tables. I have looked all forums which i found. And now its driving me nuts!!
Can someone tell/HELP what i am missing. PLEASE!!!!!!!!!!!!!!!!!!
Thanks in advance!!
Syed Tanveer Shahzad Gilani wrote:
I am new to wikipedia world. I have a Windows XP machine running XAMPP. I have installed Mediawiki 1.7.1 and is working fine. Now I want to dump English wikipedia. I have read and used a lot many applicable commands but none seems to work. Today I have used the following command:
java -jar mwdumper.jar --format=sql:1.5 enwiktionary-20060927-pages-articles.xml.bz2 | mysql -u user -p passw wikidb
According to a suggestion I have placed mwdumper.jar and the dump .bz2 file in the folder where mysql.exe exists. (any other suggestion????)
[snip]
it goes upto 279 pages with out any warning.
When i open wikidb in mysql there is no record in any table , not a single entry. I have deleted from the page, revision and text tables. There is no prefix set to wikidb tables. I have looked all forums which i found. And now its driving me nuts!!
1) Try saving the output to an .sql file rather than piping it directly to MySQL. This can make it easier to track down problems later:
java bla bla bla > bigfile.sql
2) Now do the SQL import: mysql -u <username> -p<password> <dbname> < bigfile.sql
Watch the output for errors.
Likely problems: * You didn't create the tables first, so there's no tables to import into. * Your tables are named with a prefix and you didn't pass the prefix option, so they don't go to the right place. * You created tables with the MediaWiki installer, but didn't clear them afterwards.
The page, revision, and text tables must: * exist * be empty for the import to proceed successfully.
To create the tables manually, source in maintenance/tables.sql from the MediaWiki distribution.
You can use the TRUNCATE TABLE command to clear existing tables. See dev.mysql.com for general documentation on MySQL.
-- brion vibber (brion @ pobox.com)
Thanks Brother! I will try what u said and I hope it works. The tables Page, Revision and Text exist, and are empty. I will try conversion to .sql then see what happens.
Bye.
On 10/16/06, Brion Vibber brion@pobox.com wrote:
Syed Tanveer Shahzad Gilani wrote:
I am new to wikipedia world. I have a Windows XP machine running XAMPP.
I
have installed Mediawiki 1.7.1 and is working fine. Now I want to dump English wikipedia. I have read and used a lot many applicable commands
but
none seems to work. Today I have used the following command:
java -jar mwdumper.jar --format=sql:1.5 enwiktionary-20060927-pages-articles.xml.bz2 | mysql -u user -p passw
wikidb
According to a suggestion I have placed mwdumper.jar and the dump .bz2
file
in the folder where mysql.exe exists. (any other suggestion????)
[snip]
it goes upto 279 pages with out any warning.
When i open wikidb in mysql there is no record in any table , not a
single
entry. I have deleted from the page, revision and text tables. There is
no
prefix set to wikidb tables. I have looked all forums which i found. And
now
its driving me nuts!!
- Try saving the output to an .sql file rather than piping it directly to
MySQL. This can make it easier to track down problems later:
java bla bla bla > bigfile.sql
- Now do the SQL import:
mysql -u <username> -p<password> <dbname> < bigfile.sql
Watch the output for errors.
Likely problems:
- You didn't create the tables first, so there's no tables to import into.
- Your tables are named with a prefix and you didn't pass the prefix
option, so they don't go to the right place.
- You created tables with the MediaWiki installer, but didn't clear them
afterwards.
The page, revision, and text tables must:
- exist
- be empty
for the import to proceed successfully.
To create the tables manually, source in maintenance/tables.sql from the MediaWiki distribution.
You can use the TRUNCATE TABLE command to clear existing tables. See dev.mysql.com for general documentation on MySQL.
-- brion vibber (brion @ pobox.com)
MediaWiki-l mailing list MediaWiki-l@Wikimedia.org http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
Hi! Brion. Its not working. can u help any more? Please!!!
On 10/17/06, STSG stshahzad@gmail.com wrote:
Thanks Brother! I will try what u said and I hope it works. The tables Page, Revision and Text exist, and are empty. I will try conversion to .sql then see what happens.
Bye.
On 10/16/06, Brion Vibber brion@pobox.com wrote:
Syed Tanveer Shahzad Gilani wrote:
I am new to wikipedia world. I have a Windows XP machine running
XAMPP. I
have installed Mediawiki 1.7.1 and is working fine. Now I want to dump English wikipedia. I have read and used a lot many applicable commands
but
none seems to work. Today I have used the following command:
java -jar mwdumper.jar --format=sql:1.5 enwiktionary-20060927-pages-articles.xml.bz2 | mysql -u user -p passw
wikidb
According to a suggestion I have placed mwdumper.jar and the dump .bz2
file
in the folder where mysql.exe exists. (any other suggestion????)
[snip]
it goes upto 279 pages with out any warning.
When i open wikidb in mysql there is no record in any table , not a
single
entry. I have deleted from the page, revision and text tables. There
is no
prefix set to wikidb tables. I have looked all forums which i found.
And now
its driving me nuts!!
- Try saving the output to an .sql file rather than piping it directly
to MySQL. This can make it easier to track down problems later:
java bla bla bla > bigfile.sql
- Now do the SQL import:
mysql -u <username> -p<password> <dbname> < bigfile.sql
Watch the output for errors.
Likely problems:
- You didn't create the tables first, so there's no tables to import
into.
- Your tables are named with a prefix and you didn't pass the prefix
option, so they don't go to the right place.
- You created tables with the MediaWiki installer, but didn't clear them
afterwards.
The page, revision, and text tables must:
- exist
- be empty
for the import to proceed successfully.
To create the tables manually, source in maintenance/tables.sql from the MediaWiki distribution.
You can use the TRUNCATE TABLE command to clear existing tables. See dev.mysql.com for general documentation on MySQL.
-- brion vibber (brion @ pobox.com)
MediaWiki-l mailing list MediaWiki-l@Wikimedia.org http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
mediawiki-l@lists.wikimedia.org