how do you import a wiki sql dump into mediawiki?
$> zcat cur*gz | mysql yourdatabase
Please note that after that you'd either want to rebuild all links, or import links tables from http://download.wikimedia.org/
Domas
------------------------------------------------------- $> zcat cur*gz | mysql yourdatabase
Please note that after that you'd either want to rebuild all links, or import links tables from http://download.wikimedia.org/ --------------------------------------------------------
Thanks for that - though I do not really understand this, I have EasyPHP1-8 installed, on an XP machine, which I will be wanting to run the wikipedia from. I have downloaded 20050421_cur_table.sql.gz and have unpacked it to 20050421_cur_table.sql - do I need the other downloads aswell, like broken links, etc?
Could you, or someone else, provide me a step by step guide, so that an idiot could follow (like me lol), so I can then start it off
Thanks Again
james wrote:
Thanks for that - though I do not really understand this, I have EasyPHP1-8 installed, on an XP machine, which I will be wanting to run the wikipedia from. I have downloaded 20050421_cur_table.sql.gz and have unpacked it to 20050421_cur_table.sql
OK, now click "Start", then "Run", then type "cmd" and press Enter. A DOS box opens. Then type the following:
mysql -D wikidb < C:\path-to-the-file\20050421_cur_table.sql
Of course, you will have to enter the actual real path to the file.
This assumes that you have already created a database called "wikidb". If you have installed MediaWiki and it's already running, then you have.
- do I need the other downloads aswell, like broken
links, etc?
You don't strictly *need* them; you can re-create them using the rebuildlinks.php script. But, of course, you can also download them and import them as well. I don't know which is faster.
Timwi
On Wed, 18 May 2005, Timwi wrote:
You don't strictly *need* them; you can re-create them using the rebuildlinks.php script. But, of course, you can also download them and import them as well. I don't know which is faster.
Downloading is a lot faster. For some reason rebuilding links takes ages.
Alfio
I think I am really stupid here - as I cannot seem to get this setup properly - I think I should probably try something different from easyphp1-8.... any suggestions? I want to run this on my home intranet - so an all in one solution would be great
I keep on getting errors like the below, I guess just installing it doesnt help, and it needs to be configured right, of which I am not sure how.
ERROR 1045 <28000>:Access denied for user 'ODBC'@'localhost' <using password: NO>
Anyone got any good suggestions?
On 5/18/05, james jamessampford@supanet.com wrote:
I think I am really stupid here - as I cannot seem to get this setup properly - I think I should probably try something different from easyphp1-8.... any suggestions? I want to run this on my home intranet - so an all in one solution would be great
I keep on getting errors like the below, I guess just installing it doesnt help, and it needs to be configured right, of which I am not sure how.
ERROR 1045 <28000>:Access denied for user 'ODBC'@'localhost' <using password: NO>
Anyone got any good suggestions?
as part of that mysql invocation, add
-u root -p
when you run it, it will prompt you for the mysql root password (if you've not set that up, just hit return)
as part of that mysql invocation, add
-u root -p
when you run it, it will prompt you for the mysql root password (if you've not set that up, just hit return)
thanks for that - its been going along now for 3 hours or so, with about 1Gb or so to go, I hope
once it has finished, i guess i better move onto the links one, and then to images.... i see that this is a very time consumming process
if anyone could explain what I should do, after i have imported in the above files aswell - that would be great... like does it just magically appear in mediawiki, or what happens?
On 5/18/05, james jamessampford@supanet.com wrote:
as part of that mysql invocation, add
-u root -p
when you run it, it will prompt you for the mysql root password (if you've not set that up, just hit return)
thanks for that - its been going along now for 3 hours or so, with about 1Gb or so to go, I hope
Three hours is getting to be quite long (assuming you're adding a cur table, not the full history). If it gets to be very (like eight) hours, see this thread:
http://mail.wikipedia.org/pipermail/wikitech-l/2005-May/029156.html
once it has finished, i guess i better move onto the links one, and then to images.... i see that this is a very time consumming process
You only need links depending on what you intend to do with the final system. The linktable powers "what links here". If you don't need that, you don't need to load or rebuild the linktable.
if anyone could explain what I should do, after i have imported in the above files aswell - that would be great... like does it just magically appear in mediawiki, or what happens?
Yep, once the load is done the wikidb will be unlocked and your baby's life begins.
Three hours is getting to be quite long (assuming you're adding a cur table, not the full history). If it gets to be very (like eight) hours, see this thread:
http://mail.wikipedia.org/pipermail/wikitech-l/2005-May/029156.html
Now at 4hrs 15mn - how am I able to speed it up, other than using Linux?
I am using the current tables, not the old ones - 2.5Gb in size - so far the size of a file in the MySQL data directory is at 1.7Gb nearly
My system is 2.4Ghz Intel Celeron, 1Gb RAM - which i think was a little lower than the 32minute machine in that thread you posted
You only need links depending on what you intend to do with the final system. The linktable powers "what links here". If you don't need that, you don't need to load or rebuild the linktable.
I don't know what that is - but it sounds like I wont be needing it
Yep, once the load is done the wikidb will be unlocked and your baby's life begins.
Sweet - just have to wait quite a bit more at this rate - to see the end result!
On 5/18/05, james jamessampford@supanet.com wrote:
Three hours is getting to be quite long (assuming you're adding a cur table, not the full history). If it gets to be very (like eight) hours, see this thread:
http://mail.wikipedia.org/pipermail/wikitech-l/2005-May/029156.html
Now at 4hrs 15mn - how am I able to speed it up, other than using Linux?
If you read futher down that thread, you'll see my import problems were solved by using the XAMPP packaging of MySQL. MySQL comes with a bunch of different configurations, and the one XAMPP uses settings that are sensible for a large database like wikipedia's.
It's probably some of the settings in my.cnf (unforunately I zapped my old version before installing the XAMPP version, so I don't know which things to change).
I am using the current tables, not the old ones - 2.5Gb in size - so far the size of a file in the MySQL data directory is at 1.7Gb nearly
It looks like you're doing okay.
My system is 2.4Ghz Intel Celeron, 1Gb RAM - which i think was a little lower than the 32minute machine in that thread you posted
That's roughly like mine (the one quoted belonged to someone else).
It came up with an error at about 11.30 last night, lost mysql or something - though i tried connecting again and it worked - so left it overnight.... same thing
so now trying it out on this Xampp thing - Gets to 1.2Gb pretty quickly, but then terminates with the error;
Error 1153 <08S01> at line 1323: Got a packet bigger than 'max_allowed_packet' bytes
How can I get around this?
If it is the .sql file - I am downloading the newest version now - unless it is something in my.ini - but I don't know what I should change here.....
james wrote:
Error 1153 <08S01> at line 1323: Got a packet bigger than 'max_allowed_packet' bytes
How can I get around this?
Set max_allowed_packet to 16M in my.cnf / mysql.ini / whatever your OS calls it.
This should be in the documentation for the download dumps page, but everything got moved when Domas changed the backend server for them and I'm not sure where the directions went. :(
-- brion vibber (brion @ pobox.com
right - after finally doing the database with no errors (thanks to the above poster :) ) - it has finished with one file cur.MYD @ 2.5 ish GB, now its doing something with another file cur.MYI - how big will this file get to be? Mainly cos I want to go out, and turn the PC off sometime, as its been on for 30hrs or so. Is it safe to turn it off, or would I have to start again, or howbig does this file get to be?
wikitech-l@lists.wikimedia.org