Exactly what is the error message you are getting? If it is something like "Mysql Server has gone away" it may be due to a too small "max_allowed_packet" setting. See http://stackoverflow.com/questions/19214572/can-not-import-large-sql-dump-in...
My MediaWiki db is going on ~20GB now and I can export/import it fine with a max_allowed_packet of 16M and a wait_timeout of 90. If you have a lot of large articles in your wiki you may need a larger max_allowed_packet. For example, in my "/etc/my.cnf" file I have:
[mysqldump] quick max_allowed_packet = 16M
I don't see anything else obvious in my config file that would affect the export/import speed. I know there's a few good results if you google for "mysqldump fast restore" or similar.
Exports/imports are as simple as:
mysqldump -avz -u user -p database_name > file.sql mysql -u user -p new_database_name < file.sql
If I'm transferring it to a different server I simply rsync/scp the SQL file over between the two steps. The export takes about 10 min and import close to 1 hour for me (I've never had a disconnect error).
On 14 May 2015 at 22:52, John Foster jfoster81747@gmail.com wrote:
I have to move a mediawiki site that has a database of approx 4.1GB. It is somthing I have done before and I usually just use the command line syntax to do it. However I have tried 3 times to export it and the files are always either incomplete due to server (Mysql) disconnect. I finally did get one that seemed to complete OK but then I tried several times to import it into the new server and likewise got server disconnect errors. I am aware that there are a multitude of possibilities, so what I'm asking here is for any tips on moving databases of this size or larger. I'm beginning to wonder if Mysql is the way to go here. Thanks!
-- John Foster JW Foster & Associates
MediaWiki-l mailing list To unsubscribe, go to: https://lists.wikimedia.org/mailman/listinfo/mediawiki-l