Yes I've done this before as well. And it worked for me. However: this time I don't want to use the same data base. I've reconfigured the wiki to perform differently as this will run on a remote site, that I don't have "shell access" so what I'm trying to do is port it to the exact same system on my server. Rebuild it using the new database structure; then upload it to the new mediawiki off-site server. That is why I want to use the .xml dump. Thanks!
On Fri, 2013-05-31 at 14:27 -0400, Dave Humphrey wrote:
How are you exporting and importing the database? In the past I've successfully used "mysqldump" to export and then import directly via the command line "mysql" with a wiki database that exceeds 10GB. The commands to do this are basically:
mysqldump --opt -u user -p wikidb > file.sql mysql -u user -p newwikidb < file.sql
There are lots of resources explaining/detailing the commands if you need more information. Also remember to copy the image directory to the new wiki.
On 31 May 2013 13:07, John W. Foster jfoster81747@gmail.com wrote:
I have a large site that I've been developing on my server in-house. I want to move the entire site to a secure hosted server off-site. I want to know what is the best way to do this. I DO NOT want to use the same tables that are in my current wiki, as there are/seem to be some issues with the table being corrupted. The site runs, but is VERY slow, & I'm certain it's not the computer its on. I likely screwed up the tables when I upgraded the mediawiki. In the past Ive just exported the database to the new / upgraded wiki. several times, that has worked fine. But, I want only the actual page content to be exported. I did that & it generated a 256.5 mb xml file. That does not upload, using the import function. Any tips are appreciated. Thanks John
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l