I have a large site that I've been developing on my server in-house. I want to move the entire site to a secure hosted server off-site. I want to know what is the best way to do this. I DO NOT want to use the same tables that are in my current wiki, as there are/seem to be some issues with the table being corrupted. The site runs, but is VERY slow, & I'm certain it's not the computer its on. I likely screwed up the tables when I upgraded the mediawiki. In the past Ive just exported the database to the new / upgraded wiki. several times, that has worked fine. But, I want only the actual page content to be exported. I did that & it generated a 256.5 mb xml file. That does not upload, using the import function. Any tips are appreciated. Thanks John
On 05/31/2013 01:07 PM, John W. Foster wrote:
I did that & it generated a 256.5 mb xml file. That does not upload, using the import function.
The dumps can be imported using maintenace/importDump.php (see https://www.mediawiki.org/wiki/Manual:Importing_XML_dumps#Using_importDump.p...).
Does this procedure export / import images?
"Mark A. Hershberger" mah@everybody.org wrote:
On 05/31/2013 01:07 PM, John W. Foster wrote:
I did that & it generated a 256.5 mb xml file. That does not upload, using
the
import function.
The dumps can be imported using maintenace/importDump.php (see https://www.mediawiki.org/wiki/Manual:Importing_XML_dumps#Using_importDump.p...).
Love alone reveals the true shape of the universe. -- "Everywhere Present", Stephen Freeman
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
On Fri, 2013-05-31 at 14:27 -0400, Steve VanSlyck wrote:
There are image links but no images. All wiki text.
Does this procedure export / import images?
"Mark A. Hershberger" mah@everybody.org wrote:
On 05/31/2013 01:07 PM, John W. Foster wrote:
I did that & it generated a 256.5 mb xml file. That does not upload, using
the
import function.
The dumps can be imported using maintenace/importDump.php (see https://www.mediawiki.org/wiki/Manual:Importing_XML_dumps#Using_importDump.p...).
Love alone reveals the true shape of the universe. -- "Everywhere Present", Stephen Freeman
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
On Fri, 2013-05-31 at 14:24 -0400, Mark A. Hershberger wrote:
On 05/31/2013 01:07 PM, John W. Foster wrote:
I did that & it generated a 256.5 mb xml file. That does not upload, using the import function.
The dumps can be imported using maintenace/importDump.php (see https://www.mediawiki.org/wiki/Manual:Importing_XML_dumps#Using_importDump.p...).
Ah! I will try that. I was trying to use the import function from the mediawiki Special pages area. Thanks!!
How are you exporting and importing the database? In the past I've successfully used "mysqldump" to export and then import directly via the command line "mysql" with a wiki database that exceeds 10GB. The commands to do this are basically:
mysqldump --opt -u user -p wikidb > file.sql
mysql -u user -p newwikidb < file.sql
There are lots of resources explaining/detailing the commands if you need more information. Also remember to copy the image directory to the new wiki.
On 31 May 2013 13:07, John W. Foster jfoster81747@gmail.com wrote:
I have a large site that I've been developing on my server in-house. I want to move the entire site to a secure hosted server off-site. I want to know what is the best way to do this. I DO NOT want to use the same tables that are in my current wiki, as there are/seem to be some issues with the table being corrupted. The site runs, but is VERY slow, & I'm certain it's not the computer its on. I likely screwed up the tables when I upgraded the mediawiki. In the past Ive just exported the database to the new / upgraded wiki. several times, that has worked fine. But, I want only the actual page content to be exported. I did that & it generated a 256.5 mb xml file. That does not upload, using the import function. Any tips are appreciated. Thanks John
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Yes I've done this before as well. And it worked for me. However: this time I don't want to use the same data base. I've reconfigured the wiki to perform differently as this will run on a remote site, that I don't have "shell access" so what I'm trying to do is port it to the exact same system on my server. Rebuild it using the new database structure; then upload it to the new mediawiki off-site server. That is why I want to use the .xml dump. Thanks!
On Fri, 2013-05-31 at 14:27 -0400, Dave Humphrey wrote:
How are you exporting and importing the database? In the past I've successfully used "mysqldump" to export and then import directly via the command line "mysql" with a wiki database that exceeds 10GB. The commands to do this are basically:
mysqldump --opt -u user -p wikidb > file.sql mysql -u user -p newwikidb < file.sql
There are lots of resources explaining/detailing the commands if you need more information. Also remember to copy the image directory to the new wiki.
On 31 May 2013 13:07, John W. Foster jfoster81747@gmail.com wrote:
I have a large site that I've been developing on my server in-house. I want to move the entire site to a secure hosted server off-site. I want to know what is the best way to do this. I DO NOT want to use the same tables that are in my current wiki, as there are/seem to be some issues with the table being corrupted. The site runs, but is VERY slow, & I'm certain it's not the computer its on. I likely screwed up the tables when I upgraded the mediawiki. In the past Ive just exported the database to the new / upgraded wiki. several times, that has worked fine. But, I want only the actual page content to be exported. I did that & it generated a 256.5 mb xml file. That does not upload, using the import function. Any tips are appreciated. Thanks John
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
mediawiki-l@lists.wikimedia.org