Ive used this to dump what I hoped would be a complete backup of my local hosted mediawiki. Th purpose of which was to import the .xml file to a new working server. The script to import it did the job. however the dumped.xml file did not contain all the articles. It uploaded 476 of a site that contains 5391 articles. Just wondering why. Ive done it 3 times with no different results. john
Hi John
dumpBackup isn't a full wiki backup.
Sounds like script is timing out. Did you look at the data with an xml tool to see if all articles are in the file. Could be a time out on upload too. Most host have a max execution time.
Just use a command line MySQL dump of the database on your local. Then a command line SQL restore on the host. Makes backups and moving db easy.
If you don't have a command line there are other methods to run a full backup and move db. XCloner V3.3 stand alone will do website files and db or one or the other.
Tom
On Aug 6, 2013, at 10:08 PM, "John W. Foster" jfoster81747@gmail.com wrote:
Ive used this to dump what I hoped would be a complete backup of my local hosted mediawiki. Th purpose of which was to import the .xml file to a new working server. The script to import it did the job. however the dumped.xml file did not contain all the articles. It uploaded 476 of a site that contains 5391 articles. Just wondering why. Ive done it 3 times with no different results. john
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
On Wed, 2013-08-07 at 12:06 -0400, Tom wrote:
Hi John
dumpBackup isn't a full wiki backup.
Sounds like script is timing out. Did you look at the data with an xml tool to see if all articles are in the file. Could be a time out on upload too. Most host have a max execution time.
Just use a command line MySQL dump of the database on your local. Then a command line SQL restore on the host. Makes backups and moving db easy.
If you don't have a command line there are other methods to run a full backup and move db. XCloner V3.3 stand alone will do website files and db or one or the other.
Tom
On Aug 6, 2013, at 10:08 PM, "John W. Foster" jfoster81747@gmail.com wrote:
Ive used this to dump what I hoped would be a complete backup of my local hosted mediawiki. Th purpose of which was to import the .xml file to a new working server. The script to import it did the job. however the dumped.xml file did not contain all the articles. It uploaded 476 of a site that contains 5391 articles. Just wondering why. Ive done it 3 times with no different results. john
---------------------------- Is there any software or php script that will allow for a complete back up and reinstallation of the entire site to a new site on another physical server in another location. I'll look at the xCloner but seeking other suggestions. I do not want to use the mysql dump of the old database to do this. That database is VERY old and has way too much cruft in it from mediawiki upgrades & extension upgrades that have gone on for several years. I just want to move the articles to the new server. Thnaks! John
mediawiki-l@lists.wikimedia.org