[MediaWiki-l] How to export all articles on an existing wiki.

Emilio J. Rodríguez-Posada emijrp at gmail.com
Fri Aug 9 16:25:57 UTC 2013

If your MediaWiki version is not too old and the hosting is not very slow,
this may work http://code.google.com/p/wikiteam/

2013/8/9 John W. Foster <jfoster81747 at gmail.com>

> So far Ive tried the dumpBackup.php and that only gets part of it. It
> has been suggested that its a php script timeout issue and that's
> possible. It is a large site with over 5000 articles on it so it will be
> large. I would appreciate any tips on how to do this. I've also looked
> at XCloner as another suggested and it does not appear to provide the
> functionality to do what I need. Does fine on existing hard pages
> in /html  directory, but does not seem to be able to pull files from a
> mediawiki and place them into a .xml file for importing. Even something
> that could break up a backup so that it gets everything could help.
> Dumping the databse ( mysql) will not work as that part of the issue.
> The existing data base in somewhat filled with old no longer relevant
> tables that SLOW it way down.
> Any tips please?
> Thanks
> John
> _______________________________________________
> MediaWiki-l mailing list
> MediaWiki-l at lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l

More information about the MediaWiki-l mailing list