I can do a little better than that, I can whip up something that copies
everything based off Special:Allpages. If you want me to do that drop me a
email off list and we can work out the details. ~~~~
On Mon, Oct 21, 2013 at 8:19 PM, Wjhonson <wjhonson(a)aol.com> wrote:
How about a script that Googles
site:www.myurl.com and
then walks every
page and copies it ;)
-----Original Message-----
From: John Foster <jfoster81747(a)verizon.net>
To: mediawiki-list <mediawiki-l(a)lists.wikimedia.org>
Cc: MediaWiki announcements and site admin list <
mediawiki-l(a)lists.wikimedia.org>
Sent: Mon, Oct 21, 2013 4:58 pm
Subject: Re: [MediaWiki-l] Mediawiki articla export
On Mon, 2013-10-21 at 16:46 -0700, Yan Seiner wrote:
John W. Foster wrote:
> Is there any way to export ALL the articles & or pages from a very slow
> but working mediawiki. I want to move them to a much faster upgraded
> mediawiki server.
> I have tried the dumpbackup script in /maintainence, but that didn't
get
all the
pages, only some, a& I dont know why. Any tips are appreciated.
Thanks
john
If it's the same version of mediawiki you can always try dumping the
database
directly and importing it into mysql on the new server. I'm not sure but
you
might have to create the exact file structure as well....
Thanks.
I am aware of that solution & in fact it is my preferred method for
moving a wiki. However; the reason the mediawiki is slow is a totally
messed up MySql database system, & I don't know how to fix it. I tried
for over a year, as the wiki has thousands of pages/articles. Therefore
I don't want to move the table structure for this db into the new,
properly functioning wiki.
Anything else, maybe.
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l