On Mon, 2013-10-21 at 22:44 -0700, Jan Steinman wrote:
cd root-of-destination
wget -r
http://site.to.be.copied/wiki
I think that would just copy the directory contents of the existing wiki
to the new root & that will not solve my issue.
But I don't think that's what the OP wants.
I'd just copy the database tables over. But since he's copying to an
"upgraded server," that may not work well.
That is exactly what happened to CAUSE the issue with the old wiki. I
did a MySQL database dump & imported it into a new UPGRADED wiki and the
tables were all wrong. It takes forever for the wiki to find the
articles. It does eventually..... but that is just unsatisfactory.
Maybe copy the database tables over to an install of the same rev, then upgrade the new
server.
Hmmm, Haven,t tried that. I figure that would either work or crap out
the old slow but still working wiki. I may try that as a last resort.
Thanks
On 2013-10-21, at 17:43, mediawiki-l-request(a)lists.wikimedia.org wrote:
> From: John <phoenixoverride(a)gmail.com>
>
> I can do a little better than that, I can whip up something that copies
> everything based off Special:Allpages. If you want me to do that drop me a
> email off list and we can work out the details. ~~~~
Maybe
>
> On Mon, Oct 21, 2013 at 8:19 PM, Wjhonson <wjhonson(a)aol.com> wrote:
>
>> How about a script that Googles
site:www.myurl.com and then walks every
>> page and copies it ;)
Beyond my skill set.LOL
>>
>>
>>
>> Sent: Mon, Oct 21, 2013 4:58 pm
>> Subject: Re: [MediaWiki-l] Mediawiki articla export
>>
>>
>> On Mon, 2013-10-21 at 16:46 -0700, Yan Seiner wrote:
>>> John W. Foster wrote:
>>>> Is there any way to export ALL the articles & or pages from a very
slow
>>>> but working mediawiki. I want to move them to a much faster upgraded
>>>> mediawiki server.