Hi all Oops. I have a wget crawl of an old wikimedia site, and would like to load it into a newly installed wikimedia. No easy way, I am almost sure. Please tell me if I missed something. Cheers -- Rick
Yeah, that's basically impossible.
Maybe you could try and use parsoid to convert the html into wikitext-ish stuff (but it would be lossy as its not parsoid-html that's being converted) and also would still involve some programming to connect all the parts.
-- Brian
On Wed, May 3, 2017 at 5:44 PM, Rick Leir rleir@leirtech.com wrote:
Hi all Oops. I have a wget crawl of an old wikimedia site, and would like to load it into a newly installed wikimedia. No easy way, I am almost sure. Please tell me if I missed something. Cheers -- Rick -- Sorry for being brief. Alternate email is rickleir at yahoo dot com _______________________________________________ MediaWiki-l mailing list To unsubscribe, go to: https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
You could try my html2wiki extension
On May 3, 2017 7:17 PM, "Brian Wolff" bawolff@gmail.com wrote:
Yeah, that's basically impossible.
Maybe you could try and use parsoid to convert the html into wikitext-ish stuff (but it would be lossy as its not parsoid-html that's being converted) and also would still involve some programming to connect all the parts.
-- Brian
On Wed, May 3, 2017 at 5:44 PM, Rick Leir rleir@leirtech.com wrote:
Hi all Oops. I have a wget crawl of an old wikimedia site, and would like to
load it into a newly installed wikimedia. No easy way, I am almost sure. Please tell me if I missed something. Cheers -- Rick
-- Sorry for being brief. Alternate email is rickleir at yahoo dot com _______________________________________________ MediaWiki-l mailing list To unsubscribe, go to: https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
MediaWiki-l mailing list To unsubscribe, go to: https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Thanks, that looks promising!
On 2017-05-03 07:57 PM, G Rundlett wrote:
You could try my html2wiki extension
On May 3, 2017 7:17 PM, "Brian Wolff" bawolff@gmail.com wrote:
Yeah, that's basically impossible.
Maybe you could try and use parsoid to convert the html into wikitext-ish stuff (but it would be lossy as its not parsoid-html that's being converted) and also would still involve some programming to connect all the parts.
-- Brian
On Wed, May 3, 2017 at 5:44 PM, Rick Leir rleir@leirtech.com wrote:
Hi all Oops. I have a wget crawl of an old wikimedia site, and would like to
load it into a newly installed wikimedia. No easy way, I am almost sure. Please tell me if I missed something. Cheers -- Rick
-- Sorry for being brief. Alternate email is rickleir at yahoo dot com _______________________________________________ MediaWiki-l mailing list To unsubscribe, go to: https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
MediaWiki-l mailing list To unsubscribe, go to: https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
MediaWiki-l mailing list To unsubscribe, go to: https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
mediawiki-l@lists.wikimedia.org