I have a simple WGET script running now to extract Wiki's once they get to a stage that we want.
However, the links don't update and fix themselves appropriately and was wondering if someone is trying to do the same and maybe go some kind of hack working. All I am trying to do is export to static html, have the links inside update and keep the metadata.
Thanks
Joshua
On 27/12/05, Joshua Seagroves joshua_seagroves@hotmail.com wrote:
All I am trying to do is export to static html, have the links inside update and keep the metadata.
See http://meta.wikimedia.org/wiki/Alternative_parsers - particularly the "non-parser dumper" which uses MediaWiki itself to do the rendering and exporting.
-- Rowan Collins BSc [IMSoP]
mediawiki-l@lists.wikimedia.org