On Tue, May 09, 2006 at 03:46:00PM +0100, Ben Francis wrote:
I'm interested in how much work has gone into this already because it would be such a useful feature (both the exporting and the native DocBook storage). Actually, once the API gets implemented there are enormous possibilities for using MediaWiki as a document management tool and calling the API from different internal systems.
An organisation could write documents using a desktop application but use MediaWiki for the back end storage. They could then log in remotely, access and edit the documents from anywhere via a web interface or other application.
Imagine MediaWiki for help systems where the help appears inside a desktop or web application but is stored and collaboratively edited using a MediaWiki installation.
If the XHTML wiki web pages only have to be one representation of the data, the possibilties are endless!
The PHP tool written by Magnus Manske (http://magnusmanske.de/wiki2xml/w2x.php) looks good
Yeah; at the moment, Magnus seems like the go-to guy on that. I've badly wanted to see something like this for some time, and would be glad (with 20 years system analyst experience :-) to kibitz on the design, if you like.
My personal target was mostly being able to extract a partial tree from a running MW install, and dump it into a DocBook source file, processing xrefs and the like in some useful fashion, so that flat-file documentation can be extracted from a MediaWiki used to maintain it.
(Read: I talked the MythTV people into converting from Moin, and this was one of the selling points. :-)
That usage, of course, implies a few extra requirements that the general case wouldn't require, but I think it's one of the most useful targets for such a processing chain.
Cheers, -- jra