For inter-wiki moves (to meta or sep11), bot-runners, and spot-backups, it would be useful to have a way to grab individual pages in a machine-friendly package, which can be easily popped into or out of another wiki or other software.
Sample and (perhaps?) discussion of proposed XML wrapper format: http://meta.wikipedia.org/wiki/XML_import/export
I've check in an export function to the dev branch as SpecialExport.php. Live demo: http://test.wikipedia.org/wiki/Special:Export
Current revision only: http://test.wikipedia.org/wiki/Special:Export/Wiki_table_test
With history: http://test.wikipedia.org/w/wiki.phtml?title=Special:Export&action=submi...
-- brion vibber (brion @ pobox.com)
Hello Brion,
On Sun, 14 Sep 2003 17:15:36 -0700, Brion Vibber wrote:
For inter-wiki moves (to meta or sep11), bot-runners, and spot-backups, it would be useful to have a way to grab individual pages in a machine-friendly package, which can be easily popped into or out of another wiki or other software.
Sample and (perhaps?) discussion of proposed XML wrapper format: http://meta.wikipedia.org/wiki/XML_import/export
If you really want it to make machine-friendly the wiki-markup must be "xmlized" as well, so [[revision]] should be changed to something like <link type="wiki">revision</link> (and type="external" if it's an external link). The same should happen with
Current revision only: http://test.wikipedia.org/wiki/Special:Export/Wiki_table_test
tables of course and other things like lists, pictures, ...
As well there should be a XSL that is missing at the moment as well.
... but I like the idea ;-) At the end we can offer a web-service- interface for queries from other systems, e.g. for meta-crawlers using the Google- and Wikipedia-API for collecting informations about a specific topic.
Regards, Lothar
On Sun, 2003-09-14 at 17:53, Lothar Kimmeringer wrote:
If you really want it to make machine-friendly the wiki-markup must be "xmlized" as well,
No, this is solely for transferring the wiki markup from place to place; consider it opaque data with some headers.
-- brion vibber (brion @ pobox.com)
On Sun, 14 Sep 2003 18:24:32 -0700, Brion Vibber wrote:
On Sun, 2003-09-14 at 17:53, Lothar Kimmeringer wrote:
If you really want it to make machine-friendly the wiki-markup must be "xmlized" as well,
No, this is solely for transferring the wiki markup from place to place; consider it opaque data with some headers.
So what exactly was meant with "which can be easily popped into or out of [...] other software" in your original post?
Regards, Lothar
On Sun, 2003-09-14 at 18:34, Lothar Kimmeringer wrote:
On Sun, 14 Sep 2003 18:24:32 -0700, Brion Vibber wrote:
On Sun, 2003-09-14 at 17:53, Lothar Kimmeringer wrote:
If you really want it to make machine-friendly the wiki-markup must be "xmlized" as well,
No, this is solely for transferring the wiki markup from place to place; consider it opaque data with some headers.
So what exactly was meant with "which can be easily popped into or out of [...] other software" in your original post?
You _do_ know that software can be written that can work with something call "text", right? :)
-- brion vibber (brion @ pobox.com)
On Sun, 14 Sep 2003 18:36:38 -0700, Brion Vibber wrote:
On Sun, 2003-09-14 at 18:34, Lothar Kimmeringer wrote:
So what exactly was meant with "which can be easily popped into or out of [...] other software" in your original post?
You _do_ know that software can be written that can work with something call "text", right? :)
Of course, but the text is not only text, it contains more (links, lists, tables, ...)
So if the XML also contains these informations as well, other software can do much more without learning the whole Wiki-Markup like creating links into the corresponding Wikipedia for references in the text, ... (as already mentioned in my first answer).
Regards, Lothar
On Mon, 15 Sep 2003 03:49:44 +0200, Lothar Kimmeringer wrote:
So if the XML also contains these informations as well, other software can do much more without learning the whole Wiki-Markup like creating links into the corresponding Wikipedia for references in the text, ... (as already mentioned in my first answer).
Just saw that this is already a project on meta.wikipedia: http://meta.wikipedia.org/wiki/Wikipedia_DTD So there will be two versions of XML for Wikipedia? ;-)
Regards, Lothar
Brion Vibber wrote:
For inter-wiki moves (to meta or sep11), bot-runners, and spot-backups, it would be useful to have a way to grab individual pages in a machine-friendly package, which can be easily popped into or out of another wiki or other software.
The idea of allowing inter-wiki moves complete with history sounds pretty cool. But I don't like the idea of allowing users to forge history by editing these XML files. Do you think we could throw in an HMAC field? bin2hex(mhash(MHASH_MD5, $text, $key)) should do the trick.
-- Tim Starling
wikitech-l@lists.wikimedia.org