2009/9/25 Dmitriy Sintsov <questpc(a)rambler.ru>ru>:
XML is used to store "human-produced" rich
formatted text by many
standalone and web apps. XML parsers are also very strict and spitting
errors. As it's been mentioned recently, XML is really good for bots,
too, for that reason (the input is "error-free" tree).
I wonder, If the browsers can handle tag soup, most probably MediaWiki
parser can handle wikitext soup, too? Eg, instead of parsing error,
properly close the nodes. The existing wikitext of millions of articles
has to be converted by commandline upgrade script in case the wikitext
will be abandoned. Though I wonder whether it's possible to keep
wikitext editing mode for backwards compatibility by using the same
method online.
Wikitext started as a shorthand for HTML. The horrible things that
have happened since then are from ad-hoc additions to a parser and no
formal spec, leaving the parser behaviour as, literally, the only
definition.
- d.