To change the format to XML (and updating the wikitext format at the same time) means we need four important things: an 'old wikitext'->XML converter, a XML->'good wikitext' converter, a 'good wikitext'->XML converter and a XML->HTML parser. (s/converter/parser, if you care about the exact words). The 'good wikitext' and html parsers should be fairly easy; the first is just plain hard.
I've only ever used one system that worked like that: LambdaMOO. When you write code in that system, it compiles it to bytecode, then decompiles it next time you want to edit it. It had some interesting quirks though:
- Whitespace was self-normalising (not a bad thing)
- Parentheses were self-normalising (sometimes a confusing thing)
- /* Comments */ were stripped out and not stored (a stupid thing)
- You couldn't save non-compiling code
Those are fairly minor problems for a programming language. They are quite major problems for a language intended for laypeople to write articles in. Consider tables - at the moment, we use whitespace quite liberally and inconsistently to make tables easier to work with. Since the way you want it varies from page to page it would be impossible for the "deparser" to get it how users want it.
I think this is what would happen:
1) User creates new page with lots of wikitext. 2) User saves page. 3) User spots a mistake and clicks "Edit this page" to fix it. 4) User sees that everything has changed from when they saved it. 5) User runs away never to be seen again.