---- Original Message -----
From: "David Gerard" dgerard@gmail.com
On 6 February 2012 21:02, Jay Ashworth jra@baylink.com wrote:
Correct, and it isn't merely investments in learning; there are likely investments in wrap-around-the-outside coding which assume access to markup as well. Not All Mediawikiae Are Wikipedia.
Your use of "likely" there turns out to be largely incorrect - one of the biggest problems with wikitext is that it's all but unparsable by machines other than the original parser routines in MediaWiki. That fact was one of the inspirations for this list existing at all: to come up with a definition of wikitext that could be used by machine parsers at all.
I was around when wikitext-l forked; I know pretty much exactly how unparseable MWtext is. That doesn't preclude external code which *generates* MWtext for injection into wikis.
And in fact, IIRC, there are 4 or 5 parser replacements that are between 97 and 99% accurate. Not good enough for Wikipedia, but they'd certainly be good enough for nearly anything else...
Cheers, -- jra