On 12/09/06, Timwi <timwi(a)gmx.net> wrote:
Thus, even in wiki syntax, a specification must define
what constitutes
"valid" markup, and then define the behaviour for that valid mark-up.
I think you are completely and utterly wrong on this. Every
combination of wikitext has to be able to do something, because it is
a *language*.
If people who can't work computers put a character out of place and
the wiki engine just spits back "INVALID CONTENT", are they going to
edit an article ever again? *Hell* no.
HTML was invented as a human-editable page language. However, the
humans it was invented for were nuclear physicists, rather than the
general public.
When the general public started writing web pages, they didn't write
well-formed HTML - they would bash out something, preview it in
Netscape 1.1 and put it up. The geeks derided this as "tag soup".
But I submit that the "tag soup" approach was the natural one for
people to use, because that's how people learn a language: test,
change, test again, repeat. That's how people learn wikitext on
Wikipedia: if you just write plain text, it'll more or less work. If
you want to do something fancier, you'll look and see what other
people do and try something like that and see if it works.
People care much more about what they're putting on the page than
constructing an immaculate piece of XML. If wikitext can't cope
sensibly with *anything and everything* people throw at it, then we
might as well just be requiring immaculate XML.
- d.