On Tue, Sep 12, 2006 at 05:44:29PM +0100, David Gerard wrote:
On 12/09/06, Timwi timwi@gmx.net wrote:
Thus, even in wiki syntax, a specification must define what constitutes "valid" markup, and then define the behaviour for that valid mark-up.
I think you are completely and utterly wrong on this. Every combination of wikitext has to be able to do something, because it is a *language*.
Well, *my* opinion is that you're in violent agreement, gents. :-)
If people who can't work computers put a character out of place and the wiki engine just spits back "INVALID CONTENT", are they going to edit an article ever again? *Hell* no.
Of course not.
But Timwi's assertion is, indirectly, at least, that it's necessary to define *how to behave when you *get* something ill-formed*, which as a subset, requires that you define well-formed clearly.
HTML was invented as a human-editable page language. However, the humans it was invented for were nuclear physicists, rather than the general public.
Hee.
Can I quote you on that, David? :-)
But I submit that the "tag soup" approach was the natural one for people to use, because that's how people learn a language: test, change, test again, repeat. That's how people learn wikitext on Wikipedia: if you just write plain text, it'll more or less work. If you want to do something fancier, you'll look and see what other people do and try something like that and see if it works.
Yep.
People care much more about what they're putting on the page than constructing an immaculate piece of XML. If wikitext can't cope sensibly with *anything and everything* people throw at it, then we might as well just be requiring immaculate XML.
*Users* care much more about that. That is *precisely why* implementers have to care about things such as Timwi suggests. IMHO.
Cheers, -- jra