Thomas Dalton wrote:
What nonsense.
Wikimedia can and should do original research. It's only in
the main namespace of Wikipedia that we have that restriction, and that's
because we see an encyclopedia as a secondary source.
It's only since the addition of ISO 639-3 that it's even feasible to use
ISO 639 as a canonical list of languages, and that's only because SIL was
recognised as the most competent body to do such a thing. The Library of
Congress was doing a poor job of it, and I would absolutely stand by our
decision to add language editions to Wikipedia that they didn't recognise.
How would you suggest deciding what is and isn't a language?
It's really quite easy. A language is a dialect with an army and a navy.
But that's not the question.
The question is whether we should have a Wikipedia edition in a given
language or dialect. This should be deicided by the judgement of a
competent committee, following research into the nature of the language
and the opinions of its community of speakers.
There are two critical questions:
1) Are the participants prepared to accept a mixed-form or standard-form
wiki as representative of all speakers? For example, this appears to be
the case with modern English, where we have a mix of two standard forms
(US and UK), and where speakers of regional dialects are happy to write in
one of these standard forms.
2) Are the differences between proposed written forms trivially resolvable
in software? This is the case in zh-tw/zh-cn, sr-ec/sr-el, and it would
certainly be the case if there was any proposed split between en_US and
en_UK.
If the answer to both is no, then there's a case for starting a new wiki.
-- Tim Starling