There is a request for a Wikipedia in Ancient Greek. This request has so far
been denied. A lot of words have been used about it. Many people maintain
their positions and do not for whatever reason consider the arguments of
In my opinion their are a few roadblocks.
- Ancient Greek is an ancient language - the policy does not allow for
- Text in ancient Greek written today about contemporary subjects
require the reconstruction of Ancient Greek.
- it requires the use of existing words for concepts that did
not exist at the time when the language was alive
- neologisms will be needed to describe things that did not
exist at the time when the language was alive
- modern texts will not represent the language as it used to be
- Constructed and by inference reconstructed languages are effectively
We can change the policy if there are sufficient arguments, when we agree on
When a text is written in reconstructed ancient Greek, and when it is
clearly stated that it is NOT the ancient Greek of bygone days, it can be
obvious that it is a great tool to learn skills to read and write ancient
Greek but that it is in itself not Ancient Greek. Ancient Greek as a
language is ancient. I have had a word with people who are involved in the
working group that deals with the ISO-639, I have had a word with someone
from SIL and it is clear that a proposal for a code for "Ancient Greek
reconstructed" will be considered for the ISO-639-3. For the ISO-639-6 a
code is likely to be given because a clear use for this code can be given.
We can apply for a code and as it has a use bigger then Wikipedia alone it
clearly has merit.
With modern texts clearly labelled as distinct from the original language,
it will be obvious that innovations a writers needs for his writing are
This leaves the fact that constructed and reconstructed languages are not
permitted because of the notion that mother tongue users are required. In my
opinion, this has always been only a gesture to those people who are dead
set against any and all constructed languages. In the policies there is
something vague "*it must have a reasonable degree of recognition as
determined by discussion (this requirement is being discussed by the language
subcommittee <http://meta.wikimedia.org/wiki/Language_subcommittee>)."* It
is vague because even though the policy talks about a discussion, it is
killed off immediately by stating "The proposal has a sufficient number of
living native speakers to form a viable community and audience." In my
opinion, this discussion for criteria for the acceptance of constructed or
reconstructed languages has not happened. Proposals for objective criteria
have been ignored.
In essence, to be clear about it:
- We can get a code for reconstructed languages.
- We need to change the policy to allow for reconstructed and
We need to do both in order to move forward.
The proposal for objective criteria for constructed and reconstructed
languages is in a nutshell:
- The language must have an ISO-639-3 code
- We need full WMF localisation from the start
- The language must be sufficiently expressive for writing a modern
- The Incubator project must have sufficiently large articles that
demonstrate both the language and its ability to write about a wide range of
- A sufficiently large group of editors must be part of the Incubator
There are opinions on Commons that Moeller's statement in this list
("[W]e've consistently held that faithful reproductions of
two-dimensional public domain works which are nothing more than
reproductions should be considered public domain for licensing
purposes") has been "overruled" by Mike Godwin's statement (which was
adressed on a Wikisource case)
We should not accept such nonsense.
Every Wikimania bid has strengths and weaknesses. Once a bid is chosen,
the ritual of criticizing the selection by focusing on some weakness
seems to have become inevitable. I would be more impressed to reconsider
the jury's selection if somebody presented a serious evaluation that
reached a different result after weighing all the issues, instead of
harping on only the one most favorable to the argument.
Since that is not yet forthcoming, I'd like to refocus the discussion on
the concept of Wikimania in general, since it seems to produce so much
debate. As an idea, Wikimania is being pulled in too many directions,
and it cannot be all things to all people. Supposing we have a consensus
that in the most basic sense it's a good idea (do we have that?), what
can we make of this idea? What kind of event should it be? What values
do we prioritize - intimacy, mass appeal, accessibility, outreach,
infrastructure, culture? Others that I haven't listed? If we care about
diversity, what is that? When we consider costs, whose costs do we mean?
How do we balance the competing considerations?
Currently the conference is planned for roughly 400 people. So far I'm
not aware of any location having difficulty attracting attendees. The
argument for catering to the highest concentrations of contributors
would be more appealing if coupled with the idea that it makes sense to
accommodate more people. But expanding Wikimania would change other
dynamics of accessibility - the type of facility used, individual costs
and overall conference expenses, the character of the event. At least so
far, nobody has been presenting this as a vision for Wikimania's future.
Another consideration is that admission fees have consciously been kept
low. Otherwise Wikimania doesn't make Wikimedia contributors a priority
- at least, not the kind of contributors I gather everyone is referring
to here. For any location most people already face costs related to
attendance, it's simply impossible to physically bring Wikimania to
everyone. Realistically, for any one person, Wikimania may be close
enough for you to come at minimal cost once or twice in your lifetime.
Some people may have to use a broad interpretation of "minimal" for even
Geographic proximity only goes so far in any case. Talking about Europe
and North America may sound as if that still leaves a vast range of
options. In the first place, this would be more persuasive if we saw a
larger number of cities bidding. When it's just one from each, the
chances of producing a bid superior to a highly-motivated team from,
say, South America are not exactly overwhelming. Furthermore, even if
this was the very highest consideration, it's not exactly neutral
between those. The varying population distributions and distances,
especially for North America, would have obvious logical consequences.
Basically, we should prefer any bid from the European core (defined by
London on the west, Rome on the south, Berlin or Rome on the east,
Berlin or Amsterdam on the north); the east coast of North America would
be a secondary option (maybe we could disqualify Europe every third
year); by comparison, the odds for the rest of North America would be
decidedly inferior (after ten or so years, we might make it to Chicago
or Los Angeles).
Wikimania could be bigger or smaller, reach the developing world or only
the already-developed, more expensive or less so, rotated widely or
narrowly. Leaving aside the security concerns specific to Alexandria,
the choice of options would have the following undesirable consequences,
depending on which course is taken:
*Complaints that the event is impersonal, lacks a sense of community, or
is merely a stage-managed public relations show
*After a cycle or two, it seems to be pretty much just the same group of
people getting together every few years
*Objections that the amount being spent is a poor use of foundation
funds (depending on how it works out, this would be about either the
size of the event or the travel costs incurred by the foundation itself,
making distance from San Francisco a factor)
*Inability to accommodate anyone beyond the local audience, thus being
hardly different from a random meetup and failing to reflect the diverse
character of Wikimedia participants
*Rumors and misperceptions of unfairness in timing of when registration
is opened or how tickets are allocated
*Outrage over high admission charges, resembling more closely a
I would like to understand what vision people have for Wikimania, and
see how their vision would deal with all of these issues. So far I have
heard only complaints and rebuttals, nobody offering their own vision
(on this list, at least). I fear an end result of the fights over this
would be to either abandon the idea of Wikimania, or simply to hold it
in the Moscone Center every year like Macworld. Before we get there,
let's hear some better alternatives.
On Jan 8, 2008 6:19 AM, Brian McNeil <brian.mcneil(a)wikinewsie.org> wrote:
Forgive me if some of this is retreading old ground, but I've over 50
> messages for this list since yesterday. Can we have a rerun (or a January
> run) of the top poster stats? I was 2nd last time and felt embarrassed
> despite having thought most of what I wrote was close to the topic in
Posts in December to Foundation-l
1 Thomas Dalton - 123
2 Anthony - 70
3 Andrew Whitworth - 64
4 David Gerard - 48
5 Florence Devouard - 47
6 Brian McNeil - 43
7 Nathan Awrich - 36
8 Ray Saintonge - 33
T9 GerardM - 32
T9 Mike Godwin - 32
T9 Erik Moeller - 32
The language subcommittee only allows languages that have a living
native community (except Wikisource, due to its archivist nature).
This is based on an interpretation of the Wikimedia Foundation mission
to "provide the sum of human knowledge to every human being". Thus,
the overriding purpose of allowing a wiki in a new language is to make
it accessible to more human beings. If a language has no native users,
allowing a wiki in that language does not fit our mission because it
does not make that project accessible to more human beings. Instead, a
wiki in their native languages should be requested if it doesn't
Typically, the users requesting a wiki in an extinct language don't
want to provide educational material to more people at all, but only
want to promote or revive the language. While these are noble goals,
they are not those of the Wikimedia Foundation, so that a wiki should
not be created simply to fulfill them.
But that is my opinion. What do you think; should wikis be allowed in
every extinct language?
Jesse Plamondon-Willard (Pathoschild)
----- Original Message ----
From: Gerard Meijssen <gerard.meijssen(a)gmail.com>
To: Wikimedia Foundation Mailing List <foundation-l(a)lists.wikimedia.org>
Sent: Monday, April 14, 2008 1:37:38 AM
Subject: Re: [Foundation-l] Ancient Greek Wikipedia
Greek is understood to be understood to have several
Koine is according to Ethnologue part of "Ancient Greek". This is Greek
until 1453 AD according to the ISO standard. Now when the definition of
"Ancient" is wrongly applied, get this addressed at the appropriate places.
This is the right way to approach this. Again, this is most likely to lead
to a new code to acknowledge modern usage.
for gerard and pathoschild
for you, is very important the need of a new class of ISO code? for opening a Wikipedia?
1.- Ancient is a neccesary adjective for differenciating with modern greek (native language), a different language (with its own gramar, sintaxis, etc). it needs even if is used in contemporanean context. unless the modern language no longer called "Greek". i guess, it is imposible.
2.- if you read my last post. you realize that even scholars don't differentiate extint languages that are still in use and the ones aren't to (the unique factor considered is the lack native speaker and no more). don't exist a concept for them. it is the perpetual problem of all the social science, the lack of accurate terms. (latin has also a code of extint language)
if ISO base its decision in these concepts. don't you believe that is practically imposible the creation of a new Kind of Code?
inevitably, it has to be used. does not exist another code for identifying ancient greek.
3.- with a sense of fair, if, i think, it has been more o less clear that ancient greek is still in use in many contexts. you would think is a moment to rethink the decission, woun't you think has been excessively restrictived?
Be a better friend, newshound, and
know-it-all with Yahoo! Mobile. Try it now. http://mobile.yahoo.com/;_ylt=Ahu06i62sR8HDtDypao8Wcj9tAcJ
Has there been any discussion on this matter? If a user is being disruptive
on a wiki he or she will eventually end up getting blocked for it. If the
same user decides to continue this disruption he was blocked for on other
wikis, particularly sister projects, commons, meta and etc how should he or
she be treated.
I know every wiki is independent. But letting a disruptive user become the
source of agony on many wikis seems like a problematic thing to do.
- White Cat
Perhaps a community discussion is necessary on the matter, I hereby initiate
When a person tries to edit a page that contains a URL matching the spam
autoblocker regex, the user is prohibited from making the edit until the
spam link is removed. The spam autoblocker was intended to prevent the
addition of new spam.
In a scenario where a spambot adds spam links to wikipedia, then later the
spam url is added to the spam blacklist, then a user tries to edit a page
that already contains spam added before the spam url is added to the spam
blacklist. For a human this isn't much of a deal to deal with, it is however
a different story when it comes to bots.
Consider you are operating a bot that makes non-controversial routine
maintenance edits on a regular basis. The spam autoblocker would prevent
such edits. If your bot's task is dealing with images renamed/deleted on
commons or if your bots task is dealing with interwiki links this is
particularly problematic. Interwiki bots, commons delinking bots often edit
hundereds of pages a day on hundereds of wikis. Thats a lot of logs. So the
suggestion that I should spend perhaps hours per day reading log files for
spam on pages on languages I cannot even understand (or necesarily read the
?'s and %'s) is quite unreasonable. This is a task better dealt with by the
locals (humans) of the wiki community rather than bots preforming mindless,
routine and non-controversial tasks.
There is also the matter of legitimate reason to include spam on pages such
as archived discussion on a spam bot attack where example URLs are used
before these make their way to the spam autoblocker.
- White Cat
We're still waiting for the FDL 1.3. Since there's been no resolution
within the timeframe we hoped for, we're going to re-allow the
creation of new Wikimedia wikis. To make sure that we can safely
transition to CC-BY-SA, we're going to dual-license them under
We may remove this dual-licensing clause later, depending on what the
community decides with regard to licensing of existing and new wikis
based on the options that the FDL 1.3 will provide. This
dual-licensing of new wikis is purely intended to make sure that we
have the _option_ to transition these wikis to CC-BY-SA 3.0 (or later)
if we choose to.
Deputy Director, Wikimedia Foundation
Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate