http://meta.wikimedia.org/wiki/List_of_wikipedias#1_000_000.2B_articles
With only 5 admins and **250 users**? What on earth are they doing?!
cheers, Brianna
On 08/10/2007, Brianna Laugher brianna.laugher@gmail.com wrote:
http://meta.wikimedia.org/wiki/List_of_wikipedias#1_000_000.2B_articles
With only 5 admins and **250 users**? What on earth are they doing?!
cheers, Brianna
-- They've just been waiting in a mountain for the right moment: http://modernthings.org/
Wikipedia-l mailing list Wikipedia-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikipedia-l
Bots? Hardworking users?
On 08/10/2007, Majorly axel9891@googlemail.com wrote:
Bots? Hardworking users?
Bots indeed..
http://vo.wikipedia.org/wiki/Geban:SmeiraBot has 582945 edits. According to the meta page the project has 767129. So this single account has fully over 75% of the edits of this entire wiki. That's a joke...
Hm, this wiki is like a catalogue of small-town America. This is the longest article: http://vo.wikipedia.org/wiki/Minneapolis_%28Minnesota%29 (don't be concerned that you won't be able to read it...)
*goes to look at the meta page*
cheers, Brianna
Brianna Laugher wrote:
On 08/10/2007, Majorly axel9891@googlemail.com wrote: Hm, this wiki is like a catalogue of small-town America. This is the longest article: http://vo.wikipedia.org/wiki/Minneapolis_%28Minnesota%29 (don't be concerned that you won't be able to read it...)
Hmmm, I can read it quite well - after the first three section it is simply english text the bot did not translate...
[[User:Ahoerstemeier]]
On 08/10/2007, Brianna Laugher brianna.laugher@gmail.com wrote:
On 08/10/2007, Majorly axel9891@googlemail.com wrote:
Bots? Hardworking users?
Bots indeed..
http://vo.wikipedia.org/wiki/Geban:SmeiraBot has 582945 edits. According to the meta page the project has 767129. So this single account has fully over 75% of the edits of this entire wiki. That's a joke...
So we've created the world's first Volapuk gazetteer of the Western world. It may not quite be an encyclopedia, but it's still a rather pleasant niche achievement...
On Mon, Oct 08, 2007 at 10:53:20PM +1000, Brianna Laugher brianna.laugher@gmail.com wrote a message of 16 lines which said:
http://meta.wikimedia.org/wiki/List_of_wikipedias#1_000_000.2B_articles
With only 5 admins and **250 users**?
250 users is not too bad if you take into account the fact that, according to [[en:Volapük]], there are only 20 people in the whole world who speaks Volapük...
Brianna Laugher wrote:
http://meta.wikimedia.org/wiki/List_of_wikipedias#1_000_000.2B_articles
With only 5 admins and **250 users**? What on earth are they doing?!
Don't forget one bot which especially likes to translate geo-stubs, as those do not need much human-written contents. You might want to join in the big discussion at [[meta:Proposals for closing projects/Closure of Volapük Wikipedia]]
[[User:Ahoerstemeier]]
Hi Brianna,
http://meta.wikimedia.org/wiki/List_of_wikipedias#1_000_000.2B_articles
With only 5 admins and **250 users**? What on earth are they doing?!
There is a very easy way to find that out: go to vo.wikipedia.org and keep pressing alt-x (random page) and you will see all the nice geographic stubs that the have generated. Then go to "history" (there are only 4 tabs, so you will find your way even without understanding volapük). And then have a look at the contributions of those bots, like http://vo.wikipedia.org/wiki/Patikos:Contributions/SmeiraBot
Heiko
So they are smart. What is wrong with that. Shall we punish all small wikis that get smart and do not want to struggle for a long time but want to get something done? It is information. Whatever way it was generated doesn't matter.
Waerth
Hi Brianna,
http://meta.wikimedia.org/wiki/List_of_wikipedias#1_000_000.2B_articles
With only 5 admins and **250 users**? What on earth are they doing?!
There is a very easy way to find that out: go to vo.wikipedia.org and keep pressing alt-x (random page) and you will see all the nice geographic stubs that the have generated. Then go to "history" (there are only 4 tabs, so you will find your way even without understanding volapük). And then have a look at the contributions of those bots, like http://vo.wikipedia.org/wiki/Patikos:Contributions/SmeiraBot
Heiko
On 09/10/2007, Waerth waerth@asianet.co.th wrote:
So they are smart. What is wrong with that. Shall we punish all small wikis that get smart and do not want to struggle for a long time but want to get something done? It is information. Whatever way it was generated doesn't matter.
If a small wiki decides to fill out its page count with largely untranslated articles from en:wp, that's really not building a wiki or a community, is it.
- d.
I was under the impression the articles were translated. If they are not, which seems to be the case in a number of them, then that is bad. If articles are made in the language the wikipedia is in then it doesn't really matter whether it is done by bot or by writing them themselves. IMHO offcourse ;)
Waerth
On 09/10/2007, Waerth waerth@asianet.co.th wrote:
So they are smart. What is wrong with that. Shall we punish all small wikis that get smart and do not want to struggle for a long time but want to get something done? It is information. Whatever way it was generated doesn't matter.
If a small wiki decides to fill out its page count with largely untranslated articles from en:wp, that's really not building a wiki or a community, is it.
- d.
Wikipedia-l mailing list Wikipedia-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikipedia-l
On 10/9/07, David Gerard dgerard@gmail.com wrote:
If a small wiki decides to fill out its page count with largely untranslated articles from en:wp, that's really not building a wiki or a community, is it.
Exactly. If we wanted that we could adjust mediawiki to fall through to the enwp text. ;)
and If we wanted a machine translation we wouldn't bother with the whole multiple wikis thing.
On Tue, 2007-10-09 at 09:03 -0400, Gregory Maxwell wrote:
On 10/9/07, David Gerard dgerard@gmail.com wrote:
If a small wiki decides to fill out its page count with largely untranslated articles from en:wp, that's really not building a wiki or a community, is it.
Exactly. If we wanted that we could adjust mediawiki to fall through to the enwp text. ;)
and If we wanted a machine translation we wouldn't bother with the whole multiple wikis thing.
Machine translation has a useful, and under-used place to play in the development of smaller Wikipedias.
Fran
Wikipedia-l mailing list Wikipedia-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikipedia-l
On 10/9/07, Francis Tyers spectre@ivixor.net wrote:
On Tue, 2007-10-09 at 09:03 -0400, Gregory Maxwell wrote:
On 10/9/07, David Gerard dgerard@gmail.com wrote:
If a small wiki decides to fill out its page count with largely untranslated articles from en:wp, that's really not building a wiki or a community, is it.
Exactly. If we wanted that we could adjust mediawiki to fall through to the enwp text. ;)
and If we wanted a machine translation we wouldn't bother with the whole multiple wikis thing.
Machine translation has a useful, and under-used place to play in the development of smaller Wikipedias.
If this is "machine translation with humane review", I might agree with you. But just a bunch of automatically translated articles to whatever topic won't help the reader much...
Michael
Fran
Wikipedia-l mailing list Wikipedia-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikipedia-l
Wikipedia-l mailing list Wikipedia-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikipedia-l
Francis Tyers schrieb:
On Tue, 2007-10-09 at 09:03 -0400, Gregory Maxwell wrote:
On 10/9/07, David Gerard dgerard@gmail.com wrote:
If a small wiki decides to fill out its page count with largely untranslated articles from en:wp, that's really not building a wiki or a community, is it.
Exactly. If we wanted that we could adjust mediawiki to fall through to the enwp text. ;)
and If we wanted a machine translation we wouldn't bother with the whole multiple wikis thing.
Machine translation has a useful, and under-used place to play in the development of smaller Wikipedias.
Fran
I completely agree to this statement :-)
Cheers, Sabine
Hello,
Sabine Cretella schrieb:
Francis Tyers schrieb:
Machine translation has a useful, and under-used place to play in the development of smaller Wikipedias.
I completely agree to this statement :-)
You overestimate the power of machine translation. I've just translated http://en.wikipedia.org/wiki/Serbin with Babelfish into German, if I wouldn't already know what that article is about, I wouldn't have known it after reading the German "translation".
Best regards, 32X
Machine translation = babelfish.
Babelfish is one little machine translator, one of the worst available.
And this neglects the use of machine translation in very closely related languages, such as Catalan and Aranese or Danish and Bokmål Norwegian, which, with human oversight, can often be very useful for writing articles.
Mark
On 11/10/2007, User 32X wikipedia@32x.de wrote:
Hello,
Sabine Cretella schrieb:
Francis Tyers schrieb:
Machine translation has a useful, and under-used place to play in the development of smaller Wikipedias.
I completely agree to this statement :-)
You overestimate the power of machine translation. I've just translated http://en.wikipedia.org/wiki/Serbin with Babelfish into German, if I wouldn't already know what that article is about, I wouldn't have known it after reading the German "translation".
Best regards, 32X
Wikipedia-l mailing list Wikipedia-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikipedia-l
oops, make that !=
On 11/10/2007, Mark Williamson node.ue@gmail.com wrote:
Machine translation = babelfish.
Babelfish is one little machine translator, one of the worst available.
And this neglects the use of machine translation in very closely related languages, such as Catalan and Aranese or Danish and Bokmål Norwegian, which, with human oversight, can often be very useful for writing articles.
Mark
On 11/10/2007, User 32X wikipedia@32x.de wrote:
Hello,
Sabine Cretella schrieb:
Francis Tyers schrieb:
Machine translation has a useful, and under-used place to play in the development of smaller Wikipedias.
I completely agree to this statement :-)
You overestimate the power of machine translation. I've just translated http://en.wikipedia.org/wiki/Serbin with Babelfish into German, if I wouldn't already know what that article is about, I wouldn't have known it after reading the German "translation".
Best regards, 32X
Wikipedia-l mailing list Wikipedia-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikipedia-l
-- Refije dirije lanmè yo paske nou posede pwòp bato.
On Fri, 2007-10-12 at 00:21 +0200, User 32X wrote:
Hello,
Sabine Cretella schrieb:
Francis Tyers schrieb:
Machine translation has a useful, and under-used place to play in the development of smaller Wikipedias.
I completely agree to this statement :-)
You overestimate the power of machine translation. I've just translated http://en.wikipedia.org/wiki/Serbin with Babelfish into German, if I wouldn't already know what that article is about, I wouldn't have known it after reading the German "translation".
I'd like to make a point here:
1) Neither English nor German are smaller Wikipedias, in fact both are quite large!
Machine translation between distant languages is often poor. Not to say that SYSTRAN (as available through Babelfish or Google) is the best you can get, but even with a local installation and customised dictionaries/rules there is still a fairly low limit on quality.
On the other hand, between closely related languages, machine translation is often very good, as I have demonstrated with a previous post. You might not be aware, but machine translation between Spanish-Catalan and Spanish-Galician is already in production use, in the first case at El Periódico de Catalunya, and the second place at El Voz de Galicia. Maybe you don't care about smaller languages, but if that is the case, why bother to post?
If you look at the current statistics for page size and quality, the Spanish language Wikipedia is ahead of these other two languages. With machine translation this gap could be decreased (in some cases significantly).
So to wind up, you've taken a microscopic test and extrapolated to an untenable conclusion. If you choose to revisit the issue, I would recommend you do it from the point where we set off, that of under-resourced, smaller Wikipedias.
Yours,
Fran
Best regards, 32X
Wikipedia-l mailing list Wikipedia-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikipedia-l
Hellas!
Francis Tyers wrote:
On Fri, 2007-10-12 at 00:21 +0200, User 32X wrote:
You overestimate the power of machine translation. [...]
I'd like to make a point here:
- Neither English nor German are smaller Wikipedias, in fact both are
quite large!
The German Wikipedia is only about six times larger than the Volapük one. The later one is considerred being small.
On the other hand, between closely related languages, machine translation is often very good, as I have demonstrated with a previous post. You might not be aware, but machine translation between Spanish-Catalan and Spanish-Galician is already in production use, [...]
If there's actually a simple way of machine translation between two closely related languages, then why are there more than one Wikipedias for that language family?
The Serbian Wikipedia works in Cyrillic and Latin alphabets, but there's still only one Wiki that serves both forms.
Maybe you don't care about smaller languages, [...]
Believe me, I do. At the moment I'm waiting for the Lower Sorbian Wikipedia to come out of the incubator and from time to time I try to read some articles in the Deitsch Wikipedia ([[pdc:]]). These small Wikipedias are often interesting, but only when there's a comunity who writes it. I don't need to read bot-created articles when there are the same articles 1000 times better in other languages I understand.
When a small Wikipedia doesn't have a special field of interest AND there is no large group of speakers, then it is – in most cases – not worth supporting it.
Best regards, 32X
On Fri, 2007-10-19 at 18:23 +0200, User 32X wrote:
Hellas!
Francis Tyers wrote:
On Fri, 2007-10-12 at 00:21 +0200, User 32X wrote:
You overestimate the power of machine translation. [...]
I'd like to make a point here:
- Neither English nor German are smaller Wikipedias, in fact both are
quite large!
The German Wikipedia is only about six times larger than the Volapük one. The later one is considerred being small.
This is a ridiculous statement. If you take any metric apart from sheer "page count" there is no comparison.
On the other hand, between closely related languages, machine translation is often very good, as I have demonstrated with a previous post. You might not be aware, but machine translation between Spanish-Catalan and Spanish-Galician is already in production use, [...]
If there's actually a simple way of machine translation between two closely related languages, then why are there more than one Wikipedias for that language family?
When a language community decides that it speaks a language, not a dialect of another language, for whatever reason (for example national identity) they may have a wish for a Wikipedia. Personally I don't think this is up for you to decide.
Unless of course you are suggesting merging the Catalan and Spanish Wikipedias ? Or the Occitan and Catalan Wikipedias ?
The Serbian Wikipedia works in Cyrillic and Latin alphabets, but there's still only one Wiki that serves both forms.
There are currently four Wikipedias for what until a few years ago was called Serbo-Croatian. There is likely to be another one in the future. I think it would be wonderful to see them merged. I'll let you get started on that.
Maybe you don't care about smaller languages, [...]
Believe me, I do. At the moment I'm waiting for the Lower Sorbian Wikipedia to come out of the incubator and from time to time I try to read some articles in the Deitsch Wikipedia ([[pdc:]]). These small Wikipedias are often interesting, but only when there's a comunity who writes it. I don't need to read bot-created articles when there are the same articles 1000 times better in other languages I understand.
I don't see where I have suggested using a bot. Is that what you think I have in mind? Machine translation needs to be post-editted, that is a job for humans. It is always necessary to get the permission of the local Wikipedia community before embarking on such a project.
Perhaps you have the idea that smaller language Wikipedias are some kind of "cute experiment", something twee or exotic, rather than a useful resource for the given speech community? Well, that is up to you, but I'd disagree. I think it is valuable to have good articles on a wide range of topics, specialist and non-specialist, and for this, machine translation between closely-related languages can be useful.
When a small Wikipedia doesn't have a special field of interest AND there is no large group of speakers, then it is – in most cases – not worth supporting it.
What on earth do you mean by a "special field of interest" ?
Yours,
Fran
Best regards, 32X
Hello,
Francis Tyers wrote:
On Fri, 2007-10-19 at 18:23 +0200, User 32X wrote:
The German Wikipedia is only about six times larger than the Volapük one. The later one is considerred being small.
This is a ridiculous statement. If you take any metric apart from sheer "page count" there is no comparison.
When I had a look at http://www.wikipedia.org/ I had to asume that there was some kind of comparison.
On the other hand, between closely related languages, machine translation is often very good, [...]
If there's actually a simple way of machine translation between two closely related languages, then why are there more than one Wikipedias for that language family?
When a language community decides that it speaks a language, not a dialect of another language, for whatever reason (for example national identity) they may have a wish for a Wikipedia. Personally I don't think this is up for you to decide.
Okay, there's the wish (and it's really not up for me to decide), but why are they using the other content then? Having the same content several times is just a balkanization of the the community.
The Serbian Wikipedia works in Cyrillic and Latin alphabets, but there's still only one Wiki that serves both forms.
There are currently four Wikipedias for what until a few years ago was called Serbo-Croatian. There is likely to be another one in the future. I think it would be wonderful to see them merged. I'll let you get started on that.
Nice try. See, the [[w:en:Little Rock Nine]] had to have an armed escort to enter the Little Rock High School in 1957 – nearly a hundred years after the American Civil War.
The civil wars in the former Yugoslavia are only a decade ago and there's still too much potential for hate between these people. A combined Wikipedia would work for some of them, but not for all. My hope is there'll be some day a portal-like Wikipedia in serbo-croatia which takes the content of the different Yugoslavian language versions and combines them somehow. (With the ability to easily switch between the same article in different languages.)
Maybe you don't care about smaller languages, [...]
Believe me, I do. [...] These small Wikipedias are often interesting, but only when there's a comunity who writes it. I don't need to read bot-created articles when there are the same articles 1000 times better in other languages I understand.
I don't see where I have suggested using a bot. Is that what you think I have in mind? Machine translation needs to be post-editted, that is a job for humans.
Okay, our positions move a bit closer together. After the recent activities on vo.wikipedia I'm a bit cautious when it comes to non-human content creation. As long as there's a comunity to work on that content, this shouldn't be that much of a problem. But still, if the text isn't NPOV or based on OR this machine-based translations /could/ help to establish its contents.
When a small Wikipedia doesn't have a special field of interest AND there is no large group of speakers, then it is – in most cases – not worth supporting it.
What on earth do you mean by a "special field of interest" ?
The previously mentioned pdc.wikipedia (oviously) has a lot of articles on German immigration in Pennsylvania. Many of these articles aren't in the German nor the English Wikipedia. These are the articles that make the pdc.wikipedia worth reading, a dozen of these articles are more valuable than the 12,294 "articles" of German municipalities in the Volapük Wikipedia.
Best regards, 32X
On Sun, 2007-10-21 at 16:38 +0200, User 32X wrote:
Hello,
Francis Tyers wrote:
On Fri, 2007-10-19 at 18:23 +0200, User 32X wrote:
The German Wikipedia is only about six times larger than the Volapük one. The later one is considerred being small.
This is a ridiculous statement. If you take any metric apart from sheer "page count" there is no comparison.
When I had a look at http://www.wikipedia.org/ I had to asume that there was some kind of comparison.
There are better statistics here:
http://meta.wikimedia.org/wiki/List_of_Wikipedias
Articles Total Edits Admins Users Images de 609770 1663501 35810679 283 425797 109499 vo 27686 30668 111338 5 173 368
But I saw some even better ones somewhere with statistics for article depth (e.g. how many edits per article), etc.
Page/article count is only one of the things that needs to be taken into account. For example, a wiki which has a page:edits ratio of 1:1 is not likely to have a very developed community.
On the other hand, between closely related languages, machine translation is often very good, [...]
If there's actually a simple way of machine translation between two closely related languages, then why are there more than one Wikipedias for that language family?
When a language community decides that it speaks a language, not a dialect of another language, for whatever reason (for example national identity) they may have a wish for a Wikipedia. Personally I don't think this is up for you to decide.
Okay, there's the wish (and it's really not up for me to decide), but why are they using the other content then? Having the same content several times is just a balkanization of the the community.
I agree :/
The Serbian Wikipedia works in Cyrillic and Latin alphabets, but there's still only one Wiki that serves both forms.
There are currently four Wikipedias for what until a few years ago was called Serbo-Croatian. There is likely to be another one in the future. I think it would be wonderful to see them merged. I'll let you get started on that.
Nice try. See, the [[w:en:Little Rock Nine]] had to have an armed escort to enter the Little Rock High School in 1957 – nearly a hundred years after the American Civil War.
The civil wars in the former Yugoslavia are only a decade ago and there's still too much potential for hate between these people. A combined Wikipedia would work for some of them, but not for all. My hope is there'll be some day a portal-like Wikipedia in serbo-croatia which takes the content of the different Yugoslavian language versions and combines them somehow. (With the ability to easily switch between the same article in different languages.)
That would be a great idea, but as I said, I don't see it happening any time soon :/
Maybe you don't care about smaller languages, [...]
Believe me, I do. [...] These small Wikipedias are often interesting, but only when there's a comunity who writes it. I don't need to read bot-created articles when there are the same articles 1000 times better in other languages I understand.
I don't see where I have suggested using a bot. Is that what you think I have in mind? Machine translation needs to be post-editted, that is a job for humans.
Okay, our positions move a bit closer together. After the recent activities on vo.wikipedia I'm a bit cautious when it comes to non-human content creation. As long as there's a comunity to work on that content, this shouldn't be that much of a problem. But still, if the text isn't NPOV or based on OR this machine-based translations /could/ help to establish its contents.
Yes, quite right. A Wikipedia full of unchecked machine translations does a lot more harm than good. The same goes for a Wikipedia full of spam, or unchecked _human_ translations.
When a small Wikipedia doesn't have a special field of interest AND there is no large group of speakers, then it is – in most cases – not worth supporting it.
What on earth do you mean by a "special field of interest" ?
The previously mentioned pdc.wikipedia (oviously) has a lot of articles on German immigration in Pennsylvania. Many of these articles aren't in the German nor the English Wikipedia. These are the articles that make the pdc.wikipedia worth reading, a dozen of these articles are more valuable than the 12,294 "articles" of German municipalities in the Volapük Wikipedia.
I completely agree. Good articles are worth more than stubs, and that is what machine translation is really about. Anyone can write a bot that generates country/province stubs, even if they don't know the language.
But to translate anything more, for people who don't know the language, machine translation can help. Providing:
1) The community assents, and there are guidelines for the process. 2) There are people there to post-edit the output.
You can read some discussion of the benefits and drawbacks here:
http://af.wikipedia.org/wiki/Wikipedia:Geselshoekie/MT
I'm glad that after a bit of a heated conversation we're largely in agreement, and sorry if I came over a bit strong :)
Fran
Best regards, 32X
User 32X schrieb:
Hello,
Sabine Cretella schrieb:
Francis Tyers schrieb:
Machine translation has a useful, and under-used place to play in the development of smaller Wikipedias.
I completely agree to this statement :-)
You overestimate the power of machine translation. I've just translated http://en.wikipedia.org/wiki/Serbin with Babelfish into German, if I wouldn't already know what that article is about, I wouldn't have known it after reading the German "translation".
Try to translate from Spanish to Catalan using Apertium http://apertium.sourceforge.net
and then ask a Catalan Speaker what he thinks ...
Babelfish is pure crap - they don't even know what machine translation means IMHO.
Apertium is programmed to translate from only one language to only one other language - for each language pair it follows very specific rules.
Also have a look at the Cherokee wikipedia ... Jeff really did an amazing job in machine translation from English to Cherokee (I have it confirmed also by other people - I don't speak or understand Cherokee myself).
Cheers, Sabine
Hoi, Having only English Wikipedia content and translate is a certain way of bringing biased information to all nooks and crannies of this world. On one hand it will definitely bring more information out and that is GOOD. However, as the translation would be onlu one way, it is not possible to remedy the perceived and real biases.
This is translation using bots.. it is not necessary machine translation and it is certainly not as good as machine translation gets.
Thanks, GerardM
On 10/9/07, Gregory Maxwell gmaxwell@gmail.com wrote:
On 10/9/07, David Gerard dgerard@gmail.com wrote:
If a small wiki decides to fill out its page count with largely untranslated articles from en:wp, that's really not building a wiki or a community, is it.
Exactly. If we wanted that we could adjust mediawiki to fall through to the enwp text. ;)
and If we wanted a machine translation we wouldn't bother with the whole multiple wikis thing.
Wikipedia-l mailing list Wikipedia-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikipedia-l
On 10/8/07, Brianna Laugher brianna.laugher@gmail.com wrote:
http://meta.wikimedia.org/wiki/List_of_wikipedias#1_000_000.2B_articles
With only 5 admins and **250 users**? What on earth are they doing?!
In one word: crap:
Ermmmm sorry, but how much better is the "original" from en: ????
http://en.wikipedia.org/wiki/Manhattan%2C_Nevada
IT is one huge template and 2 lines of text.
Waerth
On 10/8/07, Brianna Laugher brianna.laugher@gmail.com wrote:
http://meta.wikimedia.org/wiki/List_of_wikipedias#1_000_000.2B_articles
With only 5 admins and **250 users**? What on earth are they doing?!
In one word: crap:
http://vo.wikipedia.org/wiki/Manhattan_%28Nevada%29
Wikipedia-l mailing list Wikipedia-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikipedia-l
wikipedia-l@lists.wikimedia.org