-- Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי http://aharoni.wordpress.com “We're living in pieces, I want to live in peace.” – T. Moore
2018-02-28 1:03 GMT+02:00 Tim Landscheidt tim@tim-landscheidt.de:
Then of course there is the more fundamental problem: If those 100,000 monolingual speakers do not speak other lan- guages, have no access to encyclopedias, etc., how do they interact with a computer now, which web sites do they visit, etc.?
Quite possibly, they don't visit any websites.
Can Wikipedia be a first website in a given language? Of course.
Who if not Wikipedia? In a lot of languages, the first, and sometimes the only written work is a translation of the Bible or of the UDHR. (Reminder: The Bible was the first work that was published in a lot of European languages, too.) These are usually made by some kind of a funded initiative that comes from religious or human rights organizations. Why shouldn't it be a translation of 10,000 Wikipedia articles? Why shouldn't it be an initiative from Wikimedia or another educational organization?
I just have a very hard time to imagine a community of 100,000 people under those circumstances who are only held back by not having access to a Wikipedia. On the contrary, this reminds me very much of traditional development prac- tices where third world countries always seem to urgently need to buy what first world countries have to sell. IMHO, there is a considerable risk that this creates unhealthy de- pendencies.
Hey, if people don't want it, they don't have to read it, but I suspect that if you *let* people read useful information about geography, medicine, public policy, economics, etc., they will use it.
But in very simplified terms, I see it as a competition between UN, JW, Facebook, and Wikimedia, and Wikimedia is hardly even participating. UN is a fine organization, but not very useful in people's daily life. Religious materials' contribution to development of publishing and literacy throughout history can't be denied, but the usefulness of their content can be questioned. Facebook is useful to a lot of people, and it can be localized easily, but it would be kind of depressing if that's the only thing that people do in their language. And Facebook is very actively trying to reach to the farthest corners of the world and get people connected.
And this leaves Wikimedia, which is hardly doing anything proactive to get its materials *actually* written in more languages. We are making *technologies* for translation—Wikidata, Content Translation, and more—and they are used by thousands of translators to write in dozens of languages, but we are not doing anything proactive to expand the coverage of languages beyond the usual suspects: the 70 or so languages that John Erling mentioned in the email that started this thread. The ~70 big languages take care of themselves. We've been saying that the rest of the languages can take care of themselves, but that is naïve.