The Wikimedia Language Engineering team is pleased to announce the
first release of the MediaWiki Language Extension Bundle. The bundle
is a collection of selected MediaWiki extensions needed by any wiki
which desires to be multilingual.
This first bundle release (2012.11) is compatible with MediaWiki 1.19,
1.20 and 1.21alpha.
Get it from https://www.mediawiki.org/wiki/MLEB
The Universal Language Selector is a must have, because it provides an
essential functionality for any user regardless on the number of
languages he/she speaks: language selection, font support for
displaying scripts badly supported by operating systems and input
methods for typing languages that don't use Latin (a-z) alphabet.
Maintaining multilingual content in a wiki is a mess without the
Translate extension, which is used by Wikimedia, KDE and
translatewiki.net, where hundreds of pieces of documentation and
interface translations are updated every day; with Localisation Update
your users will always have the latest translations freshly out of the
oven. The Clean Changes extension keeps your recent changes page
uncluttered from translation activity and other distractions.
Don't miss the chance to practice your rusty language skills and use
the Babel extension to mark the languages you speak and to find other
speakers of the same language in your wiki. And finally the cldr
extension is a database of language and country translations.
We are aiming to make new releases every month, so that you can easily
stay on the cutting edge with the constantly improving language
support. The bundle comes with clear installation and upgrade
installations. The bundle is tested against MediaWiki release
versions, so you can avoid most of the temporary breaks that would
happen if you were using the latest development versions instead.
Because this is our first release, there can be some rough edges.
Please provide us a lot of feedback so that we can improve for the
next release.
-Niklas
--
Niklas Laxström
Given the recent discussions on how to deal with person names in
Wikidata (e.g. how many properties to use, how to handle scripts,
automatic vs. manual labels/aliases/descriptions...) and the importance
username display has in MediaWiki (e.g. gendered namespaces, log system
restructure since 1.19, ...), it may be useful for someone to read this
thesis and summarise it to our benefit. :)
http://ulir.ul.ie/handle/10344/3450
«If a system does not possess the ability to capture, store, and
retrieve people names, according to their cultural requirements, it is
less likely to be acceptable on the international market.
Internationalisation of people names could reduce the probability of a
person’s name being lost in a system, avoiding frustration, saving time,
and possibly money. This study attempts to determine the extent to which
the human name can be internationalised, based upon published
anthroponymic data for 148 locales, by categorising them into eleven
distinctly autonomous parts: definite article, common title, honorific
title, nickname, by-name, particle, forename, patronymic or matronymic,
surname, community name, and generational marker. This paper provides an
evaluation of the effectiveness of internationalising people names;
examining the challenges of terminology conflicts, the impact of
subjectivity whilst pigeonholing personyms, and the consequences of
decisions made. It has demonstrated that the cultural variety of human
names can be expressed with the Locale Data Mark-up Language for 74% of
the world’s countries. This study, which spans 1,919 anthroponymic
syntactic structures, has also established, through the use of a unique
form of encoding, that the extent to which the human name can be
internationalised is 96.31% of the data published by Plassard (1996) and
Interpol (2006). Software developers, localisation engineers, and
database administrators may benefit from this paper, through recognition
of this problem and understanding the potential gains from accurately
handling people names within a system. The outcome of this study opens
up opportunities for future research into cultural name mapping that may
further enhance the Common Locale Data Repository.»
Within search we are moving forward with cross-wiki searching.
This means if you go to de.wikipedia.org and search for
''Ревест-Сен-Мартен" we should show a message like:
No results found on the German Wikipedia
Showing results from the Russian Wikipedia
Except of course, it should be (beware machine translation):
Keine Ergebnisse auf dem deutschen Wikipedia gefunden
Zeige Ergebnis aus der russischen Wikipedia
So somehow, we have to map from a wiki id (ruwiki, enwiki, dewiki, etc) to
a localized (not for that wiki, for every other wiki in all languages)
version of that project name. How can we go about this?
Hello,
A reminder that the online office hour hosted by the Wikimedia Language
Engineering team is scheduled for later today at 1500 UTC. You can join the
hangout or watch the session from:
https://plus.google.com/u/0/events/c1c0msurhua7enopsu3q8l42j3s
Please note, due to the limitation of Google Hangouts there are few seats
available. So do let us know before hand if you would like to participate
on the hangout. We will also be on the IRC channel #wikimedia-office to
take questions. Please see below for the event details, local time and
original announcement.
Thanks
Runa
== Details ==
# Event: Wikimedia Language Engineering office hour session
# When: September 16th, 2015 (Wednesday) at 15:00 UTC (check local time
http://www.timeanddate.com/worldclock/fixedtime.html?iso=20150916T1500)
# Where: https://plus.google.com/u/0/events/c1c0msurhua7enopsu3q8l42j3s and
on IRC #wikimedia-office (Freenode)
# Agenda: Content Translation updates and Q & A
---------- Forwarded message ----------
From: Runa Bhattacharjee <rbhattacharjee(a)wikimedia.org>
Date: Tue, Sep 8, 2015 at 5:21 PM
Subject: Language Engineering office hour and online meeting on 16
September 2015 (Wednesday) at 1500 UTC
To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>, MediaWiki
internationalisation <mediawiki-i18n(a)lists.wikimedia.org>, "Wikimedia &
GLAM collaboration [Public]" <glam(a)lists.wikimedia.org>, Wikimedia Mailing
List <wikimedia-l(a)lists.wikimedia.org>
[x-posted announcement]
Hello,
The next office hour of the Wikimedia Language Engineering team is
scheduled for next Wednesday, September 16th at 15:00 UTC. Like our last
office hour, this time also we are hosting it as an online discussion over
Hangout/Youtube with a simultaneous IRC conversation. Due to the limitation
of Google Hangouts, only a limited number of slots are available. Hence, do
let us know (on the event page
<https://plus.google.com/u/0/events/c1c0msurhua7enopsu3q8l42j3s>) if you
would like to participate on the Hangout. The IRC channel #wikimedia-office
and the Q&A channel for the youtube broadcast will be open for interactions
during the session.
Our last online round-table session was held in June 2015. You can watch
the recording here: https://www.youtube.com/watch?v=vbXyHmpZJGE
Please read below for the event details and do let us know if you have any
questions.
Thank you
Runa
== Details ==
# Event: Wikimedia Language Engineering office hour session
# When: September 16th, 2015 (Wednesday) at 15:00 UTC (check local time
http://www.timeanddate.com/worldclock/fixedtime.html?iso=20150916T1500)
# Where: https://plus.google.com/u/0/events/c1c0msurhua7enopsu3q8l42j3s and
on IRC #wikimedia-office (Freenode)
# Agenda: Content Translation updates and Q & A
--
Language Engineering Manager
Outreach and QA Coordinator
Wikimedia Foundation
[x-posted announcement]
Hello,
The next office hour of the Wikimedia Language Engineering team is
scheduled for next Wednesday, September 16th at 15:00 UTC. Like our last
office hour, this time also we are hosting it as an online discussion over
Hangout/Youtube with a simultaneous IRC conversation. Due to the limitation
of Google Hangouts, only a limited number of slots are available. Hence, do
let us know (on the event page
<https://plus.google.com/u/0/events/c1c0msurhua7enopsu3q8l42j3s>) if you
would like to participate on the Hangout. The IRC channel #wikimedia-office
and the Q&A channel for the youtube broadcast will be open for interactions
during the session.
Our last online round-table session was held in June 2015. You can watch
the recording here: https://www.youtube.com/watch?v=vbXyHmpZJGE
Please read below for the event details and do let us know if you have any
questions.
Thank you
Runa
== Details ==
# Event: Wikimedia Language Engineering office hour session
# When: September 16th, 2015 (Wednesday) at 15:00 UTC (check local time
http://www.timeanddate.com/worldclock/fixedtime.html?iso=20150916T1500)
# Where: https://plus.google.com/u/0/events/c1c0msurhua7enopsu3q8l42j3s and
on IRC #wikimedia-office (Freenode)
# Agenda: Content Translation updates and Q & A
--
Language Engineering Manager
Outreach and QA Coordinator
Wikimedia Foundation
extensions/BoilerPlate is now a standalone extension, so we can run
automated tests on it.
So I merged the patch removing the same files from extensions/examples
(which isn't one extension but a hodge-podge of files and subdirectories).
However, I noticed extensions/BoilerPlate/i18n has different i18n files
than extensions/examples/BoilerPlate/i18n had. Does TranslateWiki realize
the same strings are now in a different place? If not is there some way to
tell it about the new location?
Thanks,
--
=S Page WMF Tech writer