The Wikimedia Language Engineering team is pleased to announce the
first release of the MediaWiki Language Extension Bundle. The bundle
is a collection of selected MediaWiki extensions needed by any wiki
which desires to be multilingual.
This first bundle release (2012.11) is compatible with MediaWiki 1.19,
1.20 and 1.21alpha.
Get it from https://www.mediawiki.org/wiki/MLEB
The Universal Language Selector is a must have, because it provides an
essential functionality for any user regardless on the number of
languages he/she speaks: language selection, font support for
displaying scripts badly supported by operating systems and input
methods for typing languages that don't use Latin (a-z) alphabet.
Maintaining multilingual content in a wiki is a mess without the
Translate extension, which is used by Wikimedia, KDE and
translatewiki.net, where hundreds of pieces of documentation and
interface translations are updated every day; with Localisation Update
your users will always have the latest translations freshly out of the
oven. The Clean Changes extension keeps your recent changes page
uncluttered from translation activity and other distractions.
Don't miss the chance to practice your rusty language skills and use
the Babel extension to mark the languages you speak and to find other
speakers of the same language in your wiki. And finally the cldr
extension is a database of language and country translations.
We are aiming to make new releases every month, so that you can easily
stay on the cutting edge with the constantly improving language
support. The bundle comes with clear installation and upgrade
installations. The bundle is tested against MediaWiki release
versions, so you can avoid most of the temporary breaks that would
happen if you were using the latest development versions instead.
Because this is our first release, there can be some rough edges.
Please provide us a lot of feedback so that we can improve for the
next release.
-Niklas
--
Niklas Laxström
As to my mind it's a very interesting topic, I searched a bit more.
https://www.w3.org/International/articles/article-text-size.en
which quotes
http://www-01.ibm.com/software/globalization/guidelines/a3.html
According to which, for strings in English source that are over 70
characters, you might expect an 130% average expansion. So, with an
admittedly very loose inference, the 400 character limit for all is
equivalent to a 307 character limit for English. Would you say that it
would seems ok to have a 307 character limit there?
Le 29/12/2016 à 12:11, mathieu stumpf guntz a écrit :
>
>
> Le 28/12/2016 à 23:08, Yuri Astrakhan a écrit :
>> The 400 chat limit is to be in sync with Wikidata, which has the same
>> limitation. The origins of this limit is to encourage storage of
>> "values"
>> rather than full strings (sentences).
> Well, that's probably not the best constraints for a glossary then. To
> my mind, 400 char limit regardless of the language is rather
> suprising. Surely you can tell much more with a set of 400 ideograms
> than with, well, whatever the language happen to have the longest
> average sentence length (any idea?). Also, at least for some
> translation pairs, there is a tendancy to have translations longer
> than the original[1].
>
> [1] http://www.sid.ir/en/VEWSSID/J_pdf/53001320130303.pdf
>> Also, it discourages storage of wiki
>> markup.
> What about disallowing it explicitly? You might even enforce that with
> a quick parsing that prevent recording, or simply put a reminder when
> detecting such a string to avoid blocking users in legitimate corner
> cases.
>
>>
>> On Wed, Dec 28, 2016, 16:45 mathieu stumpf guntz <
>> psychoslave(a)culture-libre.org> wrote:
>>
>>> Thank you Yuri. Is there some rational explanation behind this
>>> limits? I
>>> understand the limit over performance concern, and 2Mb seems already
>>> very large for intented glossaries. But 400 chars might be problematic
>>> for some definition I guess, especially since translations can lead to
>>> varying lenght needs.
>>>
>>>
>>> Le 25/12/2016 à 17:03, Yuri Astrakhan a écrit :
>>>> Hi Mathieu, yes, I think you can totally build up this glossary in a
>>>> dataset. Just remember that each string can be no longer then 400
>>>> chars,
>>>> and total size under 2mb.
>>>>
>>>> On Sun, Dec 25, 2016, 10:45 mathieu stumpf guntz <
>>>> psychoslave(a)culture-libre.org> wrote:
>>>>
>>>>> Hi Yuri,
>>>>>
>>>>> Seems very interesting. Am I wrong thinking this could helpto create
>>>>> multi-lingual glossary as drafted in
>>>>> https://phabricator.wikimedia.org/T150263#2860014 ?
>>>>>
>>>>>
>>>>> Le 22/12/2016 à 20:30, Yuri Astrakhan a écrit :
>>>>>> Gift season! We have launched structured data on Commons, available
>>> from
>>>>>> all wikis.
>>>>>>
>>>>>> TLDR; One data store. Use everywhere. Upload table data to Commons,
>>> with
>>>>>> localization, and use it to create wiki tables, lists, or use
>>>>>> directly
>>> in
>>>>>> graphs. Works for GeoJSON maps too. Must be licensed as CC0. Try
>>>>>> this
>>>>>> per-state GDP map demo, and select multiple years. More demos at the
>>>>> bottom.
>>>>>> US Map state highlight
>>>>>> <https://en.wikipedia.org/wiki/Template:Graph:US_Map_state_highlight>
>>>>>>
>>>>>>
>>>>>> Data can now be stored as *.tab and *.map pages in the data
>>>>>> namespace
>>> on
>>>>>> Commons. That data may contain localization, so a table cell
>>>>>> could be
>>> in
>>>>>> multiple languages. And that data is accessible from any wikis,
>>>>>> by Lua
>>>>>> scripts, Graphs, and Maps.
>>>>>>
>>>>>> Lua lets you generate wiki tables from the data by filtering,
>>> converting,
>>>>>> mixing, and formatting the raw data. Lua also lets you generate
>>>>>> lists.
>>> Or
>>>>>> any wiki markup.
>>>>>>
>>>>>> Graphs can use both .tab and .map directly to visualize the data and
>>> let
>>>>>> users interact with it. The GDP demo above uses a map from
>>>>>> Commons, and
>>>>>> colors each segment with the data based on a data table.
>>>>>>
>>>>>> Kartographer (<maplink>/<mapframe>) can use the .map data as an
>>>>>> extra
>>>>> layer
>>>>>> on top of the base map. This way we can show endangered species'
>>> habitat.
>>>>>> == Demo ==
>>>>>> * Raw data example
>>>>>> <https://commons.wikimedia.org/wiki/Data:Weather/New_York_City.tab>
>>>>>> * Interactive Weather data
>>>>>> <https://en.wikipedia.org/wiki/Template:Graph:Weather_monthly_history>
>>>>>>
>>>>>> * Same data in Weather template
>>>>>> <https://en.wikipedia.org/wiki/User:Yurik/WeatherDemo>
>>>>>> * Interactive GDP map
>>>>>> <https://en.wikipedia.org/wiki/Template:Graph:US_Map_state_highlight>
>>>>>>
>>>>>> * Endangered Jemez Mountains salamander - habitat
>>>>>> <https://en.wikipedia.org/wiki/Jemez_Mountains_salamander#/maplink/0>
>>>>>>
>>>>>> * Population history
>>>>>> <https://en.wikipedia.org/wiki/Template:Graph:Population_history>
>>>>>> * Line chart <https://en.wikipedia.org/wiki/Template:Graph:Lines>
>>>>>>
>>>>>> == Getting started ==
>>>>>> * Try creating a page at data:Sandbox/<user>.tab on Commons. Don't
>>> forget
>>>>>> the .tab extension, or it won't work.
>>>>>> * Try using some data with the Line chart graph template
>>>>>> A thorough guide is needed, help is welcome!
>>>>>>
>>>>>> == Documentation links ==
>>>>>> * Tabular help <https://www.mediawiki.org/wiki/Help:Tabular_Data>
>>>>>> * Map help <https://www.mediawiki.org/wiki/Help:Map_Data>
>>>>>> If you find a bug, create Phabricator ticket with #tabular-data
>>>>>> tag, or
>>>>>> comment on the documentation talk pages.
>>>>>>
>>>>>> == FAQ ==
>>>>>> * Relation to Wikidata: Wikidata is about "facts" (small pieces of
>>>>>> information). Structured data is about "blobs" - large amounts of
>>>>>> data
>>>>> like
>>>>>> the historical weather or the outline of the state of New York.
>>>>>>
>>>>>> == TODOs ==
>>>>>> * Add a nice "table editor" - editing JSON by hand is cruel. T134618
>>>>>> * "What links here" should track data usage across wikis. Will allow
>>>>>> quicker auto-refresh of the pages too. T153966
>>>>>> * Support data redirects. T153598
>>>>>> * Mega epic: Support external data feeds.
>>>>>> _______________________________________________
>>>>>> Wikitech-l mailing list
>>>>>> Wikitech-l(a)lists.wikimedia.org
>>>>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>>>> _______________________________________________
>>>>> Wikitech-l mailing list
>>>>> Wikitech-l(a)lists.wikimedia.org
>>>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>>> _______________________________________________
>>>> Wikitech-l mailing list
>>>> Wikitech-l(a)lists.wikimedia.org
>>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>> _______________________________________________
>>> Wikitech-l mailing list
>>> Wikitech-l(a)lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>> _______________________________________________
>> Wikitech-l mailing list
>> Wikitech-l(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
I was informed that system messages are no longer being updated daily on
Wikimedia wikis. I don't know why or whether it is intentional, so I
opened a task on the specific issue:
https://phabricator.wikimedia.org/T154302
Please follow updates there.
Nemo
Hello everybody,
Following the suggestion of Andre Klapper
<https://phabricator.wikimedia.org/T150933#2886290>, I'm turning to this
set of lists to see if it can attract more feedback on the topic of
internationalized programming facilities within WM environment
<https://phabricator.wikimedia.org/T150933>.
As described more extensively in the ticket,the idea is to implement
internationalization facilities (if they don't exist yet) in compilers
used in the WM infrastructure, and enable contributors to localize them
(possibly through Translatewiki), and then let them use localized
versions if they wish.
Please let me know if you need more details or if you have any question.
You can answer on the list or on phabricator, as you wish.
Kind regards,
Mathieu
Hi all!
When developing the Revision Slider extension [1] earlier this year, we (at
Wikimedia Germany) have learned quite some things about making extensions
and gadgets accessible for RTL language users. We have just published the
write up [2] summarizing what we've discovered.
We hope people developing new and existing tools, and in general developers
sensitive to RTL accessibility issues will find these notes interesting
and/or helpful.
Please note that this write up is a summary of lessons learned in the case
of this particular extension. It was not meant to serve as a general and
extensive guidelines or manual on these topics (there already are docs on
Directionality Support on mediawiki.org [3] which I can only recommend).
Any comments are welcome!
[1] https://www.mediawiki.org/wiki/Extension:RevisionSlider
[2] https://www.mediawiki.org/wiki/Extension:RevisionSlider/
Developing_a_RTL-accessible_feature_in_MediaWiki_-_what_
we%27ve_learned_while_creating_the_RevisionSlider
[3] https://www.mediawiki.org/wiki/Directionality_support
Best,
--
Leszek Manicki
Software Developer
Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0
http://wikimedia.de
Imagine a world, in which every single human being can freely share in the
sum of all knowledge. That‘s our commitment.
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
[x-posted announcement]
Hello,
The next online office hour session of the Wikimedia Language team is
scheduled for next Wednesday, December 7th, 2016 at 13:00 UTC. This session
is going to be an online discussion over Google Hangouts/Youtube with a
simultaneous IRC conversation. Due to the limitation of Google Hangouts,
only a limited number of participation slots are available. Hence, do
please let us know if you would like to join in the Hangout. During the
session, we will be taking questions from viewers only on the IRC channel
#wikimedia-office. The channel will be open for interactions during the
session.
Our last online round-table session was held in September 21, 2016. You can
watch the recording here: https://www.youtube.com/watch?v=NXgMZ7myEA4
Please read below for the event details, including local time, youtube
session links and do let us know if you have any questions.
Thank you
Runa
== Details ==
# Event: Wikimedia Language team's office hour session
# When: December 7, 2016 (Wednesday) at 13:00 UTC (check local time
http://www.timeanddate.com/worldclock/fixedtime.html?iso=20161207T1300)
# Where: https://www.youtube.com/watch?v=n2PgVNSmohE and on IRC
#wikimedia-office (Freenode)
# Agenda:
Updates from the Language team and Q & A.
--
Language Engineering Manager
Wikimedia Foundation