Hello all,
I would like to announce the release of MediaWiki Language Extension
Bundle 2014.07. This bundle is compatible with MediaWiki 1.23.x and
MediaWiki 1.22.x releases.
* Download: https://translatewiki.net/mleb/MediaWikiLanguageExtensionBundle-2014.07.tar…
* sha256sum: 8af5c001db9375bf8dfd16495c7a88fc8dc9b4fe281b1048f6bea6c490bc4a9d
Quick links:
* Installation instructions are at: https://www.mediawiki.org/wiki/MLEB
* Announcements of new releases will be posted to a mailing list:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-i18n
* Report bugs to: https://bugzilla.wikimedia.org
* Talk with us at: #mediawiki-i18n @ Freenode
Release notes for each extension are below.
-- Kartik Mistry
== Babel, CleanChanges and LocalisationUpdate ==
* Only localisation updates
== CLDR ==
* Localisation updates.
* Updated to CLDR 25 and fixed rebuild script.
* Added Simple English translation to Persian.
== Translate ==
=== Noteworthy changes ===
* Bug 67921: Store translatable page translation units in variable
form to improve translation memory suggestions.
* Display source language for the pages in Special:Translate
* Fixed ElasticSearchTTMServer to not return matches for single word
messages only.
* Changes in Special:PageMigration:
** Bug 66162: Simplistic alignment based on h2 headers.
** Bug 65942: Split headers from other wiki text in translation units.
** Added script for preparing the page for Translation.
== UniversalLanguageSelector ==
=== Noteworthy changes ===
* Stopped using deprecated jquery.json module, this will make ULS
slightly smaller.
* Added support for Rutul language.
=== Input Methods ===
* Added Ludic (lud) transliteration layout.
* Added Tibetian (bo) EWTS layout.
--
Kartik Mistry/કાર્તિક મિસ્ત્રી | IRC: kart_
{kartikm, 0x1f1f}.wordpress.com
Hi everyone,
Different chapters are looking into organizing next years hackathon. One
of the recurring questions is how much it costs to organize such an
event. To make it easier to answer this question we published the budget
of the Amsterdam hackathon at
https://www.mediawiki.org/wiki/Amsterdam_Hackathon_2013/Budget . I
believe it's the intention to also publish the budget of this year
(Zürich) and 2012 (Berlin). That should give quite a complete overview.
One note of warning: You might have different costs than we had. Take
for example the internet and wireless: We didn't spend any money on that.
Maarten
Hi all,
frontend development is greatly hindered by not having logs of errors that
happen in production. If there is a mistake in a PHP file, it is usually
quickly caught after deployment when a large number of exceptions show up
in the error log. If the mistake is in a JS file, it can take a long time
until the error is reported and reproduced; especially so if it only
happens under exotic conditions.
Many sites solve this issue by setting up an error handler in Javascript
which reports any errors that occurred to a logging server. I tried to make
a laundry list of things that need to be done or considered if we want to
set up such logging for Wikimedia sites and/or MediaWiki in general; I put
it up as a draft RfC at [1]. I would appreciate feedback on whether this is
plausible or worthwhile.
[1]
https://www.mediawiki.org/wiki/Requests_for_comment/Server-side_Javascript_…
Hi Everyone,
Please join me in welcoming Joel Sahleen as software engineer in the
Language Engineering team.
Joel joins us from Adobe where he has been a globalization engineer for the
past 5 years. He developed the internationalization and localization system
for PHP components used in the Adobe Marketing Cloud. He also helped build
a continuous integration system for localization that currently supports
many different products, programming languages and file formats. Prior to
becoming a techie and taking the job at Adobe, Joel did a long stint as a
Teaching Fellow in the Departments of East Asian Languages and Literatures
and Religious Studies at Stanford University. Joel taught courses on Early
Chinese Language, Thought, History and Culture and developed course
materials used for Advanced Classical Chinese. Joel is very interested in
the development of collaborative, online, multilingual texts and is looking
forward to advance platform support for languages, scripts, encodings and
formats in the Wikimedia universe.
In Joel’s own words - “My main reason for wanting to join the WMF is that I
believe in its mission. I believe providing access to knowledge is one of
the best things you can do to help a person live up to his or her full
potential, and as the world becomes more interconnected and interdependent,
I believe it is essential that the pursuit of knowledge becomes more
community-driven and accessible to everyone. Language engineering is key to
making this collaborative effort possible and I am glad to be joining a
team and an organization that is actively working to make the world a
better place.” I couldn’t agree with him more.
Joel lives right outside Salt Lake City, Utah with his wife, two boys,
nephew and two dogs. He enjoys traveling, writing, painting and most of all
- learning new things.
He can be reached on email at jsahleen at wikimedia.org and on our irc
channels including #mediawiki, #wikimedia-dev and #mediawiki-i18n.
He will also be at Wikimania so feel free to say hello!
Joel - I am excited to have you on the language engineering team :-)
Welcome!
-
Alolita
Alolita Sharma
आलोलिता शर्मा
Director of Engineering
Internationalization & Localization
Wikimedia Foundation
Il 17/07/2014 16:43, Lydia Pintscher ha scritto:
> Hey :)
>
> Wikidata will soon be able to also maintain article badges like "good
> article" and "featured article".
I am very happy that we won't use {{Link FA}} anymore, but I'm very
skeptical about storing that information on Wikidata.
As I previously wrote on the bug page, the page quality rank does not
actually pertain to sitelinks.
I renew my proposal of an extension to store a quality rank of a page on
the client wiki itself, in a structured format that would also be made
available to sister projects and to the Wikibase repository.
Keeping them on Wikidata would increase redundancy with the client
sites, and surely some bots would have to periodically update badges on
the repo.
Even if I noticed your hard work on badges, I strongly believe that they
shouldn't be used for FA/GAs.
Hello everyone,
I am Diwanshi Pandey, an OPW intern. I'd like to have your feedback on the
course I have created on codecademy for mediawiki api with help of my
mentor Yuri Astrakhan.
A little insight:
The course is about parsing and querying mediawiki api.
Initially we created one course which included 44 exercises but according
to codecademy's guidelines their course are for beginners and should have
maximum 30 exercises in one course.
So we did a split up into two courses:
One is Introduction to Wikipedia
API<http://www.codecademy.com/courses/web-beginner-en-vj9nh/0/1>and
other is Wikipedia:Query
API <http://www.codecademy.com/courses/web-beginner-en-yd3lp/0/1>.
Also due to api security and restrictions we couldn't implement tutorial on
"editing wiki pages through api call" from a non wiki site yet. We are
waiting till we find a good and easy way to demo that.
Feedback may include:
* Are the exercises easy to understand for novice users/developers?
* Are changes needed in the look of exercises?
* Are there any exercises which need not to be implemented or in too depth?
* Any other thing?
Thanks,
--
*Regards,*
*Diwanshi Pandey*
Hi everyone,
I'm very pleased to welcome Kirsten Lans to the Wikimedia Foundation, as
the first ScrumMaster [1][2] hire into the newly formed Team Practices
Group [3]. Kristen will be taking over ScrumMaster duties for the Mobile
Web and Mobile Apps Engineering teams.
Prior to joining WMF, Kristen worked for six years with the TED
prize-winning Encyclopedia of Life project [4], a free, open resource that
aims to provide access to knowledge about all life on Earth. Kristen helped
to pilot and facilitate the Encyclopedia of Life organization's agile
development and planning processes, and enjoyed working with the project's
global community of contributors. Kristen is thrilled to be continuing to
work towards open knowledge sharing for all at an even larger scale.
Kristen currently lives in Cape Cod, Massachusetts with her husband and
dog, and is planning to relocate to the Bay Area by the end of the year,
and is looking forward to eating her way through the San Francisco's tasty
restaurants and taking advantage of the amazing outdoor activities in the
area.
[1] https://en.wikipedia.org/wiki/ScrumMaster#Scrum_Master
[2]
https://www.mediawiki.org/wiki/Wikimedia_Mobile_engineering/imported/Mobile…
[3]
https://www.mediawiki.org/wiki/Wikimedia_Engineering/Team_Practices_Group
[4] http://eol.org
--
Arthur Richards
Team Practices Manager
[[User:Awjrichards]]
IRC: awjr
+1-415-839-6885 x6687
This is about whether it's OK for MediaWiki core to depend on other PHP
libraries, and how to manage such dependencies.
Background: A while back, I proposed a simple class for assertions to be added
to core[1]. It was then suggested[2] that this could be placed in a separate
component, which could then be re-used by others via composer. Since the
assertions are very little code and nicely self-contained, this should be easy
to do.
However, if we want to use these in MediaWiki core, core would now depend on the
assertion component. That means that either MediaWiki would require installation
via Composer, or we have to bundle the library in some other way.
What's the best practice for this kind of thing? Shall we just make the
assertion repo an git submodule, and then pull and bundle it when making tarball
releases? Should we switch the generation of tarballs to using composer? Or
should we require composer based installs in the future? Are there other options?
Cheers,
Daniel
[1] https://www.mediawiki.org/wiki/Requests_for_comment/Assert
[2]
https://www.mediawiki.org/wiki/Talk:Requests_for_comment/Assert#Use_outside…