Hi everyone,
Please join me in welcoming Sucheta Ghoshal as associate software engineer
in the Language Engineering team. Sucheta will be working on
internationalization and localization features for Wikimedia sites as well
as maintaining other language engineering software using Javascript,
related libraries as well as Mediawiki.
Sucheta loves to code! She has been contributing to MediaWiki as a
volunteer developer for more than a year. She has been working on Language
Coverage Matrix visualizations and contributed to EtherEditor as an intern
in the Outreach Program for Women (OPW) earlier this year. She also
participated in the Open Source Language Summit in Pune in February 2013.
Sucheta has been an open source contributor since high school and actively
contributed to various open source projects including Fedora and Mozilla.
She has been an active Wikimedian contributing in Bengali and has
participated as a member of Wikimedia Kolkata for over 3 years now. Sucheta
also claims to be a bookworm, cine buff and musician in her spare time.
She can be reached on email at sghoshal at wikimedia.org or as ‘sucheta’ on
our irc channels including #mediawiki, #wikimedia-dev and #mediawiki-i18n.
Welcome Sucheta! I am excited to have you on the language engineering team!
-Alolita
--
Alolita Sharma
Director of Engineering
Language Engineering (i18n/L10n)
Wikimedia Foundation
WebKit has added native support for the 'srcset' attribute on <img>
elements[1]. This means that future versions of Safari (and hopefully
Chrome) will automatically load high-resolution thumbnails in MediaWiki
pages when using a high-density display.
[1]
https://www.webkit.org/blog/2910/improved-support-for-high-resolution-displ…
We added 'srcset' entries to MediaWiki image output last year, along with
JavaScript fallback code to handle existing browsers. New browsers with
native support will have three advantages over the previous behavior:
1) Works even with JavaScript off.
2) Avoids extra network transfers by only downloading the right size images
for your screen.
3) Avoids brief 'flash' of low-resolution thumbnails before the JavaScript
triggers.
I've confirmed on a current WebKit nightly on my Retina MacBook Pro that
this works as expected -- our fallback code automatically disables itself
and the native support handles things without a hitch!
After a few months this feature should trickle down to release versions of
Chrome and Safari on mobile devices and laptops, so everybody will get
slightly smoother page loads on their iDroids. :)
-- brion
Timezone-appropriate greeting, wikitech!
I've been working on a new extension, BetaFeatures[0]. A lot of you have
heard about it through the grapevine, and for the rest of you, consider
this an announcement for the developers. :)
The basic idea of the extension is to enable features to be enabled
experimentally on a wiki, on an opt-in basis, instead of just launching
them immediately, sometimes hidden behind a checkbox that has no special
meaning in the interface. It also has a lot of cool design work on top
of it, courtesy of Jared and May of the WMF design team, so thanks very
much to them. There are still a few[1] things[2] we have to build out,
but overall the extension is looking pretty nice so far.
I am of course always soliciting advice about the extension in general,
but in particular, we have a request for a feature for the fields that
has been giving me a bit of trouble. We want to put a count of users that
have each preference enabled on the page, but we don't want to, say, crash
the site with long SQL queries. Our theories thus far have been:
* Count all rows (grouped) in user_properties that correspond to properties
registered through the BetaFeatures hook. Potentially a lot of rows,
but we have at least decided to use an "IN" query, as opposed to "LIKE",
which would have been an outright disaster. Obviously: Caching. Caching
more would lead to more of the below issues, though.
* Fire off a job, every once in a while, to update the counts in a table
that the extension registers. Downsides: Less granular, sort of fakey
(since one of the subfeatures will be incrementing the count, live,
when a user enables a preference). Upside: Faster.
* Update counts with simple increment/decrement queries. Upside: Blazingly
faster. Potential downside: Might get out of sync. Maybe fire off jobs
even less frequently, to ensure it's not always out of date in weird
ways?
So my question is, which of these are best, and are there even better
ways out there? I love doing things right the first time, hence my asking.
[0] https://www.mediawiki.org/wiki/Extension:BetaFeatures
[1] https://mingle.corp.wikimedia.org/projects/multimedia/cards/2
[2] https://mingle.corp.wikimedia.org/projects/multimedia/cards/21
P.S. One of the first features that we'll launch with this framework is
the "MultimediaViewer" extension which is also under[3] development[4]
as we speak. Exciting times for the Multimedia team!
[3] https://mingle.corp.wikimedia.org/projects/multimedia/cards/8
[4] https://mingle.corp.wikimedia.org/projects/multimedia/cards/12
--
Mark Holmquist
Software Engineer, Multimedia
Wikimedia Foundation
mtraceur(a)member.fsf.org
https://wikimediafoundation.org/wiki/User:MHolmquist
Now that we push/force/encourage https:// connections, are we going to do
anything about the htpp:// addresses in the emails from the wikis that
alert to changes? I know that there is a bugzilla for this, and thought
that it would have been handled at the same time as the conversion of the
login.
Regards, Billinghurst
http://gracehopper.org/2013/conference/grace-hopper-open-source-day/
"The purpose of Grace Hopper Open Source Day is to give attendees of the
conference and some of our friends from local universities the
opportunity to code, network and contribute to the greater social good."
I was going to lead this. But if possible, I'd like to concentrate on my
sabbatical and give someone else a chance to guide these folks (mostly
women who are studying computer science as grad students or
undergraduates). I can give you instructions and task lists for how to
help these MediaWiki/Wikimedia newbies started. And WMF can pay for you
to get to Minneapolis and stay overnight for a few nights. (The closer
you are, the better.)
Can you help?
I'd like to get this taken care of by the end of this Friday; please
email me off-list if you're interested. Thanks.
--
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation
As multilingual content grows, interlanguage links become longer on
Wikipedia articles. Articles such as "Barak Obama" or "Sun" have more than
200 links, and that becomes a problem for users that often switch among
several languages.
As part of the future plans for the Universal Language Selector, we were
considering to:
- Show only a short list of the relevant languages for the user based on
geo-IP, previous choices and browser settings of the current user. The
language the users are looking for will be there most of the times.
- Include a "more" option to access the rest of the languages for which
the content exists with an indicator of the number of languages.
- Provide a list of the rest of the languages that users can easily scan
(grouped by script and region ao that alphabetical ordering is possible),
and search (allowing users to search a language name in another language,
using ISO codes or even making typos).
I have created a prototype <http://pauginer.github.io/prototype-uls/#lisa> to
illustrate the idea. Since this is not connected to the MediaWiki backend,
it lacks the advanced capabilities commented above but you can get the idea.
If you are interested in the missing parts, you can check the flexible
search and the list of likely languages ("common languages" section) on the
language selector used at http://translatewiki.net/ which is connected to
MediaWiki backend.
As part of the testing process for the ULS language settings, I included a
task to test also the compact interlanguage designs. Users seem to
understand their use (view
recording<https://www.usertesting.com/highlight_reels/qPYxPW1aRi1UazTMFreR>),
but I wanted to get some feedback for changes affecting such an important
element.
Please let me know if you see any possible concern with this approach.
Thanks
--
Pau Giner
Interaction Designer
Wikimedia Foundation
Hi everyone,
Please join me in welcoming Kartik Mistry as Software Engineer on the
Language Engineering team. Kartik started contributing part-time to the
team earlier this May. He will now be focusing full-time on improving our
internationalization libraries in jQuery specializing in input tools as
well as fonts. Kartik brings in-depth expertise on Indic font development
as well as Gujarati language content translation tools to the team.
Kartik is well known in India’s open source community for his many
contributions to Debian. He has been a Debian developer and package
maintainer since August 2008 and has contributed deeply to
internationalization and localization of various packages and Debian
installer for Gujarati. Kartik actively maintains about 45 packages for
Debian including aspell-gu, fortune-debian-hints, nginx as well as Aakar,
Rekha and Kalapi fonts for Gujarati. He is a Wikipedian actively
contributing in Gujarati and English as well as on Commons. He also
contributes as a MediaWiki localizer through translatewiki.net.
Kartik's journey with open source started in 2004 and believes his ultimate
goal is to keep contributing various open source projects which are useful
to people across the world. When not at work, at Debian or at FOSS world,
he loves to run in marathons or spend time with his 6 year old son.
Kartik can be reached on email at kmistry at wikimedia.org or as ‘kart_’ on
our irc channels including #mediawiki, #wikimedia-dev and #mediawiki-i18n.
Kartik blogs at http://kartikm.wordpress.com in Gujarati and
http://0x1f1f.wordpress.com in English.
Welcome again Kartik! I am excited to have you contribute full time to the
language engineering team!
-Alolita
--
Alolita Sharma
Director of Engineering
Language Engineering (i18n/L10n)
Wikimedia Foundation
Hello,
The MediaWiki code coverage test were segfaulting since the start.
Alexandros today provided us with new PHP packages that should fix the
issue, I have deployed them on the continuous integration server.
If you see any weird PHP test result / segfault, please please make
noise so we can revert the PHP package on gallium :-]
The code coverage is being build as I write this email, we will see soon
whether the new package fixed the segfault issue:
"Jenkins: MediaWiki unit tests segfault on gallium"
https://bugzilla.wikimedia.org/43972
Coverage build:
https://integration.wikimedia.org/ci/job/mediawiki-core-code-coverage/189/c…
Next: I will get the packages on the beta cluster and ping the qa list
about it.
cheers,
--
Antoine "hashar" Musso