Hello everyone,
It’s with great pleasure that I’m announcing that Arlo Breault joined the Wikimedia Foundation as a Features Engineer. Note the past tense. :-D
Before joining us, Arlo worked as an independent open-source software developer. He’s held various contracts at the Tor Project[1], DuckDuckGo[2], Storify[3], and Right & Democracy[4], amongst others, where he’s worked on everything from novel censorship circumvention systems to navigable visualizations of the web graph.
I first found out about him in April of 2012, but we only managed to find a fit a year ago in July of 2013 as an international contractor. His first official day as a member of our staff was July 7, 2014. I tell myself that announcing Arlo and Marc together is the reason I’ve been tardy on these announcements. :-D Along with Marcoil (previous e-mail), the two of them work with Subbu Sastry and C. Scott Ananian to form the Parsoid team, which provides the back-end voodoo that turns your VisualEditing into wikitext and back again.[5]
Arlo studied physics and mathematics at McGill in Montréal, and now lives in Victoria, BC. He has been spending a lot of his free time lately looking at cryptographic protocols, contributing OTR.js[6] to Cryptocat[7], a privacy preserving chat application. He’s otherwise typically Canadian, enjoying his maple syrup, hockey, and reading by the warm glow of a fire.
Please join me in a belated welcome of Arlo to the Wikimedia Foundation. :-)
Take care,
Terry
P.S. In keeping with Jared’s demand of a picture to accompany every new hire announcement, here is one: https://avatars0.githubusercontent.com/u/123708?v=2&s=400
[1] https://www.torproject.org/
[2] https://duckduckgo.com/
[3] https://storify.com/
[4] https://en.wikipedia.org/wiki/International_Centre_for_Human_Rights_and_Dem…
[5] https://blog.wikimedia.org/2013/03/04/parsoid-how-wikipedia-catches-up-with…
[6] https://github.com/arlolra/otr
[7] https://crypto.cat/
terry chay 최태리
Director of Features Engineering
Wikimedia Foundation
“Imagine a world in which every single human being can freely share in the sum of all knowledge. That's our commitment.”
p: +1 (415) 839-6885 x6832
m: +1 (408) 480-8902
e: tchay(a)wikimedia.org
i: http://terrychay.com/
w: http://meta.wikimedia.org/wiki/User:Tychay
aim: terrychay
According to our algorithm (*), TorBlock currently has the worse track
reviewing code contributions -- even after Tim gave a -1 to one of the
three open patches last week (thanks!). There are two patches from Tyler
that haven't received any feedback at all since August 2013.
https://gerrit.wikimedia.org/r/#/q/status:open+project:mediawiki/extensions…
Your help reviewing these patches is welcome.
It is not surprising that this extension has no maintaner listed at
https://www.mediawiki.org/wiki/Developers/Maintainers (someone suggested
Tim in that table, he disagrees and edited accordingly).
Also, maybe someone is interested in maintaining this extension? Only
eleven patches submitted in the last 15 months.
(*) http://korma.wmflabs.org/browser/gerrit_review_queue.html -- the
algorithm happens to be buggy these days, but apparently equally buggy for
all repos, which still results in some kind of justice.
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
Currently a lot of our extension Vagrant roles are working like Swiss
knives: they do everything possible to imagine. For example, MobileFrontend
always installs 3 optional dependencies while CirrusSearch includes its
configuration for unit tests that among other things
enforces $wgCapitalLinks = false which is untypical for most MW installs.
I think many of these actually make development harder. Solution? Can we
split some larger roles to "basic" and "advanced" parts, so that people who
need an extension to play around or to satisfy a dependency will not be
forced to emulate a significant part of WMF infrastructure?
--
Best regards,
Max Semenik ([[User:MaxSem]])
How feasible would it be to enable file access/linking to files on a given
filesystem without having to upload them?
Use case, I have a documentation system in /server/docs which I provide
access internally via a file share to all users. However remote users are
unable to access that share.
How difficult/dangerous would it be to to have an extension that provided
access to those files for our remote users.
Join the Team Practices Group
<https://www.mediawiki.org/wiki/Team_Practices_Group> in
#wikimedia-teampractices on IRC <https://en.wikipedia.org/wiki/Irc> (
irc.freenode.net) to chat about software development practices, processes,
theories, and philosophy - as they pertain to Wikimedia engineering and
beyond.
--
Arthur Richards
Team Practices Manager
[[User:Awjrichards]]
IRC: awjr
+1-415-839-6885 x6687
Hi all,
When some text gets translated to different languages, the result can be
very different in length depending on the languages involved.
This has implications to both development and design, and sometimes we want
to quickly check how long a specific translation can become. Since I was
asked about this recently, I though it would be interesting to share two
simple methods I use:
The first one is to check all the translations for a given message at
translatewiki.net (where MediaWiki gets translated):
1. Go to translatewiki.net
2. Search for an expression such as "reply to", and you'll get a list of
all messages that contain those terms across all the projects that get
translated at translatewiki (you can filter to a specific project if you
want to check a specific message in Mediawiki).
3. Click on edit translation. By clicking on the translation ID you can
access the "All translations" option (as shown here
<http://i.imgur.com/GXNVAJv.png>). That will provide access to the list
of all available translations
<https://translatewiki.net/w/i.php?title=Special:Translations&message=Nocc%3…>
for a given message.
4. Take a look at the list and check the different lengths for the
translations ("回复", "Reply to", "Antwoorden naar", "ಉಲ್ಲದಕ್ಕೂ ಉತ್ತರಿಸು",
"இந்த முகவரிக்கு பதில் அளி"...).
Alternatively, for terms that exist on Wikipedia, Wikidata can be also used
for that purpose:
1. Go to a Wikipedia article for the term you are interested in (e.g.,
"Question")
2. At the end of the language list on the sidebar, click on "edit
links". That will gave you access to a list with the titles of the
equivalent articles
<http://www.wikidata.org/wiki/Q189756#sitelinks-wikipedia> in different
languages.
3. Take a look a the the length of the terms in the list.
None of the methods above are bulletproof since those depend on the
availability of the content you are looking for (or something similar), but
can be useful in different situations.
Pau
--
Pau Giner
Interaction Designer
Wikimedia Foundation
I just wanted to give progress report on some of the things that have
been happening with MediaWiki-Vagrant. You may have seen the
announcement a couple of weeks ago [0] about changes to the multiwiki
support and a long awaited Wikidata role. Since then we've had another
47 (!) changes land including:
* A role for the MolHandler extension developed during GSoC
* Roles to install the MonoBook, Modern, and CologneBlue skins
* Support for testing Flickr uploads in uploadwizard role
* Browser tests support for CentralAuth
* A role for the PhpTags extension
* A role for Gadgets 2.0
There have also been some new plumbing changes. MWV is now using hiera
[1] to manage the list of enabled roles for the VM. The puppet
manifests that were at puppet/manifests/roles/*.pp were moved to a
proper module at puppet/modules/role and the site.pp manifest has been
changed to no longer use the deprecated Puppet "import" statement.
This has also opened up the possibility of making the Puppet code
shipped with MediaWiki-Vagrant compatible with more environments.
Hiera lets us do a better job of separating code from configuration by
providing a means to override variables in a structured way. This is
already being used to some extent to provide better support of running
multiple wikis via labs-vagrant [2].
The final milestone I'd like to point out is that all of our puppet
code now passes the mediawiki-vagrant-puppetlint-lenient tests in
Jenkins [3] and the job has now been made "voting" in gerrit to help
patch contributors find and fix formatting errors quickly. As always,
MWV ships with a Rakefile that can be used to run the lint checks
locally as well.
[0]: http://www.gossamer-threads.com/lists/wiki/wikitech/494609
[1]: https://docs.puppetlabs.com/hiera/1/puppet.html
[2]: https://wikitech.wikimedia.org/wiki/Labs-vagrant
[3]: https://integration.wikimedia.org/ci/job/mediawiki-vagrant-puppetlint-lenie…
Bryan
--
Bryan Davis Wikimedia Foundation <bd808(a)wikimedia.org>
[[m:User:BDavis_(WMF)]] Sr Software Engineer Boise, ID USA
irc: bd808 v:415.839.6885 x6855