(apologies for cross-posting)
I'm happy to announce that HHVM is available on all Wikimedia wikis for
intrepid beta testers. HHVM, you'll recall, is an alternative runtime for
PHP that provides substantial performance improvements over the standard
PHP interpreter. Simply put: HHVM is software that runs on Wikimedia's
servers to make your reading and editing experience faster.
You can read more about HHVM here: https://www.mediawiki.org/wiki/HHVM
* How do I enable HHVM?
You can enable HHVM by opting in to the beta feature. This short animated
gif will show you how: <http://people.wikimedia.org/~ori/hhvm_beta.gif>.
Enabling the beta feature will set a special cookie in your browser. Our
servers are configured to route requests bearing this cookie to a pool of
servers that are running HHVM.
* How do I know that it's working?
Opting-in to the beta feature does not change the user interface in any
way. If you like, you can copy the following code snippet to the global.js
subpage of your user page on MetaWiki:
https://meta.wikimedia.org/wiki/User:Ori.livneh/global.js
If you copy this script to your global.js, the personal bar will be
annotated with the name of the PHP runtime used to generate the page and
the backend response time. It looks like this:
http://people.wikimedia.org/~ori/hhvm_script.png
Edits made by users with HHVM enabled will be tagged with 'HHVM'. The tag
is there as a precaution, to help us clean up if we discover that HHVM is
mangling edits somehow. We don't expect this to happen.
* What sort of performance changes should I expect?
We expect HHVM to have a substantial impact on the time it takes to load,
preview, and save pages.
At the moment, API requests are not being handled by HHVM. Because
VisualEditor uses the API to save articles, opting in to the HHVM beta
feature will not impact the performance of VisualEditor. We hope to have
HHVM handling API requests next week.
* What sort of issues might I encounter?
Most of the bugs that we have encountered so far resulted from minute
differences in how PHP5 and HHVM handle various edge-cases. These bugs
typically cause a MediaWiki error page to be shown.
If you encounter an error, please report it on Bugzilla and tag with it the
'HHVM' keyword.
We're not done yet, but this is an important milestone. The roll-out of
HHVM as a beta feature caps many months of hard work from many developers,
both salaried and volunteer, from the Wikimedia Foundation, Wikimedia
Deutschland, and the broader Wikimedia movement. I want to take this
opportunity to express my appreciation to the following individuals, listed
in alphabetical order:
Aaron Schulz, Alexandros Kosiaris, Brad Jorsch, Brandon Black, Brett
Simmers, Bryan Davis, Chad Horohoe, Chris Steipp, Erik Bernhardson, Erik
Möller, Faidon Liambotis, Filippo Giunchedi, Giuseppe Lavagetto, Greg
Grossmeier, Jack McBarn, Katie Filbert, Kunal Mehta, Mark Bergsma, Max
Semenik, Niklas Laxström, Rob Lanphier, and Tim Starling.
More good things to come! :)
Hello and welcome to the latest edition of the WMF Engineering Roadmap and
Deployment update.
The full log of planned deployments next week can be found at:
<https://wikitech.wikimedia.org/wiki/Deployments#Week_of_September_29th>
A quick list of notable items...
== Monday ==
* PDF Service conversion, see the announcement to wikitech-ambassadors
** <
https://lists.wikimedia.org/pipermail/wikitech-ambassadors/2014-September/0…
>
== Tuesday ==
* MediaWiki deploy
** group1 to 1.25wmf1: All non-Wikipedia sites (Wiktionary, Wikisource,
Wikinews, Wikibooks, Wikiquote, Wikiversity, and a few other sites)
** <https://www.mediawiki.org/wiki/MediaWiki_1.25/wmf1>
== Thursday ==
* MediaWiki deploy
** group2 to 1.25wmf1 (all Wikipedias)
** group0 to 1.25wmf2 (test/test2/testwikidata/mediawiki)
Thanks and as always, questions and comments welcome,
Greg
--
Greg Grossmeier
Release Team Manager
Executive summary:
I messed with ldap today. Gerrit handles ldap differently from all
other services, so it broke and it took several ops several hours to
sort out what was happening. Everything is working again now.
Details:
As part of an elaborate post-tampa ballet[1] I moved the ldap
servers from virt1000 and virt0 to ldap-eqiad (aka neptunium for the
time being) and ldap-codfw (aka labcontrol2001). This change was made
this morning via puppet[2].
Much to my delight, labs handled the change gracefully and without
any service interruptions.
Wikitech suffered a brief outage because I neglected to note that
it depends on an ldap server name in the mediawiki config. I hotfixed
that on virt1000 and also submitted a proper patch[3] for review. With
that change wikitech returned to normal, although (as usual) caches are
broken and many users will have to log out and in again to get all the
labs features they're used to.
With the change in ldap server, Gerrit logins went down and stayed
down. At various times Marc, Rob, Brandon and I were all involved in
troubleshooting. Several changes were made to the ldap setup
cluster-wide[4][5] -- these changes are probably correct, but did Gerrit
no good (and getting them applied w/out gerrit was no walk in the park.)
After a great many more blind alleys, Marc noted that we typically
handle ldap certificate validation by specifying a root cert in
ldap.conf, and that is not the Proper Debian Way. Apparently we've just
been lucky so far that most of our ldap services use ldap.conf rather
than the systemwide ca-certificate system.
The right solution is to drop trusted certs into
/usr/local/share/ca-certificates and then regenerate
/etc/ssl/ca-certificates.crt by running update-ca-certificates. Marc
did this on ytterbium (the Gerrit host) and Gerrit immediately started
working again.
Remaining tasks are:
1) Puppetize Marc's hotfix[6]
2) (Maybe) totally refactor how we use ldap everywhere so that it
conforms to Debian standards.
3) Document all the services that rely on ldap so the next time
someone (me, probably) messes with it, they know what to watch for[7]
Many thanks to Marc, Rob and Brandon for joining in when I called
out for help with this problem.
[1] https://wikitech.wikimedia.org/wiki/Ldap_rename
[2] https://gerrit.wikimedia.org/r/#/c/162689/
[3] https://gerrit.wikimedia.org/r/#/c/163189/
[4] https://gerrit.wikimedia.org/r/#/c/163183/
[5] https://gerrit.wikimedia.org/r/#/c/163194/
[6] https://gerrit.wikimedia.org/r/163222
[7] https://wikitech.wikimedia.org/wiki/LDAP
Hello,
James E. Blair, from the Openstack project, wrote an async console
interface to Gerrit.
The reasoning is that when you write all your code in a terminal, it
makes sense to use a terminal interface to review code. It also make
it async so you can review while offline (ie during travel) and sync
back with Gerrit whenever you are connected again.
You can get it by doing:
pip install pbr gertty
Then create a ~/.gertty.yaml based on:
https://git.openstack.org/cgit/stackforge/gertty/plain/examples/openstack-g…
It might not work on our Gerrit setup though :-D
The announcement with more details:
http://lists.openstack.org/pipermail/openstack-infra/2014-September/001904.…
--
Antoine "hashar" Musso
Hello
My name is Alisha Jain. I am an engineering student from Punjab,
India. I came to know that your organization is participating in FOSS
Outreach Program for Women this year. I have gone through your
projects and would like to have opportunity to work with you. I am
quite interested in "Extensive and robust localisation file format
coverage" project. I have worked upon parsers which is the main reason
for my interest in this project. I have made a Two - way Converter
using Flex and Bison and C++ and a simple converter which converts
Text format to DXF format. [0] is the link to my github repository for
Two - way Converter and [1] is for the DXF parser. I want to test my
coding skills in your project. Please guide me how to proceed further
and start with the patches.
[0]https://github.com/alishajain/Converter
[1]https://github.com/alishajain/dxfwentities
--
Alisha Jain
blog - jainalisha14.wordpress.com
"Your Failure does not define you, but your determination does."
I write this email with regret to let you know that I've decided to leave
the Wikimedia Foundation after nearly four years working here, and that my
last day will be 30 September.
I go into my reasoning and plans in this personal blog post:
http://www.harihareswara.net/sumana/2014/09/12/0
I'm grateful for what I've learned here and will take these lessons
wherever I go. And I've made many friends here; thank you. I'll remain
[[User:Sumanah]] in any volunteer work I do onwiki. Quim Gil will be the
person to contact for any loose threads I leave behind; still, if you have
questions for me, the next two weeks would be a good time to ask them.
best,
Sumana Harihareswara
was Volunteer Development Coordinator, then Engineering Community Manager,
now Senior Technical Writer
Wikimedia Foundation
Ok, last attempt.
This needs merge: https://gerrit.wikimedia.org/r/#/c/151370/
Somebody shat on it (-2), so it looks ugly, sorry for that.
Didn't come back to clean up their mess, either, instead told me to
advertise it here.
Sorry for putting it like this, in particular with the Sumana's thread
right next to it, but really, I am fed up. The way this went lets all
that clamoring for contributors sound rather hollow, to my ears at
least - they are quite obviously not welcome. I mean being ignored
because everybody is apparently super busy is one thing. But being
made to jump through hoop after hoop for weeks and then being ignored
does feel less like being ignored and more like being made fun of. It
will be a long time until I waste my time again on trying to get some
patch merged.
Stephan
I just read
http://sarah.thesharps.us/2014/09/01/the-gentle-art-of-patch-review/ and it
made a lot of sense to me as a way to speed up the first response a new
patch gets.
"Instead of putting off reviewing first-time contributions and thoroughly
reviewing everything in the contribution at once, I propose a three-phase
review process for maintainers:
1. Is the idea behind the contribution sound?
2. Is the contribution architected correctly?
3. Is the contribution polished?"
The post author, a Linux kernel developer, goes into more detail in the
post; it's worth reading even if you decide this approach isn't your style.
(Reminder: https://www.mediawiki.org/wiki/Gerrit/Code_review/Getting_reviews
and https://www.mediawiki.org/wiki/Gerrit/Code_review are worth
re-skimming.)
Sumana Harihareswara
Senior Technical Writer
Wikimedia Foundation
Hello Spanish speakers of wikitech-l,
In case you are interested, Andrew Russell Green from the growth team at
the Wikimedia Foundation will be giving a talk at PHP Con Mexico 2014
<http://www.phpcon.mx/>. The conference will be in Spanish.
*Date:* September 26, 2014
*Talk:* The parallel evolution of Mediawiki and PHP, and how you can
contribute to the future of Mediawiki
*Remote viewing:* It'll be transmitted live from http://www.phpcon.mx/ ,
and the recording will be available from
https://www.youtube.com/user/ComunidadDePHP .
If there is a lot of interest, maybe we can get him to do a tech talk with
the english version.
Any specific questions? Ask Andrew!