The next RFC meeting will discuss the following RFC:
* API roadmap (Brad Jorsch, Yuri Astrakhan)
<https://www.mediawiki.org/wiki/Requests_for_comment/API_roadmap>
The API roadmap was last discussed at the Architecture Summit in January.
The meeting will be on the IRC channel #wikimedia-office on
irc.freenode.org at the following time:
* UTC: Wednesday 21:00
* US PDT: Wednesday 14:00
* Europe CEST: Wednesday 23:00
* Australia AEST: Thursday 07:00
-- Tim Starling
Forwarding from Siko Bouterse:
Greetings! The Wikimedia Foundation Individual Engagement Grants program is
accepting proposals for funding new experiments from September 1st to 30th.
<https://meta.wikimedia.org/wiki/Grants:IEG>
Your idea can improve Wikimedia projects by building a new tool or gadget,
organizing a better process on your wiki, conducting research on an
important issue, or providing other support for community-building. Whether
you need $200 or $30,000 USD, Individual Engagement Grants can cover your
own project development time in addition to funding for a team to help you.
The program has a flexible schedule and reporting structure, and
Grantmaking staff are there to support you through all stages of the
process.
Do you have have a good idea, but you are worried that it isn’t developed
enough for a grant? Put it into the IdeaLab, where volunteers and staff
can give you advice and guidance on how to bring it to life. <
https://meta.wikimedia.org/wiki/Grants:IdeaLab> Also, IEG will be hosting
three Hangout Sessions for real-time discussions to help you make your
proposal better - the first will happen on September 16th. <
https://meta.wikimedia.org/wiki/Grants:IdeaLab/Events#Upcoming_events>
For inspiration, you can read more about past projects <
https://blog.wikimedia.org/tag/individual-engagement-grants/> that received
funding or review open proposals <
https://meta.wikimedia.org/wiki/Grants:IEG#ieg-reviewing>. We are excited
to see some of the new ways your grant ideas can support our community and
make an impact on the future of Wikimedia projects.
Submit your proposal in September! <
https://meta.wikimedia.org/wiki/Grants:IEG#ieg-apply>
Dear Wikitech-l
REL1_24 was branched and announced ahead of schedule[0].
Our schedule[1] clearly states that we will announce a branch one week
before we make one in order to allow developers to put any necessary work
into the branch and keep backports to a minimum.
The current release is planned for 2014-11-26[2]. According to our
schedule, the actual REL1_24 branch should be made on 2014-10-22.
Please excuse the error and do not deprecate anything while we make the
necessary corrections. Unless there is a good reason to keep the branch,
we will revert the current REL1_24 branch on Monday evening.
Finally, while we are very happy to have help with the release work, we
do ask that you coordinate with us to ensure a controlled process free
of disruption.
Signed,
Mark and Markus
[0] https://gerrit.wikimedia.org/r/#/q/project:mediawiki/core+branch:REL1_24,n,z)
[1] https://www.mediawiki.org/wiki/WikiReleaseTeam/Release_process#Release_proc…
[2] https://www.mediawiki.org/wiki/WikiReleaseTeam/Release_timeline
(apologies for cross-posting)
I'm happy to announce that HHVM is available on all Wikimedia wikis for
intrepid beta testers. HHVM, you'll recall, is an alternative runtime for
PHP that provides substantial performance improvements over the standard
PHP interpreter. Simply put: HHVM is software that runs on Wikimedia's
servers to make your reading and editing experience faster.
You can read more about HHVM here: https://www.mediawiki.org/wiki/HHVM
* How do I enable HHVM?
You can enable HHVM by opting in to the beta feature. This short animated
gif will show you how: <http://people.wikimedia.org/~ori/hhvm_beta.gif>.
Enabling the beta feature will set a special cookie in your browser. Our
servers are configured to route requests bearing this cookie to a pool of
servers that are running HHVM.
* How do I know that it's working?
Opting-in to the beta feature does not change the user interface in any
way. If you like, you can copy the following code snippet to the global.js
subpage of your user page on MetaWiki:
https://meta.wikimedia.org/wiki/User:Ori.livneh/global.js
If you copy this script to your global.js, the personal bar will be
annotated with the name of the PHP runtime used to generate the page and
the backend response time. It looks like this:
http://people.wikimedia.org/~ori/hhvm_script.png
Edits made by users with HHVM enabled will be tagged with 'HHVM'. The tag
is there as a precaution, to help us clean up if we discover that HHVM is
mangling edits somehow. We don't expect this to happen.
* What sort of performance changes should I expect?
We expect HHVM to have a substantial impact on the time it takes to load,
preview, and save pages.
At the moment, API requests are not being handled by HHVM. Because
VisualEditor uses the API to save articles, opting in to the HHVM beta
feature will not impact the performance of VisualEditor. We hope to have
HHVM handling API requests next week.
* What sort of issues might I encounter?
Most of the bugs that we have encountered so far resulted from minute
differences in how PHP5 and HHVM handle various edge-cases. These bugs
typically cause a MediaWiki error page to be shown.
If you encounter an error, please report it on Bugzilla and tag with it the
'HHVM' keyword.
We're not done yet, but this is an important milestone. The roll-out of
HHVM as a beta feature caps many months of hard work from many developers,
both salaried and volunteer, from the Wikimedia Foundation, Wikimedia
Deutschland, and the broader Wikimedia movement. I want to take this
opportunity to express my appreciation to the following individuals, listed
in alphabetical order:
Aaron Schulz, Alexandros Kosiaris, Brad Jorsch, Brandon Black, Brett
Simmers, Bryan Davis, Chad Horohoe, Chris Steipp, Erik Bernhardson, Erik
Möller, Faidon Liambotis, Filippo Giunchedi, Giuseppe Lavagetto, Greg
Grossmeier, Jack McBarn, Katie Filbert, Kunal Mehta, Mark Bergsma, Max
Semenik, Niklas Laxström, Rob Lanphier, and Tim Starling.
More good things to come! :)
Hello and welcome to the latest edition of the WMF Engineering Roadmap and
Deployment update.
The full log of planned deployments next week can be found at:
<https://wikitech.wikimedia.org/wiki/Deployments#Week_of_September_29th>
A quick list of notable items...
== Monday ==
* PDF Service conversion, see the announcement to wikitech-ambassadors
** <
https://lists.wikimedia.org/pipermail/wikitech-ambassadors/2014-September/0…
>
== Tuesday ==
* MediaWiki deploy
** group1 to 1.25wmf1: All non-Wikipedia sites (Wiktionary, Wikisource,
Wikinews, Wikibooks, Wikiquote, Wikiversity, and a few other sites)
** <https://www.mediawiki.org/wiki/MediaWiki_1.25/wmf1>
== Thursday ==
* MediaWiki deploy
** group2 to 1.25wmf1 (all Wikipedias)
** group0 to 1.25wmf2 (test/test2/testwikidata/mediawiki)
Thanks and as always, questions and comments welcome,
Greg
--
Greg Grossmeier
Release Team Manager
Executive summary:
I messed with ldap today. Gerrit handles ldap differently from all
other services, so it broke and it took several ops several hours to
sort out what was happening. Everything is working again now.
Details:
As part of an elaborate post-tampa ballet[1] I moved the ldap
servers from virt1000 and virt0 to ldap-eqiad (aka neptunium for the
time being) and ldap-codfw (aka labcontrol2001). This change was made
this morning via puppet[2].
Much to my delight, labs handled the change gracefully and without
any service interruptions.
Wikitech suffered a brief outage because I neglected to note that
it depends on an ldap server name in the mediawiki config. I hotfixed
that on virt1000 and also submitted a proper patch[3] for review. With
that change wikitech returned to normal, although (as usual) caches are
broken and many users will have to log out and in again to get all the
labs features they're used to.
With the change in ldap server, Gerrit logins went down and stayed
down. At various times Marc, Rob, Brandon and I were all involved in
troubleshooting. Several changes were made to the ldap setup
cluster-wide[4][5] -- these changes are probably correct, but did Gerrit
no good (and getting them applied w/out gerrit was no walk in the park.)
After a great many more blind alleys, Marc noted that we typically
handle ldap certificate validation by specifying a root cert in
ldap.conf, and that is not the Proper Debian Way. Apparently we've just
been lucky so far that most of our ldap services use ldap.conf rather
than the systemwide ca-certificate system.
The right solution is to drop trusted certs into
/usr/local/share/ca-certificates and then regenerate
/etc/ssl/ca-certificates.crt by running update-ca-certificates. Marc
did this on ytterbium (the Gerrit host) and Gerrit immediately started
working again.
Remaining tasks are:
1) Puppetize Marc's hotfix[6]
2) (Maybe) totally refactor how we use ldap everywhere so that it
conforms to Debian standards.
3) Document all the services that rely on ldap so the next time
someone (me, probably) messes with it, they know what to watch for[7]
Many thanks to Marc, Rob and Brandon for joining in when I called
out for help with this problem.
[1] https://wikitech.wikimedia.org/wiki/Ldap_rename
[2] https://gerrit.wikimedia.org/r/#/c/162689/
[3] https://gerrit.wikimedia.org/r/#/c/163189/
[4] https://gerrit.wikimedia.org/r/#/c/163183/
[5] https://gerrit.wikimedia.org/r/#/c/163194/
[6] https://gerrit.wikimedia.org/r/163222
[7] https://wikitech.wikimedia.org/wiki/LDAP
Hello,
James E. Blair, from the Openstack project, wrote an async console
interface to Gerrit.
The reasoning is that when you write all your code in a terminal, it
makes sense to use a terminal interface to review code. It also make
it async so you can review while offline (ie during travel) and sync
back with Gerrit whenever you are connected again.
You can get it by doing:
pip install pbr gertty
Then create a ~/.gertty.yaml based on:
https://git.openstack.org/cgit/stackforge/gertty/plain/examples/openstack-g…
It might not work on our Gerrit setup though :-D
The announcement with more details:
http://lists.openstack.org/pipermail/openstack-infra/2014-September/001904.…
--
Antoine "hashar" Musso
Hello
My name is Alisha Jain. I am an engineering student from Punjab,
India. I came to know that your organization is participating in FOSS
Outreach Program for Women this year. I have gone through your
projects and would like to have opportunity to work with you. I am
quite interested in "Extensive and robust localisation file format
coverage" project. I have worked upon parsers which is the main reason
for my interest in this project. I have made a Two - way Converter
using Flex and Bison and C++ and a simple converter which converts
Text format to DXF format. [0] is the link to my github repository for
Two - way Converter and [1] is for the DXF parser. I want to test my
coding skills in your project. Please guide me how to proceed further
and start with the patches.
[0]https://github.com/alishajain/Converter
[1]https://github.com/alishajain/dxfwentities
--
Alisha Jain
blog - jainalisha14.wordpress.com
"Your Failure does not define you, but your determination does."