-------- Original Message --------
Subject: [teampractices] Front End Ops Conference, San Francisco in April
Date: Wed, 29 Jan 2014 11:00:51 -0700
From: Chris McMahon <cmcmahon(a)wikimedia.org>
Reply-To: A mailing list to discuss team practices in Wikimedia
organizations <teampractices(a)lists.wikimedia.org>
To: A mailing list to discuss team practices in Wikimedia organizations
<teampractices(a)lists.wikimedia.org>
Apropos of Continous Delivery, production monitoring, QA, etc, this
might be of interest:
http://www.feopsconf.com/
CFP open until Feb 14.
-Chris
Dear all,
We are researchers at KAIST in Korea working on finding JavaScript bugs in web pages. While analyzing top websites from Alexa.com, we found an issue, which seems to be a bug, on the Wikipedia main web page (wikipedia.org). We would be grateful if you can either confirm that it is a bug and even better fix it or let us know what we're missing.
Here's the issue. When a user selects a language in which search results are displayed via the language selection button from the Wikipedia main web page, the following JavaScript function is executed:
1 function setLang(lang) {
2 var uiLang = navigator.language || navigator.userLanguage, date = new Date();
3
4 if (uiLang.match(/^\w+/) === lang) {
5 date.setTime(date.getTime() - 1);
6 } else {
7 date.setFullYear(date.getFullYear() + 1);
8 }
9
10 document.cookie = "searchLang=" + lang + ";expires=" + date.toUTCString() + ";domain=" + location.host + ";";
11 }
Depending on the evaluation result of the conditional expression on line 4, "uiLang.match(/^\w+/) === lang", the function leaves or dose not leave the selected language information on the user's computer through a cookie. But we found that the expression, "uiLang.match(/^\w+/) === lang", always evaluates to false, which results in that the function always leaves cookies on users' computers. We think that changing the contidional expression, "uiLang.match(/^\w+/) === lang", to the expression, "uiLang.match(/^\w+/) == lang", will solve the problem.
This problem may occur in the main web pages of all the Wikimedia sites. Could you kindly let us know what you think? Thank you in advance.
Best,
Changhee Park and Sukyoung Ryu
Hello all,
I’m delighted to announce that Ken Snider is joining the Wikimedia
operations team. He will start as an international contractor working
remotely from Toronto, Canada on June 10, and will be visiting SF in
the week of June 17. We’re currently in the process of seeking work
authorization in the United States in the Director of TechOps
position.
CT has graciously agreed to support the ops leadership transition
full-time through June, and part-time through July. We’ll be starting
the handover while Ken is working remotely.
A bit more about Ken: Ken was apparently genetically predisposed to
become a sysadmin since he joined one of Canada’s first large ISPs,
Primus, straight out of school in 1997 and helped build their
infrastructure til 2001. He then joined a startup called OpenCOLA in
2001 which was co-founded by Cory Doctorow and developed early P2P
precursors to tools like BitTorrent and Steam. It’s best known today
for the development of an open source (GPL’d) cola recipe which is
still in use (more than 150,000 cans sold if Wikipedia is to be
believed).
Ken got involved in one of Cory’s pet projects, BoingBoing.net which
some of you may have heard of ;-), and has been their sysadmin since
2003. After a stint from 2001-2005 at DataWire, Ken became Director of
Tech Ops at Federated Media, a role he held from 2005-2012.
Federated Media is an ad network that was founded to support high
traffic blogs and sites that want to stay independent of large
publishers, with a network that supports more than 1B requests/day.
One of the unusual challenges at FM was that the company grew through
acquisitions of various blogging and publishing networks. This led to
the challenge of integrating very heterogeneous operations and
engineering infrastructure, including multiple geographically
distributed ops teams and data-center locations. As DTO, Ken led these
efforts, such as OS standardization, development of a unified
deployment infrastructure, etc. Ken also ensured that the operations
group partnered effectively with the various engineering teams
developing site features and enhancements.
I want to again take this opportunity to thank CT Woo for his tireless
operations leadership since December 2010. I’d also like to thank
everyone who’s participated in the Director of TechOps search process.
Please join me in welcoming Ken to the Wikimedia Foundation and the
community. :-)
All best,
Erik
--
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation
Hi puppet wranglers,
We're trying to refactor the WMF puppet manifests to get rid of reliance
on dynamic scope, since puppet 3 doesn't permit it. Until now we've
done what is surely pretty standard pupet 2.x practice: assign
values to a variable in the node definition and pick it up in the
class from there dynamically. Example: we set $openstack_version
to a specific version depending on the node, and the nova and
openstack classes do different things depending on that value.
Without dynamic scoping that's no longer possible, so I'm wondering
what people are doing in the real world to address this, and
what best practices are, if any. Hiera? Some sort of class
parameterization? Something else?
It should go without saying that we'd like not to have to massively
rewrite all our manifests, if possible.
Thanks,
Ariel Glenn
All,
I'm planning to upgrade Gerrit from our 2.7-rc2 custom build to the 2.8.1
stable release on Tuesday from 01:00 to 03:00 UTC (that's 17-19:00 in
SF on Monday evening).
This will result in some minor downtime while the database is upgraded.
I don't anticipate it to take the full 2 hour window, but I'm being cautious
because Gerrit.
-Chad
Hi!
As of now Extension:RelatedSites is used on Wikivoyage only, but I
think will be good idea to use this extension for inter-project links
unification across other WMF projects.
Currently different templates present these links in different
fashion. This introduces interface inconsistency as well as require
maintaining and usage of set of templates on various projects.
Extension:RelatedSites need some care:
* it should allow project names localization;
* it should be able to extract links from Wikidata.
Tools | Data item should be displayed by this extension for
consistency with other projects.
Eugene,
This is a notice that on Tuesday, Jan 28th between 21:00-22:00 UTC (1-2pm
PST) Wikimedia Foundation will release critical security updates for
current and supported branches of the MediaWiki software and extensions.
Downloads and patches will be available at that time, with the git
repositories updated soon after. The vulnerable feature is not enabled in
MediaWiki by default, however many sites will want to upgrade as soon as
the patch is available.
Hello all, I am User:TyA. I noticed on the [[Annoying large bugs]] page
there was a request for a Request Queue and I'm interested in working on
that. I have some basic ideas of how to lay it out, however I'd like to
hear other's opinions on it as well. I'm also curious if it is still needed.
My current plan is to have 2 special pages, one for the requesting and the
other for seeing the queue/examining the submitted items.
I currently have a rough draft of the Requesting form complete, [
http://i.imgur.com/zRAAyvh.png screenshot]. However before going too much
further, I wanted to get some feedback on whether I'm collecting enough
info, or maybe not enough.
My plan for the actual queue page is to have it create a table or list that
will give the current status of the request as well as the requester, the
submitted date, and the project name. The page will also accept /id
subpages to load up information about a certain request so that it can be
marked as done, not done, or needs more info.
Thanks