Welcome to the latest edition of the Wikimedia roadmap and deployments
highlights email.
Full schedule at:
https://wikitech.wikimedia.org/wiki/Deployments#Week_of_March_10th
Other than the usual allotment of standing deploy windows for various
teams (eg Flow, Parsoid, Search, and WP Zero) there aren't any
deployments of special note. This of course doesn't take into account
last minute/exceptional things.
There is the normal MediaWiki train, as always.
Also, the WMF Labs migration continues. For more info, see:
http://lists.wikimedia.org/pipermail/labs-l/2014-February/002152.html
== Tuesday ==
MediaWiki upgrades
* group1 to 1.23wmf17: All non-Wikipedia sites (Wiktionary, Wikisource,
Wikinews, Wikibooks, Wikiquote, Wikiversity, and a few other sites)
** https://www.mediawiki.org/wiki/MediaWiki_1.23/Roadmap#Schedule_for_the_depl…
** https://www.mediawiki.org/wiki/MediaWiki_1.23/wmf17
== Thursday ==
MediaWiki upgrades
* group2 to 1.23wmf17 (all Wikipedias)
* group0 to 1.23wmf18 (test/test2/testwikidata/mediawiki)
** https://www.mediawiki.org/wiki/MediaWiki_1.23/wmf18 (branch/list of
changes created on Thursday)
As always, questions/comments welcome,
Greg
--
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |
Somehow wikitech-l got dropped form the recipient list of this thread; I
know some of the OSM folks are subscribed. Anyway, re-added wikitech-l to
this thread.
On Fri, Mar 7, 2014 at 1:53 PM, Jon Robson <jdlrobson(a)gmail.com> wrote:
> Sounds like a good idea. If there is no objections to my potentially crazy
> idea I will drop them a note on their mailing list...
> On 7 Mar 2014 13:27, "Arthur Richards" <arichards(a)wikimedia.org> wrote:
>
>> Dunno if any of the OSM-y folks are planning to attend but I bet this
>> would be up their alley. At the very least, it would probably be good to
>> get their input on a project like this.
>>
>>
>> On Fri, Mar 7, 2014 at 11:44 AM, Jon Robson <jdlrobson(a)gmail.com> wrote:
>>
>>> Dan awesome! Glad there is some interest - this should be a lot of fun!
>>> :-)
>>>
>>> On Wed, Mar 5, 2014 at 1:20 PM, Quim Gil <qgil(a)wikimedia.org> wrote:
>>> > fyi
>>> >
>>> >
>>> > -------- Original Message --------
>>> > Subject: [WikimediaMobile] Zurich Hackathon: Creating a map namespace
>>> > Date: Wed, 5 Mar 2014 12:54:52 -0800
>>> > From: Jon Robson <jdlrobson(a)gmail.com>
>>> > To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>, mobile-l
>>> > <mobile-l(a)lists.wikimedia.org>
>>> >
>>> > This may be extremely ambitious, but I'm keen to kick off development
>>> > around the creation of a map namespace during the Zurich hackathon.
>>> >
>>> > The goal would be to setup an editable map namespace that could be
>>> > used for a variety of things, one of which would be adding a map view
>>> > to the Special:Nearby page provided via the mobile site. The goal is a
>>> > proof of concept not necessarily anything production ready (but that
>>> > would be great if we could get to that point!)
>>> >
>>> > Please let me know if you would also be interested on hacking such a
>>> > thing -
>>> >
>>> https://www.mediawiki.org/wiki/Z%C3%BCrich_Hackathon_2014/Geo_Namespace
>>> > - or if doing so would be a terrible idea (but if you have to go down
>>> > that route please provide constructive reasoning on what would be a
>>> > less terrible idea)
>>> >
>>> > Excited to hack on cool things in Zurich!
>>> >
>>> > _______________________________________________
>>> > Mobile-l mailing list
>>> > Mobile-l(a)lists.wikimedia.org
>>> > https://lists.wikimedia.org/mailman/listinfo/mobile-l
>>> >
>>> >
>>> >
>>>
>>>
>>>
>>> --
>>> Jon Robson
>>> * http://jonrobson.me.uk
>>> * https://www.facebook.com/jonrobson
>>> * @rakugojon
>>>
>>> _______________________________________________
>>> Mobile-l mailing list
>>> Mobile-l(a)lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/mobile-l
>>>
>>
>>
>>
>> --
>> Arthur Richards
>> Software Engineer, Mobile
>> [[User:Awjrichards]]
>> IRC: awjr
>> +1-415-839-6885 x6687
>>
>
--
Arthur Richards
Software Engineer, Mobile
[[User:Awjrichards]]
IRC: awjr
+1-415-839-6885 x6687
Let's also take this into a new thread. There are a lot of different
conversations now going on....
On Fri, Mar 7, 2014 at 9:21 AM, Brad Jorsch (Anomie)
<bjorsch(a)wikimedia.org> wrote:
> On Fri, Mar 7, 2014 at 12:08 PM, C. Scott Ananian <cananian(a)wikimedia.org>wrote:
>
>> I agree. I think a better technical solution would be to halt jenkins'
>> auto-merge for the 24 hour period, so that +2'ed changes are not
>> automatically merged until after the branch is cut.
>
>
> I don't see how that's any better. Things still aren't getting merged.
>
> If anything, the "cut using master@{24 hours ago}" is a much better
> idea.[1] Although it might be useful to see if Wednesday tends to be a
> relatively active bug-fixing day as the community on non-Wikipedia sites
> finds issues in the version that was deployed to them on Tuesday, in which
> case keeping those from making it into the new cut on Thursday (and so
> requiring more backports or waiting an extra week for fixes) might not be
> so great.
>
>
> [1]: And yes, 'master@{24 hours ago}' is valid git syntax.
>
> --
> Brad Jorsch (Anomie)
> Software Engineer
> Wikimedia Foundation
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
--
Jon Robson
* http://jonrobson.me.uk
* https://www.facebook.com/jonrobson
* @rakugojon
Hi,
Thanks last year for the assistance in correcting some programming issues with the Memento Extension. We've been conducting performance testing in the interim and have been addressing issues as they've been discovered. For the most part, we've been pleased with the results, mostly due to feedback from this group.
The extension looks for the presence of an 'Accept-Datetime' header in the HTTP request and, if present, constructs a 302 HTTP response redirecting the visitor to the closest, but not over, oldid page to the time they've requested.
Currently the extension starts its processing using the ArticleViewHeader hook. If we are going to construct a 302 response because an 'Accept-Datetime' header was detected, then it appears that some of the data loaded never gets used, wasting processing time.
Is there an earlier hook in the page rendering process that can provide the name (title) of the page being requested and an OutputPage object so we can look at the request and construct the response?
We are avoiding all globals, like $wgOut, to comply with MediaWiki coding standards.
Thanks in advance,
Shawn M. Jones
Graduate Research Assistant
Department of Computer Science
Old Dominion University
[x-posted]
Hello,
The Wikimedia Language Engineering team will be hosting the monthly IRC
office hour on March 12, 2014 (Wednesday) at 1700 UTC/ 1000 PDT on
#wikimedia-office.
In this edition, we will be talking about our ongoing projects, like the
Content Translation tool[1]. Also, we would like to extend this invitation
specially to the students who are looking forward to participate in Google
Summer of Code (GSoC) 2014 and Outreach Program for Women (OPW) - Round 8,
for the Language Engineering projects[2] under Wikimedia. We will be happy
to answer your questions about our work and projects. Please see below for
the event details and check for local time at your location.
Questions can also be sent to me before the event. See you all at the IRC
office hour!
Thanks
Runa
[1] https://www.mediawiki.org/wiki/Content_translation
[2]
https://www.mediawiki.org/wiki/Mentorship_programs/Possible_projects#Intern…
Event Details:
==========
# Date: March 12, 2014
# Time: 1700-1800 UTC, 1000-1100 PDT (Check local time:
http://www.timeanddate.com/worldclock/fixedtime.html?iso=20140312T1700)
# IRC channel: #wikimedia-office on irc.freenode.net
Agenda:
======
1. Ongoing Projects - Content Translation tool
2. GSoC and OPW - open house for Language Engineering projects
3. Q & A
--
Language Engineering - Outreach and QA Coordinator
Wikimedia Foundation
I came across Gerrit change 79948[1] today, which makes "VectorBeta"
use a pile of non-free fonts (with one free font thrown in at the end
as a sop). Is this really the direction we want to go, considering
that in many other areas we prefer to use free software whenever we
can?
Looking around a bit, I see this has been discussed in some "back
corners"[2][3] (no offense intended), but not on this list and I don't
see any place where free versus non-free was actually discussed rather
than being brought up and then seemingly ignored.
In case it helps, I did some searching through mediawiki/core and
WMF-deployed extensions for font-family directives containing non-free
fonts. The results are at
https://www.mediawiki.org/wiki/User:Anomie/font-family (use of
non-staff account intentional).
[1]: https://gerrit.wikimedia.org/r/#/c/79948
[2]: https://www.mediawiki.org/wiki/Talk:Wikimedia_Foundation_Design/Typography#…
[3]: https://bugzilla.wikimedia.org/show_bug.cgi?id=44394
This looks like a good time to fork this conversation as this is a good
problem to fix. How can we notify developers when they break other things
in the stack and how?
On 7 Mar 2014 05:36, "Bartosz Dziewoński" <matma.rex(a)gmail.com> wrote:
> (continued, about the browser testing)
>
> (tl;dr where are the tests and how do I know they fail?)
>
> So, we have some slick browser tests. Awesome! But what's not good is
> that the tests run off-site, the results are not reported back to
> gerrit nor Bugzilla (unless someone manually files a bug, usually
> Chris) not IRC nor anywhere else, and are generally non-discoverable
> until someone shouts at you for breaking them. (As Tim guessed, I did
> not know about any failures until Jon told me.)
>
> In fact, I still have no idea what exactly the tests encompass (I've
> heard about some browser tests for VE because I lurk a lot, never
> heard of any for core) or where to find them or how to run them.
> Either I'm slow or we have a serious documentation failure here.
>
> Can something be done about it? Can we have the results reported
> somewhere visible – preferably to gerrit, as jenkins already reports
> some post-merge checks there? Or maybe we can have automatically filed
> bug reports if the build breaks? A bot reporting test status on
> #wikimedia-dev? Anything?
>
> (I understand that the tests take too long to run them in the
> pre-merge checks.)
>
> (Jon proposed reverting problematic changes outright, but to me that
> seems like a bit of an overreaction – bugs in tests and false
> positives happen, let's not make a huge fuss out of that.)
>
> (to be continued: about the deployment)
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Hello,
I have added a job in Jenkins which runs the Mediawiki core PHPUnit test
suite using the Facebook HipHop virtual machine.
The job is now being run along other testing jobs. It is slightly
slower (4 min 30s) than the other jobs so that would delay the reporting
back to Gerrit by roughly a minute. I have made the job to timeout
after 8 minutes to avoid unnecessarily blocking changes.
hhmv is installed on some labs instance and is using the version in our
apt repository (thought not automatically upgraded 'ensure => present').
The job page:
https://integration.wikimedia.org/ci/job/mediawiki-core-phpunit-hhvm/
It is very experimental, one of the build segfaulted:
https://integration.wikimedia.org/ci/job/mediawiki-core-phpunit-hhvm/2/
Another one has one failing test:
https://integration.wikimedia.org/ci/job/mediawiki-core-phpunit-hhvm/3/test…
DjVuTest::testPageCount
Object of class UnregisteredLocalFile could not be converted to string
That one probably need some code to be fixed.
Culprit: the install.php and update.php scripts are still using php.
Reference:
----------
Bug 62278 "write a jenkins job to use hhvm for mwcore
https://bugzilla.wikimedia.org/show_bug.cgi?id=62278
--
Antoine "hashar" Musso
Hello,
First of all sorry for inappropriate way of presenting the content it
appears that there was problem with my email web interface , As advised by
community members
I once again present my ideas regarding Multilingual, usable and effective
captchas at my proposal page for GSOC-2014 given here :
https://www.mediawiki.org/wiki/User:AalekhN/GSoC_proposal_2014
I therefore request all members to please go through the proposal and give
your viewpoint/advice regarding the content of the proposal.
Thank You
Aalekh Nigam
"aalekhN"
https://www.mediawiki.org/wiki/User:AalekhN
From: David Gerard <dgerard(a)gmail.com>
Date: Thu, Mar 6, 2014 at 9:44 PM
Subject: Re: [Wikitech-l] Should MediaWiki CSS prefer non-free fonts?
To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
(Veering off topic: So what does WMF use for a usability lab, anyway?)
...not sure what Kaldari did. In this case, he may have simply sat down
with the UX designers and done a test in person.
We do not have a usability testing lab on-site in San Francisco, and
typically prefer to do remote usability tests. Either we do this "manually"
via sending out a survey,[1] and then running a Google Hangout which we
record for later. This is good since it is guided by the person who wrote
the test script, so they can adapt to what the user is doing/failing to do.
It takes a lot more leg work though.
More often, we write a testing script and use usertesting.com, which is
more automated remote testing and is $35/test (this is really cheap since
the going US rate for an in-person test is something like a $50 Amazon gift
card). The service uses people from all over the English-speaking world who
have a variety of levels of technical expertise, and the tests are recorded
for viewing after they're completed.[2]
The UX team is actually in the process of hiring a UX researcher, so expect
to hear more about this kind of qualitative research soon.
Steven
1. We recently did this kind of recruitment and testing for article drafts
work. https://www.mediawiki.org/wiki/Draft_namespace/Usability_testing and
the /Results subpage
2. This kind of testing is something we used during the account creation
redesign
https://www.mediawiki.org/wiki/Account_creation_user_experience/User_testing
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l