Hi everyone,
Started poking on parser tests lately and found myself riddled after a
while.
It seems like when running parser test files with phpunit.php they follow
different rules as when running them with parserTests.php.
If I observed this correctly, phpunit.php collects all articles to be
created with "!!article", creates them, and then it runs tests. With
parserTests.php on the other hand everything is executed in the order it is
defined. In some tests it can be important whether a article already exists
or not.
There might be other behavioral differences here as well. The whole thing
seems incredibly odd to me since there is also some redundant code and the
initial globals set up in ParserTest::setupGlobals() are slightly different
from globals set up in NewParserTest::setupGlobals().
If there is no good reason against this, both classes, ParserTest and
NewParserTest should be reduced to one, or at least one base
class/interface. The goal should be that when running phpunit.php parser
tests behave exactly like running parserTests.php
Already created a bug report for this as well, it just didn't get any
attention so far, so I try it here:
https://bugzilla.wikimedia.org/show_bug.cgi?id=39473
I would very much appreciate if anyone could explain to me why there are
both of these files and why we maintain (more or less) a whole bunch of
redundant code for those tests.
Cheers,
Daniel
--
Daniel Werner
Software Engineer
Wikimedia Deutschland e.V. | NEU: Obentrautstr. 72 | 10963 Berlin
Tel. (030) 219 158 26-0
http://wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
I would like to take this time to welcome you to our hiring process
and give you a brief synopsis of the position's benefits and requirements.
If you are taking a career break, are on a maternity leave,
recently retired or simply looking for some part-time job, this position is for you.
Occupation: Flexible schedule 2 to 8 hours per day. We can guarantee a minimum 20 hrs/week occupation
Salary: Starting salary is 2000 GBP per month plus commission, paid every month.
Business hours: 9:00 AM to 5:00 PM, MON-FRI, 9:00 AM to 1:00 PM SAT or part time (UK time).
Region: United Kingdom.
Please note that there are no startup fees or deposits to start working for us.
To request an application form, schedule your interview and receive more information about this position
please reply to Tommy(a)xpatjobsuk.com,with your personal identification number for this position IDNO: 3985
Hi everyone,
Our Analytics crew have worked out how to generate a graph that gives
us a view into our code review backlog:
http://gerrit-stats.wmflabs.org/graphs/mediawiki
The red line is roughly the equivalent of this search in the Gerrit search box:
is:open -CodeReview=+2 -CodeReview=+1 -CodeReview=-1 -CodeReview=-2
project:^mediawiki.*
...which, in English, means "everything in the mediawiki/* projects
that hasn't been marked with a positive or negative review yet, and
hasn't been merged or abandoned yet"
These numbers seem to be +/- 10 revisions, and not evenly off over the
history, so bear that in mind as you look at the numbers. In
particular, it seems to paint a slightly rosier picture for how we're
doing on keeping up with the backlog than we are.[2]
That said, we seem to be doing pretty good on keeping up - better than
I thought had you asked me before I had the graph staring me in the
face. We still have quite a backlog, but it appears to be shrinking
by a modest amount. Our peak backlog appears to be mid July. For
those of you that have been reviewing, thanks for keeping up!
As of this writing, there's 207 revisions that have neither positive
nor negative reviews associated with them. That's still seems like a
pretty big number. 30 of those are more than a month old, and some
date back to May.
How is the process working for everyone? Is stuff getting reviewed
quickly enough? How has it been for the reviewers out there?
Rob
[1] https://github.com/wikimedia/limn
[2] For those interested in the gory details. Unfortunately, it's not
a perfect history due to the way Gerrit stores the history of
approvals (or rather, the fact that Gerrit doesn't store the
"history", just the current approval state for any given patchset).
In addition to known discrepancies, there may well be other issues.
In tracking it over the past couple of days, it looks like the last
few days are slightly undercounted (relative to the historical
numbers), as they drift upward every day so. Everything prior to
August 12 is stable, though, so we seem to be getting *consistent*
numbers for everything before August 12 (though quite possibly
overcounted by 10-ish revisions). It also more-or-less lines up with
the few manual datapoints that I have.
This crossed my desk this morning, it is a long and detailed (and honest!)
account by an insider of Google's efforts to increase code quality and
product quality. I think it's relevant to what we're doing at WMF, and
what we might do in the future.
http://mike-bland.com/2012/07/10/test-mercenaries.html
Greetings Wikitech Members,
The official release of the WLM Android App has been published on the
Google Play store!
https://play.google.com/store/apps/details?id=org.wikipedia.wlm
Now is the time to download the app from Google Play and tell us if
there are any issues before the contest starts on September 1st. All
uploads go to Commons, so send photos that you want the world to see.
In just a few days we've seen some amazing uploads including:
*
http://commons.wikimedia.org/wiki/File:Torre_de_H%C3%A9rcules_(taken_on_30A…
*
http://commons.wikimedia.org/wiki/File:Union_Square_(taken_on_27Aug2012_13h…
*
http://commons.wikimedia.org/wiki/File:Southern_Pacific_Railroad_Locomotive…
and you can see all of our mobile uploads below:
http://commons.wikimedia.org/wiki/Category:Uploaded_with_Android_WLM_App
For those of you in San Francsico, come by to the third floor to see a
visualization of all the mobile uploads on the big screen in admin
alley. If your not with us then you can load:
http://jonrobson.me.uk/wlm/to see the same. We took the majority of
these photos on Monday when the
team took a break from development and wandered San Francisco, Tucson, and
Chicago to field test our app. This was one of the best ways that we could
test our app. Since Monday we've seen amazing photos from Pittsburgh, South
Dakota, Mexico, and Spain from Wikipedians that we know and numerous that
we don't. We love when this happens!
Please write to us with any comments so that we can make the app even
better.
Thank you for your cooperation and support, and let's make 2012 the
best year ever!
Best regards,
The WMF Mobile Team
--
Phil Inje Chang
Product Manager, Mobile
Wikimedia Foundation
415-812-0854 m
415-882-7982 x 6810
This is an announcement of a security release of the
LdapAuthentication plugin. This release adds support for the fix
released in MediaWiki core in versions 1.19.2 and 1.18.5 that involved
data leakage, and false authentication in situations where the
LdapAuthentication extension was returning strict() as false.
The commit for this fix is: e67d0d392d261aa5a6b59f61dae4c42119aef2e3
The change-id in Gerrit is: I5244af48b895ebfb7ca79f04019924a172c417e4
It's also tagged in the git repo as 2.0c.
Additionally, after upgrading MediaWiki and the LdapAuthentication
extension, you should also purge leaked passwords from your local
database. Please see bug 39184
<https://bugzilla.wikimedia.org/show_bug.cgi?id=39184> for information
regarding purging the passwords.
- Ryan Lane
Hello,
After much hassle with Jenkins, ant and PHPUnit, I finally managed to
run an extension PHPUnit test suite under Jenkins.
The winner is TitleBlacklist for which results are available via:
https://integration.mediawiki.org/ci/job/Ext-TitleBlacklist/
The Jenkins job is setup to report the build status back in Gerrit.
If you see any issue, please fill in a bug under Testing infrastructure.
TODO: Job does not run PHP lint yet.
--
Antoine "hashar" Musso
Hi All,
I have moved all search traffic to hosts now running ubuntu 12.04
(precise). This should be a no-op, as the search software itself has not
changed, and all automated tests looked good, but I wanted to make you all
aware as I neither the time nor languages skills to manually test every
language.
If you notice any irregularities, please let me know!
--peter