Hi, the RT migration to Phabricator is starting right now.
All the RT queues have been frozen. Please don't send any emails to @
rt.wikimedia.org. This includes access-requests@ and procurement@, the two
RT queues that will not be migrated at this point, which will be in
read-only mode during the migration.
phabricator.wikimedia.org will be down between 00:00 UTC (16:00 Pacific) and
08:00 UTC (00:00 Pacific). In those eight hours, urgent issues which cannot
wait should be brought up either on #wikimedia-tech IRC or
https://www.mediawiki.org/wiki/Project:Support_desk
As soon as Wikimedia Phabricator is back, requests to the Operations team
will be submitted through the regular process of creating a task,
associating them to the #Operations project. If you really miss sending
requests to Ops via email, task(a)phabricator.wikimedia.org including
"#Operations" will do the job as well. Even requests sent through the now
deprecated @rt.wikimedia.org email addresses will create Phabricator tasks
for a while too, but the web UI process is clearly preferred.
Remember, the exceptions are access-requests@ and procurement@, which will
be migrated at a later stage.
The Operations team will be vigilant during the migration (as they usually
are, on the other hand). Please contact them only for urgent requests
during this period. Thank you very much!
PS: we are updating https://www.mediawiki.org/wiki/Phabricator/versus_RT --
you will find all the details documented or linked from there.
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
Andre won't tell you, but he wrote this:
http://blogs.gnome.org/aklapper/2014/12/17/welcome-phabricator/
The best in-depth analysis of a Bugzilla to Phabricator migration available
in the Internet. Not that there is much competition... I'm sure it will be
useful to someone, especially in conjunction with Chase's migration script
and the detailed logs and tasks archived.
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
All,
lsearchd/MWSearch has powered Wikimedia search for many now. With
the successful deployment of Elasticsearch/CirrusSearch to all wikis we've
completed the migration and lsearchd has reached its end of life and is
no longer being supported in development or usage. Any 3rd party users
of MWSearch are encouraged to use CirrusSearch which attempts to
remain largely compatible with behavior in MWSearch.
I didn't want to break any gadgets or scripts, so I checked the API logs
going back about a month. enwiki, ruwiki and nlwiki still have a bit of
traffic using the old search backend so I left them enabled. lsearchd has
been disabled on all other Wikimedia wikis though and is now in ops'
hands for decomissioning. Once any bot/tool/etc authors using the old
search are contacted (and helped, if possible) lsearchd will go away there
too.
lsearchd is dead. Long live Elasticsearch.
-Chad
(CCing wikitech-l)
Dimitar, this is great news! Second year with a Wikimedia stand. With
Wikimedia Belgium officially constituted and conversations started with the
Wikimedia Shop, we should have a much better setup this year. We should get
other European chapters as well, so they can bring swag and hopefully some
volunteer time to take a shift.
I have just created two tasks to coordinate Wikimedia's presence in FOSDEM:
https://phabricator.wikimedia.org/T84972 is for general coordination. Any
wikimedia attending is encouraged to subscribe to this task and use it to
stay in touch.
https://phabricator.wikimedia.org/T84971 is specifically about organizing
the Wikimedia stand.
On Wed, Dec 17, 2014 at 3:42 PM, Dimitar Dimitrov <
dimitar.dimitrov(a)wikimedia.de> wrote:
>
>
> ---------- Forwarded message ----------
> From: FOSDEM Stands Team <stands(a)fosdem.org>
> Date: 2014-12-17 15:35 GMT+01:00
> Subject: Your stand proposal for Wikimedia has been accepted
> To: Dimitar Dimitrov <dimitar.dimitrov(a)wikimedia.de>
>
> Hi Dimitar,
>
> The FOSDEM stands team is glad to be able to inform you that your request
> for a stand for Wikimedia has been accepted.
> There will be one table reserved for you.
>
> You will receive further information about what's expected of you closer
> to the event date.
>
> Looking forward to seeing you at FOSDEM 2015!
>
>
> Kind regards,
>
> Wynke Stulemeijer
> FOSDEM stands team
>
>
> --
> Dimitar Dimitrov
> Wikimedian in Brussels
>
> mobile: +32497720374
> landline: +32 2 540 2483
> Rue du Trône 51 Troonstraat
>
> *Imagine a world in which every single human being can freely share in the
> sum of all knowledge. That's our commitment. Help us with it in the EU!
> <http://meta.wikimedia.org/wiki/EU_Policy>*
> www.wikimedia.org
>
>
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
Hello,
Jenkins runs the MediaWiki core unit tests under HHVM and the job will
now prevent changes to be merged if it fails.
Huge thanks to everyone that helped fix tests and HHVM code base!
--
Antoine "hashar" Musso
Hello,
I found out we do not run on extensions the 'structure' testsuite of
mediawiki/core. It is made of three tests in tests/phpunit/structure
AutoLoaderTest.php verify classes are properly registered, and the
autloader entries point to an actual file
ResourcesTest.php ditto for the Resource Loader
StructureTest.php makes sure test files ends with Test.php
Jenkins runs on extensions the 'extension' testsuite, I will add the
'structure' directory to it tomorrow morning ~9am UTC.
You can verify whether your extension pass by running:
php tests/phpunit/phpunit.php --testsuite structure
The patch for core is: https://gerrit.wikimedia.org/r/#/c/180496/
Ideally we should backport it to REL branches as well.
Filled as https://phabricator.wikimedia.org/T78798
--
Antoine "hashar" Musso
Can anyone think of a reason *not* to eliminate wgParser usage from
WikitextContent? I have a bug (T76651) that is resolved by making
WikitextContent use its own parser instance rather than wgParser. I wrote
an experimental fix of WikitextContent that clones the parser in the same
way as MessageCache::getParser and am trying to decide the best way forward.
Thanks,
Keith Welter