Hi,
So the time is upon us to finally upgrade Gerrit. I thank everyone in
advance for all of your patience
and good work testing things out for us. The plan is outlined in detail on
Phabricator[0], but I'll give
everyone the short version here.
The downtime will be on Monday, July 25th from 01:00 to 04:00 UTC; that's
Sunday night for those
of us who are US-based. This time was picked based because it's one of our
lowest traffic times
on Gerrit. There's never a *good* time to bring it down, it's always being
used, but this looks like it'll
impact the fewest number of users. I do not anticipate the process actually
taking the full 3 hours,
but I'm giving us a lot of extra time just in case.
Jaime will be on hand to assist with a final DB snapshot of the old version
and possible rollback,
and Daniel Z. is going to assist me with the puppet work. Most of this is
already prepared so the
amount of "change" to do the swap has been kept to a minimum.
Gerrit is a critical service to all developers, so the plan includes a
generous provision to roll back
if things are not operating properly--I'd rather us be on the old version
that works on a Monday
morning than be stuck broken going into the work week.
I'll be sure to send a last minute reminder Sunday night prior to taking
services offline.
Thanks again!
-Chad
[0] https://phabricator.wikimedia.org/T70271#2482308
What proportion of MediaWiki installations run on 32-bit systems? How much
memory is available to a typical MediaWiki install? How often is the Oracle
database backend used?
These are the kinds of questions that come up whenever we debate changes
that impact compatibility. More often than not, the questions go
unanswered, because we don't have good statistical data about the
environments in which MediaWiki is running.
Starting with version 1.28, MediaWiki will provide operators with the
option of sharing anonymous data about the local MediaWiki instance and its
environment with MediaWiki's developer community via a pingback to a URL
endpoint on MediaWiki.org.
The configuration variable that controls this behavior ($wgPingback) will
default to false (that is: don't share data). The web installer will
display a checkbox for toggling this feature on and off, and it will be
checked by default (that is: *do* share data). This ensures (I hope) that
no one feels surprised or violated.
The information that gets sent is described in <
https://meta.wikimedia.org/wiki/Schema:MediaWikiPingback>. Here is a
summary of what we send:
- A randomly-generated unique ID for the wiki.
- The chosen database backend (e.g., "mysql", "sqlite")
- The version of MediaWiki in use
- The version of PHP
- The name and version of the operating system in use
- The processor architecture and integer size (e.g. "x86_64")
- The name of the web server software in use (e.g. "Apache/1.3.14")
Neither the wiki name nor its location is shared.
The plan is to make this data freely available to all MediaWiki developers.
Before that can happen, I will need to solicit reviews from security folks
and from the WMF's legal team, but I don't expect any major issues.
Please chime in if you have any thoughts about this. :)
The change-set implementing this functionality is <
https://gerrit.wikimedia.org/r/#/c/296699/>, if you want to take a look.
> The configuration variable that controls this behavior ($wgPingback) will
> default to false (that is: don't share data). The web installer will
> display a checkbox for toggling this feature on and off, and it will be
> checked by default (that is: *do* share data). This ensures (I hope) that
> no one feels surprised or violated.
Sounds sane, as long as the installer makes it quite clear what it is going
to be doing.
> - The chosen database backend (e.g., "mysql", "sqlite")
Would love to have DB version information as well (getServerVersion)
Lua version?
> Please chime in if you have any thoughts about this. :)
Many of the wikis I install are on intranets behind heavy firewalls. I'd be happy
to submit this data however if there were an optional method to do so.
--
Greg Sabino Mullane greg(a)endpoint.com
End Point Corporation
PGP Key: 0x14964AC8
Hi Wikitech,
I believe that Flow is being used on an opt-in basis on Wikidata and
Catalan Wikipedia, and is used on user talk pages by default on MediaWiki.
Is that correct?
Are there any wikis other than MediaWiki where Flow is enabled on all talk
pages by default?
Are there any wikis where Flow is widely used on an opt-in basis, even if
it's not the default?
Thanks,
Pine
(sorry for cross posting)
Hi all,
on July 21 23:00-00:00 (UTC) RevisionSlider got deployed as a beta feature
on German Wikipedia, Arabic Wikipedia and Hebrew Wikipedia.
The RevisionSlider extension adds a slider interface to the diff view, so
that you can easily move between revisions. It helps users to view edit
summaries and other meta data of all revisions while hovering over the
slider interface. At the current state, the last 500 revisions can be
loaded. [1]
The RevisionSlider extension was developed by Wikimedia Deutschland's TCB
team and fulfills a wish from the German-speaking community's Technical
Wishlist. [2] It is based on a rough prototype by the Community Tech Team
who we love collaborating with. [3]
The feature already was presented at WMF's last metrics meeting, so if you
are interested to hear more about it, you can also have a look into the
video record. [4]
Why German, Arabic and Hebrew Wikipedia in the first round?
As a first step, we want to see if RevisionSlider works well on both, LTR
and RTL Wikipedias. The decision for German Wikipedia is probably obvious,
as the RevisionSlider addresses a wish from the German-speaking community.
But the team spent also some work in optimizing the feature for RTL
languages, and we were talking to people from Arabic and Hebrew Wikipedia
at Wikimania. Both communities created a site request ticket to deploy
RevisionSlider as a Beta feature on their projects. [5] [6]
So far everything went well and we also got our first really nicely written
bug report from Hebrew Wikipedia - https://phabricator.wikimedia.org/T141071
;).
We are really happy about the constructive and appreciative collaboration
style around RevisionSlider, it's so much fun working like this! Thanks a
lot to everyone who was/is involved in the work around the feature! This
includes people from German, Arabic and Hebrew Wikipedia and people from
the WMF, who did the security review and helped with other important stuff.
A special thanks goes to Moriel Schottlaender, who gave us valuable advice
on how to RTLize the feature and who even contributed code!
Next steps: We want to see how RevisionSlider works on dewiki, hewiki and
arwiki and then we can think about providing the feature also for other
language communities.
It would be great if you find time to test the feature and give us
feedback! If you're usually not a user of German, Arabic or Hebrew
Wikipedia, you can also go to Beta Labs and try RevisionSlider with an
English-speaking article set. [7]
Thanks,
Birgit
[1] https://www.mediawiki.org/wiki/Extension:RevisionSlider
[2] https://meta.wikimedia.org/wiki/WMDE_Technical_Wishes
[3] https://meta.wikimedia.org/wiki/Community_Tech/RevisionSlider
[4]
https://meta.wikimedia.org/wiki/File:WMF_Metrics_and_Activities_Meeting_-_J…
[5] https://phabricator.wikimedia.org/T140551
[6] https://phabricator.wikimedia.org/T140545
[7]
http://simple.wikipedia.beta.wmflabs.org/wiki/Special:Preferences#mw-prefse…
--
Birgit Müller
Community Communications Manager
Software Development and Engineering
Wikimedia Deutschland e.V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Tel. (030) 219 158 26-0
http://wikimedia.de
Stellen Sie sich eine Welt vor, in der jeder Mensch an der Menge allen
Wissens frei teilhaben kann. Helfen Sie uns dabei!
http://spenden.wikimedia.de/
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
https://phabricator.wikimedia.org/T119736 - "Could not find local user data for {Username}@{wiki}"
There was an order of magnitude increase in the rate of those errors
that started on July 7th.
Investigation and remediation is on-going.
Greg
--
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |
All,
>From what we in Multimedia can tell, the ImageMetrics extension was put
into production as an EventLogging source to measure data about users and
images, to make decisions about MediaViewer whilst it was being actively
developed.
As that's now in the past, is there any interest in continuing to store
this data, or can we kill it and reduce the number of extensions in
production by one?
Follow-up on https://phabricator.wikimedia.org/T140952 please.
J.
--
James D. Forrester
Lead Product Manager, Editing
Wikimedia Foundation, Inc.
jforrester(a)wikimedia.org | @jdforrester
In a custom extension, I have a unit test that was working fine in MediaWiki 1.26, but it fails in MediaWiki 1.27. I'd appreciate any pointers in the right direction.
The extension performs a page move, more or less like this (but with error checking):
$mp = new MovePage($source, $destination);
$result = $mp->move($context->getUser(), $reason, false);
The code works fine on a running wiki, but in a unit test under MW 1.27, I get this error:
MWException: No valid null revision produced in MovePage::moveToInternal
The test seem to be running into issues with the LinkCache. The call to MovePage::move hits this line:
$pageid = $this->oldTitle->getArticleID( Title::GAID_FOR_UPDATE );
which returns zero instead of the page's proper article ID. (This did not happen in MediaWiki 1.26.) Inside of Title::getArticleID we find these lines:
if ( $flags & self::GAID_FOR_UPDATE ) {
$oldUpdate = $linkCache->forUpdate( true );
$linkCache->clearLink( $this );
$this->mArticleID = $linkCache->addLinkObj( $this );
$linkCache->forUpdate( $oldUpdate );
}
The call to LinkCache::addLinkObj($this) returns zero. As a result, MovePage::moveInternal fails and we get the error message shown above.
So, I am wondering if anyone has any insights why this began happening with MW 1.27, and what directions I should explore to make my test work again.
Thank you very much.
DanB
We're happy to announce that after numerous tests and analyses[1] and a
fully operational demo[2], the Discovery Team is ready to release
TextCat[3] into production on wiki.
What is TextCat? It detects the language that the search query was written
in which allows us to look for results on a different wiki. TextCat is a
language detection library based on n-grams[4]. During a search, TextCat
will only kick in when the following three things occur:
1. fewer than 3 results are returned from the query on the current wiki
2. language detection is successful (meaning that TextCat is reasonably
certain what language the query is in, and that it is different from the
language of the current wiki)
3. the other wiki (in the detected language) has results
Our analysis of the A/B test[5] (for English, French, Spanish, Italian and
German Wikipedia's) showed that:
"...The test groups not only had a substantially lower zero results rate
(57% in control group vs 46% in the two test groups), but they had a higher
clickthrough rate (44% in the control group vs 49-50% in the two test
groups), indicating that we may be providing users with relevant results
that they would not have gotten otherwise."
This update will be scheduled for production release during the week of
July 25, 2016 on the following Wikipedia's:
- English [6]
- German [7]
- Spanish [8]
- Italian [9]
- French [10]
TextCat will then be added to this next group of Wikipedia's at a later
date:
- Portugese[11]
- Russian[12]
- Japanese[13]
This is a huge step forward in creating a search mechanism that is able to
detect - with a high level of accuracy - the language that was used and
produce results in that language. Another forward-looking aspect of TextCat
is investigating a confidence measuring algorithm[14], to ensure that the
language detection results are the best they can be.
We will also be doing more[15] A/B tests using TextCat on non Wikipedia
sites, such as Wikibooks and Wikivoyage. These new tests will give us
insight into whether applying the same language detection configuration
across projects would be helpful.
Please let us know if you have any questions or concerns, on the TextCat
discussion page[16]. Also, for screenshots of what this update will look
like, please see this one[17] showing an existing search typed in on enwiki
in Russian "первым экспериментом" and this one[18] for showing what it will
look like once TextCat is in production on enwiki.
Thanks!
[1] https://phabricator.wikimedia.org/T118278
[2] https://tools.wmflabs.org/textcatdemo/
[3] https://www.mediawiki.org/wiki/TextCat
[4] https://en.wikipedia.org/wiki/N-gram
[5]
https://commons.wikimedia.org/wiki/File:Report_on_Cirrus_Search_TextCat_AB_…
[6] https://en.wikipedia.org/
[7] https://de.wikipedia.org/
[8] https://es.wikipedia.org/
[9] https://it.wikipedia.org/
[10] https://fr.wikipedia.org/
[11] https://pt.wikipedia.org/
[12] https://ru.wikipedia.org/
[13] https://ja.wikipedia.org/
[14] https://phabricator.wikimedia.org/T140289
[15] https://phabricator.wikimedia.org/T140292
[16] https://www.mediawiki.org/wiki/Talk:TextCat
[17] https://commons.wikimedia.org/wiki/File:Existing-search_no-textcat.png
[18] https://commons.wikimedia.org/wiki/File:New-search_with-textcat.png
--
Deb Tankersley
Product Manager, Discovery
IRC: debt
Wikimedia Foundation
https://www.mediawiki.org/wiki/Scrum_of_scrums/2016-07-20
= 2016-07-20 =
== Product ==
=== Reading ===
==== Reading Web ====
* No update, working on language switcher on mobile web
==== iOS native app ====
* 5.0.5 heading to regression today, expected release to Apple store later
this week or early next week
* Development of 5.1 is in progress
* Planning of 5.2 is in progress
==== Android native app ====
* Feed is released in beta! (Follow-up bugfix beta release is cooking as
we speak.)
* We are starting work on the navigation overhaul.
* Heads-up to RelEng: we are going to talk this week about whether we have
bandwidth this Q to transition to Differential code reviews.
==== Mobile Content Service ====
* First public feed endpoints are deployed: aggregated + smart random
==== Reading Infrastructure ====
=== Community Tech ===
* Patch for numeric sorting is ready for review (
https://gerrit.wikimedia.org/r/#/c/299108/)
** Will be rolling out on test wiki first. Need another test wiki before
English WIkipedia (preferably already using UCA collation)
* Fixed security bug in Pageviews Analysis
* Architechure Committee RFC meeting about Cross-wiki watchlist back-end
today 2pm
=== Editing ===
==== Collaboration ====
* Blocked - None
* Blocking - Krinkle would like us to stop using buildCssLinks to pave the
way for a refactoring. Otherwise, no change.
* Updates
** Turned off Echo transition flags, now that the maintenance scripts are
done. This should improve performance and avoid unexpected side effects.
** Echo features (such as animation when notifications move in list) and
bug fixes.
** Flow security fixes merged to master; they were already on the cluster.
==== Parsing ====
* In collaboration with Services (Marko) & Ops (Giuseppe), we transitioned
Parsoid to be based on service-runner. Parsoid deploys will resume tomorrow
/ Monday.
* Tim working on addressing HHVM segfault in preprocessor which was
reported by Giuseppe in a security bug.
* Scott & Tim working on a PHP only Tidy replacement which is close to
being done.
* OCG (Offline Content Generator) outage this week due to unrelated proxy
misconfiguration: T140789
==== VisualEditor ====
* Blocked: None.
* Blocking: None known.
* Update: Quiet week. Mostly working on bugs and the new wikitext editor.
CustomData extension dependency removed from all three remaining Wikivoyage
extensions that used in in master; will be able to de-deploy it in the next
few weeks.
=== Discovery ===
* '''Blocking''': none
* '''Blocked''': none
* logstash.wikimedia.org upgraded to latest Kibana version
* TextCat A/B test results are in:
https://commons.wikimedia.org/wiki/File:Report_on_Cirrus_Search_TextCat_AB_…
** TLDR: Success
* TextCat demo has new design: https://tools.wmflabs.org/textcatdemo/
* GeoSearch launched:
https://www.mediawiki.org/wiki/Help:CirrusSearch#Geo_Search
=== Interactive ===
* Launched maps on meta, cawiki, hewiki, mkwiki
* This Friday (July 22) in Seattle - data visualization hackathon
https://www.mediawiki.org/wiki/DataViz_Seattle_hackathon
* This Weekend - the whole team is in Seattle for State of the Map US
conference
== Technology ==
=== Analytics ===
* issues with eventbus deployment and new schemas, service was rejecting
events, had to be restarted.
* reconstructing edit history from mw database, pretty sure it is possible
but will know better after thsi week
* scaling pageview API, our new cluster has issues with being able to load
data and compact (we needed to change compaction from old scheme)
=== Research ===
* Memory issues on scb1001/1002 related to ORES have been partially
addressed.
** Lower number of uwsgi processes
** Periodic restart of celery workers to address memory leak
https://phabricator.wikimedia.org/T140020
** Explore changing model from Random Forest to Gradient Boosting
https://phabricator.wikimedia.org/T139963
* We'll be seeking dedicated hardware. (Anyone in ops want to reach out to
us to help with that process would be great)
=== Services ===
* Feed endpoints deployed, but need to revisit using `/feed/featured` as it
takes a looong time to MCS to compute it
**
https://en.wikipedia.org/api/rest_v1/?doc#!/Feed/get_feed_featured_yyyy_mm_…
* Parsoid move to service-runner and service::node completed
* service-template-node v0.4.0 is out - please update soon.
** security issue addressed
** new feature - automatic metrics collection
* Marko out next week
=== Security ===
* Verifying T140366
* Reviewing 296699
* Drafting/editing security team job descriptions
* Request security reviews:
https://www.mediawiki.org/wiki/Wikimedia_Security_Team/Security_reviews
* MediaWIki 1.27.1 security release planed for early August
=== RelEng ===
* Blocking
** Android to differential
* Blocked
** None
* Updates
** Zuul upgraded this week, should address a bunch of issues
** New SWAT deploy process going ok, reminder to install
https://wikitech.wikimedia.org/wiki/X-Wikimedia-Debug if you're putting
things up for SWAT
=== Fundraising Tech ===
* Civi post-upgrade bugfixes
** more work on batch contact de-duplication
* CentralNotice deployed (last week), watching closely for glitches
** No longer serving CN modules on special pages and action=edit (
https://phabricator.wikimedia.org/T139439)
* Upgraded payments to MW 1.27 (LTS!)
** Still hoping to get closer to master, but this buys us a lot of time
* Killed ancient homegrown form template engine (-9,000 loc !)
* Experimenting with scrutinizer-ci
* Pivoting to ActiveMQ replacement work
* Building out new servers
* No blockers
=== TechOps ===
'''Blocking'''
** None
* '''Blocked'''
** https://phabricator.wikimedia.org/T135483 - HHVM crashes - raised to
UBN! after issue recurrence. Currently no one owns the ticket.
looks like there's already a patch at
https://gerrit.wikimedia.org/r/299710 -- [cscott, for parsing (and tim
starling, who wrote the patch)]
* Updates:
** Insecure (non-HTTPS) POST traffic blocked completely as of yesterday,
may see reports of broken bots/tools -
https://phabricator.wikimedia.org/T105794
== Wikidata ==
* No blockers.
* Back into regular 2 weeks Scrum sprint. Connecting loose ends to get
stuff done.
* Reworking jQuery based UI code (minimizing the code base).
* Still working on structured data for Commons.