https://www.mediawiki.org/wiki/Scrum_of_scrums/2018-01-17#Reading_Web
= *2018-**0**1-17* =
== Callouts ==
* Grafana will be migrated to support native LDAP on Feb 12 2018. This
means the eventual demise for grafana-admin.wikimedia.org. Announcement to
be posted to engineering@ and wikitech@.
https://phabricator.wikimedia.org/T170150
== Audiences ==
=== Readers ===
==== iOS native app ====
* Blocked by:
* Blocking:
* Updates:
**Continuing work on 5.8 - synced reading lists
==== Android native app ====
* Blocked by: weird RESTBase cache behavior:
https://phabricator.wikimedia.org/T184833
* Blocking:
* Updates:
** On track to release update with performance enhancements for Reading
Lists (and "default" list, for feature parity with iOS)
** Continuing testing of reading list syncing.
==== Reading Web ====
* Blocked by:
** Release engineering - need to setup some CI on pdf service
https://phabricator.wikimedia.org/T179552
Services - soon we will be looking at how to apply different styles to
pdf service (https://phabricator.wikimedia.org/T181680) and how to use
firejail to limit resources used by chromium render service? (
https://phabricator.wikimedia.org/T180626) - we may need some guidance.
* Blocking:
* Updates:
**We will be releasing a revamp of the mobile settings page. It also brings
structure to the mobile beta meaning it will inform the user what features
are in beta at any given time. (see
https://phabricator.wikimedia.org/T182217)
**Added instrumentation to print to pdf button to understand our users
better (https://phabricator.wikimedia.org/T181297)
-
==== Reading Infrastructure ====
* Blocked by:
** Services on code review for
https://github.com/wikimedia/restbase/pull/944
* Blocking:
* Updates:
**Switchover to MCS summary implementation is delayed until after Dev
Summit week.
===== Maps =====
* Updates: None
==== Multimedia ====
* Updates: None
==== Discovery ====
* Blocked by:
* Blocking:
* Updates:
=== Community Tech ===
* Blocked by: Security for GlobalPreferences review
* Blocking:
* Updates: analysing top 10 proposals
=== Contributors ===
==== Editing ====
* Blocked by:
* Blocking:
* Updates:
==== Parsing ====
* Blocked by: None
* Blocking: None
* Updates:
** https://gerrit.wikimedia.org/r/#/c/402455/ has some changes to how
Parsoid handles the interaction between templates and responsive wrappers.
Heads up to Parsoid clients to take a look at that patch and +1 if it
doesn't affect you
** We will roll out changes this week to replace <span> with <sup> for ref
linkbacks ( https://phabricator.wikimedia.org/T45094)
==== Global Collaboration ====
* Blocked by: Security on 5, but waiting for all-hands to talk
* Blocking: Ops on Flow dumps probably; haven't checked in with Matt/Ariel
about this recently
* Updates:
** Flow respects robot policies (noindex) now
==== UI Standardization ====
** OOUI v0.25.1 released
https://phabricator.wikimedia.org/diffusion/GOJU/browse/master/History.md
*** Renaming to unified “OOUI” in code documentation and comments
*** Remove buggy `translateZ` hack on scrollable PanelLayouts – keep an eye
on scrolling perf in Blink/Webkit if you use PanelLayout
* Ongoing:
** OOUI & based products:
*** icons: Unify, refine and align to WikimediaUI Style Guide
https://phabricator.wikimedia.org/T177432 – first patches in:
https://gerrit.wikimedia.org/r/#/c/402757/
== Technology ==
=== Analytics ===
* Blocked by:
* Blocking:
* Updates:
** Found some Event data missing for January 3 through January 7. If your
reports/dashboards look weird, rerun them:
https://wikitech.wikimedia.org/wiki/Analytics/Systems/Reportupdater#Re-runs
** Clickstream blogpost:
https://blog.wikimedia.org/2018/01/16/wikipedia-rabbit-hole-clickstream/
** working on new APIs to report pageviews per project per country (sorting
out ISO codes and data shape)
** testing our java 1.8 upgrade in labs
** Working with Brandon on TLS
** Meltdown reboots continue
=== Cloud Services ===
* Blocked by:
* Blocking:
* Updates:
=== Fundraising Tech ===
* Blocked by:
* Blocking:
* Updates:
** Various CiviCRM improvements
** Making our Amazon Pay SDK fork support TCP proxy
** Re-starting work on new API for our main credit card processor
** Stats projects:
*** Andrew Green's druid banner impressions lib:
https://github.com/AndrewGreen/centralnotice_analytics
*** talked with Analytics, planning to use EventLogging for banner and
donatewiki stats
=== MediaWiki Platform ===
* Blocked by:
* Blocking:
* Updates:
* Triaged/updated RFCs
* Multi-Content Revisions:
** Patches under review:
*** https://gerrit.wikimedia.org/r/#/c/380669/
*** https://gerrit.wikimedia.org/r/#/c/402932/
*** https://gerrit.wikimedia.org/r/#/c/393929/
* cleanupUsersWithNoId: Wikibase blocker has movement.
* Comment table: Schema change looks even closer to done.
* A decision has been reached on ExternalStore de-PHP-serialization; patch
still needs review
* Third-party developer support Phab tasks coordinated with Tech
Collaboration:
** T184606: Evaluate and set up a test instance of FOSS persistent chat
software as a companion to Q&A system for communication with third-party
developers
** T184648: Create and publish a multi-tiered support level system for
MediaWiki extensions frequently used by third parties
* Dev Summit planning
* Audiences Technology Working Group participation
=== Performance ===
* Blocked by: N/A
* Blocking: N/A
* Updates:
**Team offsite this week, will not be attending
**Prep for monitoring/evaluating Singapore at go-live
**Fixes for database "Domain ID" logic -
https://gerrit.wikimedia.org/r/404060 and
https://gerrit.wikimedia.org/r/#/c/404056/
**Docs on running performance testing on an Android phone -
https://wikitech.wikimedia.org/wiki/Measure_Performance#Testing_performance…
**Added alerts for NavigationTiming report rates -
https://phabricator.wikimedia.org/T179555#3887559
=== Release Engineering ===
* Blocking
** None?
* Blocked
** "Stack overflow when Redis is down" -
https://phabricator.wikimedia.org/T185055
*** Need help from Operations and/or Performance
* Updates
** Catching up the train this week and rolling out the last version before
DevSummit/All Hands and RelEng team offsite weeks. [wiki[email]]
**** **https://phabricator.wikimedia.org/T180749#3897321*
<https://phabricator.wikimedia.org/T180749#3897321>
** We moved Wednesday morning’s SWAT window 1 hour earlier (to 10am) to
give us an hour break before the new MW version rolls to second set of
wikis (all non-wikipedias) which was a follow-up from a recent post-mortem.
[wiki][email]
**** *
*https://lists.wikimedia.org/pipermail/wikitech-l/2018-January/089404.html*
<https://lists.wikimedia.org/pipermail/wikitech-l/2018-January/089404.html>
**** **https://phabricator.wikimedia.org/T182733*
<https://phabricator.wikimedia.org/T182733>
** We broke git-fat deploy repos in scap (old config no longer valid),
workaround/fix available in all relevant repos.
**** **https://phabricator.wikimedia.org/T184882#3899710*
<https://phabricator.wikimedia.org/T184882#3899710>
*** (Yes, we’re re-doing how the CI for scap is done, see:
*https://phabricator.wikimedia.org/T184628*
<https://phabricator.wikimedia.org/T184628> )
** Updated the Debian packaging for Zuul (CI task scheduler) and released
2.5.0-8-gcbc7f62-wmf6, unblocking an upgrade of Gerrit.
**** **https://phabricator.wikimedia.org/T158243*
<https://phabricator.wikimedia.org/T158243>
** Converted our home-grown docker image builder to `docker-pkg` from
Giuseppe
**** **https://phabricator.wikimedia.org/T177276*
<https://phabricator.wikimedia.org/T177276>
** Getting started with the basics of planning our team offsite pre
Barcelona Hackathon. Submitted travel request form and let eng-admin@ know.
** Working on browser tests with Search (“selenium-CirrusSearch-jessie
daily Jenkins job”).
=== Research ===
* Blocked by:
* Blocking:
* Updates:
=== Scoring Platform ===
* Blocked by:
* Blocking:
* Updates:
=== Search Platform ===
* Blocked by:
* Blocking:
* Updates:
* Improving LTR training mechanisms
https://phabricator.wikimedia.org/T184547
* Working on fixes for completion suggester & redirects namespaces
https://phabricator.wikimedia.org/T115756
* Investigating ElasticSearch phonetic search
https://phabricator.wikimedia.org/T182708
* Working on Serbian analysis plugins for ES
https://phabricator.wikimedia.org/T183015
* Working on refactoring search profiles to make them more config-like
https://phabricator.wikimedia.org/T183279
* Finished test for machine-learning ranking on Hebrew wiki
https://phabricator.wikimedia.org/T182616, result analysis next
* Working on enabling WDQS-driven deep category search
https://phabricator.wikimedia.org/T181549
=== Security ===
* Blocked by:
* Blocking:
* Updates:
=== Services ===
* Blocked by: none
* Blocking: none
* Updates:
** All REST traffic is served from Cassandra 3 cluster
** Remaining Cassandra 2 nodes are being moved to Cassandra 3 cluster
** 50% of htmlCacheUpdate jobs are processed on kafka
*** All except enwiki, commons and wikidata
=== Technical Operations ===
* Blocked by:
** Global collaboration on the flow dumps.
https://phabricator.wikimedia.org/T164262
* Blocking:
** None
* Updates:
** Team restructuring happening. Don't expect much fallout, but we are SREs
these days.
** Grafana will be migrated to support native LDAP on Feb 12 2018. This
means the eventual demise for grafana-admin.wikimedia.org. Announcement to
be posted to engineering@ and wikitech@.
https://phabricator.wikimedia.org/T170150
== Wikidata ==
* Blocked by:
* Blocking:
* Updates:
== German Technical Wishlist ==
* Blocked by:
* Blocking:
* Updates:
== SoS Meeting Bookkeeping ==
* Updates:
Hello developers,
I would like to gauge the community's interest in taking part in a 1 day
hackathon in London on February 3. It's a bit short notice, but we have
been offered 1 day during a more general hackathon on the future of Wikis.
It's going under the general title of Darvoz (Portugese for 'to give
voice') and Katherine Maher will be in London during that evening and will
be giving a talk at the same venue from 7-9.
So I would like to see how many developers would be interested to come down
and join in a hackathon and attend the talk in the evening. At the moment,
it's undecided what the focus of the hackathon would be, but that's why I
would like to see if any community members could come and take part, and
potentially help me to organise the day. Please let me know if you are
interested and if you could come to London (if you might need somewhere to
stay the night, or if you might need travel expenses to get there) for that
day.
Here's the darvoz.org site.
John Lubbock
Communications Coordinator
Wikimedia UK
+44 (0) 203 372 0767
Wikimedia UK is a Company Limited by Guarantee registered in England and
Wales, Registered No. 6741827. Registered Charity No.1144513. Office 1,
Ground Floor, Europoint, 5 - 11 Lavington Street, London SE1 0NZ.
Wikimedia UK is the UK chapter of a global Wikimedia movement. The
Wikimedia projects are run by the Wikimedia Foundation (who operate
Wikipedia, amongst other projects). *Wikimedia UK is an independent
non-profit charity with no legal control over Wikipedia nor responsibility
for its contents.*
<http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_cam…>
Virus-free.
www.avg.com
<http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_cam…>
<#DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>
Hi all!
TechCom decided to use this week's IRC discussion slot for a brain storming
session about Evolving the MediaWiki Architecture. "MediaWiki Architecture" is
supposed to be interpreted broadly here - anything that helps us to manage and
serve content is in scope.
The idea is to provide input for the corresponding workshop session at the
summit, in which we will identify focus areas for technical development for the
years to come. You can find more information about the session on phabricator,
please provide input there as well: <https://phabricator.wikimedia.org/T183313>.
The IRC session will take place in #wikimedia-office on Wednesday January 17,
21:00 UTC (2pm PDT, 23:00 CEST).
--
Daniel Kinzler
Principal Platform Engineer
Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512
Hi,
MediaWiki code search is a fully free software tool that lets you
easily search through all of MediaWiki core, extensions, and skins
that are hosted on Gerrit. You can limit your search to specific
repositories, or types of repositories too. Regular expressions are
supported in both the search string, and when filtering by path.
Try it out: https://codesearch.wmflabs.org/search/
I started working on this because the only other options to searching
the entire MediaWiki codebase was either cloning everything locally
(takes up space, and need to manually keep it up to date) or using
Github (not free software, has extraneous repositories). The backend
is powered by hound, a code search tool written by etsy, based on
Google's Code Search.
Please let me know what you think! More documentation and links are
at: <https://www.mediawiki.org/wiki/Codesearch>.
- -- Legoktm
-----BEGIN PGP SIGNATURE-----
iQJLBAEBCgA1FiEE+h6fmkHn9DUCyl1jUvyOe+23/KIFAlo7NUoXHGxlZ29rdG1A
bWVtYmVyLmZzZi5vcmcACgkQUvyOe+23/KJn/w//YYSD6Fer5EfQAXj+frd02rB5
yx8cowO4ttPFG+52ZTt4RE24SdjSFcz42jnq6wuSQ47pQsZHgDc5qrr6JRsFGq9l
Bvnh7NIYsHHOdQDTkxwHHwaHBTb31u35Bt8+qSHPqbB3cCAHMirJJjvs5+yoilIi
wCmbjpxYoL4eUiMNeZRH/eYyUxpZJwHadc2FuuN3meUIgKoFAblHnKdxTmYoExqr
86PkjE36trbvOQkfrxaSyGJjG5Nm7l+83rm3pCo5pX9Fj/GZOdxcp0siRBKGaQ7W
OciRofZAPjtqmiUunf2pe/wVEAK51VS7EkobgWraSSOwBf62PN7hHVLXQanRn8bh
tQEcKHOxoVSXDlM/fl45cIBN/YGm9LEmRk0iB1HlZZ+QSC3XYj3kL/eMLlGorOuX
MtKZ+J1KOjNJ2fmCMBZhGDzdHPSN70VSAN2Th3kqpDTGzXLTcn3D0VqIT0gQ6eiz
lVyW0haiDuBS7JixZDdLFNr8RkMRLRWmJEdQQi/5VEp1I7K/UQmmt50HqzDBN4d6
/0iKw8p5lANdmjP1rsVzmRrc5C94IS6GN68VznfXMPD+iXI4j1PEeJ6cgEn4aD3y
oh2bD4nmX/T4YfBeigWxPVq3OyPHC5tPzTxdy8OHPNfko/xpwhlBMaf70fBIaBPy
Ciq+thh5hlKuCT1HdXI=
=Te+C
-----END PGP SIGNATURE-----
NPM v5 supports shrinkwrapping dependencies
<http://blog.npmjs.org/post/161081169345/v500> via a "package-lock.json"
lockfile (example
<https://gerrit.wikimedia.org/r/#/c/403724/1/package-lock.json>). Should we
generally be committing these lockfiles or ignoring them in Wikimedia repos?
The downsides of lockfile usage that I'm aware of are:
1. No CI support yet. (CI simply ignores this file when installing
dependencies because it uses an older version of NPM.)
2. Platform differences can create different lockfiles for optional
dependencies but it's a known issue
<https://github.com/npm/npm/issues/17722>.
3. Developers must remember to use NPM v5 or greater when adding or
updating dependencies.
4. The format is a bit verbose.
The pros of lockfiles are:
1. Considered best practice and the default behavior of NPM.
2. The officially supported use case for reproducible builds.
The topic is tracked in T179229 <https://phabricator.wikimedia.org/T179229> but
there's lots of activity around lockfiles
<https://phabricator.wikimedia.org/search/query/NFhYM5EmMLlB/#R> outside of
it.
Stephen
Sorry for cross-posting!
Reminder: Technical Advice IRC meeting again **tomorrow, Wednesday 4-5 pm
UTC** on #wikimedia-tech.
The Technical Advice IRC meeting is open for all volunteer developers,
topics and questions. This can be anything from "how to get started" over
"who would be the best contact for X" to specific questions on your project.
If you know already what you would like to discuss or ask, please add your
topic to the next meeting:
https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting
Hope to see you there!
Michi (for WMDE’s tech team)
--
Michael F. Schönitzer
Wikimedia Deutschland e.V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Tel. (030) 219 158 26-0
http://wikimedia.de
Stellen Sie sich eine Welt vor, in der jeder Mensch an der Menge allen
Wissens frei teilhaben kann. Helfen Sie uns dabei!
http://spenden.wikimedia.de/
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
I have just merged the "stretch-migration" feature branch to master
for MediaWiki-Vagrant.
Major changes:
* Debian Stretch (Debian 9) base image
* Default PHP runtime is Zend PHP 7.0 (HHVM available via role)
* Database is MariaDB 10.1
* Puppet 4
Once you update your local MediaWiki-Vagrant clone to 59e3b49c or
later you will need to create a new VM based on the Debian Stretch
base image in order to use `vagrant provision`. Upgrading your local
VM may be as easy as using `vagrant destroy` to delete the current VM
followed by `vagrant up` to make a new one. Note that this will *not*
save the contents of any local wikis in the VM. You will need to
manually backup and restore the databases or export and import pages
you have created.
See <https://phabricator.wikimedia.org/T181353> for more information
and a few known open bugs.
I have also created a jessie-compat branch that can by used by users
who are not ready to destroy their current Jessie based virtual
machines and start over with Stretch. A simple `git checkout -b
jessie-compat` should be all that is needed to switch your local
MediaWiki-Vagrant clone to the new compatibility branch. This branch
will probably receive few updates, so you are encouraged to create new
Stretch based VMs soon.
Bryan
--
Bryan Davis Wikimedia Foundation <bd808(a)wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services Boise, ID USA
irc: bd808 v:415.839.6885 x6855
Hi all,
I am forwarding you this email, because we have a specific technical
question. With DBpedia as middleware, we can create a global view on all
the data that is in Wikipedias Infoboxes and Wikidata and compare them
(for details see the email below and also the proposal).
We were wondering what is the latest and most appropriate tech to
interface with the editors of infoboxes.
VisualEditor seems appropriate, but I checked here for example:
https://en.wikipedia.org/wiki/Fulda
It seems to be possible to edit some values, but there is no Wikidata
support and also population does have a reference which does not show up
in the VisualEditor.
Do you think it would be a good way to provide comparative facts from
other language versions in the VisualEditor? Or would you choose
something else?
All the best,
Sebastian
-------- Forwarded Message --------
Subject: [Wikidata] GlobalFactSync
Date: Mon, 15 Jan 2018 19:57:04 +0100
From: Magnus Knuth <knuth(a)informatik.uni-leipzig.de>
Reply-To: Discussion list for the Wikidata project.
<wikidata(a)lists.wikimedia.org>
To: wikidata(a)lists.wikimedia.org
Dear all,
last year, we applied for a Wikimedia grant to feed qualified data from Wikipedia infoboxes (i.e. missing statements with references) via the DBpedia software into Wikidata. The evaluation was already quite good, but some parts were still missing and we would like to ask for your help and feedback for the next round. The new application is here: https://meta.wikimedia.org/wiki/Grants:Project/DBpedia/GlobalFactSync
The main purpose of the grant is:
- Wikipedia infoboxes are quite rich, are manually curated and have references. DBpedia is already extracting that data quite well (i.e. there is no other software that does it better). However, extracting references is not a priority on our agenda. They would be very useful to Wikidata, but there are no user requests for this from DBpedia users.
- DBpedia also has all the infos of all infoboxes of all Wikipedia editions (>10k pages), so we also know quite well, where Wikidata is used already and where information is available in Wikidata or one language version and missing in another.
- side-goal: bring the Wikidata, Wikipedia and DBpedia communities closer together
Here is a diff between the old an new proposal:
- extraction of infobox references will still be a goal of the reworked proposal
- we have been working on the fusion and data comparison engine (the part of the budget that came from us) for a while now and there are first results:
6823 birthDate_gain_wiki.nt
3549 deathDate_gain_wiki.nt
362541 populationTotal_gain_wiki.nt
372913 total
We only took three properties for now and showed the gain where no Wikidata statement was available. birthDate/deathDate is already quite good. Details here: https://drive.google.com/file/d/1j5GojhzFJxLYTXerLJYz3Ih-K6UtpnG_/view?usp=…
Our plan here is to map all Wikidata properties to the DBpedia Ontology and then have the info to compare coverage of Wikidata with all infoboxes across languages.
- we will remove the text extraction part from the old proposal (which is here for you reference: https://meta.wikimedia.org/wiki/Grants:Project/DBpedia/CrossWikiFact). This will still be a focus during our work in 2018, together with Diffbot and the new DBpedia NLP department, but we think that it distracted from the core of the proposal. Results from the Wikipedia article text extraction can be added later once they are available and discussed separately.
- We proposed to make an extra website that helps to synchronize all Wikipedias and Wikidata with DBpedia as its backend. While the external website is not an ideal solution, we are lacking alternatives. The Primary Sources Tool is mainly for importing data into Wikidata, not so much synchronization. The MediaWiki instances of the Wikipedias do not seem to have any good interfaces to provide suggestions and pinpoint missing info. Especially to this part, we would like to ask for your help and suggestions, either per mail to the list or on the talk page: https://meta.wikimedia.org/wiki/Grants_talk:Project/DBpedia/GlobalFactSync
We are looking forward to a fruitful collaboration with you and we thank you for your feedback!
All the best
Magnus
--
Magnus Knuth
Universität Leipzig
Institut für Informatik
Abt. Betriebliche Informationssysteme, AKSW/KILT
Augustusplatz 10
04109 Leipzig DE
mail: knuth(a)informatik.uni-leipzig.de
tel: +49 177 3277537
webID: http://magnus.13mm.de/
_______________________________________________
Wikidata mailing list
Wikidata(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata