The Search Platform Team
<https://www.mediawiki.org/wiki/Wikimedia_Search_Platform> usually holds
office hours the first Wednesday of each month. Come talk to us about
anything related to Wikimedia search!
Feel free to add your items to the Etherpad Agenda for the next meeting.
Details for our next meeting:
Date: Wednesday, June 3rd, 2020
Time: 15:00-16:00 GMT / 08:00-09:00 PDT / 11:00-12:00 EDT / 17:00-18:00 CEST
Etherpad: https://etherpad.wikimedia.org/p/Search_Platform_Office_Hours
Google Meet link: https://meet.google.com/vyc-jvgq-dww
Join by phone in the US: +1 786-701-6904 PIN: 262 122 849#
Hope to talk to you in a week!
—Trey
Trey Jones
Sr. Software Engineer, Search Platform
Wikimedia Foundation
UTC-4 / EDT
Hooks::run() was soft-deprecated in Nikki Nikkhoui's HookContainer
patch, merged on April 17. [1] And my patch to remove almost all
instances of it in MediaWiki Core was finally merged over the weekend.
[2] That means that everyone writing core code now needs to learn how
to use the new hook system.
HookContainer is a new service which replaces the functionality which
was previously in static methods in the Hooks class. HookContainer
contains a generic run() method which runs a specified hook, analogous
to Hooks::run(). However, unlike Hooks::run(), you generally should
not use HookContainer::run() directly. Instead, you call a proxy
method in a hook runner class.
Hook runner classes give hooks machine-readable parameter names and types.
How to call a hook
------------------
In MediaWiki Core, there are two hook runner classes: HookRunner and
ApiHookRunner. ApiHookRunner is used in the Action API, and HookRunner
is used everywhere else.
How you get an instance of HookRunner depends on where you are:
* In classes that use dependency injection, a HookContainer object is
passed in as a constructor parameter. Then the class creates a local
HookRunner instance:
$this->hookRunner = new HookRunner( $hookContainer );
* In big hierarchies like SpecialPage, there are always
getHookRunner() and getHookContainer() methods which you can use.
* Some classes use the ProtectedHookAccessor trait, which provides
getHookRunner() and getHookContainer() methods that get their
HookContainer from the global service locator. You can also call
MediaWikiServices::getHookContainer() in your own code, if dependency
injection is not feasible.
* There is a convenience method for static code called
Hooks::runner(), which returns a HookRunner instance using the global
HookContainer.
* Extensions should generally not use HookRunner, since the available
hooks may change without deprecation. Instead, extensions should have
their own HookRunner class which calls HookContainer::run().
Once you have a HookRunner object, you call the hook by simply calling
the relevant method.
How to add a hook
-----------------
* Create an interface for the hook. The interface name is always the
hook name with "Hook" appended. The interface should contain a single
method, which is the hook name with a prefix of "on". So for example,
for a hook called MovePage, there will be an interface called
MovePageHook with a method called onMovePage(). The interface will
typically be in a "Hook" subnamespace relative to the caller namespace.
* Add an "implements" clause to HookRunner.
* Implement the method in HookRunner.
Note that the name of the interface is currently not enforced by CI.
Alphabetical sorting of interface names and method names in HookRunner
is also not enforced. Please be careful to follow existing conventions.
How to deprecate a hook
-----------------------
Hooks were previously deprecated by passing options to Hook::run().
They are now deprecated globally by adding the hook to an array in the
DeprecatedHooks class.
Using the new system in extensions
----------------------------------
Extensions should create their own HookRunner classes and use them to
call hooks. HookContainer::run() should be used instead of Hooks::run().
As for handling hooks, I think it's too early for a mass migration of
extensions to the new registration system as described in the RFC.[3]
Extension authors who are keen to pilot the new system can give it a
go. Make sure you add Nikki and me as reviewers.
More information about the new system can be found in docs/Hooks.md
[4]. The patch to add it should soon be merged.
[1] https://gerrit.wikimedia.org/r/c/mediawiki/core/+/571297
[2] https://gerrit.wikimedia.org/r/c/mediawiki/core/+/581225
[3] https://phabricator.wikimedia.org/T240307
[4]
<https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+/323ac073d38…>
-- Tim Starling
Hello,
sadly, (all) blocks prevent account autocreations (see T249444). That means
that even if you have a Wikimedia account, you can't edit through an
anon-only block, because the account doesn't go through.
Also, if the community wishes you to bypass a hard block, they can't grant
you an IPBE, because you can't get the account autocreated.
On the other hand, it can confuse checkusers and the like to see users
having account autocreated through a valid block.
I welcome your comments on a possible solution of this problem at
https://phabricator.wikimedia.org/T249444.
Thanks,
Martin Urbanec
Hello colleagues,
== Announcements for this month ==
The next Wikimedia Café meetup will occur on 30 May 2020 at 9:30 AM
Pacific / 11:30 AM Eastern / 4:30 PM UTC / 10 PM IST.
This month's meetup will focus on the recently announced 2030 strategy
recommendations. See
https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20/Recomme….
The organizers of the Café have not finalized the format for this
month, but this is likely to be a two hour meetup with time for
discussions regarding each of the strategy recommendations.
The Café has been well attended during the past few months. There have
been multiple requests for an alternate meetup time for the Café. This
subject is being discussed on the Café talk page, and is also on the
agenda for May's meetup. One possibility under consideration is
offering an additional Café meetup on a different day of the week and
a different time of day. Please feel free to join the discussion on
the talk page if you cannot attend the Café in its current time slot
and would like to comment regarding this issue.
== General information about the Café ==
More information regarding the agenda and links to strategy documents
are available at https://meta.wikimedia.org/wiki/Wikimedia_Café.
As usual, the meeting style for the Café will emphasize discussion
rather than presentation. People are welcome to participate as
listeners only if they prefer.
Please see the page on Meta for more information about the Café.
Please watch the page for any updates, particularly to the schedule or
the agenda. Signing up for the meeting is optional, but is helpful to
the organizers so that we can estimate how many people will attend.
Signing up for the meeting also informs us who we should notify
individually if there are significant changes.
If there are any problems with connecting to the meeting or if you
have any questions or comments, then please write on the Meta talk
page or send me an email.
Pine
( https://meta.wikimedia.org/wiki/User:Pine )
Hello,
In case you haven't done any changes on database schema of mediawiki core,
let me explain the process to you (if you know this, feel free to skip this
paragraph):
* Mediawiki core supports three types of RDBMS: MySQL, Sqlite, Postgres. It
used to be five (plus Oracle and MSSQL)
* For each one of these types, you need to do three parts: 1- Change the
tables.sql file so new installations get the new schema 2- Make .sql schema
change file, like an "ALTER TABLE" for current installations so they can
upgrade. 3- Wire that schema change file into *Updater.php file.
* For example, this is a patch to drop a column:
https://gerrit.wikimedia.org/r/c/mediawiki/core/+/473601 This file touches
14 different files, adds 94 lines and removes 30.
This is bad for several reasons:
* It is extremely complicated to do a even a simple schema change. Usually
something as simple as adding an column takes a whole day for me. There are
lots of complicating factors, like Sqlite doesn't have ALTER TABLE, so when
you want to make a patch for adding a column, you need to make a temporary
table with the new column, copy the old table data to it, drop the old
table and then rename the old table.
** Imagine the pain and sorrow when you want to normalize a table meaning
you need to do several schema changes: 1- Add a table, 2- Add a column on
the old table, 3- make the column not-nullable when it's filled and make
the old column nullable instead 4- drop the old column.
* It's almost impossible to test all DBMS types, I don't have MSSQL or
Oracle installed and I don't even know their differences with MySQL. I
assume most other developers are good in one type, not all.
* Writing raw sqls, specially duplicated ones, and doubly specially when we
don't have CI to test (because we won't install propriety software in our
infra) is pretty much prone to error. My favourite one was that a new
column on a table was actually added to the wrong table in MSSQL and it
went unnoticed for two years (four releases, including one LTS).
* It's impossible to support more DBMS types through extensions or other
third party systems. Because the maintainer needs to keep up with all
patches we add to core and write their equivalents.
* For lots of reasons, these schemas are diverging, there have been several
work to just reduce this to a minimum.
There was a RFC to introduce abstract schema and schema changes and it got
accepted and I have been working to implement this:
https://phabricator.wikimedia.org/T191231
This is not a small task, and like any big work, it's important to cut it
to small pieces and gradually improve things. So my plan is first, I
abstract the schema (tables.sql files), then slowly I abstract schema
changes. For now, the plan is to make these .sql files automatically
generated through maintenance scripts. So we will have a file called
tables.json and when running something like:
php maintenance/generateSchemaSql.php --json maintenance/tables.json --sql
maintenance/tables-generated.sql --type=mysql
It would produce tables-generated.sql file. The code that produces it is
Doctrine DBAL and this is already installed as a dev dependency of core
because you would need Doctrine if you want to make a schema change, if you
maintain an instance, you should not need anything. Most of the work for
automatically generating schema is already merged and the last part that
wires it (and migrates two tables) is up for review:
https://gerrit.wikimedia.org/r/c/mediawiki/core/+/595240
My request is that I need to make lots of patches and since I'm doing this
in my volunteer capacity, I need developers to review (and potentially help
with the work if you're excited about this like me). Let me know if you're
willing to be added in future patches and the current patch also welcomes
any feedback: https://gerrit.wikimedia.org/r/c/mediawiki/core/+/595240
I have added the documentation in
https://www.mediawiki.org/wiki/Manual:Schema_changes for the plan and
future changes. The ideal goal is that when you want to do a schema change,
you just change tables.json and create a json file that is snapshot of
before and after table (remember, sqlite doesn't have alter table, meaning
it has to know the whole table). Also, once we are in a good shape in
migrating mediawiki core, we can start cleaning up extensions.
Any feedback is also welcome.
Best
--
Amir (he/him)
The 1.35.0-wmf.34 version of MediaWiki is blocked[0].
The new version is deployed to group0[1], however, canary checks failed
during the deployment of wmf.34 to group1 wikis and was rolled back. The
train can proceed no
further until this issue is resolved:
Exception from TermStoreWriterFactory : Local entity source does not have
items. <https://phabricator.wikimedia.org/T253804>
Once resolved, the train can resume. If resolved on Friday, the train will
resume Monday.
Thanks for your help in resolving the issue.
-- Your humble train toiler
[0]. https://phabricator.wikimedia.org/T253022
[1]. <https://tools.wmflabs.org/versions/>
Hi,
for HTML version see https://www.mediawiki.org/wiki/Scrum_of_scrums/2020-05-27
Željko
--
= 2020-05-27 =
== Callouts ==
* Release Engineering
** [All] Review guidance at [[wikitech:Deployments/Covid-19]] and Code
Deployment Office Hour at 17:00UTC in #wikimedia-office
** "scap sync" will be renamed to "scap sync-world" in the next
release. If you use "scap sync" non-interactively, please add a note
to: [[phab:T250302]]
== Product ==
=== Growth ===
* Blocking:
** Release Engineering - [Low priority] Growth team: Fix Flow DB
errors from phpunit tests – [[phab:T249839]], blocking
[[phab:T246358]]
=== iOS native app ===
* Updates:
** 6.6 release (mobile-html) released yesterday in scaled rollout -
[[phab:project/view/4273]]
** Working on bug fix 6.6.1 release
=== Android native app ===
* Updates:
** Mobile-html integration and Commons Image Tagging in production
** Working on minor update to address user feedback and small bugs.
=== Web ===
* Updates:
** '''Summary''': collapsible sidebar UI and persistence continues for
Desktop Improvements Project (DIP), scaffolding Vue.js search.
** [[Reading/Web/Desktop_Improvements|Desktop Improvements Project
(Vector / DIP)]]:
*** [[phab:T253329|<nowiki>Deprecate the `.menu` class</nowiki>]]
*** [[phab:T252841|<nowiki>Update the method Wikibase uses to inject
edit language links into the language portal</nowiki>]]
*** [[phab:T252774|<nowiki>Rename mediawiki.toc.styles ResourceLoader
module</nowiki>]]
*** [[phab:T252917|<nowiki>Drop support for SkinTemplateToolboxEnd in
Vector, finding suitable replacement</nowiki>]]
*** [[phab:T252800|<nowiki>Regression: Option add links in other
languages has disappeared</nowiki>]]
*** [[phab:T249372|<nowiki>[Dev] DRY up the menu templating code</nowiki>]]
*** [[phab:T246419|<nowiki>Build collapsible sidebar and sidebar
button </nowiki>]]
*** [[phab:T251212|<nowiki>[Dev] Drop VectorTemplate usage in Vector</nowiki>]]
*** [[phab:T60137|<nowiki>Deprecate the
SkinTemplateOutputPageBeforeExec hook</nowiki>]]
*** [[phab:T191021|<nowiki>Standardize `.mw-ui-icon` to overhauled
icon canvas size 20x20</nowiki>]]
*** [[phab:T246427|<nowiki>[Spike 8hrs] Make collapsible sidebar
persistent across sessions for logged-in users, for sessions for
logged-out users</nowiki>]]
*** [[phab:T244392|Vue.js search case study]]:
**** [[phab:T251968|<nowiki>[Spike] Prototype a single component for
new Vue.js search project</nowiki>]]
**** [[phab:T251832|<nowiki>[Spike] Build and deploy a Vue.js search
prototype to labs</nowiki>]]
**** [[phab:T253357|<nowiki>Name the Vue.js component library</nowiki>]]
**** [[phab:T249350|<nowiki>[Spike] Build step, ResourceLoader, or
both for Vue.js search?</nowiki>]]
** Mobile website (MinervaNeue / MobileFrontend):
*** [[phab:T253084|<nowiki>Don't count startup script
resourceloader.exception events in WebClientError error
counting</nowiki>]]
*** [[phab:T246767|<nowiki>Implement Tap to show for lazy loaded
images without MutationObserver - Scrolling on a Kai OS mobile device
can be slow on articles with lazy loading</nowiki>]]
*** [[phab:T240622|<nowiki>[Technical debt payoff] Remove
InlineDiffFormatter and InlineDifferenceEngine from
MobileFrontend</nowiki>]]
*** [[phab:T32405|<nowiki>[EPIC] MobileFrontend extension should stop
special-casing main page</nowiki>]]
*** [[phab:T246838|<nowiki>Avoid SEO performance and user experience
penalties by addressing our handling of lazy loaded images using
IntersectionObserver</nowiki>]]
=== Product Infrastructure ===
* Blocked by:
** SRE service-ops on deploying chromium-render, mobileapps in
kubernetes, working on it.
=== Structured Data ===
* Updates:
** vue.js port of computer-aided-tagging on beta
** had SDAW (structured data across wikipedias) offsite last week
== Technology ==
=== Fundraising Tech ===
* Updates:
** Updated payments-wiki to stop making direct DB calls to backend
** Building CiviCRM extension to sync info about employer's matching
gift policies to our db
** Updating CiviCRM to latest point release
** Adding feature to email donors automatically on failed recurring payment
** getting ready to release CentralNotice features that require schema changes
=== Engineering Productivity ===
==== Release Engineering ====
* Blocked by:
** [Low priority] SRE Service Ops: Provide our special component/php72
in buster-wikimedia — [[phab:T250515]]
** [Low priority] Wikibase team: Fix WikibaseLexeme DB errors from
phpunit tests – [[phab:T249838]], blocking [[phab:T246358]]
** [Low priority] Growth team: Fix Flow DB errors from phpunit tests –
[[phab:T249839]], blocking [[phab:T246358]]
* Updates:
** "scap sync" will be renamed to "scap sync-world" in the next
release. If you use "scap sync" non-interactively, please add a note
to: [[phab:T250302]]
** [All] Deployments/Covid-19 [[wikitech:Deployments/Covid-19]]
** Train Health
*** Last week: 1.35.0-wmf.33 - No train (EngProd virtual off-site)
*** This week: 1.35.0-wmf.34 - [[phab:T253022]]
**** Call to a member function getUser() on boolean (
CoreParserFunctions::revisionuser ?) [[phab:T253725]]
**** Fatal: Class 'MessageIndexException' not found [[phab:T253748]]
**** No localisation cache found for English. Please run
maintenance/rebuildLocalisationCache.php. in production when running
populateSitesTable for aawikibooks with foreachwikiindblist
[[phab:T253756]]
*** Next week: 1.35.0-wmf.34 - [[phab:T253023]]
=== Site Reliability Engineering ===
* Blocking:
** Research on deploying recomnendation-api to kubernetes
** Product infrastructure on deploying chromium-render, mobileapps
** Release Engineering - [Low priority] SRE Service Ops: Provide our
special component/php72 in buster-wikimedia — [[phab:T250515]]
== Wikimedia DE ==
=== Wikidata ===
* Blocking:
** Release Engineering - [Low priority] Wikibase team: Fix
WikibaseLexeme DB errors from phpunit tests – [[phab:T249838]],
blocking [[phab:T246358]]
Hi Patrick,
That screenshot is of the Wikipedia app, not the Commons app. :) But
hopefully the Wikipedia app team will see your post.
Best regards,
Josephine
On Wed, 27 May 2020 at 01:42, Patrick Fiset <patrick.fiset(a)gmail.com> wrote:
>
> The image now attached.
>
> On Tue, 26 May 2020 at 11:40, Patrick Fiset <patrick.fiset(a)gmail.com>
> wrote:
>
>> Hello all,
>>
>>
>> Since some while, the 3 "read more" articles description is now
>> truncated, as you can see here :
>>
>>
>> [image: image.png]
>>
>>
>>
>> It would be better to at least show the full first phrase.
>>
>> Could this be done ?
>>
>> Or can I revert to an older version of the app ?
>>
>>
>>
>> Thank you,
>>
>> Patrick
>>
>> On Tue, 26 May 2020 at 10:04, Josephine Lim <josephinelim86(a)gmail.com>
>> wrote:
>>
>>> Hi all,
>>>
>>> Hope you are safe and well. We've just released v2.13 of the Commons
>>> Android app[1] to beta, which includes:
>>>
>>> - A new media details UI, which includes the ability to zoom and pan
>>> around images
>>> - When the user uploads a picture with a geotag, the app will check for
>>> Nearby places that need photos around that location, and one is found, it
>>> will ask the user "Is this a picture of Place X?"
>>> - Modifications to Nearby filters based on user feedback
>>> - Bug and crash fixes for stuff that got broken by the codebase overhaul
>>>
>>> Our next release will likely contain structured data integration,
>>> bookmarks for the Nearby map, and a couple of other new features.
>>>
>>> Thank you all for your support and encouragement - feedback, bug
>>> reports, and suggestions are always welcome in our GitHub repo[2]. :)
>>>
>>> [1]: https://play.google.com/store/apps/details?id=fr.free.nrw.commons
>>> [2]: https://github.com/commons-app/apps-android-commons/issues/
>>>
>>> Best regards,
>>> Josephine / @misaochan (Commons app project lead)
>>> _______________________________________________
>>> Mobile-l mailing list
>>> Mobile-l(a)lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/mobile-l
>>>
>>
Hi all,
Hope you are safe and well. We've just released v2.13 of the Commons
Android app[1] to beta, which includes:
- A new media details UI, which includes the ability to zoom and pan
around images
- When the user uploads a picture with a geotag, the app will check for
Nearby places that need photos around that location, and one is found, it
will ask the user "Is this a picture of Place X?"
- Modifications to Nearby filters based on user feedback
- Bug and crash fixes for stuff that got broken by the codebase overhaul
Our next release will likely contain structured data integration, bookmarks
for the Nearby map, and a couple of other new features.
Thank you all for your support and encouragement - feedback, bug reports,
and suggestions are always welcome in our GitHub repo[2]. :)
[1]: https://play.google.com/store/apps/details?id=fr.free.nrw.commons
[2]: https://github.com/commons-app/apps-android-commons/issues/
Best regards,
Josephine / @misaochan (Commons app project lead)