Hello,
The committee has finished selecting new members and the new committee
candidates are (In alphabetical order):
- Amir Sarabadani
- Martin Urbanec
- MusikAnimal
- Tonina Zhelyazkova
- Tony Thomas
And auxiliary members will be (Also in alphabetical order):
- Huji
- Jayprakash12345
- Matanya
- Nuria Ruiz
- Tpt
You can read more about the members in [0]
The changes compared to last term are:
* Lucie has resigned in September 2019. Tpt (as one of auxiliary members)
filled her seat in the meantime and now Tpt is moving to be an auxiliary
member again.
* Martin Urbanec is joining the main committee to fill Lucie/Tpt seat
* Jay Prakash is joining auxiliary members
* Rosalie is leaving the auxiliary members.
This is not the final structure. According to the CoC [1], the current
committee publishes the new members and call for public feedback for *six
weeks* and after that, the current committee might apply changes to the
structure based on public feedback.
Please let the committee know if you have any concern regarding the members
and its structure until *09 June 2019* and after that, the new committee
will be in effect and will serve for a year.
[0]:
https://www.mediawiki.org/wiki/Code_of_Conduct/Committee/Members/Candidates
[1]:
https://www.mediawiki.org/wiki/Code_of_Conduct/Committee#Selection_of_new_m…
Amir, On behalf of the Code of Conduct committee
Best
Hi Everyone,
It's time for Wikimedia Tech Talks 2020 Episode 4! This talk will take
place on 5 June 2020 at 5 PM UTC.
Title: API portal and gateway
Speaker: Evan Prodromou, Product Manager for APIs in the Platform Team
Summary: How does Wikimedia become "the essential infrastructure in the
ecosystem of free knowledge"? One way is by making a platform that helps
software developers become successful. In this talk, Evan Prodromou,
Product Manager for APIs in the Platform Team, discusses the ongoing work
to provide a Wikimedia developer platform. With this platform, app creators
can include Wikimedia data and content into their software in new and
emergent ways. From modernizing our API paradigm, through unified user
authorization, documentation, and developer onboarding, the Platform team
is working to make a developer experience that rivals those from other
major Internet players.
The link to the Youtube Livestream can be found here:
https://youtu.be/gedV-OScuQY
During the live talk, you are invited to join the discussion on IRC at
#wikimedia-office
You can browse past Tech Talks here:
https://www.mediawiki.org/wiki/Tech_talks
If you are interested in giving your own tech talk, you can learn more
here:
https://www.mediawiki.org/wiki/Project:Calendar/How_to_schedule_an_event#Te…
Note: This is a public talk. Feel free to distribute through appropriate
email and social channels!
Kindly,
Sarah R. Rodlund
Technical Writer, Developer Advocacy
<https://meta.wikimedia.org/wiki/Developer_Advocacy>
srodlund(a)wikimedia.org
Hi All,
I am using the VE model in a Gadget and I am wondering how can I get the
headings which have been entered on the VE surface for further processing
Thanks
The Search Platform Team
<https://www.mediawiki.org/wiki/Wikimedia_Search_Platform> usually holds
office hours the first Wednesday of each month. Come talk to us about
anything related to Wikimedia search!
Feel free to add your items to the Etherpad Agenda for the next meeting.
Details for our next meeting:
Date: Wednesday, June 3rd, 2020
Time: 15:00-16:00 GMT / 08:00-09:00 PDT / 11:00-12:00 EDT / 17:00-18:00 CEST
Etherpad: https://etherpad.wikimedia.org/p/Search_Platform_Office_Hours
Google Meet link: https://meet.google.com/vyc-jvgq-dww
Join by phone in the US: +1 786-701-6904 PIN: 262 122 849#
Hope to talk to you in a week!
—Trey
Trey Jones
Sr. Software Engineer, Search Platform
Wikimedia Foundation
UTC-4 / EDT
Hooks::run() was soft-deprecated in Nikki Nikkhoui's HookContainer
patch, merged on April 17. [1] And my patch to remove almost all
instances of it in MediaWiki Core was finally merged over the weekend.
[2] That means that everyone writing core code now needs to learn how
to use the new hook system.
HookContainer is a new service which replaces the functionality which
was previously in static methods in the Hooks class. HookContainer
contains a generic run() method which runs a specified hook, analogous
to Hooks::run(). However, unlike Hooks::run(), you generally should
not use HookContainer::run() directly. Instead, you call a proxy
method in a hook runner class.
Hook runner classes give hooks machine-readable parameter names and types.
How to call a hook
------------------
In MediaWiki Core, there are two hook runner classes: HookRunner and
ApiHookRunner. ApiHookRunner is used in the Action API, and HookRunner
is used everywhere else.
How you get an instance of HookRunner depends on where you are:
* In classes that use dependency injection, a HookContainer object is
passed in as a constructor parameter. Then the class creates a local
HookRunner instance:
$this->hookRunner = new HookRunner( $hookContainer );
* In big hierarchies like SpecialPage, there are always
getHookRunner() and getHookContainer() methods which you can use.
* Some classes use the ProtectedHookAccessor trait, which provides
getHookRunner() and getHookContainer() methods that get their
HookContainer from the global service locator. You can also call
MediaWikiServices::getHookContainer() in your own code, if dependency
injection is not feasible.
* There is a convenience method for static code called
Hooks::runner(), which returns a HookRunner instance using the global
HookContainer.
* Extensions should generally not use HookRunner, since the available
hooks may change without deprecation. Instead, extensions should have
their own HookRunner class which calls HookContainer::run().
Once you have a HookRunner object, you call the hook by simply calling
the relevant method.
How to add a hook
-----------------
* Create an interface for the hook. The interface name is always the
hook name with "Hook" appended. The interface should contain a single
method, which is the hook name with a prefix of "on". So for example,
for a hook called MovePage, there will be an interface called
MovePageHook with a method called onMovePage(). The interface will
typically be in a "Hook" subnamespace relative to the caller namespace.
* Add an "implements" clause to HookRunner.
* Implement the method in HookRunner.
Note that the name of the interface is currently not enforced by CI.
Alphabetical sorting of interface names and method names in HookRunner
is also not enforced. Please be careful to follow existing conventions.
How to deprecate a hook
-----------------------
Hooks were previously deprecated by passing options to Hook::run().
They are now deprecated globally by adding the hook to an array in the
DeprecatedHooks class.
Using the new system in extensions
----------------------------------
Extensions should create their own HookRunner classes and use them to
call hooks. HookContainer::run() should be used instead of Hooks::run().
As for handling hooks, I think it's too early for a mass migration of
extensions to the new registration system as described in the RFC.[3]
Extension authors who are keen to pilot the new system can give it a
go. Make sure you add Nikki and me as reviewers.
More information about the new system can be found in docs/Hooks.md
[4]. The patch to add it should soon be merged.
[1] https://gerrit.wikimedia.org/r/c/mediawiki/core/+/571297
[2] https://gerrit.wikimedia.org/r/c/mediawiki/core/+/581225
[3] https://phabricator.wikimedia.org/T240307
[4]
<https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+/323ac073d38…>
-- Tim Starling
Hello,
sadly, (all) blocks prevent account autocreations (see T249444). That means
that even if you have a Wikimedia account, you can't edit through an
anon-only block, because the account doesn't go through.
Also, if the community wishes you to bypass a hard block, they can't grant
you an IPBE, because you can't get the account autocreated.
On the other hand, it can confuse checkusers and the like to see users
having account autocreated through a valid block.
I welcome your comments on a possible solution of this problem at
https://phabricator.wikimedia.org/T249444.
Thanks,
Martin Urbanec
Hello colleagues,
== Announcements for this month ==
The next Wikimedia Café meetup will occur on 30 May 2020 at 9:30 AM
Pacific / 11:30 AM Eastern / 4:30 PM UTC / 10 PM IST.
This month's meetup will focus on the recently announced 2030 strategy
recommendations. See
https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20/Recomme….
The organizers of the Café have not finalized the format for this
month, but this is likely to be a two hour meetup with time for
discussions regarding each of the strategy recommendations.
The Café has been well attended during the past few months. There have
been multiple requests for an alternate meetup time for the Café. This
subject is being discussed on the Café talk page, and is also on the
agenda for May's meetup. One possibility under consideration is
offering an additional Café meetup on a different day of the week and
a different time of day. Please feel free to join the discussion on
the talk page if you cannot attend the Café in its current time slot
and would like to comment regarding this issue.
== General information about the Café ==
More information regarding the agenda and links to strategy documents
are available at https://meta.wikimedia.org/wiki/Wikimedia_Café.
As usual, the meeting style for the Café will emphasize discussion
rather than presentation. People are welcome to participate as
listeners only if they prefer.
Please see the page on Meta for more information about the Café.
Please watch the page for any updates, particularly to the schedule or
the agenda. Signing up for the meeting is optional, but is helpful to
the organizers so that we can estimate how many people will attend.
Signing up for the meeting also informs us who we should notify
individually if there are significant changes.
If there are any problems with connecting to the meeting or if you
have any questions or comments, then please write on the Meta talk
page or send me an email.
Pine
( https://meta.wikimedia.org/wiki/User:Pine )
Hello,
In case you haven't done any changes on database schema of mediawiki core,
let me explain the process to you (if you know this, feel free to skip this
paragraph):
* Mediawiki core supports three types of RDBMS: MySQL, Sqlite, Postgres. It
used to be five (plus Oracle and MSSQL)
* For each one of these types, you need to do three parts: 1- Change the
tables.sql file so new installations get the new schema 2- Make .sql schema
change file, like an "ALTER TABLE" for current installations so they can
upgrade. 3- Wire that schema change file into *Updater.php file.
* For example, this is a patch to drop a column:
https://gerrit.wikimedia.org/r/c/mediawiki/core/+/473601 This file touches
14 different files, adds 94 lines and removes 30.
This is bad for several reasons:
* It is extremely complicated to do a even a simple schema change. Usually
something as simple as adding an column takes a whole day for me. There are
lots of complicating factors, like Sqlite doesn't have ALTER TABLE, so when
you want to make a patch for adding a column, you need to make a temporary
table with the new column, copy the old table data to it, drop the old
table and then rename the old table.
** Imagine the pain and sorrow when you want to normalize a table meaning
you need to do several schema changes: 1- Add a table, 2- Add a column on
the old table, 3- make the column not-nullable when it's filled and make
the old column nullable instead 4- drop the old column.
* It's almost impossible to test all DBMS types, I don't have MSSQL or
Oracle installed and I don't even know their differences with MySQL. I
assume most other developers are good in one type, not all.
* Writing raw sqls, specially duplicated ones, and doubly specially when we
don't have CI to test (because we won't install propriety software in our
infra) is pretty much prone to error. My favourite one was that a new
column on a table was actually added to the wrong table in MSSQL and it
went unnoticed for two years (four releases, including one LTS).
* It's impossible to support more DBMS types through extensions or other
third party systems. Because the maintainer needs to keep up with all
patches we add to core and write their equivalents.
* For lots of reasons, these schemas are diverging, there have been several
work to just reduce this to a minimum.
There was a RFC to introduce abstract schema and schema changes and it got
accepted and I have been working to implement this:
https://phabricator.wikimedia.org/T191231
This is not a small task, and like any big work, it's important to cut it
to small pieces and gradually improve things. So my plan is first, I
abstract the schema (tables.sql files), then slowly I abstract schema
changes. For now, the plan is to make these .sql files automatically
generated through maintenance scripts. So we will have a file called
tables.json and when running something like:
php maintenance/generateSchemaSql.php --json maintenance/tables.json --sql
maintenance/tables-generated.sql --type=mysql
It would produce tables-generated.sql file. The code that produces it is
Doctrine DBAL and this is already installed as a dev dependency of core
because you would need Doctrine if you want to make a schema change, if you
maintain an instance, you should not need anything. Most of the work for
automatically generating schema is already merged and the last part that
wires it (and migrates two tables) is up for review:
https://gerrit.wikimedia.org/r/c/mediawiki/core/+/595240
My request is that I need to make lots of patches and since I'm doing this
in my volunteer capacity, I need developers to review (and potentially help
with the work if you're excited about this like me). Let me know if you're
willing to be added in future patches and the current patch also welcomes
any feedback: https://gerrit.wikimedia.org/r/c/mediawiki/core/+/595240
I have added the documentation in
https://www.mediawiki.org/wiki/Manual:Schema_changes for the plan and
future changes. The ideal goal is that when you want to do a schema change,
you just change tables.json and create a json file that is snapshot of
before and after table (remember, sqlite doesn't have alter table, meaning
it has to know the whole table). Also, once we are in a good shape in
migrating mediawiki core, we can start cleaning up extensions.
Any feedback is also welcome.
Best
--
Amir (he/him)
The 1.35.0-wmf.34 version of MediaWiki is blocked[0].
The new version is deployed to group0[1], however, canary checks failed
during the deployment of wmf.34 to group1 wikis and was rolled back. The
train can proceed no
further until this issue is resolved:
Exception from TermStoreWriterFactory : Local entity source does not have
items. <https://phabricator.wikimedia.org/T253804>
Once resolved, the train can resume. If resolved on Friday, the train will
resume Monday.
Thanks for your help in resolving the issue.
-- Your humble train toiler
[0]. https://phabricator.wikimedia.org/T253022
[1]. <https://tools.wmflabs.org/versions/>