Hi everyone,
the Design System Team (DST) is proposing the following changes to
MediaWiki browser support [1]:
- Drop support for Internet Explorer 11 (IE 11)
- Drop support for all versions of Edge Legacy
- Drop support for Opera
- Increase Basic (Grade C) support for Chrome and Firefox to versions 49+,
Safari and iOS to versions 10+.
What this means: The browsers we’re phasing out won’t be tested for layout
rendering anymore. While users on these browsers might and will still be
able to read and basically interact with content, they might experience
some quirks. This step helps us integrate modern web features more
seamlessly.
These changes will unlock the ability to use specific newer browser
features that cannot be safely used today without requiring a fallback,
notably CSS custom properties (used in upcoming reading customization
features like Night Mode) and the <summary> and <details> HTML elements
that can be used to replace the checkbox hack.
This will reduce the amount of code sent to 99.9% of users and cut down on
software development costs and maintenance burdens.
See the full announcement for more details; PDF to download [2].
On behalf of the Wikimedia Foundation Design System Team,
Volker
[1] https://www.mediawiki.org/wiki/Compatibility#Browser_support_matrix
[2] https://phabricator.wikimedia.org/F52025988
Hi there,
I'm User:Diskdance, and recently I'm developing a default gadget for Chinese Wikipedia enhancing MediaWiki's variant handling logic, and under certain circumstances a prompt is shown at page load asking for a user's preferred variant. Consider it as a conditional Cookie notice, and its English screenshot can be found at https://commons.wikimedia.org/wiki/File:VariantAlly-En.png.
Iknowthis can be very disruptive on UX, so I tend to be careful about its negative impact on page views. If the gadget can collect telemetry data about the prompt's display frequency and user interactions(using e.g. WikimediaEvents), I can know about its possible impact.
Is this possible? It would be much appreciated if anybody could provide assistance.
Best wishes,Diskdance
I know it has been annoying a couple of people other than me, so now that I've learned how to make it work I'll share the knowledge here.
tl;dr: Star the repositories. No, seriously. (And yes, you need to star each extension repo separately.)
(Is there a place on mw.org to put this tidbit on?)
------- Forwarded message -------
From: "Brian Levine" <support(a)github.com> (GitHub Staff)
To: matma.rex(a)gmail.com
Cc:
Subject: Re: Commits in mirrored repositories not showing up on my profile
Date: Tue, 09 Jul 2013 06:47:19 +0200
Hi Bartosz
In order to link your commits to your GitHub account, you need to have some association with the repository other than authoring the commit. Usually, having push access gives you that connection. In this case, you don't have push permission, so we don't link you to the commit.
The easy solution here is for you to star the repository. If you star it - along with the other repositories that are giving you this problem - we'll see that you're connected to the repository and you'll get contribution credit for those commits.
Cheers
Brian
--
Matma Rex
We just released a new version of Research:FAQ on Meta [1], significantly
expanded and updated, to make our processes at WMF more transparent and to
meet an explicit FDC request to clarify the role and responsibilities of
individual teams involved in research across the organization.
The previous version – written from the perspective of the (now inactive)
Research:Committee, and mostly obsolete since the release of WMF's open
access policy [2] – can still be found here [3].
Comments and bold edits to the new version of the document are welcome. For
any question or concern, you can drop me a line or ping my username on-wiki.
Thanks,
Dario
[1] https://meta.wikimedia.org/wiki/Research:FAQ
[2] https://wikimediafoundation.org/wiki/Open_access_policy
[3] https://meta.wikimedia.org/w/index.php?title=Research:FAQ&oldid=15176953
*Dario Taraborelli *Head of Research, Wikimedia Foundation
wikimediafoundation.org • nitens.org • @readermeter
<http://twitter.com/readermeter>
As described on Phabricator a bug [1] surfaced whereby the "pages-articles"
XML dumps on https://dumps.wikimedia.org/ bear incomplete records.
A possible fix has been identified, and it involves bumping the dump schema
version from version 0.10 to version 0.11 [2], which could be a breaking
change for some.
MORE DETAILS:
Due to the bug that surfaced, a nontrivial number of <text> nodes
representing article text shows in a fashion like so as empty.
<text bytes="123456789" />
A potential fix in T365155 [3] has been identified. Assuming further
testing looks good, XML dumps will be kicked off again starting next week
in order to restore the missing records as soon as possible. It will take a
while for new dumps to be generated as it is a compute intensive operation.
More progress will be reported at T365155 and new dumps will eventually
show up on dumps.wikimedia.org .
Although a number of pipelines may not notice the change associated with
the schema bump, if your dump ingestion tooling or use of Special:Export
relies on the specific shape of the XML at version 0.10 (e.g., because of
code generation tools), please examine the differences between version 0.10
and version 0.11. One notable addition in version 0.11 is addition of MCR
[4] fields.
Thank you for your patience while this issue is resolved.
-Adam
[1]
https://phabricator.wikimedia.org/T365501
[2]
https://www.mediawiki.org/xml/export-0.10.xsd
and
https://www.mediawiki.org/xml/export-0.11.xsd
Schema version 0.11 has existed in MediaWiki for over 6 years, but
Wikimedia wikis have been using version 0.10.
[3]
https://phabricator.wikimedia.org/T365155#9851025
and
https://phabricator.wikimedia.org/T365155#9851160
[4]
https://www.mediawiki.org/wiki/Multi-Content_Revisions
Hello all,
The next language community meeting is scheduled in a few weeks - May 31st
at 16:00 UTC. If you're interested, you can sign up on this wiki page: <
https://www.mediawiki.org/w/index.php?title=Wikimedia_Language_engineering/…
>.
This is a participant-driven meeting, where we share language-specific
updates related to various projects, collectively discuss technical issues
related to language wikis, and work together to find possible solutions.
For example, in the last meeting, the topics included the machine
translation service (MinT) and the languages and models it currently
supports, localization efforts from the Kiwix team, and technical
challenges with numerical sorting in files used on Bengali Wikisource.
Do you have any ideas for topics to share technical updates related to your
project? Any problems that you would like to bring for discussion during
the meeting? Do you need interpretation support from English to another
language? Please reach out to me at ssethi(a)wikimedia.org and add
agenda items to the document here: <
https://etherpad.wikimedia.org/p/language-community-meeting-may-2024>.
We look forward to your participation!
Cheers,
Jon, Mary, Oscar, Amir and Srishti
*Srishti Sethi*
Senior Developer Advocate
Wikimedia Foundation <https://wikimediafoundation.org/>
TLDR: fresh-node now defaults to Node.js 20, and introducing the "fresh-npm" security feature.
Get started:
https://gerrit.wikimedia.org/g/fresh#fresh-environment
Changelog: https://gerrit.wikimedia.org/g/fresh/+/HEAD/CHANGELOG.md
Commits: https://gerrit.wikimedia.org/r/q/project:fresh+is:merged
Hi all,
Fresh 24.05 is upon us!
*What's new?*
The fresh-node22 command has been introduced by James Forrester, and is now open for early testing. This uses the "releng/node22-test-browser" Docker image that is also available to Jenkins jobs in WMF CI. Standalone libraries and tools are welcome opt-in and switch their CI jobs in Zuul config if they pass under node22.
The default fresh-node command was updated from Node.js 18 to Node.js 20, similarly re-using the same Docker images that we use in WMF CI. These feature the same Debian Linux version, same pre-installed packages, and versions thereof. This makes it as easy as possible to reproduce CI failures locally. Vice versa, if you use Fresh in local development, you're unlikely to encounter failures in CI. You can continue to develop on older versions via the fresh-node18 and fresh-node16 commands. The fresh-node14 command has been removed (unsupported since last year <https://github.com/nodejs/Release#end-of-life-releases>).
This release includes the first contribution to Fresh by Marius Hoch (WMDE), who fixed a bug <https://gerrit.wikimedia.org/r/c/fresh/+/1034847> affecting projects with a space in their working directory name. Thanks Marius!
Finally, this release introduces the experimental "fresh-npm" feature. You can opt-in by cloning the repo and running `bin/fresh-install --secure-npm`. This will shadow the npm command in the shell on your main workstation, and avoids accidentally running potentially insecure scripts outside Fresh. Other npm commands are unaffected. It can be bypassed as-needed by specifying the full path to npm, which is also printed at the end of any fresh-npm help or error message. I previously maintained this under the name "secpm" in a local patch <https://gerrit.wikimedia.org/r/c/fresh/+/675346> since 2021. It has served myself and a handful of others well. I hope it can be useful to others!
To report issues or browse tasks, find us on Phabricator at https://phabricator.wikimedia.org/tag/fresh/.
*What is Fresh?*
Fresh is a fast way to launch isolated environments from your terminal. These can be used to work more securely and responsibly <https://timotijhof.net/posts/2019/protect-yourself-from-npm/> with Node.js-based developer tools, especially those installed from npm such as ESLint, QUnit, Grunt, Webdriver, and more. Example guide: https://www.mediawiki.org/wiki/Manual:JavaScript_unit_testing. Get started https://gerrit.wikimedia.org/g/fresh#fresh-environment
--
Timo Tijhof,
Principal Engineer,
Wikimedia Foundation.
I've been trying to cleanup TitleKey a bit (hackathon project).
One issue was the deprecation of PrefixSearchBackend, the associated hook
and the class TitlePrefixSearch in 1.41. Apparently this was already meant
to be deprecated since 1.27, but never really properly carried out.
However the Core's searchengine itself still uses all this, and the only
proper way to override it, is by reimplementing the completionSearchBackend
method of your own searchengine backend, which means that you have to
provide all other search functionality for via that alternative backend as
well. This is also how CirrusSearch does this.
However, for TitleKey, we essentially want to bolt this on top of an
existing backend and that's not a simple job any longer. I'm now ending up
with this:
https://gerrit.wikimedia.org/r/c/mediawiki/extensions/TitleKey/+/1036312
Three new subclasses for the three core searchengines that the system admin
installing TitleKey has to choose from. Not really convenient.
Issues I see with the deprecation:
- There is no alternative hook to modify prefix search results.
- Does it even make sense to have the class TitlePrefixSearch be deprecated
(without replacement), if it is implementing the same logic for all core
searchengine backend right now via the abstract SearchEngine class ?
- Another nice option might be to have a TitlePrefixSearch to be a service,
similar to TitleMatcher ? That would make it easier to replace the code
- Should core replace TitlePrefixSearch with a search based on a near match
(fuzzy match) with TitleMatcher service perhaps ?
Additionally:
- There is no core concept of nearmatch title matching in core search
engines (cirrus search can do this, and it does so via the
completionsearch, but that seems a bit of a hack).
- There is also a deprecated StringPrefixSearch also deprecated since 1.27,
but only officially since 1.41, which is used exclusively by
Extension:MassEditRegex and that seems to be in the same boat.
- There are more questions for the future and implementation of titlekey
extension itself, but i think those are better left for another time.
So my question is: Where do we want to take prefix/completionSearch in
core. Do we want to address this and if so what are your suggestions ?
DJ