Hello,
Recently, I stumbled upon a technical writing course which I found very
useful and I wanted to share it and thought of sending an email to
wikitech-l recommending it. Also, I've been looking for a resource about
VueJS with not much luck and wanted to send an email asking if anyone knows
any.
Instead, I have this idea to have a virtual library for developers so they
can share useful resources with each other. You go to a wiki page and see
list of courses, books, conference videos, on each topic and different
people recommanding them. You can also request a resource for a topic and
people respond to you. If the wiki page grows too big, we can split them to
sub pages based on topics, and so on.
I started the page in https://www.mediawiki.org/wiki/User:Ladsgroup/Library
but I'm planning to move it to the main namespace if no one objects. Please
take a look, add more recommandations, co-sign, request for a resource,
respond to a request for a resource, etc.
What do you think? Please let me know if you think it's a horrible idea or
you have feedback on details (mediawiki.org? maybe we should move it to
wikitech.wikimedia.org?)
Hope that'd be useful.
Best
--
Amir (he/him)
It does the same as the @ operator, except that it takes care to prevent a
very bad bug that existed before PHP 7. Details at
https://phabricator.wikimedia.org/T253461
If there are other issues or benefits, please write them on the task. The
overhead of AtEase is prerty minor, so really any benefit at all is likely
to tip the balance toward keeping it. But, in the event that there isn't
any, then perhaps we should slowly phase it out.
Best,
-- Timo
Hello,
The current logo of MediaWiki was adapted slightly more than fifteen years
ago and hasn’t changed since. This logo despite having the nice concept of
sunflower, is old. The sunflower represents the diversity, the constant
growth and also the wildness.
Among its biggest issues I can point out that it’s a bitmap picture so it’s
unusable in large sizes (like large posters) and it’s too realistic making
it unusable in small sizes.
Most, virtually all, software products use a simpler and more abstract
form. For example, docker, kubernetes, Ubuntu, VueJs, React, Apache Kafka,
and many more. It’s a good time for MediaWiki to follow suit.
My request is for changing the logo of MediaWiki and I have no plans or
interest in changing logo of any other project.
Please show your support, oppose or your comments in the discussion page.
You can also add more suggestions.
The discussion page:
https://www.mediawiki.org/wiki/Project:Proposal_for_changing_logo_of_mediaw…
Best
--
Amir (he/him)
Phabricator users,
this is to let you know that the "aphlict" service has been disabled on
Phabricator (for now) because it caused stability issues.
This means you will not get realtime (pop-up) notifications on Phabricator.
(If you had those enabled in the first place).
Regular notifications (that do not pop-up) and emails are not affected by
this.
https://phabricator.wikimedia.org/T238593
--
Daniel Zahn <dzahn(a)wikimedia.org>
Operations Engineer
Hi Everyone,
Mark your calendars! Wikimedia Tech Talks 2020 Episode 6 will take
place on Wednesday
on 12 August 2020 at 17:00 UTC.
Title: Retargeting extensions to work with Parsoid
Speaker: Subramanya Sastry
Summary:
The Parsing team is aiming to replace the core wikitext parser with Parsoid
for Wikimedia wikis sometime late next year. Parsoid models and processes
wikitext quite differently from the core parser (all that Parsoid
guarantees is that the rendering is largely identical, not the specific
process of generating the rendering). So, that does mean that extensions
that extend the behavior of the parser will need to adapt to work with
Parsoid instead to provide similar functionality [1]. With that in mind, we
have been working to more clearly specify how extensions need to adapt to
the Parsoid regime.
At a high level, here are the questions we needed to answer:
1) How do extensions "hook" into Parsoid?
2) When the registered hook listeners are invoked by Parsoid, how do they
process any wikitext they need to process?
3) How is the extension's output assimilated into the page output?
Broadly, the (highly simplified) answers are as follows:
1) Extensions now need to think in terms of transformations (convert this
to that) instead of events (at this point in the pipeline, call this
listener). So, more transformation hooks, and less parsing-event hooks.
2) Parsoid provides all registered listeners with a ParsoidExtensionAPI
object to interact with it which extensions can use to process wikitext.
3) The output is treated as a "fully-processed" page/DOM fragment. It is
appropriately decorated with additional markup and slotted into place into
the page. Extensions need not make any special efforts (aka strip state) to
protect it from the parsing pipeline.
In this talk, we will go over the draft Parsoid API for extensions [2] and
the kind of changes that would need to be made. While in this initial
stage, we are primarily targeting extensions that are deployed on the
Wikimedia wikis, eventually, all MediaWiki extensions that use parser hooks
or use the "parser API" to process wikitext will need to change. We hope to
use this talk to reach out to MediaWiki extension developers and get
feedback about the draft API so we can refine it appropriately.
[1] https://phabricator.wikimedia.org/T258838
[2] https://www.mediawiki.org/wiki/Parsoid/Extension_API
The link to the Youtube Livestream can be found here:
<https://www.youtube.com/watch?v=jNNy8ALGjaE>
https://www.youtube.com/watch?v=lS1xPkERWCM
During the live talk, you are invited to join the discussion on IRC at
#wikimedia-office
You can browse past Tech Talks here:
https://www.mediawiki.org/wiki/Tech_talks
If you are interested in giving your own tech talk, you can learn more here:
https://www.mediawiki.org/wiki/Project:Calendar/How_to_schedule_an_event#Te…
Kindly,
Sarah R. Rodlund
Senior Technical Writer, Developer Advocacy
<https://meta.wikimedia.org/wiki/Developer_Advocacy>
srodlund(a)wikimedia.org
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA256
Hi,
tl;dr: Help would be appreciated testing a new MediaWiki codesearch
UI: <https://codesearch-beta.wmcloud.org/>. Sticking "-beta" in
existing URLs should just work.
The current codesearch interface is a pretty bad hack based on
upstream's UI. Originally I implemented it that way with the
assumption that upstream would continue to make improvements that we
could benefit from but that didn't really happen.
The new beta UI implements some features I wanted/others have asked for:
* Switching search profiles doesn't lose your search query
* An overview listing which repositories have results
* An option to get a Phabricator checklist based on the results
Some features are missing (e.g. manual repository selector) that I
don't use, but if people want I can implement them. Please provide
requests or general feedback via email or the codesearch Phabricator
project.
If people are happy/satisfied with the new UI, I'd like to replace the
old one in a few weeks.
I'd also appreciate some help with some of the minor layout/styling
issues. The code[1] is written in Rust and compiled to WebAssembly,
but the styling is all Bootstrap so hopefully it's easy to work with.
I'm not sure it'll remain implemented in client-side Rust, the
performance really isn't that great.
[1]
https://gerrit.wikimedia.org/g/labs/codesearch/+/refs/heads/master/front
end/
Thanks,
- -- Legoktm
-----BEGIN PGP SIGNATURE-----
iQIzBAEBCAAdFiEE2MtZ8F27ngU4xIGd8QX4EBsFJpsFAl8jzRMACgkQ8QX4EBsF
JpvHYBAA0Dnq1hMYqDQPMTh/S+6CXKf7YN91mMhWYoPNA/eWP9p8V3Sd6TIy+T3w
wCfNK3SBdsCt7G/ut5ikZ1U3orSr952imrJIGkK6FIV9FDCZ1Obgr/J3bTYNZGpP
gQ+BLvLIfHnNlVA3Wa5IsEf7ja1A7SBddfmjE5nl1NFR/7muzX43kmT+PYlMeJDt
J4EiZOgEdp+Dk12/zdTw0RAGbZDfcTRO6ytxe4ZNNKyef9jLq7OH3QrU6lYF/Nja
zoo+Kq6Bmv7428DrgtfInbdrYgWCRPAyVkuAje/9cH/W3k2RcY26y/2ohSXfS4mg
EBiywlOum+DBsm5gbFRG7fvqjvHBY77sIN7Io67OxZRcECzOfWd1qYiRI81aoRzd
4L+cuuqnK9cQdFzyXIx6C8cVu1HaaOaCum2mLgUxEMX/tome7tQ7ZY2wqEDY51Xv
YCFA91W4sCVmeZBJeYCMURiC7r/Aq0HwQAgzFynWCP5SEngDokwQ5MpOJwlsRhgG
dWdhHsWFvJmS5XofDYLRPGG+RM0l1JXpoxLxWr0k8/udiRmASS6Pev08DBWiayZv
xy62h1WB3lIIEb7W3JpGcQ8j6NZB6if1mkHl5HqjzDjqSbJ7QrbJPYUmQlou28L9
8OezeAhaqZbRKlIy3OR1KCCcCUTcOMo1jAiOlwSImRaD8dVzKkE=
=aRW2
-----END PGP SIGNATURE-----
The 1.36.0-wmf.2 version of MediaWiki is currently blocked at group1.[0]
The new version can proceed no further until this issue is resolved:
* Argument 1 passed to Wikimedia\Parsoid\Utils\DOMDataUtils::getDataMw()
must be an instance of DOMElement
- https://phabricator.wikimedia.org/T259311
Thanks for any help resolving this issue. Since we're past today's
cutoff and there are no deploys on Friday, the train can roll forward on
Monday morning, assuming a fix.
Thanks to Mholloway, Msantos, Legoktm, tgr, and RoanKattouw for their
assistance with earlier blockers this week.
-- Your temporary train trundler
[0]. <https://phabricator.wikimedia.org/T257970>
[1]. <https://tools.wmflabs.org/versions/>
Hi all,
Here are the minutes from this week's TechCom meeting:
== RFC: Stop supporting legacy PHP entry point extensions ==
<https://phabricator.wikimedia.org/T258845>
* New RFC by Kunal (Legoktm) that proposes gradually ending support for
legacy
PHP entry points
* TS: The RFC proposes ending support for legacy entry points by MediaWiki
1.43,
which could be five years from now.
* GL: That long of a timeline might be too conservative considering that
there’s a viable
migration path. Wikidata is removing it from production this week. The
percentage of
extensions on gerrit using the new system leads me to believe that we can
do this on
a shorter timeline, especially if there are advantages to not having to
support the legacy
entry points.
* RK: 1.35 will be an LTS release that ships with support for legacy PHP
entry points,
giving longer support to projects that need it. We could remove them before
1.39, our
next LTS release.
* TS: If there’s a large extension or project that needs it, we can be
sensitive to the
timing of the deprecation.
* Discussion continues on the task
== Consolidate language metadata into a 'language-data' library and use in
MediaWiki ==
<https://phabricator.wikimedia.org/T190129>
* NL: No remaining unanswered questions that would change the direction of
the proposal.
* Ready to move from P3 (Explore) to P4 (Tune)
== RFC: Render data visualizations on the server ==
<https://phabricator.wikimedia.org/T249419>
* DA: This RFC is stalled due to pushback on the technical side coupled
with inactivity on
the product side. Client-side visualization seems to be uncontroversial so
far, but I took the
inactivity as a decrease in priority. I’m still interested in the problem
technically, but it needs
proper resourcing and stewardship. Any change away from current
architecture runs into
other problems. I proposed to take a deeper look at how we store graph
definitions.
* GL: We un-deployed Graphoid from Group 0 and 1 only, which may be the
cause of the
lack of feedback.
* DA: I wouldn’t suggest re-deploying Graphoid. The idea was to deploy a
new version,
new repository.
* TT: Removing Graphoid from group 2 could change the priority. When we
un-deploy
Graphoid, there will be a gap when getting things from Siri as well as
impact to the graphs
on pages related to COVID-19. I did hear some interest, but we need actual
resourcing.
* GL: For anything going into production from now on, we’re considering the
typical
performance expected and determining an error budget. If a project overruns
its error budget
for the quarter, the responsible team will be required to do something
about it, either make it
more stable or lose SRE support.
== RFC: Normalize MediaWiki link tables ==
<https://phabricator.wikimedia.org/T222224>
* TT: There’s currently a straw-man proposal open for feedback. Waiting for
DBA feedback.
* No IRC discussion scheduled for next week
You can also find our meeting minutes at
<https://www.mediawiki.org/wiki/Wikimedia_Technical_Committee/Minutes>
See also the TechCom RFC board
<https://phabricator.wikimedia.org/tag/mediawiki-rfcs/>.
If you prefer you can subscribe to our newsletter here
<https://www.mediawiki.org/wiki/Newsletter:TechCom_Radar>
--
Alex Paskulin
Technical Writer
Wikimedia Foundation