Hey all, if you are not watching
https://www.mediawiki.org/wiki/Parsing/Get_involved (and you should!),
here's an update that Subbu wanted to share with all of us about the
progress on the current initiative to fix wikitext patterns that behave
differently with RemexHTML.
Some wikis have really taken action, so that's good!, but some big
communities (notably enwp) are really behind...
As a reminder, the numbers you'll see in some dashboards do *not* mean you
need to fix millions of pages: most of the times you'll fix one template
and most if not all of them will be gone.
Also: if you want to support the communities in figuring out what's to be
done, note that
is marked for translation, but you could probably focus just on section 5
(the one I linked).
Finally, the team is always, always available for questions and
clarifications. Just leave a message on the talk for the FAQ page ;)
-------- Forwarded Message --------
Subject: Followup (Re: Tidy will be replaced by RemexHTML on Wikimedia
wikis latest by June 2018)
Date: Thu, 10 Aug 2017 14:42:31 -0400
From: Subramanya Sastry <ssastry(a)wikimedia.org> <ssastry(a)wikimedia.org>
To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
On 07/06/2017 08:02 AM, Subramanya Sastry wrote:
> The Parsing team wants to replace Tidy with a RemexHTML-based solution on the
> Wikimedia cluster by June 2018. This will require editors to fix pages and
> templates to address wikitext patterns that behave differently with
> RemexHTML. Please see 'What editors will need to do' section on the Tidy
> replacement FAQ .
> 9. Monitoring progress
> In order to monitor progress, we plan to do a weekly (or some such periodic
> frequency) test run that compares the rendering of pages with Tidy and with
> RemexHTML on a large sample of pages (in the 50K range) from a large subset
> of Wikimedia wikis (~50 or so). This will give us a pulse of how fixups are
> going, and when we might be able to flip the switch on different wikis.
I wanted to post some followups on this.
1. We have a revived dashboard that tracks linter error counts on wikis
for all linter categories.
2. We track the error counts as they change and publish weekly snapshots
comparing counts to a July 24th baseline (which is when I first
started collecting stats)
3. We also have a pixel-diffs test run (previously called visual diffs)
that compares page rendering with Tidy and with RemexHTML. The test
set has 73K pages sampled from 60 wikis. These diffs more accurately
reflect what kind of rendering differences we can expect to see if
pages are not fixed.
4. Based on the runs above, I identified one more high priority linter
category which is a Tidy whitespace bug and needs to be fixed (expect
mostly templates, especially navboxes based on what I've seen in the
test run above). Once the code is reviewed and deployed to the
cluster, we'll start populating this category.
See https://gerrit.wikimedia.org/r/#/c/371068/ and
I have a short 6 sentence message I would like to have in German to share
with the German Wikipedia community about an upcoming A/B test. Help
translating would be most appreciated.
Mein Deutsch ist schrecklich. Ich bin außer Übung! :)
I haven't tried this, but it sounds like translators and Language
Engineering folks might be interested.
---------- Forwarded message ----------
From: Alex Brollo <alex.brollo(a)gmail.com>
Date: Tue, Aug 1, 2017 at 12:10 AM
Subject: [Wikisource-l] An it.source gadget to manage diacritics
To: wikisource list <wikisource-l(a)lists.wikimedia.org>
Just to let it known, some it.source contributors are using a comfortable
gadget to manage diacritics - it can delete, replace or add a pretty large
list of diacritical marks to any character with a single click.
It uses .normalize() string method, so decomposing-recomposing (when
possible) unicode characters and allowing to manage diacritics alone
indipendently from base ascii character.
Perhaps is this gadget "rediscovering the wheel"....? Anyway, the code is
Wikisource-l mailing list