I am pleased to announce the launch of the third Inspire Campaign for
IdeaLab, focused on addressing harassment of Wikimedia project contributors:
<https://meta.wikimedia.org/wiki/Grants:IdeaLab/Inspire>
Harassment diminishes the experience of contributing and participation for
a substantial number of individuals, even those who simply witness it.
Current methods of dealing with harassment are considered unacceptable as
they often do not lead to productive outcomes.[1]
During the month-long campaign, you are invited to submit & review ideas on
how to better address harassment. Consider joining a team to help make an
idea happen. Ideas can be submitted in any language, and focus on
research, building tools or software, outreach efforts, or something
completely new. Grants are available from the Wikimedia Foundation to fund
projects that are eligible for financial support.[2] Ideas focused on
changes to community policies and guidelines are also welcome. Google
Hangout sessions are also scheduled in June if you’d like to discuss your
idea or have questions about WMF grants.[3]
Questions about the campaign can be directed at the Inspire talk page.[4]
An FAQ page about the campaign is also available.[5]
If you want to help make your projects safer for everyone to participate
in, I encourage you to participate in this Inspire Campaign. I believe we
can work together to address this difficult and important issue.
With thanks,
Chris "Jethro" Schilling
I JethroBT (WMF) <https://meta.wikimedia.org/wiki/User:I_JethroBT_(WMF)>
Community Organizer, Wikimedia Foundation
<https://wikimediafoundation.org/wiki/Home>
[1] <
https://commons.wikimedia.org/wiki/File:Harassment_Survey_2015_-_Results_Re…
>
[2] <https://meta.wikimedia.org/wiki/Grants:Start>
[3] <https://meta.wikimedia.org/wiki/Grants:IdeaLab/Events>
[4] <https://meta.wikimedia.org/wiki/Grants_talk:IdeaLab/Inspire>
[5] <https://meta.wikimedia.org/wiki/Grants:IdeaLab/Inspire/FAQ>
>From time to time mails like below pass by where I wish that voting should
be generalized on a technical level. So that any organization
participating in the wikiverse would be able to conduct votes and reuse
voting rights. Would this be something of broader value?
Best
Rupert
---------- Forwarded message ----------
From: "Sandister Tei" <sandistertei(a)gmail.com>
Date: May 31, 2016 16:33
Subject: Re: [Wikimedia-GH] Help us choose volunteers to serve you
To: "Sadat" <masssly(a)ymail.com>
Cc: "Planning Wikimedia Ghana Chapter" <wikimedia-gh(a)lists.wikimedia.org>
And if they vote with another account?
Regards,
Sandister Tei
www.sandistertei.com
Via mobile
On 31 May 2016 2:28 p.m., "Mohammed S. Abdulai" <masssly(a)ymail.com> wrote:
> You can just restrict multiple entries, I believe you're familiar with
> that process.
>
> Cheers
>
> -Masssly
>
> Sent from my Samsung Galaxy smartphone.
> -------- Original message --------
> From: Sandister Tei <sandistertei(a)gmail.com>
> Date: 05/31/2016 10:38 (GMT+00:00)
> To: Sadat <masssly(a)ymail.com>
> Cc: Planning Wikimedia Ghana Chapter <wikimedia-gh(a)lists.wikimedia.org>
> Subject: RE: [Wikimedia-GH] Help us choose volunteers to serve you
>
> You can suggest another means to check double voting.
>
> Regards,
> Sandister Tei
>
> www.sandistertei.com
>
> Via mobile
> On 31 May 2016 10:07 a.m., "Mohammed S. Abdulai" <masssly(a)ymail.com>
> wrote:
>
>> For best practices we shouldn't REQUIRE prospective respondents to enter
>> their names.
>>
>> I would like to see that relaxed.
>>
>> Thanks
>>
>> -Masssly
>>
>>
>> Sent from my Samsung Galaxy smartphone.
>> -------- Original message --------
>> From: Sandister Tei <sandistertei(a)gmail.com>
>> Date: 05/31/2016 08:00 (GMT+00:00)
>> To: Planning Wikimedia Ghana Chapter <wikimedia-gh(a)lists.wikimedia.org>
>> Subject: [Wikimedia-GH] Help us choose volunteers to serve you
>>
>> Hello all.
>>
>> Help us choose volunteers to serve you.
>> Please take this survey <http://goo.gl/forms/NQSLDL0CRGwwy3hi2>. It's
>> just two questions.
>>
>>
>> *Regards, *
>> *Sandister Tei *
>>
>> *www.sandistertei.com <http://www.sandistertei.com>*
>>
>
_______________________________________________
Wikimedia-GH mailing list
Wikimedia-GH(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikimedia-gh
Hi all,
wrote an application for an IEG-grant on creating a testing environment for
Lua-scripts.[1] Perhaps it is interesting for you. It is mainly a tool for
on-wiki testing of scripts, and I'm not sure if it is that interesting for
use for off-wiki testing.
When I wrote the application I said it was about BDD-style testing, but it
would probably be better to describe it as testing with spec-statements.[2]
It is somewhat similar to the library "Busted",[3] but it use expect
instead of assert.
No changes to the Lua core is necessary to run this kinds of tests, it is
mostly to get proper localization of messages and a bit more stability and
reputability.
Likewise i sketched an alternate future grant application for ATDD-style
testing. This is testing with step-statements. The BDD-lib will only do
spec-type testing.
Add yourself to endorsement if you think this is a good idea! :)
[1]
https://meta.wikimedia.org/wiki/Grants:IEG/Lua_libs_for_behavior-driven_dev…
[2] https://en.wikipedia.org/wiki/Behavior-driven_development
[3] http://olivinelabs.com/busted/
Hello,
Here's what the performance team has been up to.
== Dashboards & instrumentation ==
We spent time instrumenting software and curating displays of performance
data. We have several new dashboards to share with you:
* Global edit rate and save failures (new)
https://grafana.wikimedia.org/dashboard/db/edit-count
* Performance metrics (revamped)
https://grafana-admin.wikimedia.org/dashboard/db/performance-metrics
* Page load performance
https://grafana.wikimedia.org/dashboard/db/navigation-timing
...by continent:
https://grafana.wikimedia.org/dashboard/db/navigation-timing-by-continent
...by country :
https://grafana.wikimedia.org/dashboard/db/navigation-timing-by-geolocation
...by browser :
https://grafana.wikimedia.org/dashboard/db/navigation-timing-by-browser
* We found that certain browsers were reporting wildly inaccurate timing
data and skewing our summary performance metrics, and reacted by validating
browser metric data more strictly against Navigation Timing API specs.
== ResourceLoader ==
ResourceLoader is the MediaWiki subsystem responsible for loading CSS,
JavaScript, and i18n interface messages for dynamic site features. It is
critical to site performance. Changes to ResourceLoader are focused on
reducing backend response time, ensuring we make efficient use of the
browser cache, and reducing time to first paint (the time it takes any
content to appear). This work is led by Timo Tijhof.
* The "/static/$mwBranch" entry point has been deprecated and removed in
favor of wmfstatic - a new multiversion-powered entrypoint accessed via
"/w" (via RewriteRule)
https://phabricator.wikimedia.org/T99096
* Restricting addModuleStyles() to style-only modules (ongoing)
https://phabricator.wikimedia.org/T92459
* Startup module check is now based on a feature test instead of browser
blacklist
https://phabricator.wikimedia.org/T102318
== WebPageTest ==
Page load performance varies by browser, platform, and network. To
anticipate how code changes will impact page performance for readers and
editors, we use WebPageTest (https://wikitech.wikimedia.org/wiki/WebPageTest),
a web performance browser automation tool. WebPageTest loads pages on
Wikimedia wikis using real browsers and collects timing metrics. This work
is led by Peter Hedenskog.
* We now generate waterfall charts for page loads on Firefox. Previously we
were only able to produce them with Chrome.
* We tracked downs two bugs in WebPageTest that caused it to report an
incorrect value for time-to-first-byte and reported them upstream.
https://phabricator.wikimedia.org/T130182https://phabricator.wikimedia.org/T129735
* We upgraded the WebPageTest agent instance after observing variability in
measurements when the agent is under load.
https://phabricator.wikimedia.org/T135985
* We designed a new dashboard to help us spot performance regressions
https://grafana.wikimedia.org/dashboard/db/webpagetest
== Databases ==
The major effort in backend performance has been to reduce replication lag.
Replication lag occurs when a slave database is not able to reflect changes
on the master database quickly enough and falls behind. Aaron Schulz set
out to bring peak replication lag down from ten seconds to below five, by
identifying problematic query patterns and rewriting them to be more
efficient. We are very close to hitting that target: replication lag is
almost entirely below five seconds on all clusters.
https://phabricator.wikimedia.org/T95501
* High lag on databases used to generate special pages no longer stops job
queue processing
https://phabricator.wikimedia.org/T135809
== Multi-DC ==
"Multi-DC" refers to ongoing work to make it possible to serve reads from a
secondary data center. Having MediaWiki running and serving requests in
more than one data center will reduce latency and improve site reliability.
This project is led by Aaron Schulz.
In order for this to be possible, we need to be able to anticipate which
requests will need the master database, so we can route them accordingly.
The plan is to achieve this by making sure that GET requests never require
a master database connection. We've made progress incremental progress
here, most recently by changing action=rollback to use JavaScript to
perform HTTP POST requests.
We also need to be able to broadcast cache purges across data centers. The
major work on this front has been the addition to core of EventBus classes
that relay cache proxy and object cache purges. Stas Malyshev of the
discovery team is assisting with this work.
== Thumbor ==
"Thumbor" is shorthand for the project to factor thumbnail rendering out of
MediaWiki and into a standalone service based on Thumbor (
http://thumbor.org/). This project is led by Gilles Dubuc. The following
list summarizes recent progress:
- Simplified the VCL as much as possible
- Added client throttling with the tbf vmod
- Added progressive JPEG support to ImageMagick engine
- Added configurable chroma subsampling support
- Made SVG detection more robust
- Added multilanguage SVG support
- Reproduced temp folder security mechanism found in MediaWiki for SVG for
all file types
- Swift's rewrite.py ported to vagrant. On Vagrant thumbor now hooks itself
into the same point in the stack it will in production
- Swift storage implemented (shard support left to do)
- Matched Content-Disposition behavior to MediaWiki
- Vastly increased performance on JPEG processing by using a long-running
exiftool process and named pipes to pass commands to it
- Made one instance of thumbor run on each available core on vagrant, since
thumbor is single-threaded
- Debian packaging well under way: https://phabricator.wikimedia.org/T134485
all dependencies covered except one. 14 backports and 17 new packages so
far. Working with Filippo to get as many of these into Debian proper as
possible.
Until next time,
Aaron, Gilles, Ori, Timo, and Peter
With https://gerrit.wikimedia.org/r/288633 some new classes in the Babel
extension were put into a new namespace "MediaWiki\Babel\BabelBox".
Legoktm approved but Thiemo Mättig (WMDE) disagrees. PHPUnit tests are
already in namespace "Babel\Tests".
Hi,
I just released MediaWiki-Codesniffer 0.7.2, which is a bugfix release
for 0.7.1. There was an issue[1] in the SpacyParenthesisSniff that would
sometimes eat the content inside of the parenthesis or brackets. I ran a
script to check and the only extension that was affected and had bad
code committed was ArticlePlaceholder, which has already been fixed.
Thanks to Thiemo and Nikerabbit for reporting, and PleaseStand for the
patch. Additionally, EBernhardson overhauled how our tests run so we now
have the ability to test sniffs that automatically fix code, which will
hopefully prevent regressions like this in the future!
I submitted autogenerated patches to upgrade all extensions that were
already at 0.7.1 to upgrade to 0.7.2 - they should only touch composer.json.
[1] https://phabricator.wikimedia.org/T134857
-- Legoktm