I'm writing to let you know of a cool new facility for debugging MediaWiki
code on the Wikimedia production cluster -- the X-Wikimedia-Debug
<https://wikitech.wikimedia.org/wiki/X-Wikimedia-Debug> HTTP header.
By setting this header on requests to Wikimedia wikis, you can:
- Bypass the cache.
- Force Varnish to pass your request to a specific backend server.
- Profile request and log profiling data to XHGui.
- Turn on all log channels and send log messages to a special view in
Kibana / Logstash.
- Force MediaWiki to process the request in read-only mode.
And the best part: there are browser extensions for Chrome and Firefox that
provide a friendly user-interface for these features:
Read the docs on Wikitech
<https://wikitech.wikimedia.org/wiki/X-Wikimedia-Debug> for more
We’ve gotten good participation as we’ve worked on sections of the Code
of Conduct over the past few months, and have made considerable
improvements to the draft based on your feedback.
Given that, and the community approval through the discussions on each
section, the best approach is to proceed by approving section-by-section
until the last section is done.
So, please continue to improve the Code of Conduct by participating now
and as future sections are discussed. When the last section is
completed and approved on the talk page, the Code of Conduct will become
policy and no longer be marked as a draft.
Also, two more discussions regarding the Code of Conduct have been
resolved and incorporated into the draft.
* "Enforcement issues" addressed the reporting process and clarified
that Committee decisions could not be circumvented
* "Marginalized and underrepresented groups" forbids discrimination
A heads up that the Reading Web team has been updating the MobileFrontend
schema code to use mw.eventLog rather than the custom Schema class .
This will mean that MobileFrontend won't support logging events anymore if
sendBeacon is not supported. We used to use localStorage to support
Process update starting next week: * If you'd like something you say to
make it into the weekly Tech News, tag it with #technews
= *2016-03-30* =
== Technology ==
=== Analytics ===
* '''*Blocking'''*: none
* '''*Blocked'''*: none
** Unique Devices data released, pageviews data officially released,
pagecounts-raw, pagecounts-all-sites deprecated,
dumps.wikimedia.org/analytics has details
** Request Breakdown reports from Hadoop are ready, they replace the old
Wikistats squid reports, some UI improvements coming shortly
=== Architecture ===
=== Performance ===
=== Release Engineering ===
** scap 3.1 will be in production on Tuesday April 5th, now with large
binary support! (via git-fat (what trebuchet used for feature parity))
** Code freeze week of April 18th (DC switchover)
=== Research ===
** Deployed ORES swagger & v2 paths. Can now ask for feature lists and do
feature injection when scoring. Announcement coming.
=== Security ===
=== Services ===
*** Redirects for File: titles active only for the native moblie apps
*** general availability blocked on VE -
*** bumped s-maxage for purged content to 1 week
** deployed the change-propagation service (via scap3)
** working on docs - https://www.mediawiki.org/wiki/Documentation/Services
=== Technical Operations ===
** by noone
** changeprop is live, worked with Marko from Services team
** working on ORES and scap3 integration with Amir
== Product ==
=== Community Tech ===
=== Discovery ===
** Not that we are aware of
** Quick Surveys:Not known to be blocked on anyone in SoS, but we might ask
** Existing: ops https://phabricator.wikimedia.org/T127014 and security
** We are experimenting publishing weekly status updates:
=== Editing ===
==== Collaboration ====
** External store work
** Working on support for Flow notifications being properly hidden on
** Work on the Echo special page
==== Language ====
** Work on Parallel Corpora dump (will) need some time from Ops (Ariel),
==== Multimedia ====
==== Parsing ====
[ Subbu: I am feeling under the weather and will probably be napping /
resting and won't show up for this SoS .. Scott might show up if he sees
his email in time, but I've updated the etherpad in any case ]
** VE team and CX teams already knows about the tasks I've filed against
them for setting user-agent headers, accept headers. I've filed a ticket
against Flow as well to set user-agent header in your request. Not urgent,
but good to get it done sooner than later.
** Work ongoing to move data-mw out from an inlined attribute .. CX, VE,
Flow: please start thinking about what this means and how you will process
data-mw and html from separate api requests. The Parsoid and RESTBase side
work will be done in 2-3 weeks time (which includes performance evaluation
and impact of supporting old versions, etc.). We'll not turn this on in
production (3-4 weeks away at least) without consulting with all affected
clients, but the accept: header is your way of getting the old version till
you are ready to switch, but we prefer that the switch not be delayed
inordinately. I've already filed tickets against these projects, but this
is just a heads up.
==== VisualEditor ====
=== Fundraising Tech ===
** More DonationInterface refactoring
** Investigating ActiveMQ replacement options
** More work towards mass (reversible) contact de-duping in CiviCRM
** Scoping work to update PayPal integration
=== Reading ===
==== Android ====
** Content Service is rolling out successfully. (25% of Android production
app as of Monday evening)
==== iOS =====
** 5.0.1 Deployed last week
** 5.0.2 Being deployed this week
(Just decided this on Monday - we received a call from Apple that there is
an OS bug causing our app to crash and we needed to work around it)
==== Web ====
* Language switcher overlay being deployed
==== Reading Infrastructure ====
* AuthManager for 1.27 branch in progress (disabled switch for now)
* Flow, able to tackle https://phabricator.wikimedia.org/T129397 ?
* Wikidata, https://phabricator.wikimedia.org/T131176 ?
I'm prepared to participate in IEG and has an idea closely linked to the Accuracy Review Project raised by James Salsman. Here is a brief summary of my proposal:
Out-of-date information and references are common in Wikipedia entries, especially in Chinese Wikipedia. Therefore, I would like to evaluate some existed solutions of identifying those out-of-date contents, and create a new bot to identify the information based on the results of testing. More detailed tests will be arranged after that by selected entries from Wikipedia and the cases that we compile.
And here is the URL of the project the proposal:
Because there is already relative discussion in this mailing list, please comment on the proposal in the discussion board of it at:
* Review site scripts to verify that no wikibits methods are used.
* If you find usage of wikibits, refactor the code to use newer methods
instead, or add a dependency on the 'mediawiki.legacy.wikibits' module.
module has been a long time coming. This announces another step in that
In 2011, wikibits was deprecated following the introduction of
ResourceLoader in MediaWiki 1.17. To accommodate pre-ResourceLoader code
created to allow existing sites (such as Wikimedia wikis) to load
"mediawiki.legacy.wikibits" by default on all pages. While most code no
longer uses wikibits, it remains set to this day.
In 2013, we introduced mw.log.deprecate in MediaWiki 1.23 to help you
detect any use of deprecated methods in the browser's developer console.
We have seen a big reduction in the use of such methods.
In 2015, we made significant performance improvements in MediaWiki 1.26, by
avoid breaking undeclared use of wikibits, we made an exception for
before other modules".
Now, as the last step before removing wikibits from MediaWiki, it will
first no longer load by default in MediaWiki 1.27. This change will roll
out on Wikimedia wikis in April 2016. If you find usage of wikibits
features without a dependency, please refactor this code to use the modern
replacements, or add an explicit dependency as temporary stop-gap while
figuring out how to refactor the code.
Before re-factoring, please remember to first check whether the associated
code is working. Many wikibits methods have become empty placeholders to
avoid cascading failures. As such, blind updates may cause old or broken
code that is currently invisible to re-activate itself. Removing dead code
speeds up wikis for all users, and reduces the risk of things going wrong
For third-party wikis, this will ship in MediaWiki 1.27.0. If needed, you
This will give you time to fix missing dependencies on wikibits. In
MediaWiki 1.28, to be released in November 2016, the wikibits module will
be removed entirely.
Citations and references are the building blocks of Wikimedia projects.
However, as of today, they are still treated as second-class citizens.
Structured data bases such as Wikidata offer a unique opportunity
turn into reality over a decade of endeavors to build the sum of all
citations and bibliographic metadata into a centralized repository. To
coordinate upcoming work in this space, we're organizing a technical event
in late May and opening up applications for prospective participants.
*WikiCite 2016 <https://meta.wikimedia.org/wiki/WikiCite_2016>* is a
hands-on event focused on designing data models and technology to *improve
the coverage, quality, standards-compliance and machine-readability of
citations and source metadata in Wikipedia, Wikidata and other Wikimedia
projects*. Our goal, in particular, is to define a technical roadmap for
building a repository of all Wikimedia references in Wikidata.
We are bringing together Wikidatans, Wikipedians, software engineers, data
modelers, and information and library science experts from organizations
including *Crossref*, *Zotero*, *CSL*, *ContentMine*, *Google*, *Datacite*,
*NISO*, *OCLC* and the *NIH*. We are also inviting academic researchers
with experience working with Wikipedia's citations and bibliographic data.
WikiCite will be hosted in *Berlin* on *May 25-26, 2016*. Participation to
the event is capped at about 50 participants and we expect to have a number
of open slots for applicants:
- if you were pre-invited and have already filled in a form, you will
receive a separate note from the organizers
- if you have not been invited but you would like to participate, please
fill in this application form <http://goo.gl/forms/Yv6rve2wCt> to give
us some information about you and your interest and expected contribution
to the event.
Please help us pass this on to anyone who has done important technical work
on Wikimedia references and citations.
- *March 29, 2016*: applications open
- *April 11, 2016*: applications close
- *April 15, 2016*: notifications of acceptance are issued (if you
applied for a travel grant, we'll be able to confirm by this date if we can
cover the costs of your trip)
For any question, you can contact the organizing committee:
*Dario Taraborelli *Head of Research, Wikimedia Foundation
wikimediafoundation.org • nitens.org • @readermeter