Hi everyone,
The important part of this email is this link:
https://meta.wikimedia.org/wiki/2015_Community_Wishlist_Survey/Status_repor…
This is the second Community Wishlist Survey status report. In November and
December, active contributors to Wikimedia projects proposed, discussed and
voted on the features and fixes that they most want to see. The Wikimedia
Foundation Community Tech team has been tasked with working on
these. Additionally, Wikimedia Deutschland's Technical Wishes team has been
working on wishes from the German-speaking community. There's overlap
between the two wishlists, and the teams are collaborating on various
wishes, so this report includes progress made by both teams as well as
great work being done by volunteer developers and other WMF staff.
So far, we (in the broad sense) have added support for:
*) Migrating dead external links to archives (but there's more work to be
done!)
*) Pageview stats
*) Global notifications
*) A category watchlist
We're currently working on:
*) Improving the plagiarism detection bot
*) Improving the diff compare screen
*) Numerical sorting in categories
*) The possibility to add an expiry date to watchlist items
*) A revision slider to help editors navigate through diff pages
For more information on these projects as well as upcoming tasks, see the
full status report on Meta:
https://meta.wikimedia.org/wiki/2015_Community_Wishlist_Survey/Status_repor…
We're looking forward to talking and working with you as we go along.
Thanks,
//Johan Jönsson
User:Johan (WMF)
--
Hey all,
Wikimedia Deutschland and the Wikimedia Foundation hosted the WikiCite
<https://meta.wikimedia.org/wiki/WikiCite_2016> event in Berlin last week,
bringing together a large group
<https://meta.wikimedia.org/wiki/WikiCite_2016#Participant_list> of
Wikidatans, Wikipedians, librarians, developers and researchers from all
over the world.
The event built a lot of momentum around the definition of data models,
workflows and technology needed to better represent source and citation
data from Wikimedia projects, Wikidata in particular.
While we're still drafting a human-readable report
<https://meta.wikimedia.org/wiki/WikiCite_2016/Report>, I thought I'd share
a preview of the notes from the various workgroups, to give you a sense of
what we worked on and to let everyone join the discussion:
Main workgroups
Modeling bibliographic source metadata
<https://meta.wikimedia.org/wiki/WikiCite_2016/Group_1>
Discuss and draft data models to represent different types of sources as
Wikidata items
Reference extraction and metadata lookup tools
<https://meta.wikimedia.org/wiki/WikiCite_2016/Group_2>
Design or improve tools to extract identifiers and bibliographic data from
Wikipedia citation templates, look up and retrieve metadata
Representing citations and citation events
<https://meta.wikimedia.org/wiki/WikiCite_2016/Group_3>
Discuss how to express the citation of a source in a Wikimedia artifact
(such as a Wikipedia article, a Wikidata statements etc.) and review
alternative ways to represent them
(Semi-)automated ways to add references to Wikidata statements
<https://meta.wikimedia.org/wiki/WikiCite_2016/Group_4>
Improve tools for semi-automated statement and reference creation
(StrepHit, ContentMine)
Use cases for source-related queries
<https://meta.wikimedia.org/wiki/WikiCite_2016/Group_5>
Identify use cases for SPARQL queries involving source metadata. Obtain a
small open licensed bibliographic and citation graph dataset to build a
proof-of-concept of the querying and visualization potential of source
metadata in Wikidata.
Additional workgroups
Wikidata as the central hub on license information on databases
<https://meta.wikimedia.org/wiki/WikiCite_2016/Group_6>
Add license information to Wikidata to make Wikidata the central hub on
license information on databases
Using citations and bibliographic source metadata
<https://meta.wikimedia.org/wiki/WikiCite_2016/Group_7>
Merge groups working on citation structure and source metadata models and
integrate their recommendations
Citoid-Wikidata integration
<https://meta.wikimedia.org/wiki/WikiCite_2016/Group_8>
Extend Citoid to write source metadata into Wikidata
We're opening up the wikicite-discuss(a)wikimedia.org mailing list to anyone
interested in interacting with the participants in the event (we encouraged
them to use the official wikidata list for anything of interest to the
broader community). Phabricator also has a dedicated tag
<https://phabricator.wikimedia.org/tag/wikicite/> for related initiatives.
The event was generously funded
<https://meta.wikimedia.org/wiki/WikiCite_2016#Funding> by the Alfred P.
Sloan Foundation, the Gordon and Betty Moore Foundation, and Crossref.
We'll be exploring the feasibility of a follow-up event in the next 6-12
months to continue the work we started in Berlin and bring in more people
than we could host due to funding/capacity.
Best,
Dario
on behalf of the organizers
Hi Community Metrics team,
This is your automatic monthly Phabricator statistics mail.
Accounts created in (2016-05): 256
Active users (any activity) in (2016-05): 861
Task authors in (2016-05): 484
Users who have closed tasks in (2016-05): 260
Projects which had at least one task moved from one column to another on
their workboard in (2016-05): 0
Tasks created in (2016-05): 2572
Tasks closed in (2016-05): 2275
Open and stalled tasks in total: 30026
Median age in days of open tasks by priority:
Unbreak now: 19
Needs Triage: 170
High: 280
Normal: 433
Low: 743
Lowest: 555
(How long tasks have been open, not how long they have had that priority)
TODO: Numbers which refer to closed tasks might not be correct, as
described in https://phabricator.wikimedia.org/T1003 .
Yours sincerely,
Fab Rick Aytor
(via community_metrics.sh on iridium at Wed Jun 1 00:00:10 UTC 2016)
https://www.mediawiki.org/wiki/Scrum_of_scrums/2016-05-25
= 2016-05-25 =
== Technology ==
=== Analytics ===
- Trying to puppetize druid, harder than it looks
- Working on scaling on pageview API and cassandra, doing perf testing
on new nodes
- Trying to add throttling to pageview API
- Working on harvesting edit data from db/dumps into hadoop, WIP , main
goal this quarter
- Deployed visualization of Unique Devices:
https://vital-signs.wmflabs.org/#projects=ptwiki/metrics=UniqueDevices
-
=== Services ===
* RESTBase
** rate limiting in prod, log-only
*** Analytics, we need to discuss pageview limits
* Cassandra
** expanding from 2 to 3 instances per node in prod
** upgrade to 2.2.6 next
* Change prop
** handling updates for summary and mobile-sections* endpoints
** will move purging to it soon
* Math
** MathML by default on all wikibooks, early next week all projects
* heads up: Services team on Wikimania, then off-site at the end of June
=== Release Engineering ===
* '''Blocking''': ???
* '''Blocked''': none
* '''Updates''':
** wmf.3 is rolling forward this week
** rc.0 of 1.27 should be out this week
=== Technical Operations ===
* '''Blocking''':
** none
* '''Blocked''':
** none
* Updates:
** misc varnish cluster on route for being upgrade to varnish 4 again
** getting rid of tech debt on the database front (m1 cluster to be
reimaged)
** helping releng with scap3
** Getting finally a redundant link esams-eqiad
** libicu upgrade. See email from Giuseppe on wikitech-l
=== Security ===
* Two-factor authentication has been deployed to CentralAuth wikis with
permission enabled for staff
* Abbey and Daisy are assisting in usability surveying of two-factor
* Darian and Chris are working on knowledge transfer prior to Chris' last
day on Friday, May 27th
* Darian is working on onboarding documentation in anticipation of Security
Team hires in the near-ish future
* Security review schedule remains on track (
https://www.mediawiki.org/wiki/Wikimedia_Security_Team/Schedule ) and Brian
Wolff will be assisting
== Product ==
=== Reading ===
==== Web ====
* Getting ready for Hovercards A/B test (finialising bug fixes)
** Fundraising tech eng can have a look at JS variable for Popups
enabled/disabled
==== Android ====
* released Beta with Reading Lists
==== iOS ====
==== Mobile Content Service ====
* Working on feed endpoints
==== Reading Infrastructure ===
* AuthManager: please read email from Brad on wikitech-l
* https://gerrit.wikimedia.org/r/#/c/290269 (testing gem) pending review as
of 24-May-2016
=== Community Tech ===
* No blockers
* Numerical sorting in categories
** New indexes are in place thanks to JCrespo (
https://phabricator.wikimedia.org/T130692 )
** Will be proposing on Wikitech-l that we switch to "uca-default" as the
default page collation, rather than "uppercase" (
https://phabricator.wikimedia.org/T136113 )
* Launched MassViews interface for pageview stats (
http://tools.wmflabs.org/massviews/ )
* Working on new CopyPatrol tool for detecting plagiarism (
http://tools.wmflabs.org/plagiabot )
=== Editing ===
==== Parsing ====
(Subbu won't be there, updates only)
* Work ongoing to migrate Parsoid to use service-runner after a bunch of
fixes were pushed to service-runner - hoping to push this past the finish
line by next week before attempting a migration of Parsoid cluster to
jessie / node v4 (Follow along on
https://phabricator.wikimedia.org/T135176 and
blocking tasks). Conversation ongoing with services team to resolve details.
* Tidy replacement work proceeding well. After last round of fixes and
visual diff testing, ~88% of test pages render with pixel-perfect accuracy
and ~97% pages with < 1% pixel diffs with HTML5depurate. Additional CSS
fixes since then and new round of visual diff testing in progress. Tim
working to make some fixes to doBlockLevels in core parser to iron out some
kinks there which leads to different behavior in HTML5depurate compared to
Tidy (similar effects in Parsoid).
* Kunal's linker rewrite patch merged. Follow up work in progress to use
the new linker code.
* VE / CX: Please start thinking about how your code needs to change to use
split data-mw format. The data-mw split code is probably 2-3 weeks away in
terms of being ready for deployment, but you can start testing it with
Parsoid master which can provide you the split data-mw (ping arlolra on IRC
for details).
==== Language ====
* Blocked:
** Preference section "Internationalisation" balloons after clicking "More
language settings" https://phabricator.wikimedia.org/T133114(Frontend/Style
libs)
** Can't load In Progress or Published translation list
https://phabricator.wikimedia.org/T135743 (For: Collobration/Roan)
* Updates:
** Compact Language Links (as a non-beta feature) deployed in
Beta/testwikis; work on it continue along with ULS
=== Fundraising tech ===
* (force) merged security patches to fr branch, deployed
** got tests passing again Monday
* CentralNotice: api for a/b testing
* More work to get off ActiveMQ, remove SPOF (just bit us again last week)
* Prepping for fundraising in Israel, Japan and Ukraine
** Trying to mess with language fallbacks on payments cluster so we never
show Russian messages to those whose preferred language is Ukrainian
* Enhancing fraud & dos mitigation measures
=== Discovery ===
* '''Blocking''': none
* '''Blocked''': none
* Team offsite last week, got plans for next Q and year
* Upgrade to ElasticSearch 2.3 is coming on Thursday (
https://phabricator.wikimedia.org/T133124)
* Portal team added descriptive texts to project links on portal after A/B
test showed (small) positive impact
* Survey results for portal visitors:
https://commons.wikimedia.org/wiki/File:Wikipedia_Portal_Survey_-_May_2016.…
* Maps in Wikivoyage now support external layers:
https://en.wikivoyage.org/wiki/Wikivoyage:Travellers%27_pub#Maps_with_extra…
== Wikidata ==
* Blockers: none.
* Deleting files on Commons that are used in Wikidata statements was not
possible due to a bug. https://phabricator.wikimedia.org/T135485
* First prototype for structured data on Commons is close, finally.
https://phabricator.wikimedia.org/T125822
* QUnit tests timed out on Jenkins more often this week. Anybody knows why?
https://phabricator.wikimedia.org/T136303
So the RFC process page says I should email wikitech-l to propose an RFC, thus:
Content-Security-Policy (CSP) header is a header that disables certain
javascript features that are commonly used to exploit XSS attacks, in
order to mitigate the risks of XSS. I think we could massively benefit
from using this technology - XSS attacks probably being the most
common security issue in MediaWiki. The downside is that it would
break compatibility with older user scripts.
Please see the full text of my proposal at
https://www.mediawiki.org/wiki/Requests_for_comment/Content-Security-Policy
The associated phabricator ticket is: https://phabricator.wikimedia.org/T135963
I'd appreciate any comments anyone might have.
Thanks,
Brian