TemplateStyles could really use this, since people object to e.g. the
TemplateStyles CSS being able to mess with the diff tables. I posted an
analysis and some options at
https://phabricator.wikimedia.org/T37247#3181097. Feedback would be
appreciated, particularly from someone familiar with how exactly content
gets into VE and Flow as to what if anything else might be needed to get
the new div to be output in those extensions.
Thanks.
--
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
Hi everybody!
As a reminder the CREDIT Showcase is next week on Wednesday,
1-February-2017 (see https://www.mediawiki.org/wiki/CREDIT_showcase for
details). Also, as I mentioned previously we're conducting a survey about
CREDIT. We'd appreciate your feedback! Here is a link to the survey (which
is hosted on a third-party service), and, for information about privacy and
data handling, the survey privacy statement.
https://docs.google.com/a/wikimedia.org/forms/d/e/1FAIpQLSedAtyPfcEhT6OVd26…https://wikimediafoundation.org/wiki/CREDIT_Feedback_Survey_Privacy_Stateme…
.
This email is being sent to several mailing lists in order to reach
multiple audiences. As always, please follow the list link at the very
bottom of this email in case you want to manage your list subscription
options such as digest, unsubscribe, and so on.
And, as usual, if you'd like to share the news about the upcoming CREDIT,
here's some suggested verbiage.
*Hi <FNAME>*
*I hope all is well with you! I wanted to let you know about CREDIT, a
monthly demo series that we’re running to showcase open source tech
projects from Wikimedia’s Community, Reading, Editing, Discovery,
Infrastructure and Technology teams. *
*CREDIT is open to the public, and we welcome questions and discussion. The
next CREDIT will be held on February 1st at 11am PT / 2pm ET / 19:00 UTC. *
*There’s more info on MediaWiki
<https://www.mediawiki.org/wiki/CREDIT_showcase>, and on Etherpad
<https://etherpad.wikimedia.org/p/CREDIT>, which is where we take notes and
ask questions. You can also ask questions on IRC in the Freenode chatroom
#wikimedia-office (web-based access here
<https://webchat.freenode.net/?channels=%23wikimedia-office>). Links to
video will become available at these locations shortly before the event.*
*Please feel free to pass this information along to any interested folks.
Our projects tend to focus on areas that might be of interest to folks
working across the open source tech community: language detection,
numerical sort, large data visualizations, maps, and all sorts of other
things.*
*If you have any questions, please let me know! Thanks, and I hope to see
you at CREDIT.*
*YOURNAME*
Thanks!
Adam Baso
Director of Engineering, Reading
Wikimedia Foundation
abaso(a)wikimedia.org
This affects your wiki if you are using both Flow and Nuke.
We recently fixed https://phabricator.wikimedia.org/T162621 , an issue
with Flow's Nuke integration.
This has now been merged to master as well as the two supported Flow
release branches (1.27 and 1.28):
master - https://gerrit.wikimedia.org/r/#/c/348407/ (merged)
1.27 - https://gerrit.wikimedia.org/r/#/c/348408/1
1.28 - https://gerrit.wikimedia.org/r/#/c/348409/1
This has already been deployed to WMF production.
There is an unrelated Jenkins issue with 1.27 and 1.28. Until those are
merged, you can download the patches using Download->Checkout in the
top-right of Gerrit. Sorry for the inconvenience.
Matt Flaschen
Hey folks,
In this update, I'm going to change some things up to try and make this
update easier for you to consume. The biggest change you'll notice is that
I've broken up the [#] references in each section. I hope that saves you
some scrolling and confusion. You'll also notice that I have changed the
subject line from "Revision scoring" to "Scoring Platform" because it's now
clear that, come July, I'll be leading a new team with that name at the
Wikimedia Foundation. There'll be an announcement about that coming once
our budget is finalized. I'll try to keep this subject consistent for the
foreseeable future so that your email clients will continue to group the
updates into one big thread.
*Deployments & maintenance:*
In this cycle, we've gotten better at tracking our deployments and noting
what changes do out with each deployment. You can click on the phab task
for a deployment and observe the sub-tasks to find out what was deployed.
We had 3 deployments for ORES since mid-march[1,2,3]. We've had two
deployments to Wikilabels[4,5] and we've added a maintenance notices for a
short period of downtime that's coming up on April 21st[6,7].
1. https://phabricator.wikimedia.org/T160279 -- Deploy ores in prod
(Mid-March)
2. https://phabricator.wikimedia.org/T160638 -- Deploy ORES late march
3. https://phabricator.wikimedia.org/T161748 -- Deploy ORES early April
4. https://phabricator.wikimedia.org/T161002 -- Late march wikilabels
deployment
5. https://phabricator.wikimedia.org/T163016 -- Deploy Wikilabels mid-April
6. https://phabricator.wikimedia.org/T162888 -- Add header to Wikilabels
that warns of upcoming maintenance.
7. https://phabricator.wikimedia.org/T162265 -- Manage wikilabels for
labsdb1004 maintenance
*Making ORES better:*
We've been working to make ORES easier to extend and more useful. ORES now
reports it's relevant versions at https://ores.wikimedia.org/versions[8].
We've also reduced the complexity of our "precaching" system that scores
edits before you ask for them[9,10]. We're taking advantage of logstash to
store and query our logs[11]. We've also implemented some nice
abstractions for requests and responses in ORES[12] that allowed us to
improve our metrics tracking substantially[13].
8. https://phabricator.wikimedia.org/T155814 -- Expose version of the
service and its dependencies
9. https://phabricator.wikimedia.org/T148714 -- Create generalized
"precache" endpoint for ORES
10. https://phabricator.wikimedia.org/T162627 -- Switch `/precache` to be a
POST end point
11. https://phabricator.wikimedia.org/T149010 -- Send ORES logs to logstash
12. https://phabricator.wikimedia.org/T159502 -- Exclude precaching
requests from cache_miss/cache_hit metrics
13. https://phabricator.wikimedia.org/T161526 -- Implement
ScoreRequest/ScoreResponse pattern in ORES
*New functionality:*
In the last month and a half, we've added basic support to Korean
Wikipedia[14,15]. Props to Revi for helping us work through a bunch of
issues with our Korean language support[16,17,18].
We've also gotten the ORES Review tool deployed to Hebrew
Wikipedia[19,20,21,22] and Estonian Wikipedia[23,24,25]. We're also
working with the Collaboration team to implement the threshold test
statistics that they need to tune their new Edit Review interface[26] and
we're working towards making this kind of work self-serve so that that
product team and other tool developers won't have to wait on us to
implement these threshold stats in the future[27].
14. https://phabricator.wikimedia.org/T161617 -- Deploy reverted model for
kowiki
15. https://phabricator.wikimedia.org/T161616 -- Train/test reverted model
for kowiki
16. https://phabricator.wikimedia.org/T160752 -- Korean generated word
lists are in chinese
17. https://phabricator.wikimedia.org/T160757 -- Add language support for
Korean
18. https://phabricator.wikimedia.org/T160755 -- Fix tokenization for Korean
19. https://phabricator.wikimedia.org/T161621 -- Deploy ORES Review Tool
for hewiki
20. https://phabricator.wikimedia.org/T130284 -- Deploy edit quality models
for hewiki
21. https://phabricator.wikimedia.org/T160930 -- Train damaging and
goodfaith models for hewiki
22. https://phabricator.wikimedia.org/T130263 -- Complete hewiki edit
quality campaign
23. https://phabricator.wikimedia.org/T159609 -- Deploy ORES review tool to
etwiki
24. https://phabricator.wikimedia.org/T130280 -- Deploy edit quality models
for etwiki
25. https://phabricator.wikimedia.org/T129702 -- Complete etwiki edit
quality campaign
26. https://phabricator.wikimedia.org/T162377 -- Implement additional
test_stats in editquality
27. https://phabricator.wikimedia.org/T162217 -- Implement "thresholds",
deprecate "pile of tests_stats"
*ORES training / labeling campaigns:*
Thanks to a lot of networking at Wikimedia Conference and some help from
Ijon (Asaf Batrov), we've found a bunch of new collaborators to help us
deploy ORES to new wikis. As is critcial in this process, we need to
deploy labeling campaigns so that Wikipedians can help us train ORES.
We've got new editquality labeling campaigns deployed to Albanian[28],
Finnish[29], Latvian[30], Korean[31], and Turkish[21] Wikipedias.
We've also been working on a new type of model: "Item quality" in
Wikidata. We've deployed, labeled, and analyzed a pilot[33], fixed some
critical bugs that came up[34,35], and we've finally launched a 5k item
campaign which is already 17% done[36]! See
https://www.wikidata.org/wiki/Wikidata:Item_quality_campaign if you'd like
to help us out.
28. https://phabricator.wikimedia.org/T161981 -- Edit quality campaign for
Albanian Wikipedia
29. https://phabricator.wikimedia.org/T161905 -- Edit quality campaign for
Finnish Wikipedia
30. https://phabricator.wikimedia.org/T162032 -- Edit quality campaign for
Latvian Wikipedia
31. https://phabricator.wikimedia.org/T161622 -- Deploy editquality
campaign in Korean Wikipedia
32. https://phabricator.wikimedia.org/T161977 -- Start v2 editquality
campaign for trwiki
33. https://phabricator.wikimedia.org/T159570 -- Deploy the pilot of
Wikidata item quality campaign
34. https://phabricator.wikimedia.org/T160256 -- Wikidata items render
badly in Wikilabels
35. https://phabricator.wikimedia.org/T162530 -- Implement "unwanted pages"
filtering strategy for Wikidata
36. https://phabricator.wikimedia.org/T157493 -- Deploy Wikidata item
quality campaign
*Bug fixing:*
As usual, we have a few weird bug that got in our way. We needed to move
to a bigger virtual machine in "Beta Labs" because our models take up a
bunch of hard drive space[37]. We found that Wikilabels wasn't removing
expired tasks correctly and that this was making it difficult to finish
labeling campaigns[38]. We also had a lot of right-to-left issues when we
did an upgrade of OOjs UI[38]. There was an old bug we had with
https://translatewiki.net in one of our message keys[39].
37. https://phabricator.wikimedia.org/T160762 -- deployment-ores-redis
/srv/ redis is too small (500MBytes)
38. https://phabricator.wikimedia.org/T161521 -- Wikilabels is not cleaning
up expired tasks for Wikidata item quality campaign
39. https://phabricator.wikimedia.org/T161533 -- Fix RTL issues in
Wikilabels after OOjs UI upgrade
40. https://phabricator.wikimedia.org/T132197 -- qqq for a wiki-ai message
cannot be loaded
-Aaron
Principal Research Scientist
Head of the Scoring Platform Team
Hiya!
tl;dr: if your repo has a patch from me[0], please merge it :)
The longer explanation for these patches is that the deployment server
from which your code is fetched by targets is set via the git_server
configuration variable. This variable will be updated in Puppet when the
primary deployment server changes; however, updating it in every repo
would be time consuming. Yesterday, I made a bunch of patches to remove
this configuration variable from any repo where it is set. By removing
this configuration variable from individual repos, all repos will
respect the global value for git_server that is set in Puppet meaning
that repo owners shouldn't have to worry about making updates when a
deployment server is changed.
If you have any questions let me know via email or IRC in
#wikimedia-releng.
Thank you for your help!
-- Tyler
[0]. <https://gerrit.wikimedia.org/r/#/q/topic:T162814+status:open>
Hi all, over the past few weeks, the reading web team has made some
releases and improvements we would like to share:
- New Header for the Mobile Web [1]
We have just released a new header on the mobile web! Our goal is to raise
awareness among readers that the content we are providing is coming from
Wikipedia or related projects using clear branding. Low brand awareness
was one of the key findings during the research performed by the New
Readers team [2]. According to their findings, user were often not aware
that they were reading Wikipedia, and sometimes confused it for a search
engine or social media platform [3]. An increase in brand awareness could
therefore lead to increased readership and retention.
- Page Previews Stage 0 deployment
The Page Previews (aka Hovercards) [4] feature is now live on the
following wikipedias: Russian, Italian, Greek, Catalan. We will continue
with rollouts on Hungarian and Hebrew wikipedias as a conclusion of stage 0.
Currently, we are in the process of consulting communities about the next
stage of rollouts.
- PageImages restricted to the lead section [5]
Before, PageImages were being drawn from the first instance of an image
within the page. This led to a majority of the images set from outside of
the lead section to be out of context or to not properly represent the
subject of the article. With this change, the PageImage for an article
will be drawn from the lead section and infobox only, to ensure that the
most accurate image is always used.
Thank you and let us know if there’s any questions, concerns, or any other
feedback!
- Olga
[1]
https://www.mediawiki.org/wiki/Reading/Web/Projects/Improve_site_branding
[2] https://meta.wikimedia.org/wiki/New_Readers/Next_steps#Awareness
[3]
https://commons.wikimedia.org/w/index.php?title=File%3AWikimedia_Foundation…
[4] https://www.mediawiki.org/wiki/Beta_Features/Hovercards
[5] https://phabricator.wikimedia.org/T152115
--
Olga Vasileva // Product Manager // Reading Web Team
https://wikimediafoundation.org/
Here's my requirement:
- a wiki page is one JSON document
- when editing, the user edits the JSON directly
- when viewing, I have a viewer that turns the JSON into wikitext, and that
wikitext gets rendered as wikitext and turned into HTML by MediaWiki
I have several options, including:
1) hook for a tag like <json>, and write an extension that parses the
content between the tags and turns it into wikitext (not ideal, as I don't
use any of the existing support for JSON stuff, and also I could have
several such tags per page, which does not fit with my requirements)
2) I found the JsonConfig extension by yurik. This allows me to do almost
all of the things above - but it returns HTML directly, not wikitext. It
doesn't seem trivial to be able to return wikitext instead of HTML, but
hopefully I am wrong? Also, this ties in nicely with the Code Editor.
3) there is actually a JsonContentHandler in core. But looking through it
it seems that this suffers from the same limitations - I can return HTML,
but not wikitext.
3 seems to have the advantage to be more actively worked on that 2 (which
is not based on 3, probably because it is older than 3). So future goodies
like a Json Schema validator will probably go to 3, but not to 2, so I
should probably go to 3.
Writing this down, one solution could be to create the wikitext, and then
call the wikitext parser manually and have it create HTML?
I have already developed the extension in 1, and then fully rewritten it in
2. Before I go and rewrite it again in 3, I wanted to ask whether I am
doing it right, or if should do it completely differently, and also if
there are examples of stuff developed in 3, i.e. of extensions or features
using the JsonContent class.
Example:
I have a JSON document
{ "username": "Denny" }
which gets turned into wikitext
''Hello, [[User:Denny|Denny]]!''
which then gets turned into the right HTML and displayed to the user, e.g.
<i>Hello, <a href="...">Denny</a>!</i>
Cheers,
Denny