Hello all together,
some time ago, WMF started to re-organize into a new structure to focus on
specific topics. Now, I think, the main tasks are done, so that I want to
bring a (for me, and I think and hope for some others, too) very important
topic back to the ToDo list: mobile editing.
In the past, the mobile web team (in which I mostly contributed, too)
maintained the mobile wikitext editor in a very basic state, and the
VisualEditor team maintained the VisualEditor for tablet users. Now, most
people of the former mobile web team moved to the new vertical
“Readership”, so I’m pretty sure, that the responsibility for all editing
stuff on mobile isn’t in this team anymore (reading ≠ editing), so I
think, that the new editing vertical maintains all the editing stuff,
including mobile editing. Now I have some questions around this topic:
1. I’m missing MobileFrontend in the list of maintained extensions on
https://www.mediawiki.org/wiki/Editing ? Are there plans to move all editing
features away from MobileFrontend, or is the editing code in MobileFrontend
unmaintained now? How volunteers are able to contribute to the right place?
What is the bigger image for editing code in mobile?
2. Is there a roadmap for the mobile VisualEditor? At the moment it’s
only available for tablet users (which makes contributing on smartphones a
bit hard, there is only a clean wikitext editor, see one of my next points).
I know task T93325[1], is that meant to enable mobile VE on smartphones,
too? What is the roadmap for it?
3. Are you planning to support a wikitext editor on mobile, too (Like
a light WikiEditor)? At the moment, the wikitext editor in mobile is very
very basic (basically without any feature). I submitted a change[2] to
enable some rally basic features to it, which, technically seems to be
merge able, but no-one of editing reviews it, neither it’s possible for a
volunteer (me :)) to find out, if such a feature matches the global vision
for mobile editing.
4. And the last point (any maybe the most important one): Who is the
person (or the group of person) I can contact for mobile related editing
things? In the time of the former mobile web team I have known the persons I
can ping on irc, e-mail or via mailing lists, now I’m a bit confused, who
is responsible for mobile editing, it’s like a gray veil :)
It would be great to have these points answered :)
[1] https://phabricator.wikimedia.org/T93325
[2] https://gerrit.wikimedia.org/r/#/c/194945/
Have a nice weekend!
Best,
Florian
http://devhub.wmflabs.org is a prototype of the "Data and developer hub", a
portal and set of articles and links whose goal is to encourage third-party
developers to use Wikimedia data and APIs. Check it out, your feedback is
welcome! You can comment on the talk page of the project page
https://www.mediawiki.org/wiki/dev.wikimedia.org , or file Phabricator
tickets in the project dev.wikimedia.org [1].
Since December 2013 Moiz Syed and others discussed creating "a thing" to
expose our APIs and data to developers. When S Page moved to WMF tech
writer, he wrote some articles for this on mediawiki.org and with Quim Gil
developed a landing page from the wireframe designs [2].
The prototype is using the Blueprint skin and running on a labs instance,
but the articles are all regular wiki pages on mediawiki.org that we
regularly import to http://devhub.wmflabs.org
Thanks to everyone who participated in the gestation of this idea!
-- S Page and Quim Gil
== FAQ ==
Q: How can I feature my awesome API or data set?
A: Create a task in the #dev.wikimedia.org and #documentation projects [3]
with "Article" in the title. You can draft an article yourself, following
the guidelines [4].
Q: Yet another site? Arghh!
A: Agreed, T101441 "Integrate new Developer hub with mediawiki.org" [5].
It's a separate site for now in order to present a different appearance.
Q: But why a different appearance? Why a separate skin?
Our competition for developer mindshare is sites like
https://developers.google.com/ . We believe looking like a 2000s wiki page
is a *deterrent* to using Wikimedia APIs and data. We hope that many
third-party developers join our communities and eventually contribute to
MediaWiki, but "How to contribute to MediaWiki" [6] is not the focus,
providing free open knowledge is.
Q: Why the Blueprint skin?
A: The Design team (now Reading Design) developed it for the OOUI Living
Style Guide [7] and it has some nice features: a fixed header, and a
sidebar that gets out of the way and combines page navigation and the TOC
of the current page.
Q: So why not use the Blueprint skin on mediawiki.org?
A: Agreed, T93613 "Deploy Blueprint on mediawiki.org as optional and
experimental skin" is a blocker for T101441. We appreciate help with it and
its blockers.
Q: I hate the appearance.
A: That's not a question :) You can forget the prototype exists and view
the same content at
https://www.mediawiki.org/wiki/API:Data_and_developer_hub
Q: What is "dev.wikimedia.org"?
A: http://dev.wikimedia.org will be the well-known shortcut to the landing
page. And dev.wikimedia.org is the project name for this "Data and
developer hub".
Q: I thought dev.wikimedia.org was going to integrate source
documentation/replace doc.wikimedia.org/enumerate all Wikimedia software
projects/cure cancer, what happened?
A: One step at a time. For now, its goal is, to repeat, "to encourage
third-party developers to use Wikimedia data and APIs".
Q: Why are the pages in the API: namespace?
A: That's temporary, they will probably end up in a dev: namespace on
mediawiki.org that uses the Blueprint skin by default (T369).
Q: Where are the talk pages?
A: It's a bug that the sidebar doesn't have a "Discussion" link (T103785).
The talk pages on the prototype all redirect to the talk pages for the
original pages on mediawiki.org, and Flow is enabled on them.
[1]
https://phabricator.wikimedia.org/maniphest/task/create/?projects=dev.wikim…
[2] https://www.mediawiki.org/wiki/Dev.wikimedia.org#Structure
[3]
https://phabricator.wikimedia.org/maniphest/task/create/?projects=dev.wikim…
[4] https://www.mediawiki.org/wiki/dev.wikimedia.org/Contributing
[5] https://phabricator.wikimedia.org/T93613 and its blockers
[6] https://www.mediawiki.org/wiki/How_to_contribute (a fine general entry
point)
[7] http://livingstyleguide.wmflabs.org/
--
=S Page WMF Tech writer
As has been announced several times (most recently at
https://lists.wikimedia.org/pipermail/wikitech-l/2015-April/081559.html),
the default continuation mode for action=query requests to api.php will be
changing to be easier for new coders to use correctly.
*The date is now set:* we intend to merge the change to ride the deployment
train at the end of June. That should be 1.26wmf12, to be deployed to test
wikis on June 30, non-Wikipedias on July 1, and Wikipedias on July 2.
If your bot or script is receiving the warning about this upcoming change
(as seen here
<https://www.mediawiki.org/w/api.php?action=query&list=allpages>, for
example), it's time to fix your code!
- The simple solution is to simply include the "rawcontinue" parameter
with your request to continue receiving the raw continuation data (
example
<https://www.mediawiki.org/w/api.php?action=query&list=allpages&rawcontinue=1>).
No other code changes should be necessary.
- Or you could update your code to use the simplified continuation
documented at https://www.mediawiki.org/wiki/API:Query#Continuing_queries
(example
<https://www.mediawiki.org/w/api.php?action=query&list=allpages&continue=>),
which is much easier for clients to implement correctly.
Either of the above solutions may be tested immediately, you'll know it
works because you stop seeing the warning.
I've compiled a list of bots that have hit the deprecation warning more
than 10000 times over the course of the week May 23–29. If you are
responsible for any of these bots, please fix them. If you know who is,
please make sure they've seen this notification. Thanks.
AAlertBot
AboHeidiBot
AbshirBot
Acebot
Ameenbot
ArnauBot
Beau.bot
Begemot-Bot
BeneBot*
BeriBot
BOT-Superzerocool
CalakBot
CamelBot
CandalBot
CategorizationBot
CatWatchBot
ClueBot_III
ClueBot_NG
CobainBot
CorenSearchBot
Cyberbot_I
Cyberbot_II
DanmicholoBot
DeltaQuadBot
Dexbot
Dibot
EdinBot
ElphiBot
ErfgoedBot
Faebot
Fatemibot
FawikiPatroller
HAL
HasteurBot
HerculeBot
Hexabot
HRoestBot
IluvatarBot
Invadibot
Irclogbot
Irfan-bot
Jimmy-abot
JYBot
Krdbot
Legobot
Lowercase_sigmabot_III
MahdiBot
MalarzBOT
MastiBot
Merge_bot
NaggoBot
NasirkhanBot
NirvanaBot
Obaid-bot
PatruBOT
PBot
Phe-bot
Rezabot
RMCD_bot
Shuaib-bot
SineBot
SteinsplitterBot
SvickBOT
TaxonBot
Theo's_Little_Bot
W2Bot
WLE-SpainBot
Xqbot
YaCBot
ZedlikBot
ZkBot
--
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
Some of you may have noticed a bot [1] providing reviews for the
Mobilefrontend and Gather extensions.
This is a grass routes experiment [2] to see if we can reduce
regressions by running browser tests against every single commit. It's
very crude, and we're going to have to maintain it but we see this as
a crude stop gap solution until we get gerrit-bot taking care of this
for us.
Obviously we want to do this for all extensions but we wanted to get
something good enough that is not scaleable to start exploring this.
So far it has caught various bugs for us and our browser test builds
are starting to finally becoming consistently green, a few beta labs
flakes aside [3].
Running tests on beta labs is still useful but now we can use it to
identify tests caused by other extensions. We were finding too often
our tests were failing due to us neglecting them.
In case others are interested in how this is working and want to set
one up themselves I've documented this here:
https://www.mediawiki.org/wiki/Reading/Setting_up_a_browser_test_bot
Please let me now if you have any questions and feel free to edit and
improve this page. If you want to jump into the code that's doing this
and know Python check out:
https://github.com/jdlrobson/Barry-the-Browser-Test-Bot
(Patches welcomed and apologies in advance for the code)
[1] https://gerrit.wikimedia.org/r/#/q/reviewer:jdlrobson%252Bbarry%2540gmail.c…
[2] https://phabricator.wikimedia.org/T100293
[3] https://integration.wikimedia.org/ci/view/Mobile/job/browsertests-MobileFro…
Hi,
In the last few days I've been looking for reasons for the appearance of
unnecessary <nowiki> tags. This mostly happens because of various
VisualEditor and Parsoid issues. The developers have been very good at
fixing them, and now it happens very rarely, but there are still lots of
these useless tags lurking in pages.
Two examples are:
* '''<nowiki/>''' - this doesn't do anything at all. I couldn't reproduce
it in any way, so it's probably a bug that was fixed.
* <nowiki> </nowiki> in the beginning of a paragraph. This was added in the
past to avoid putting the paragraph in <pre>, but it's entirely useless,
because the spaces are trimmed. Now they are pre-trimmed, so this is also a
fixed bug, but a lot of pages still have it.
There may be more - I'm still looking for these.
It would be easy to write bots to fix such easy common cases, but they
would have to run on every project. Would it make sense to write them as
maintenance scripts that update them everywhere when people upgrade VE?
--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
“We're living in pieces,
I want to live in peace.” – T. Moore
cc wikitech
(thanks!)
On Tue, Jun 30, 2015 at 3:05 PM, Jon Robson <jrobson(a)wikimedia.org> wrote:
> Thanks for the quick reply! :)
> Yes site notices should be disabled on mobile, but site notices on
> mobile are currently treated different to central notice banners
> (rightly or wrongly) so I can only assume that this is served via
> CentralNotice?
>
> The campaign has to be set specifically for mobile browsers however...
> without seeing the campaign in Central Notice I'm not able to provide
> much more help but options would be to not target mobile phones, or
> add some additional styles to make it work on mobile (it looks like it
> has a fixed width)
>
>
> On Tue, Jun 30, 2015 at 3:00 PM, Legoktm <legoktm.wikipedia(a)gmail.com> wrote:
>> Hi,
>>
>> On 06/30/2015 02:49 PM, Jon Robson wrote:
>>> I noticed a banner on the mobile site that renders the site unusable:
>>> http://imgur.com/qVGz3mZ
>>>
>>> I'm not sure who is responsible for "Freedom of Panorama in Europe in
>>> 2015" but can someone disable this on mobile asap or make it work on
>>> mobile?
>>
>> The sitenotice been temporarily disabled for other reasons right now.
>>
>>> Please also reach out to us on the mobile-l mailing list ahead of
>>> running these campaigns if you are unsure how to test campaigns, we're
>>> happy to help.
>>
>> In InitialiseSettings.php, I see:
>>
>> 'wmgMFEnableSiteNotice' => array(
>> 'default' => false,
>> ),
>>
>> So I assumed that it wouldn't show up on mobile at all. Is that no
>> longer the case?
>>
>> -- Legoktm
>>
>> _______________________________________________
>> Mobile-l mailing list
>> Mobile-l(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/mobile-l
I noticed a banner on the mobile site that renders the site unusable:
http://imgur.com/qVGz3mZ
I'm not sure who is responsible for "Freedom of Panorama in Europe in
2015" but can someone disable this on mobile asap or make it work on
mobile?
Please also reach out to us on the mobile-l mailing list ahead of
running these campaigns if you are unsure how to test campaigns, we're
happy to help.
Jon
On Mon, Jun 29, 2015 at 10:31 AM, Jon Robson <jrobson(a)wikimedia.org> wrote:
> Hi Jacob
> I've cc'ed wikitech to get an update on bug
> https://phabricator.wikimedia.org/T483
> The last update here was in February 2015. I personally think this is
> one of our most urgent bugs to fix, but it's not clear who is
> responsible for this, and who has the expertise to help resolve it.
>
> The fundamental issue this privacy policy hits, is many our editors
> are also hitting - that there is no way to style wikitext content
> differently on a mobile screen. This has been a recurring problem for
> some time now. https://phabricator.wikimedia.org/T483
>
> The only way to make any progress here currently is the following:
> * Add the nomobile class to an element to hide it
> * Add a reset rule to MediaWiki:Common.css to reset problematic styles
> e.g. https://www.mediawiki.org/w/index.php?title=MediaWiki:Mobile.css
>
> I'm near positive these very same issues were discussed over a year go
> for the privacy policy on English Wikipedia and I think the conclusion
> was there was little that could be done in current form (if anyone can
> remember where that conversation happened).
>
> Note: For wikimediafoundation.org it might be acceptable to move all
> inline style rules into
> https://m.wikimediafoundation.org/w/index.php?title=MediaWiki:Mobile.css
> and use media queries to style content differently. Any capable web
> developer should be able to help you with that (I'm not sure who is
> building the privacy policy for you).
>
> Jon
>
> On Thu, Jun 25, 2015 at 2:33 PM, Adam Baso <abaso(a)wikimedia.org> wrote:
>> Moving to mobile-l. Discuss.
>>
>> -Adam
>>
>> On Wed, Jun 24, 2015 at 10:05 PM, Jon Robson <jrobson(a)wikimedia.org> wrote:
>>>
>>> cc. reading-list. You'll get more feedback there :)
>>> Short reply: There are lots of bugs and larger problems here that need
>>> to be solved.
>>>
>>>
>>> On Wed, Jun 24, 2015 at 5:42 PM, Jacob Rogers <jrogers(a)wikimedia.org>
>>> wrote:
>>> > Hi Jon,
>>> >
>>> > James A suggested you might be the right person to talk with about
>>> > improving
>>> > the readability of the WMF privacy policy on mobile devices. Currently,
>>> > it's
>>> > pretty difficult to look at. It starts with the massive language list,
>>> > the
>>> > disclaimer renders 1-2 words a line, and the blue boxes also render in
>>> > hard
>>> > to read lines as well as pushing the main section to scroll off the
>>> > screen.
>>> >
>>> > If you are the right person, what I'm hoping we can do is make the
>>> > language
>>> > list into an expandable menu, get rid of the blue boxes on the sides if
>>> > necessary, and possibly make the examples into an expandable view rather
>>> > than have everything shown by default.
>>> >
>>> > If you're not the right person to this, could you forward me on to
>>> > someone
>>> > that might be able to help?
>>> >
>>> > Many thanks,
>>> > Jacob
>>> > --
>>> >
>>> > Jacob Rogers
>>> > Legal Counsel
>>> > Wikimedia Foundation
>>> >
>>> > NOTICE: This message might have confidential or legally privileged
>>> > information in it. If you have received this message by accident, please
>>> > delete it and let us know about the mistake. As an attorney for the
>>> > Wikimedia Foundation, for legal/ethical reasons I cannot give legal
>>> > advice
>>> > to, or serve as a lawyer for, community members, volunteers, or staff
>>> > members in their personal capacity. For more on what this means, please
>>> > see
>>> > our legal disclaimer.
>>> >
>>>
>>> _______________________________________________
>>> reading-wmf mailing list
>>> reading-wmf(a)lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/reading-wmf
>>
>>
>>
>> _______________________________________________
>> Mobile-l mailing list
>> Mobile-l(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/mobile-l
>>
Hello everyone,
On behalf of the parsing team, here is an update about Parsoid, the
bidirectional wikitext <-> HTML parser that supports Visual Editor,
Flow, and Content Translation.
Subbu.
-----------------------------------------------------------------------
TL:DR;
1. Parsoid[1] roundtrips 99.95% of the 158K pages in round-trip testing
without introducing semantic diffs[2].
2. With trivial simulated edits, the HTML -> wikitext serializer used
in production (selective serialization) introduces ZERO dirty diffs
in 99.986% of those edits[3]. 10 of those 23 edits with dirty diffs
are minor newline diffs.
-----------------------------------------------------------------------
Couple days back (June 23rd), Parsoid achieved 99.95%[2] semantic accuracy
in the wikitext -> HTML -> wikitext roundtripping process on the set of
about 158K pages randomly picked from about 16 wikis back in 2013.
Keeping this test set constant has let us monitor our progress over time.
We were at 99.75% last year around this time.
What does this mean?
--------------------
* Despite the practical complexities of wikitext, the mismatch in the
processing models of wikitext (string-based) and Parsoid (DOM-based),
and the various wikitext "errors" that are found on pages, Parsoid is
able
to maintain a reversible mapping between wikitext constructs and their
equivalent HTML DOM trees that HTML editors and other tools can
manipulate.
The majority of differences in the 0.05% arise because of wikitext
errors:
links in links, 'fosterable'[4] content in tables, and some scenarios
with unmatched quotes in attributes. Parsoid does not support
round-tripping (RT) of these.
* While this is not a big change from how it has been for about a year now
in terms of Parsoid's support for editing, this is a notable milestone
for us in terms of the confidence we have in Parsoid's ability to handle
the wikitext usage seen in production wikis and our ability to RT them
accurately without corrupting pages. This should also boost confidence
of all applications that rely on Parsoid.
* In production, Parsoid uses a selective serialization strategy which
tries to preserve unedited parts of wikitext as far as possible.
As part of regular testing, we also simulate a trivial edit by adding
a new comment to the page and run the edited HTML through this
selective serializer. All but 23 pages (0.014% of trivial edits) had
ZERO dirty diffs[3]. Of these 23, 10 of the diffs were minor newline
diffs.
In production, the dirty diff rate will be higher than 0.014% because of
more complex edits and because of bugs in any of 3 components involved
in visual editing on Wikipedias (Parsoid, RESTBase[5] and Visual Editor)
and their interaction. But, the base accuracy of Parsoid's roundtripping
(both in terms of full and selective serialization) is critical to
ensuring
clean visual edits. The above milestones are part of ensuring that.
What does this not mean?
------------------------
* If you edit one of those 0.05% of pages in VE, the VE-Parsoid combination
will break the page. NO!
If you edit the broken part of the page, Parsoid will very likely
normalize
the broken wikitext to the non-erroneous form (break up nested links,
move fostered content out of the table, drop duplicate transclusion
parameters, etc.) In the odd case, it could cause a dirty diff that
changes
the semantics of those broken constructs.
* Parsoid's visual rendering is 99.95% identical to PHP parser
rendering. NO!
RT tests are focused on Parsoid's ability to support editing without
introducing dirty diffs. Even though Parsoid might render a page
differently than the default read view (and might even be incorrect),
we are nevertheless able to RT it without breaking the wikitext.
On the way to getting to 99.95% RT accuracy, we have improved and fixed
several bugs in Parsoid's rendering. The rendering is also fairly
identical
to the default read view (otherwise, VE editors will definitely
complain).
However, we haven't done sufficient testing to systematically identify
rendering incompatibilities and quantify this. In the coming quarters,
we are going to turn our attention to this problem. We have a visual
diffing infrastructure to help us with this (we take screenshots of
Parsoid's output and the default output and compare those images and find
diffs). We'll have to tweak and fix our visual-diffing setup and then fix
rendering problems we find.
* 100% roundtripping accuracy is within reach. NO!
The reality is that there are a lot of pages out there that have various
kinds of broken markup (mis-nested html tags, unmatched html tags,
broken templates) in production. There are probably other edge case
scenarios that trigger different behavior in Parsoid and the PHP parser.
Because we go to great lengths in Parsoid to avoid dirty diffs, our
selective serialization works quite well. There have been very few
reports
of page corruption over the last year. And, where they have surfaced,
we've
usually moved pretty quickly to fix them, and we'll continue to do so.
In addition, our diff classification algo will never be perfect and there
will always be false positives. Overall, we may crawl further along by
0.01% or 0.02%, but we are not holding our breath and neither should you.
* If we pick a new corpus of 100K pages, we'll have similar accuracy. MAYBE!
Because we've tested against a random sample of pages across multiple
Wikipedias, we expect that we've encountered the vast majority of
scenarios
that Parsoid will encounter in production. So, we have a very high degree
of confidence that our fixes are not tailored to our test pages.
As part of https://phabricator.wikimedia.org/T101928 we will be doing
a refresh of our test set, focusing more on enwp pages, non-Wikipedia
test
pages, and probably introducing a set of high traffic pages.
Next steps
----------
Given where we are now, we can now start thinking about the next level with
a bit more focus and energy. Our next steps are to bring the PHP parser and
Parsoid closer both in terms of output and long-term capabilities.
Some possibilities:
* Replace Tidy ( https://phabricator.wikimedia.org/T89331 )
* Pare down rendering differences between the two systems so that
we can start thinking about using Parsoid HTML instead of MWParser HTML
for read views. ( https://phabricator.wikimedia.org/T55784 )
* Use Parsoid as a WikiLint tool
https://phabricator.wikimedia.org/T48705https://www.mediawiki.org/wiki/Parsoid/Linting/GSoC_2014_Application
* Support improved templating abilities (data-driven tables, etc.)
* Improve Parsoid's parsing performance.
* Implement stable ids to be able to attach long-lived metadata to the
DOM and track it across edits.
* Move wikitext to a DOM-based processing model, using Parsoid as a bridge.
This could make several useful things possible, e.g. much better
automatic edit conflict resolution.
* Long-term: Make Parsoid redundant in its current complex avatar.
References
----------
[1] https://www.mediawiki.org/wiki/Parsoid -- bidirectional parser
supporting
visual editing
[2] http://parsoid-tests.wikimedia.org/failsDistrhttp://parsoid-tests.wikimedia.org/topfails shows the actual failures
[3]
http://parsoid-tests.wikimedia.org/rtselsererrors/aa5804ca89dc644f744af24c4…
[4] http://dev.w3.org/html5/spec-LC/tree-construction.html#foster-parenting
[5] https://www.mediawiki.org/wiki/RESTBase#Use_cases