*tl;dr: We'll be stripping all content contained inside brackets from the
first sentence of articles in the Wikipedia app.*
The Mobile Apps Team is focussed on making the app a beautiful and engaging
reader experience, and trying to support use cases like wanting to look
something up quickly to find what it is. Unfortunately, there are several
aspects of Wikipedia at present that are actively detrimental to that goal.
One example of this are the lead sentences.
As mentioned in the other thread on this matter
lead sentences are poorly formatted and contain information that is
detrimental to quickly looking up a topic. The team did a quick audit
the information available inside brackets in the first sentences, and
typically it is pronunciation information which is probably better placed
in the infobox rather than breaking up the first sentence. The other
problem is that this information was typically inserted and previewed on a
platform where space is not at a premium, and that calculation is different
on mobile devices.
In order to better serve the quick lookup use case, the team has reached
the decision to strip anything inside brackets in the first sentence of
articles in the Wikipedia app.
Stripping content is not a decision to be made lightly. People took the
time to write it, and that should be respected. We realise this is
controversial. That said, it's the opinion of the team that the problem is
pretty clear: this content is not optimised for users quickly looking
things up on mobile devices at all, and will take a long time to solve
through alternative means. A quicker solution is required.
The screenshots below are mockups of the before and after of the change.
These are not final, I just put them together quickly to illustrate what
I'm talking about.
- Before: http://i.imgur.com/VwKerbv.jpg
- After: http://i.imgur.com/2A5PLmy.jpg
If you have any questions, let me know.
Associate Product Manager, Mobile Apps
Fwd, forgot to press answer all :(
Gesendet mit meinem HTC
----- Weitergeleitete Nachricht -----
Von: "Florian Schmidt" <florian.schmidt.welzow(a)t-online.de>
An: "Aaron Halfaker" <ahalfaker(a)wikimedia.org>
Betreff: AW: [WikimediaMobile] [Wikitech-l] Anonymous editing impact on mobile
Datum: Do., Apr. 30, 2015 16:26
Great to read this, thanks Aaron :)
Gesendet mit meinem HTC
----- Nachricht beantworten -----
Von: "Aaron Halfaker" <ahalfaker(a)wikimedia.org>
An: "Wikimedia developers" <wikitech-l(a)lists.wikimedia.org>, "mobile-l" <mobile-l(a)lists.wikimedia.org>
Betreff: [WikimediaMobile] [Wikitech-l] Anonymous editing impact on mobile
Datum: Do., Apr. 30, 2015 01:09
As requested, I started a research project page to do some analysis around
It's just a stub now. I'll have to clear a few other projects off my plate
in order to pick this one up. You should expect to see updates there in
On Wed, Apr 29, 2015 at 12:10 PM, Jon Robson <jdlrobson(a)gmail.com> wrote:
> On Wed, Apr 29, 2015 at 8:19 AM, Robert Rohde <rarohde(a)gmail.com> wrote:
> > On Tue, Apr 28, 2015 at 10:31 PM, Jon Robson <jrobson(a)wikimedia.org>
> > > <snip>
> > Any community members interested in helping out here? I'm very sad the
> > > increase in errors wasn't picked up sooner... :-/
> > >
> > What does event_action = 'error' actually mean?
> > If the action is stopped by the AbuseFilter is that counted as an
> It means at some point during the editing workflow the user hit an
> error that stopped them from finishing their edit.
> We do capture AbuseFilter hits in this process (but with some of these
> errors you can recover and complete the edit.
> We also store the error associated, although a quick scan shows this
> is currently not very helpful.
> In theory 'http' error should only happen when a user cannot get an
> edit token - I've updated the bug for those interested.
> > -Robert
> > _______________________________________________
> > Wikitech-l mailing list
> > Wikitech-l(a)lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> Jon Robson
> * http://jonrobson.me.uk
> * https://www.facebook.com/jonrobson
> * @rakugojona
> Wikitech-l mailing list
Summary of message:
* Reading web engineers should take on Reading-oriented api.php work and,
if they like and as appropriate, Node.js work
* App engineers who will come forward to work on Reading-oriented api.php
or Node.js appreciated
* Reading Infra (Bryan/Brad/Gergo) mandatory for Q4 code review or
API/Core, Q1 please engage Reading Infra for up front consulting and code
review for API/Core
Now, the full message:
On another thread there was some discussion around transfer of API features
from Max to Reading web engineers. Concurrently, we've been cataloging pent
up demand for api.php and Node.js API/Services style work.
Bryan, Toby, and I met yesterday to discuss the engagement model for
Reading web/apps and Reading Infrastructure. The outcome of that discussion
was carried into the standing Reading API/Services tri-weekly meeting
afterward, where there was basic agreement on an approach going forward.
Last evening, I discussed this with Jon Katz and Kristen as well. And the
Reading web/app leads met this morning to socialize further. Here's where
the thinking is:
* Generally, Reading web engineers will need to take on Reading-oriented
api.php work or, if they think it the right fit for a given problem,
Node.js service work, or both. App engineers should come forward if they're
interested in fuller stack development - for example, Bernd is already
writing Node.js code and it sounds like Brian may be interested in pairing
to tackle Text Extracts api.php stuff related to Share a Fact.
* Q4: Reading Infrastructure (Bryan, Brad, Gergo) available for code review
of API and MediaWiki core-related patches. Otherwise, it is focused on a
core set of problems that are set for Q4 (also, Gergo is half time on web
engineering at the for this quarter, although he'll taper off).
* Q1 and beyond: Reading Infrastructure available to Reading engineers for
** Up front consulting on API and MediaWiki core-related work. It's
critical to avoid missteps up front.
** Code review of API and MediaWiki core-related patches
** Approximately one api.php task implementation per Reading Infrastructure
resource per quarter for the web/apps area. In the API/Services meeting we
talked about perhaps something like pageimages and createaccount related
Phabricator tasks for Q1 as good candidates. The purpose of this work is to
demonstrate good examples of how work can be implemented, incrementally
increasing velocity, and providing an opportunity for web/app engineers to
pair up with their Reading Infrastructure peers.
This model mirrors the Services (not part of Reading) team model for
consulting, code review, and software architecture scaffolding. As the
Services team focuses much of its time on broadly available software
architecture, so too does the Reading Infrastructure team plan to focus
most of its work.
Practically speaking, it's hard to think at all levels of the stack. But we
have an extremely talented team, and an engineer can deepen one's
experience tier by tier over time.
In Reading web engineering there's a level of familiarity and expertise
with PHP - and where there's room for improvement I'm confident
professional growth goals can make this a reality. As with HTML, CSS, and
MediaWiki-oriented way of coding it - is critical to delivering high
quality user experiences. We fully support and encourage pursuit of
training / dedicated study in this area.
With respect to api.php and extension code hooked into it: a really key
part of writing this sort of code is understanding the boilerplate and
scaffolding pieces. Bryan's team is able to point people to existing good
examples and a number of web engineers are comfortable in this area as well.
As for growing more application server (PHP) skills in the MediaWiki
environment outside of api.php specifically (i.e., Core and other thornier
pieces of the codebase), Sam has offered to pair with anyone who wants his
Thinking ahead, there is the more systemic question of staffing mix
dedicated to application layer services in Reading. The current plan, which
I have to note could be subject to change (budgeting is in flight, and we
have backfill and new resource requests in there), is to acquire talent for
web engineering in application layer services for the Reading experience.
While it's good for all web engineers (and some app engineers) to be able
to code to api.php or Node.js, there's value in having someone really
focused on this middle tier / middleware stuff in the Reading web/app area,
too. Max Semenik is now focused upon geo/Search & Discovery, and he had
deep expertise in this area, and we should be thinking about staffing for
this sort of skill level.
In a related matter, I've heard a number of people talking about a more
convergence-based approach to software development in Reading. For example,
implement a feature in the API portion of the MobileFrontend extension,
then leverage it in the apps and desktop web. This is the place we should
be for some (maybe for new stuff, most?) set of features, and part of
getting to this place requires small steps:
* Picking a few simple problems to solve and working across web & apps (&
* Cataloging features by channel in some sort of matrix, and delivering
We now have a weekly Reading web/apps engineering leads meeting, where we
can make some incremental progress on the first part for picking a few
For the cataloging piece, I'm working to figure out an approach (ideally
with a SPOC). The more intentional we are about which features get rolled
out where and in which order, plus think carefully about convergence versus
doubled effort (and to be sure there will be cases for both), the greater
our success will be delivering the most beautiful and channel relevant
experiences for our readers.
Just wanted to send my notes on the hack demos from this weekend, and
potentially start a discussion about shipping some of them!
*Apple Watch: *Corey & Jason did an incredible job on this, and I'm
essentially sold that a watch companion app could augment our *read later* and
*search* flows for what seems like a reasonable development cost.
Not sure who did it, but server-side image/upload validation could be
further facilitate cross-platform mobile uploads?
*Image surface content gap*: IOW, which pages are in the most need of
pictures. Seems like a "micro edit" workflow that could work well for apps
when combined with location/geo-fencing.
*Haikus from recent changes:* More of a technical inspiration: I'd love to
discuss building a service that sends push notifications in response to RC
*Dmitry's "Wikipedia Lite" hack*: I think we should do some prototypes on
this. In particular, I'd like to play around with parsoid to create a
"reader view" for Wikipedia pages. Aside from streamlining content for
mobile and improving performance, this could also make it easier to bring
back a lot of reader-centric features. Dynamic font sizes & color/contrast
configurations are two things I see pop up from time to time in OTRS &
*Bernd's map view for Nearby:* This seems like a no-brainer. I sent a
separate email to mobile-l, because it seems like some progress has been
made in this area since I last looked into it.
What were your favorite hacks?
EN Wikipedia user page: https://en.wikipedia.org/wiki/User:Brian.gerstle
Thanks for sharing this, Adam. Aside from engagement/funnel data, the critical question for this feature is: does it bring back eyeballs to the site from social media? It looks like it doesn’t yet, at least not in a substantial way, even with the caveat that App traffic is a very small fraction of total mobile traffic.
Having looked into referrals for this feature before and after comparing them to Twitter’s own engagement analytics (and finding some big discrepancy), you should consider removing spiders/crawlers from the data (see ) to avoid inflating pageviews with non-human activity.
I’m a big fan of this feature and look forward to seeing how you guys intend to scale it.
 https://github.com/ewulczyn/wmf/blob/b9f726ee3468852c3fed2780af1d8ac0004eda… <https://github.com/ewulczyn/wmf/blob/b9f726ee3468852c3fed2780af1d8ac0004eda…>
> On May 21, 2015, at 12:37 PM, Toby Negrin <tnegrin(a)wikimedia.org> wrote:
> Hi all - some interesting analysis on the share-a-fact feature from the mobile team.
> Begin forwarded message:
>> From: Adam Baso <abaso(a)wikimedia.org <mailto:email@example.com>>
>> Date: May 21, 2015 at 12:05:29 PDT
>> To: mobile-l <mobile-l(a)lists.wikimedia.org <mailto:firstname.lastname@example.org>>
>> Subject: [WikimediaMobile] Share a Fact Initial Analysis
>> Hello all,
>> We’ve been looking at some initial results from the Share a Fact feature introduced on the Wikipedia apps for Android and iOS in its basic "minimal viable product" implementation. Here’s some analysis, using data from one day (20150512) with respect to the latest stable versions of the apps (2.0-r-2015-04-23 on Android and 4.1.2 on iOS) for that day.
>> * On iOS, when a user initiates the first step of the default sharing workflow - tapping the up-arrow box share button (6,194 non-highlighting instances for the day under question) - about 11.7% of the time it yielded successful sharing.
>> * On Android, it’s not possible to easily tell when the sharing workflow was carried through to successful share, but we anticipate the Android success rate is currently much higher, as general engagement percentage up to the point of picking an app for sharing is higher on Android than on iOS.
>> * On Android, when presented with the share card preview, 28.0% of the time the ‘Share as image’ button was tapped and 55.5% of the time the 'Share as text' button was tapped, whereas on iOS it was 8.4% ‘Share as image’ and 16.8% ‘Share as text’.
>> * The forthcoming 4.1.4 version of the iOS app will relax its default sharing snippet generation rules and be more like the Android version in that respect. We anticipate this will result in higher engagement with both the ‘Share as image’ and ‘Share as text’ buttons on iOS, and we should be able to verify this once the 4.1.4 iOS version is released and generally adopted (usually takes 4-5 days after release; the 4.1.4 release isn’t released yet).
>> * On the Android app the ‘Share’ option is located on the overflow menu, not as part of the main set of UI buttons. This potentially increases the likelihood of Android users being primed to step through the workflow. On the iOS app, the share button (up-arrow box) is plainly visible from the main UI and not an overflow menu, and this probably creates a different priming dynamic for the iOS demographic.
>> * When users on iOS tapped on the ‘Share as image’ or ‘Share as text’ buttons, there is a pretty sharp drop off at the next stage - the system sharesheet. Once the sharesheet was presented to iOS users, 41.6% of the time it resulted in active abandonment. We believe this probably has something to do with the relatively small set of default apps listed on the sharesheet and the extra work involved with exposing additional social apps for sharing in that context. As with the Android app, the labels of ‘Share as image’ and ’Share as text’ may also pose something of a hurdle at least for first time users of the feature. To this end, there is an onboarding tutorial planned at least on Android.
>> * For a one hour period (2015051201) there were about 100 pageviews in some sense attributable to Share a Fact using a provenance parameter available on the latest stable versions of the apps at that time; this may slightly overstate the number of pageviews attributable to the two specific apps reviewed in this analysis, but probably not too much (n.b., previously a different source parameter was used than the new wprov provenance parameter). Pageviews are not the sole motivation for the feature, but following the trendline over the long run should be interesting. Impact on social media and the destinations of shares is a little harder to capture directly, but https://twitter.com/search?f=realtime&q=%40wikipedia%20-%40itzwikipedia%20f… <https://twitter.com/search?f=realtime&q=%40wikipedia%20-%40itzwikipedia%20f…> gives one a sense about image shares, at least.
>> * A couple potential options for increasing sharing include:
>> ** Trying to add support for sharing to the Photos app on iOS. People may be interested in using images from the Photos apps for various workflows, as Dan Garry has noted.
>> ** Offering a more concise app picklist, in particular explicitly adding the native OS app components (namely, Twitter and Facebook, and as mentioned, Photos if possible), with an option to expose the sharesheet for additional options if necessary. This is probably also somewhat confined to iOS, although conceivably a similar approach could be possible on Android. On Android the full list of applications in its equivalent of the sharesheet is by default readily available to the user, though.
>> ** On Android, exposing the diagonal arrow share button on the main interface akin to how the iOS version of the app shows the up-arrow share button. This may introduce more opportunities for sharing (and thus numbers of abandons would go up in tandem with numbers of shares), but would also partially clutter the interface and probably increase abandon. A controlled experiment may be useful for observing the impact of such an approach.
>> * As a point of reference, for the app versions in scope for this analysis over a single day, there appeared to be approximately 3.78 million Wikipedia for Android pageviews and 1.19 Wikipedia Mobile for iOS app pageviews. There were about 6.73 million app pageviews on the “modern” versions of these apps total for this particular day, meaning there were about 1.75 million pageviews on other modern versions of the app.
>> * Examination of the categories of successful shares on iOS showed the following distributions:
>> 48.5% messaging
>> 25.5% sharesheet copy
>> 22.9% social
>> 1.8% productivity
>> 0.9% reading
>> 53.6% messaging
>> 31.9% sharesheet copy
>> 7.1% social
>> 5.4% reading
>> 2.0% productivity
>> Here were some queries used in the analysis:
>> == SHARE A FACT ATTRIBUTABLE PAGEVIEWS FOR ONE HOUR ==
>> select wprov, uri_host, count(*) from (select x_analytics_map['wprov'] as wprov, uri_host
>> from webrequest where year = 2015 and month = 5 and day = 12 and hour = 1 and is_pageview = true and uri_host like '%.wikipedia.org <http://wikipedia.org/>' and x_analytics_map['wprov'] is not null) t
>> group by wprov, uri_host;
>> == PAGE VIEWS FOR THE DAY FOR THE “MODERN” VERSIONS OF THE APPS ==
>> user_agent, count(*)
>> tablesample(BUCKET 1 OUT OF 100 ON rand())
>> YEAR = 2015
>> AND MONTH = 5
>> AND DAY = 12
>> AND is_pageview = TRUE
>> AND lower(uri_host) like '%.wikipedia.org <http://wikipedia.org/>'
>> AND user_agent like 'WikipediaApp%'
>> GROUP BY user_agent;
>> == HIGHLIGHTING SESSION CASE FOR SPECIFIC VERSIONS OF THE APPS ==
>> select CASE WHEN t2.userAgent LIKE 'WikipediaApp/2.0-r-2015-04-23%' THEN 'Android' WHEN t2.userAgent LIKE 'WikipediaApp/4.1.2%' THEN 'iOS' END AS 'ua', t1.event_action, t1.event_sharemode, t1.event_target, count(*) from MobileWikiAppShareAFact_11331974 t1 inner join MobileWikiAppShareAFact_11331974 t2 on t1.event_shareSessionToken = t2.event_shareSessionToken where t1.timestamp > '20150512' and t1.timestamp < '20150513' and t2.timestamp > '20150512' and t2.timestamp < '20150513' and t1.event_action != 'highlight' and t2.event_action = 'highlight' and (t2.userAgent like 'WikipediaApp/2.0-r-2015-04-23%' or t2.userAgent like 'WikipediaApp/4.1.2%') group by ua, t1.event_action, t1.event_sharemode, t1.event_target;
>> == NON-HIGHLIGHTING SESSION CASE FOR SPECIFIC VERSIONS OF THE APPS ==
>> n.b., subtract the highlighting cases from the non-highlighting cases to arrive at the default sharing behavior. Technically, inner joins can be used to do more comprehensive session analysis, but the queries take a long time.
>> select CASE
>> WHEN userAgent LIKE 'WikipediaApp/2.0-r-2015-04-23%' THEN 'Android'
>> WHEN userAgent LIKE 'WikipediaApp/4.1.2%' THEN 'iOS'
>> END AS 'ua', event_action, event_sharemode, event_target,
>> count(*) from MobileWikiAppShareAFact_11331974 where timestamp > '20150512' and timestamp < '20150513' and (userAgent like 'WikipediaApp/2.0-r-2015-04-23%' or userAgent like 'WikipediaApp/4.1.2%') group by ua, event_action, event_sharemode, event_target;
>> Mobile-l mailing list
>> Mobile-l(a)lists.wikimedia.org <mailto:Mobileemail@example.com>
>> https://lists.wikimedia.org/mailman/listinfo/mobile-l <https://lists.wikimedia.org/mailman/listinfo/mobile-l>
> Analytics mailing list
On Friday, May 29, 2015, Brandon Black <bblack(a)wikimedia.org> wrote:
> We've merged up https://gerrit.wikimedia.org/r/#/c/214705/1 to address
> this, and I've purged the caches to ensure the update is fully live
> already. It should address the issue, assuming that the block on RL's
> /w/load.php entry point was the only part of the problem.
> On Fri, May 29, 2015 at 8:43 PM, Jon Robson <jrobson(a)wikimedia.org
> > Yup what brandon said.
> > On 29 May 2015 1:35 pm, "Jon Robson" <jrobson(a)wikimedia.org
> >> It's all in the report. We block w/ in robots.txt always have.
> >> There have been a bunch of changes to improve performance of the site
> >> overall which might have led to that. Mailing this to the public mailing
> >> list wikitech would give you a better idea.
> >> Our site /is/ mobile friendly it's just we tell google not to load
> >> from w/load.php so no need to panic.
> >> The question is what is the penalty of us failing this tool? Does it
> >> impact our google search rankings?
> >> Fix is trivial.. Update robots.txt but first the question is why are we
> >> blocking scripts and styles on that url?
> >>> Hi Readership team and broader community,
> >>> Any changes we might have recently made to cause this warning to appear
> >>> about googlebot not being able to access our site?
> >>> I consider this to be a very serious issue.
> >>> The examples below are not mobile, but same issue applies when you try
> >>> en.m. version of the pages.
> >>> Best,
> >>> Jon
> >>> ---------- Forwarded message ----------
> >>> Date: Fri, May 29, 2015 at 12:47 PM
> >>> Subject: Mobile Firendly
> >>> Jon,
> >>> Google notified us of the followin...
> >>> "We recently noticed that there was a change with how you embed CSS &
> >>> which results in us not being able to use the CSS & JS to recognize
> what the
> >>> page looks like. That's making some of your pages fail the
> >>> mobile-friendly-test, for example. You used to load CSS & JS from
> >>> bits.wikimedia.org, but now they're loaded through /w/load.php?...
> >>> from the Wikipedia host, where that path is blocked by robots.txt.
> >>> You can see how we render the pages with Fetch as Google in Webmaster
> >>> Tools / Search Console, you can also see most of that with the
> test-page at:
> >>> Some of the pages still pass the test there (example), but the CSS is
> >>> broken there too since it's blocked. "
> >>> Any ideas what can be causing this?
> >>> Regards,
> >>> Wes
> >>> _______________________________________________
> >>> reading-wmf mailing list
> >>> https://lists.wikimedia.org/mailman/listinfo/reading-wmf
> > _______________________________________________
> > Ops mailing list
> > https://lists.wikimedia.org/mailman/listinfo/ops
> Ops mailing list
+external mobile and wikitech
Shoot. I meant to send this the external list. For those of you just
joining us, we recently got an email from google letting us know that some
of our pages are now failing the mobile-friendly test, which has an adverse
impact on our search results. It appears that most of our pages are also
blocking style info.
Google doesn't offer much insight into penalties, but if they're sending us
the email then it is or will have some impact. I'd like to see if we can
better understand the other side of the equation- what is cost of fixing
it? I think Jon's questions below are the ones to start with.
On Fri, May 29, 2015 at 1:35 PM, Jon Robson <jrobson(a)wikimedia.org> wrote:
> It's all in the report. We block w/ in robots.txt always have.
> There have been a bunch of changes to improve performance of the site
> overall which might have led to that. Mailing this to the public mailing
> list wikitech would give you a better idea.
> Our site /is/ mobile friendly it's just we tell google not to load styles
> from w/load.php so no need to panic.
> The question is what is the penalty of us failing this tool? Does it
> impact our google search rankings?
> Fix is trivial.. Update robots.txt but first the question is why are we
> blocking scripts and styles on that url?
> On 29 May 2015 1:17 pm, "Jon Katz" <jkatz(a)wikimedia.org> wrote:
>> Hi Readership team and broader community,
>> Any changes we might have recently made to cause this warning to appear
>> about googlebot not being able to access our site?
>> I consider this to be a very serious issue.
>> The examples below are not mobile, but same issue applies when you try an
>> en.m. version of the pages.
>> ---------- Forwarded message ----------
>> From: Wes Moran <wmoran(a)wikimedia.org>
>> Date: Fri, May 29, 2015 at 12:47 PM
>> Subject: Mobile Firendly
>> To: Jon Katz <jkatz(a)wikimedia.org>
>> Cc: Adam Baso <abaso(a)wikimedia.org>
>> Google notified us of the followin...
>> "We recently noticed that there was a change with how you embed CSS & JS
>> which results in us not being able to use the CSS & JS to recognize what
>> the page looks like. That's making some of your pages fail the
>> mobile-friendly-test, for example. You used to load CSS & JS from
>> bits.wikimedia.org, but now they're loaded through /w/load.php?...
>> directly from the Wikipedia host, where that path is blocked by robots.txt.
>> You can see how we render the pages with Fetch as Google in Webmaster
>> Tools / Search Console, you can also see most of that with the test-page at:
>> Some of the pages still pass the test there (example
>> but the CSS is broken there too since it's blocked. "
>> Any ideas what can be causing this?
>> reading-wmf mailing list
> reading-wmf mailing list