Hi everyone,
*tl;dr: We'll be stripping all content contained inside brackets from the
first sentence of articles in the Wikipedia app.*
The Mobile Apps Team is focussed on making the app a beautiful and engaging
reader experience, and trying to support use cases like wanting to look
something up quickly to find what it is. Unfortunately, there are several
aspects of Wikipedia at present that are actively detrimental to that goal.
One example of this are the lead sentences.
As mentioned in the other thread on this matter
<https://lists.wikimedia.org/pipermail/mobile-l/2015-March/008715.html>,
lead sentences are poorly formatted and contain information that is
detrimental to quickly looking up a topic. The team did a quick audit
<https://docs.google.com/a/wikimedia.org/spreadsheets/d/1BJ7uDgzO8IJT0M3UM2q…>
of
the information available inside brackets in the first sentences, and
typically it is pronunciation information which is probably better placed
in the infobox rather than breaking up the first sentence. The other
problem is that this information was typically inserted and previewed on a
platform where space is not at a premium, and that calculation is different
on mobile devices.
In order to better serve the quick lookup use case, the team has reached
the decision to strip anything inside brackets in the first sentence of
articles in the Wikipedia app.
Stripping content is not a decision to be made lightly. People took the
time to write it, and that should be respected. We realise this is
controversial. That said, it's the opinion of the team that the problem is
pretty clear: this content is not optimised for users quickly looking
things up on mobile devices at all, and will take a long time to solve
through alternative means. A quicker solution is required.
The screenshots below are mockups of the before and after of the change.
These are not final, I just put them together quickly to illustrate what
I'm talking about.
- Before: http://i.imgur.com/VwKerbv.jpg
- After: http://i.imgur.com/2A5PLmy.jpg
If you have any questions, let me know.
Thanks,
Dan
--
Dan Garry
Associate Product Manager, Mobile Apps
Wikimedia Foundation
FYI. Google just announced an open source project to create a speedier
framework for mobile browsing. It might be worth looking at what they're
doing:
From:
http://tech.slashdot.org/story/15/10/08/035200/googles-effort-to-speed-up-t…
Google has officially taken the wraps off its AMP project — Accelerated
Mobile Pages <https://www.ampproject.org/> — which aims to speed up the
delivery of web content to mobile devices
<https://www.ampproject.org/how-it-works/>. They say, "We began to
experiment with an idea: could we develop a restricted subset of the things
we'd use from HTML, that's both fast and expressive, so that documents
would always load and render with reliable performance?" That subset is now
encapsulated in AMP, their proof-of-concept. They've posted the code to
GitHub <https://github.com/ampproject/amphtml> and they're asking for help
from the open source community to flesh it out. Their conclusions are
familiar to the Slashdot crowd: "One thing we realized early on is that
many performance issues are caused by the integration of multiple
JavaScript libraries, tools, embeds, etc. into a page. This isn't saying
that JavaScript immediately leads to bad performance, but once arbitrary
JavaScript is in play, most bets are off because anything could happen at
any time and it is hard to make any type of performance guarantee. With
this in mind we made the tough decision that AMP HTML documents would not
include any author-written JavaScript, nor any third-party scripts."
They're seeing speed boosts anywhere from 15-85%, but they're also looking
at pre-rendering options to make some content capable of loading
instantaneously. Their FAQ <https://www.ampproject.org/faq/> has a few more
details.
Google blog announcement:
https://googleblog.blogspot.com/2015/10/introducing-accelerated-mobile-page…
It's a stats morning (depending on your timezone)!
The two dashboards in the link below are a pretty neat look at what happens
when people visit https://wikipedia.org/ .
-Adam
---------- Forwarded message ----------
From: Oliver Keyes <okeyes(a)wikimedia.org>
Date: Mon, Nov 30, 2015 at 7:37 AM
Subject: Re: [discovery] Announcement: New dashboard covering the Wikipedia
portal
To: A public mailing list about Wikimedia Search and Discovery projects <
discovery(a)lists.wikimedia.org>
My bad; the /non-beta/ link is http://discovery.wmflabs.org/portal/ ;)
On 30 November 2015 at 10:09, Oliver Keyes <okeyes(a)wikimedia.org> wrote:
> Hey all,
>
> The Discovery Analysis team is pleased to report we have released a
> new dashboard, providing basic data about usage of the Wikipedia
> portal (https://www.wikipedia.org). It can be found at
> http://discovery-beta.wmflabs.org/portal/
>
> Thanks,
>
> Oliver, on behalf of the Discovery Analysis team
> --
> Oliver Keyes
> Count Logula
> Wikimedia Foundation
--
Oliver Keyes
Count Logula
Wikimedia Foundation
_______________________________________________
discovery mailing list
discovery(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/discovery
Heads up in case you query Event Logging tables.
---------- Forwarded message ----------
From: *Marcel Ruiz Forns* <mforns(a)wikimedia.org>
Date: Monday, November 30, 2015
Subject: [Analytics] EventLogging outage in progress?
To: "A mailing list for the Analytics Team at WMF and everybody who has an
interest in Wikipedia and analytics." <analytics(a)lists.wikimedia.org>
Team, I checked and, indeed, EventLogging database needs backfilling from
2015-11-27 01:00 until 2015-11-27 07:00. I updated the docs and started the
backfilling process. I'll let you know when it it finished.
Cheers
On Fri, Nov 27, 2015 at 8:31 PM, Oliver Keyes <okeyes(a)wikimedia.org
<javascript:_e(%7B%7D,'cvml','okeyes(a)wikimedia.org');>> wrote:
> It seems like it would depend on the class of error. 48 hours for
> events not syncing, fine. 48 hours of /total data loss/ is a
> completely different class of problem.
>
> On 27 November 2015 at 11:35, Nuria Ruiz <nuria(a)wikimedia.org
> <javascript:_e(%7B%7D,'cvml','nuria(a)wikimedia.org');>> wrote:
> >>Unfortunately, the only team-members working full-time yesterday and
> today
> >> are we Europe folks.
> >>We weren't there when that happened and we don't get those alerts on the
> >> phone, we should though.
> > Given that this system is tier-2 i do not think we need an immediate
> > response, 24 hours should be an acceptable ETA. I would say even 48.
> >
> > On Fri, Nov 27, 2015 at 2:31 AM, Marcel Ruiz Forns <mforns(a)wikimedia.org
> <javascript:_e(%7B%7D,'cvml','mforns(a)wikimedia.org');>>
> > wrote:
> >>
> >> Thanks, Ori, for having a look at this and restarting EL.
> >>
> >> I understand it was 01:30 UTC on Friday (today), not Thursday. It went
> on
> >> during 5-6 hours.
> >> Unfortunately, the only team-members working full-time yesterday and
> today
> >> are we Europe folks.
> >> We weren't there when that happened and we don't get those alerts on the
> >> phone, we should though.
> >>
> >> This problem happened already like a month ago. We'll backfill the
> missing
> >> events and will investigate.
> >> Thanks again for the heads-up.
> >>
> >> On Fri, Nov 27, 2015 at 8:01 AM, Ori Livneh <ori(a)wikimedia.org
> <javascript:_e(%7B%7D,'cvml','ori(a)wikimedia.org');>> wrote:
> >>>
> >>> On Thu, Nov 26, 2015 at 10:46 PM, Ori Livneh <ori(a)wikimedia.org
> <javascript:_e(%7B%7D,'cvml','ori(a)wikimedia.org');>> wrote:
> >>>>
> >>>> Seems that eventlog1001 has not received any events since 01:30 UTC on
> >>>> Thursday
> >>>>
> >>>>
> >>>>
> http://ganglia.wikimedia.org/latest/graph.php?r=day&z=xlarge&c=Miscellaneou…
> >>>>
> >>>> This is pretty severe; I'd page if it wasn't a US holiday.
> >>>
> >>>
> >>> Kafka clients on eventlog1001 were in a "Autocommitting consumer
> offset"
> >>> death-loop and not receiving any events from the Kafka brokers. I ran
> >>> eventloggingctl stop / eventloggingctl start and they recovered. Needs
> to be
> >>> investigated more thoroughly. Otto, can you follow up?
> >>>
> >>>
> >>> _______________________________________________
> >>> Analytics mailing list
> >>> Analytics(a)lists.wikimedia.org
> <javascript:_e(%7B%7D,'cvml','Analytics(a)lists.wikimedia.org');>
> >>> https://lists.wikimedia.org/mailman/listinfo/analytics
> >>>
> >>
> >>
> >>
> >> --
> >> Marcel Ruiz Forns
> >> Analytics Developer
> >> Wikimedia Foundation
> >>
> >> _______________________________________________
> >> Analytics mailing list
> >> Analytics(a)lists.wikimedia.org
> <javascript:_e(%7B%7D,'cvml','Analytics(a)lists.wikimedia.org');>
> >> https://lists.wikimedia.org/mailman/listinfo/analytics
> >>
> >
> >
> > _______________________________________________
> > Analytics mailing list
> > Analytics(a)lists.wikimedia.org
> <javascript:_e(%7B%7D,'cvml','Analytics(a)lists.wikimedia.org');>
> > https://lists.wikimedia.org/mailman/listinfo/analytics
> >
>
>
>
> --
> Oliver Keyes
> Count Logula
> Wikimedia Foundation
>
> _______________________________________________
> Analytics mailing list
> Analytics(a)lists.wikimedia.org
> <javascript:_e(%7B%7D,'cvml','Analytics(a)lists.wikimedia.org');>
> https://lists.wikimedia.org/mailman/listinfo/analytics
>
--
*Marcel Ruiz Forns*
Analytics Developer
Wikimedia Foundation
Hi all,
here is the weekly look at our most important readership metrics (apologies
for the delay). Apart from the usual data, this time there is an additional
chart to illuminate how our mobile readership ratio has developed since
this spring, the iOS app retention stats are back after Apple fixed their
data, and we conclude with some inspiring quotes about climate change
awareness ;)
As laid out earlier
<https://lists.wikimedia.org/pipermail/mobile-l/2015-September/009773.html>,
the main purpose of this report is to raise awareness about how these are
developing, call out the impact of any unusual events in the preceding
week, and facilitate thinking about core metrics in general. We are still
iterating on the presentation and eventually want to create dashboards for
those which are not already available in that form already. Feedback and
discussion welcome.
Now to the usual data. (All numbers below are averages for November 16-22,
2015 unless otherwise noted.)
Pageviews
Total: 540 million/day (-0.0% from the previous week)
Context (April 2015-November 2015):
( see also the Vital Signs dashboard
<https://vital-signs.wmflabs.org/#projects=all/metrics=Pageviews>)
The Analytics team improved web crawler detection further last week
<https://meta.wikimedia.org/w/index.php?title=Dashiki%3APageviewsAnnotations…>,
meaning an “optical” (as opposed to real) drop in human pageviews from
November 19 on - presumably smaller though than the one for September that
we reported in the preceding report.
Desktop: 57.2% (previous week: 57.5%)
Mobile web: 41.6% (previous week: 41.3%)
Apps: 1.2% (previous week: 1.2%)
Context (April 2015-November 2015):
These percentages usually don’t change rapidly from week to week. For a
wider perspective, I’m including a chart of the (aggregate) mobile
percentage this time, too. Technically this information is already
contained in the usual chart above. But here we can see even clearer
indications for an impact of the HTTPS-only switchover during June (it
appears to have taken out desktop traffic mainly), as well as the strong
weekly periodicity (higher mobile ratio on weekends). It looks like mobile
won’t overtake desktop anytime soon.
Global North ratio: 77.3% of total pageviews (previous week: 77.6%)
Context (April 2015-November 2015):
New app installations
Android: 30.9k/day (-44.2% from the previous week)
Daily installs per device, from Google Play
Context (last month):
As described in the previous report, the Android Wikipedia app was featured
in the "New
and Updated Apps" section of the Google Play store from November 5-12, and
while the huge positive impact overall on download numbers is obvious, they
also decreased markedly afterwards. They seem to be coming back up a bit
now, but we are still waiting for some more data before making a final
estimate for the overall effect, and have also contacted Google to see if
they can help us illuminate the mechanism behind this apparent effect.
iOS: 4.69k/day (+2.2% from the previous week)
Download numbers from App Annie
Context (last three months):
No news here.
App user retention
Android: 14.8% (previous week: 15.2%)
(Ratio of app installs opened again 7 days after installation, among all
installed during the previous week. 1:100 sample)
Context (last three months):
iOS: 12.0% (previous week: 11.9%)
(Ratio of app installs opened again 7 days after installation, among all
installed during the previous week. From iTunes Connect, opt-in only = ca.
20-30% of all users)
Context (installation dates from October 18-November 15, 2015):
This metric was left out of last week’s report because of inconsistencies.
Indeed, Apple has since issued a correction notice
<http://www.talkingnewmedia.com/2015/11/24/apple-issues-corrected-itunes-con…>.
Unfortunately it looks like the data underlying the report for the week
until November 8 was affected too, so please disregard the iOS retention
figure given in that report.
Unique app users
Android: 1.190 million / day (-2.2% from the previous week)
Context (last three months):
This too will need another look.
iOS: 281k / day (+0.1% from the previous week)
Context (last three months):
No news here.
After publishing this report regularly for a bit over two months, we may be
rethinking the weekly publication schedule a little - also to keep the
balance between newsworthiness and keeping up general awareness for
longterm developments. In that vein, some inspiring quotes about a weekly
climate change newsletter
<http://www.niemanlab.org/2015/11/climate-change-is-depressing-and-horrible-…>
that begins every issue by reciting the current CO2 ratio in the atmosphere
as a KPI ;)
Ultimately, Meyer said, the newsletter comes out of the idea that “if
you’re worried about something, you should pay regular attention to it.”
“By paying attention to it over time, and watching its texture change over
time, you will come to have ideas about it,” he said. “You will come to
understand it in a new way, and you will contribute in a very small way to
how society addresses this big problem.”
[...]
So it seemed as if a newsletter might be a good way to cover the issue.
[...] “You can get a continuity of storyline,” Meyer said. “You can’t cover
all of everything that’s happening every week in the climate, but you can
watch certain parts develop, and hopefully bring people in over time.” He
leads off the “Macro Trends” section of each issue with the molecules per
million of carbon dioxide in the atmosphere:
The atmosphere is filling with greenhouse gases. The Mauna Loa Observatory
measured an average of 398.51 CO2 molecules per million in the atmosphere
this week. A year ago, it measured 395.84 ppm. Ten years ago, it measured
376.93 ppm.
“What we’re doing now won’t show up in that number for a decade or so,” he
said. “But by reminding myself of it every week, and thinking about its
contours and its direction, that’s a way to stay focused on what matters.”
----
For reference, the queries and source links used are listed below (access
is needed for each). Most of the above charts are available on Commons, too
<https://commons.wikimedia.org/w/index.php?title=Special:ListFiles&offset=20…>
.
hive (wmf)> SELECT SUM(view_count)/7000000 AS avg_daily_views_millions FROM
wmf.projectview_hourly WHERE agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-11-16"
AND "2015-11-22";
hive (wmf)> SELECT year, month, day,
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) as date,
sum(IF(access_method <> 'desktop', view_count, null)) AS mobileviews,
SUM(view_count) AS allviews FROM wmf.projectview_hourly WHERE year=2015 AND
agent_type = 'user' GROUP BY year, month, day ORDER BY year, month, day
LIMIT 1000;
hive (wmf)> SELECT access_method, SUM(view_count)/7 FROM
wmf.projectview_hourly WHERE agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-11-16"
AND "2015-11-22" GROUP BY access_method;
hive (wmf)> SELECT SUM(IF (FIND_IN_SET(country_code,
'AD,AL,AT,AX,BA,BE,BG,CH,CY,CZ,DE,DK,EE,ES,FI,FO,FR,FX,GB,GG,GI,GL,GR,HR,HU,IE,IL,IM,IS,IT,JE,LI,LU,LV,MC,MD,ME,MK,MT,NL,NO,PL,PT,RO,RS,RU,SE,SI,SJ,SK,SM,TR,VA,AU,CA,HK,MO,NZ,JP,SG,KR,TW,US')
> 0, view_count, 0))/SUM(view_count) FROM wmf.projectview_hourly WHERE
agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-11-16"
AND "2015-11-22";
hive (wmf)> SELECT year, month, day,
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")), SUM(view_count) AS
all, SUM(IF (FIND_IN_SET(country_code,
'AD,AL,AT,AX,BA,BE,BG,CH,CY,CZ,DE,DK,EE,ES,FI,FO,FR,FX,GB,GG,GI,GL,GR,HR,HU,IE,IL,IM,IS,IT,JE,LI,LU,LV,MC,MD,ME,MK,MT,NL,NO,PL,PT,RO,RS,RU,SE,SI,SJ,SK,SM,TR,VA,AU,CA,HK,MO,NZ,JP,SG,KR,TW,US')
> 0, view_count, 0)) AS Global_North_views FROM wmf.projectview_hourly
WHERE year = 2015 AND agent_type='user' GROUP BY year, month, day ORDER BY
year, month, day LIMIT 1000;
https://console.developers.google.com/storage/browser/pubsite_prod_rev_0281…
(“overview”)
https://www.appannie.com/dashboard/252257/item/324715238/downloads/?breakdo…
(select “Total”)
SELECT LEFT(timestamp, 8) AS date, SUM(IF(event_appInstallAgeDays = 0, 1,
0)) AS day0_active, SUM(IF(event_appInstallAgeDays = 7, 1, 0)) AS
day7_active FROM log.MobileWikiAppDailyStats_12637385 WHERE timestamp LIKE
'201511%' AND userAgent LIKE '%-r-%' AND userAgent NOT LIKE '%Googlebot%'
GROUP BY date ORDER BY DATE;
(with the retention rate calculated as day7_active divided by day0_active
from seven days earlier, of course)
https://analytics.itunes.apple.com/#/retention?app=324715238
hive (wmf)> SELECT SUM(IF(platform = 'Android',unique_count,0))/7 AS
avg_Android_DAU_last_week, SUM(IF(platform = 'iOS',unique_count,0))/7 AS
avg_iOS_DAU_last_week FROM wmf.mobile_apps_uniques_daily WHERE
CONCAT(year,LPAD(month,2,"0"),LPAD(day,2,"0")) BETWEEN 20151116 AND
20151122;
hive (wmf)> SELECT CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0"))
as date, unique_count AS Android_DAU FROM wmf.mobile_apps_uniques_daily
WHERE platform = 'Android';
hive (wmf)> SELECT CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0"))
as date, unique_count AS iOS_DAU FROM wmf.mobile_apps_uniques_daily WHERE
platform = 'iOS';
--
Tilman Bayer
Senior Analyst
Wikimedia Foundation
IRC (Freenode): HaeB
Amazing talk from one of the senior designers at booking about using
flexbox to enhance responsiveness in layouts without using media queries
while maintaining old browsers compatibility.
https://youtu.be/_98SE8WUvLk
37', really practical and useful talk.
Hi,
I'm working on a project to improve the categorization of pictures in the
Upload to Commons Android app <https://phabricator.wikimedia.org/T115101>
with Nicolas_Raoul and Niedzielski as part of the Outreachy Dec '15
program. I'm very much just starting out, so I'd love any feedback or
suggestions that anyone might have.
We are planning on two major phases/releases for app updates, with Phase 1
enabling category suggestions based on the picture's location, and Phase 2
enabling more flexible category search results. It would be a huge help if
we had some volunteers to test these releases (planned for Jan and Feb
respectively), so it would be great if anyone could download the app <
https://play.google.com/store/apps/details?id=fr.free.nrw.commons> and try
it out. The source code and issues are stored on GitHub <
https://github.com/misaochan/apps-android-commons>.
Many thanks! :)
Josephine
Hi all,
here is the weekly look at our most important readership metrics, a bit
belatedly this time.
As laid out earlier
<https://lists.wikimedia.org/pipermail/mobile-l/2015-September/009773.html>,
the main purpose is to raise awareness about how these are developing, call
out the impact of any unusual events in the preceding week, and facilitate
thinking about core metrics in general. We are still iterating on the
presentation and eventually want to create dashboards for those which are
not already available in that form already. Feedback and discussion welcome.
For readers of this report who haven’t already seen it, I’d like to mention
the exciting announcement
<https://lists.wikimedia.org/pipermail/analytics/2015-November/004529.html>
of the new pageview API for per-article readership metrics.
Now to the usual data. (All numbers below are averages for November 9-15,
2015 unless otherwise noted.)
Pageviews
Total: 540 million/day (+0.7% from the previous week)
Context (April 2015-November 2015):
(see also the Vital Signs dashboard
<https://vital-signs.wmflabs.org/#projects=all/metrics=Pageviews>)
Some may remember that back in September, this weekly report called
<https://lists.wikimedia.org/pipermail/mobile-l/2015-September/009794.html>
out
<https://lists.wikimedia.org/pipermail/mobile-l/2015-September/009785.html>
a “conspicuous 4.3% drop” in total pageviews during the week until Sept 20
(followed by another 0.7% decrease the following week). Well, last week the
Analytics team solved that mystery
<https://phabricator.wikimedia.org/T114379#1802927>: An improvement in
detection of web crawlers had caused much more pageviews to be classified
as non-human, from Sept 16 on (e.g. for Commons, estimated human traffic
dropped
<https://vital-signs.wmflabs.org/#projects=commonswiki/metrics=Pageviews>
from about 12 million to about 4 million per day).
Desktop: 57.5% (previous week: 57.5%)
Mobile web: 41.3% (previous week: 41.3%)
Apps: 1.2% (previous week: 1.2%)
Global North ratio: 77.6% of total pageviews (previous week: 77.5%)
Context (April 2015-November 2015):
New app installations
Android: 55.3k/day (-8.8% from the previous week)
Daily installs per device, from Google Play
Context (last month):
As already mentioned in last week’s report, the Android Wikipedia app got
featured in the "New and Updated Apps" section of the Google Play store on
November 5, enabled by the Android team’s recent update work and
facilitated by the Partnerships team. The promotion lasted one week and we
can now see that it was a huge success (with the effect on download numbers
much more clearly discernible than in the case of the “Back to School”
feature we discussed last month
<https://lists.wikimedia.org/pipermail/mobile-l/2015-October/009835.html>).
Predictably, uninstalls went up slightly too, but most of the new users
kept the app on their phone. What is a little concerning though is that
after the promotion, install numbers fell below the previous baseline, with
the install base even shrinking a tiny bit right afterwards. (One
possibility is that we are seeing some sort of depletion effect, due to
people who would have installed the app anyway around this time, but saw it
earlier due to the promotion.) For that reason, we will wait a bit longer
before estimating the overall impact of this promotion.
iOS: 4.59k/day (+4.3% from the previous week)
Download numbers from App Annie
Context (last 12 months):
No big news here - things are back to normal after the App Store feature
last month.
App user retention
Android: 15.2% (previous week: 13.9%)
(Ratio of app installs opened again 7 days after installation, among all
installed during the previous week. 1:100 sample)
Context (last three months):
Recall that this metric lags one week behind, so to speak. I.e. the effects
of the Play Store promotion are not fully visible yet above (spoiler
though, having looked at a few more days of data already: retention for
installs who had come in during the promotion does not appear to have been
lower than usual, which is good news).
In general, this data is quite noisy due to the low (1:100) sample rate.
iOS: N/A
(Ratio of app installs opened again 7 days after installation, among all
installed during the previous week. From iTunes Connect, opt-in only = ca.
20-30% of all users)
Unfortunately I encountered some data quality issues with this metric this
week. Will investigate, and report iOS retention again once this is sorted
out. (The numbers and charts provided in the iTunes Connect App Analytics
appears to have changed quite a bit retroactively.) Unique app users
Android: 1.217 million / day (+2.7% from the previous week)
Context (last three months):
A somewhat noticeable rise that could well be connected with the
aforementioned Play Store promotion, but still needs a closer look once
more data is in.
iOS: 281k / day (+0.2% from the previous week)
Context (last three months):
No news here.
----
For reference, the queries and source links used are listed below (access
is needed for each). Most of the above charts are available on Commons, too
<https://commons.wikimedia.org/w/index.php?title=Special:ListFiles&offset=20…>
.
hive (wmf)> SELECT SUM(view_count)/7000000 AS avg_daily_views_millions FROM
wmf.projectview_hourly WHERE agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-11-09"
AND "2015-11-15";
hive (wmf)> SELECT year, month, day,
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) as date,
sum(IF(access_method <> 'desktop', view_count, null)) AS mobileviews,
SUM(view_count) AS allviews FROM wmf.projectview_hourly WHERE year=2015 AND
agent_type = 'user' GROUP BY year, month, day ORDER BY year, month, day
LIMIT 1000;
hive (wmf)> SELECT access_method, SUM(view_count)/7 FROM
wmf.projectview_hourly WHERE agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-11-09"
AND "2015-11-15" GROUP BY access_method;
hive (wmf)> SELECT SUM(IF (FIND_IN_SET(country_code,
'AD,AL,AT,AX,BA,BE,BG,CH,CY,CZ,DE,DK,EE,ES,FI,FO,FR,FX,GB,GG,GI,GL,GR,HR,HU,IE,IL,IM,IS,IT,JE,LI,LU,LV,MC,MD,ME,MK,MT,NL,NO,PL,PT,RO,RS,RU,SE,SI,SJ,SK,SM,TR,VA,AU,CA,HK,MO,NZ,JP,SG,KR,TW,US')
> 0, view_count, 0))/SUM(view_count) FROM wmf.projectview_hourly WHERE
agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-11-09"
AND "2015-11-15";
hive (wmf)> SELECT year, month, day,
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")), SUM(view_count) AS
all, SUM(IF (FIND_IN_SET(country_code,
'AD,AL,AT,AX,BA,BE,BG,CH,CY,CZ,DE,DK,EE,ES,FI,FO,FR,FX,GB,GG,GI,GL,GR,HR,HU,IE,IL,IM,IS,IT,JE,LI,LU,LV,MC,MD,ME,MK,MT,NL,NO,PL,PT,RO,RS,RU,SE,SI,SJ,SK,SM,TR,VA,AU,CA,HK,MO,NZ,JP,SG,KR,TW,US')
> 0, view_count, 0)) AS Global_North_views FROM wmf.projectview_hourly
WHERE year = 2015 AND agent_type='user' GROUP BY year, month, day ORDER BY
year, month, day LIMIT 1000;
https://console.developers.google.com/storage/browser/pubsite_prod_rev_0281…
(“overview”)
https://www.appannie.com/dashboard/252257/item/324715238/downloads/?breakdo…
(select “Total”)
SELECT LEFT(timestamp, 8) AS date, SUM(IF(event_appInstallAgeDays = 0, 1,
0)) AS day0_active, SUM(IF(event_appInstallAgeDays = 7, 1, 0)) AS
day7_active FROM log.MobileWikiAppDailyStats_12637385 WHERE timestamp LIKE
'201511%' AND userAgent LIKE '%-r-%' AND userAgent NOT LIKE '%Googlebot%'
GROUP BY date ORDER BY DATE;
(with the retention rate calculated as day7_active divided by day0_active
from seven days earlier, of course)
https://analytics.itunes.apple.com/#/retention?app=324715238
hive (wmf)> SELECT SUM(IF(platform = 'Android',unique_count,0))/7 AS
avg_Android_DAU_last_week, SUM(IF(platform = 'iOS',unique_count,0))/7 AS
avg_iOS_DAU_last_week FROM wmf.mobile_apps_uniques_daily WHERE
CONCAT(year,LPAD(month,2,"0"),LPAD(day,2,"0")) BETWEEN 20151109 AND
20151115;
hive (wmf)> SELECT CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0"))
as date, unique_count AS Android_DAU FROM wmf.mobile_apps_uniques_daily
WHERE platform = 'Android';
hive (wmf)> SELECT CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0"))
as date, unique_count AS iOS_DAU FROM wmf.mobile_apps_uniques_daily WHERE
platform = 'iOS';
--
Tilman Bayer
Senior Analyst
Wikimedia Foundation
IRC (Freenode): HaeB
Cross, posting - this is great!
---------- Forwarded message ----------
From: Antoine Musso <hashar+wmf(a)free.fr>
Date: Thu, Nov 19, 2015 at 5:19 AM
Subject: [Wikitech-l] CI and cross repository dependencies
To: wikitech-l(a)lists.wikimedia.org
Hello,
We often have the case of a change to an extension depending on a
pending patch to MediaWiki core. I have upgraded our CI scheduler -
Zuul - a couple weeks ago and it now supports marking dependencies even
in different repositories.
Why does it matter? To make sure the dependency is fulfilled one
usually either:
* CR-2 the patch until dependent change is merged
* write a test that exercise the required patch in MediaWiki.
With the first solution (lack of test), once both are merged, nothing
prevent one from cherry picking a patch without its dependent patch.
For example for MediaWiki minor releases or Wikimedia deployment branches.
When a test covers the dependency, it will fail until the dependent one
is merged which is rather annoying.
Zuul now recognizes the header 'Depends-On' in git messages, similar to
'Change-Id' and 'Bug'. 'Depends-On' takes as parameter a change-id and
multiple ones can be added.
When a patch is proposed in Gerrit, Zuul looks for Gerrit changes
matching the 'Depends-On' and verify whether any are still open. In such
a case, it will craft references for the open patches so all the
dependencies can be tested as if they got merged.
Real world example
------------------
The ContentTranslation extension is tested with the Wikidata one and was
not passing the test. Wikidata created a patch and we did not want to
merge it until we confirm the ContentTranslation one is passing properly.
The Wikidata patch is https://gerrit.wikimedia.org/r/#/c/252227/
Change-Id: I0312c23628d706deb507b5534b868480945b6163
On ContentTranslation we indicated the dependency:
https://gerrit.wikimedia.org/r/#/c/252172/1..2//COMMIT_MSG
+ Depends-On: I0312c23628d706deb507b5534b868480945b6163
Which is the Wikidata patch.
Zuul:
* received the patch for ContentTranslation
* looked up the change-id and found the Wikidata
* created git references in both repo to point to the proper patches
Jenkins:
* zuul-cloner cloned both repos and fetched the references created by
the Zuul service
* run tests
* SUCCESS
That confirmed us the Wikidata patch was actually fixing the issue for
ContentTranslation. Hence we CR+2 both and all merged fine.
Please take a moment to read upstream documentation:
http://docs.openstack.org/infra/zuul/gating.html#cross-repository-dependenc…
Wikidata/ContentTranslation task:
https://phabricator.wikimedia.org/T118263
--
Antoine "hashar" Musso
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l