Hi all,
here is the weekly look at our most important readership metrics, a bit
belatedly this time.
As laid out earlier
<https://lists.wikimedia.org/pipermail/mobile-l/2015-September/009773.html>,
the main purpose is to raise awareness about how these are developing, call
out the impact of any unusual events in the preceding week, and facilitate
thinking about core metrics in general. We are still iterating on the
presentation and eventually want to create dashboards for those which are
not already available in that form already. Feedback and discussion welcome.
For readers of this report who haven’t already seen it, I’d like to mention
the exciting announcement
<https://lists.wikimedia.org/pipermail/analytics/2015-November/004529.html>
of the new pageview API for per-article readership metrics.
Now to the usual data. (All numbers below are averages for November 9-15,
2015 unless otherwise noted.)
Pageviews
Total: 540 million/day (+0.7% from the previous week)
Context (April 2015-November 2015):
(see also the Vital Signs dashboard
<https://vital-signs.wmflabs.org/#projects=all/metrics=Pageviews>)
Some may remember that back in September, this weekly report called
<https://lists.wikimedia.org/pipermail/mobile-l/2015-September/009794.html>
out
<https://lists.wikimedia.org/pipermail/mobile-l/2015-September/009785.html>
a “conspicuous 4.3% drop” in total pageviews during the week until Sept 20
(followed by another 0.7% decrease the following week). Well, last week the
Analytics team solved that mystery
<https://phabricator.wikimedia.org/T114379#1802927>: An improvement in
detection of web crawlers had caused much more pageviews to be classified
as non-human, from Sept 16 on (e.g. for Commons, estimated human traffic
dropped
<https://vital-signs.wmflabs.org/#projects=commonswiki/metrics=Pageviews>
from about 12 million to about 4 million per day).
Desktop: 57.5% (previous week: 57.5%)
Mobile web: 41.3% (previous week: 41.3%)
Apps: 1.2% (previous week: 1.2%)
Global North ratio: 77.6% of total pageviews (previous week: 77.5%)
Context (April 2015-November 2015):
New app installations
Android: 55.3k/day (-8.8% from the previous week)
Daily installs per device, from Google Play
Context (last month):
As already mentioned in last week’s report, the Android Wikipedia app got
featured in the "New and Updated Apps" section of the Google Play store on
November 5, enabled by the Android team’s recent update work and
facilitated by the Partnerships team. The promotion lasted one week and we
can now see that it was a huge success (with the effect on download numbers
much more clearly discernible than in the case of the “Back to School”
feature we discussed last month
<https://lists.wikimedia.org/pipermail/mobile-l/2015-October/009835.html>).
Predictably, uninstalls went up slightly too, but most of the new users
kept the app on their phone. What is a little concerning though is that
after the promotion, install numbers fell below the previous baseline, with
the install base even shrinking a tiny bit right afterwards. (One
possibility is that we are seeing some sort of depletion effect, due to
people who would have installed the app anyway around this time, but saw it
earlier due to the promotion.) For that reason, we will wait a bit longer
before estimating the overall impact of this promotion.
iOS: 4.59k/day (+4.3% from the previous week)
Download numbers from App Annie
Context (last 12 months):
No big news here - things are back to normal after the App Store feature
last month.
App user retention
Android: 15.2% (previous week: 13.9%)
(Ratio of app installs opened again 7 days after installation, among all
installed during the previous week. 1:100 sample)
Context (last three months):
Recall that this metric lags one week behind, so to speak. I.e. the effects
of the Play Store promotion are not fully visible yet above (spoiler
though, having looked at a few more days of data already: retention for
installs who had come in during the promotion does not appear to have been
lower than usual, which is good news).
In general, this data is quite noisy due to the low (1:100) sample rate.
iOS: N/A
(Ratio of app installs opened again 7 days after installation, among all
installed during the previous week. From iTunes Connect, opt-in only = ca.
20-30% of all users)
Unfortunately I encountered some data quality issues with this metric this
week. Will investigate, and report iOS retention again once this is sorted
out. (The numbers and charts provided in the iTunes Connect App Analytics
appears to have changed quite a bit retroactively.) Unique app users
Android: 1.217 million / day (+2.7% from the previous week)
Context (last three months):
A somewhat noticeable rise that could well be connected with the
aforementioned Play Store promotion, but still needs a closer look once
more data is in.
iOS: 281k / day (+0.2% from the previous week)
Context (last three months):
No news here.
----
For reference, the queries and source links used are listed below (access
is needed for each). Most of the above charts are available on Commons, too
<https://commons.wikimedia.org/w/index.php?title=Special:ListFiles&offset=20…>
.
hive (wmf)> SELECT SUM(view_count)/7000000 AS avg_daily_views_millions FROM
wmf.projectview_hourly WHERE agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-11-09"
AND "2015-11-15";
hive (wmf)> SELECT year, month, day,
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) as date,
sum(IF(access_method <> 'desktop', view_count, null)) AS mobileviews,
SUM(view_count) AS allviews FROM wmf.projectview_hourly WHERE year=2015 AND
agent_type = 'user' GROUP BY year, month, day ORDER BY year, month, day
LIMIT 1000;
hive (wmf)> SELECT access_method, SUM(view_count)/7 FROM
wmf.projectview_hourly WHERE agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-11-09"
AND "2015-11-15" GROUP BY access_method;
hive (wmf)> SELECT SUM(IF (FIND_IN_SET(country_code,
'AD,AL,AT,AX,BA,BE,BG,CH,CY,CZ,DE,DK,EE,ES,FI,FO,FR,FX,GB,GG,GI,GL,GR,HR,HU,IE,IL,IM,IS,IT,JE,LI,LU,LV,MC,MD,ME,MK,MT,NL,NO,PL,PT,RO,RS,RU,SE,SI,SJ,SK,SM,TR,VA,AU,CA,HK,MO,NZ,JP,SG,KR,TW,US')
> 0, view_count, 0))/SUM(view_count) FROM wmf.projectview_hourly WHERE
agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-11-09"
AND "2015-11-15";
hive (wmf)> SELECT year, month, day,
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")), SUM(view_count) AS
all, SUM(IF (FIND_IN_SET(country_code,
'AD,AL,AT,AX,BA,BE,BG,CH,CY,CZ,DE,DK,EE,ES,FI,FO,FR,FX,GB,GG,GI,GL,GR,HR,HU,IE,IL,IM,IS,IT,JE,LI,LU,LV,MC,MD,ME,MK,MT,NL,NO,PL,PT,RO,RS,RU,SE,SI,SJ,SK,SM,TR,VA,AU,CA,HK,MO,NZ,JP,SG,KR,TW,US')
> 0, view_count, 0)) AS Global_North_views FROM wmf.projectview_hourly
WHERE year = 2015 AND agent_type='user' GROUP BY year, month, day ORDER BY
year, month, day LIMIT 1000;
https://console.developers.google.com/storage/browser/pubsite_prod_rev_0281…
(“overview”)
https://www.appannie.com/dashboard/252257/item/324715238/downloads/?breakdo…
(select “Total”)
SELECT LEFT(timestamp, 8) AS date, SUM(IF(event_appInstallAgeDays = 0, 1,
0)) AS day0_active, SUM(IF(event_appInstallAgeDays = 7, 1, 0)) AS
day7_active FROM log.MobileWikiAppDailyStats_12637385 WHERE timestamp LIKE
'201511%' AND userAgent LIKE '%-r-%' AND userAgent NOT LIKE '%Googlebot%'
GROUP BY date ORDER BY DATE;
(with the retention rate calculated as day7_active divided by day0_active
from seven days earlier, of course)
https://analytics.itunes.apple.com/#/retention?app=324715238
hive (wmf)> SELECT SUM(IF(platform = 'Android',unique_count,0))/7 AS
avg_Android_DAU_last_week, SUM(IF(platform = 'iOS',unique_count,0))/7 AS
avg_iOS_DAU_last_week FROM wmf.mobile_apps_uniques_daily WHERE
CONCAT(year,LPAD(month,2,"0"),LPAD(day,2,"0")) BETWEEN 20151109 AND
20151115;
hive (wmf)> SELECT CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0"))
as date, unique_count AS Android_DAU FROM wmf.mobile_apps_uniques_daily
WHERE platform = 'Android';
hive (wmf)> SELECT CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0"))
as date, unique_count AS iOS_DAU FROM wmf.mobile_apps_uniques_daily WHERE
platform = 'iOS';
--
Tilman Bayer
Senior Analyst
Wikimedia Foundation
IRC (Freenode): HaeB
Anything meaningful in the drop from #200 to #240 position in global
Alexa rank? http://www.alexa.com/siteinfo/commons.wikimedia.org
We know Alexa has many deficiencies, can HTTPS/HSTS have disrupted their
statistics?
Nemo
Dear Data Enthusiasts,
In collaboration with the Services team, the analytics team wishes to
announce a public Pageview API
<https://wikimedia.org/api/rest_v1/?doc#!/Pageviews_data/get_metrics_pagevie…>.
For an example of what kind of UIs someone could build with it, check out
this excellent demo <http://analytics.wmflabs.org/demo/pageview-api> (code)
<https://gist.github.com/marcelrf/49738d14116fd547fe6d#file-article-comparis…>
.
The API can tell you how many times a wiki article or project is viewed
over a certain period. You can break that down by views from web crawlers
or humans, and by desktop, mobile site, or mobile app. And you can find
the 1000 most viewed articles
<https://wikimedia.org/api/rest_v1/metrics/pageviews/top/es.wikipedia/all-ac…>
on any project, on any given day or month that we have data for. We
currently have data back through October and we will be able to go back to
May 2015 when the loading jobs are all done. For more information, take a
look at the user docs
<https://wikitech.wikimedia.org/wiki/Analytics/AQS/Pageview_API>.
After many requests from the community, we were really happy to finally
make this our top priority and get it done. Huge thanks to Gabriel, Marko,
Petr, and Eric from Services, Alexandros and all of Ops really, Henrik for
maintaining stats.grok, and, of course, the many community members who have
been so patient with us all this time.
The Research team’s Article Recommender tool <http://recommend.wmflabs.org/>
already uses the API to rank pages and determine relative importance. Wiki
Education Foundation’s dashboard <https://dashboard.wikiedu.org/> is going
to be using it to count how many times an article has been viewed since a
student edited it. And there are other grand plans for this data like
“article finder”, which will find low-rated articles with a lot of
pageviews; this can be used by editors looking for high-impact work. Join
the fun, we’re happy to help get you started and listen to your ideas.
Also, if you find bugs or want to suggest improvements, please create a
task in Phabricator and tag it with #Analytics-Backlog
<https://phabricator.wikimedia.org/tag/analytics-backlog/>.
So what’s next? We can think of too many directions to go into, for
pageview data and Wikimedia project data, in general. We need to work with
you to make a great plan for the next few quarters. Please chime in here
<https://phabricator.wikimedia.org/T112956> with your needs.
Team Analytics
Hey!
As y'all may have seen, we have a new pageviews API, with much finer
granularity and better recall than the existing data. Since I had
advance notice of the release, I was able to put together an R client
already - you can get it at https://github.com/Ironholds/pageviews if
R is your language of choice, and it'll be up on CRAN shortly.
Thanks,
--
Oliver Keyes
Count Logula
Wikimedia Foundation
Agree with everyone else, this is great!
I just have a question. Is this an evolving thing in a sense that more
data sources will be used to define page views? Let me give an example.
Reading Web team is working on a new web app prototype that caches pages
which can be viewed without hitting the back end. Since no request is
made, no page view will be recorded. We may add an event logging event to
record a page view which would be another source of information for this
API end point.
Baha
On Tue, 17 Nov 2015 02:50:20 +0500, Dan Andreescu
<dandreescu(a)wikimedia.org> wrote:
> Dear Data Enthusiasts,
>
>
> In collaboration with the Services team, the analytics team wishes to
> announce a public Pageview API
> <https://wikimedia.org/api/rest_v1/?doc#!/Pageviews_data/get_metrics_pagevie…>.
> For an example of what kind of UIs someone could build with it, check out
> this excellent demo <http://analytics.wmflabs.org/demo/pageview-api>
> (code)
> <https://gist.github.com/marcelrf/49738d14116fd547fe6d#file-article-comparis…>
> .
>
>
> The API can tell you how many times a wiki article or project is viewed
> over a certain period. You can break that down by views from web
> crawlers
> or humans, and by desktop, mobile site, or mobile app. And you can find
> the 1000 most viewed articles
> <https://wikimedia.org/api/rest_v1/metrics/pageviews/top/es.wikipedia/all-ac…>
> on any project, on any given day or month that we have data for. We
> currently have data back through October and we will be able to go back
> to
> May 2015 when the loading jobs are all done. For more information, take
> a
> look at the user docs
> <https://wikitech.wikimedia.org/wiki/Analytics/AQS/Pageview_API>.
>
>
> After many requests from the community, we were really happy to finally
> make this our top priority and get it done. Huge thanks to Gabriel,
> Marko,
> Petr, and Eric from Services, Alexandros and all of Ops really, Henrik
> for
> maintaining stats.grok, and, of course, the many community members who
> have
> been so patient with us all this time.
>
>
> The Research team’s Article Recommender tool
> <http://recommend.wmflabs.org/>
> already uses the API to rank pages and determine relative importance.
> Wiki
> Education Foundation’s dashboard <https://dashboard.wikiedu.org/> is
> going
> to be using it to count how many times an article has been viewed since a
> student edited it. And there are other grand plans for this data like
> “article finder”, which will find low-rated articles with a lot of
> pageviews; this can be used by editors looking for high-impact work.
> Join
> the fun, we’re happy to help get you started and listen to your ideas.
> Also, if you find bugs or want to suggest improvements, please create a
> task in Phabricator and tag it with #Analytics-Backlog
> <https://phabricator.wikimedia.org/tag/analytics-backlog/>.
>
>
> So what’s next? We can think of too many directions to go into, for
> pageview data and Wikimedia project data, in general. We need to work
> with
> you to make a great plan for the next few quarters. Please chime in here
> <https://phabricator.wikimedia.org/T112956> with your needs.
>
>
> Team Analytics
--
Baha
I've been working on book search at the Internet Archive, and I've
been using Wikipedia article titles and redirects as entities and
synonyms. I wanted to build autocomplete for this gizmo, so I
downloaded 7 days of pageviews for the en Wikipedia, and wrote
a tiny script to sum them up. It worked great!
Here's the demo (currently live, will disappear eventually).
"number" is the pageviews count.
curl http://researcher3.fnf.archive.org:8080/autocomplete?q=Que | json_pp
{
"autocomplete" : [
{
"number" : 68310,
"label" : "Queen Victoria"
},
{
"number" : 53283,
"label" : "Quentin Tarantino"
},
{
"number" : 29192,
"label" : "Quebec"
},
{
"number" : 23717,
"label" : "Queen Elizabeth The Queen Mother"
},
{
"number" : 20500,
"label" : "Quetiapine"
}
]
}
Very interesting read (via Brandon Harris):
http://recode.net/2015/07/07/doing-something-about-the-impossible-problem-o…
"the vast majority of negative behavior ... did not originate from the
persistently negative online citizens; in fact, 87 percent of online
toxicity came from the neutral and positive citizens just having a bad day
here or there."
"... incidences of homophobia, sexism and racism ... have fallen to a
combined 2 percent of all games. Verbal abuse has dropped by more than 40
percent, and 91.6 percent of negative players change their act and never
commit another offense after just one reported penalty."
I have plenty of ideas how to apply this to Wikipedia, but I am sure Dario
and his team as well :) - and some opportunity for the communities to use
such results.
Cheers,
Denny
Team:
Please take a look at Mediawiki API data needs, they made a nice wiki page
for us to understand what type of data do they need.
https://www.mediawiki.org/wiki/User:BDavis_%28WMF%29/Projects/Action_API_re…
We already talked with them about using our user_agent data on wmf table so
they can start on those reports right away so you might see some oozie CRs
on that regard. Please have in mind that API folks need raw user agents (as
every API client should have a unique one) rather than processed ones.
Thanks,
Nuria
Hi all,
here is the weekly look at our most important readership metrics (CCing
Wikitech-l too this time).
As laid out earlier
<https://lists.wikimedia.org/pipermail/mobile-l/2015-September/009773.html>,
the main purpose is to raise awareness about how these are developing, call
out the impact of any unusual events in the preceding week, and facilitate
thinking about core metrics in general. We are still iterating on the
presentation and eventually want to create dashboards for those which are
not already available in that form already. Feedback and discussion
continue to be welcome.
As it might be of interest for readers of this report who haven’t already
seen the news on Analytics-l or Wikitech-l, I’d like to mention the
exciting news that the monthly pageview data on Wikistats
<https://stats.wikimedia.org/EN/TablesPageViewsMonthlyCombined.htm> has
been transitioned to the new pageview definition
<https://lists.wikimedia.org/pipermail/analytics/2015-November/004502.html>.
Now to the usual data, while introducing one new metric as well this time.
(All numbers below are averages for November 2-8, 2015 unless otherwise
noted.)
Pageviews
Total: 536 million/day (+2.2% from the previous week)
Context (April 2015-November 2015):
We more than reversed the -1.5% drop from last week, yay!
(See also the Vital Signs dashboard
<https://vital-signs.wmflabs.org/#projects=all/metrics=Pageviews>)
Desktop: 57.5%
Mobile web: 41.3%
Apps: 1.2%
Global North ratio: 77.5% of total pageviews (previous week: 77.1%)
Context (April 2015-November 2015):
New app installations
Android: 60.6k/day (+63.2% from the previous week)
Daily installs per device, from Google Play
Context (last three months):
On November 5, the app started to get featured in the "New and Updated
Apps" section of the Google Play store, enabled by the Android team’s
recent update work. The effect is already clearly visible here; we’ll have
a fuller view of the impact in the next report after the placement ends
today.
iOS: 4.41k/day (+11.2% from the previous week)
Download numbers from App Annie
Context (September 2014-September 2015):
Things are back to normal after the iOS app had been featured in the App
Store in mid-October. (Much of the 11.2% rise over the preceding week can
be tied to the - still unexplained - drop on Oct 30.)
App user retention
With this issue of the report, we’re adding a new metric that should be
more directly tied to how new users perceive the quality and usefulness of
the apps. Day-7 retention (D7) is defined as the proportion of users who
used the app again on the seventh day after they first opened it. The iOS
team has set themselves a quarterly goal to bring this “stickiness” metric
to at least 15% with their 5.0 update
<https://commons.wikimedia.org/wiki/File:IOS_Wikipedia_App_5.0_Update.pdf>
(p.5 of that doc contains some further context on this metric and links on
how it is perceived elsewhere in the industry; the following post is also
useful for perspective: “losing 80% of mobile users is normal, and why the
best apps do better”
<http://andrewchen.co/new-data-shows-why-losing-80-of-your-mobile-users-is-n…>).
Android: 13.9% (previous week: 11.5%)
(1:100 sample)
Context (last three months):
iOS: 13.1% (previous week: 10.6%)
(from iTunes Connect, opt-in only = ca. 20-30% of all users)
Context (October 11-November 8, 2015):
Unique app users
Android: 1.185 million / day (+2.0% from the previous week)
Context (last three months):
There are already signs of a small but discernible rise in active users due
to the app being featured, but we’ll need to wait until later to fully
assess this.
iOS: 280k / day (+1.0% from the previous week)
Context (last three months):
No news here
----
For reference, the queries and source links used are listed below (access
is needed for each). Most of the above charts are available on Commons, too
<https://commons.wikimedia.org/wiki/Category:Wikimedia_readership_metrics_re…>
.
hive (wmf)> SELECT SUM(view_count)/7000000 AS avg_daily_views_millions FROM
wmf.projectview_hourly WHERE agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-11-02"
AND "2015-11-08";
hive (wmf)> SELECT year, month, day,
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) as date,
sum(IF(access_method <> 'desktop', view_count, null)) AS mobileviews,
SUM(view_count) AS allviews FROM wmf.projectview_hourly WHERE year=2015 AND
agent_type = 'user' GROUP BY year, month, day ORDER BY year, month, day
LIMIT 1000;
hive (wmf)> SELECT access_method, SUM(view_count)/7 FROM
wmf.projectview_hourly WHERE agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-11-02"
AND "2015-11-08" GROUP BY access_method;
hive (wmf)> SELECT SUM(IF (FIND_IN_SET(country_code,
'AD,AL,AT,AX,BA,BE,BG,CH,CY,CZ,DE,DK,EE,ES,FI,FO,FR,FX,GB,GG,GI,GL,GR,HR,HU,IE,IL,IM,IS,IT,JE,LI,LU,LV,MC,MD,ME,MK,MT,NL,NO,PL,PT,RO,RS,RU,SE,SI,SJ,SK,SM,TR,VA,AU,CA,HK,MO,NZ,JP,SG,KR,TW,US')
> 0, view_count, 0))/SUM(view_count) FROM wmf.projectview_hourly WHERE
agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-11-02"
AND "2015-11-08";
hive (wmf)> SELECT year, month, day,
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")), SUM(view_count) AS
all, SUM(IF (FIND_IN_SET(country_code,
'AD,AL,AT,AX,BA,BE,BG,CH,CY,CZ,DE,DK,EE,ES,FI,FO,FR,FX,GB,GG,GI,GL,GR,HR,HU,IE,IL,IM,IS,IT,JE,LI,LU,LV,MC,MD,ME,MK,MT,NL,NO,PL,PT,RO,RS,RU,SE,SI,SJ,SK,SM,TR,VA,AU,CA,HK,MO,NZ,JP,SG,KR,TW,US')
> 0, view_count, 0)) AS Global_North_views FROM wmf.projectview_hourly
WHERE year = 2015 AND agent_type='user' GROUP BY year, month, day ORDER BY
year, month, day LIMIT 1000;
https://console.developers.google.com/storage/browser/pubsite_prod_rev_0281…
(“overview”)
https://www.appannie.com/dashboard/252257/item/324715238/downloads/?breakdo…
(select “Total”)
SELECT LEFT(timestamp, 8) AS date, SUM(IF(event_appInstallAgeDays = 0, 1,
0)) AS day0_active, SUM(IF(event_appInstallAgeDays = 7, 1, 0)) AS
day7_active FROM log.MobileWikiAppDailyStats_12637385 WHERE timestamp LIKE
'201510%' AND userAgent LIKE '%-r-%' AND userAgent NOT LIKE '%Googlebot%'
GROUP BY date ORDER BY DATE;
(with the retention rate calculated as day7_active divided by day0_active
from seven days earlier, of course)
https://analytics.itunes.apple.com/#/retention?app=324715238
hive (wmf)> SELECT SUM(IF(platform = 'Android',unique_count,0))/7 AS
avg_Android_DAU_last_week, SUM(IF(platform = 'iOS',unique_count,0))/7 AS
avg_iOS_DAU_last_week FROM wmf.mobile_apps_uniques_daily WHERE
CONCAT(year,LPAD(month,2,"0"),LPAD(day,2,"0")) BETWEEN 20151102 AND
20151108;
hive (wmf)> SELECT CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0"))
as date, unique_count AS Android_DAU FROM wmf.mobile_apps_uniques_daily
WHERE platform = 'Android';
hive (wmf)> SELECT CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0"))
as date, unique_count AS iOS_DAU FROM wmf.mobile_apps_uniques_daily WHERE
platform = 'iOS';
--
Tilman Bayer
Senior Analyst
Wikimedia Foundation
IRC (Freenode): HaeB