Hi Team,
I just wanted to update you on the results of something we internally
referred to as the '*browse' *prototype.
TLDR: as implemented the mobile 'browse by category' test did not drive
significant engagement. In fact, as implemented, it seemed inferior to
blue links. However, we started with a very rough and low-impact
prototype, so a few tweaks would give us more definitive results.
Here is the doc from which I am pasting from below:
https://docs.google.com/document/d/1Mqw-awAcp01IcLhHPsHmWsqaAyK1l2-w_LMDtiz…
Questions/comments welcome!
Best,
J
Browse Prototype Results
Intro
<https://docs.google.com/document/d/1Mqw-awAcp01IcLhHPsHmWsqaAyK1l2-w_LMDtiz…>
Process
<https://docs.google.com/document/d/1Mqw-awAcp01IcLhHPsHmWsqaAyK1l2-w_LMDtiz…>
Results
<https://docs.google.com/document/d/1Mqw-awAcp01IcLhHPsHmWsqaAyK1l2-w_LMDtiz…>
Blue links in general
<https://docs.google.com/document/d/1Mqw-awAcp01IcLhHPsHmWsqaAyK1l2-w_LMDtiz…>
Category tags
<https://docs.google.com/document/d/1Mqw-awAcp01IcLhHPsHmWsqaAyK1l2-w_LMDtiz…>
Conclusion and Next Steps
<https://docs.google.com/document/d/1Mqw-awAcp01IcLhHPsHmWsqaAyK1l2-w_LMDtiz…>
Process
<https://docs.google.com/document/d/1Mqw-awAcp01IcLhHPsHmWsqaAyK1l2-w_LMDtiz…>
Do people want to browse by categories?
<https://docs.google.com/document/d/1Mqw-awAcp01IcLhHPsHmWsqaAyK1l2-w_LMDtiz…>
Intro
As outlined in this doc
<https://docs.google.com/presentation/d/1ZssE8G0P5WVg8XmkBTi5G3n4OdLHPFGWZDZ…>,
the concept is a tag that allows readers to navigate WP via categories that
are meaningful and populated in order of 'significance' (as determined by
user input). The hypothesis:
-
users will want to navigate by category if there are fewer, more
meaningful categories per page and those category pages showed the most
‘notable’ members first.
Again, see the full doc
<https://docs.google.com/presentation/d/1ZssE8G0P5WVg8XmkBTi5G3n4OdLHPFGWZDZ…>
to understand the premise.
Process
The first step was to validate: do users want to navigate via category? So
we built a very lightweight prototype on mobile web, en wikipedia (stable,
not beta) using hardcoded config variables, in the following categories ( ~4000
pages). Here we did not look into sub-categories with one exception (see
T94732 <https://phabricator.wikimedia.org/T94732> for details). There was
also an error and 2 of the categories did not have tags implemented (struck
through, below)
Category
Pagecount
NBA All Stars
400
American Politicians
818
Object-Oriented Programming Languages
164
European States
24
American Female Pop Singers
326
American drama television series
1048
Modern Painters
983
Landmarks in San Francisco, California
270
Here is how it appeared on the Alcatraz page
When the user clicked the tag, they were taken to a gather-like collection
based on manually estimated relevance
(sorry cropped shot)
The category pages were designed to show the most relevant (as deemed by
me) to the broadest audience, first. Here is the ordering:
https://docs.google.com/spreadsheets/d/12xLXQsH1zcg6E8lDuSonumZNdBvfaBuHOS1…
This was intended to lie in contrast with our current category pages, which
are alphabetical and not really intended for human browsing:
https://en.wikipedia.org/wiki/Category:American_male_film_actors
We primarily measured a few things:
-
when a tag was seen by a user
-
when a tag was clicked on by a user
-
when a page in the new ‘category view’ was clicked on by a user
As a side effort, I looked to see if overall referrals from pages with tags
went up--this was a timed intervention rather than an a/b test and given
the click-thru on the tags, the impact would have been negligible anyway.
This was confirmed by some very noisy results.
Results
Blue links in general
One benefit of the side study mentioned in the previous paragraph is that I
was able to generate a table that looked at the pages in question before we
started the test that shows a ratio of total pageviews/pageviews referred
by a page (estimate of how many links were opened from that page). Though
it is literally just for 0-1 GMT, 6/29/15, now that we have the pageview
hourly table, a more robust analysis can tell us how categories differ in
this regard:
Category
links clicked
#pvs
clicks/pvs
Category:20th-centuryAmericanpoliticians
761
1243
61%
Category:Americandramatelevisionseries
5981
8844
68%
Category:Americanfemalepopsingers
2502
4280
58%
Category:LandmarksinSanFrancisco,
104
287
36%
Category:Modernpainters
136
369
37%
Category:NationalBasketballAssociationAll-Stars
1908
3341
57%
Category:Object-orientedprogramminglanguages
48
181
27%
Category:WesternEurope
657
1221
54%
Grand Total
12099
19766
50%
You can see here that for pages in the category ‘Landmarks in San
Francisco’, if there are 10 pageviews, 5.4 clicks to other pages are
generated on average.
I do not have the original queries for this handy, but can dig them up if
you’re really interested.
Category tags
Full data and queries here:
https://docs.google.com/a/wikimedia.org/spreadsheets/d/1vD3DopxGyeh9FQsuTQD…
The tags themselves generated an average click-through rate of .18%. Given
the overall click thru rate on the pages estimated above ~50%, this single
tag is not driving anything significant. Furthermore, given Leila and
Bob’s paper suggest that this is performing no better than a mid-article
click--given the mobile web sections are collapsed, I would need to
understand more about their method to know just how to interpret their
results against our mobile-web only implementation. Furthermore, our click
through rate used the number of times the tag appeared on screen as the
denominator, whereas their research looked at overall pageviews.
This being noted, the tag was implemented to be as obscure as possible to
establish a baseline. Furthermore, any feature like this would probably be
different in the following ways:
-
each page would be in 1-4 tag groups (as opposed to just 1)
-
each page would be tagged, creating the expectation on the part of the
user that this was something to look for
-
presumably the categories could be implemented as a menu item as opposed
to being buried at the bottom of the page (and competing with features like
read more.
-
Using the learnings from ‘read more’ tags with images or buttons would
likely fare much better.
The follow graph shows:
-
number of impressions on the right axis
-
click-thru-rate on the left-axis.
When you look at click through rates on the ‘category’ pages themselves,
you see that they average at 41% (Chart below) Meaning that for every 10
times a user visited a category page, there were 4.1 clicks to one of those
pages as a result.
Here is the same broken up by category:
Each ‘category’ page here had at least 400 visits, and you can see that the
interest seems to vary dramatically across categories. It is worth noting
that the top three categories here are the ones with the fewest entities.
Each list, however, was capped at ~50 articles, so it is unclear what might
be causing this effect, if it is real.
As mentioned above, the average article page has an overall click rate of
50%. So this page of categories did not have the click-through rate that a
page has. However, this page had summaries of each of the pages, so it
could be that users were generating value beyond what a blue link would
provide. A live-user test of Gather collections, from whom this format was
borrowed, suggested that the format used up too much vertical space on each
article and was hard to flip through. Shortening the amount of text or
image space might be something to try to make the page more useful
Conclusion and Next StepsProcess
-
This was the first time I am aware of that we ran a live prototype and
learn something without building a scalable solution. Win
-
Developer time was estimated at 1 FTE for 2 weeks (by pheudx), but the
chronological time for pushing to stable took a quarter. Room for
improvement
-
The time to analysis was almost 2 quarters, due to a lack of data
analysis support (I ran the initial analysis within 2 weeks of launch,
during paternity leave, but was unable to go back and get it ready to
distribute for 3 months). Room for improvement--possibly solved by
additional Data Analyst.
This experiment was not designed to answer questions definitively in one
round, but with the understanding that multiple iterations would allow us
to fully answer our questions.
The long turn-around time, particularly around analysis and communication,
meant that tweaking a variable to test the conclusions or the new questions
that arosee below will involve a whole lot more work and effort than if we
had been able to explore modifications within a few weeks of the initial
launch.
Do people want to browse by categories?
Category tags at the bottom of the mobile web page in a dull gray
background that lead to manually curated categories are not a killer
feature :)
I would be reluctant to say that this means users are not interested in
browsing by category, however. For instance, it is likely that
-
users did not notice the tag, even if it appeared on screen
-
users are accustomed to our current category tags on desktop and not
interested in that experience
-
users who did like the tag were unlikely to find another page that had
it--there was no feedback mechanism by which the improved category page
would drive additional tag interactions
-
the browse experience created was not ideal
If we decide to pursue what is currently termed “cascade c: update ux”, I
would like to proceed with more tests in this arena, by altering the
appearance and position of the tags, and by improving the flow of the
‘category’ pages. If we choose a different strategy, hopefully other teams
can build off of what was learned here.
Hi all,
here is the weekly look at our most important readership metrics. Last
week’s report was skipped for various reasons (one of them being that there
really wasn’t much news in the week-over-week data then), but for now we’re
still trying to keep this regular schedule.
As laid out earlier, the main purpose is to raise awareness about how these
are developing, call out the impact of any unusual events in the preceding
week, and facilitate thinking about core metrics in general.
We are still iterating on the presentation (e.g. to better take seasonality
into account, in particular including year-over-year comparisons) and
eventually want to create dashboards for those which are not already
available in that form already. Feedback and discussion welcome.
Some other recent items of interest:
-
There is a new dissertation containing a chapter/paper about forecasting
the time series of Wikimedia pageviews. I wrote up a review/summary for the
monthly research newsletter: “Predicting Wikimedia pageviews with 2%
accuracy”
<https://meta.wikimedia.org/wiki/Research:Newsletter/2015/September#Predicti…>.
(NB: that low error rate refers to a timespan of one week.)
-
Like the whole of WMF, the Reading team held its quarterly review
meeting last week; the presentation slide deck
<https://commons.wikimedia.org/wiki/File:WMF_Reading_Quarterly_Review_Q1_201…>
contains a lot of readership metrics too.
-
In the wake of the “Google kills Wikipedia” media stories some weeks ago
(that Oliver tried to rebut based on our internal data), one SEO blogger
argues that actually “Google Still Loves Wikipedia (More Than Its Own
Properties)”
<https://www.stonetemple.com/google-still-loves-wikipedia-more-than-its-own-…>,
although “Wikipedia did slide a bit in the rankings” since April which
according to him is “the reason for the Wikipedia traffic loss”. The latter
seems quite speculative to me, but the ranking slide, while tiny, appears
to be based on a sufficiently large dataset.
Now to the usual data. (All numbers below are averages for October 5-11,
2015 unless otherwise noted.)
Pageviews
Total: 524 million/day (+1.8% from the previous week)
week until Oct 4: 514 million/day (-0.9% from the previous week)
Context (April 2015-October 2015):
Total pageviews increased for the first time again this week, after having
fallen for three weeks in a row. (See also the Vital Signs dashboard
<https://vital-signs.wmflabs.org/#projects=all/metrics=Pageviews>.)
Desktop: 57.6%
Mobile web: 41.2%
Apps: 1.2%
Global North ratio: 77.0% of total pageviews
Context (April 2015-October 2015):
Unique app users
Android: 1.160 million/day (+1% from the previous week)
week until Oct 4: 1.148 million /day (-0.9% from the previous week)
Context (January 2015-October 2015):
Note: Due to a typo in the calculation, the average daily unique app users
numbers for both iOS and Android were too high in the report for the week
until September 27. It should have said 1.158 million for Android and 280k
for iOS. Also, due to a separate issue (data corruption in the underlying
database, https://phabricator.wikimedia.org/T114406 ) the charts plotting
these two metrics were off for some earlier days in August and September,
affecting all the weekly reports so far. I’m now posting this report with
guesstimated data while the Analytics team is rolling out a fix. Does not
seem to change the big picture in any way, but we want to get this right ;)
iOS: 278k/day (+1.5% from the previous week)
week until Oct 4: 274k / day (-2.5% from the previous week)
Context (January 2015-October 2015):
In the aforementioned quarterly review (minutes still to be posted), Lila
asked about an apparent
<https://commons.wikimedia.org/w/index.php?title=File%3AWMF_Reading_Quarterl…>
worsening of the app store ratings in September (after bugs had cost us
users earlier this year already). Josh and I looked a bit through reviews
from that time. There were several users complaining about seeing the UI in
the wrong language and some of them connect that bug with the iOS 9 upgrade
(which rolled out from September 16). That might have something to do with
the decrease in the above chart in recent weeks - we don’t know for sure
though.
New app installations
Android: 39,461/day (-7.1% from the previous week)
(Daily installs per device, from Google Play)
week until Oct 4: 42,482/day (-0.07% from the previous week)
Context (July-October 2015):
Showing this in a different context this time, compared to the number of
uninstalls (#devices where the app was removed, 28,742/day on average last
week), and zooming in on the last three months instead of a full year.
We didn’t discuss it earlier because it happened before the start of these
reports, but there was a notable rise in installs from August 20 to August
23, largely sustained afterwards, and not offset by an equally large rise
in uninstalls. The reason is not clear - putting it out here in case anyone
has further insights - but it’s tantalizingly close to the dates where the
app was part of a “Back to School” promotion in the Play store (which the
WMF Partnerships team worked on with Google): August 13-20 and August
27-September 2, in the US only. So the dates don’t quite match, but our
contact at Google pointed out that there could still be a causal
connection, e.g. because of a third-party social media post inspired by the
campaign some days earlier, or due to residual effects from the “Back to
School” collection’s overall success (e.g. if a user downloaded another
similar app within the collection, our app would be more likely to be
recommended to them in the future since the algorithm recognizes similar
apps).
iOS: 4,523/day (-2.4% from the previous week)
(download numbers from App Annie)
week until Oct 4: 4,633/day (+0.7% from the previous week)
Context (July-October 2015):
Switching to a three-month view here too this time. No source of uninstall
data for iOS that I’m aware of.
----
For reference, the queries and source links used are listed below (access
is needed for each). Most of the above charts are available on Commons, too
<https://commons.wikimedia.org/w/index.php?title=Special:ListFiles&offset=20…>
.
hive (wmf)> SELECT SUM(view_count)/7000000 AS avg_daily_views_millions FROM
wmf.projectview_hourly WHERE agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-10-05"
AND "2015-10-11";
hive (wmf)> SELECT year, month, day,
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) as date,
sum(IF(access_method <> 'desktop', view_count, null)) AS mobileviews,
SUM(view_count) AS allviews FROM wmf.projectview_hourly WHERE year=2015 AND
agent_type = 'user' GROUP BY year, month, day ORDER BY year, month, day
LIMIT 1000;
hive (wmf)> SELECT access_method, SUM(view_count)/7 FROM
wmf.projectview_hourly WHERE agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-10-05"
AND "2015-10-11" GROUP BY access_method;
hive (wmf)> SELECT SUM(IF (FIND_IN_SET(country_code,
'AD,AL,AT,AX,BA,BE,BG,CH,CY,CZ,DE,DK,EE,ES,FI,FO,FR,FX,GB,GG,GI,GL,GR,HR,HU,IE,IL,IM,IS,IT,JE,LI,LU,LV,MC,MD,ME,MK,MT,NL,NO,PL,PT,RO,RS,RU,SE,SI,SJ,SK,SM,TR,VA,AU,CA,HK,MO,NZ,JP,SG,KR,TW,US')
> 0, view_count, 0))/SUM(view_count) FROM wmf.projectview_hourly WHERE
agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-10-05"
AND "2015-10-11";
hive (wmf)> SELECT year, month, day,
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")), SUM(view_count) AS
all, SUM(IF (FIND_IN_SET(country_code,
'AD,AL,AT,AX,BA,BE,BG,CH,CY,CZ,DE,DK,EE,ES,FI,FO,FR,FX,GB,GG,GI,GL,GR,HR,HU,IE,IL,IM,IS,IT,JE,LI,LU,LV,MC,MD,ME,MK,MT,NL,NO,PL,PT,RO,RS,RU,SE,SI,SJ,SK,SM,TR,VA,AU,CA,HK,MO,NZ,JP,SG,KR,TW,US')
> 0, view_count, 0)) AS Global_North_views FROM wmf.projectview_hourly
WHERE year = 2015 AND agent_type='user' GROUP BY year, month, day ORDER BY
year, month, day LIMIT 1000;
hive (wmf)> SELECT SUM(IF(platform = 'Android',unique_count,0))/7 AS
avg_Android_DAU_last_week, SUM(IF(platform = 'iOS',unique_count,0))/7 AS
avg_iOS_DAU_last_week FROM wmf.mobile_apps_uniques_daily WHERE
CONCAT(year,LPAD(month,2,"0"),LPAD(day,2,"0")) BETWEEN 20151005 AND
20151011;
hive (wmf)> SELECT CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0"))
as date, unique_count AS Android_DAU FROM wmf.mobile_apps_uniques_daily
WHERE platform = 'Android';
hive (wmf)> SELECT CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0"))
as date, unique_count AS iOS_DAU FROM wmf.mobile_apps_uniques_daily WHERE
platform = 'iOS';
https://play.google.com/apps/publish/?dev_acc=02812522755211381933#StatsPla…https://www.appannie.com/dashboard/252257/item/324715238/downloads/?breakdo…
(select “Total”)
--
Tilman Bayer
Senior Analyst
Wikimedia Foundation
IRC (Freenode): HaeB
Hello all together,
with this e-mail I want to inform you, that, with change I20e46165fb76[1],
we removed a long living Hook in MobileFrontend, called EnableMobileModules.
It was added to empower extensions to add modules to OutputPage only, if the
page was recognized as requested for a "mobile" friendly output (a mobile
device or the "Mobile/Desktop switch" at the bottom of every page). The Hook
was deprecated with a notice a long time ago and was now completely removed.
Although it seems, that the hook wasn't used in any extension hosted in
Wikimedia git I want to give some hints how to achieve something that was
made with the use of this removed hook in extensions that maybe aren't
hosted in Wikimedia's git.
== Register mobile (only) modules ==
If you used the EnablemobileModules hook to register a module only, if the
page is a "mobile" page, you should migrate to the ResourceLoader. It
provides an easy way to specify the target of a module (as you might know
already). So, you can define the module with the mobile target only
('targets' => array( 'mobile' )) and add the module through the
BeforePageDisplay hook. If you need to invest some time to migrate to this
way, please allow me to mention, that it could be a good step to think about
the size of your modules you want (or already) load for the mobile page, too
:) (Think of peoples data plans and network speed :P)
== Run code in mobile mode only ==
If you need to run some code only in the "Mobile page" context (only, if a
page is requested for the mobile mode) you may have used the
EnableMobileModules hook (even if it wasn't thought for this use case). To
use this code in the future, you can use the BeforePageDisplayMobile hook,
which already runs along with the EnableMobileModules for a while. It's
nearly a copy of the BeforePageDisplay hook (with the same signature and
functioning), but it's only executed in the mobile mode.
If you have any questions about how to migrate some code or some other
mobile (code) related things, feel free to e-mail mobile-l or ask in
#wikimedia-mobile!
Have a nice weekend!
Best,
Florian
[1] https://gerrit.wikimedia.org/r/#/c/243226/6
Hi,
I'm Josephine, an Outreachy applicant who's hoping to work on the Upload to
Commons Android app project -
https://phabricator.wikimedia.org/T114358#1701261
With the guidance of Nicolas_Raoul (co-mentor for this project), I have
started contributing to the app on GitHub and have completed the microtask.
However, we are still needing one more mentor for this project.
If anyone is willing to help out as a mentor or point me to someone who
would, I'd hugely appreciate it.
--
Regards,
Josephine
Hi
This was a long standing feature request and not the easiest one to
implement, that's why the Kiwix development team is pretty proud to
announce the release of Kiwix for iOS v1.0.
With Kiwix for iOS you can easily download/read recent snapshots (with
or without pictures) of all Wikimedia projects with your iPhone/iPad.
This first version of the app provides basic functionalities like:
* ZIM reader
* Navigation through articles
* Search based on article title
* Content library/downloader
* Bookmarks
* Browse history
* iTunes sync (already in beta testing for v1.1)
The app has been developed for the last 18 months by Chris, a New-York
based developer. He was (a bit) mentored by an older member of the Kiwix
dev team and the project was supported by WikimediaCH for the
administrative work with Apple and the coverage of mandatory hardware costs.
With this last stone, Kiwix which is already available for Windows, OSX,
Linux and Android, has reached a milestone. 10 years after the project
creation, Kiwix is a a cutting-edge solution to access Wikipedia (and a
lot more) offline - with most of the computers - whatever the platform.
Download Kiwix for iOS on Itunes:
https://itunes.apple.com/us/app/kiwix/id997079563
Have a look to the source code:
https://github.com/kiwix/iOS
Regards
Emmanuel
--
Kiwix - Wikipedia Offline & more
* Web: http://www.kiwix.org
* Twitter: https://twitter.com/KiwixOffline
* more: http://www.kiwix.org/wiki/Communication
Sharing with mobile-l.
-Sam
---------- Forwarded message ----------
From: Sam Smith <samsmith(a)wikimedia.org>
Date: Thu, Oct 1, 2015 at 9:29 AM
Subject: Re: Mobile event log sampling
To: Bernd Sitzmann <bernd(a)wikimedia.org>
Cc: Neil Quinn <nquinn(a)wikimedia.org>, Jon Katz <jkatz(a)wikimedia.org>
I can confirm that MobileWebEditing (now Edit [0]) wasn't (isn't) sampled
either.
-Sam
[0]
https://github.com/wikimedia/mediawiki-extensions-MobileFrontend/commit/c0a…
On Wed, Sep 30, 2015 at 10:09 PM, Bernd Sitzmann <bernd(a)wikimedia.org>
wrote:
> Correct, MobileWikiAppEdit
> <https://meta.wikimedia.org/wiki/Schema:MobileWikiAppEdit> is not
> sampled. See [1] and compare to sample parameters used for other schemas,
> like [2].
>
> Cheers,
> Bernd
>
> [1]
> https://github.com/wikimedia/apps-android-wikipedia/blob/master/app/src/mai…
> [2]
> https://github.com/wikimedia/apps-android-wikipedia/blob/master/app/src/mai…
>
> On Wed, Sep 30, 2015 at 1:07 PM, Jon Katz <jkatz(a)wikimedia.org> wrote:
>
>> Hey folks,
>> Can you answer Neil's question for apps and web respectively? You are the
>> tech-owners of this. Moving forward, I think the sampling rate should be
>> documented on the discussion page.
>> Thanks,
>>
>> J
>> ---------- Forwarded message ----------
>> From: Neil P. Quinn <nquinn(a)wikimedia.org>
>> Date: Wed, Sep 30, 2015 at 11:32 AM
>> Subject: Mobile event log sampling
>> To: Jon Katz <jkatz(a)wikimedia.org>
>>
>>
>> Jon,
>>
>> Just to confirm: the MobileWikiAppEdit
>> <https://meta.wikimedia.org/wiki/Schema:MobileWikiAppEdit> and
>> MobileWebEditing
>> <https://meta.wikimedia.org/wiki/Schema:MobileWebEditing> logs are not
>> sampled, right? (Asking you because you're listed as a maintainer for both).
>>
>> Thanks!
>>
>> --
>> Neil P. Quinn <https://meta.wikimedia.org/wiki/User:Neil_P._Quinn-WMF>,
>> product analyst
>> Wikimedia Foundation
>>
>>
>
Sharing this on mobile-l.
-Adam
---------- Forwarded message ----------
From: C. Scott Ananian <cananian(a)wikimedia.org>
Date: Wed, Sep 30, 2015 at 12:14 PM
Subject: [Design] The page design of our dreams
To: Wikimedia designers <design(a)lists.wikimedia.org>
Let's revisit the basic way that Mediawiki lays out media and content.
It has worked well enough for twenty years, but perhaps we can do
better.
In particular, I would like to be able to (a) make Wikimedia projects
looks Really Beautiful (b) on a variety of different devices and
formats.
Mobile and print are the forerunners here: in both cases we'd like
more flexibility to lay out infoboxes, media, tables, and content in
not-strictly-linear ways:
1) We'd like to be able to tag lead images, and use them more
generally (backgrounds for page titles, previews, etc)
2) Infoboxes, references, footnotes, etc want to be untethered from
their source location in the content and moved around -- for example,
to sidebars or popups on mobile; to the footer or dedicated pages in
print.
3) We would like to be able to crop and scale images better, but need
focal point information or a box around critical regions of the image.
(If the article is about the sun, and the photo is of a sunny day,
cropping the sun out would be bad. Other images have critical
features at the edges of the image we don't want to lose.) We
currently have a single option "thumb", and a single user-specified
scaling factor, meant for accessibility --- but an accessible size
will differ on different devices, and the scaling factor doesn't apply
to all images, only to those using "thumb".
4) We need more semantic information about images in order to make
better layouts: in print, is this a "wide" image appropriate for
spanning across multiple columns, or a "feature" image appropriate for
having a page to itself? Is this a meaningful parallel grouping of
images (ie, before and after) which shouldn't be broken up (but could
be arranged either horizontally or vertically, or perhaps with a
slider)? Should this image be laid out inline (rarely) or can it
float to a more aesthetic location?
5) Even text content might be unmoored -- why can't we have pull
quotes or sidebars in our articles?
6) What else? What other features of magazine, newpaper, or
encyclopedia design are we missing?
>From a technical perspective, I'd like to move eventually toward a
system with greater separation of layout and content (think of
something like adobe pagemaker), where changes to layout can be made
without editing the article text. But I'd also like to make sure that
the technical issues don't overshadow the actual goal:
* What beautiful designs would you like for article content?
* What tools could we build to enable these designs?
Eventually we'd like to boil this down into a concrete design for a
better image styling system, which seems like a reasonable first step
in revamping what mediawiki can do for designers. That RFC is
https://phabricator.wikimedia.org/T90914 -- ideally the RFC will be
guided by a concrete design for a specific article, say,
http://en.wikipedia.beta.wmflabs.org/wiki/Moon, so that the
implementation of the RFC can focus on building the capabilities
needed to execute that specific design. That way we're certain we're
building something useful and beautiful for designers and readers, not
just implementing something whose PHP code seems elegant.
--scott
--
(http://cscott.net)
_______________________________________________
Design mailing list
Design(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/design
Hi all,
here is the weekly look at our most important readership metrics.
As laid out last week, the main purpose is to raise awareness about how
these are developing, call out the impact of any unusual events in the
preceding week, and facilitate thinking about core metrics in general.
We will iterate on the presentation (e.g. to better take seasonality into
account, in particular including year-over-year comparisons) and eventually
want to create dashboards for those which are not already available in that
form already. Feedback and discussion welcome.
(All numbers below are averages for September 14-20, 2015 unless otherwise
noted.)
Pageviews:
Total: 523 million/day
Context (April 2015-September 2015):
Pageviews were down 4.3% from last week, a somewhat noticeable drop (for a
more dramatized view, see the Vital Signs dashboard
<https://vital-signs.wmflabs.org/#projects=all/metrics=Pageviews>).
Desktop: 58.1%
Mobile web: 40.7%
Apps: 1.2%
Global North ratio: 77.4% of total pageviews
Context (April 2015-September 2015):
Unique app users
Android: 1.303 million /day
Context (January 2015-September 2015):
A slight rise in the last two weeks, but we’re not seeing a clear effect of
the recent media promotion for the Android app (blog post
<https://blog.wikimedia.org/2015/09/10/wikipedia-app-new-navigation-features/>
on September 10, media coverage
<https://meta.wikimedia.org/wiki/Communications_committee/Press_clippings/20…>
on September 10, 11 and 13) - compare the bump in April around the time
when the app was featured in the Play store and the media promotion for the
iOS app happened (with much wider media coverage
<https://meta.wikimedia.org/wiki/Communications_committee/Press_clippings/20…>
).
iOS: 319k / day
Context (January 2015-September 2015):
No news here.
New app installations
Android: 44,066/day (installs per device, from Google Play, September 14-19)
Context (Aug 21, 2015 - Sep 21, 2015):
One is tempted to see a bit of a bump after the start of the media
promotion on September 10 (concretely, average daily installs in the time
from September 10-19 were 2.5% higher than in the time from August
21-September 9), but in the end we can’t really be certain whether it had
an effect. (Compare again very visible peak in April in the year-over-year
graph
<https://commons.wikimedia.org/wiki/File:Wikipedia_Android_app_-_daily_insta…>
I posted last week.)
The drop on Sunday September 20 seems to be a problem with Google Play’s
statistics (several other metrics like uninstall numbers show a similar
drop) and not something we need to worry about; I left it out from the
average above as an outlier.
iOS: 4,839/day (download numbers from App Annie)
Context (September 2014-September 2015):
----
For reference, the queries and source links used are listed below (access
is needed for each). Most of the above charts have been updated on Commons,
too
<https://commons.wikimedia.org/w/index.php?title=Special:ListFiles&offset=20…>
.
hive (wmf)> SELECT SUM(view_count)/7000000 AS avg_daily_views_millions FROM
wmf.projectview_hourly WHERE agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-09-14"
AND "2015-09-20";
hive (wmf)> SELECT year, month, day,
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) as date,
sum(IF(access_method <> 'desktop', view_count, null)) AS mobileviews,
SUM(view_count) AS allviews FROM wmf.projectview_hourly WHERE year=2015 AND
agent_type = 'user' GROUP BY year, month, day ORDER BY year, month, day
LIMIT 1000;
hive (wmf)> SELECT access_method, SUM(view_count)/7 FROM
wmf.projectview_hourly WHERE agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-09-14"
AND "2015-09-20" GROUP BY access_method;
hive (wmf)> SELECT SUM(IF (FIND_IN_SET(country_code,
'AD,AL,AT,AX,BA,BE,BG,CH,CY,CZ,DE,DK,EE,ES,FI,FO,FR,FX,GB,GG,GI,GL,GR,HR,HU,IE,IL,IM,IS,IT,JE,LI,LU,LV,MC,MD,ME,MK,MT,NL,NO,PL,PT,RO,RS,RU,SE,SI,SJ,SK,SM,TR,VA,AU,CA,HK,MO,NZ,JP,SG,KR,TW,US')
> 0, view_count, 0))/SUM(view_count) FROM wmf.projectview_hourly WHERE
agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-09-14"
AND "2015-09-20";
hive (wmf)> SELECT year, month, day,
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")), SUM(view_count) AS
all, SUM(IF (FIND_IN_SET(country_code,
'AD,AL,AT,AX,BA,BE,BG,CH,CY,CZ,DE,DK,EE,ES,FI,FO,FR,FX,GB,GG,GI,GL,GR,HR,HU,IE,IL,IM,IS,IT,JE,LI,LU,LV,MC,MD,ME,MK,MT,NL,NO,PL,PT,RO,RS,RU,SE,SI,SJ,SK,SM,TR,VA,AU,CA,HK,MO,NZ,JP,SG,KR,TW,US')
> 0, view_count, 0)) AS Global_North_views FROM wmf.projectview_hourly
WHERE year = 2015 AND agent_type='user' GROUP BY year, month, day ORDER BY
year, month, day LIMIT 1000;
hive (wmf)> SELECT SUM(IF(platform = 'Android',unique_count,0))/7 AS
avg_Android_DAU_last_week, SUM(IF(platform = 'iOS',unique_count,0))/7 AS
avg_iOS_DAU_last_week FROM wmf.mobile_apps_uniques_daily WHERE
CONCAT(year,LPAD(month,2,"0"),LPAD(day,2,"0")) BETWEEN 20150914 AND
20150920;
hive (wmf)> SELECT CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0"))
as date, unique_count AS Android_DAU FROM wmf.mobile_apps_uniques_daily
WHERE platform = 'Android';
hive (wmf)> SELECT CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0"))
as date, unique_count AS iOS_DAU FROM wmf.mobile_apps_uniques_daily WHERE
platform = 'iOS';
https://play.google.com/apps/publish/?dev_acc=18062848292902787645#StatsPla…https://www.appannie.com/dashboard/252257/item/324715238/downloads/?breakdo…
--
Tilman Bayer
Senior Analyst
Wikimedia Foundation
IRC (Freenode): HaeB
Hi all,
here is the weekly look at our most important readership metrics.
As laid out earlier, the main purpose is to raise awareness about how these
are developing, call out the impact of any unusual events in the preceding
week, and facilitate thinking about core metrics in general.
We will continue to iterate on the presentation (e.g. to better take
seasonality into account, in particular including year-over-year
comparisons) and eventually want to create dashboards for those which are
not already available in that form already. Feedback and discussion welcome.
(All numbers below are averages for September 21-27, 2015 unless otherwise
noted.)
Pageviews
Total: 519 million/day
Context (April 2015-September 2015):
After the conspicuous 4.3% drop the previous week, pageviews decreased a
bit further (0.7%) this time. (For a more dramatized view, see the Vital
Signs dashboard
<https://vital-signs.wmflabs.org/#projects=all/metrics=Pageviews>).
Desktop: 57.5%
Mobile web: 41.3%
Apps: 1.2%
Global North ratio: 77.4% of total pageviews
Context (April 2015-September 2015):
Unique app users
Android: 1.340 million /day
Context (January 2015-September 2015):
iOS: 328k / day
Context (January 2015-September 2015):
New app installations
Android: 42,782/day (Daily installs per device, from Google Play, September
21-27)
Context (September 2014-September 2015):
iOS: 4,603/day (download numbers from App Annie)
Context (September 2014-September 2015):
As a bonus track (because this week’s report doesn’t offer much news
otherwise ;) here’s a chart of the *day 1 retention rate* for the iOS app.
(That’s defined as the percentage of users who used the app on the day
after first installing it.)
(from August 30 = leftmost bar to September 26, source: iTunes Connect)
It’s not quite clear why the rate dropped around September 16 and then rose
again last Wednesday - no feature changes or influx of downloaders from
particular sources that we’re aware of during that time.
(Per discussion in last week’s thread, I’ll look into including similar
metrics for Android too.)
----
For reference, the queries and source links used are listed below (access
is needed for each). Most of the above charts are available on Commons, too
<https://commons.wikimedia.org/w/index.php?title=Special:ListFiles&offset=20…>
.
hive (wmf)> SELECT SUM(view_count)/7000000 AS avg_daily_views_millions FROM
wmf.projectview_hourly WHERE agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-09-21"
AND "2015-09-27";
hive (wmf)> SELECT year, month, day,
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) as date,
sum(IF(access_method <> 'desktop', view_count, null)) AS mobileviews,
SUM(view_count) AS allviews FROM wmf.projectview_hourly WHERE year=2015 AND
agent_type = 'user' GROUP BY year, month, day ORDER BY year, month, day
LIMIT 1000;
hive (wmf)> SELECT access_method, SUM(view_count)/7 FROM
wmf.projectview_hourly WHERE agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-09-21"
AND "2015-09-27" GROUP BY access_method;
hive (wmf)> SELECT SUM(IF (FIND_IN_SET(country_code,
'AD,AL,AT,AX,BA,BE,BG,CH,CY,CZ,DE,DK,EE,ES,FI,FO,FR,FX,GB,GG,GI,GL,GR,HR,HU,IE,IL,IM,IS,IT,JE,LI,LU,LV,MC,MD,ME,MK,MT,NL,NO,PL,PT,RO,RS,RU,SE,SI,SJ,SK,SM,TR,VA,AU,CA,HK,MO,NZ,JP,SG,KR,TW,US')
> 0, view_count, 0))/SUM(view_count) FROM wmf.projectview_hourly WHERE
agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-09-20"
AND "2015-09-27";
hive (wmf)> SELECT year, month, day,
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")), SUM(view_count) AS
all, SUM(IF (FIND_IN_SET(country_code,
'AD,AL,AT,AX,BA,BE,BG,CH,CY,CZ,DE,DK,EE,ES,FI,FO,FR,FX,GB,GG,GI,GL,GR,HR,HU,IE,IL,IM,IS,IT,JE,LI,LU,LV,MC,MD,ME,MK,MT,NL,NO,PL,PT,RO,RS,RU,SE,SI,SJ,SK,SM,TR,VA,AU,CA,HK,MO,NZ,JP,SG,KR,TW,US')
> 0, view_count, 0)) AS Global_North_views FROM wmf.projectview_hourly
WHERE year = 2015 AND agent_type='user' GROUP BY year, month, day ORDER BY
year, month, day LIMIT 1000;
hive (wmf)> SELECT SUM(IF(platform = 'Android',unique_count,0))/7 AS
avg_Android_DAU_last_week, SUM(IF(platform = 'iOS',unique_count,0))/7 AS
avg_iOS_DAU_last_week FROM wmf.mobile_apps_uniques_daily WHERE
CONCAT(year,LPAD(month,2,"0"),LPAD(day,2,"0")) BETWEEN 20150920 AND
20150927;
hive (wmf)> SELECT CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0"))
as date, unique_count AS Android_DAU FROM wmf.mobile_apps_uniques_daily
WHERE platform = 'Android';
hive (wmf)> SELECT CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0"))
as date, unique_count AS iOS_DAU FROM wmf.mobile_apps_uniques_daily WHERE
platform = 'iOS';
https://play.google.com/apps/publish/?dev_acc=02812522755211381933#StatsPla…https://www.appannie.com/dashboard/252257/item/324715238/downloads/?breakdo…
(select “Total”)
https://analytics.itunes.apple.com/#/retention?app=324715238
--
Tilman Bayer
Senior Analyst
Wikimedia Foundation
IRC (Freenode): HaeB