language or project, but you can't do that for unique devices.
way to deduplicate that.
Right. Please take a look at the docs about unique devices, we report
unique devices per project per site:
On Thu, Nov 17, 2016 at 1:22 PM, Dan Andreescu <dandreescu(a)wikimedia.org>
wrote:
Hi Melody,
I'm cc-ing our public list which is the best place to ask questions like
this.
So, the unique devices and pageviews numbers on the vital signs dashboard
are fetched from our public APIs. They don't allow bulk download. To get
bulk numbers, you can:
* download everything from our public dumps [1] where you'll find
pageviews by article or by project [2] and unique device numbers by project
[3]
* ask one of our analysts to crunch numbers (in this case, the reading
team would be the relevant one to ask)
* use our internal cluster to crunch numbers yourself (I can help show you
around)
One important thing to keep in mind: You can aggregate pageviews by
language or project, but you can't do that for unique devices. Because the
same devices might be used to visit many sites and there's no way to
deduplicate that. We're working on counting global unique devices so we
have those numbers as well, though Tilman from reading has some interesting
work on that too.
[1]
https://dumps.wikimedia.org/other/analytics/
[2]
https://dumps.wikimedia.org/other/pageviews/
[3]
https://dumps.wikimedia.org/other/unique_devices/
On Thu, Nov 17, 2016 at 11:44 AM, Melody Kramer <mkramer(a)wikimedia.org>
wrote:
Hey Dan and Mikhail,
I'm working on a map of the Wikimedia universe that will show the
relative size of entities under the Wikimedia umbrella (Wikipedia,
Wikibooks, Wikinews, etc.) grouped by language, articles contributed and
then pageviews and/or unique devices.
On this site:
https://analytics.wikimedia.org/dashboards/vital-
signs/#projects=eswiki,itwiki,enwiki,jawiki,dewiki,ruwiki,
frwiki,enwikibooks,enwikinews,wikidatawiki,commonswiki/
metrics=UniqueDevices I'm able to manually enter each
language/wikiproject to see them all on the graph.
Is there a way to acquire everything at once, and download it into a csv?
Or say "Show all?"
I'm happy to say more! Thanks so much for your help/expertise in this
area in advance (and if there's someone else I should reach out to, please
let me know who that might be!)
Mel
--
Melody Kramer
Read a random featured article from Wikipedia!
<https://en.wikipedia.org/wiki/Special:RandomInCategory/Featured_articles>
mkramer(a)wikimedia.org
_______________________________________________
Analytics mailing list
Analytics(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/analytics