Just a reminder, we will be deprecating the pagecounts datasets at the end
of May, as we mentioned earlier this year [0]. This means these files will
remain there to be used by researchers but new files will not be generated
in the future.
*Pagecounts datasets that will be deprecated*
pagecounts-raw
pagecounts-all-sites
Options for switching to the new datasets [1]:
pageviews for the same format but better quality data
pagecounts-ez for compressed data
[0] https://lists.wikimedia.org/pipermail/analytics/2016-March/005060.html
[1] https://dumps.wikimedia.org/other/analytics/
Hi,
Welcome to the first of a series of semi-regular updates on our progress
towards Wikistats 2.0. As you may have seen from the banners on
stats.wikimedia.org, we're working on a replacement for Wikistats. Erik
talked about this in his announcement [1]. To summarize it from our point
of view:
* Wikistats has served the community very well so far, and we're looking to
keep every bit of value in the upgrade
* Wikistats depends on the dumps generation process which is getting slower
and slower due to its architecture. Because of this, most editing metrics
are delayed by weeks through no fault of the Wikistats implementation
* Finding data on Wikistats is a bit hard for new users, so we're working
on new ways to organize what's available and present it in a comprehensive
way along with other data sources like dumps
This regular update is meant to keep interested people informed on the
direction and progress of the project.
Of course, Wikistats 2.0 is not a new project. We've already replaced the
data pipeline behind the pageview reports on stats.wikimedia.org already.
But the end goal is a new data pipeline for editing, reading, and beyond,
plus a nice UI to help guide people to what they need. Since this is the
first update, I'll lay out the high level milestones along with where we
are, and then I'll give detail about the last few weeks of work.
1. [done] Build pipeline to process and analyze *pageview* data
2. [done] Load pageview data into an *API*
3. [ ] *Sanitize* pageview data with more dimensions for public
consumption
4. [ ] Build pipeline to process and analyze *editing* data
5. [ ] Load editing data into an *API*
6. [ ] *Sanitize* editing data for public consumption
7. [ ] *Design* UI to organize dashboards built around new data
8. [ ] Build enough *dashboards* to replace the main functionality
of stats.wikipedia.org
9. [ ] Officially Replace stats.wikipedia.org with *(maybe)
analytics.wikipedia.org <http://analytics.wikipedia.org>*
***. [ ] Bonus: *replace dumps generation* based on the new data
pipelines
Our focus last year was pageview data, and that's how we got 1 and 2 done.
3 is mostly done except deploying the logic and making the data
available. So 4, 5, and 6 are what we're working on now. As we work on
these pieces, we'll take vertical slices of different important metrics and
take them from the data processing all the way to the dashboards that
present the results. That means we'll make incremental progress on 8 and 9
as we go. But we won't be able to finish 7 and 9 until we have a cohesive
design to wrap around it all. We don't want to introduce yet more
dashboard hell, we want to save you the consumers from all that.
So the focus right now is on the editing data pipeline. What do I mean by
this? Data is already available in quarry and via the API. That's true,
but here are some problems with that data:
* lack of historical change information. For example, we only have
pageview data by the title of the page. If we wanted to get all the
pageviews for a page that's now called C, but was called B two months ago
and A three months before that, we have to manually parse PHP-serialized
parameters in the logging table to trace back those page moves
* no easy way to look at data across wikis. If someone asks you to run a
quarry query to look at data from all wikipedias, you have to run hundreds
of separate queries, one for each database
* no easy way to look at a lot of data. Quarry and other tools time out
after a certain amount of time to protect themselves. Downloading dumps is
a way to get access to more data but the files are huge and analysis is hard
* querying the API with complex multi-dimensional analytics questions isn't
possible
These are the kinds of problems we're trying to solve. Our progress so far:
* Retraced history through the logging table to piece together what names
each page has had throughout its life. Deleted pages were included in this
reconstruction
* Found what names each user has had throughout their life. And what
rights and blocks were applied to or removed from users.
* Wrote event schemas for Event Bus, which will feed data into this
pipeline in near real time (so metrics and dashboards can be updated in
near-real-time)
* Come up with a single denormalized schema that holds every single kind of
event possible in the editing world. This is a join of the Event Bus
schemas mentioned above and is possible to feed either in batch from our
reconstruction algorithm or in real time. If you're familiar with lambda
architecture, this is the approach we're taking to make our editing data
available
Right now we're testing the accuracy of our reconstruction against
Wikistats data. If this works, we'll open up the schema to more people to
play with so they can give feedback on this way of doing analytics. And if
all that looks good, we'll be loading the data into Druid and Hive and
running the most high priority metrics on this new platform. We hope to be
done with this by the end of this quarter. To weigh in on what reports are
important, make sure you visit Erik's page [2]. We'll also do a tech talk
on our algorithm for historical reconstruction and lessons learned on
mediawiki analytics.
If you're still reading, congratulations, sorry for the wall of text. I
look forward to keeping you all in the loop, and to making steady progress
on this project that's very dear to our hearts. Feel free to ask questions
and if you'd like to be involved, just let me know how. Have a nice
weekend :)
[1]
http://infodisiac.com/blog/2016/05/wikistats-days-will-be-over-soon-long-li…
[2]
https://www.mediawiki.org/wiki/Analytics/Wikistats/DumpReports/Future_per_r…