Dear All:
Maybe this question is a little bit too simple, but I did not immediately find the answer in the docs.
How does the API differentiate between the two user agents spider and bot?
I'm asking because for some articles, there seems to be no bot traffic at all, including the main page in August: https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia...
--- Another, unrelated question: By my recollection, I read somewhere that the data available via the API dates back to sometime in May of 2015. However, when doing queries today, the API only returned data starting on August 1, 2015. Is that correct?
Best, Felix
Hey Felix,
To answer some questions in order:
1. Bots are automated systems with a Wikimedia specific tag (WikimediaBot, iirc) in their user agent. We don't expect this to be widely adopted yet because it hasn't been widely advertised. The standard itself is very new, which is probably why you don't see any traffic referring to it in August. 2. The idea is to have traffic no earlier than May 2015 - because that's when the new pageview definition was instrumented (and so the earliest point we have data from) but that doesn't mean all the data has been loaded in yet.
On 14 December 2015 at 18:39, Felix J. Scholz felixjacobscholz@gmail.com wrote:
Dear All:
Maybe this question is a little bit too simple, but I did not immediately find the answer in the docs.
How does the API differentiate between the two user agents spider and bot?
I'm asking because for some articles, there seems to be no bot traffic at all, including the main page in August: https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia...
Another, unrelated question: By my recollection, I read somewhere that the data available via the API dates back to sometime in May of 2015. However, when doing queries today, the API only returned data starting on August 1, 2015. Is that correct?
Best, Felix
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
Thanks for the quick answers, Oliver!
On Mon, Dec 14, 2015 at 6:45 PM, Oliver Keyes okeyes@wikimedia.org wrote:
Hey Felix,
To answer some questions in order:
- Bots are automated systems with a Wikimedia specific tag
(WikimediaBot, iirc) in their user agent. We don't expect this to be widely adopted yet because it hasn't been widely advertised. The standard itself is very new, which is probably why you don't see any traffic referring to it in August. 2. The idea is to have traffic no earlier than May 2015 - because that's when the new pageview definition was instrumented (and so the earliest point we have data from) but that doesn't mean all the data has been loaded in yet.
On 14 December 2015 at 18:39, Felix J. Scholz felixjacobscholz@gmail.com wrote:
Dear All:
Maybe this question is a little bit too simple, but I did not immediately find the answer in the docs.
How does the API differentiate between the two user agents spider and
bot?
I'm asking because for some articles, there seems to be no bot traffic at all, including the main page in August:
https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia...
Another, unrelated question: By my recollection, I read somewhere that the data available via the API dates back to sometime in May of 2015. However, when doing queries today, the API only returned data starting on August 1, 2015. Is that correct?
Best, Felix
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
-- Oliver Keyes Count Logula Wikimedia Foundation
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
+1 Oliver - User agents tagged with WikimediaBot are tagged as bot - I do agree that our documentation on this can be approved, I'll update the Webrequest and Pageview tables docs to reflect this.
The backfilling jobs for May-July have been paused at the moment, the plan is to resume backfilling in 2-3 weeks.
On Mon, Dec 14, 2015 at 3:45 PM, Oliver Keyes okeyes@wikimedia.org wrote:
Hey Felix,
To answer some questions in order:
- Bots are automated systems with a Wikimedia specific tag
(WikimediaBot, iirc) in their user agent. We don't expect this to be widely adopted yet because it hasn't been widely advertised. The standard itself is very new, which is probably why you don't see any traffic referring to it in August. 2. The idea is to have traffic no earlier than May 2015 - because that's when the new pageview definition was instrumented (and so the earliest point we have data from) but that doesn't mean all the data has been loaded in yet.
On 14 December 2015 at 18:39, Felix J. Scholz felixjacobscholz@gmail.com wrote:
Dear All:
Maybe this question is a little bit too simple, but I did not immediately find the answer in the docs.
How does the API differentiate between the two user agents spider and
bot?
I'm asking because for some articles, there seems to be no bot traffic at all, including the main page in August:
https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia...
Another, unrelated question: By my recollection, I read somewhere that the data available via the API dates back to sometime in May of 2015. However, when doing queries today, the API only returned data starting on August 1, 2015. Is that correct?
Best, Felix
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
-- Oliver Keyes Count Logula Wikimedia Foundation
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
2-3 weeks? What are you doing, taking /vacations at Christmas/? Unacceptable!
More seriously: the work on the API thus far - the data that has been moved in, the responsiveness around bug reports, the intuitive nature of the interface from a client library POV - has been fantastic. I hope you all enjoy your break :). I am honoured to call you coworkers.
On 14 December 2015 at 18:51, Madhumitha Viswanathan mviswanathan@wikimedia.org wrote:
+1 Oliver - User agents tagged with WikimediaBot are tagged as bot - I do agree that our documentation on this can be approved, I'll update the Webrequest and Pageview tables docs to reflect this.
The backfilling jobs for May-July have been paused at the moment, the plan is to resume backfilling in 2-3 weeks.
On Mon, Dec 14, 2015 at 3:45 PM, Oliver Keyes okeyes@wikimedia.org wrote:
Hey Felix,
To answer some questions in order:
- Bots are automated systems with a Wikimedia specific tag
(WikimediaBot, iirc) in their user agent. We don't expect this to be widely adopted yet because it hasn't been widely advertised. The standard itself is very new, which is probably why you don't see any traffic referring to it in August. 2. The idea is to have traffic no earlier than May 2015 - because that's when the new pageview definition was instrumented (and so the earliest point we have data from) but that doesn't mean all the data has been loaded in yet.
On 14 December 2015 at 18:39, Felix J. Scholz felixjacobscholz@gmail.com wrote:
Dear All:
Maybe this question is a little bit too simple, but I did not immediately find the answer in the docs.
How does the API differentiate between the two user agents spider and bot?
I'm asking because for some articles, there seems to be no bot traffic at all, including the main page in August:
https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia...
Another, unrelated question: By my recollection, I read somewhere that the data available via the API dates back to sometime in May of 2015. However, when doing queries today, the API only returned data starting on August 1, 2015. Is that correct?
Best, Felix
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
-- Oliver Keyes Count Logula Wikimedia Foundation
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
-- --Madhu :)
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
I could not agree more. The API implementation has progressed remarkably well over the last few months.
Congrats to all involved!
On Tue, Dec 15, 2015 at 5:56 AM, Oliver Keyes okeyes@wikimedia.org wrote:
2-3 weeks? What are you doing, taking /vacations at Christmas/? Unacceptable!
More seriously: the work on the API thus far - the data that has been moved in, the responsiveness around bug reports, the intuitive nature of the interface from a client library POV - has been fantastic. I hope you all enjoy your break :). I am honoured to call you coworkers.
On 14 December 2015 at 18:51, Madhumitha Viswanathan mviswanathan@wikimedia.org wrote:
+1 Oliver - User agents tagged with WikimediaBot are tagged as bot - I do agree that our documentation on this can be approved, I'll update the Webrequest and Pageview tables docs to reflect this.
The backfilling jobs for May-July have been paused at the moment, the
plan
is to resume backfilling in 2-3 weeks.
On Mon, Dec 14, 2015 at 3:45 PM, Oliver Keyes okeyes@wikimedia.org
wrote:
Hey Felix,
To answer some questions in order:
- Bots are automated systems with a Wikimedia specific tag
(WikimediaBot, iirc) in their user agent. We don't expect this to be widely adopted yet because it hasn't been widely advertised. The standard itself is very new, which is probably why you don't see any traffic referring to it in August. 2. The idea is to have traffic no earlier than May 2015 - because that's when the new pageview definition was instrumented (and so the earliest point we have data from) but that doesn't mean all the data has been loaded in yet.
On 14 December 2015 at 18:39, Felix J. Scholz felixjacobscholz@gmail.com wrote:
Dear All:
Maybe this question is a little bit too simple, but I did not immediately find the answer in the docs.
How does the API differentiate between the two user agents spider and bot?
I'm asking because for some articles, there seems to be no bot traffic at all, including the main page in August:
https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia...
Another, unrelated question: By my recollection, I read somewhere that the data available via the
API
dates back to sometime in May of 2015. However, when doing queries today, the API only returned data starting on August 1, 2015. Is that
correct?
Best, Felix
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
-- Oliver Keyes Count Logula Wikimedia Foundation
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
-- --Madhu :)
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
-- Oliver Keyes Count Logula Wikimedia Foundation
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
On Tue, Dec 15, 2015 at 2:56 AM, Oliver Keyes okeyes@wikimedia.org wrote:
2-3 weeks? What are you doing, taking /vacations at Christmas/? Unacceptable!
More seriously: the work on the API thus far - the data that has been moved in, the responsiveness around bug reports, the intuitive nature of the interface from a client library POV - has been fantastic. I hope you all enjoy your break :). I am honoured to call you coworkers.
hear hear
+1 on better documenting what "bot" refers to, not 100% intuitive.
On 14 December 2015 at 18:51, Madhumitha Viswanathan mviswanathan@wikimedia.org wrote:
+1 Oliver - User agents tagged with WikimediaBot are tagged as bot - I do agree that our documentation on this can be approved, I'll update the Webrequest and Pageview tables docs to reflect this.
The backfilling jobs for May-July have been paused at the moment, the
plan
is to resume backfilling in 2-3 weeks.
On Mon, Dec 14, 2015 at 3:45 PM, Oliver Keyes okeyes@wikimedia.org
wrote:
Hey Felix,
To answer some questions in order:
- Bots are automated systems with a Wikimedia specific tag
(WikimediaBot, iirc) in their user agent. We don't expect this to be widely adopted yet because it hasn't been widely advertised. The standard itself is very new, which is probably why you don't see any traffic referring to it in August. 2. The idea is to have traffic no earlier than May 2015 - because that's when the new pageview definition was instrumented (and so the earliest point we have data from) but that doesn't mean all the data has been loaded in yet.
On 14 December 2015 at 18:39, Felix J. Scholz felixjacobscholz@gmail.com wrote:
Dear All:
Maybe this question is a little bit too simple, but I did not immediately find the answer in the docs.
How does the API differentiate between the two user agents spider and bot?
I'm asking because for some articles, there seems to be no bot traffic at all, including the main page in August:
https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia...
Another, unrelated question: By my recollection, I read somewhere that the data available via the
API
dates back to sometime in May of 2015. However, when doing queries today, the API only returned data starting on August 1, 2015. Is that
correct?
Best, Felix
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
-- Oliver Keyes Count Logula Wikimedia Foundation
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
-- --Madhu :)
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
-- Oliver Keyes Count Logula Wikimedia Foundation
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
On Tue, Dec 15, 2015 at 10:51 AM, Madhumitha Viswanathan mviswanathan@wikimedia.org wrote:
+1 Oliver - User agents tagged with WikimediaBot are tagged as bot - I do agree that our documentation on this can be approved, I'll update the Webrequest and Pageview tables docs to reflect this.
Where was this announced? I don't believe pywikibot does this, or was notified that it should do this...?
Are accounts with the bot flag also tagged as bot?
On Mon, Dec 21, 2015 at 5:15 PM, John Mark Vandenberg jayvdb@gmail.com wrote:
On Tue, Dec 15, 2015 at 10:51 AM, Madhumitha Viswanathan mviswanathan@wikimedia.org wrote:
+1 Oliver - User agents tagged with WikimediaBot are tagged as bot - I do agree that our documentation on this can be approved, I'll update the Webrequest and Pageview tables docs to reflect this.
Where was this announced? I don't believe pywikibot does this, or was notified that it should do this...?
Apologies, it wasn't. Here is a task for it -
https://phabricator.wikimedia.org/T108599, and it's in our pipeline to get done.
Are accounts with the bot flag also tagged as bot?
I believe bot flags associated with accounts are not part of the
webrequest data, so we don't look at it. Currently, we use UA-parser + some custom regex https://github.com/wikimedia/analytics-refinery-source/blob/c7f1973053122476b6297d373d49105ec08285e9/refinery-core/src/main/java/org/wikimedia/analytics/refinery/core/Webrequest.java#L56 to identify and mark spiders. So if you have not adopted the WikimediaBot convention, your bot would be currently tagged as a spider if the UA matched this regex. Only those bots that explicitly tag with WikimediaBot will register as a bot.
--
John Vandenberg
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
I have also added notes to https://wikitech.wikimedia.org/wiki/Analytics/Data/Pageview_hourly and https://wikitech.wikimedia.org/wiki/Analytics/Data/Webrequest noting this 'bot' agent-type.
--Madhu :)
On Tue, Dec 22, 2015 at 12:23 PM, Madhumitha Viswanathan mviswanathan@wikimedia.org wrote:
On Mon, Dec 21, 2015 at 5:15 PM, John Mark Vandenberg jayvdb@gmail.com wrote:
On Tue, Dec 15, 2015 at 10:51 AM, Madhumitha Viswanathan mviswanathan@wikimedia.org wrote:
+1 Oliver - User agents tagged with WikimediaBot are tagged as bot - I do agree that our documentation on this can be approved, I'll update the Webrequest and Pageview tables docs to reflect this.
Where was this announced? I don't believe pywikibot does this, or was notified that it should do this...?
Apologies, it wasn't. Here is a task for it - https://phabricator.wikimedia.org/T108599, and it's in our pipeline to get done.
Are accounts with the bot flag also tagged as bot?
I believe bot flags associated with accounts are not part of the webrequest data, so we don't look at it.
There is a bot request parameter associated with many write actions, and there is assert=bot available for all API requests since 1.23 (and earlier with Extension:AssertEdit) See https://www.mediawiki.org/wiki/API:Assert .
Why cant those be used? They are validated data.
user-agent with 'WikimediaBot' is not validated data; anyone can change the user-agent and it magically becomes a bot? That sounds like a way to ensure this data is not reliable and a waste of effort to build.
On 21 December 2015 at 21:00, John Mark Vandenberg jayvdb@gmail.com wrote:
On Tue, Dec 22, 2015 at 12:23 PM, Madhumitha Viswanathan mviswanathan@wikimedia.org wrote:
On Mon, Dec 21, 2015 at 5:15 PM, John Mark Vandenberg jayvdb@gmail.com wrote:
On Tue, Dec 15, 2015 at 10:51 AM, Madhumitha Viswanathan mviswanathan@wikimedia.org wrote:
+1 Oliver - User agents tagged with WikimediaBot are tagged as bot - I do agree that our documentation on this can be approved, I'll update the Webrequest and Pageview tables docs to reflect this.
Where was this announced? I don't believe pywikibot does this, or was notified that it should do this...?
Apologies, it wasn't. Here is a task for it - https://phabricator.wikimedia.org/T108599, and it's in our pipeline to get done.
Are accounts with the bot flag also tagged as bot?
I believe bot flags associated with accounts are not part of the webrequest data, so we don't look at it.
There is a bot request parameter associated with many write actions, and there is assert=bot available for all API requests since 1.23 (and earlier with Extension:AssertEdit) See https://www.mediawiki.org/wiki/API:Assert .
Why cant those be used? They are validated data.
Because many "bot" requests go nowhere near the API, because almost no "pageviews" go near the API, because Assert is designed exclusively for logged-in API requests, which most API requests are not, because Assert is designed (primarily) for edits, which no pageviews are.
user-agent with 'WikimediaBot' is not validated data; anyone can change the user-agent and it magically becomes a bot? That sounds like a way to ensure this data is not reliable and a waste of effort to build.
Anyone can change the user-agent and it magically becomes considered automated software, yes. This is absolutely no different from the moment, where anyone can change their user agent to say, the GoogleBot user agent and also becomes considered automated software. The vast vast vast majority of actual human users never do this, and those that do tend not to be interested in distorting our automata statistics but instead not providing a consistent user agent for privacy purposes, in which case they use browser extensions to roll between an array of actual human UAs. There's no real incentive to roll between automata UAs because some sites restrict the features you can or can't access (for example: not providing JavaScript) if they think you're a crawler. As of yesterday, I have been handling the raw webrequest logs for 2 solid years and in that time the number of obviously-human "automata" I've seen has been minuscule.
-- John Vandenberg
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
Thanks for the information Oliver.
Hi John -- I just wanted to point out in a friendly way that your original email would have been just as effective if you had omitted the last line about a waste of effort to build. We always like to get feedback and questions from the community but the analytics team works hard to make good decisions and use donor money wisely. I'd love to see more constructive language on these lists.
Warmly,
-Toby
On Tue, Dec 22, 2015 at 12:30 AM, Oliver Keyes okeyes@wikimedia.org wrote:
On 21 December 2015 at 21:00, John Mark Vandenberg jayvdb@gmail.com wrote:
On Tue, Dec 22, 2015 at 12:23 PM, Madhumitha Viswanathan mviswanathan@wikimedia.org wrote:
On Mon, Dec 21, 2015 at 5:15 PM, John Mark Vandenberg <jayvdb@gmail.com
wrote:
On Tue, Dec 15, 2015 at 10:51 AM, Madhumitha Viswanathan mviswanathan@wikimedia.org wrote:
+1 Oliver - User agents tagged with WikimediaBot are tagged as bot -
I
do agree that our documentation on this can be approved, I'll update the Webrequest and Pageview tables docs to reflect this.
Where was this announced? I don't believe pywikibot does this, or was notified that it should do this...?
Apologies, it wasn't. Here is a task for it - https://phabricator.wikimedia.org/T108599, and it's in our pipeline to
get
done.
Are accounts with the bot flag also tagged as bot?
I believe bot flags associated with accounts are not part of the
webrequest
data, so we don't look at it.
There is a bot request parameter associated with many write actions, and there is assert=bot available for all API requests since 1.23 (and earlier with Extension:AssertEdit) See https://www.mediawiki.org/wiki/API:Assert .
Why cant those be used? They are validated data.
Because many "bot" requests go nowhere near the API, because almost no "pageviews" go near the API, because Assert is designed exclusively for logged-in API requests, which most API requests are not, because Assert is designed (primarily) for edits, which no pageviews are.
user-agent with 'WikimediaBot' is not validated data; anyone can change the user-agent and it magically becomes a bot? That sounds like a way to ensure this data is not reliable and a waste of effort to build.
Anyone can change the user-agent and it magically becomes considered automated software, yes. This is absolutely no different from the moment, where anyone can change their user agent to say, the GoogleBot user agent and also becomes considered automated software. The vast vast vast majority of actual human users never do this, and those that do tend not to be interested in distorting our automata statistics but instead not providing a consistent user agent for privacy purposes, in which case they use browser extensions to roll between an array of actual human UAs. There's no real incentive to roll between automata UAs because some sites restrict the features you can or can't access (for example: not providing JavaScript) if they think you're a crawler. As of yesterday, I have been handling the raw webrequest logs for 2 solid years and in that time the number of obviously-human "automata" I've seen has been minuscule.
-- John Vandenberg
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
-- Oliver Keyes Count Logula Wikimedia Foundation
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
+1 (and my apologies if my reply was somewhat sharp too). I have worked with the Analytics team pretty extensively and they are excellent people. As another consumer of this data, I only tend to disagree with them about prioritisation - and even when I think they're /wrong/ I've never seen a situation where they didn't have a really good reason for disagreeing with me ;p. A bit of AGF goes a really long way.
On 22 December 2015 at 10:46, Toby Negrin tnegrin@wikimedia.org wrote:
Thanks for the information Oliver.
Hi John -- I just wanted to point out in a friendly way that your original email would have been just as effective if you had omitted the last line about a waste of effort to build. We always like to get feedback and questions from the community but the analytics team works hard to make good decisions and use donor money wisely. I'd love to see more constructive language on these lists.
Warmly,
-Toby
On Tue, Dec 22, 2015 at 12:30 AM, Oliver Keyes okeyes@wikimedia.org wrote:
On 21 December 2015 at 21:00, John Mark Vandenberg jayvdb@gmail.com wrote:
On Tue, Dec 22, 2015 at 12:23 PM, Madhumitha Viswanathan mviswanathan@wikimedia.org wrote:
On Mon, Dec 21, 2015 at 5:15 PM, John Mark Vandenberg jayvdb@gmail.com wrote:
On Tue, Dec 15, 2015 at 10:51 AM, Madhumitha Viswanathan mviswanathan@wikimedia.org wrote:
+1 Oliver - User agents tagged with WikimediaBot are tagged as bot - I do agree that our documentation on this can be approved, I'll update the Webrequest and Pageview tables docs to reflect this.
Where was this announced? I don't believe pywikibot does this, or was notified that it should do this...?
Apologies, it wasn't. Here is a task for it - https://phabricator.wikimedia.org/T108599, and it's in our pipeline to get done.
Are accounts with the bot flag also tagged as bot?
I believe bot flags associated with accounts are not part of the webrequest data, so we don't look at it.
There is a bot request parameter associated with many write actions, and there is assert=bot available for all API requests since 1.23 (and earlier with Extension:AssertEdit) See https://www.mediawiki.org/wiki/API:Assert .
Why cant those be used? They are validated data.
Because many "bot" requests go nowhere near the API, because almost no "pageviews" go near the API, because Assert is designed exclusively for logged-in API requests, which most API requests are not, because Assert is designed (primarily) for edits, which no pageviews are.
user-agent with 'WikimediaBot' is not validated data; anyone can change the user-agent and it magically becomes a bot? That sounds like a way to ensure this data is not reliable and a waste of effort to build.
Anyone can change the user-agent and it magically becomes considered automated software, yes. This is absolutely no different from the moment, where anyone can change their user agent to say, the GoogleBot user agent and also becomes considered automated software. The vast vast vast majority of actual human users never do this, and those that do tend not to be interested in distorting our automata statistics but instead not providing a consistent user agent for privacy purposes, in which case they use browser extensions to roll between an array of actual human UAs. There's no real incentive to roll between automata UAs because some sites restrict the features you can or can't access (for example: not providing JavaScript) if they think you're a crawler. As of yesterday, I have been handling the raw webrequest logs for 2 solid years and in that time the number of obviously-human "automata" I've seen has been minuscule.
-- John Vandenberg
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
-- Oliver Keyes Count Logula Wikimedia Foundation
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics
Analytics mailing list Analytics@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/analytics