+ product, analytics as fyi
On Thu, Jan 15, 2015 at 3:27 PM, Jon Katz jkatz@wikimedia.org wrote:
Hi Folks, Just sharing some data gleaned from the logs that I pulled in relation to the collections project. I looked at clicks on the watchlist star (see image), because we are considering that to be the first entry point for creating a collection from an article on mobile web. Nothing earth-shattering, but the data suggests that the pilot will not generate enough data without:
- a tutorial element/signage
- improved account creation funnel
- additional entry points (such as search results)
-J
[image: Inline image 1][image: Inline image 4]
*Summary: *
- Data suggests additional entry points might be necessary in order to
derive meaningful results from the pilot. - Current expected user population reaching collections entry point without further intervention 2.4M per month. Of those, 28.6k are logged in and able to access the feature funnel. - Another 8.5k users end up signing up as a result of the prompt (.3% conversion rate).
- Baseline established to measure interventions against.
- More baselines to come.
*Results*
- in a 30 day period in April, ~28.6k logged in users clicked on
watchlist star (2.4M anonymous clicked, prompted to create account)
- in a 30 day period in June*, ~8.5k of the previously logged-out
users successfully created accounts after clicking on watchlist star (.3% conversion rate). I do not know if these should be counted in the 28.6k logged in "watch" events.
- Logins that resulted from CTA not tracked, but assumed to be low and
might trigger the measured events we did track.
- Chart view, clicks on watchlist star (see methods>assumptions,
below, for anonymous user calc):
[image: Inline image 2]
*How this informs Collections:*
- If nothing changes, ~2.4M people have potential to see watchlist
creation prompt, of whom 28.6k are immediately eligible to create (don't need login/account?). I think this latter number is a little low for a pilot. You? Should we explore other entry points for pilot, like search results?
- We have a baseline of ~81.5k watchlist clicks per month by logged in
users and 2.8M from loggedout. How much can we increase these numbers? I don't think this number *needs* to grow significantly for success, because 1 collection could drive infinite views, but it would definitely help - Any overlay or product introduction will need to increase this number to be successful - Changing the icon from a star to a heart or some thing else more collection-y would also have to increase this number to be succesful
- Account creation rate has room to grow
can boost this, we might be able to have a large impact on the number of people who are eligible to create.
- The CTA has a .3% conversion rate, which is very very low. If we
incentive?
- Can we drive this number up by offering collections as an
*Next Steps*
- We need to explore alternate entry points - such as search, as we
continue to build out the MVP.
- I think we should measure the following baselines to establish what
we should measure against and set realistic expectations: - Book creations over time (to compare to collection creations) - Edits on mobile (to compare to collection creations/edits) - Watchlist menu clicks (similar path to editing your collections) - Referrals via social (to establish baseline for social/share traffic) - Any others?
*Methods:*
- Queries used:
star (April 2014). Link to query https://docs.google.com/a/wikimedia.org/document/d/1-bfW9H2kApvOOOTKL40TjYKH0eO7dUeErpg0dNW554I/edit#heading=h.eqhhm1kvdn6e .
- Queried MobileBetaWatchlist_5281061 for 30 days--tracks clicks on
- Queried ServerSideAccountCreation_5487345 for 30 days in (June
https://docs.google.com/a/wikimedia.org/document/d/1-bfW9H2kApvOOOTKL40TjYKH0eO7dUeErpg0dNW554I/edit#heading=h.d11qmgmu7pqb .
- Link to query
- Assumptions
undercounts users this way.
- Assume unique [IP+UserAgent] = unique anonymous user. Definitely
watchlist is small (not currently measured) and/or succesful logins from this prompt sucesfully add article to watchlist and trigger event that we *did* measure. So the above count might be underestimating total events.
- Assume number of users who login as a result of clicking on
clicks and the account creations. I ran a join to see if a succesful account creation then triggered a successful version of the original watchlist event (i.e. anon click to watch article-->account signup prompt-->create account-->article watch click triggered), but it kept stalling out. This is the query, in case you're curious https://docs.google.com/a/wikimedia.org/document/d/1-bfW9H2kApvOOOTKL40TjYKH0eO7dUeErpg0dNW554I/edit#heading=h.eg0uxzfv84qe .
- Assuming addition is appropriate between the logged in watchlist
- Data alert
events from Jan- Dec, but there were obvious dropoffs and spikes due to changes that were only caught when I ran a daily report for events in each table. Without this the data would have been very very skewed (my original count for anon watchlist clicks was 57k per month...it is actually 2.4M). Here is a link https://docs.google.com/a/wikimedia.org/document/d/1-bfW9H2kApvOOOTKL40TjYKH0eO7dUeErpg0dNW554I/edit#heading=h.d0ywz6oufh2m to a helpful daily event query.
- I had a rough time with some queries because the event logs had
*necessary to use different time periods because the logs did not cleanly overlap
One suggestion, would be to engage a small dark pattern here, if we're getting an not insignificant number of anon taps on the star we should use it as an opportunity for value add, normally I'd say value add for creating an account, but in this case I'd suggest you focus it solely on value add for list creation, allow users to create a list, or at least start it, prior to creating an account, and force account creation when they try to add a second page or attempt to share the list. This will get people a little more invested before asking them to make an account.
Just a suggestion.
*Jared Zimmerman * \ Director of User Experience \ Wikimedia Foundation
M +1 415 609 4043 \ @jaredzimmerman http://loo.ms/g0
On Thu, Jan 15, 2015 at 3:38 PM, Jon Katz jkatz@wikimedia.org wrote:
- product, analytics as fyi
On Thu, Jan 15, 2015 at 3:27 PM, Jon Katz jkatz@wikimedia.org wrote:
Hi Folks, Just sharing some data gleaned from the logs that I pulled in relation to the collections project. I looked at clicks on the watchlist star (see image), because we are considering that to be the first entry point for creating a collection from an article on mobile web. Nothing earth-shattering, but the data suggests that the pilot will not generate enough data without:
- a tutorial element/signage
- improved account creation funnel
- additional entry points (such as search results)
-J
[image: Inline image 1][image: Inline image 4]
*Summary: *
- Data suggests additional entry points might be necessary in order
to derive meaningful results from the pilot. - Current expected user population reaching collections entry point without further intervention 2.4M per month. Of those, 28.6k are logged in and able to access the feature funnel. - Another 8.5k users end up signing up as a result of the prompt (.3% conversion rate).
- Baseline established to measure interventions against.
- More baselines to come.
*Results*
- in a 30 day period in April, ~28.6k logged in users clicked on
watchlist star (2.4M anonymous clicked, prompted to create account)
- in a 30 day period in June*, ~8.5k of the previously logged-out
users successfully created accounts after clicking on watchlist star (.3% conversion rate). I do not know if these should be counted in the 28.6k logged in "watch" events.
- Logins that resulted from CTA not tracked, but assumed to be low
and might trigger the measured events we did track.
- Chart view, clicks on watchlist star (see methods>assumptions,
below, for anonymous user calc):
[image: Inline image 2]
*How this informs Collections:*
- If nothing changes, ~2.4M people have potential to see watchlist
creation prompt, of whom 28.6k are immediately eligible to create (don't need login/account?). I think this latter number is a little low for a pilot. You? Should we explore other entry points for pilot, like search results?
- We have a baseline of ~81.5k watchlist clicks per month by logged
in users and 2.8M from loggedout. How much can we increase these numbers? I don't think this number *needs* to grow significantly for success, because 1 collection could drive infinite views, but it would definitely help - Any overlay or product introduction will need to increase this number to be successful - Changing the icon from a star to a heart or some thing else more collection-y would also have to increase this number to be succesful
- Account creation rate has room to grow
we can boost this, we might be able to have a large impact on the number of people who are eligible to create.
- The CTA has a .3% conversion rate, which is very very low. If
incentive?
- Can we drive this number up by offering collections as an
*Next Steps*
- We need to explore alternate entry points - such as search, as we
continue to build out the MVP.
- I think we should measure the following baselines to establish what
we should measure against and set realistic expectations: - Book creations over time (to compare to collection creations) - Edits on mobile (to compare to collection creations/edits) - Watchlist menu clicks (similar path to editing your collections) - Referrals via social (to establish baseline for social/share traffic) - Any others?
*Methods:*
- Queries used:
on star (April 2014). Link to query https://docs.google.com/a/wikimedia.org/document/d/1-bfW9H2kApvOOOTKL40TjYKH0eO7dUeErpg0dNW554I/edit#heading=h.eqhhm1kvdn6e .
- Queried MobileBetaWatchlist_5281061 for 30 days--tracks clicks
- Queried ServerSideAccountCreation_5487345 for 30 days in (June
https://docs.google.com/a/wikimedia.org/document/d/1-bfW9H2kApvOOOTKL40TjYKH0eO7dUeErpg0dNW554I/edit#heading=h.d11qmgmu7pqb .
- Link to query
- Assumptions
Definitely undercounts users this way.
- Assume unique [IP+UserAgent] = unique anonymous user.
watchlist is small (not currently measured) and/or succesful logins from this prompt sucesfully add article to watchlist and trigger event that we *did* measure. So the above count might be underestimating total events.
- Assume number of users who login as a result of clicking on
clicks and the account creations. I ran a join to see if a succesful account creation then triggered a successful version of the original watchlist event (i.e. anon click to watch article-->account signup prompt-->create account-->article watch click triggered), but it kept stalling out. This is the query, in case you're curious https://docs.google.com/a/wikimedia.org/document/d/1-bfW9H2kApvOOOTKL40TjYKH0eO7dUeErpg0dNW554I/edit#heading=h.eg0uxzfv84qe .
- Assuming addition is appropriate between the logged in watchlist
- Data alert
events from Jan- Dec, but there were obvious dropoffs and spikes due to changes that were only caught when I ran a daily report for events in each table. Without this the data would have been very very skewed (my original count for anon watchlist clicks was 57k per month...it is actually 2.4M). Here is a link https://docs.google.com/a/wikimedia.org/document/d/1-bfW9H2kApvOOOTKL40TjYKH0eO7dUeErpg0dNW554I/edit#heading=h.d0ywz6oufh2m to a helpful daily event query.
- I had a rough time with some queries because the event logs had
*necessary to use different time periods because the logs did not cleanly overlap
Wmfproduct mailing list Wmfproduct@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wmfproduct
Jon,
What are you testing for? What is the success criteria?
Your numbers are statistically significant, so I am not convinced that total entry numbers are an issue at this point. This depends on what the test is meant to examine, but so far I think you have enough total uniques to do analysis.
L
On Thu, Jan 15, 2015 at 3:38 PM, Jon Katz jkatz@wikimedia.org wrote:
- product, analytics as fyi
On Thu, Jan 15, 2015 at 3:27 PM, Jon Katz jkatz@wikimedia.org wrote:
Hi Folks, Just sharing some data gleaned from the logs that I pulled in relation to the collections project. I looked at clicks on the watchlist star (see image), because we are considering that to be the first entry point for creating a collection from an article on mobile web. Nothing earth-shattering, but the data suggests that the pilot will not generate enough data without:
- a tutorial element/signage
- improved account creation funnel
- additional entry points (such as search results)
-J
[image: Inline image 1][image: Inline image 4]
*Summary: *
- Data suggests additional entry points might be necessary in order
to derive meaningful results from the pilot. - Current expected user population reaching collections entry point without further intervention 2.4M per month. Of those, 28.6k are logged in and able to access the feature funnel. - Another 8.5k users end up signing up as a result of the prompt (.3% conversion rate).
- Baseline established to measure interventions against.
- More baselines to come.
*Results*
- in a 30 day period in April, ~28.6k logged in users clicked on
watchlist star (2.4M anonymous clicked, prompted to create account)
- in a 30 day period in June*, ~8.5k of the previously logged-out
users successfully created accounts after clicking on watchlist star (.3% conversion rate). I do not know if these should be counted in the 28.6k logged in "watch" events.
- Logins that resulted from CTA not tracked, but assumed to be low
and might trigger the measured events we did track.
- Chart view, clicks on watchlist star (see methods>assumptions,
below, for anonymous user calc):
[image: Inline image 2]
*How this informs Collections:*
- If nothing changes, ~2.4M people have potential to see watchlist
creation prompt, of whom 28.6k are immediately eligible to create (don't need login/account?). I think this latter number is a little low for a pilot. You? Should we explore other entry points for pilot, like search results?
- We have a baseline of ~81.5k watchlist clicks per month by logged
in users and 2.8M from loggedout. How much can we increase these numbers? I don't think this number *needs* to grow significantly for success, because 1 collection could drive infinite views, but it would definitely help - Any overlay or product introduction will need to increase this number to be successful - Changing the icon from a star to a heart or some thing else more collection-y would also have to increase this number to be succesful
- Account creation rate has room to grow
we can boost this, we might be able to have a large impact on the number of people who are eligible to create.
- The CTA has a .3% conversion rate, which is very very low. If
incentive?
- Can we drive this number up by offering collections as an
*Next Steps*
- We need to explore alternate entry points - such as search, as we
continue to build out the MVP.
- I think we should measure the following baselines to establish what
we should measure against and set realistic expectations: - Book creations over time (to compare to collection creations) - Edits on mobile (to compare to collection creations/edits) - Watchlist menu clicks (similar path to editing your collections) - Referrals via social (to establish baseline for social/share traffic) - Any others?
*Methods:*
- Queries used:
on star (April 2014). Link to query https://docs.google.com/a/wikimedia.org/document/d/1-bfW9H2kApvOOOTKL40TjYKH0eO7dUeErpg0dNW554I/edit#heading=h.eqhhm1kvdn6e .
- Queried MobileBetaWatchlist_5281061 for 30 days--tracks clicks
- Queried ServerSideAccountCreation_5487345 for 30 days in (June
https://docs.google.com/a/wikimedia.org/document/d/1-bfW9H2kApvOOOTKL40TjYKH0eO7dUeErpg0dNW554I/edit#heading=h.d11qmgmu7pqb .
- Link to query
- Assumptions
Definitely undercounts users this way.
- Assume unique [IP+UserAgent] = unique anonymous user.
watchlist is small (not currently measured) and/or succesful logins from this prompt sucesfully add article to watchlist and trigger event that we *did* measure. So the above count might be underestimating total events.
- Assume number of users who login as a result of clicking on
clicks and the account creations. I ran a join to see if a succesful account creation then triggered a successful version of the original watchlist event (i.e. anon click to watch article-->account signup prompt-->create account-->article watch click triggered), but it kept stalling out. This is the query, in case you're curious https://docs.google.com/a/wikimedia.org/document/d/1-bfW9H2kApvOOOTKL40TjYKH0eO7dUeErpg0dNW554I/edit#heading=h.eg0uxzfv84qe .
- Assuming addition is appropriate between the logged in watchlist
- Data alert
events from Jan- Dec, but there were obvious dropoffs and spikes due to changes that were only caught when I ran a daily report for events in each table. Without this the data would have been very very skewed (my original count for anon watchlist clicks was 57k per month...it is actually 2.4M). Here is a link https://docs.google.com/a/wikimedia.org/document/d/1-bfW9H2kApvOOOTKL40TjYKH0eO7dUeErpg0dNW554I/edit#heading=h.d0ywz6oufh2m to a helpful daily event query.
- I had a rough time with some queries because the event logs had
*necessary to use different time periods because the logs did not cleanly overlap
Wmfproduct mailing list Wmfproduct@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wmfproduct
Hi Lila, Great questions. My responses inline in green. Best,
Ja
On Wed, Jan 21, 2015 at 12:42 PM, Lila Tretikov lila@wikimedia.org wrote:
Jon,
What are you testing for? What is the success criteria?
Given that this is a pilot, I would like to quantify initial success in terms of what we learn. Given that we are not likely to have a long period of time before April 1 with a live feature, here are the draft measurement goals for this feature in Q3:
- Begin collecting baseline usage statistics (creating, adding, consuming) and characterizing use cases - Decide on further development on this feature based on initial baseline stats on: - *number of creators* (as compared to mobile web editors and desktop page creators) - *user class of creators* (are we reaching outside the power editor group?) - *number of shares* - *pageview* *traffic* to content viewed via shares
Excerpted from: https://www.mediawiki.org/wiki/Wikimedia_Engineering/2014-15_Goals/Q3
If by "success" you mean: whether or not we move forward, I appreciate the ask. I think that we will need further develop an understanding of what we can build this quarter and I will need to look at our baselines for comparable features (if thats possible), before we can define a reasonable baseline. You can certainly hold me to having rough expectations prior to launch. Any thoughts you have as to what your minimum expectations would be for key metrics are welcome.
Your numbers are statistically significant, so I am not convinced that
total entry numbers are an issue at this point. This depends on what the test is meant to examine, but so far I think you have enough total uniques to do analysis.
The numbers of people who click on the entry point are, as you pointed out, large enough to run tests for *that* action. However, given that this is a funnel with a few steps, rather than a single action, the number of users creating collections will be dramatically smaller. We have <50k logged-in users per month who click the star and will see "collections" as an option. If .3% finish a collection (using the same conversion metric we have for star-->account creation), then we only have <150 users across wikis finishing a collection in a month. That number would be hard to draw conclusions from, especially as we don't want to wait 30 days to analyze results. Let me know if you disagree or want to discuss further.
L
On Thu, Jan 15, 2015 at 3:38 PM, Jon Katz jkatz@wikimedia.org wrote:
- product, analytics as fyi
On Thu, Jan 15, 2015 at 3:27 PM, Jon Katz jkatz@wikimedia.org wrote:
Hi Folks, Just sharing some data gleaned from the logs that I pulled in relation to the collections project. I looked at clicks on the watchlist star (see image), because we are considering that to be the first entry point for creating a collection from an article on mobile web. Nothing earth-shattering, but the data suggests that the pilot will not generate enough data without:
- a tutorial element/signage
- improved account creation funnel
- additional entry points (such as search results)
-J
[image: Inline image 1][image: Inline image 4]
*Summary: *
- Data suggests additional entry points might be necessary in order
to derive meaningful results from the pilot. - Current expected user population reaching collections entry point without further intervention 2.4M per month. Of those, 28.6k are logged in and able to access the feature funnel. - Another 8.5k users end up signing up as a result of the prompt (.3% conversion rate).
- Baseline established to measure interventions against.
- More baselines to come.
*Results*
- in a 30 day period in April, ~28.6k logged in users clicked on
watchlist star (2.4M anonymous clicked, prompted to create account)
- in a 30 day period in June*, ~8.5k of the previously logged-out
users successfully created accounts after clicking on watchlist star (.3% conversion rate). I do not know if these should be counted in the 28.6k logged in "watch" events.
- Logins that resulted from CTA not tracked, but assumed to be low
and might trigger the measured events we did track.
- Chart view, clicks on watchlist star (see methods>assumptions,
below, for anonymous user calc):
[image: Inline image 2]
*How this informs Collections:*
- If nothing changes, ~2.4M people have potential to see watchlist
creation prompt, of whom 28.6k are immediately eligible to create (don't need login/account?). I think this latter number is a little low for a pilot. You? Should we explore other entry points for pilot, like search results?
- We have a baseline of ~81.5k watchlist clicks per month by logged
in users and 2.8M from loggedout. How much can we increase these numbers? I don't think this number *needs* to grow significantly for success, because 1 collection could drive infinite views, but it would definitely help - Any overlay or product introduction will need to increase this number to be successful - Changing the icon from a star to a heart or some thing else more collection-y would also have to increase this number to be succesful
- Account creation rate has room to grow
we can boost this, we might be able to have a large impact on the number of people who are eligible to create.
- The CTA has a .3% conversion rate, which is very very low. If
incentive?
- Can we drive this number up by offering collections as an
*Next Steps*
- We need to explore alternate entry points - such as search, as we
continue to build out the MVP.
- I think we should measure the following baselines to establish
what we should measure against and set realistic expectations: - Book creations over time (to compare to collection creations) - Edits on mobile (to compare to collection creations/edits) - Watchlist menu clicks (similar path to editing your collections) - Referrals via social (to establish baseline for social/share traffic) - Any others?
*Methods:*
- Queries used:
on star (April 2014). Link to query https://docs.google.com/a/wikimedia.org/document/d/1-bfW9H2kApvOOOTKL40TjYKH0eO7dUeErpg0dNW554I/edit#heading=h.eqhhm1kvdn6e .
- Queried MobileBetaWatchlist_5281061 for 30 days--tracks clicks
- Queried ServerSideAccountCreation_5487345 for 30 days in (June
https://docs.google.com/a/wikimedia.org/document/d/1-bfW9H2kApvOOOTKL40TjYKH0eO7dUeErpg0dNW554I/edit#heading=h.d11qmgmu7pqb .
- Link to query
- Assumptions
Definitely undercounts users this way.
- Assume unique [IP+UserAgent] = unique anonymous user.
watchlist is small (not currently measured) and/or succesful logins from this prompt sucesfully add article to watchlist and trigger event that we *did* measure. So the above count might be underestimating total events.
- Assume number of users who login as a result of clicking on
watchlist clicks and the account creations. I ran a join to see if a succesful account creation then triggered a successful version of the original watchlist event (i.e. anon click to watch article-->account signup prompt-->create account-->article watch click triggered), but it kept stalling out. This is the query, in case you're curious https://docs.google.com/a/wikimedia.org/document/d/1-bfW9H2kApvOOOTKL40TjYKH0eO7dUeErpg0dNW554I/edit#heading=h.eg0uxzfv84qe .
- Assuming addition is appropriate between the logged in
- Data alert
events from Jan- Dec, but there were obvious dropoffs and spikes due to changes that were only caught when I ran a daily report for events in each table. Without this the data would have been very very skewed (my original count for anon watchlist clicks was 57k per month...it is actually 2.4M). Here is a link https://docs.google.com/a/wikimedia.org/document/d/1-bfW9H2kApvOOOTKL40TjYKH0eO7dUeErpg0dNW554I/edit#heading=h.d0ywz6oufh2m to a helpful daily event query.
- I had a rough time with some queries because the event logs had
*necessary to use different time periods because the logs did not cleanly overlap
Wmfproduct mailing list Wmfproduct@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wmfproduct