Hello research colleagues,
When I look at the WMF Report Card, it appears to me that the global active editor stats and the number of new accounts being registered per month has been relatively flat since at least 2011.
Those of you who work in EE research and analytics, I would like to ask if there is a summary of techniques that you have found that do produce statistically significant results in improving editor retention. I know that some of you write tools, design projects, or pull and analyze data about editors. It looks to me like WMF is investing significant effort in research and tool creation, but we're not moving the needle to create the results that we had hoped to achieve. So I'd like to ask what have we learned from all of our time working on editor engagement about techniques and programs that do improve the EE stats significant ways, so that we can hopefully accelerate the implementation of programs and techniques that have demonstrated success.
I'd also like to ask what barriers you think prevent us from becoming more effective at improving the number of users who register and the number of active editors. For example, are users who go through GettingStarted often being deterred by quickly being confronted by experienced editors in ways that make the newbies want to leave? If that is a significant problem, how do you suggest addressing this?
One of my concerns about investing further in developing Flow, analytics tools like like WIkimetrics, and further complex editor engagement research projects, is that the most important challenges related to editor engagement may be problems that can only be solved through primarily interpersonal and social means rather than the use of software tools and mass communications. I like Wikimetrics and I use it, and I think there's an important place for analytics and tool development in EE work, but I wonder if WMF should scale up the emphasis on grassroots social and interpersonal efforts, particularly in the context of the 2015+ Strategic Plan and Jimmy's speech at the 2014 Wikimania. What do you think,and if your answer is yes, how do you think WMF can do this while respecting the autonomy and social processes of the volunteer projects?
Thanks,
Pine
Can we see data on the number of new editors and the number of editors dropping out (by some definition of sustained inactivity) to see if the problem is initial recruitment or dropping-out? And in terms of number of edits when do we see inactivity set in? My suspicion is that we are getting plenty of new editors but that our ability to retain editors is our problem.
I'd be betting on reverts as the cause of loss of newer editors and conflicts as the cause of loss of more experienced editors. personally I think revert need a tick the box (like the licensing question on commons) to make it harder to revert unless it's vandalism, BLP, or nonsense etc. Try to make it harder to revert "because I don't happen to like it like that" and encourage more pleasant and helpful discussion. I think we need to upgrade from "civil" which appears to mean little more than don't swear.
Sent from my iPad
On 11 Sep 2014, at 4:00 pm, Pine W wiki.pine@gmail.com wrote:
Hello research colleagues,
When I look at the WMF Report Card, it appears to me that the global active editor stats and the number of new accounts being registered per month has been relatively flat since at least 2011.
Those of you who work in EE research and analytics, I would like to ask if there is a summary of techniques that you have found that do produce statistically significant results in improving editor retention. I know that some of you write tools, design projects, or pull and analyze data about editors. It looks to me like WMF is investing significant effort in research and tool creation, but we're not moving the needle to create the results that we had hoped to achieve. So I'd like to ask what have we learned from all of our time working on editor engagement about techniques and programs that do improve the EE stats significant ways, so that we can hopefully accelerate the implementation of programs and techniques that have demonstrated success.
I'd also like to ask what barriers you think prevent us from becoming more effective at improving the number of users who register and the number of active editors. For example, are users who go through GettingStarted often being deterred by quickly being confronted by experienced editors in ways that make the newbies want to leave? If that is a significant problem, how do you suggest addressing this?
One of my concerns about investing further in developing Flow, analytics tools like like WIkimetrics, and further complex editor engagement research projects, is that the most important challenges related to editor engagement may be problems that can only be solved through primarily interpersonal and social means rather than the use of software tools and mass communications. I like Wikimetrics and I use it, and I think there's an important place for analytics and tool development in EE work, but I wonder if WMF should scale up the emphasis on grassroots social and interpersonal efforts, particularly in the context of the 2015+ Strategic Plan and Jimmy's speech at the 2014 Wikimania. What do you think,and if your answer is yes, how do you think WMF can do this while respecting the autonomy and social processes of the volunteer projects?
Thanks,
Pine _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
And I would comment that I don't see why the foundation should respect the autonomy and social processes of any group that's effectively working against the foundation's objectives through the group's norms. I would be inclined to say "our platform, our rules" in that case.
I find it fascinating
Sent from my iPad
On 11 Sep 2014, at 4:00 pm, Pine W wiki.pine@gmail.com wrote:
Hello research colleagues,
When I look at the WMF Report Card, it appears to me that the global active editor stats and the number of new accounts being registered per month has been relatively flat since at least 2011.
Those of you who work in EE research and analytics, I would like to ask if there is a summary of techniques that you have found that do produce statistically significant results in improving editor retention. I know that some of you write tools, design projects, or pull and analyze data about editors. It looks to me like WMF is investing significant effort in research and tool creation, but we're not moving the needle to create the results that we had hoped to achieve. So I'd like to ask what have we learned from all of our time working on editor engagement about techniques and programs that do improve the EE stats significant ways, so that we can hopefully accelerate the implementation of programs and techniques that have demonstrated success.
I'd also like to ask what barriers you think prevent us from becoming more effective at improving the number of users who register and the number of active editors. For example, are users who go through GettingStarted often being deterred by quickly being confronted by experienced editors in ways that make the newbies want to leave? If that is a significant problem, how do you suggest addressing this?
One of my concerns about investing further in developing Flow, analytics tools like like WIkimetrics, and further complex editor engagement research projects, is that the most important challenges related to editor engagement may be problems that can only be solved through primarily interpersonal and social means rather than the use of software tools and mass communications. I like Wikimetrics and I use it, and I think there's an important place for analytics and tool development in EE work, but I wonder if WMF should scale up the emphasis on grassroots social and interpersonal efforts, particularly in the context of the 2015+ Strategic Plan and Jimmy's speech at the 2014 Wikimania. What do you think,and if your answer is yes, how do you think WMF can do this while respecting the autonomy and social processes of the volunteer projects?
Thanks,
Pine _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Hoi, The point of research is that it provides us with understanding that indicates one way or the other the problems we face and, how we are trending towards success or failure.
Thanks to numbers we know the extend of the growth of our mobile readers and editors. The trend is uncontroversial; it grows and it offsets the readers and editors that are declining from computers. Simple research shows that talk pages are unworkable on mobiles and tablets.
Dear Pine, do you agree that such research exists, do you agree that I fairly summarize the data that is available ?
When you want more engagement by our public, ask yourself how can we use our numbers and analyse what might point to things where we could / should mobilise our community. Numbers that show clearly why it makes sense for us to ask volunteers to volunteer. I give you one set of numbers we do not have... The number of negative results from the searches in our Wikipedias individually. Thanks, GerardM
On 11 September 2014 08:00, Pine W wiki.pine@gmail.com wrote:
Hello research colleagues,
When I look at the WMF Report Card, it appears to me that the global active editor stats and the number of new accounts being registered per month has been relatively flat since at least 2011.
Those of you who work in EE research and analytics, I would like to ask if there is a summary of techniques that you have found that do produce statistically significant results in improving editor retention. I know that some of you write tools, design projects, or pull and analyze data about editors. It looks to me like WMF is investing significant effort in research and tool creation, but we're not moving the needle to create the results that we had hoped to achieve. So I'd like to ask what have we learned from all of our time working on editor engagement about techniques and programs that do improve the EE stats significant ways, so that we can hopefully accelerate the implementation of programs and techniques that have demonstrated success.
I'd also like to ask what barriers you think prevent us from becoming more effective at improving the number of users who register and the number of active editors. For example, are users who go through GettingStarted often being deterred by quickly being confronted by experienced editors in ways that make the newbies want to leave? If that is a significant problem, how do you suggest addressing this?
One of my concerns about investing further in developing Flow, analytics tools like like WIkimetrics, and further complex editor engagement research projects, is that the most important challenges related to editor engagement may be problems that can only be solved through primarily interpersonal and social means rather than the use of software tools and mass communications. I like Wikimetrics and I use it, and I think there's an important place for analytics and tool development in EE work, but I wonder if WMF should scale up the emphasis on grassroots social and interpersonal efforts, particularly in the context of the 2015+ Strategic Plan and Jimmy's speech at the 2014 Wikimania. What do you think,and if your answer is yes, how do you think WMF can do this while respecting the autonomy and social processes of the volunteer projects?
Thanks,
Pine
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
It would be very interesting to know the size of edits done on mobile vs desktop (it would be even better if we could distinguish between phones and tablets because of the different form factors. I appreciate that we have the problem of definition as a person on a phone can use the desktop interface and vice versa, so there's a matrix of device and interface potentially.
when I say "size of edit", I would really prefer to know the size of the delta, not the difference in the size of the article as reported in the history. My personal hypothesis is that the smaller the form factor the smaller the edits. As much as I love my ipad, it is no substitute for my laptop for serious editing, most edits are harder and slower on the ipad than my laptop, and it's a pain to,do citations on a mobile device. If my hypothesis is correct, I am not personally convinced that the loss of a desktop edit is compensated by the gain of a mobile edit, even it results in the same total number of edits, I think the extent to which an article is improved will be lower on mobile (on average). Not sure how we measure that but KPIs like size of delta and addition of citations would be something that might be interesting. Or, with enough data, we could use the automatic assessment tool to look for articles that change assessment (as measured by the tool) and look at the mobile vs desktop edit counts and ratios etc.
Sent from my iPad
On 11 Sep 2014, at 8:20 pm, Gerard Meijssen gerard.meijssen@gmail.com wrote:
Hoi, The point of research is that it provides us with understanding that indicates one way or the other the problems we face and, how we are trending towards success or failure.
Thanks to numbers we know the extend of the growth of our mobile readers and editors. The trend is uncontroversial; it grows and it offsets the readers and editors that are declining from computers. Simple research shows that talk pages are unworkable on mobiles and tablets.
Dear Pine, do you agree that such research exists, do you agree that I fairly summarize the data that is available ?
When you want more engagement by our public, ask yourself how can we use our numbers and analyse what might point to things where we could / should mobilise our community. Numbers that show clearly why it makes sense for us to ask volunteers to volunteer. I give you one set of numbers we do not have... The number of negative results from the searches in our Wikipedias individually. Thanks, GerardM
On 11 September 2014 08:00, Pine W wiki.pine@gmail.com wrote: Hello research colleagues,
When I look at the WMF Report Card, it appears to me that the global active editor stats and the number of new accounts being registered per month has been relatively flat since at least 2011.
Those of you who work in EE research and analytics, I would like to ask if there is a summary of techniques that you have found that do produce statistically significant results in improving editor retention. I know that some of you write tools, design projects, or pull and analyze data about editors. It looks to me like WMF is investing significant effort in research and tool creation, but we're not moving the needle to create the results that we had hoped to achieve. So I'd like to ask what have we learned from all of our time working on editor engagement about techniques and programs that do improve the EE stats significant ways, so that we can hopefully accelerate the implementation of programs and techniques that have demonstrated success.
I'd also like to ask what barriers you think prevent us from becoming more effective at improving the number of users who register and the number of active editors. For example, are users who go through GettingStarted often being deterred by quickly being confronted by experienced editors in ways that make the newbies want to leave? If that is a significant problem, how do you suggest addressing this?
One of my concerns about investing further in developing Flow, analytics tools like like WIkimetrics, and further complex editor engagement research projects, is that the most important challenges related to editor engagement may be problems that can only be solved through primarily interpersonal and social means rather than the use of software tools and mass communications. I like Wikimetrics and I use it, and I think there's an important place for analytics and tool development in EE work, but I wonder if WMF should scale up the emphasis on grassroots social and interpersonal efforts, particularly in the context of the 2015+ Strategic Plan and Jimmy's speech at the 2014 Wikimania. What do you think,and if your answer is yes, how do you think WMF can do this while respecting the autonomy and social processes of the volunteer projects?
Thanks,
Pine
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Hoi, The problem with this approach is that as it is, the functionality for editing on tablets and phones is not well developed at all. As a consequence the results will not be that meaningful.
It is only recently that it became possible to edit. So realistically there are several important factors... The development of enabling technology, the numbers of readers from a tablet / mobile.
The personal argument of current editors that they prefer their computer for complex stuff essentially makes the newbies on that other platform second class citizens. The realisation that currently our technology favours computer usage is not. The first is an argument that sounds like "do not bother, it does not matter", the other leaves room for "we need to work on improving the mobile/tablet experience".
Arguably, calling things a KPI may mean a bias from the start. Thanks, GerardM
On 13 September 2014 14:41, Kerry Raymond kerry.raymond@gmail.com wrote:
It would be very interesting to know the size of edits done on mobile vs desktop (it would be even better if we could distinguish between phones and tablets because of the different form factors. I appreciate that we have the problem of definition as a person on a phone can use the desktop interface and vice versa, so there's a matrix of device and interface potentially.
when I say "size of edit", I would really prefer to know the size of the delta, not the difference in the size of the article as reported in the history. My personal hypothesis is that the smaller the form factor the smaller the edits. As much as I love my ipad, it is no substitute for my laptop for serious editing, most edits are harder and slower on the ipad than my laptop, and it's a pain to,do citations on a mobile device. If my hypothesis is correct, I am not personally convinced that the loss of a desktop edit is compensated by the gain of a mobile edit, even it results in the same total number of edits, I think the extent to which an article is improved will be lower on mobile (on average). Not sure how we measure that but KPIs like size of delta and addition of citations would be something that might be interesting. Or, with enough data, we could use the automatic assessment tool to look for articles that change assessment (as measured by the tool) and look at the mobile vs desktop edit counts and ratios etc.
Sent from my iPad
On 11 Sep 2014, at 8:20 pm, Gerard Meijssen gerard.meijssen@gmail.com wrote:
Hoi, The point of research is that it provides us with understanding that indicates one way or the other the problems we face and, how we are trending towards success or failure.
Thanks to numbers we know the extend of the growth of our mobile readers and editors. The trend is uncontroversial; it grows and it offsets the readers and editors that are declining from computers. Simple research shows that talk pages are unworkable on mobiles and tablets.
Dear Pine, do you agree that such research exists, do you agree that I fairly summarize the data that is available ?
When you want more engagement by our public, ask yourself how can we use our numbers and analyse what might point to things where we could / should mobilise our community. Numbers that show clearly why it makes sense for us to ask volunteers to volunteer. I give you one set of numbers we do not have... The number of negative results from the searches in our Wikipedias individually. Thanks, GerardM
On 11 September 2014 08:00, Pine W wiki.pine@gmail.com wrote:
Hello research colleagues,
When I look at the WMF Report Card, it appears to me that the global active editor stats and the number of new accounts being registered per month has been relatively flat since at least 2011.
Those of you who work in EE research and analytics, I would like to ask if there is a summary of techniques that you have found that do produce statistically significant results in improving editor retention. I know that some of you write tools, design projects, or pull and analyze data about editors. It looks to me like WMF is investing significant effort in research and tool creation, but we're not moving the needle to create the results that we had hoped to achieve. So I'd like to ask what have we learned from all of our time working on editor engagement about techniques and programs that do improve the EE stats significant ways, so that we can hopefully accelerate the implementation of programs and techniques that have demonstrated success.
I'd also like to ask what barriers you think prevent us from becoming more effective at improving the number of users who register and the number of active editors. For example, are users who go through GettingStarted often being deterred by quickly being confronted by experienced editors in ways that make the newbies want to leave? If that is a significant problem, how do you suggest addressing this?
One of my concerns about investing further in developing Flow, analytics tools like like WIkimetrics, and further complex editor engagement research projects, is that the most important challenges related to editor engagement may be problems that can only be solved through primarily interpersonal and social means rather than the use of software tools and mass communications. I like Wikimetrics and I use it, and I think there's an important place for analytics and tool development in EE work, but I wonder if WMF should scale up the emphasis on grassroots social and interpersonal efforts, particularly in the context of the 2015+ Strategic Plan and Jimmy's speech at the 2014 Wikimania. What do you think,and if your answer is yes, how do you think WMF can do this while respecting the autonomy and social processes of the volunteer projects?
Thanks,
Pine
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
It's always been possible to read or edit from a mobile phone or tablet using the desktop interface, but I agree that the development of more mobile friendly tools alters things, but still I would argue having some "measurements" (if KPI is too biased a term) is useful to judge whether the effort put into those tools is worth it for the return. Does adding features alter the way Wikipedia is edited and is it for better or worse? For example when the new mobile tool was released a little while ago, I saw a lot of edits tagged "Mobile Edits" that were vandalising Wikipedia. Fortunately it died down, but obviously if most of the edits coming from mobile tools were vandalism, we might well ask if it is worth having.
I often check my watchlist on my iPad and "knock off" the easy ones "ok, ok, ugh vandalism revert, ok, ok, leave that one until I'm on my laptop, ok, ok, ok" so it may be that people just reorganise their editing around the device they are using in which case it might change what they do from hour to hour but not over (say) a week.
I think we have a range of metrics to pick from in terms of quantity of activity (how many, how large, etc). What we probably need to complement are some kinds of quality metrics. We could go with some easy ones (how often an edit summary is left, how long is the edit summary), how often is the activity a revert or a reverted edit, are the edits to mainspace, talk, user, user talk, etc. But I think we do want to consider things like the macroscopic "quality assessment" issues.
Kerry
_____
From: wiki-research-l-bounces@lists.wikimedia.org [mailto:wiki-research-l-bounces@lists.wikimedia.org] On Behalf Of Gerard Meijssen Sent: Sunday, 14 September 2014 3:23 AM To: Research into Wikimedia content and communities Subject: Re: [Wiki-research-l] What works for increasing editor engagement?
Hoi,
The problem with this approach is that as it is, the functionality for editing on tablets and phones is not well developed at all. As a consequence the results will not be that meaningful.
It is only recently that it became possible to edit. So realistically there are several important factors... The development of enabling technology, the numbers of readers from a tablet / mobile.
The personal argument of current editors that they prefer their computer for complex stuff essentially makes the newbies on that other platform second class citizens. The realisation that currently our technology favours computer usage is not. The first is an argument that sounds like "do not bother, it does not matter", the other leaves room for "we need to work on improving the mobile/tablet experience".
Arguably, calling things a KPI may mean a bias from the start.
Thanks,
GerardM
On 13 September 2014 14:41, Kerry Raymond kerry.raymond@gmail.com wrote:
It would be very interesting to know the size of edits done on mobile vs desktop (it would be even better if we could distinguish between phones and tablets because of the different form factors. I appreciate that we have the problem of definition as a person on a phone can use the desktop interface and vice versa, so there's a matrix of device and interface potentially.
when I say "size of edit", I would really prefer to know the size of the delta, not the difference in the size of the article as reported in the history. My personal hypothesis is that the smaller the form factor the smaller the edits. As much as I love my ipad, it is no substitute for my laptop for serious editing, most edits are harder and slower on the ipad than my laptop, and it's a pain to,do citations on a mobile device. If my hypothesis is correct, I am not personally convinced that the loss of a desktop edit is compensated by the gain of a mobile edit, even it results in the same total number of edits, I think the extent to which an article is improved will be lower on mobile (on average). Not sure how we measure that but KPIs like size of delta and addition of citations would be something that might be interesting. Or, with enough data, we could use the automatic assessment tool to look for articles that change assessment (as measured by the tool) and look at the mobile vs desktop edit counts and ratios etc.
Sent from my iPad
On 11 Sep 2014, at 8:20 pm, Gerard Meijssen gerard.meijssen@gmail.com wrote:
Hoi,
The point of research is that it provides us with understanding that indicates one way or the other the problems we face and, how we are trending towards success or failure.
Thanks to numbers we know the extend of the growth of our mobile readers and editors. The trend is uncontroversial; it grows and it offsets the readers and editors that are declining from computers. Simple research shows that talk pages are unworkable on mobiles and tablets.
Dear Pine, do you agree that such research exists, do you agree that I fairly summarize the data that is available ?
When you want more engagement by our public, ask yourself how can we use our numbers and analyse what might point to things where we could / should mobilise our community. Numbers that show clearly why it makes sense for us to ask volunteers to volunteer. I give you one set of numbers we do not have... The number of negative results from the searches in our Wikipedias individually.
Thanks,
GerardM
On 11 September 2014 08:00, Pine W wiki.pine@gmail.com wrote:
Hello research colleagues,
When I look at the WMF Report Card, it appears to me that the global active editor stats and the number of new accounts being registered per month has been relatively flat since at least 2011.
Those of you who work in EE research and analytics, I would like to ask if there is a summary of techniques that you have found that do produce statistically significant results in improving editor retention. I know that some of you write tools, design projects, or pull and analyze data about editors. It looks to me like WMF is investing significant effort in research and tool creation, but we're not moving the needle to create the results that we had hoped to achieve. So I'd like to ask what have we learned from all of our time working on editor engagement about techniques and programs that do improve the EE stats significant ways, so that we can hopefully accelerate the implementation of programs and techniques that have demonstrated success.
I'd also like to ask what barriers you think prevent us from becoming more effective at improving the number of users who register and the number of active editors. For example, are users who go through GettingStarted often being deterred by quickly being confronted by experienced editors in ways that make the newbies want to leave? If that is a significant problem, how do you suggest addressing this?
One of my concerns about investing further in developing Flow, analytics tools like like WIkimetrics, and further complex editor engagement research projects, is that the most important challenges related to editor engagement may be problems that can only be solved through primarily interpersonal and social means rather than the use of software tools and mass communications. I like Wikimetrics and I use it, and I think there's an important place for analytics and tool development in EE work, but I wonder if WMF should scale up the emphasis on grassroots social and interpersonal efforts, particularly in the context of the 2015+ Strategic Plan and Jimmy's speech at the 2014 Wikimania. What do you think,and if your answer is yes, how do you think WMF can do this while respecting the autonomy and social processes of the volunteer projects?
Thanks,
Pine
_______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
_______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
_______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
I agree that the shift to mobile is a big deal; I remain concerned that tech-centric approaches to editor engagement like VE and Flow, while perhaps having a modest positive impact, do little to fix the incivility problem that is so frequently cited as a reason for people to leave. Creating more efficient ways for people to communicate seems unlikely to alter the substance of the messages that are exchanged in a significant way. So I am thinking that culutural change is at least as important as Flow, VE, and Getting Started, and that cultural change should be resourced accordingly. The question I have is what WMF's role should be.
Pine On Sep 13, 2014 4:00 PM, "Kerry Raymond" kerry.raymond@gmail.com wrote:
It’s always been possible to read or edit from a mobile phone or tablet using the desktop interface, but I agree that the development of more mobile friendly tools alters things, but still I would argue having some “measurements” (if KPI is too biased a term) is useful to judge whether the effort put into those tools is worth it for the return. Does adding features alter the way Wikipedia is edited and is it for better or worse? For example when the new mobile tool was released a little while ago, I saw a lot of edits tagged “Mobile Edits” that were vandalising Wikipedia. Fortunately it died down, but obviously if most of the edits coming from mobile tools were vandalism, we might well ask if it is worth having.
I often check my watchlist on my iPad and “knock off” the easy ones “ok, ok, ugh vandalism revert, ok, ok, leave that one until I’m on my laptop, ok, ok, ok” so it may be that people just reorganise their editing around the device they are using in which case it might change what they do from hour to hour but not over (say) a week.
I think we have a range of metrics to pick from in terms of quantity of activity (how many, how large, etc). What we probably need to complement are some kinds of quality metrics. We could go with some easy ones (how often an edit summary is left, how long is the edit summary), how often is the activity a revert or a reverted edit, are the edits to mainspace, talk, user, user talk, etc. But I think we do want to consider things like the macroscopic “quality assessment” issues.
Kerry
*From:* wiki-research-l-bounces@lists.wikimedia.org [mailto: wiki-research-l-bounces@lists.wikimedia.org] *On Behalf Of *Gerard Meijssen *Sent:* Sunday, 14 September 2014 3:23 AM *To:* Research into Wikimedia content and communities *Subject:* Re: [Wiki-research-l] What works for increasing editor engagement?
Hoi,
The problem with this approach is that as it is, the functionality for editing on tablets and phones is not well developed at all. As a consequence the results will not be that meaningful.
It is only recently that it became possible to edit. So realistically there are several important factors... The development of enabling technology, the numbers of readers from a tablet / mobile.
The personal argument of current editors that they prefer their computer for complex stuff essentially makes the newbies on that other platform second class citizens. The realisation that currently our technology favours computer usage is not. The first is an argument that sounds like "do not bother, it does not matter", the other leaves room for "we need to work on improving the mobile/tablet experience".
Arguably, calling things a KPI may mean a bias from the start.
Thanks,
GerardM
On 13 September 2014 14:41, Kerry Raymond kerry.raymond@gmail.com wrote:
It would be very interesting to know the size of edits done on mobile vs desktop (it would be even better if we could distinguish between phones and tablets because of the different form factors. I appreciate that we have the problem of definition as a person on a phone can use the desktop interface and vice versa, so there's a matrix of device and interface potentially.
when I say "size of edit", I would really prefer to know the size of the delta, not the difference in the size of the article as reported in the history. My personal hypothesis is that the smaller the form factor the smaller the edits. As much as I love my ipad, it is no substitute for my laptop for serious editing, most edits are harder and slower on the ipad than my laptop, and it's a pain to,do citations on a mobile device. If my hypothesis is correct, I am not personally convinced that the loss of a desktop edit is compensated by the gain of a mobile edit, even it results in the same total number of edits, I think the extent to which an article is improved will be lower on mobile (on average). Not sure how we measure that but KPIs like size of delta and addition of citations would be something that might be interesting. Or, with enough data, we could use the automatic assessment tool to look for articles that change assessment (as measured by the tool) and look at the mobile vs desktop edit counts and ratios etc.
Sent from my iPad
On 11 Sep 2014, at 8:20 pm, Gerard Meijssen gerard.meijssen@gmail.com wrote:
Hoi,
The point of research is that it provides us with understanding that indicates one way or the other the problems we face and, how we are trending towards success or failure.
Thanks to numbers we know the extend of the growth of our mobile readers and editors. The trend is uncontroversial; it grows and it offsets the readers and editors that are declining from computers. Simple research shows that talk pages are unworkable on mobiles and tablets.
Dear Pine, do you agree that such research exists, do you agree that I fairly summarize the data that is available ?
When you want more engagement by our public, ask yourself how can we use our numbers and analyse what might point to things where we could / should mobilise our community. Numbers that show clearly why it makes sense for us to ask volunteers to volunteer. I give you one set of numbers we do not have... The number of negative results from the searches in our Wikipedias individually.
Thanks,
GerardM
On 11 September 2014 08:00, Pine W wiki.pine@gmail.com wrote:
Hello research colleagues,
When I look at the WMF Report Card, it appears to me that the global active editor stats and the number of new accounts being registered per month has been relatively flat since at least 2011.
Those of you who work in EE research and analytics, I would like to ask if there is a summary of techniques that you have found that do produce statistically significant results in improving editor retention. I know that some of you write tools, design projects, or pull and analyze data about editors. It looks to me like WMF is investing significant effort in research and tool creation, but we're not moving the needle to create the results that we had hoped to achieve. So I'd like to ask what have we learned from all of our time working on editor engagement about techniques and programs that do improve the EE stats significant ways, so that we can hopefully accelerate the implementation of programs and techniques that have demonstrated success.
I'd also like to ask what barriers you think prevent us from becoming more effective at improving the number of users who register and the number of active editors. For example, are users who go through GettingStarted often being deterred by quickly being confronted by experienced editors in ways that make the newbies want to leave? If that is a significant problem, how do you suggest addressing this?
One of my concerns about investing further in developing Flow, analytics tools like like WIkimetrics, and further complex editor engagement research projects, is that the most important challenges related to editor engagement may be problems that can only be solved through primarily interpersonal and social means rather than the use of software tools and mass communications. I like Wikimetrics and I use it, and I think there's an important place for analytics and tool development in EE work, but I wonder if WMF should scale up the emphasis on grassroots social and interpersonal efforts, particularly in the context of the 2015+ Strategic Plan and Jimmy's speech at the 2014 Wikimania. What do you think,and if your answer is yes, how do you think WMF can do this while respecting the autonomy and social processes of the volunteer projects?
Thanks,
Pine
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Kerry, the problem with the "leave that one until I'm on my laptop" edits, is that by the time I'm on my laptop, the watch list has changed again and priorities get reshuffled, causing those "tablet-unfriendly" tasks to get buried where they become eventually "ghost edits", which means they never get done at all. Jane Sent from my iPad
On Sep 14, 2014, at 12:59 AM, "Kerry Raymond" kerry.raymond@gmail.com wrote:
It’s always been possible to read or edit from a mobile phone or tablet using the desktop interface, but I agree that the development of more mobile friendly tools alters things, but still I would argue having some “measurements” (if KPI is too biased a term) is useful to judge whether the effort put into those tools is worth it for the return. Does adding features alter the way Wikipedia is edited and is it for better or worse? For example when the new mobile tool was released a little while ago, I saw a lot of edits tagged “Mobile Edits” that were vandalising Wikipedia. Fortunately it died down, but obviously if most of the edits coming from mobile tools were vandalism, we might well ask if it is worth having.
I often check my watchlist on my iPad and “knock off” the easy ones “ok, ok, ugh vandalism revert, ok, ok, leave that one until I’m on my laptop, ok, ok, ok” so it may be that people just reorganise their editing around the device they are using in which case it might change what they do from hour to hour but not over (say) a week.
I think we have a range of metrics to pick from in terms of quantity of activity (how many, how large, etc). What we probably need to complement are some kinds of quality metrics. We could go with some easy ones (how often an edit summary is left, how long is the edit summary), how often is the activity a revert or a reverted edit, are the edits to mainspace, talk, user, user talk, etc. But I think we do want to consider things like the macroscopic “quality assessment” issues.
Kerry
From: wiki-research-l-bounces@lists.wikimedia.org [mailto:wiki-research-l-bounces@lists.wikimedia.org] On Behalf Of Gerard Meijssen Sent: Sunday, 14 September 2014 3:23 AM To: Research into Wikimedia content and communities Subject: Re: [Wiki-research-l] What works for increasing editor engagement?
Hoi, The problem with this approach is that as it is, the functionality for editing on tablets and phones is not well developed at all. As a consequence the results will not be that meaningful.
It is only recently that it became possible to edit. So realistically there are several important factors... The development of enabling technology, the numbers of readers from a tablet / mobile.
The personal argument of current editors that they prefer their computer for complex stuff essentially makes the newbies on that other platform second class citizens. The realisation that currently our technology favours computer usage is not. The first is an argument that sounds like "do not bother, it does not matter", the other leaves room for "we need to work on improving the mobile/tablet experience".
Arguably, calling things a KPI may mean a bias from the start. Thanks, GerardM
On 13 September 2014 14:41, Kerry Raymond kerry.raymond@gmail.com wrote: It would be very interesting to know the size of edits done on mobile vs desktop (it would be even better if we could distinguish between phones and tablets because of the different form factors. I appreciate that we have the problem of definition as a person on a phone can use the desktop interface and vice versa, so there's a matrix of device and interface potentially.
when I say "size of edit", I would really prefer to know the size of the delta, not the difference in the size of the article as reported in the history. My personal hypothesis is that the smaller the form factor the smaller the edits. As much as I love my ipad, it is no substitute for my laptop for serious editing, most edits are harder and slower on the ipad than my laptop, and it's a pain to,do citations on a mobile device. If my hypothesis is correct, I am not personally convinced that the loss of a desktop edit is compensated by the gain of a mobile edit, even it results in the same total number of edits, I think the extent to which an article is improved will be lower on mobile (on average). Not sure how we measure that but KPIs like size of delta and addition of citations would be something that might be interesting. Or, with enough data, we could use the automatic assessment tool to look for articles that change assessment (as measured by the tool) and look at the mobile vs desktop edit counts and ratios etc.
Sent from my iPad
On 11 Sep 2014, at 8:20 pm, Gerard Meijssen gerard.meijssen@gmail.com wrote:
Hoi, The point of research is that it provides us with understanding that indicates one way or the other the problems we face and, how we are trending towards success or failure.
Thanks to numbers we know the extend of the growth of our mobile readers and editors. The trend is uncontroversial; it grows and it offsets the readers and editors that are declining from computers. Simple research shows that talk pages are unworkable on mobiles and tablets.
Dear Pine, do you agree that such research exists, do you agree that I fairly summarize the data that is available ?
When you want more engagement by our public, ask yourself how can we use our numbers and analyse what might point to things where we could / should mobilise our community. Numbers that show clearly why it makes sense for us to ask volunteers to volunteer. I give you one set of numbers we do not have... The number of negative results from the searches in our Wikipedias individually. Thanks, GerardM
On 11 September 2014 08:00, Pine W wiki.pine@gmail.com wrote: Hello research colleagues,
When I look at the WMF Report Card, it appears to me that the global active editor stats and the number of new accounts being registered per month has been relatively flat since at least 2011.
Those of you who work in EE research and analytics, I would like to ask if there is a summary of techniques that you have found that do produce statistically significant results in improving editor retention. I know that some of you write tools, design projects, or pull and analyze data about editors. It looks to me like WMF is investing significant effort in research and tool creation, but we're not moving the needle to create the results that we had hoped to achieve. So I'd like to ask what have we learned from all of our time working on editor engagement about techniques and programs that do improve the EE stats significant ways, so that we can hopefully accelerate the implementation of programs and techniques that have demonstrated success.
I'd also like to ask what barriers you think prevent us from becoming more effective at improving the number of users who register and the number of active editors. For example, are users who go through GettingStarted often being deterred by quickly being confronted by experienced editors in ways that make the newbies want to leave? If that is a significant problem, how do you suggest addressing this?
One of my concerns about investing further in developing Flow, analytics tools like like WIkimetrics, and further complex editor engagement research projects, is that the most important challenges related to editor engagement may be problems that can only be solved through primarily interpersonal and social means rather than the use of software tools and mass communications. I like Wikimetrics and I use it, and I think there's an important place for analytics and tool development in EE work, but I wonder if WMF should scale up the emphasis on grassroots social and interpersonal efforts, particularly in the context of the 2015+ Strategic Plan and Jimmy's speech at the 2014 Wikimania. What do you think,and if your answer is yes, how do you think WMF can do this while respecting the autonomy and social processes of the volunteer projects?
Thanks,
Pine
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
I have email notification for my watch list and I have a message rule that diverts them into a specific mail folder. I delete them when I have "dealt" with them. It's very useful, because it remains chronologically ordered and allows me not to forget things or allow new priorities to overtake. Sometimes there might be something like needing to add a citation that involves going to the particular library that I can't address for weeks, but the watchlist email sits there patiently reminding me.
If I used the web-based watchlist, then absolutely I would have the problem you describe (which is why I don't use it). I need the discipline of my email approach.
Kerry
_____
From: Jane Darnell [mailto:jane023@gmail.com] Sent: Monday, 15 September 2014 7:18 AM To: kerry.raymond@gmail.com; Research into Wikimedia content andcommunities Subject: Re: [Wiki-research-l] What works for increasing editor engagement?
Kerry, the problem with the "leave that one until I'm on my laptop" edits, is that by the time I'm on my laptop, the watch list has changed again and priorities get reshuffled, causing those "tablet-unfriendly" tasks to get buried where they become eventually "ghost edits", which means they never get done at all.
Jane Sent from my iPad
On Sep 14, 2014, at 12:59 AM, "Kerry Raymond" kerry.raymond@gmail.com wrote:
It's always been possible to read or edit from a mobile phone or tablet using the desktop interface, but I agree that the development of more mobile friendly tools alters things, but still I would argue having some "measurements" (if KPI is too biased a term) is useful to judge whether the effort put into those tools is worth it for the return. Does adding features alter the way Wikipedia is edited and is it for better or worse? For example when the new mobile tool was released a little while ago, I saw a lot of edits tagged "Mobile Edits" that were vandalising Wikipedia. Fortunately it died down, but obviously if most of the edits coming from mobile tools were vandalism, we might well ask if it is worth having.
I often check my watchlist on my iPad and "knock off" the easy ones "ok, ok, ugh vandalism revert, ok, ok, leave that one until I'm on my laptop, ok, ok, ok" so it may be that people just reorganise their editing around the device they are using in which case it might change what they do from hour to hour but not over (say) a week.
I think we have a range of metrics to pick from in terms of quantity of activity (how many, how large, etc). What we probably need to complement are some kinds of quality metrics. We could go with some easy ones (how often an edit summary is left, how long is the edit summary), how often is the activity a revert or a reverted edit, are the edits to mainspace, talk, user, user talk, etc. But I think we do want to consider things like the macroscopic "quality assessment" issues.
Kerry
_____
From: wiki-research-l-bounces@lists.wikimedia.org [mailto:wiki-research-l-bounces@lists.wikimedia.org] On Behalf Of Gerard Meijssen Sent: Sunday, 14 September 2014 3:23 AM To: Research into Wikimedia content and communities Subject: Re: [Wiki-research-l] What works for increasing editor engagement?
Hoi,
The problem with this approach is that as it is, the functionality for editing on tablets and phones is not well developed at all. As a consequence the results will not be that meaningful.
It is only recently that it became possible to edit. So realistically there are several important factors... The development of enabling technology, the numbers of readers from a tablet / mobile.
The personal argument of current editors that they prefer their computer for complex stuff essentially makes the newbies on that other platform second class citizens. The realisation that currently our technology favours computer usage is not. The first is an argument that sounds like "do not bother, it does not matter", the other leaves room for "we need to work on improving the mobile/tablet experience".
Arguably, calling things a KPI may mean a bias from the start.
Thanks,
GerardM
On 13 September 2014 14:41, Kerry Raymond kerry.raymond@gmail.com wrote:
It would be very interesting to know the size of edits done on mobile vs desktop (it would be even better if we could distinguish between phones and tablets because of the different form factors. I appreciate that we have the problem of definition as a person on a phone can use the desktop interface and vice versa, so there's a matrix of device and interface potentially.
when I say "size of edit", I would really prefer to know the size of the delta, not the difference in the size of the article as reported in the history. My personal hypothesis is that the smaller the form factor the smaller the edits. As much as I love my ipad, it is no substitute for my laptop for serious editing, most edits are harder and slower on the ipad than my laptop, and it's a pain to,do citations on a mobile device. If my hypothesis is correct, I am not personally convinced that the loss of a desktop edit is compensated by the gain of a mobile edit, even it results in the same total number of edits, I think the extent to which an article is improved will be lower on mobile (on average). Not sure how we measure that but KPIs like size of delta and addition of citations would be something that might be interesting. Or, with enough data, we could use the automatic assessment tool to look for articles that change assessment (as measured by the tool) and look at the mobile vs desktop edit counts and ratios etc.
Sent from my iPad
On 11 Sep 2014, at 8:20 pm, Gerard Meijssen gerard.meijssen@gmail.com wrote:
Hoi,
The point of research is that it provides us with understanding that indicates one way or the other the problems we face and, how we are trending towards success or failure.
Thanks to numbers we know the extend of the growth of our mobile readers and editors. The trend is uncontroversial; it grows and it offsets the readers and editors that are declining from computers. Simple research shows that talk pages are unworkable on mobiles and tablets.
Dear Pine, do you agree that such research exists, do you agree that I fairly summarize the data that is available ?
When you want more engagement by our public, ask yourself how can we use our numbers and analyse what might point to things where we could / should mobilise our community. Numbers that show clearly why it makes sense for us to ask volunteers to volunteer. I give you one set of numbers we do not have... The number of negative results from the searches in our Wikipedias individually.
Thanks,
GerardM
On 11 September 2014 08:00, Pine W wiki.pine@gmail.com wrote:
Hello research colleagues,
When I look at the WMF Report Card, it appears to me that the global active editor stats and the number of new accounts being registered per month has been relatively flat since at least 2011.
Those of you who work in EE research and analytics, I would like to ask if there is a summary of techniques that you have found that do produce statistically significant results in improving editor retention. I know that some of you write tools, design projects, or pull and analyze data about editors. It looks to me like WMF is investing significant effort in research and tool creation, but we're not moving the needle to create the results that we had hoped to achieve. So I'd like to ask what have we learned from all of our time working on editor engagement about techniques and programs that do improve the EE stats significant ways, so that we can hopefully accelerate the implementation of programs and techniques that have demonstrated success.
I'd also like to ask what barriers you think prevent us from becoming more effective at improving the number of users who register and the number of active editors. For example, are users who go through GettingStarted often being deterred by quickly being confronted by experienced editors in ways that make the newbies want to leave? If that is a significant problem, how do you suggest addressing this?
One of my concerns about investing further in developing Flow, analytics tools like like WIkimetrics, and further complex editor engagement research projects, is that the most important challenges related to editor engagement may be problems that can only be solved through primarily interpersonal and social means rather than the use of software tools and mass communications. I like Wikimetrics and I use it, and I think there's an important place for analytics and tool development in EE work, but I wonder if WMF should scale up the emphasis on grassroots social and interpersonal efforts, particularly in the context of the 2015+ Strategic Plan and Jimmy's speech at the 2014 Wikimania. What do you think,and if your answer is yes, how do you think WMF can do this while respecting the autonomy and social processes of the volunteer projects?
Thanks,
Pine
_______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
_______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
_______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
_______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
On Mon, Sep 15, 2014 at 12:29 PM, Kerry Raymond kerry.raymond@gmail.com wrote:
I have email notification for my watch list
How many items on your watchlist? I appear to accumulated 14,871 items on mine since I last zero'd it. Right now there are 159 changes in the last 24 hours.
I'm not sure I could cope with that volume.
Part of the problem is probably my participation in WP:BLP/N, which means that at least once a week I edit an article that's getting lots of edits and likely to for some time.
cheers stuart
I have set the preference to "put anything I edit on my watchlist" (so I can be aware of any short-term reactive edits to my own edits), but I balance that with always asking myself when I get a watchlist notification whether it deserves to stay on the watchlist. I made a decision a while back that I can't do everything, so I chose to make Queensland geography, history and biography my focus and I generally pare back my watch list to articles in that space (plus a few other things odds and ends that I am particularly fond of). By doing that, I have brought my watchlist slowly down from around 10K to about 4K which is manageable in terms of daily load, but obviously some topic spaces are more active than others (I wish there were more people interested in Queensland to share the load with).
Kerry
-----Original Message----- From: Stuart A. Yeates [mailto:syeates@gmail.com] Sent: Monday, 15 September 2014 10:51 AM To: Kerry Raymond; Research into Wikimedia content and communities Cc: Jane Darnell Subject: Re: [Wiki-research-l] What works for increasing editor engagement?
On Mon, Sep 15, 2014 at 12:29 PM, Kerry Raymond kerry.raymond@gmail.com wrote:
I have email notification for my watch list
How many items on your watchlist? I appear to accumulated 14,871 items on mine since I last zero'd it. Right now there are 159 changes in the last 24 hours.
I'm not sure I could cope with that volume.
Part of the problem is probably my participation in WP:BLP/N, which means that at least once a week I edit an article that's getting lots of edits and likely to for some time.
cheers stuart
When my watch list went over 13,000 I changed my preferences so I only add things to it that I want on it, and like Kerry started to pare things back. At first i was just unwatching a trickle of articles, I would look at edits on my watch list by unfamiliar editors, revert the vandalism and unwatch if it was a good edit and I couldn't remember why I had watch listed it. Then I did a huge purge and now have only a few thousand articles watchlisted. Above a certain size watch lists become a chore, plus with the rise of the edit filters, nowadays I don't find much vandalism by looking at my watchlist.
Regards
Jonathan Cardy
On 15 Sep 2014, at 02:35, "Kerry Raymond" kerry.raymond@gmail.com wrote:
I have set the preference to "put anything I edit on my watchlist" (so I can be aware of any short-term reactive edits to my own edits), but I balance that with always asking myself when I get a watchlist notification whether it deserves to stay on the watchlist. I made a decision a while back that I can't do everything, so I chose to make Queensland geography, history and biography my focus and I generally pare back my watch list to articles in that space (plus a few other things odds and ends that I am particularly fond of). By doing that, I have brought my watchlist slowly down from around 10K to about 4K which is manageable in terms of daily load, but obviously some topic spaces are more active than others (I wish there were more people interested in Queensland to share the load with).
Kerry
-----Original Message----- From: Stuart A. Yeates [mailto:syeates@gmail.com] Sent: Monday, 15 September 2014 10:51 AM To: Kerry Raymond; Research into Wikimedia content and communities Cc: Jane Darnell Subject: Re: [Wiki-research-l] What works for increasing editor engagement?
On Mon, Sep 15, 2014 at 12:29 PM, Kerry Raymond kerry.raymond@gmail.com wrote:
I have email notification for my watch list
How many items on your watchlist? I appear to accumulated 14,871 items on mine since I last zero'd it. Right now there are 159 changes in the last 24 hours.
I'm not sure I could cope with that volume.
Part of the problem is probably my participation in WP:BLP/N, which means that at least once a week I edit an article that's getting lots of edits and likely to for some time.
cheers stuart
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Hi Pine,
to answer your question on results about improving editor retention, there is a new paper authored by me and Dario coming out soon about MoodBar, an early EE experiment whose aim was to elicit feedback from newly registered editors, that shows that lightweight socialization (e.g. reporting feedback about editing experience and receiving help from more experienced users) improves long-term editor retention.
The pre-print of the paper is up on the arxiv http://arxiv.org/abs/1409.1496
I also gave a talk about it at the Mediawiki metrics meeting earlier this summer: https://www.youtube.com/watch?v=Rn4-cBYxttA
Cheers,
Giovanni Luca Ciampaglia
✎ 919 E 10th ∙ Bloomington 47408 IN ∙ USA ☞ http://www.glciampaglia.com/ ✆ +1 812 855-7261 ✉ gciampag@indiana.edu
2014-09-11 2:00 GMT-04:00 Pine W wiki.pine@gmail.com:
Hello research colleagues,
When I look at the WMF Report Card, it appears to me that the global active editor stats and the number of new accounts being registered per month has been relatively flat since at least 2011.
Those of you who work in EE research and analytics, I would like to ask if there is a summary of techniques that you have found that do produce statistically significant results in improving editor retention. I know that some of you write tools, design projects, or pull and analyze data about editors. It looks to me like WMF is investing significant effort in research and tool creation, but we're not moving the needle to create the results that we had hoped to achieve. So I'd like to ask what have we learned from all of our time working on editor engagement about techniques and programs that do improve the EE stats significant ways, so that we can hopefully accelerate the implementation of programs and techniques that have demonstrated success.
I'd also like to ask what barriers you think prevent us from becoming more effective at improving the number of users who register and the number of active editors. For example, are users who go through GettingStarted often being deterred by quickly being confronted by experienced editors in ways that make the newbies want to leave? If that is a significant problem, how do you suggest addressing this?
One of my concerns about investing further in developing Flow, analytics tools like like WIkimetrics, and further complex editor engagement research projects, is that the most important challenges related to editor engagement may be problems that can only be solved through primarily interpersonal and social means rather than the use of software tools and mass communications. I like Wikimetrics and I use it, and I think there's an important place for analytics and tool development in EE work, but I wonder if WMF should scale up the emphasis on grassroots social and interpersonal efforts, particularly in the context of the 2015+ Strategic Plan and Jimmy's speech at the 2014 Wikimania. What do you think,and if your answer is yes, how do you think WMF can do this while respecting the autonomy and social processes of the volunteer projects?
Thanks,
Pine
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
With the mood bar, the communication back to the editor was through their user page and email (when known). Do you have any data to show where they saw it (or from where they responded to it)? I've long suspected that new users don't know about User Talk and this frustrates our efforts to communicate with them. so I would be interested to know if there was any difference in reaction from those communicated with via user talk alone and those who also got email and what that might say about user talk as a means to communicate with new users. I note that on the mobile interface running on my ipad, I cannot find a way to get to my User Talk page (as far as I can see), short of entering the URL manually or switching to the desktop interface, which makes user talk pretty useless way of communicating with mobile users.
Sent from my iPad
On 16 Sep 2014, at 6:02 am, Giovanni Luca Ciampaglia gciampag@indiana.edu wrote:
Hi Pine,
to answer your question on results about improving editor retention, there is a new paper authored by me and Dario coming out soon about MoodBar, an early EE experiment whose aim was to elicit feedback from newly registered editors, that shows that lightweight socialization (e.g. reporting feedback about editing experience and receiving help from more experienced users) improves long-term editor retention.
The pre-print of the paper is up on the arxiv http://arxiv.org/abs/1409.1496 I also gave a talk about it at the Mediawiki metrics meeting earlier this summer: https://www.youtube.com/watch?v=Rn4-cBYxttA
Cheers,
Giovanni Luca Ciampaglia
✎ 919 E 10th ∙ Bloomington 47408 IN ∙ USA ☞ http://www.glciampaglia.com/ ✆ +1 812 855-7261 ✉ gciampag@indiana.edu
2014-09-11 2:00 GMT-04:00 Pine W wiki.pine@gmail.com:
Hello research colleagues,
When I look at the WMF Report Card, it appears to me that the global active editor stats and the number of new accounts being registered per month has been relatively flat since at least 2011.
Those of you who work in EE research and analytics, I would like to ask if there is a summary of techniques that you have found that do produce statistically significant results in improving editor retention. I know that some of you write tools, design projects, or pull and analyze data about editors. It looks to me like WMF is investing significant effort in research and tool creation, but we're not moving the needle to create the results that we had hoped to achieve. So I'd like to ask what have we learned from all of our time working on editor engagement about techniques and programs that do improve the EE stats significant ways, so that we can hopefully accelerate the implementation of programs and techniques that have demonstrated success.
I'd also like to ask what barriers you think prevent us from becoming more effective at improving the number of users who register and the number of active editors. For example, are users who go through GettingStarted often being deterred by quickly being confronted by experienced editors in ways that make the newbies want to leave? If that is a significant problem, how do you suggest addressing this?
One of my concerns about investing further in developing Flow, analytics tools like like WIkimetrics, and further complex editor engagement research projects, is that the most important challenges related to editor engagement may be problems that can only be solved through primarily interpersonal and social means rather than the use of software tools and mass communications. I like Wikimetrics and I use it, and I think there's an important place for analytics and tool development in EE work, but I wonder if WMF should scale up the emphasis on grassroots social and interpersonal efforts, particularly in the context of the 2015+ Strategic Plan and Jimmy's speech at the 2014 Wikimania. What do you think,and if your answer is yes, how do you think WMF can do this while respecting the autonomy and social processes of the volunteer projects?
Thanks,
Pine
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
wiki-research-l@lists.wikimedia.org