Hi everyone,
I'm excited to share that our annual survey about Wikimedia communities is now published!
This survey included 170 questions and reaches over 4,000 community members across four audiences: Contributors, Affiliate organizers, Program Organizers, and Volunteer Developers. This survey helps us hear from the experience of Wikimedians from across the movement so that teams are able to use community feedback in their planning and their work. This survey also helps us learn about long term changes in communities, such as community health or demographics.
The report is available on meta: https://meta.wikimedia.org/wiki/Community_Engagement_Insights/2018_Report
For this survey, we worked with 11 teams to develop the questions. Once the results were analyzed, we spent time with each team to help them understand their results. Most teams have already identified how they will use the results to help improve their work to support you.
The report could be useful for your work in the Wikimedia movement as well! What are you learning from the data? Take some time to read the report and share your feedback on the talk pages. We have also published a blog that you can read.[1]
We are hosting a livestream presentation[2] on September 20 at 1600 UTC. Hope to see you there!
Feel free to email me directly with any questions.
All the best, Edward
[1] https://wikimediafoundation.org/2018/09/13/what-we-learned-surveying-4000-co... [2] https://www.youtube.com/watch?v=qGQtWFP9Cjc
unsubscribe
On Thu, Sep 13, 2018 at 6:07 PM Edward Galvez egalvez@wikimedia.org wrote:
Hi everyone,
I'm excited to share that our annual survey about Wikimedia communities is now published!
This survey included 170 questions and reaches over 4,000 community members across four audiences: Contributors, Affiliate organizers, Program Organizers, and Volunteer Developers. This survey helps us hear from the experience of Wikimedians from across the movement so that teams are able to use community feedback in their planning and their work. This survey also helps us learn about long term changes in communities, such as community health or demographics.
The report is available on meta: https://meta.wikimedia.org/wiki/Community_Engagement_Insights/2018_Report
For this survey, we worked with 11 teams to develop the questions. Once the results were analyzed, we spent time with each team to help them understand their results. Most teams have already identified how they will use the results to help improve their work to support you.
The report could be useful for your work in the Wikimedia movement as well! What are you learning from the data? Take some time to read the report and share your feedback on the talk pages. We have also published a blog that you can read.[1]
We are hosting a livestream presentation[2] on September 20 at 1600 UTC. Hope to see you there!
Feel free to email me directly with any questions.
All the best, Edward
[1]
https://wikimediafoundation.org/2018/09/13/what-we-learned-surveying-4000-co... [2] https://www.youtube.com/watch?v=qGQtWFP9Cjc
-- Edward Galvez Evaluation Strategist, Surveys Learning & Evaluation Community Engagement Wikimedia Foundation
-- Edward Galvez Evaluation Strategist, Surveys Learning & Evaluation Community Engagement Wikimedia Foundation _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Hi Edward,
Thanks for this publication. This research is likely to be of interest to the WikimediaAnnounce-l (and by extension, Wikimedia-l) and Wikitech-l subscribers, so I suggest that you cross-post this publication to those lists.
After reading this report, I have a question which may be challenging to answer: what should we do to improve our diversity? Many of us, inside and outside of WMF, have wanted to see progress on diversity metrics for years, and I get the impression that while significant attention and resources are being given to diversity, our progress has been disappointing. Perhaps that's a subject that can be discussed further during the video presentation, but I'd also be interested in hearing your comments here on Research-l.
Have a good weekend,
Pine ( https://meta.wikimedia.org/wiki/User:Pine )
On Thu, Sep 13, 2018 at 11:07 PM Edward Galvez egalvez@wikimedia.org wrote:
Hi everyone,
I'm excited to share that our annual survey about Wikimedia communities is now published!
This survey included 170 questions and reaches over 4,000 community members across four audiences: Contributors, Affiliate organizers, Program Organizers, and Volunteer Developers. This survey helps us hear from the experience of Wikimedians from across the movement so that teams are able to use community feedback in their planning and their work. This survey also helps us learn about long term changes in communities, such as community health or demographics.
The report is available on meta: https://meta.wikimedia.org/wiki/Community_Engagement_Insights/2018_Report
For this survey, we worked with 11 teams to develop the questions. Once the results were analyzed, we spent time with each team to help them understand their results. Most teams have already identified how they will use the results to help improve their work to support you.
The report could be useful for your work in the Wikimedia movement as well! What are you learning from the data? Take some time to read the report and share your feedback on the talk pages. We have also published a blog that you can read.[1]
We are hosting a livestream presentation[2] on September 20 at 1600 UTC. Hope to see you there!
Feel free to email me directly with any questions.
All the best, Edward
[1]
https://wikimediafoundation.org/2018/09/13/what-we-learned-surveying-4000-co... [2] https://www.youtube.com/watch?v=qGQtWFP9Cjc
-- Edward Galvez Evaluation Strategist, Surveys Learning & Evaluation Community Engagement Wikimedia Foundation
-- Edward Galvez Evaluation Strategist, Surveys Learning & Evaluation Community Engagement Wikimedia Foundation _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Thanks for your note Pine. I believe I have already shared this on Wikimedia-l; I haven't shared to Announce, so I can do that.
"Diversity" is multifaceted. I think that some areas offer some hope (e.g. program organizers & affiliate organizers have higher proportion of women and geographic representation), others I am not uncertain whether we put a lot of attention (Education & Age), and in others we are seeing little progress (gender on the projects). And perhaps some aren't even on our radar. I think many teams are still working to understand what are the problems and possible levers that can help us to bring change to these measures. Some of those teams include Contributors/Audiences team, Anti-Harassment Tools, Trust & Safety and Community Resources. Each of these teams bringing their own strengths and angles to the problem. I invite you to read the team reports https://meta.wikimedia.org/wiki/Community_Engagement_Insights/2018_Report/Team_Reports .
The research team is also working on finding a way to capture demographic data as well this year. While we gather this data through CE Insights it is not the most optimal way to measure demographic data. There was also the recent email by Erik Zachte about language diversity (Email subject: "Wikipedias, participation per language") Always to good to start to measure what you want to change.
I also invite you (and perhaps everyone on this list) to reflect on: what numbers are most concerning for you related to diversity? What could you do to improve diversity on the projects? And decide how you would like to take action.
Hope this helps! Edward
On Fri, Sep 14, 2018 at 8:53 PM Pine W wiki.pine@gmail.com wrote:
Hi Edward,
Thanks for this publication. This research is likely to be of interest to the WikimediaAnnounce-l (and by extension, Wikimedia-l) and Wikitech-l subscribers, so I suggest that you cross-post this publication to those lists.
After reading this report, I have a question which may be challenging to answer: what should we do to improve our diversity? Many of us, inside and outside of WMF, have wanted to see progress on diversity metrics for years, and I get the impression that while significant attention and resources are being given to diversity, our progress has been disappointing. Perhaps that's a subject that can be discussed further during the video presentation, but I'd also be interested in hearing your comments here on Research-l.
Have a good weekend,
Pine ( https://meta.wikimedia.org/wiki/User:Pine )
On Thu, Sep 13, 2018 at 11:07 PM Edward Galvez egalvez@wikimedia.org wrote:
Hi everyone,
I'm excited to share that our annual survey about Wikimedia communities
is
now published!
This survey included 170 questions and reaches over 4,000 community members across four audiences: Contributors, Affiliate organizers, Program Organizers,
and
Volunteer Developers. This survey helps us hear from the experience of Wikimedians from across the movement so that teams are able to use community feedback in their planning and their work. This survey also
helps
us learn about long term changes in communities, such as community health or demographics.
The report is available on meta:
https://meta.wikimedia.org/wiki/Community_Engagement_Insights/2018_Report
For this survey, we worked with 11 teams to develop the questions. Once
the
results were analyzed, we spent time with each team to help them
understand
their results. Most teams have already identified how they will use the results to help improve their work to support you.
The report could be useful for your work in the Wikimedia movement as
well!
What are you learning from the data? Take some time to read the report
and
share your feedback on the talk pages. We have also published a blog that you can read.[1]
We are hosting a livestream presentation[2] on September 20 at 1600 UTC. Hope to see you there!
Feel free to email me directly with any questions.
All the best, Edward
[1]
https://wikimediafoundation.org/2018/09/13/what-we-learned-surveying-4000-co...
[2] https://www.youtube.com/watch?v=qGQtWFP9Cjc
-- Edward Galvez Evaluation Strategist, Surveys Learning & Evaluation Community Engagement Wikimedia Foundation
-- Edward Galvez Evaluation Strategist, Surveys Learning & Evaluation Community Engagement Wikimedia Foundation _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Hi Edward, I'm surprised that this thread only appears in my email under Research-l, but I can see in the WMF mail archives that you sent the email to other lists also. I wonder if that happened because you used bcc. Maybe there is a bug in Gmail. On the topic of diversity research, thanks for the link to the team reports. I'll put those on my list of things that would be good to browse.
Regarding the topic of harassment that the person with the email "80hnhtv4agou" raised, I think that it's good to ask what more could and should be done. My view is that WMF shouldn't be directly intervening in community activities, but WMF support for community self-governance is welcome with actions such as developing better moderation tools and providing financial support to affiliates and community members who want to develop evidence-based training modules. Sydney Poore is on the Anti-Harrassment Tools team and I'm pinging her here to invite her to add any comments that she has.
Pine ( https://meta.wikimedia.org/wiki/User:Pine )
On Sat, Sep 15, 2018 at 5:45 PM Edward Galvez egalvez@wikimedia.org wrote:
Thanks for your note Pine. I believe I have already shared this on Wikimedia-l; I haven't shared to Announce, so I can do that.
"Diversity" is multifaceted. I think that some areas offer some hope (e.g. program organizers & affiliate organizers have higher proportion of women and geographic representation), others I am not uncertain whether we put a lot of attention (Education & Age), and in others we are seeing little progress (gender on the projects). And perhaps some aren't even on our radar. I think many teams are still working to understand what are the problems and possible levers that can help us to bring change to these measures. Some of those teams include Contributors/Audiences team, Anti-Harassment Tools, Trust & Safety and Community Resources. Each of these teams bringing their own strengths and angles to the problem. I invite you to read the team reports < https://meta.wikimedia.org/wiki/Community_Engagement_Insights/2018_Report/Te...
.
The research team is also working on finding a way to capture demographic data as well this year. While we gather this data through CE Insights it is not the most optimal way to measure demographic data. There was also the recent email by Erik Zachte about language diversity (Email subject: "Wikipedias, participation per language") Always to good to start to measure what you want to change.
I also invite you (and perhaps everyone on this list) to reflect on: what numbers are most concerning for you related to diversity? What could you do to improve diversity on the projects? And decide how you would like to take action.
Hope this helps! Edward
On Fri, Sep 14, 2018 at 8:53 PM Pine W wiki.pine@gmail.com wrote:
Hi Edward,
Thanks for this publication. This research is likely to be of interest to the WikimediaAnnounce-l (and by extension, Wikimedia-l) and Wikitech-l subscribers, so I suggest that you cross-post this publication to those lists.
After reading this report, I have a question which may be challenging to answer: what should we do to improve our diversity? Many of us, inside
and
outside of WMF, have wanted to see progress on diversity metrics for
years,
and I get the impression that while significant attention and resources
are
being given to diversity, our progress has been disappointing. Perhaps that's a subject that can be discussed further during the video presentation, but I'd also be interested in hearing your comments here on Research-l.
Have a good weekend,
Pine ( https://meta.wikimedia.org/wiki/User:Pine )
On Thu, Sep 13, 2018 at 11:07 PM Edward Galvez egalvez@wikimedia.org wrote:
Hi everyone,
I'm excited to share that our annual survey about Wikimedia communities
is
now published!
This survey included 170 questions and reaches over 4,000 community members across four audiences: Contributors, Affiliate organizers, Program Organizers,
and
Volunteer Developers. This survey helps us hear from the experience of Wikimedians from across the movement so that teams are able to use community feedback in their planning and their work. This survey also
helps
us learn about long term changes in communities, such as community
health
or demographics.
The report is available on meta:
https://meta.wikimedia.org/wiki/Community_Engagement_Insights/2018_Report
For this survey, we worked with 11 teams to develop the questions. Once
the
results were analyzed, we spent time with each team to help them
understand
their results. Most teams have already identified how they will use the results to help improve their work to support you.
The report could be useful for your work in the Wikimedia movement as
well!
What are you learning from the data? Take some time to read the report
and
share your feedback on the talk pages. We have also published a blog
that
you can read.[1]
We are hosting a livestream presentation[2] on September 20 at 1600
UTC.
Hope to see you there!
Feel free to email me directly with any questions.
All the best, Edward
[1]
https://wikimediafoundation.org/2018/09/13/what-we-learned-surveying-4000-co...
[2] https://www.youtube.com/watch?v=qGQtWFP9Cjc
-- Edward Galvez Evaluation Strategist, Surveys Learning & Evaluation Community Engagement Wikimedia Foundation
-- Edward Galvez Evaluation Strategist, Surveys Learning & Evaluation Community Engagement Wikimedia Foundation _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
-- Edward Galvez Evaluation Strategist, Surveys Learning & Evaluation Community Engagement Wikimedia Foundation _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
It comes as no great surprise to me to see these survey results show very little change in matters of some concern (e.g. diversity, community health). Quite simply, if you don't change the system, then don't expect the outcomes to change. I can't speak about most projects but I don't see any change on en.WP in terms of how it operates since the last WMF strategic plan published in 2011. We had a non-diverse toxic culture then; nothing changes; culture remained the same. Our active editor numbers go down, the number of articles to be maintained goes up, do the maths and see the long-term problem. Admin numbers are also declining.
One big potentially positive change was the Visual Editor. WMF built the Visual Editor specifically to open up editing to a wider ground of users and, as someone who does training for new users, it is a game changer for making it easier for new users. However, en.WP didn't change. VE is not the default for new editors on en.WP. It is not enabled for en.WP talk pages, project pages, or even the Teahouse, or any forum where new users might report problems or harassment etc. Almost any how-to help page gives information only for source editor users. Commons has blocked new users from using the VE to upload own-work photos (and no useful error message is provided to tell them what to do - just something generic like "server error" is returned because Commons just "fails" the upload and doesn't pass back a reason to the VE).
The old adage "praise in public, criticise in private" remains inverted in the world of Wikipedia. Everyone can see reverted edits and the criticisms on User Talk pages. Meanwhile "Thanks" (our lightest weight way to praise) is effectively private (yeah, I know there is a public log, but at most it tells you who likes who). And what the public log does show is that most people never thank anyone anyway, which again speaks volume about our culture. We are all for transparency except curiously when thanking for a particular edit. Transparency leads to a lack of privacy that comes with it is a turn-off to some new users. I know from training some new users don't think it's OK that everyone can read their User Talk page or that their entire contribution history is visible to all. They generally believe that if they were to misbehave, then of course someone in authority (admins in our world) should be able to look at such things for the purposes of keeping the place safe and functioning effectively, but they don't see why just anyone should be able to monitor them, which is a means by which you can stalk someone or wikihound them on Wikipedia. Interestingly pretty much all of those who raise these concerns are women, who are, in real life, the most common victims of privacy invasions (think "up-skirt-ing" vs "up-trouser-ing", think Peeping Tom vs Peeping Tomasina) and stalking. So should we look at trading off some transparency in order to get more diversity?
Vandalism. Many years ago, when I questioned our very soft policy on vandalism (it takes 4 to allow you to request to block an account), I was told that "yeah, there is a lot of vandalism now but Wikipedia is new and once people realise its value and that vandals get blocked, it will stop happening over time". Sadly nobody told the vandals this, as, based on my watchlist, they are still very active and still mostly IPs. I note we have not changed our IP policy or our pseudonym account policy; editors remain as non-real-world accountable as always. As many online newspapers and other forums are turning off comments as they have learned that anonymous/pseudo accounts lead to completely unproductive name calling, defamatory comments, and not the constructive civil debate envisaged, yet at en.WP we persist in believing that the same approach can create a positive collaborative culture, which clearly it has not.
There's no willingness even to experiment with anything that might change the culture and I see little likelihood that en.WP's culture will change of its own accord.
However, there is one easy win for diversity at WMF. Start diversifying the WMF livestream times. Every WMF livestream is usually between 2-4am here in Australia so I'd like to see a bit of support for the Global East diversity by shifting the livestreams so everyone gets a chance to participate live. One small step that WMF could take ...
Kerry
-----Original Message----- From: Wiki-research-l [mailto:wiki-research-l-bounces@lists.wikimedia.org] On Behalf Of Pine W Sent: Saturday, 15 September 2018 1:52 PM To: Wiki Research-l wiki-research-l@lists.wikimedia.org Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
Hi Edward,
Thanks for this publication. This research is likely to be of interest to the WikimediaAnnounce-l (and by extension, Wikimedia-l) and Wikitech-l subscribers, so I suggest that you cross-post this publication to those lists.
After reading this report, I have a question which may be challenging to answer: what should we do to improve our diversity? Many of us, inside and outside of WMF, have wanted to see progress on diversity metrics for years, and I get the impression that while significant attention and resources are being given to diversity, our progress has been disappointing. Perhaps that's a subject that can be discussed further during the video presentation, but I'd also be interested in hearing your comments here on Research-l.
Have a good weekend,
Pine ( https://meta.wikimedia.org/wiki/User:Pine )
On Thu, Sep 13, 2018 at 11:07 PM Edward Galvez egalvez@wikimedia.org wrote:
Hi everyone,
I'm excited to share that our annual survey about Wikimedia communities is now published!
This survey included 170 questions and reaches over 4,000 community members across four audiences: Contributors, Affiliate organizers, Program Organizers, and Volunteer Developers. This survey helps us hear from the experience of Wikimedians from across the movement so that teams are able to use community feedback in their planning and their work. This survey also helps us learn about long term changes in communities, such as community health or demographics.
The report is available on meta: https://meta.wikimedia.org/wiki/Community_Engagement_Insights/2018_Rep ort
For this survey, we worked with 11 teams to develop the questions. Once the results were analyzed, we spent time with each team to help them understand their results. Most teams have already identified how they will use the results to help improve their work to support you.
The report could be useful for your work in the Wikimedia movement as well! What are you learning from the data? Take some time to read the report and share your feedback on the talk pages. We have also published a blog that you can read.[1]
We are hosting a livestream presentation[2] on September 20 at 1600 UTC. Hope to see you there!
Feel free to email me directly with any questions.
All the best, Edward
[1]
https://wikimediafoundation.org/2018/09/13/what-we-learned-surveying-4 000-community-members/ [2] https://www.youtube.com/watch?v=qGQtWFP9Cjc
-- Edward Galvez Evaluation Strategist, Surveys Learning & Evaluation Community Engagement Wikimedia Foundation
-- Edward Galvez Evaluation Strategist, Surveys Learning & Evaluation Community Engagement Wikimedia Foundation _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
_______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
I am a bit more optimistic than Kerry, although I agree that wider support for VE and more publicity for the "thanks" feature would be good.
I agree with Kerry's concern about our labor supply being too small for the demand. Related to this is the difficult situation with our diversity statistics for content contributors; I would hope that if we could improve our diversity that we could do so in a way that created a net positive for the labor supply.
I would not trade down transparency for other possible benefits, and I believe that *off-wiki* WMF and its associates like AffCom should be more transparent about problematic situations and bad news.
I'm not sure that I'd agree that vandalism on ENWP is a huge problem. It's a problem, but I don't think that it's going to overwhelm the encyclopedia soon. However, I do think that it's a nontrivial timesink for experienced users and ambitious users who want to protect the quality of the encyclopedia. It would be interesting if there was research that estimated the amount of time that good-faith editors on ENWP spend on cleaning up vandalism and handing out blocks.
Pine ( https://meta.wikimedia.org/wiki/User:Pine )
On Tue, Sep 18, 2018 at 11:56 PM Kerry Raymond kerry.raymond@gmail.com wrote:
It comes as no great surprise to me to see these survey results show very little change in matters of some concern (e.g. diversity, community health). Quite simply, if you don't change the system, then don't expect the outcomes to change. I can't speak about most projects but I don't see any change on en.WP in terms of how it operates since the last WMF strategic plan published in 2011. We had a non-diverse toxic culture then; nothing changes; culture remained the same. Our active editor numbers go down, the number of articles to be maintained goes up, do the maths and see the long-term problem. Admin numbers are also declining.
One big potentially positive change was the Visual Editor. WMF built the Visual Editor specifically to open up editing to a wider ground of users and, as someone who does training for new users, it is a game changer for making it easier for new users. However, en.WP didn't change. VE is not the default for new editors on en.WP. It is not enabled for en.WP talk pages, project pages, or even the Teahouse, or any forum where new users might report problems or harassment etc. Almost any how-to help page gives information only for source editor users. Commons has blocked new users from using the VE to upload own-work photos (and no useful error message is provided to tell them what to do - just something generic like "server error" is returned because Commons just "fails" the upload and doesn't pass back a reason to the VE).
The old adage "praise in public, criticise in private" remains inverted in the world of Wikipedia. Everyone can see reverted edits and the criticisms on User Talk pages. Meanwhile "Thanks" (our lightest weight way to praise) is effectively private (yeah, I know there is a public log, but at most it tells you who likes who). And what the public log does show is that most people never thank anyone anyway, which again speaks volume about our culture. We are all for transparency except curiously when thanking for a particular edit. Transparency leads to a lack of privacy that comes with it is a turn-off to some new users. I know from training some new users don't think it's OK that everyone can read their User Talk page or that their entire contribution history is visible to all. They generally believe that if they were to misbehave, then of course someone in authority (admins in our world) should be able to look at such things for the purposes of keeping the place safe and functioning effectively, but they don't see why just anyone should be able to monitor them, which is a means by which you can stalk someone or wikihound them on Wikipedia. Interestingly pretty much all of those who raise these concerns are women, who are, in real life, the most common victims of privacy invasions (think "up-skirt-ing" vs "up-trouser-ing", think Peeping Tom vs Peeping Tomasina) and stalking. So should we look at trading off some transparency in order to get more diversity?
Vandalism. Many years ago, when I questioned our very soft policy on vandalism (it takes 4 to allow you to request to block an account), I was told that "yeah, there is a lot of vandalism now but Wikipedia is new and once people realise its value and that vandals get blocked, it will stop happening over time". Sadly nobody told the vandals this, as, based on my watchlist, they are still very active and still mostly IPs. I note we have not changed our IP policy or our pseudonym account policy; editors remain as non-real-world accountable as always. As many online newspapers and other forums are turning off comments as they have learned that anonymous/pseudo accounts lead to completely unproductive name calling, defamatory comments, and not the constructive civil debate envisaged, yet at en.WP we persist in believing that the same approach can create a positive collaborative culture, which clearly it has not.
There's no willingness even to experiment with anything that might change the culture and I see little likelihood that en.WP's culture will change of its own accord.
However, there is one easy win for diversity at WMF. Start diversifying the WMF livestream times. Every WMF livestream is usually between 2-4am here in Australia so I'd like to see a bit of support for the Global East diversity by shifting the livestreams so everyone gets a chance to participate live. One small step that WMF could take ...
Kerry
-----Original Message----- From: Wiki-research-l [mailto:wiki-research-l-bounces@lists.wikimedia.org] On Behalf Of Pine W Sent: Saturday, 15 September 2018 1:52 PM To: Wiki Research-l wiki-research-l@lists.wikimedia.org Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
Hi Edward,
Thanks for this publication. This research is likely to be of interest to the WikimediaAnnounce-l (and by extension, Wikimedia-l) and Wikitech-l subscribers, so I suggest that you cross-post this publication to those lists.
After reading this report, I have a question which may be challenging to answer: what should we do to improve our diversity? Many of us, inside and outside of WMF, have wanted to see progress on diversity metrics for years, and I get the impression that while significant attention and resources are being given to diversity, our progress has been disappointing. Perhaps that's a subject that can be discussed further during the video presentation, but I'd also be interested in hearing your comments here on Research-l.
Have a good weekend,
Pine ( https://meta.wikimedia.org/wiki/User:Pine )
On Thu, Sep 13, 2018 at 11:07 PM Edward Galvez egalvez@wikimedia.org wrote:
Hi everyone,
I'm excited to share that our annual survey about Wikimedia communities is now published!
This survey included 170 questions and reaches over 4,000 community members across four audiences: Contributors, Affiliate organizers, Program Organizers, and Volunteer Developers. This survey helps us hear from the experience of Wikimedians from across the movement so that teams are able to use community feedback in their planning and their work. This survey also helps us learn about long term changes in communities, such as community health or demographics.
The report is available on meta: https://meta.wikimedia.org/wiki/Community_Engagement_Insights/2018_Rep ort
For this survey, we worked with 11 teams to develop the questions. Once the results were analyzed, we spent time with each team to help them understand their results. Most teams have already identified how they will use the results to help improve their work to support you.
The report could be useful for your work in the Wikimedia movement as
well!
What are you learning from the data? Take some time to read the report and share your feedback on the talk pages. We have also published a blog that you can read.[1]
We are hosting a livestream presentation[2] on September 20 at 1600 UTC. Hope to see you there!
Feel free to email me directly with any questions.
All the best, Edward
[1]
https://wikimediafoundation.org/2018/09/13/what-we-learned-surveying-4 000-community-members/ [2] https://www.youtube.com/watch?v=qGQtWFP9Cjc
-- Edward Galvez Evaluation Strategist, Surveys Learning & Evaluation Community Engagement Wikimedia Foundation
-- Edward Galvez Evaluation Strategist, Surveys Learning & Evaluation Community Engagement Wikimedia Foundation _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Pine, I would absolutely disagree with you about off-wiki transparency. Why should a woman have to publicly disclose the contents of a thoroughly disgusting sexual email for public entertainment because they reverted some guy's edit. Why should a women be expected to provide details of an physical unwanted contact at an event for other men to pontificate about? That's what transparency would mean. The right of the 90% of Wikipedia contributors who are men to get to decide if a woman has the right to be offended by these things. Let's put it all out there in the open so everyone can get involved.
"Couldn't it just have been a friendly hug?". "So did his hand actually tweak your nipple or just brush part of your breast?" And so on.
And of course anyone in the world with a web browser could watch on too, such as the women's partner, her parents, her children, her colleagues. And of course IPs and new accounts could come along and join in the conversation and get involved too in the interrogation. "How lowcut was your dress? Did you have a bra on?"
Transparency would not work off-wiki and I don't think it works on-wiki for harassment issues. You might think it does because I suspect a lot of stuff doesn't get reported on the public forums. The folks in private process (such as oversight) probably see a lot of ugly stuff that the rest of us don't, or the woman just walks away from Wikipedia because they don't know there are private ways to report problems or they think it's easier just to walk away.
If you want to address diversity, I think you have to address the need for privacy in complaints processes. Although I have only outlined issues relating to women here, I am sure there are similar issues for people of other races, other religions, other cultures and so on.
Kerry
Vandalism used to be dealt with entirely manually, then it became semi automated with tools like huggle, nowadays much of it is rejected by the edit filters without the vandals managing to save an edit. So while it is still a problem, it is much less of a problem than it used to be. Far less gets through to need human attention, and far less is actually seen by the readers. If it is rejected by edit filters or held up by pending changes then it isnt seen by the readers, we could do better, the German language Wikipedia has a much better system called flagged revisions. But vandalism is much less of a problem than it once was.
Against that we have an issue with IP contributions that newspapers and others don’t have. We recruit our editing community by being easy to edit. One theory of Wikipedia recruitment is that a large proprtion of new editors make an IP edit or two before deciding to create an account, and that closing down IP editing would reduce our recruitment of new editors. Of course that has to be balanced against the possibility that we would get rid of lots of vandals. But here we have to remember another theory, that vandals and trolls will do the minimum registration necessary to do their vandalism or trolling, but people who were going to be helpful and point out a typo are easily deterred. Perhaps someone on this list would fancy doing a research project on this, but I am inclined to assume that the trend amongst news sites is to drop comments sections entirely rather than merely restrict them to those who create an account, and that this implies that the model of restricting comments to those willling at least to create a throwaway account keeps more of the toxicity than it does of the goodfaith contributions.
I have seen plenty of sites that have restricted comments further by requiring new accounts to disclose an email address, and that step might be one that deters a larger proportion of badfaith users than goidfaith ones. But I can’t see Wikimedia making such a drastic step in reducing openness, especially with 2018 looking like a year of declining editor volumes with the rally of 2015/16 having ended.
Get Outlook for iOShttps://aka.ms/o0ukef ________________________________ From: Wiki-research-l wiki-research-l-bounces@lists.wikimedia.org on behalf of Pine W wiki.pine@gmail.com Sent: Wednesday, September 19, 2018 3:05:00 AM To: Wiki Research-l Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
I am a bit more optimistic than Kerry, although I agree that wider support for VE and more publicity for the "thanks" feature would be good.
I agree with Kerry's concern about our labor supply being too small for the demand. Related to this is the difficult situation with our diversity statistics for content contributors; I would hope that if we could improve our diversity that we could do so in a way that created a net positive for the labor supply.
I would not trade down transparency for other possible benefits, and I believe that *off-wiki* WMF and its associates like AffCom should be more transparent about problematic situations and bad news.
I'm not sure that I'd agree that vandalism on ENWP is a huge problem. It's a problem, but I don't think that it's going to overwhelm the encyclopedia soon. However, I do think that it's a nontrivial timesink for experienced users and ambitious users who want to protect the quality of the encyclopedia. It would be interesting if there was research that estimated the amount of time that good-faith editors on ENWP spend on cleaning up vandalism and handing out blocks.
Pine ( https://meta.wikimedia.org/wiki/User:Pine )
On Tue, Sep 18, 2018 at 11:56 PM Kerry Raymond kerry.raymond@gmail.com wrote:
It comes as no great surprise to me to see these survey results show very little change in matters of some concern (e.g. diversity, community health). Quite simply, if you don't change the system, then don't expect the outcomes to change. I can't speak about most projects but I don't see any change on en.WP in terms of how it operates since the last WMF strategic plan published in 2011. We had a non-diverse toxic culture then; nothing changes; culture remained the same. Our active editor numbers go down, the number of articles to be maintained goes up, do the maths and see the long-term problem. Admin numbers are also declining.
One big potentially positive change was the Visual Editor. WMF built the Visual Editor specifically to open up editing to a wider ground of users and, as someone who does training for new users, it is a game changer for making it easier for new users. However, en.WP didn't change. VE is not the default for new editors on en.WP. It is not enabled for en.WP talk pages, project pages, or even the Teahouse, or any forum where new users might report problems or harassment etc. Almost any how-to help page gives information only for source editor users. Commons has blocked new users from using the VE to upload own-work photos (and no useful error message is provided to tell them what to do - just something generic like "server error" is returned because Commons just "fails" the upload and doesn't pass back a reason to the VE).
The old adage "praise in public, criticise in private" remains inverted in the world of Wikipedia. Everyone can see reverted edits and the criticisms on User Talk pages. Meanwhile "Thanks" (our lightest weight way to praise) is effectively private (yeah, I know there is a public log, but at most it tells you who likes who). And what the public log does show is that most people never thank anyone anyway, which again speaks volume about our culture. We are all for transparency except curiously when thanking for a particular edit. Transparency leads to a lack of privacy that comes with it is a turn-off to some new users. I know from training some new users don't think it's OK that everyone can read their User Talk page or that their entire contribution history is visible to all. They generally believe that if they were to misbehave, then of course someone in authority (admins in our world) should be able to look at such things for the purposes of keeping the place safe and functioning effectively, but they don't see why just anyone should be able to monitor them, which is a means by which you can stalk someone or wikihound them on Wikipedia. Interestingly pretty much all of those who raise these concerns are women, who are, in real life, the most common victims of privacy invasions (think "up-skirt-ing" vs "up-trouser-ing", think Peeping Tom vs Peeping Tomasina) and stalking. So should we look at trading off some transparency in order to get more diversity?
Vandalism. Many years ago, when I questioned our very soft policy on vandalism (it takes 4 to allow you to request to block an account), I was told that "yeah, there is a lot of vandalism now but Wikipedia is new and once people realise its value and that vandals get blocked, it will stop happening over time". Sadly nobody told the vandals this, as, based on my watchlist, they are still very active and still mostly IPs. I note we have not changed our IP policy or our pseudonym account policy; editors remain as non-real-world accountable as always. As many online newspapers and other forums are turning off comments as they have learned that anonymous/pseudo accounts lead to completely unproductive name calling, defamatory comments, and not the constructive civil debate envisaged, yet at en.WP we persist in believing that the same approach can create a positive collaborative culture, which clearly it has not.
There's no willingness even to experiment with anything that might change the culture and I see little likelihood that en.WP's culture will change of its own accord.
However, there is one easy win for diversity at WMF. Start diversifying the WMF livestream times. Every WMF livestream is usually between 2-4am here in Australia so I'd like to see a bit of support for the Global East diversity by shifting the livestreams so everyone gets a chance to participate live. One small step that WMF could take ...
Kerry
-----Original Message----- From: Wiki-research-l [mailto:wiki-research-l-bounces@lists.wikimedia.org] On Behalf Of Pine W Sent: Saturday, 15 September 2018 1:52 PM To: Wiki Research-l wiki-research-l@lists.wikimedia.org Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
Hi Edward,
Thanks for this publication. This research is likely to be of interest to the WikimediaAnnounce-l (and by extension, Wikimedia-l) and Wikitech-l subscribers, so I suggest that you cross-post this publication to those lists.
After reading this report, I have a question which may be challenging to answer: what should we do to improve our diversity? Many of us, inside and outside of WMF, have wanted to see progress on diversity metrics for years, and I get the impression that while significant attention and resources are being given to diversity, our progress has been disappointing. Perhaps that's a subject that can be discussed further during the video presentation, but I'd also be interested in hearing your comments here on Research-l.
Have a good weekend,
Pine ( https://meta.wikimedia.org/wiki/User:Pine )
On Thu, Sep 13, 2018 at 11:07 PM Edward Galvez egalvez@wikimedia.org wrote:
Hi everyone,
I'm excited to share that our annual survey about Wikimedia communities is now published!
This survey included 170 questions and reaches over 4,000 community members across four audiences: Contributors, Affiliate organizers, Program Organizers, and Volunteer Developers. This survey helps us hear from the experience of Wikimedians from across the movement so that teams are able to use community feedback in their planning and their work. This survey also helps us learn about long term changes in communities, such as community health or demographics.
The report is available on meta: https://meta.wikimedia.org/wiki/Community_Engagement_Insights/2018_Rep ort
For this survey, we worked with 11 teams to develop the questions. Once the results were analyzed, we spent time with each team to help them understand their results. Most teams have already identified how they will use the results to help improve their work to support you.
The report could be useful for your work in the Wikimedia movement as
well!
What are you learning from the data? Take some time to read the report and share your feedback on the talk pages. We have also published a blog that you can read.[1]
We are hosting a livestream presentation[2] on September 20 at 1600 UTC. Hope to see you there!
Feel free to email me directly with any questions.
All the best, Edward
[1]
https://wikimediafoundation.org/2018/09/13/what-we-learned-surveying-4 000-community-members/ [2] https://www.youtube.com/watch?v=qGQtWFP9Cjc
-- Edward Galvez Evaluation Strategist, Surveys Learning & Evaluation Community Engagement Wikimedia Foundation
-- Edward Galvez Evaluation Strategist, Surveys Learning & Evaluation Community Engagement Wikimedia Foundation _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
_______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
I'm going to respond to Kerry and Jonathan in two parts of one email.
--
Hi Kerry, I did not say that transparency should be a free-for-all, and it's important to keep in mind that transparency from my perspective is intended to ensure due process for everyone involved. That includes ensuring that people who are adjudicating cases are not callously dismissing complaints, mistreating people who have been victimized, neglecting evidence, or rushing to conclusions. I would oppose, for example, people who are adjudicating a case deciding to engage in questioning that is completely unnecessary for dealing with the relevant allegations.
On a related issue, I don't trust WMF to adjudicate cases or involve itself directly in deciding who gets to be on Wikimedia sites or attend Wikimedia events; WMF is not the same thing as Wikimedia and I remain deeply unhappy with some of WMF's choices over the years and its lack of apology for those choices. I would be more trusting of a somewhat less transparent process for adjudicating off-wiki problems if it was led by people who are elected from the community, similar to English Wikipedia Arbitration Committee elections. Arbcom is far from perfect, but I have modestly more faith in Arbcom than I do in WMF. On the other hand, arbitrators are volunteers, and over the years I have seen more than one instance of arbitrators appearing to be stressed; volunteers with high skill levels and good intentions are a precious resource, and if one of the outcomes of WMF's strategy process is a move toward having a global Arbitration Committee then one of the difficult questions will be how to get an adequate supply of highly skilled people with good intentions to volunteer. On a related note, I prefer to avoid identity politics when deciding who should be on arbitration committees; I feel that identity politics are often poisonous and make it very difficult to have civil dialogue. How to balance the virtue of diversity with the virtue of avoiding identity politics is an issue that I haven't worked out.
We're getting off of the topic of research and into more of a policy discussion, so if you'd like to continue in this topic then I suggest doing so on Wikimedia-l or on Meta.
--
Hi Jonathan, I'd be supportive of running small experiments about blocking all IP editors on ENWP and mid-sized Wikipedias to see whether that is a net positive. As you noted, the research would be somewhat complicated when keeping in mind that the researchers would want to check for positive and negative side effects, but I think that it would be worth doing. Would you like to make a proposal in IdeaLab?
Regards,
Thanks Pine,
In case I didn’t make it clear, I am very much of the camp that IP editing is our lifeline, the way we recruit new members. If someone isn’t happy with Citizendium et al as tests of that proposition then feel free to propose tests. I am open to being proved wrong if someone doesn’t mind wasting their time checking what seems obvious to me.
Just please if you do so make sure you test for the babies that I fear would be thrown out with the bathwater, i.e the goodfaith newbies.
I am not short of promising lines of enquiry, and more productive uses of my time. My choice for my time available for such things is which promising lines of enquiry to follow, and banning IPs isn’t one if them.
One where we might have more agreement is over the default four warnings and a block for vandalism. I think it bonkers that we block edit warrers for a first offence but usually don’t block vandals till a fifth offence. I know that the four warnings and a block approach dates back to some of the earliest years on Wiki, but I am willing to bet that it wasn’t very scientifically arrived at, and that a study of the various behaviours that we treat this way would probably conclude that we could reduce the number of warnings for vandals, whilst we might want a longer dialogue with non neutral editors, copy pasters and those who add unsourced material. Afterall, many of our editors started without getting issues like neutrality, and whilst the few former vandals who we have don’t generally have a grudge that their early vandalism lead to a block, the same isn't always true of others.
The other issue that could really use some research is on the chilling effect theory. Here the community is divided, some honestly believe that the high quality work of certain individuals justifies a certain level of snark, even to the point of harassment. Others, including myself, believe that tolerance of bad behaviour drives away some good editors and fails to improve the behaviour of some who would comply with stricter civility enforcement. It would be really useful to have a study one could point to when that argument next recurs.
Get Outlook for iOShttps://aka.ms/o0ukef ________________________________ From: Wiki-research-l wiki-research-l-bounces@lists.wikimedia.org on behalf of Pine W wiki.pine@gmail.com Sent: Wednesday, September 19, 2018 8:29:32 AM To: Wiki Research-l Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
I'm going to respond to Kerry and Jonathan in two parts of one email.
--
Hi Kerry, I did not say that transparency should be a free-for-all, and it's important to keep in mind that transparency from my perspective is intended to ensure due process for everyone involved. That includes ensuring that people who are adjudicating cases are not callously dismissing complaints, mistreating people who have been victimized, neglecting evidence, or rushing to conclusions. I would oppose, for example, people who are adjudicating a case deciding to engage in questioning that is completely unnecessary for dealing with the relevant allegations.
On a related issue, I don't trust WMF to adjudicate cases or involve itself directly in deciding who gets to be on Wikimedia sites or attend Wikimedia events; WMF is not the same thing as Wikimedia and I remain deeply unhappy with some of WMF's choices over the years and its lack of apology for those choices. I would be more trusting of a somewhat less transparent process for adjudicating off-wiki problems if it was led by people who are elected from the community, similar to English Wikipedia Arbitration Committee elections. Arbcom is far from perfect, but I have modestly more faith in Arbcom than I do in WMF. On the other hand, arbitrators are volunteers, and over the years I have seen more than one instance of arbitrators appearing to be stressed; volunteers with high skill levels and good intentions are a precious resource, and if one of the outcomes of WMF's strategy process is a move toward having a global Arbitration Committee then one of the difficult questions will be how to get an adequate supply of highly skilled people with good intentions to volunteer. On a related note, I prefer to avoid identity politics when deciding who should be on arbitration committees; I feel that identity politics are often poisonous and make it very difficult to have civil dialogue. How to balance the virtue of diversity with the virtue of avoiding identity politics is an issue that I haven't worked out.
We're getting off of the topic of research and into more of a policy discussion, so if you'd like to continue in this topic then I suggest doing so on Wikimedia-l or on Meta.
--
Hi Jonathan, I'd be supportive of running small experiments about blocking all IP editors on ENWP and mid-sized Wikipedias to see whether that is a net positive. As you noted, the research would be somewhat complicated when keeping in mind that the researchers would want to check for positive and negative side effects, but I think that it would be worth doing. Would you like to make a proposal in IdeaLab?
Regards,
Pine ( https://meta.wikimedia.org/wiki/User:Pine ) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
(Re: Jonathan's 'Chilling Effect' theory and Kerry's call for experiments to increase gender diversity)
Kerry: In a magic world, where I could experiment with anything I wanted to without having to get permission from communities, I would experiment with enforceable codes of conduct that covered a wider range of harassing and hostile behavior, coupled with robust & confidential incident reporting and review tools. But that's not really an 'experiment', that's a whole new social/software system.
I actually think we're beyond 'experiments' when it comes to increasing gender diversity. There are too many systemic factors working against increasing non-male participation. In order to do that you would need to increase newcomer retention dramatically, and we can barely move the needle there on EnWiki, for both social and technical reasons. But one non-technical intervention might be carefully revising and re-scope policies like WP:NOTSOCIAL that are often used to arbitrarily and aggressively shut down modes of communication, self-expression, and collaboration that don't fit so-and-so's idea of what it means to be Wikipedian.
Initiatives that start off wiki, like women-oriented edit-a-thons and outreach campaigns, are vitally important and could certainly be supported better in terms of maintaining a sense of community among participants once the event is over and they find they're now stuck alone in hostile wiki-territory. But I'm not sure what the best strategy is there, and these kind of initiatives are not large-scale enough to make a large overall impact on active editor numbers on their own, though they set important precedents, create infrastructure, change the conversation, and do lead to new editors.
The Community Health https://en.wikipedia.org/wiki/Wikipedia:Community_health_initiative team just hired a new researcher who has lots of experience in the online harassment space. I don't feel comfortable announcing their name yet, since they hasn't officially started, but I'll make sure they subscribe to this list, and will point out this thread.
Jonathan: This study https://dl.acm.org/citation.cfm?id=2145265 is the one I cite. There's a more recent--paywalled!--follow up https://link.springer.com/article/10.1007/s11199-015-0573-y (expansion?) that I haven't read yet, but which may provide new insights. And this short but powerful enthnographic study https://dl.acm.org/citation.cfm?id=2702514. And this lab study https://www.sciencedirect.com/science/article/pii/S0747563216306781 on the gendered perceptions of feedback and anonymity. And the--ancient, by now--former contributors survey https://strategy.wikimedia.org/wiki/Former_Contributors_Survey_Results, which IIRC shows that conflict fatigue is a significant reason people leave. And of course there's a mountain of credible evidence at this point that antisocial behaviors drive away newcomers, irrespective of gender.
Thanks for raising these questions,
- J
On Wed, Sep 19, 2018 at 3:21 AM, Jonathan Cardy <werespielchequers@gmail.com
wrote:
Thanks Pine,
In case I didn’t make it clear, I am very much of the camp that IP editing is our lifeline, the way we recruit new members. If someone isn’t happy with Citizendium et al as tests of that proposition then feel free to propose tests. I am open to being proved wrong if someone doesn’t mind wasting their time checking what seems obvious to me.
Just please if you do so make sure you test for the babies that I fear would be thrown out with the bathwater, i.e the goodfaith newbies.
I am not short of promising lines of enquiry, and more productive uses of my time. My choice for my time available for such things is which promising lines of enquiry to follow, and banning IPs isn’t one if them.
One where we might have more agreement is over the default four warnings and a block for vandalism. I think it bonkers that we block edit warrers for a first offence but usually don’t block vandals till a fifth offence. I know that the four warnings and a block approach dates back to some of the earliest years on Wiki, but I am willing to bet that it wasn’t very scientifically arrived at, and that a study of the various behaviours that we treat this way would probably conclude that we could reduce the number of warnings for vandals, whilst we might want a longer dialogue with non neutral editors, copy pasters and those who add unsourced material. Afterall, many of our editors started without getting issues like neutrality, and whilst the few former vandals who we have don’t generally have a grudge that their early vandalism lead to a block, the same isn't always true of others.
The other issue that could really use some research is on the chilling effect theory. Here the community is divided, some honestly believe that the high quality work of certain individuals justifies a certain level of snark, even to the point of harassment. Others, including myself, believe that tolerance of bad behaviour drives away some good editors and fails to improve the behaviour of some who would comply with stricter civility enforcement. It would be really useful to have a study one could point to when that argument next recurs.
Get Outlook for iOShttps://aka.ms/o0ukef ________________________________ From: Wiki-research-l wiki-research-l-bounces@lists.wikimedia.org on behalf of Pine W wiki.pine@gmail.com Sent: Wednesday, September 19, 2018 8:29:32 AM To: Wiki Research-l Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
I'm going to respond to Kerry and Jonathan in two parts of one email.
--
Hi Kerry, I did not say that transparency should be a free-for-all, and it's important to keep in mind that transparency from my perspective is intended to ensure due process for everyone involved. That includes ensuring that people who are adjudicating cases are not callously dismissing complaints, mistreating people who have been victimized, neglecting evidence, or rushing to conclusions. I would oppose, for example, people who are adjudicating a case deciding to engage in questioning that is completely unnecessary for dealing with the relevant allegations.
On a related issue, I don't trust WMF to adjudicate cases or involve itself directly in deciding who gets to be on Wikimedia sites or attend Wikimedia events; WMF is not the same thing as Wikimedia and I remain deeply unhappy with some of WMF's choices over the years and its lack of apology for those choices. I would be more trusting of a somewhat less transparent process for adjudicating off-wiki problems if it was led by people who are elected from the community, similar to English Wikipedia Arbitration Committee elections. Arbcom is far from perfect, but I have modestly more faith in Arbcom than I do in WMF. On the other hand, arbitrators are volunteers, and over the years I have seen more than one instance of arbitrators appearing to be stressed; volunteers with high skill levels and good intentions are a precious resource, and if one of the outcomes of WMF's strategy process is a move toward having a global Arbitration Committee then one of the difficult questions will be how to get an adequate supply of highly skilled people with good intentions to volunteer. On a related note, I prefer to avoid identity politics when deciding who should be on arbitration committees; I feel that identity politics are often poisonous and make it very difficult to have civil dialogue. How to balance the virtue of diversity with the virtue of avoiding identity politics is an issue that I haven't worked out.
We're getting off of the topic of research and into more of a policy discussion, so if you'd like to continue in this topic then I suggest doing so on Wikimedia-l or on Meta.
--
Hi Jonathan, I'd be supportive of running small experiments about blocking all IP editors on ENWP and mid-sized Wikipedias to see whether that is a net positive. As you noted, the research would be somewhat complicated when keeping in mind that the researchers would want to check for positive and negative side effects, but I think that it would be worth doing. Would you like to make a proposal in IdeaLab?
Regards,
Pine ( https://meta.wikimedia.org/wiki/User:Pine ) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
I agree there are some systemic factors that may prevent us achieving 50-50 male-female participation (or in these enlightened non-binary times 49-49-2). Studies continue to show that wives still spend more hours at domestic tasks than their husbands, even when both are in full-time employment, and clearly less free time is less time for Wikipedia. But still men now do more housework than they once did. (My husband would argue that I have never let housework take priority over Wikipedia, but maybe I'm not typical!). Similarly, we have not yet seen pay rates for women reach parity with men but they are moving closer. A gender balance of 90-10 that might once have been the norm in many occupations is now unusual. Wikipedia is a child of the 21st century; one might expect it to more closely reflect the societal norms of this century not the 19th century.
Women use wikis like Confluence in workplaces without apparent difficulty. But I note that modern for-profit wikis have visual editing and tools that import/export from Word as normal modes of contribution.
I agree entirely with you about outreach and off-wiki activities. I said when there was the big push to "solve the women problem" by such events that it wouldn't make the difference because the problem is on-wiki. The majority of people who attend my training class and come to the events I support are women. It's not women can't do it. It's not that they don't want to do. As you say, it's just that it's such an unpleasant environment to do it in and that's what women don't like. For that matter, a lot of men don't like it either.
What shall we write on Wikipedia's tombstone? "Wikipedia: an encyclopedia written by the most unpleasant people"?
Can one create cultural change? Yes, I've seen it done in organisations. You tell people what the new rules are, you illustrate with examples of acceptable and unacceptable behaviours. You offer a voluntary redundancy program for those who don't wish to stay and you make clear it that those who wish to stay and continue to engage in the unacceptable behaviours will be "managed out" through performance reviews. You run surveys that measure your culture throughout the whole process. Interestingly the cultural change almost always involved being less critical, more collaborative, less micromanaged, more goal-oriented, more self-starting, many of which I would say apply here (except perhaps for being more self-starting, I don't think that's our problem).
En.WP can change but WMF will have to take a stand and state what the new culture is going to be. En.WP will not change of its own accord; we have years of evidence to demonstrate that.
Kerry
-----Original Message----- From: Wiki-research-l [mailto:wiki-research-l-bounces@lists.wikimedia.org] On Behalf Of Jonathan Morgan Sent: Friday, 21 September 2018 10:44 AM To: Research into Wikimedia content and communities wiki-research-l@lists.wikimedia.org Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
(Re: Jonathan's 'Chilling Effect' theory and Kerry's call for experiments to increase gender diversity)
Kerry: In a magic world, where I could experiment with anything I wanted to without having to get permission from communities, I would experiment with enforceable codes of conduct that covered a wider range of harassing and hostile behavior, coupled with robust & confidential incident reporting and review tools. But that's not really an 'experiment', that's a whole new social/software system.
I actually think we're beyond 'experiments' when it comes to increasing gender diversity. There are too many systemic factors working against increasing non-male participation. In order to do that you would need to increase newcomer retention dramatically, and we can barely move the needle there on EnWiki, for both social and technical reasons. But one non-technical intervention might be carefully revising and re-scope policies like WP:NOTSOCIAL that are often used to arbitrarily and aggressively shut down modes of communication, self-expression, and collaboration that don't fit so-and-so's idea of what it means to be Wikipedian.
Initiatives that start off wiki, like women-oriented edit-a-thons and outreach campaigns, are vitally important and could certainly be supported better in terms of maintaining a sense of community among participants once the event is over and they find they're now stuck alone in hostile wiki-territory. But I'm not sure what the best strategy is there, and these kind of initiatives are not large-scale enough to make a large overall impact on active editor numbers on their own, though they set important precedents, create infrastructure, change the conversation, and do lead to new editors.
The Community Health https://en.wikipedia.org/wiki/Wikipedia:Community_health_initiative team just hired a new researcher who has lots of experience in the online harassment space. I don't feel comfortable announcing their name yet, since they hasn't officially started, but I'll make sure they subscribe to this list, and will point out this thread.
Jonathan: This study https://dl.acm.org/citation.cfm?id=2145265 is the one I cite. There's a more recent--paywalled!--follow up https://link.springer.com/article/10.1007/s11199-015-0573-y (expansion?) that I haven't read yet, but which may provide new insights. And this short but powerful enthnographic study https://dl.acm.org/citation.cfm?id=2702514. And this lab study https://www.sciencedirect.com/science/article/pii/S0747563216306781 on the gendered perceptions of feedback and anonymity. And the--ancient, by now--former contributors survey https://strategy.wikimedia.org/wiki/Former_Contributors_Survey_Results, which IIRC shows that conflict fatigue is a significant reason people leave. And of course there's a mountain of credible evidence at this point that antisocial behaviors drive away newcomers, irrespective of gender.
Thanks for raising these questions,
- J
On Wed, Sep 19, 2018 at 3:21 AM, Jonathan Cardy <werespielchequers@gmail.com
wrote:
Thanks Pine,
In case I didn’t make it clear, I am very much of the camp that IP editing is our lifeline, the way we recruit new members. If someone isn’t happy with Citizendium et al as tests of that proposition then feel free to propose tests. I am open to being proved wrong if someone doesn’t mind wasting their time checking what seems obvious to me.
Just please if you do so make sure you test for the babies that I fear would be thrown out with the bathwater, i.e the goodfaith newbies.
I am not short of promising lines of enquiry, and more productive uses of my time. My choice for my time available for such things is which promising lines of enquiry to follow, and banning IPs isn’t one if them.
One where we might have more agreement is over the default four warnings and a block for vandalism. I think it bonkers that we block edit warrers for a first offence but usually don’t block vandals till a fifth offence. I know that the four warnings and a block approach dates back to some of the earliest years on Wiki, but I am willing to bet that it wasn’t very scientifically arrived at, and that a study of the various behaviours that we treat this way would probably conclude that we could reduce the number of warnings for vandals, whilst we might want a longer dialogue with non neutral editors, copy pasters and those who add unsourced material. Afterall, many of our editors started without getting issues like neutrality, and whilst the few former vandals who we have don’t generally have a grudge that their early vandalism lead to a block, the same isn't always true of others.
The other issue that could really use some research is on the chilling effect theory. Here the community is divided, some honestly believe that the high quality work of certain individuals justifies a certain level of snark, even to the point of harassment. Others, including myself, believe that tolerance of bad behaviour drives away some good editors and fails to improve the behaviour of some who would comply with stricter civility enforcement. It would be really useful to have a study one could point to when that argument next recurs.
Get Outlook for iOShttps://aka.ms/o0ukef ________________________________ From: Wiki-research-l wiki-research-l-bounces@lists.wikimedia.org on behalf of Pine W wiki.pine@gmail.com Sent: Wednesday, September 19, 2018 8:29:32 AM To: Wiki Research-l Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
I'm going to respond to Kerry and Jonathan in two parts of one email.
--
Hi Kerry, I did not say that transparency should be a free-for-all, and it's important to keep in mind that transparency from my perspective is intended to ensure due process for everyone involved. That includes ensuring that people who are adjudicating cases are not callously dismissing complaints, mistreating people who have been victimized, neglecting evidence, or rushing to conclusions. I would oppose, for example, people who are adjudicating a case deciding to engage in questioning that is completely unnecessary for dealing with the relevant allegations.
On a related issue, I don't trust WMF to adjudicate cases or involve itself directly in deciding who gets to be on Wikimedia sites or attend Wikimedia events; WMF is not the same thing as Wikimedia and I remain deeply unhappy with some of WMF's choices over the years and its lack of apology for those choices. I would be more trusting of a somewhat less transparent process for adjudicating off-wiki problems if it was led by people who are elected from the community, similar to English Wikipedia Arbitration Committee elections. Arbcom is far from perfect, but I have modestly more faith in Arbcom than I do in WMF. On the other hand, arbitrators are volunteers, and over the years I have seen more than one instance of arbitrators appearing to be stressed; volunteers with high skill levels and good intentions are a precious resource, and if one of the outcomes of WMF's strategy process is a move toward having a global Arbitration Committee then one of the difficult questions will be how to get an adequate supply of highly skilled people with good intentions to volunteer. On a related note, I prefer to avoid identity politics when deciding who should be on arbitration committees; I feel that identity politics are often poisonous and make it very difficult to have civil dialogue. How to balance the virtue of diversity with the virtue of avoiding identity politics is an issue that I haven't worked out.
We're getting off of the topic of research and into more of a policy discussion, so if you'd like to continue in this topic then I suggest doing so on Wikimedia-l or on Meta.
--
Hi Jonathan, I'd be supportive of running small experiments about blocking all IP editors on ENWP and mid-sized Wikipedias to see whether that is a net positive. As you noted, the research would be somewhat complicated when keeping in mind that the researchers would want to check for positive and negative side effects, but I think that it would be worth doing. Would you like to make a proposal in IdeaLab?
Regards,
Pine ( https://meta.wikimedia.org/wiki/User:Pine ) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
-- Jonathan T. Morgan Senior Design Researcher Wikimedia Foundation User:Jmorgan (WMF) https://meta.wikimedia.org/wiki/User:Jmorgan_(WMF) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
A recently published report which is relevant to this discussion: https://meta.wikimedia.org/wiki/Gender_equity_report_2018/Barriers_to_equity
On Thu, Sep 20, 2018 at 7:57 PM Kerry Raymond kerry.raymond@gmail.com wrote:
I agree there are some systemic factors that may prevent us achieving 50-50 male-female participation (or in these enlightened non-binary times 49-49-2). Studies continue to show that wives still spend more hours at domestic tasks than their husbands, even when both are in full-time employment, and clearly less free time is less time for Wikipedia. But still men now do more housework than they once did. (My husband would argue that I have never let housework take priority over Wikipedia, but maybe I'm not typical!). Similarly, we have not yet seen pay rates for women reach parity with men but they are moving closer. A gender balance of 90-10 that might once have been the norm in many occupations is now unusual. Wikipedia is a child of the 21st century; one might expect it to more closely reflect the societal norms of this century not the 19th century.
Women use wikis like Confluence in workplaces without apparent difficulty. But I note that modern for-profit wikis have visual editing and tools that import/export from Word as normal modes of contribution.
I agree entirely with you about outreach and off-wiki activities. I said when there was the big push to "solve the women problem" by such events that it wouldn't make the difference because the problem is on-wiki. The majority of people who attend my training class and come to the events I support are women. It's not women can't do it. It's not that they don't want to do. As you say, it's just that it's such an unpleasant environment to do it in and that's what women don't like. For that matter, a lot of men don't like it either.
What shall we write on Wikipedia's tombstone? "Wikipedia: an encyclopedia written by the most unpleasant people"?
Can one create cultural change? Yes, I've seen it done in organisations. You tell people what the new rules are, you illustrate with examples of acceptable and unacceptable behaviours. You offer a voluntary redundancy program for those who don't wish to stay and you make clear it that those who wish to stay and continue to engage in the unacceptable behaviours will be "managed out" through performance reviews. You run surveys that measure your culture throughout the whole process. Interestingly the cultural change almost always involved being less critical, more collaborative, less micromanaged, more goal-oriented, more self-starting, many of which I would say apply here (except perhaps for being more self-starting, I don't think that's our problem).
En.WP can change but WMF will have to take a stand and state what the new culture is going to be. En.WP will not change of its own accord; we have years of evidence to demonstrate that.
Kerry
-----Original Message----- From: Wiki-research-l [mailto:wiki-research-l-bounces@lists.wikimedia.org] On Behalf Of Jonathan Morgan Sent: Friday, 21 September 2018 10:44 AM To: Research into Wikimedia content and communities < wiki-research-l@lists.wikimedia.org> Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
(Re: Jonathan's 'Chilling Effect' theory and Kerry's call for experiments to increase gender diversity)
Kerry: In a magic world, where I could experiment with anything I wanted to without having to get permission from communities, I would experiment with enforceable codes of conduct that covered a wider range of harassing and hostile behavior, coupled with robust & confidential incident reporting and review tools. But that's not really an 'experiment', that's a whole new social/software system.
I actually think we're beyond 'experiments' when it comes to increasing gender diversity. There are too many systemic factors working against increasing non-male participation. In order to do that you would need to increase newcomer retention dramatically, and we can barely move the needle there on EnWiki, for both social and technical reasons. But one non-technical intervention might be carefully revising and re-scope policies like WP:NOTSOCIAL that are often used to arbitrarily and aggressively shut down modes of communication, self-expression, and collaboration that don't fit so-and-so's idea of what it means to be Wikipedian.
Initiatives that start off wiki, like women-oriented edit-a-thons and outreach campaigns, are vitally important and could certainly be supported better in terms of maintaining a sense of community among participants once the event is over and they find they're now stuck alone in hostile wiki-territory. But I'm not sure what the best strategy is there, and these kind of initiatives are not large-scale enough to make a large overall impact on active editor numbers on their own, though they set important precedents, create infrastructure, change the conversation, and do lead to new editors.
The Community Health https://en.wikipedia.org/wiki/Wikipedia:Community_health_initiative team just hired a new researcher who has lots of experience in the online harassment space. I don't feel comfortable announcing their name yet, since they hasn't officially started, but I'll make sure they subscribe to this list, and will point out this thread.
Jonathan: This study https://dl.acm.org/citation.cfm?id=2145265 is the one I cite. There's a more recent--paywalled!--follow up < https://link.springer.com/article/10.1007/s11199-015-0573-y%3E (expansion?) that I haven't read yet, but which may provide new insights. And this short but powerful enthnographic study < https://dl.acm.org/citation.cfm?id=2702514%3E. And this lab study < https://www.sciencedirect.com/science/article/pii/S0747563216306781%3E on the gendered perceptions of feedback and anonymity. And the--ancient, by now--former contributors survey < https://strategy.wikimedia.org/wiki/Former_Contributors_Survey_Results%3E, which IIRC shows that conflict fatigue is a significant reason people leave. And of course there's a mountain of credible evidence at this point that antisocial behaviors drive away newcomers, irrespective of gender.
Thanks for raising these questions,
- J
On Wed, Sep 19, 2018 at 3:21 AM, Jonathan Cardy < werespielchequers@gmail.com
wrote:
Thanks Pine,
In case I didn’t make it clear, I am very much of the camp that IP editing is our lifeline, the way we recruit new members. If someone isn’t happy with Citizendium et al as tests of that proposition then feel free to propose tests. I am open to being proved wrong if someone doesn’t mind wasting their time checking what seems obvious to me.
Just please if you do so make sure you test for the babies that I fear would be thrown out with the bathwater, i.e the goodfaith newbies.
I am not short of promising lines of enquiry, and more productive uses of my time. My choice for my time available for such things is which promising lines of enquiry to follow, and banning IPs isn’t one if them.
One where we might have more agreement is over the default four warnings and a block for vandalism. I think it bonkers that we block edit warrers for a first offence but usually don’t block vandals till a fifth offence. I know that the four warnings and a block approach dates back to some of the earliest years on Wiki, but I am willing to bet that it wasn’t very scientifically arrived at, and that a study of the various behaviours that we treat this way would probably conclude that we could reduce the number of warnings for vandals, whilst we might want a longer dialogue with non neutral editors, copy pasters and
those who add unsourced material.
Afterall, many of our editors started without getting issues like neutrality, and whilst the few former vandals who we have don’t generally have a grudge that their early vandalism lead to a block, the same isn't always true of others.
The other issue that could really use some research is on the chilling effect theory. Here the community is divided, some honestly believe that the high quality work of certain individuals justifies a certain level of snark, even to the point of harassment. Others, including myself, believe that tolerance of bad behaviour drives away some good editors and fails to improve the behaviour of some who would comply with stricter civility enforcement. It would be really useful to have a study one could point to when that argument next recurs.
Get Outlook for iOShttps://aka.ms/o0ukef ________________________________ From: Wiki-research-l wiki-research-l-bounces@lists.wikimedia.org on behalf of Pine W wiki.pine@gmail.com Sent: Wednesday, September 19, 2018 8:29:32 AM To: Wiki Research-l Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
I'm going to respond to Kerry and Jonathan in two parts of one email.
--
Hi Kerry, I did not say that transparency should be a free-for-all, and it's important to keep in mind that transparency from my perspective is intended to ensure due process for everyone involved. That includes ensuring that people who are adjudicating cases are not callously dismissing complaints, mistreating people who have been victimized, neglecting evidence, or rushing to conclusions. I would oppose, for example, people who are adjudicating a case deciding to engage in questioning that is completely unnecessary for dealing with the relevant allegations.
On a related issue, I don't trust WMF to adjudicate cases or involve itself directly in deciding who gets to be on Wikimedia sites or attend Wikimedia events; WMF is not the same thing as Wikimedia and I remain deeply unhappy with some of WMF's choices over the years and its lack of apology for those choices. I would be more trusting of a somewhat less transparent process for adjudicating off-wiki problems if it was led by people who are elected from the community, similar to English Wikipedia Arbitration Committee elections. Arbcom is far from perfect, but I have modestly more faith in Arbcom than I do in WMF. On the other hand, arbitrators are volunteers, and over the years I have seen more than one instance of arbitrators appearing to be stressed; volunteers with high skill levels and good intentions are a precious resource, and if one of the outcomes of WMF's strategy process is a move toward having a global Arbitration Committee then one of the difficult questions will be how to get an adequate supply of highly skilled people with good intentions to volunteer. On a related note, I prefer to avoid identity politics when deciding who should be on arbitration committees; I feel that identity politics are often poisonous and make it very difficult to have civil dialogue. How to balance the virtue of diversity with the virtue of avoiding identity
politics is an issue that I haven't worked out.
We're getting off of the topic of research and into more of a policy discussion, so if you'd like to continue in this topic then I suggest doing so on Wikimedia-l or on Meta.
--
Hi Jonathan, I'd be supportive of running small experiments about blocking all IP editors on ENWP and mid-sized Wikipedias to see whether that is a net positive. As you noted, the research would be somewhat complicated when keeping in mind that the researchers would want to check for positive and negative side effects, but I think that it would be worth doing. Would you like to make a proposal in IdeaLab?
Regards,
Pine ( https://meta.wikimedia.org/wiki/User:Pine ) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
-- Jonathan T. Morgan Senior Design Researcher Wikimedia Foundation User:Jmorgan (WMF) https://meta.wikimedia.org/wiki/User:Jmorgan_(WMF) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
I'm appreciative that we're having this conversation - not in the sense that I'm happy with the status quo, but I'm glad that some of us are continuing to work on our persistent difficulties with contributor retention, civility, and diversity.
I've spent several hours on ENWP recently, and I've been surprised by the willingness of people to revert good-faith edits, sometimes with blunt commentary or with no explanation. I can understand how a newbie who experienced even one of these incidents would find it to be unpleasant, intimidating, or discouraging. Based on these experiences, I've decided that I should coach newbies to avoid taking reversions personally if their original contributions were in good faith.
I agree with Jonathan Morgan that WP:NOTSOCIAL can be overused.
Kerry, I appreciate your suggestions about about cultural change. I can think of two ways to influence culture on English Wikipedia in large-scale ways.
1. I think that there should be more and higher-quality training and continuing education for administrators in topics like policies, conflict resolution, communications skills, legal issues, and setting good examples. I think that these trainings would be one way through which cultural change could gradually happen over time. For what it's worth, I think that there are many excellent administrators who do a lot of good work (which can be tedious and/or stressful) with little appreciation. Also, my impression is that ENWP Arbcom has become more willing over the years to remove admin privileges from admins who misuse their tools. I recall having a discussion awhile back with Rosie on the topic of training for administrators, and I'm adding her to this email chain as an invitation for her to participate in this discussion. I think that offering training to administrators could be helpful in facilitating changes to ENWP culture.
2. I think that I can encourage civil participation in ENWP in the context of my training project https://meta.wikimedia.org/wiki/Grants:Project/Rapid/Pine/Continuation_of_educational_video_and_website_project that I'm hoping that WMF will continue to fund. ENWP is a complex and sometimes emotionally difficult environment, and I'm trying to set a tone in the online training materials that is encouraging. I hope to teach newbies about the goals of Wikipedia as well as policies, how to use tools, and Wikipedia culture. I am hopeful that the online training materials will improve the confidence of new contributors, improve the retention of new contributors, and help new editors to increase the quality and quantity of their contributions. I hope that early portions of the project will be well received and that, over time and if the project is successful as it incrementally increases in scale and reach, that it will influence the overall culture of ENWP to be more civil.
Regards,
I believe administrators outside of the US, in en wikipedia and in wikidata etc., do not understand, our freedom of speech and our right to due process, and
that there is a cultural misunderstanding and a lack of patience on there part,
which leads to an abuse of power and a breaking of the rules when it comes
to blocking IP’s and others for just standing up for themselves. and to that end,
do not see the good faith edits made, that were not reverted, and based on
other’s intelligent level not there's. Everything starts out nice, on tea room’s,
noticeboards, forums, and on there talk pages etc., and then all goes south,
as in en wikipedia, and with a now “conflict of interest” just block you, to end it. In wikidata which is more technically challenging, editors that claim ownership
of pages and coming from outside of north America and europe, revert on
misunderstanding’s, and can not express themselves in english, so just rely on
administrators noticeboard to complain against IP’s without warning, not giving the chance for the ip to defend himself, and to explain that it was
an edit war. administrators that see these posts at 100’s an hour, just block
the IP’s or the pages without any kind of investigation, based on lies of the
accusers. and these same administrators that have participated on there talk pages are now in a “conflict of interest”, being directly involved. and in ru wikipedia, ru wikidata, english speakers are not welcome, from
there board down to there users.
Tuesday, September 25, 2018 10:08 PM -05:00 from Pine W wiki.pine@gmail.com:
I'm appreciative that we're having this conversation - not in the sense that I'm happy with the status quo, but I'm glad that some of us are continuing to work on our persistent difficulties with contributor retention, civility, and diversity.
I've spent several hours on ENWP recently, and I've been surprised by the willingness of people to revert good-faith edits, sometimes with blunt commentary or with no explanation. I can understand how a newbie who experienced even one of these incidents would find it to be unpleasant, intimidating, or discouraging. Based on these experiences, I've decided that I should coach newbies to avoid taking reversions personally if their original contributions were in good faith.
I agree with Jonathan Morgan that WP:NOTSOCIAL can be overused.
Kerry, I appreciate your suggestions about about cultural change. I can think of two ways to influence culture on English Wikipedia in large-scale ways.
- I think that there should be more and higher-quality training and
continuing education for administrators in topics like policies, conflict resolution, communications skills, legal issues, and setting good examples. I think that these trainings would be one way through which cultural change could gradually happen over time. For what it's worth, I think that there are many excellent administrators who do a lot of good work (which can be tedious and/or stressful) with little appreciation. Also, my impression is that ENWP Arbcom has become more willing over the years to remove admin privileges from admins who misuse their tools. I recall having a discussion awhile back with Rosie on the topic of training for administrators, and I'm adding her to this email chain as an invitation for her to participate in this discussion. I think that offering training to administrators could be helpful in facilitating changes to ENWP culture.
- I think that I can encourage civil participation in ENWP in the context
of my training project < https://meta.wikimedia.org/wiki/Grants:Project/Rapid/Pine/Continuation_of_ed... > that I'm hoping that WMF will continue to fund. ENWP is a complex and sometimes emotionally difficult environment, and I'm trying to set a tone in the online training materials that is encouraging. I hope to teach newbies about the goals of Wikipedia as well as policies, how to use tools, and Wikipedia culture. I am hopeful that the online training materials will improve the confidence of new contributors, improve the retention of new contributors, and help new editors to increase the quality and quantity of their contributions. I hope that early portions of the project will be well received and that, over time and if the project is successful as it incrementally increases in scale and reach, that it will influence the overall culture of ENWP to be more civil.
Regards,
Pine ( https://meta.wikimedia.org/wiki/User:Pine ) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
While I have no objection to the administrator training, I don't think most of the problem lies with administrators. There's a lot of biting of the good-faith newbies done by "ordinary" editors (although I have seen some admins do it too). And, while I agree that there are many good folk out there on en.WP, unfortunately the newbie tends to meet the other folk first or perhaps it's that 1 bad experience has more impact than one good experience.
Similarly while Arbcom's willingness to desysop folks is good, I doubt a newbie knows how or where to complain in the first instance. Also there's a high level of defensive reaction if they do. Some of my trainees have contacted me about being reverted for clearly good-faith edits on the most spurious of reasons. When I have restored their edit with a hopefully helpful explanation, I often get reverted too. If a newbie takes any action themselves, it is likely to be an undo and that road leads to 3RR block or at least a 3RR warning. The other action they take is to respond on their User Talk page (when there is a message there to respond to). However, such replies are usually ignored, whether the other user isn't watching for a reply or whether they just don't like their authority to be challenged, I don't know. But it rarely leads to a satisfactory resolution.
One of the problems we have with Wikipedia is that most of us tend to see it edit-by-edit (whether we are talking about a new edit or a revert of an edit), we don't ever see a "big picture" of a user's behaviour without a lot of tedious investigation (working through their recent contributions one by one). So, it's easy to think "I am not 100% sure that the edit/revert I saw was OK but I really don't have time to see if this is one-off or a consistent problem". Maybe we need a way to privately "express doubt" about an edit (in the way you can report a Facebook post). Then if someone starts getting too many "doubtful edits" per unit time (or whatever), it triggers an admin (or someone) to take a closer look at what that user is up to. I think if we had a lightweight way to express doubt about any edit, then we could use machine learning to detect patterns that suggest specific types of undesirable user behaviours that can really only be seen as a "big picture".
Given this is the research mailing list, I guess we should we talking about ways research can help with this problem.
Kerry
-----Original Message----- From: Wiki-research-l [mailto:wiki-research-l-bounces@lists.wikimedia.org] On Behalf Of Pine W Sent: Wednesday, 26 September 2018 1:07 PM To: Wiki Research-l wiki-research-l@lists.wikimedia.org; Rosie Stephenson-Goodknight rosiestep.wiki@gmail.com Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
I'm appreciative that we're having this conversation - not in the sense that I'm happy with the status quo, but I'm glad that some of us are continuing to work on our persistent difficulties with contributor retention, civility, and diversity.
I've spent several hours on ENWP recently, and I've been surprised by the willingness of people to revert good-faith edits, sometimes with blunt commentary or with no explanation. I can understand how a newbie who experienced even one of these incidents would find it to be unpleasant, intimidating, or discouraging. Based on these experiences, I've decided that I should coach newbies to avoid taking reversions personally if their original contributions were in good faith.
I agree with Jonathan Morgan that WP:NOTSOCIAL can be overused.
Kerry, I appreciate your suggestions about about cultural change. I can think of two ways to influence culture on English Wikipedia in large-scale ways.
1. I think that there should be more and higher-quality training and continuing education for administrators in topics like policies, conflict resolution, communications skills, legal issues, and setting good examples. I think that these trainings would be one way through which cultural change could gradually happen over time. For what it's worth, I think that there are many excellent administrators who do a lot of good work (which can be tedious and/or stressful) with little appreciation. Also, my impression is that ENWP Arbcom has become more willing over the years to remove admin privileges from admins who misuse their tools. I recall having a discussion awhile back with Rosie on the topic of training for administrators, and I'm adding her to this email chain as an invitation for her to participate in this discussion. I think that offering training to administrators could be helpful in facilitating changes to ENWP culture.
2. I think that I can encourage civil participation in ENWP in the context of my training project https://meta.wikimedia.org/wiki/Grants:Project/Rapid/Pine/Continuation_of_educational_video_and_website_project that I'm hoping that WMF will continue to fund. ENWP is a complex and sometimes emotionally difficult environment, and I'm trying to set a tone in the online training materials that is encouraging. I hope to teach newbies about the goals of Wikipedia as well as policies, how to use tools, and Wikipedia culture. I am hopeful that the online training materials will improve the confidence of new contributors, improve the retention of new contributors, and help new editors to increase the quality and quantity of their contributions. I hope that early portions of the project will be well received and that, over time and if the project is successful as it incrementally increases in scale and reach, that it will influence the overall culture of ENWP to be more civil.
Regards,
Pine ( https://meta.wikimedia.org/wiki/User:Pine ) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Hi Kerry,
Your comments are well taken (at least by me)!
I like the idea of letting users upvote or downvote edits, and having a time-weighted average of those scores be public or at least visible to administrators. Users who accumulate a significant number of downvotes would be good for admins to review, especially if those downvotes come from multiple users in a short period of time. Upvotes could be closely linked to the "Thanks" feature, except that users could be offered the option to thank anonymously or thank non-anonymously. I suggest that you propose your suggestions in IdeaLab, and I may make some comments on the IdeaLab post. The Anti-Harrassment Tools Team might be interested in that idea for their own reasons.
Regarding reversions, I think that I heard Jonathan Morgan once say that reverting good-faith new editors makes them significantly more likely to stop editing. Perhaps he could share some research or thoughts on that point, and any other thoughts about the problem with excessively aggressive reversions and/or comments on reversions.
Pine ( https://meta.wikimedia.org/wiki/User:Pine )
On Thu, Sep 27, 2018 at 4:47 AM Kerry Raymond kerry.raymond@gmail.com wrote:
While I have no objection to the administrator training, I don't think most of the problem lies with administrators. There's a lot of biting of the good-faith newbies done by "ordinary" editors (although I have seen some admins do it too). And, while I agree that there are many good folk out there on en.WP, unfortunately the newbie tends to meet the other folk first or perhaps it's that 1 bad experience has more impact than one good experience.
Similarly while Arbcom's willingness to desysop folks is good, I doubt a newbie knows how or where to complain in the first instance. Also there's a high level of defensive reaction if they do. Some of my trainees have contacted me about being reverted for clearly good-faith edits on the most spurious of reasons. When I have restored their edit with a hopefully helpful explanation, I often get reverted too. If a newbie takes any action themselves, it is likely to be an undo and that road leads to 3RR block or at least a 3RR warning. The other action they take is to respond on their User Talk page (when there is a message there to respond to). However, such replies are usually ignored, whether the other user isn't watching for a reply or whether they just don't like their authority to be challenged, I don't know. But it rarely leads to a satisfactory resolution.
One of the problems we have with Wikipedia is that most of us tend to see it edit-by-edit (whether we are talking about a new edit or a revert of an edit), we don't ever see a "big picture" of a user's behaviour without a lot of tedious investigation (working through their recent contributions one by one). So, it's easy to think "I am not 100% sure that the edit/revert I saw was OK but I really don't have time to see if this is one-off or a consistent problem". Maybe we need a way to privately "express doubt" about an edit (in the way you can report a Facebook post). Then if someone starts getting too many "doubtful edits" per unit time (or whatever), it triggers an admin (or someone) to take a closer look at what that user is up to. I think if we had a lightweight way to express doubt about any edit, then we could use machine learning to detect patterns that suggest specific types of undesirable user behaviours that can really only be seen as a "big picture".
Given this is the research mailing list, I guess we should we talking about ways research can help with this problem.
Kerry
-----Original Message----- From: Wiki-research-l [mailto:wiki-research-l-bounces@lists.wikimedia.org] On Behalf Of Pine W Sent: Wednesday, 26 September 2018 1:07 PM To: Wiki Research-l wiki-research-l@lists.wikimedia.org; Rosie Stephenson-Goodknight rosiestep.wiki@gmail.com Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
I'm appreciative that we're having this conversation - not in the sense that I'm happy with the status quo, but I'm glad that some of us are continuing to work on our persistent difficulties with contributor retention, civility, and diversity.
I've spent several hours on ENWP recently, and I've been surprised by the willingness of people to revert good-faith edits, sometimes with blunt commentary or with no explanation. I can understand how a newbie who experienced even one of these incidents would find it to be unpleasant, intimidating, or discouraging. Based on these experiences, I've decided that I should coach newbies to avoid taking reversions personally if their original contributions were in good faith.
I agree with Jonathan Morgan that WP:NOTSOCIAL can be overused.
Kerry, I appreciate your suggestions about about cultural change. I can think of two ways to influence culture on English Wikipedia in large-scale ways.
- I think that there should be more and higher-quality training and
continuing education for administrators in topics like policies, conflict resolution, communications skills, legal issues, and setting good examples. I think that these trainings would be one way through which cultural change could gradually happen over time. For what it's worth, I think that there are many excellent administrators who do a lot of good work (which can be tedious and/or stressful) with little appreciation. Also, my impression is that ENWP Arbcom has become more willing over the years to remove admin privileges from admins who misuse their tools. I recall having a discussion awhile back with Rosie on the topic of training for administrators, and I'm adding her to this email chain as an invitation for her to participate in this discussion. I think that offering training to administrators could be helpful in facilitating changes to ENWP culture.
- I think that I can encourage civil participation in ENWP in the context
of my training project < https://meta.wikimedia.org/wiki/Grants:Project/Rapid/Pine/Continuation_of_ed...
that I'm hoping that WMF will continue to fund. ENWP is a complex and sometimes emotionally difficult environment, and I'm trying to set a tone in the online training materials that is encouraging. I hope to teach newbies about the goals of Wikipedia as well as policies, how to use tools, and Wikipedia culture. I am hopeful that the online training materials will improve the confidence of new contributors, improve the retention of new contributors, and help new editors to increase the quality and quantity of their contributions. I hope that early portions of the project will be well received and that, over time and if the project is successful as it incrementally increases in scale and reach, that it will influence the overall culture of ENWP to be more civil.
Regards,
Pine ( https://meta.wikimedia.org/wiki/User:Pine ) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Pine
This paper has some good studies about gender and new editors and reverting
https://www.researchgate.net/profile/Shilad_Sen/publication/221367798_WPClub...
It shows that both male and female newbies are equally likely to drop out after being reverted for good-faith edits, BUT that female newbies are more likely to be reverted than male newbies, leading to a greater proportion of them dropping out.
It also shows that male and female editors tend to be attracted to different types of topic. "There is a greater concentration of females in the People and Arts areas, while males focus more on Geography and Science." (see Table 1 in the paper). And their engagement with History seems lower.
So why are newbie women reverted more? This paper does not investigate that. But I think it has to be either than they are reverted because they are women (i.e. conscious discrimination) or because women's edits are less acceptable in some way.
I have *hypothesised* that newbie women may get reverted more because women show higher interest in People but not in History suggesting women are more likely to be editing articles about living people than about dead people. BLP policy is stricter on verification compared with dead people topics, or with topics in male-attracting topics like Geography and Science, so women are perhaps doing more BLP edits as newbies and more likely to be reverted because they fail to provide a citation or their citation comes from a source which may not be considered reliable (e.g. celebrity magazine).
If this could be established as at least a part of the problem, maybe there might be targeted solutions to address the problem. E.g. maybe newbies should not be allowed to edit articles which are BLP or have a high revert history (suggesting it's dangerous territory for some reason, e.g. real-world controversy, "ownership") and are deflected to the Talk page to suggest edits (as with a protected article or semi-protected article). Currently we auto-confirm user accounts at 10 edits or 4 days (from memory). But these thresholds are based on the likelihood of vandalism (early good-faith behaviour is a good predictor of future good faith behaviour). But, having trained people, I know that the auto-confirmation threshold should not be used as "beyond newbie" indicator; they are newbies for many more edits.
How many edits do you need to stop being a newbie? I don't know, but as I know myself with over 100k edits, if I edit an article outside my normal interests, I am far more likely to be reverted than in my regular topic area, so we can all be newbies in unfamiliar topic spaces. There is a lot of convention, pre-existing consensus and other "norms" in topic spaces that the "newbie to this topic" doesn't know. All editors in this situation may back off, but the established editor has a comfort zone (normal topic space) to return to, the total newbie does not.
Kerry
Hoi, FYI https://tools.wmflabs.org/scholia/work/Q27797938
The point is that the relevance of research and of its authors becomes increasingly clear from the data we hold in Wikidata. Thanks, GerardM
On Fri, 28 Sep 2018 at 02:05, Kerry Raymond kerry.raymond@gmail.com wrote:
Pine
This paper has some good studies about gender and new editors and reverting
https://www.researchgate.net/profile/Shilad_Sen/publication/221367798_WPClub...
It shows that both male and female newbies are equally likely to drop out after being reverted for good-faith edits, BUT that female newbies are more likely to be reverted than male newbies, leading to a greater proportion of them dropping out.
It also shows that male and female editors tend to be attracted to different types of topic. "There is a greater concentration of females in the People and Arts areas, while males focus more on Geography and Science." (see Table 1 in the paper). And their engagement with History seems lower.
So why are newbie women reverted more? This paper does not investigate that. But I think it has to be either than they are reverted because they are women (i.e. conscious discrimination) or because women's edits are less acceptable in some way.
I have *hypothesised* that newbie women may get reverted more because women show higher interest in People but not in History suggesting women are more likely to be editing articles about living people than about dead people. BLP policy is stricter on verification compared with dead people topics, or with topics in male-attracting topics like Geography and Science, so women are perhaps doing more BLP edits as newbies and more likely to be reverted because they fail to provide a citation or their citation comes from a source which may not be considered reliable (e.g. celebrity magazine).
If this could be established as at least a part of the problem, maybe there might be targeted solutions to address the problem. E.g. maybe newbies should not be allowed to edit articles which are BLP or have a high revert history (suggesting it's dangerous territory for some reason, e.g. real-world controversy, "ownership") and are deflected to the Talk page to suggest edits (as with a protected article or semi-protected article). Currently we auto-confirm user accounts at 10 edits or 4 days (from memory). But these thresholds are based on the likelihood of vandalism (early good-faith behaviour is a good predictor of future good faith behaviour). But, having trained people, I know that the auto-confirmation threshold should not be used as "beyond newbie" indicator; they are newbies for many more edits.
How many edits do you need to stop being a newbie? I don't know, but as I know myself with over 100k edits, if I edit an article outside my normal interests, I am far more likely to be reverted than in my regular topic area, so we can all be newbies in unfamiliar topic spaces. There is a lot of convention, pre-existing consensus and other "norms" in topic spaces that the "newbie to this topic" doesn't know. All editors in this situation may back off, but the established editor has a comfort zone (normal topic space) to return to, the total newbie does not.
Kerry
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Thanks for the link, that was an interesting piece of research.
I’m glad, though not surprised to see that among the regulars women are more likely to become admins than men. I would like to count that as evidence that the core community is not consciously sexist, though it may also be proof of a theory of mine about the heyday of Adminship 2004-2007 - just before I started editing. This was the era when “good vandalfighter” was sufficient qualification to pass adminship. It was also the era when a lot of teenagers and adolescents came to the defence of wikipedia and patrolled it for the sort of vandalism that is now reverted by computer programs. My hypothesis about that era is that teenage boys would oppose each others requests for adminship (RFA) if they thought that other boys were younger or less qualified than they had been when they became admins. But teenage boys didn’t oppose the RFAs of teenage girls. Girls, and women in general, are less likely to push the boundaries by running for adminship before they are clearly qualified, and here I think the vagueness of the criteria for adminship may deter women more than men. It would be interesting to look at the gender ratios of clear versus narrow results at RFA, I suspect that women are disproportionately among the near unanimous results (>95% support) as opposed to those with more substantial opposition.
I am disappointed to hear that the gender ratio is not improving, but considering the ossification of a broadly stable community and the shorter Wikipedia career of women, the risk is that the gender gap grows over time unless we can lengthen the Wiki career of women or get an infusion of new women into the community.
The higher revert rate, and greater contention, does leave me wondering if Wikipedia’s notability criteria may be institutionally sexist or at least a reflection of sexism in society. In academia and many other professions it is or has been harder to get to the top if you take a career break to have a family. Wikipedia’s notability criteria generally skew our biographies to those who get to the top in their professions. This could be a double whammy for women editors, if they write about other women they may be steering themselves to subjects deemed by wikipedia to be more marginally notable, and if they seek to avoid controversy by writing about the unsung figures in a field they are skewing themselves towards articles where sufficient notability for includion in Wikipedia is itself controversial.
The huge differential between female readership and female editorship and our relative failure compared to some other sites leaves me wondering how much of this is down to our problems with mobile editing. In theory Wikipedia can be edited on the mobile platform, however very very few do that, and the mobile platform is much closer to being a broadcast media than the “desktop” platform. If the gender ratio among PC and netbook users is different to the gender ratio among tablet and smartphone readers then this could account for some of our gender imbalance - and make improving editing on mobile a gendergap issue as well as an ethnic-gap issue.
WSC
Get Outlook for iOShttps://aka.ms/o0ukef ________________________________ From: Wiki-research-l wiki-research-l-bounces@lists.wikimedia.org on behalf of Kerry Raymond kerry.raymond@gmail.com Sent: Friday, September 28, 2018 1:05:02 AM To: 'Research into Wikimedia content and communities' Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
Pine
This paper has some good studies about gender and new editors and reverting
https://www.researchgate.net/profile/Shilad_Sen/publication/221367798_WPClub...
It shows that both male and female newbies are equally likely to drop out after being reverted for good-faith edits, BUT that female newbies are more likely to be reverted than male newbies, leading to a greater proportion of them dropping out.
It also shows that male and female editors tend to be attracted to different types of topic. "There is a greater concentration of females in the People and Arts areas, while males focus more on Geography and Science." (see Table 1 in the paper). And their engagement with History seems lower.
So why are newbie women reverted more? This paper does not investigate that. But I think it has to be either than they are reverted because they are women (i.e. conscious discrimination) or because women's edits are less acceptable in some way.
I have *hypothesised* that newbie women may get reverted more because women show higher interest in People but not in History suggesting women are more likely to be editing articles about living people than about dead people. BLP policy is stricter on verification compared with dead people topics, or with topics in male-attracting topics like Geography and Science, so women are perhaps doing more BLP edits as newbies and more likely to be reverted because they fail to provide a citation or their citation comes from a source which may not be considered reliable (e.g. celebrity magazine).
If this could be established as at least a part of the problem, maybe there might be targeted solutions to address the problem. E.g. maybe newbies should not be allowed to edit articles which are BLP or have a high revert history (suggesting it's dangerous territory for some reason, e.g. real-world controversy, "ownership") and are deflected to the Talk page to suggest edits (as with a protected article or semi-protected article). Currently we auto-confirm user accounts at 10 edits or 4 days (from memory). But these thresholds are based on the likelihood of vandalism (early good-faith behaviour is a good predictor of future good faith behaviour). But, having trained people, I know that the auto-confirmation threshold should not be used as "beyond newbie" indicator; they are newbies for many more edits.
How many edits do you need to stop being a newbie? I don't know, but as I know myself with over 100k edits, if I edit an article outside my normal interests, I am far more likely to be reverted than in my regular topic area, so we can all be newbies in unfamiliar topic spaces. There is a lot of convention, pre-existing consensus and other "norms" in topic spaces that the "newbie to this topic" doesn't know. All editors in this situation may back off, but the established editor has a comfort zone (normal topic space) to return to, the total newbie does not.
Kerry
_______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
WSC,
I think that we'd need to be very careful about lowering the bar for BLPs on ENWP, because there are innumerable non-notable professionals who seem to pay people to add their biographies (and/or small organizations) to Wikipedia, and I am more happy to keep them out of the world's encyclopedia unless they've done something that's more significant than publishing an occasional scholarly article, owning a small consultancy, and receiving a few professional distinctions like "adjunct professor of cardiology at XYZ University". I'm not saying that we can't lower the bar, but we'd want to be very careful about doing so in order to avoid giving marketers and PR people a wider opening for using Wikipedia as a marketing and PR platform.
I'm very supportive of improving the user experience for aspiring contributors who use mobile devices, but I am not optimistic that this will lead to a substantial increase in the population of ENWP Wikipedians who can become proficient with the details of our many policies, are willing to persist through negative experiences with other contributors (including vandals, overzealous patrollers, POV-pushers, etc.), and volunteer their time for high profile roles like WikiProject coordinator or ENWP administrator. Perhaps non-English Wikipedias do better with editor retention; I'm also thinking that Commons might be a good place for new contributors to start if and when mobile editing becomes more user-friendly.
I think that making reversions feel less hostile would be good for diversity and good for editor retention in general, so I'd suggest that WMF prioritize working on that point. I'm also hoping to improve user onboarding with my video project and in collaboration with the WMF Growth team. I generally appreciate how Kerry is thinking about these problems; she and I have both given feedback to the WMF Growth team.
Regards,
Further thought regarding the notability criteria for BLPs: Asaf made a suggestion awhile ago, and unfortunately I can't remember exactly where I heard about it, but I thought that it was a good idea. He suggested being more context-specific when considering the bar for BLPs. I think that his statement went something like this: in a culture where having information about someone be published in newspapers is a rarity, the lack of being published in a newspaper is not a good test for whether someone should be considered notable. I think that Asaf's proposal was more nuanced than I'm describing it, but in general I thought that it was worth seriously considering.
If someone meets a revised notability bar for a BLP, there may still be a problem with finding information that is verifiable and reliable. I don't know of a good way to deal with that. I think that we have a problem with believing (this is a bit of an exaggeration, but I think that you'll understand my point) that if something is written in a book that is published by a reputable publisher that therefore it must be reliable and verifiable, while something is not reliable and verifiable if it is communicated only orally in a culture where written communications are rare or nonexistent. I don't know how to deal with that problem, but I do think that it's a problem.
Pine ( https://meta.wikimedia.org/wiki/User:Pine )
On Wed, Oct 3, 2018 at 6:06 AM Pine W wiki.pine@gmail.com wrote:
WSC,
I think that we'd need to be very careful about lowering the bar for BLPs on ENWP, because there are innumerable non-notable professionals who seem to pay people to add their biographies (and/or small organizations) to Wikipedia, and I am more happy to keep them out of the world's encyclopedia unless they've done something that's more significant than publishing an occasional scholarly article, owning a small consultancy, and receiving a few professional distinctions like "adjunct professor of cardiology at XYZ University". I'm not saying that we can't lower the bar, but we'd want to be very careful about doing so in order to avoid giving marketers and PR people a wider opening for using Wikipedia as a marketing and PR platform.
I'm very supportive of improving the user experience for aspiring contributors who use mobile devices, but I am not optimistic that this will lead to a substantial increase in the population of ENWP Wikipedians who can become proficient with the details of our many policies, are willing to persist through negative experiences with other contributors (including vandals, overzealous patrollers, POV-pushers, etc.), and volunteer their time for high profile roles like WikiProject coordinator or ENWP administrator. Perhaps non-English Wikipedias do better with editor retention; I'm also thinking that Commons might be a good place for new contributors to start if and when mobile editing becomes more user-friendly.
I think that making reversions feel less hostile would be good for diversity and good for editor retention in general, so I'd suggest that WMF prioritize working on that point. I'm also hoping to improve user onboarding with my video project and in collaboration with the WMF Growth team. I generally appreciate how Kerry is thinking about these problems; she and I have both given feedback to the WMF Growth team.
Regards,
Stripping out a long email trail ...
I am not advocating lowering the BLP bar as there are genuine legal needs to prevent libel.
What I am advocating is not letting new users do their first edits in “high risk” articles. When I do training, I pick exercises for the group which deliberately take place in quiet backwaters of Wikipedia, eg add schools to local suburb articles. Such articles have low readership and low levels of watchers and no BLP considerations, i.e. low risk articles. If the newbie first edit is a bit of a mess, probably no reader will see it before it is fixed by a subsequent edit. They will be able to get help from me to fix it before anyone is harmed by it and before anyone reverts them.
The “organic” newbie can dive into any article. It would be a very interesting research question to look at reverts and see if we can develop risk models that predict which articles are at higher risks of reverted edits (e.g. quality rating, length, type of article eg BLP, level of readership, number of active watchers, etc) and there might be separate models specifically for newbies revert risk and female newbie revert risk.
Or we just simply calculate the proportion of reverted edits and just use declare anything over some threshold as “high risk” and not bother finding out what the article characteristics are. We could also calculate what is the newbie revert rate.
Then we have something actionable. We could treat the high risk articles (by predictive model or straight stats) as semi-protected and divert newbies from making direct edits. Or at least warn them before letting them loose. For that matter, warn any user if they are entering into a high conflict zone.
When you learn to drive a car, you normally start in the quiet streets, not a busy high speed freeway, not narrow winding roads without guard rails up a mountain. Why shouldn’t we take the same attitude to Wikipedia? Start where it is safe.
Kerry
Those all sound like good suggestions. I have flagged this entire conversation for me to review if and when I get funding for continuing work on my project. I hope that the WMF Growth team is also aware of this conversation.
By the way, Edward, if you're still reading this, thanks for letting us have an extended conversation about community health in the thread that you started about the CEI survey.
Pine ( https://meta.wikimedia.org/wiki/User:Pine )
On Wed, Oct 3, 2018 at 8:54 AM Kerry Raymond kerry.raymond@gmail.com wrote:
Stripping out a long email trail ...
I am not advocating lowering the BLP bar as there are genuine legal needs to prevent libel.
What I am advocating is not letting new users do their first edits in “high risk” articles. When I do training, I pick exercises for the group which deliberately take place in quiet backwaters of Wikipedia, eg add schools to local suburb articles. Such articles have low readership and low levels of watchers and no BLP considerations, i.e. low risk articles. If the newbie first edit is a bit of a mess, probably no reader will see it before it is fixed by a subsequent edit. They will be able to get help from me to fix it before anyone is harmed by it and before anyone reverts them.
The “organic” newbie can dive into any article. It would be a very interesting research question to look at reverts and see if we can develop risk models that predict which articles are at higher risks of reverted edits (e.g. quality rating, length, type of article eg BLP, level of readership, number of active watchers, etc) and there might be separate models specifically for newbies revert risk and female newbie revert risk.
Or we just simply calculate the proportion of reverted edits and just use declare anything over some threshold as “high risk” and not bother finding out what the article characteristics are. We could also calculate what is the newbie revert rate.
Then we have something actionable. We could treat the high risk articles (by predictive model or straight stats) as semi-protected and divert newbies from making direct edits. Or at least warn them before letting them loose. For that matter, warn any user if they are entering into a high conflict zone.
When you learn to drive a car, you normally start in the quiet streets, not a busy high speed freeway, not narrow winding roads without guard rails up a mountain. Why shouldn’t we take the same attitude to Wikipedia? Start where it is safe.
Kerry _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Kerry,
I like this a lot except for one small, but critical, distinction. I want to get your take on it (yours specifically, in this case, because of your background and the thought you've put into this issue).
I think that explicitly forbidding newcomers from performing certain kinds of actions, or editing certain pages, is a mistake. This was a mistake with ACTRIAL, and it would be a mistake with any other newcomer quality-control or harm-mitigation strategies--however well intentioned.
It's a mistake for two reasons, First, it runs counter to the spirit of Wikipedia. Wikipedia has become more 'closed' over time in both formal and informal ways. This is a common patterns for social movements as well as organizations--it's not unexpected, and to a certain extent it may be necessary, but in *Wikipedia's *case it directly violates the fundamental values and goals of the project. That means creeping bureaucracy and "in-group" mentalities are inherently more damaging to Wikipedia than it would be to, say, Microsoft, or Facebook, or even Stackexchange.
Second, being explicitly denied the opportunity to make particular kinds of contributions (as opposed to being nudged towards other options, explained to why something is a bad idea, or shown the likely outcomes of certain actions) is an even bigger motivation-killer, long term, than having bad experiences due to stumbling onto the "freeway" (nice metaphor!).
Especially considering that both the current EnWiki community and the current content embed major biases and gaps, we can't afford to make it harder for the new people who have the expertise, the perspective, and the passion to correct those biases and fill those gaps from participating as full-fledged members of the community. Full stop. You can't have higher walls and easier quality control, but you can't have higher walls and higher newcomer retention (or diversity).
Wikipedia (esp. EnWiki) has basically two options at this point, with maybe some narrow-ish middle ways between them: 1. Continue to make it harder and harder for new people to contribute, through political and technological means, thus preserving the current content to a great degree, but diminishing the relevance of the project as a whole as it becomes increasingly incomplete, out of date, and limited in scope.
2. Try to make it easy as possible for newcomers (with their new knowledge, sometimes different values, and yes, sometimes *mixed motivations*) to contribute, and try to make the project feel as exciting for them as it was for people who joined in 2004; accept that taking this track will lead to a degree of vandalism and COI (although probably not different in scale than current or historical levels), and invest heavily in algorithmic quality control, streamlined onboarding and socialization, diversity-friendly policy change, expansive and public offline initiatives, and all the other "suite" of methods intended to scale the ability of the current community to handle additional growth and diversity in content and contributors.
#1 involves no great risk to the "community" besides gradual obsolescence; Wikipedia will go the way of many other social institutions that failed to adapt. But it will do so slowly, and continue to provide value in the process. It just won't ever be the world's encyclopedia.
#2 involves risk because the intention behind it is that the community will look different, the content will look different, the mechanisms for contributing will look different, and the policies will look different in 10 years vs. today. But it is the only shot at continuing to meaningfully pursue the original mission at this point. I personally would love to see this happen--as a contributor, as a scholar, as a world citizen who believes in Wikipedia--but it involves risk because it means that people who have power will need to give it up. That's never easy.
(Opinions my own, not those of WMF) - J
On Wed, Oct 3, 2018 at 1:54 AM Kerry Raymond kerry.raymond@gmail.com wrote:
Stripping out a long email trail ...
I am not advocating lowering the BLP bar as there are genuine legal needs to prevent libel.
What I am advocating is not letting new users do their first edits in “high risk” articles. When I do training, I pick exercises for the group which deliberately take place in quiet backwaters of Wikipedia, eg add schools to local suburb articles. Such articles have low readership and low levels of watchers and no BLP considerations, i.e. low risk articles. If the newbie first edit is a bit of a mess, probably no reader will see it before it is fixed by a subsequent edit. They will be able to get help from me to fix it before anyone is harmed by it and before anyone reverts them.
The “organic” newbie can dive into any article. It would be a very interesting research question to look at reverts and see if we can develop risk models that predict which articles are at higher risks of reverted edits (e.g. quality rating, length, type of article eg BLP, level of readership, number of active watchers, etc) and there might be separate models specifically for newbies revert risk and female newbie revert risk.
Or we just simply calculate the proportion of reverted edits and just use declare anything over some threshold as “high risk” and not bother finding out what the article characteristics are. We could also calculate what is the newbie revert rate.
Then we have something actionable. We could treat the high risk articles (by predictive model or straight stats) as semi-protected and divert newbies from making direct edits. Or at least warn them before letting them loose. For that matter, warn any user if they are entering into a high conflict zone.
When you learn to drive a car, you normally start in the quiet streets, not a busy high speed freeway, not narrow winding roads without guard rails up a mountain. Why shouldn’t we take the same attitude to Wikipedia? Start where it is safe.
Kerry _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
whoops, last sentence of paragraph #5 should read "You *CAN* have higher walls and easier quality control, but you can't have higher walls and higher newcomer retention (or diversity)."
On Thu, Oct 4, 2018 at 9:45 AM Jonathan Morgan jmorgan@wikimedia.org wrote:
Kerry,
I like this a lot except for one small, but critical, distinction. I want to get your take on it (yours specifically, in this case, because of your background and the thought you've put into this issue).
I think that explicitly forbidding newcomers from performing certain kinds of actions, or editing certain pages, is a mistake. This was a mistake with ACTRIAL, and it would be a mistake with any other newcomer quality-control or harm-mitigation strategies--however well intentioned.
It's a mistake for two reasons, First, it runs counter to the spirit of Wikipedia. Wikipedia has become more 'closed' over time in both formal and informal ways. This is a common patterns for social movements as well as organizations--it's not unexpected, and to a certain extent it may be necessary, but in *Wikipedia's *case it directly violates the fundamental values and goals of the project. That means creeping bureaucracy and "in-group" mentalities are inherently more damaging to Wikipedia than it would be to, say, Microsoft, or Facebook, or even Stackexchange.
Second, being explicitly denied the opportunity to make particular kinds of contributions (as opposed to being nudged towards other options, explained to why something is a bad idea, or shown the likely outcomes of certain actions) is an even bigger motivation-killer, long term, than having bad experiences due to stumbling onto the "freeway" (nice metaphor!).
Especially considering that both the current EnWiki community and the current content embed major biases and gaps, we can't afford to make it harder for the new people who have the expertise, the perspective, and the passion to correct those biases and fill those gaps from participating as full-fledged members of the community. Full stop. You can't have higher walls and easier quality control, but you can't have higher walls and higher newcomer retention (or diversity).
Wikipedia (esp. EnWiki) has basically two options at this point, with maybe some narrow-ish middle ways between them:
- Continue to make it harder and harder for new people to contribute,
through political and technological means, thus preserving the current content to a great degree, but diminishing the relevance of the project as a whole as it becomes increasingly incomplete, out of date, and limited in scope.
- Try to make it easy as possible for newcomers (with their new
knowledge, sometimes different values, and yes, sometimes *mixed motivations*) to contribute, and try to make the project feel as exciting for them as it was for people who joined in 2004; accept that taking this track will lead to a degree of vandalism and COI (although probably not different in scale than current or historical levels), and invest heavily in algorithmic quality control, streamlined onboarding and socialization, diversity-friendly policy change, expansive and public offline initiatives, and all the other "suite" of methods intended to scale the ability of the current community to handle additional growth and diversity in content and contributors.
#1 involves no great risk to the "community" besides gradual obsolescence; Wikipedia will go the way of many other social institutions that failed to adapt. But it will do so slowly, and continue to provide value in the process. It just won't ever be the world's encyclopedia.
#2 involves risk because the intention behind it is that the community will look different, the content will look different, the mechanisms for contributing will look different, and the policies will look different in 10 years vs. today. But it is the only shot at continuing to meaningfully pursue the original mission at this point. I personally would love to see this happen--as a contributor, as a scholar, as a world citizen who believes in Wikipedia--but it involves risk because it means that people who have power will need to give it up. That's never easy.
(Opinions my own, not those of WMF)
- J
On Wed, Oct 3, 2018 at 1:54 AM Kerry Raymond kerry.raymond@gmail.com wrote:
Stripping out a long email trail ...
I am not advocating lowering the BLP bar as there are genuine legal needs to prevent libel.
What I am advocating is not letting new users do their first edits in “high risk” articles. When I do training, I pick exercises for the group which deliberately take place in quiet backwaters of Wikipedia, eg add schools to local suburb articles. Such articles have low readership and low levels of watchers and no BLP considerations, i.e. low risk articles. If the newbie first edit is a bit of a mess, probably no reader will see it before it is fixed by a subsequent edit. They will be able to get help from me to fix it before anyone is harmed by it and before anyone reverts them.
The “organic” newbie can dive into any article. It would be a very interesting research question to look at reverts and see if we can develop risk models that predict which articles are at higher risks of reverted edits (e.g. quality rating, length, type of article eg BLP, level of readership, number of active watchers, etc) and there might be separate models specifically for newbies revert risk and female newbie revert risk.
Or we just simply calculate the proportion of reverted edits and just use declare anything over some threshold as “high risk” and not bother finding out what the article characteristics are. We could also calculate what is the newbie revert rate.
Then we have something actionable. We could treat the high risk articles (by predictive model or straight stats) as semi-protected and divert newbies from making direct edits. Or at least warn them before letting them loose. For that matter, warn any user if they are entering into a high conflict zone.
When you learn to drive a car, you normally start in the quiet streets, not a busy high speed freeway, not narrow winding roads without guard rails up a mountain. Why shouldn’t we take the same attitude to Wikipedia? Start where it is safe.
Kerry _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
-- Jonathan T. Morgan Senior Design Researcher Wikimedia Foundation User:Jmorgan (WMF) https://meta.wikimedia.org/wiki/User:Jmorgan_(WMF)
I am with you 100% on the principle that if we don't change how we do things, nothing will change in terms of our outcomes. But I guess what we are debating is what the change should be.
Our problem is indeed one of ideology as we have three statements of ideology underpinning Wikipedia. We have the vision:
"Imagine a world in which every single person on the planet is given free access to the sum of all human knowledge. That's what we're doing."
We have the 5 Pillars which I assume we all know so I won't elaborate here
and we have the main page that says "Welcome to Wikipedia, the free encyclopaedia that anyone can edit."
Frankly these various ideologies don't combine terribly well and I think that "anyone can edit" is something that we do have to re-think. At the end of the day we are building and (increasingly) maintaining an encyclopaedia. We do need adequately educated people to do this. The ability to research and write is not innate, most people have to learn it through a formal education process. Now I am not suggesting a formal education barrier to participation but really, if you can't cite, you can't write for Wikipedia. Maybe you can fill other roles in Wikipedia but not as a content writer.
We absolutely do need new contributors. We know we have a contributor gap and a content gap and there is research that shows these are related. But I am not convinced that the vandals and self-promoters are part of our contributor gap. I suspect our bad faith editors are predominantly white male and 1st-world, and we have plenty of good faith contributors from that group already. Do we have any evidence that vandals turn into productive contributors? Have we surveyed our existing editor community on how many of them started out as a vandal?
Maybe we could turn CoI and bias around to be a motivator? A lot of the self-promoters seem to be quite well educated. Let's have some new namespaces e.g. "CV" (for CVs), "Essay" (for opinions). Maybe you get to the right to one of these for every N productive edits you do in mainspace. Obviously they get displayed to the reader in a way that makes clear these are "personal views" or whatever words are appropriate so there is no misrepresentation of what they are. And of course they should be subject to our normal rules about puffery, hate speech etc. And they can choose to have or not have an associated talk page. But I would put one caveat on these new namespaces, verified identity. If you want to advertise yourself and your views, you need to stand up and be honest about who you are (but it doesn't have to be linked to your normal user name or IP editing for those who edit on "sensitive" topics). After enough good mainspace edits, you get a token that you can "cash in" for one of these personal statement pages. This works well for the paid editors. They can write good edits on mainspace topics to earn tokens to write CVs and personal statements for their clients (as long as their clients are happy to verify their real world identity). And as the easiest way to get a good edit is to revert vandalism, maybe we can solve our vandalism problem that way.
Maintenance is a problem. 2016 we had a census in Australia. We still have loads of town/suburb articles with 2011 census date, and I stumble over 2006 data too. (Note this is not easy to automate as the internal identifiers used for the places are not stable from one census to the next -- if it was, we would have automated this). Let's set this kind of stuff up into a pipeline like Mechanical Turk as another way to get "good edits". Indeed let's consider whether the price of paying folks in the third world to do this kind of maintenance might be worth it. They are pretty cheap and they need the money.
We need to nurture the good faith new contributors. Could we have something that isn't "un-do: but say "re-do" which acts some kind of referral to a more caring part of Wikipedia than your average editor to help them learn how to do it beter? E.g. Teahouse type people.
But back to the contributer gap. We do need to do something about oral knowledge, such as we have in Australian Indigenous communities. At the moment, this is a verification problem. But Indigenous people don't have a verification problem. They know who their elders are and they know who they trust to hear their lore fromMaybe we need a family of templates, e.g. {{Oral Quandamooka}} that tells the reader that what's inside this box (or however we present it) is oral knowledge provided by" SoAndSo, Elder Of the Quandamooka People", and within such templates, normal verification does not apply but there is some culturally appropriate real world verification that is used to authorised certain user names to use that template. It might not be the respected elder themselves as they may not be technologically savvy but it might be someone they designate to assist them with the task. And of course, we can already include a sound or video file on Commons where the elder speaks for themselves or demonstrates something, and allow that to be included into mainspace articles with a similar special template that let's the reader understand what they are seeing/hearing so the reader doesn't want to hear/see/read this stuff, they know what it is and ignore it.
Just throwing ideas out there ...
Kerry
Again, to bring this back to some research question, why do female newbie editors get reverted more?
Possible research question. Where (topic space) are the reverts happening and what types of reason given? Is there any sign that male/female are affected differently? To what extent does level of editing experience affect this?
One research side-question. Should we just be comparing male vs female or should we look at the unknowns? I know some people think that we have more women than we think but that they choose not to self-identify as such on Wikipedia. If we compared various statistics for no-gender editors with that of self-identifying male and female editors, does it give us any insight on what the likely gender composition of the no-gender group are. For example, if among self-identifying editors we known there is a 90-10 gender split, then if the no-gendered are also 90-10 split, then statistics about the non-gendered editors should show corresponding averages (male stat * 90 + female stat * 10). If they do not, then can we use a range to statistics to back calculate the likely gender split of the non-gendered group? Has anyone ever done this?
Kerry
-----Original Message----- From: Kerry Raymond [mailto:kerry.raymond@gmail.com] Sent: Friday, 28 September 2018 10:05 AM To: 'Research into Wikimedia content and communities' wiki-research-l@lists.wikimedia.org Subject: RE: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
Pine
This paper has some good studies about gender and new editors and reverting
https://www.researchgate.net/profile/Shilad_Sen/publication/221367798_WPClub...
It shows that both male and female newbies are equally likely to drop out after being reverted for good-faith edits, BUT that female newbies are more likely to be reverted than male newbies, leading to a greater proportion of them dropping out.
It also shows that male and female editors tend to be attracted to different types of topic. "There is a greater concentration of females in the People and Arts areas, while males focus more on Geography and Science." (see Table 1 in the paper). And their engagement with History seems lower.
So why are newbie women reverted more? This paper does not investigate that. But I think it has to be either than they are reverted because they are women (i.e. conscious discrimination) or because women's edits are less acceptable in some way.
I have *hypothesised* that newbie women may get reverted more because women show higher interest in People but not in History suggesting women are more likely to be editing articles about living people than about dead people. BLP policy is stricter on verification compared with dead people topics, or with topics in male-attracting topics like Geography and Science, so women are perhaps doing more BLP edits as newbies and more likely to be reverted because they fail to provide a citation or their citation comes from a source which may not be considered reliable (e.g. celebrity magazine).
If this could be established as at least a part of the problem, maybe there might be targeted solutions to address the problem. E.g. maybe newbies should not be allowed to edit articles which are BLP or have a high revert history (suggesting it's dangerous territory for some reason, e.g. real-world controversy, "ownership") and are deflected to the Talk page to suggest edits (as with a protected article or semi-protected article). Currently we auto-confirm user accounts at 10 edits or 4 days (from memory). But these thresholds are based on the likelihood of vandalism (early good-faith behaviour is a good predictor of future good faith behaviour). But, having trained people, I know that the auto-confirmation threshold should not be used as "beyond newbie" indicator; they are newbies for many more edits.
How many edits do you need to stop being a newbie? I don't know, but as I know myself with over 100k edits, if I edit an article outside my normal interests, I am far more likely to be reverted than in my regular topic area, so we can all be newbies in unfamiliar topic spaces. There is a lot of convention, pre-existing consensus and other "norms" in topic spaces that the "newbie to this topic" doesn't know. All editors in this situation may back off, but the established editor has a comfort zone (normal topic space) to return to, the total newbie does not.
Kerry
Hello Kerry,
While I agree to most what you said, I think that the bigger picture should include that: newbies are not always good contributors, and not always good-faith contributors. And even if they have good faith, that does not mean that they can be trained to become good contributors. Dealing with newbies means always a filtering. MAybe different people are differently optimistic about the probability to make a newbie a good contributor.
Kind regards, Ziko
Kerry Raymond kerry.raymond@gmail.com schrieb am Do. 27. Sep. 2018 um 06:47:
While I have no objection to the administrator training, I don't think most of the problem lies with administrators. There's a lot of biting of the good-faith newbies done by "ordinary" editors (although I have seen some admins do it too). And, while I agree that there are many good folk out there on en.WP, unfortunately the newbie tends to meet the other folk first or perhaps it's that 1 bad experience has more impact than one good experience.
Similarly while Arbcom's willingness to desysop folks is good, I doubt a newbie knows how or where to complain in the first instance. Also there's a high level of defensive reaction if they do. Some of my trainees have contacted me about being reverted for clearly good-faith edits on the most spurious of reasons. When I have restored their edit with a hopefully helpful explanation, I often get reverted too. If a newbie takes any action themselves, it is likely to be an undo and that road leads to 3RR block or at least a 3RR warning. The other action they take is to respond on their User Talk page (when there is a message there to respond to). However, such replies are usually ignored, whether the other user isn't watching for a reply or whether they just don't like their authority to be challenged, I don't know. But it rarely leads to a satisfactory resolution.
One of the problems we have with Wikipedia is that most of us tend to see it edit-by-edit (whether we are talking about a new edit or a revert of an edit), we don't ever see a "big picture" of a user's behaviour without a lot of tedious investigation (working through their recent contributions one by one). So, it's easy to think "I am not 100% sure that the edit/revert I saw was OK but I really don't have time to see if this is one-off or a consistent problem". Maybe we need a way to privately "express doubt" about an edit (in the way you can report a Facebook post). Then if someone starts getting too many "doubtful edits" per unit time (or whatever), it triggers an admin (or someone) to take a closer look at what that user is up to. I think if we had a lightweight way to express doubt about any edit, then we could use machine learning to detect patterns that suggest specific types of undesirable user behaviours that can really only be seen as a "big picture".
Given this is the research mailing list, I guess we should we talking about ways research can help with this problem.
Kerry
-----Original Message----- From: Wiki-research-l [mailto:wiki-research-l-bounces@lists.wikimedia.org] On Behalf Of Pine W Sent: Wednesday, 26 September 2018 1:07 PM To: Wiki Research-l wiki-research-l@lists.wikimedia.org; Rosie Stephenson-Goodknight rosiestep.wiki@gmail.com Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
I'm appreciative that we're having this conversation - not in the sense that I'm happy with the status quo, but I'm glad that some of us are continuing to work on our persistent difficulties with contributor retention, civility, and diversity.
I've spent several hours on ENWP recently, and I've been surprised by the willingness of people to revert good-faith edits, sometimes with blunt commentary or with no explanation. I can understand how a newbie who experienced even one of these incidents would find it to be unpleasant, intimidating, or discouraging. Based on these experiences, I've decided that I should coach newbies to avoid taking reversions personally if their original contributions were in good faith.
I agree with Jonathan Morgan that WP:NOTSOCIAL can be overused.
Kerry, I appreciate your suggestions about about cultural change. I can think of two ways to influence culture on English Wikipedia in large-scale ways.
- I think that there should be more and higher-quality training and
continuing education for administrators in topics like policies, conflict resolution, communications skills, legal issues, and setting good examples. I think that these trainings would be one way through which cultural change could gradually happen over time. For what it's worth, I think that there are many excellent administrators who do a lot of good work (which can be tedious and/or stressful) with little appreciation. Also, my impression is that ENWP Arbcom has become more willing over the years to remove admin privileges from admins who misuse their tools. I recall having a discussion awhile back with Rosie on the topic of training for administrators, and I'm adding her to this email chain as an invitation for her to participate in this discussion. I think that offering training to administrators could be helpful in facilitating changes to ENWP culture.
- I think that I can encourage civil participation in ENWP in the context
of my training project < https://meta.wikimedia.org/wiki/Grants:Project/Rapid/Pine/Continuation_of_ed...
that I'm hoping that WMF will continue to fund. ENWP is a complex and sometimes emotionally difficult environment, and I'm trying to set a tone in the online training materials that is encouraging. I hope to teach newbies about the goals of Wikipedia as well as policies, how to use tools, and Wikipedia culture. I am hopeful that the online training materials will improve the confidence of new contributors, improve the retention of new contributors, and help new editors to increase the quality and quantity of their contributions. I hope that early portions of the project will be well received and that, over time and if the project is successful as it incrementally increases in scale and reach, that it will influence the overall culture of ENWP to be more civil.
Regards,
Pine ( https://meta.wikimedia.org/wiki/User:Pine ) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
I believe administrators outside of the US, in en wikipedia and in wikidata etc., do not understand, our freedom of speech and our right to due process, and
that there is a cultural misunderstanding and a lack of patience on there part,
which leads to an abuse of power and a breaking of the rules when it comes
to blocking IP’s and others for just standing up for themselves. and to that
end, do not see the good faith edits made, that were not reverted, and based
on other’s intelligent level not there's. Everything starts out nice, on tea room’s,
noticeboards, forums, and on there talk pages etc., and then all goes south,
as in en wikipedia, and with a now “conflict of interest” just block you, to end it. In wikidata which is more technically challenging, editors that claim ownership
of pages and coming from outside of north america and europe, revert on
misunderstanding’s, and can not express them self in english, so just rely on
administrators noticeboard to complain against IP’s without warning, not giving the chance for the ip to defend himself, and to explain that it was
an edit war. administrators that see these posts at 100’s an hour, just block
the IP’s or the pages without any kind of investigation, based on lies of the
accusers. and these same administrators that have participated on there talk pages are now in a “conflict of interest”, being directly involved. and in ru wikipedia, ru wikidata, english speakers are not welcome, from
there board down to there users.
Saturday, September 29, 2018 12:28 AM -05:00 from Ziko van Dijk zvandijk@gmail.com:
Hello Kerry,
While I agree to most what you said, I think that the bigger picture should include that: newbies are not always good contributors, and not always good-faith contributors. And even if they have good faith, that does not mean that they can be trained to become good contributors. Dealing with newbies means always a filtering. MAybe different people are differently optimistic about the probability to make a newbie a good contributor.
Kind regards, Ziko
Kerry Raymond < kerry.raymond@gmail.com > schrieb am Do. 27. Sep. 2018 um 06:47:
While I have no objection to the administrator training, I don't think most of the problem lies with administrators. There's a lot of biting of the good-faith newbies done by "ordinary" editors (although I have seen some admins do it too). And, while I agree that there are many good folk out there on en.WP, unfortunately the newbie tends to meet the other folk first or perhaps it's that 1 bad experience has more impact than one good experience.
Similarly while Arbcom's willingness to desysop folks is good, I doubt a newbie knows how or where to complain in the first instance. Also there's a high level of defensive reaction if they do. Some of my trainees have contacted me about being reverted for clearly good-faith edits on the most spurious of reasons. When I have restored their edit with a hopefully helpful explanation, I often get reverted too. If a newbie takes any action themselves, it is likely to be an undo and that road leads to 3RR block or at least a 3RR warning. The other action they take is to respond on their User Talk page (when there is a message there to respond to). However, such replies are usually ignored, whether the other user isn't watching for a reply or whether they just don't like their authority to be challenged, I don't know. But it rarely leads to a satisfactory resolution.
One of the problems we have with Wikipedia is that most of us tend to see it edit-by-edit (whether we are talking about a new edit or a revert of an edit), we don't ever see a "big picture" of a user's behaviour without a lot of tedious investigation (working through their recent contributions one by one). So, it's easy to think "I am not 100% sure that the edit/revert I saw was OK but I really don't have time to see if this is one-off or a consistent problem". Maybe we need a way to privately "express doubt" about an edit (in the way you can report a Facebook post). Then if someone starts getting too many "doubtful edits" per unit time (or whatever), it triggers an admin (or someone) to take a closer look at what that user is up to. I think if we had a lightweight way to express doubt about any edit, then we could use machine learning to detect patterns that suggest specific types of undesirable user behaviours that can really only be seen as a "big picture".
Given this is the research mailing list, I guess we should we talking about ways research can help with this problem.
Kerry
-----Original Message----- From: Wiki-research-l [mailto:wiki-research-l-bounces@lists.wikimedia.org] On Behalf Of Pine W Sent: Wednesday, 26 September 2018 1:07 PM To: Wiki Research-l < wiki-research-l@lists.wikimedia.org >; Rosie Stephenson-Goodknight < rosiestep.wiki@gmail.com > Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
I'm appreciative that we're having this conversation - not in the sense that I'm happy with the status quo, but I'm glad that some of us are continuing to work on our persistent difficulties with contributor retention, civility, and diversity.
I've spent several hours on ENWP recently, and I've been surprised by the willingness of people to revert good-faith edits, sometimes with blunt commentary or with no explanation. I can understand how a newbie who experienced even one of these incidents would find it to be unpleasant, intimidating, or discouraging. Based on these experiences, I've decided that I should coach newbies to avoid taking reversions personally if their original contributions were in good faith.
I agree with Jonathan Morgan that WP:NOTSOCIAL can be overused.
Kerry, I appreciate your suggestions about about cultural change. I can think of two ways to influence culture on English Wikipedia in large-scale ways.
- I think that there should be more and higher-quality training and
continuing education for administrators in topics like policies, conflict resolution, communications skills, legal issues, and setting good examples. I think that these trainings would be one way through which cultural change could gradually happen over time. For what it's worth, I think that there are many excellent administrators who do a lot of good work (which can be tedious and/or stressful) with little appreciation. Also, my impression is that ENWP Arbcom has become more willing over the years to remove admin privileges from admins who misuse their tools. I recall having a discussion awhile back with Rosie on the topic of training for administrators, and I'm adding her to this email chain as an invitation for her to participate in this discussion. I think that offering training to administrators could be helpful in facilitating changes to ENWP culture.
- I think that I can encourage civil participation in ENWP in the context
of my training project < https://meta.wikimedia.org/wiki/Grants:Project/Rapid/Pine/Continuation_of_ed...
that I'm hoping that WMF will continue to fund. ENWP is a complex and sometimes emotionally difficult environment, and I'm trying to set a tone in the online training materials that is encouraging. I hope to teach newbies about the goals of Wikipedia as well as policies, how to use tools, and Wikipedia culture. I am hopeful that the online training materials will improve the confidence of new contributors, improve the retention of new contributors, and help new editors to increase the quality and quantity of their contributions. I hope that early portions of the project will be well received and that, over time and if the project is successful as it incrementally increases in scale and reach, that it will influence the overall culture of ENWP to be more civil.
Regards,
Pine ( https://meta.wikimedia.org/wiki/User:Pine ) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Well, I run training and events. The folk who turn up to these are always good faith, typically middle-aged and older, mostly women, and of above-average education for their age (our oldest Australians will not all have had the opportunity to go to high school) and generally acceptable IT skills. I think most of them are capable of being good contributors and their errors are mostly unintentional, e.g. copyright is not always well understood and so there are photo uploads from “family albums” or “our local history collection” where the provenance of the image is unknown and hence its copyright status is unclear. But off-line activities like mine are too few in number to make a significant impact on en.WP. We have to get better at attracting and on-boarding people via on-line.
Obviously on my watchlist I see plenty of blatant and subtle vandalism, so I am not naïve about that, but I do also see what appears to be good faith behaviour from newbies too. I suspect people who only see their watchlist have a more negative view about newbies than I do.
So, yes, we may have to filter out some of the good faith folks if their behaviour remains problematic, but reverting them for any small problem in their early edits certainly isn’t proving to be an effective strategy.
Kerry
From: Ziko van Dijk [mailto:zvandijk@gmail.com] Sent: Saturday, 29 September 2018 3:27 PM To: Research into Wikimedia content and communities wiki-research-l@lists.wikimedia.org; kerry.raymond@gmail.com Cc: Rosie Stephenson-Goodknight rosiestep.wiki@gmail.com Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
Hello Kerry,
While I agree to most what you said, I think that the bigger picture should include that: newbies are not always good contributors, and not always good-faith contributors. And even if they have good faith, that does not mean that they can be trained to become good contributors. Dealing with newbies means always a filtering. MAybe different people are differently optimistic about the probability to make a newbie a good contributor.
Kind regards,
Ziko
Kerry Raymond <kerry.raymond@gmail.com mailto:kerry.raymond@gmail.com > schrieb am Do. 27. Sep. 2018 um 06:47:
While I have no objection to the administrator training, I don't think most of the problem lies with administrators. There's a lot of biting of the good-faith newbies done by "ordinary" editors (although I have seen some admins do it too). And, while I agree that there are many good folk out there on en.WP, unfortunately the newbie tends to meet the other folk first or perhaps it's that 1 bad experience has more impact than one good experience.
Similarly while Arbcom's willingness to desysop folks is good, I doubt a newbie knows how or where to complain in the first instance. Also there's a high level of defensive reaction if they do. Some of my trainees have contacted me about being reverted for clearly good-faith edits on the most spurious of reasons. When I have restored their edit with a hopefully helpful explanation, I often get reverted too. If a newbie takes any action themselves, it is likely to be an undo and that road leads to 3RR block or at least a 3RR warning. The other action they take is to respond on their User Talk page (when there is a message there to respond to). However, such replies are usually ignored, whether the other user isn't watching for a reply or whether they just don't like their authority to be challenged, I don't know. But it rarely leads to a satisfactory resolution.
One of the problems we have with Wikipedia is that most of us tend to see it edit-by-edit (whether we are talking about a new edit or a revert of an edit), we don't ever see a "big picture" of a user's behaviour without a lot of tedious investigation (working through their recent contributions one by one). So, it's easy to think "I am not 100% sure that the edit/revert I saw was OK but I really don't have time to see if this is one-off or a consistent problem". Maybe we need a way to privately "express doubt" about an edit (in the way you can report a Facebook post). Then if someone starts getting too many "doubtful edits" per unit time (or whatever), it triggers an admin (or someone) to take a closer look at what that user is up to. I think if we had a lightweight way to express doubt about any edit, then we could use machine learning to detect patterns that suggest specific types of undesirable user behaviours that can really only be seen as a "big picture".
Given this is the research mailing list, I guess we should we talking about ways research can help with this problem.
Kerry
-----Original Message----- From: Wiki-research-l [mailto:wiki-research-l-bounces@lists.wikimedia.org mailto:wiki-research-l-bounces@lists.wikimedia.org ] On Behalf Of Pine W Sent: Wednesday, 26 September 2018 1:07 PM To: Wiki Research-l <wiki-research-l@lists.wikimedia.org mailto:wiki-research-l@lists.wikimedia.org >; Rosie Stephenson-Goodknight <rosiestep.wiki@gmail.com mailto:rosiestep.wiki@gmail.com > Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
I'm appreciative that we're having this conversation - not in the sense that I'm happy with the status quo, but I'm glad that some of us are continuing to work on our persistent difficulties with contributor retention, civility, and diversity.
I've spent several hours on ENWP recently, and I've been surprised by the willingness of people to revert good-faith edits, sometimes with blunt commentary or with no explanation. I can understand how a newbie who experienced even one of these incidents would find it to be unpleasant, intimidating, or discouraging. Based on these experiences, I've decided that I should coach newbies to avoid taking reversions personally if their original contributions were in good faith.
I agree with Jonathan Morgan that WP:NOTSOCIAL can be overused.
Kerry, I appreciate your suggestions about about cultural change. I can think of two ways to influence culture on English Wikipedia in large-scale ways.
1. I think that there should be more and higher-quality training and continuing education for administrators in topics like policies, conflict resolution, communications skills, legal issues, and setting good examples. I think that these trainings would be one way through which cultural change could gradually happen over time. For what it's worth, I think that there are many excellent administrators who do a lot of good work (which can be tedious and/or stressful) with little appreciation. Also, my impression is that ENWP Arbcom has become more willing over the years to remove admin privileges from admins who misuse their tools. I recall having a discussion awhile back with Rosie on the topic of training for administrators, and I'm adding her to this email chain as an invitation for her to participate in this discussion. I think that offering training to administrators could be helpful in facilitating changes to ENWP culture.
2. I think that I can encourage civil participation in ENWP in the context of my training project https://meta.wikimedia.org/wiki/Grants:Project/Rapid/Pine/Continuation_of_educational_video_and_website_project that I'm hoping that WMF will continue to fund. ENWP is a complex and sometimes emotionally difficult environment, and I'm trying to set a tone in the online training materials that is encouraging. I hope to teach newbies about the goals of Wikipedia as well as policies, how to use tools, and Wikipedia culture. I am hopeful that the online training materials will improve the confidence of new contributors, improve the retention of new contributors, and help new editors to increase the quality and quantity of their contributions. I hope that early portions of the project will be well received and that, over time and if the project is successful as it incrementally increases in scale and reach, that it will influence the overall culture of ENWP to be more civil.
Regards,
Pine ( https://meta.wikimedia.org/wiki/User:Pine ) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org mailto:Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
_______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org mailto:Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Hello Kerry,
Sorry, I did not see all the mails and the context before.
I remember a gentleman in a training lesson who wanted to write about his grandfather. Notability no problem, and no obvious bias. Why not assume Good Faith. But still, one might ask oneself whether this is an ideal situation. It is tricky. In general I totally agree that the hostility is a problem.
Kind regards Ziko
Kerry Raymond kerry.raymond@gmail.com schrieb am Sa. 29. Sep. 2018 um 08:27:
Well, I run training and events. The folk who turn up to these are always good faith, typically middle-aged and older, mostly women, and of above-average education for their age (our oldest Australians will not all have had the opportunity to go to high school) and generally acceptable IT skills. I think most of them are capable of being good contributors and their errors are mostly unintentional, e.g. copyright is not always well understood and so there are photo uploads from “family albums” or “our local history collection” where the provenance of the image is unknown and hence its copyright status is unclear. But off-line activities like mine are too few in number to make a significant impact on en.WP. We have to get better at attracting and on-boarding people via on-line.
Obviously on my watchlist I see plenty of blatant and subtle vandalism, so I am not naïve about that, but I do also see what appears to be good faith behaviour from newbies too. I suspect people who only see their watchlist have a more negative view about newbies than I do.
So, yes, we may have to filter out some of the good faith folks if their behaviour remains problematic, but reverting them for any small problem in their early edits certainly isn’t proving to be an effective strategy.
Kerry
*From:* Ziko van Dijk [mailto:zvandijk@gmail.com] *Sent:* Saturday, 29 September 2018 3:27 PM *To:* Research into Wikimedia content and communities < wiki-research-l@lists.wikimedia.org>; kerry.raymond@gmail.com *Cc:* Rosie Stephenson-Goodknight rosiestep.wiki@gmail.com
*Subject:* Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
Hello Kerry,
While I agree to most what you said, I think that the bigger picture should include that: newbies are not always good contributors, and not always good-faith contributors. And even if they have good faith, that does not mean that they can be trained to become good contributors. Dealing with newbies means always a filtering. MAybe different people are differently optimistic about the probability to make a newbie a good contributor.
Kind regards,
Ziko
Kerry Raymond kerry.raymond@gmail.com schrieb am Do. 27. Sep. 2018 um 06:47:
While I have no objection to the administrator training, I don't think most of the problem lies with administrators. There's a lot of biting of the good-faith newbies done by "ordinary" editors (although I have seen some admins do it too). And, while I agree that there are many good folk out there on en.WP, unfortunately the newbie tends to meet the other folk first or perhaps it's that 1 bad experience has more impact than one good experience.
Similarly while Arbcom's willingness to desysop folks is good, I doubt a newbie knows how or where to complain in the first instance. Also there's a high level of defensive reaction if they do. Some of my trainees have contacted me about being reverted for clearly good-faith edits on the most spurious of reasons. When I have restored their edit with a hopefully helpful explanation, I often get reverted too. If a newbie takes any action themselves, it is likely to be an undo and that road leads to 3RR block or at least a 3RR warning. The other action they take is to respond on their User Talk page (when there is a message there to respond to). However, such replies are usually ignored, whether the other user isn't watching for a reply or whether they just don't like their authority to be challenged, I don't know. But it rarely leads to a satisfactory resolution.
One of the problems we have with Wikipedia is that most of us tend to see it edit-by-edit (whether we are talking about a new edit or a revert of an edit), we don't ever see a "big picture" of a user's behaviour without a lot of tedious investigation (working through their recent contributions one by one). So, it's easy to think "I am not 100% sure that the edit/revert I saw was OK but I really don't have time to see if this is one-off or a consistent problem". Maybe we need a way to privately "express doubt" about an edit (in the way you can report a Facebook post). Then if someone starts getting too many "doubtful edits" per unit time (or whatever), it triggers an admin (or someone) to take a closer look at what that user is up to. I think if we had a lightweight way to express doubt about any edit, then we could use machine learning to detect patterns that suggest specific types of undesirable user behaviours that can really only be seen as a "big picture".
Given this is the research mailing list, I guess we should we talking about ways research can help with this problem.
Kerry
-----Original Message----- From: Wiki-research-l [mailto:wiki-research-l-bounces@lists.wikimedia.org] On Behalf Of Pine W Sent: Wednesday, 26 September 2018 1:07 PM To: Wiki Research-l wiki-research-l@lists.wikimedia.org; Rosie Stephenson-Goodknight rosiestep.wiki@gmail.com Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
I'm appreciative that we're having this conversation - not in the sense that I'm happy with the status quo, but I'm glad that some of us are continuing to work on our persistent difficulties with contributor retention, civility, and diversity.
I've spent several hours on ENWP recently, and I've been surprised by the willingness of people to revert good-faith edits, sometimes with blunt commentary or with no explanation. I can understand how a newbie who experienced even one of these incidents would find it to be unpleasant, intimidating, or discouraging. Based on these experiences, I've decided that I should coach newbies to avoid taking reversions personally if their original contributions were in good faith.
I agree with Jonathan Morgan that WP:NOTSOCIAL can be overused.
Kerry, I appreciate your suggestions about about cultural change. I can think of two ways to influence culture on English Wikipedia in large-scale ways.
- I think that there should be more and higher-quality training and
continuing education for administrators in topics like policies, conflict resolution, communications skills, legal issues, and setting good examples. I think that these trainings would be one way through which cultural change could gradually happen over time. For what it's worth, I think that there are many excellent administrators who do a lot of good work (which can be tedious and/or stressful) with little appreciation. Also, my impression is that ENWP Arbcom has become more willing over the years to remove admin privileges from admins who misuse their tools. I recall having a discussion awhile back with Rosie on the topic of training for administrators, and I'm adding her to this email chain as an invitation for her to participate in this discussion. I think that offering training to administrators could be helpful in facilitating changes to ENWP culture.
- I think that I can encourage civil participation in ENWP in the context
of my training project < https://meta.wikimedia.org/wiki/Grants:Project/Rapid/Pine/Continuation_of_ed...
that I'm hoping that WMF will continue to fund. ENWP is a complex and sometimes emotionally difficult environment, and I'm trying to set a tone in the online training materials that is encouraging. I hope to teach newbies about the goals of Wikipedia as well as policies, how to use tools, and Wikipedia culture. I am hopeful that the online training materials will improve the confidence of new contributors, improve the retention of new contributors, and help new editors to increase the quality and quantity of their contributions. I hope that early portions of the project will be well received and that, over time and if the project is successful as it incrementally increases in scale and reach, that it will influence the overall culture of ENWP to be more civil.
Regards,
Pine ( https://meta.wikimedia.org/wiki/User:Pine ) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
I have seen this too in face-to-face situations. While it is COI, if it’s notable and written factually, I don’t worry too much (I might swing past it later and just remove any puffery that may have crept in). I do stop them writing about themselves or other living people with whom they may have a COI. There’s a fine line between “having an interest” and “having a conflict of interest” and I find the dead/living distinction tends to make a difference. An article about a dead person is unlikely to be promotional, which is the big concern with COI.
I find edit-a-thons have more risk around CoI and notability, particularly when the organisers have not provided a list of possible topics but let the participants choose their own (I am generally supporting these events as an experienced Wikipedian rather than organising them). Also they are often larger groups than training sessions so it is a lot more difficult for me to know what they are all writing about and be able to chat to them about why they chose that topic, so I am far less likely to be aware if there is CoI.
Kerry
From: Ziko van Dijk [mailto:zvandijk@gmail.com] Sent: Saturday, 29 September 2018 10:21 PM To: kerry.raymond@gmail.com Cc: Research into Wikimedia content and communities wiki-research-l@lists.wikimedia.org; Rosie Stephenson-Goodknight rosiestep.wiki@gmail.com Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
Hello Kerry,
Sorry, I did not see all the mails and the context before.
I remember a gentleman in a training lesson who wanted to write about his grandfather. Notability no problem, and no obvious bias. Why not assume Good Faith. But still, one might ask oneself whether this is an ideal situation. It is tricky. In general I totally agree that the hostility is a problem.
Kind regards
Ziko
Kerry Raymond <kerry.raymond@gmail.com mailto:kerry.raymond@gmail.com > schrieb am Sa. 29. Sep. 2018 um 08:27:
Well, I run training and events. The folk who turn up to these are always good faith, typically middle-aged and older, mostly women, and of above-average education for their age (our oldest Australians will not all have had the opportunity to go to high school) and generally acceptable IT skills. I think most of them are capable of being good contributors and their errors are mostly unintentional, e.g. copyright is not always well understood and so there are photo uploads from “family albums” or “our local history collection” where the provenance of the image is unknown and hence its copyright status is unclear. But off-line activities like mine are too few in number to make a significant impact on en.WP. We have to get better at attracting and on-boarding people via on-line.
Obviously on my watchlist I see plenty of blatant and subtle vandalism, so I am not naïve about that, but I do also see what appears to be good faith behaviour from newbies too. I suspect people who only see their watchlist have a more negative view about newbies than I do.
So, yes, we may have to filter out some of the good faith folks if their behaviour remains problematic, but reverting them for any small problem in their early edits certainly isn’t proving to be an effective strategy.
Kerry
From: Ziko van Dijk [mailto:zvandijk@gmail.com mailto:zvandijk@gmail.com ] Sent: Saturday, 29 September 2018 3:27 PM To: Research into Wikimedia content and communities <wiki-research-l@lists.wikimedia.org mailto:wiki-research-l@lists.wikimedia.org >; kerry.raymond@gmail.com mailto:kerry.raymond@gmail.com Cc: Rosie Stephenson-Goodknight <rosiestep.wiki@gmail.com mailto:rosiestep.wiki@gmail.com >
Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
Hello Kerry,
While I agree to most what you said, I think that the bigger picture should include that: newbies are not always good contributors, and not always good-faith contributors. And even if they have good faith, that does not mean that they can be trained to become good contributors. Dealing with newbies means always a filtering. MAybe different people are differently optimistic about the probability to make a newbie a good contributor.
Kind regards,
Ziko
Kerry Raymond <kerry.raymond@gmail.com mailto:kerry.raymond@gmail.com > schrieb am Do. 27. Sep. 2018 um 06:47:
While I have no objection to the administrator training, I don't think most of the problem lies with administrators. There's a lot of biting of the good-faith newbies done by "ordinary" editors (although I have seen some admins do it too). And, while I agree that there are many good folk out there on en.WP, unfortunately the newbie tends to meet the other folk first or perhaps it's that 1 bad experience has more impact than one good experience.
Similarly while Arbcom's willingness to desysop folks is good, I doubt a newbie knows how or where to complain in the first instance. Also there's a high level of defensive reaction if they do. Some of my trainees have contacted me about being reverted for clearly good-faith edits on the most spurious of reasons. When I have restored their edit with a hopefully helpful explanation, I often get reverted too. If a newbie takes any action themselves, it is likely to be an undo and that road leads to 3RR block or at least a 3RR warning. The other action they take is to respond on their User Talk page (when there is a message there to respond to). However, such replies are usually ignored, whether the other user isn't watching for a reply or whether they just don't like their authority to be challenged, I don't know. But it rarely leads to a satisfactory resolution.
One of the problems we have with Wikipedia is that most of us tend to see it edit-by-edit (whether we are talking about a new edit or a revert of an edit), we don't ever see a "big picture" of a user's behaviour without a lot of tedious investigation (working through their recent contributions one by one). So, it's easy to think "I am not 100% sure that the edit/revert I saw was OK but I really don't have time to see if this is one-off or a consistent problem". Maybe we need a way to privately "express doubt" about an edit (in the way you can report a Facebook post). Then if someone starts getting too many "doubtful edits" per unit time (or whatever), it triggers an admin (or someone) to take a closer look at what that user is up to. I think if we had a lightweight way to express doubt about any edit, then we could use machine learning to detect patterns that suggest specific types of undesirable user behaviours that can really only be seen as a "big picture".
Given this is the research mailing list, I guess we should we talking about ways research can help with this problem.
Kerry
-----Original Message----- From: Wiki-research-l [mailto:wiki-research-l-bounces@lists.wikimedia.org mailto:wiki-research-l-bounces@lists.wikimedia.org ] On Behalf Of Pine W Sent: Wednesday, 26 September 2018 1:07 PM To: Wiki Research-l <wiki-research-l@lists.wikimedia.org mailto:wiki-research-l@lists.wikimedia.org >; Rosie Stephenson-Goodknight <rosiestep.wiki@gmail.com mailto:rosiestep.wiki@gmail.com > Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
I'm appreciative that we're having this conversation - not in the sense that I'm happy with the status quo, but I'm glad that some of us are continuing to work on our persistent difficulties with contributor retention, civility, and diversity.
I've spent several hours on ENWP recently, and I've been surprised by the willingness of people to revert good-faith edits, sometimes with blunt commentary or with no explanation. I can understand how a newbie who experienced even one of these incidents would find it to be unpleasant, intimidating, or discouraging. Based on these experiences, I've decided that I should coach newbies to avoid taking reversions personally if their original contributions were in good faith.
I agree with Jonathan Morgan that WP:NOTSOCIAL can be overused.
Kerry, I appreciate your suggestions about about cultural change. I can think of two ways to influence culture on English Wikipedia in large-scale ways.
1. I think that there should be more and higher-quality training and continuing education for administrators in topics like policies, conflict resolution, communications skills, legal issues, and setting good examples. I think that these trainings would be one way through which cultural change could gradually happen over time. For what it's worth, I think that there are many excellent administrators who do a lot of good work (which can be tedious and/or stressful) with little appreciation. Also, my impression is that ENWP Arbcom has become more willing over the years to remove admin privileges from admins who misuse their tools. I recall having a discussion awhile back with Rosie on the topic of training for administrators, and I'm adding her to this email chain as an invitation for her to participate in this discussion. I think that offering training to administrators could be helpful in facilitating changes to ENWP culture.
2. I think that I can encourage civil participation in ENWP in the context of my training project https://meta.wikimedia.org/wiki/Grants:Project/Rapid/Pine/Continuation_of_educational_video_and_website_project that I'm hoping that WMF will continue to fund. ENWP is a complex and sometimes emotionally difficult environment, and I'm trying to set a tone in the online training materials that is encouraging. I hope to teach newbies about the goals of Wikipedia as well as policies, how to use tools, and Wikipedia culture. I am hopeful that the online training materials will improve the confidence of new contributors, improve the retention of new contributors, and help new editors to increase the quality and quantity of their contributions. I hope that early portions of the project will be well received and that, over time and if the project is successful as it incrementally increases in scale and reach, that it will influence the overall culture of ENWP to be more civil.
Regards,
Pine ( https://meta.wikimedia.org/wiki/User:Pine ) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org mailto:Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
_______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org mailto:Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Ziko,
That's certainly true. I think that Aaron Halfaker and the ORES team were hoping to use ORES to identify with greater certainty which newbies are likely to be good-faith very early in their edit counts, so as to try to route those newbies to the Teahouse and other places where they could get support. Perhaps Aaron or Jonathan Morgan could comment on how successful, or unsuccessful, those efforts with ORES were.
More recently ORES seems to be focusing on helping experienced Wikimedians to identify vandalism.
Gerard makes a good point that moving the needle in a statistically significant way on a huge project like ENWP is a challenging goal. On the other hand, the continuing inflow of new editors, on ENWP and elsewhere, gives me hope that we have some time to increase the viability and sustainability of the projects. Also, on ENWP and other large projects, if someone finds a way to increase the retention of good-faith contributors by a relatively small percentage, because the numbers involved are so large, a small percentage change can be very valuable in terms of absolute numbers.
Pine ( https://meta.wikimedia.org/wiki/User:Pine )
On Sat, Sep 29, 2018 at 5:27 AM Ziko van Dijk zvandijk@gmail.com wrote:
Hello Kerry,
While I agree to most what you said, I think that the bigger picture should include that: newbies are not always good contributors, and not always good-faith contributors. And even if they have good faith, that does not mean that they can be trained to become good contributors. Dealing with newbies means always a filtering. MAybe different people are differently optimistic about the probability to make a newbie a good contributor.
Kind regards, Ziko
Kerry Raymond kerry.raymond@gmail.com schrieb am Do. 27. Sep. 2018 um 06:47:
While I have no objection to the administrator training, I don't think most of the problem lies with administrators. There's a lot of biting of the good-faith newbies done by "ordinary" editors (although I have seen some admins do it too). And, while I agree that there are many good folk out there on en.WP, unfortunately the newbie tends to meet the other folk first or perhaps it's that 1 bad experience has more impact than one good experience.
Similarly while Arbcom's willingness to desysop folks is good, I doubt a newbie knows how or where to complain in the first instance. Also
there's a
high level of defensive reaction if they do. Some of my trainees have contacted me about being reverted for clearly good-faith edits on the
most
spurious of reasons. When I have restored their edit with a hopefully helpful explanation, I often get reverted too. If a newbie takes any
action
themselves, it is likely to be an undo and that road leads to 3RR block
or
at least a 3RR warning. The other action they take is to respond on their User Talk page (when there is a message there to respond to). However,
such
replies are usually ignored, whether the other user isn't watching for a reply or whether they just don't like their authority to be challenged, I don't know. But it rarely leads to a satisfactory resolution.
One of the problems we have with Wikipedia is that most of us tend to see it edit-by-edit (whether we are talking about a new edit or a revert of
an
edit), we don't ever see a "big picture" of a user's behaviour without a lot of tedious investigation (working through their recent contributions one by one). So, it's easy to think "I am not 100% sure that the edit/revert I saw was OK but I really don't have time to see if this is one-off or a consistent problem". Maybe we need a way to privately
"express
doubt" about an edit (in the way you can report a Facebook post). Then if someone starts getting too many "doubtful edits" per unit time (or whatever), it triggers an admin (or someone) to take a closer look at
what
that user is up to. I think if we had a lightweight way to express doubt about any edit, then we could use machine learning to detect patterns
that
suggest specific types of undesirable user behaviours that can really
only
be seen as a "big picture".
Given this is the research mailing list, I guess we should we talking about ways research can help with this problem.
Kerry
-----Original Message----- From: Wiki-research-l [mailto:
wiki-research-l-bounces@lists.wikimedia.org]
On Behalf Of Pine W Sent: Wednesday, 26 September 2018 1:07 PM To: Wiki Research-l wiki-research-l@lists.wikimedia.org; Rosie Stephenson-Goodknight rosiestep.wiki@gmail.com Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
I'm appreciative that we're having this conversation - not in the sense that I'm happy with the status quo, but I'm glad that some of us are continuing to work on our persistent difficulties with contributor retention, civility, and diversity.
I've spent several hours on ENWP recently, and I've been surprised by the willingness of people to revert good-faith edits, sometimes with blunt commentary or with no explanation. I can understand how a newbie who experienced even one of these incidents would find it to be unpleasant, intimidating, or discouraging. Based on these experiences, I've decided that I should coach newbies to avoid taking reversions personally if
their
original contributions were in good faith.
I agree with Jonathan Morgan that WP:NOTSOCIAL can be overused.
Kerry, I appreciate your suggestions about about cultural change. I can think of two ways to influence culture on English Wikipedia in
large-scale
ways.
- I think that there should be more and higher-quality training and
continuing education for administrators in topics like policies, conflict resolution, communications skills, legal issues, and setting good
examples.
I think that these trainings would be one way through which cultural change could gradually happen over time. For what it's worth, I think
that
there are many excellent administrators who do a lot of good work (which can be tedious and/or stressful) with little appreciation. Also, my impression is that ENWP Arbcom has become more willing over the years to remove admin privileges from admins who misuse their tools. I recall
having
a discussion awhile back with Rosie on the topic of training for administrators, and I'm adding her to this email chain as an invitation
for
her to participate in this discussion. I think that offering training to administrators could be helpful in facilitating changes to ENWP culture.
- I think that I can encourage civil participation in ENWP in the
context
of my training project <
https://meta.wikimedia.org/wiki/Grants:Project/Rapid/Pine/Continuation_of_ed...
that I'm hoping that WMF will continue to fund. ENWP is a complex and sometimes emotionally difficult environment, and I'm trying to set a tone in the online training materials that is encouraging. I hope to teach newbies about the goals of Wikipedia as well as policies, how to use
tools,
and Wikipedia culture. I am hopeful that the online training materials
will
improve the confidence of new contributors, improve the retention of new contributors, and help new editors to increase the quality and quantity
of
their contributions. I hope that early portions of the project will be
well
received and that, over time and if the project is successful as it incrementally increases in scale and reach, that it will influence the overall culture of ENWP to be more civil.
Regards,
Pine ( https://meta.wikimedia.org/wiki/User:Pine ) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Hoi, To move the needle on English Wikipedia, the numbers involved are huge. So at best things change incrementally. What fails most of the research is that it only considers English WIkipedia whereas changes are much easier on the smaller projects.
I do go as far that in order to become more inclusive we should stop focusing on English Wikipedia both in attention, spending and research and by English. Then again there are too many systemic impediments. Thanks, GerardM
On Fri, 21 Sep 2018 at 02:44, Jonathan Morgan jmorgan@wikimedia.org wrote:
(Re: Jonathan's 'Chilling Effect' theory and Kerry's call for experiments to increase gender diversity)
Kerry: In a magic world, where I could experiment with anything I wanted to without having to get permission from communities, I would experiment with enforceable codes of conduct that covered a wider range of harassing and hostile behavior, coupled with robust & confidential incident reporting and review tools. But that's not really an 'experiment', that's a whole new social/software system.
I actually think we're beyond 'experiments' when it comes to increasing gender diversity. There are too many systemic factors working against increasing non-male participation. In order to do that you would need to increase newcomer retention dramatically, and we can barely move the needle there on EnWiki, for both social and technical reasons. But one non-technical intervention might be carefully revising and re-scope policies like WP:NOTSOCIAL that are often used to arbitrarily and aggressively shut down modes of communication, self-expression, and collaboration that don't fit so-and-so's idea of what it means to be Wikipedian.
Initiatives that start off wiki, like women-oriented edit-a-thons and outreach campaigns, are vitally important and could certainly be supported better in terms of maintaining a sense of community among participants once the event is over and they find they're now stuck alone in hostile wiki-territory. But I'm not sure what the best strategy is there, and these kind of initiatives are not large-scale enough to make a large overall impact on active editor numbers on their own, though they set important precedents, create infrastructure, change the conversation, and do lead to new editors.
The Community Health https://en.wikipedia.org/wiki/Wikipedia:Community_health_initiative team just hired a new researcher who has lots of experience in the online harassment space. I don't feel comfortable announcing their name yet, since they hasn't officially started, but I'll make sure they subscribe to this list, and will point out this thread.
Jonathan: This study https://dl.acm.org/citation.cfm?id=2145265 is the one I cite. There's a more recent--paywalled!--follow up https://link.springer.com/article/10.1007/s11199-015-0573-y (expansion?) that I haven't read yet, but which may provide new insights. And this short but powerful enthnographic study https://dl.acm.org/citation.cfm?id=2702514. And this lab study https://www.sciencedirect.com/science/article/pii/S0747563216306781 on the gendered perceptions of feedback and anonymity. And the--ancient, by now--former contributors survey https://strategy.wikimedia.org/wiki/Former_Contributors_Survey_Results, which IIRC shows that conflict fatigue is a significant reason people leave. And of course there's a mountain of credible evidence at this point that antisocial behaviors drive away newcomers, irrespective of gender.
Thanks for raising these questions,
- J
On Wed, Sep 19, 2018 at 3:21 AM, Jonathan Cardy < werespielchequers@gmail.com
wrote:
Thanks Pine,
In case I didn’t make it clear, I am very much of the camp that IP
editing
is our lifeline, the way we recruit new members. If someone isn’t happy with Citizendium et al as tests of that proposition then feel free to propose tests. I am open to being proved wrong if someone doesn’t mind wasting their time checking what seems obvious to me.
Just please if you do so make sure you test for the babies that I fear would be thrown out with the bathwater, i.e the goodfaith newbies.
I am not short of promising lines of enquiry, and more productive uses of my time. My choice for my time available for such things is which
promising
lines of enquiry to follow, and banning IPs isn’t one if them.
One where we might have more agreement is over the default four warnings and a block for vandalism. I think it bonkers that we block edit warrers for a first offence but usually don’t block vandals till a fifth
offence. I
know that the four warnings and a block approach dates back to some of
the
earliest years on Wiki, but I am willing to bet that it wasn’t very scientifically arrived at, and that a study of the various behaviours
that
we treat this way would probably conclude that we could reduce the number of warnings for vandals, whilst we might want a longer dialogue with non neutral editors, copy pasters and those who add unsourced material. Afterall, many of our editors started without getting issues like neutrality, and whilst the few former vandals who we have don’t generally have a grudge that their early vandalism lead to a block, the same isn't always true of others.
The other issue that could really use some research is on the chilling effect theory. Here the community is divided, some honestly believe that the high quality work of certain individuals justifies a certain level of snark, even to the point of harassment. Others, including myself, believe that tolerance of bad behaviour drives away some good editors and fails
to
improve the behaviour of some who would comply with stricter civility enforcement. It would be really useful to have a study one could point to when that argument next recurs.
Get Outlook for iOShttps://aka.ms/o0ukef ________________________________ From: Wiki-research-l wiki-research-l-bounces@lists.wikimedia.org on behalf of Pine W wiki.pine@gmail.com Sent: Wednesday, September 19, 2018 8:29:32 AM To: Wiki Research-l Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
I'm going to respond to Kerry and Jonathan in two parts of one email.
--
Hi Kerry, I did not say that transparency should be a free-for-all, and it's important to keep in mind that transparency from my perspective is intended to ensure due process for everyone involved. That includes ensuring that people who are adjudicating cases are not callously dismissing complaints, mistreating people who have been victimized, neglecting evidence, or rushing to conclusions. I would oppose, for example, people who are adjudicating a case deciding to engage in questioning that is completely unnecessary for dealing with the relevant allegations.
On a related issue, I don't trust WMF to adjudicate cases or involve
itself
directly in deciding who gets to be on Wikimedia sites or attend
Wikimedia
events; WMF is not the same thing as Wikimedia and I remain deeply
unhappy
with some of WMF's choices over the years and its lack of apology for
those
choices. I would be more trusting of a somewhat less transparent process for adjudicating off-wiki problems if it was led by people who are
elected
from the community, similar to English Wikipedia Arbitration Committee elections. Arbcom is far from perfect, but I have modestly more faith in Arbcom than I do in WMF. On the other hand, arbitrators are volunteers,
and
over the years I have seen more than one instance of arbitrators
appearing
to be stressed; volunteers with high skill levels and good intentions
are a
precious resource, and if one of the outcomes of WMF's strategy process
is
a move toward having a global Arbitration Committee then one of the difficult questions will be how to get an adequate supply of highly
skilled
people with good intentions to volunteer. On a related note, I prefer to avoid identity politics when deciding who should be on arbitration committees; I feel that identity politics are often poisonous and make it very difficult to have civil dialogue. How to balance the virtue of diversity with the virtue of avoiding identity politics is an issue that
I
haven't worked out.
We're getting off of the topic of research and into more of a policy discussion, so if you'd like to continue in this topic then I suggest
doing
so on Wikimedia-l or on Meta.
--
Hi Jonathan, I'd be supportive of running small experiments about
blocking
all IP editors on ENWP and mid-sized Wikipedias to see whether that is a net positive. As you noted, the research would be somewhat complicated
when
keeping in mind that the researchers would want to check for positive and negative side effects, but I think that it would be worth doing. Would
you
like to make a proposal in IdeaLab?
Regards,
Pine ( https://meta.wikimedia.org/wiki/User:Pine ) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
-- Jonathan T. Morgan Senior Design Researcher Wikimedia Foundation User:Jmorgan (WMF) https://meta.wikimedia.org/wiki/User:Jmorgan_(WMF) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Yes, it may well be easier to try experiments on smaller Wikipedias where there isn't an immovable dominant culture that would strenuously resist the experiments. I understand the new on-boarding experiments are happening (or will happen soon) on Czech and South Korean Wikipedia, so there is an example.
German Wikipedia (of its own choice) decided to experiment with making the Visual Editor the default for new users a couple of years ago and were happy with the result:
https://wikimania2017.wikimedia.org/wiki/Submissions/From_open_hostility_to_...
So I don't see a problem conducting gender experiments on other Wikipedia. I guess they have to have a documented gender imbalance in the first place.
Kerry
-----Original Message----- From: Wiki-research-l [mailto:wiki-research-l-bounces@lists.wikimedia.org] On Behalf Of Gerard Meijssen Sent: Friday, 28 September 2018 4:03 PM To: Research into Wikimedia content and communities wiki-research-l@lists.wikimedia.org Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
Hoi, To move the needle on English Wikipedia, the numbers involved are huge. So at best things change incrementally. What fails most of the research is that it only considers English WIkipedia whereas changes are much easier on the smaller projects.
I do go as far that in order to become more inclusive we should stop focusing on English Wikipedia both in attention, spending and research and by English. Then again there are too many systemic impediments. Thanks, GerardM
On Fri, 21 Sep 2018 at 02:44, Jonathan Morgan jmorgan@wikimedia.org wrote:
(Re: Jonathan's 'Chilling Effect' theory and Kerry's call for experiments to increase gender diversity)
Kerry: In a magic world, where I could experiment with anything I wanted to without having to get permission from communities, I would experiment with enforceable codes of conduct that covered a wider range of harassing and hostile behavior, coupled with robust & confidential incident reporting and review tools. But that's not really an 'experiment', that's a whole new social/software system.
I actually think we're beyond 'experiments' when it comes to increasing gender diversity. There are too many systemic factors working against increasing non-male participation. In order to do that you would need to increase newcomer retention dramatically, and we can barely move the needle there on EnWiki, for both social and technical reasons. But one non-technical intervention might be carefully revising and re-scope policies like WP:NOTSOCIAL that are often used to arbitrarily and aggressively shut down modes of communication, self-expression, and collaboration that don't fit so-and-so's idea of what it means to be Wikipedian.
Initiatives that start off wiki, like women-oriented edit-a-thons and outreach campaigns, are vitally important and could certainly be supported better in terms of maintaining a sense of community among participants once the event is over and they find they're now stuck alone in hostile wiki-territory. But I'm not sure what the best strategy is there, and these kind of initiatives are not large-scale enough to make a large overall impact on active editor numbers on their own, though they set important precedents, create infrastructure, change the conversation, and do lead to new editors.
The Community Health https://en.wikipedia.org/wiki/Wikipedia:Community_health_initiative team just hired a new researcher who has lots of experience in the online harassment space. I don't feel comfortable announcing their name yet, since they hasn't officially started, but I'll make sure they subscribe to this list, and will point out this thread.
Jonathan: This study https://dl.acm.org/citation.cfm?id=2145265 is the one I cite. There's a more recent--paywalled!--follow up https://link.springer.com/article/10.1007/s11199-015-0573-y (expansion?) that I haven't read yet, but which may provide new insights. And this short but powerful enthnographic study https://dl.acm.org/citation.cfm?id=2702514. And this lab study https://www.sciencedirect.com/science/article/pii/S0747563216306781 on the gendered perceptions of feedback and anonymity. And the--ancient, by now--former contributors survey https://strategy.wikimedia.org/wiki/Former_Contributors_Survey_Result s, which IIRC shows that conflict fatigue is a significant reason people leave. And of course there's a mountain of credible evidence at this point that antisocial behaviors drive away newcomers, irrespective of gender.
Thanks for raising these questions,
- J
On Wed, Sep 19, 2018 at 3:21 AM, Jonathan Cardy < werespielchequers@gmail.com
wrote:
Thanks Pine,
In case I didn’t make it clear, I am very much of the camp that IP
editing
is our lifeline, the way we recruit new members. If someone isn’t happy with Citizendium et al as tests of that proposition then feel free to propose tests. I am open to being proved wrong if someone doesn’t mind wasting their time checking what seems obvious to me.
Just please if you do so make sure you test for the babies that I fear would be thrown out with the bathwater, i.e the goodfaith newbies.
I am not short of promising lines of enquiry, and more productive uses of my time. My choice for my time available for such things is which
promising
lines of enquiry to follow, and banning IPs isn’t one if them.
One where we might have more agreement is over the default four warnings and a block for vandalism. I think it bonkers that we block edit warrers for a first offence but usually don’t block vandals till a fifth
offence. I
know that the four warnings and a block approach dates back to some of
the
earliest years on Wiki, but I am willing to bet that it wasn’t very scientifically arrived at, and that a study of the various behaviours
that
we treat this way would probably conclude that we could reduce the number of warnings for vandals, whilst we might want a longer dialogue with non neutral editors, copy pasters and those who add unsourced material. Afterall, many of our editors started without getting issues like neutrality, and whilst the few former vandals who we have don’t generally have a grudge that their early vandalism lead to a block, the same isn't always true of others.
The other issue that could really use some research is on the chilling effect theory. Here the community is divided, some honestly believe that the high quality work of certain individuals justifies a certain level of snark, even to the point of harassment. Others, including myself, believe that tolerance of bad behaviour drives away some good editors and fails
to
improve the behaviour of some who would comply with stricter civility enforcement. It would be really useful to have a study one could point to when that argument next recurs.
Get Outlook for iOShttps://aka.ms/o0ukef ________________________________ From: Wiki-research-l wiki-research-l-bounces@lists.wikimedia.org on behalf of Pine W wiki.pine@gmail.com Sent: Wednesday, September 19, 2018 8:29:32 AM To: Wiki Research-l Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
I'm going to respond to Kerry and Jonathan in two parts of one email.
--
Hi Kerry, I did not say that transparency should be a free-for-all, and it's important to keep in mind that transparency from my perspective is intended to ensure due process for everyone involved. That includes ensuring that people who are adjudicating cases are not callously dismissing complaints, mistreating people who have been victimized, neglecting evidence, or rushing to conclusions. I would oppose, for example, people who are adjudicating a case deciding to engage in questioning that is completely unnecessary for dealing with the relevant allegations.
On a related issue, I don't trust WMF to adjudicate cases or involve
itself
directly in deciding who gets to be on Wikimedia sites or attend
Wikimedia
events; WMF is not the same thing as Wikimedia and I remain deeply
unhappy
with some of WMF's choices over the years and its lack of apology for
those
choices. I would be more trusting of a somewhat less transparent process for adjudicating off-wiki problems if it was led by people who are
elected
from the community, similar to English Wikipedia Arbitration Committee elections. Arbcom is far from perfect, but I have modestly more faith in Arbcom than I do in WMF. On the other hand, arbitrators are volunteers,
and
over the years I have seen more than one instance of arbitrators
appearing
to be stressed; volunteers with high skill levels and good intentions
are a
precious resource, and if one of the outcomes of WMF's strategy process
is
a move toward having a global Arbitration Committee then one of the difficult questions will be how to get an adequate supply of highly
skilled
people with good intentions to volunteer. On a related note, I prefer to avoid identity politics when deciding who should be on arbitration committees; I feel that identity politics are often poisonous and make it very difficult to have civil dialogue. How to balance the virtue of diversity with the virtue of avoiding identity politics is an issue that
I
haven't worked out.
We're getting off of the topic of research and into more of a policy discussion, so if you'd like to continue in this topic then I suggest
doing
so on Wikimedia-l or on Meta.
--
Hi Jonathan, I'd be supportive of running small experiments about
blocking
all IP editors on ENWP and mid-sized Wikipedias to see whether that is a net positive. As you noted, the research would be somewhat complicated
when
keeping in mind that the researchers would want to check for positive and negative side effects, but I think that it would be worth doing. Would
you
like to make a proposal in IdeaLab?
Regards,
Pine ( https://meta.wikimedia.org/wiki/User:Pine ) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
-- Jonathan T. Morgan Senior Design Researcher Wikimedia Foundation User:Jmorgan (WMF) https://meta.wikimedia.org/wiki/User:Jmorgan_(WMF) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
_______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Kerry,
This discussion about reverts, combined with my recent experience on ENWP, makes me wonder if there's a way to make reverts feel less hostile on average. Do you have any ideas about how to do that?
Thanks,
Part of the reason we have a problem with dealing with good faith new users is because we assume they understand things like we do. They don't.
Try and imagine you are a genuine good-faith new user (hard for us, but I get to see them face-to-face so I get some insight into their experience). Imagine that you have just spent quite a number of minutes making your first ever change to a Wikipedia article. You found it quite difficult, strange jargon, incomprehensible tool bars etc. Lots of things didn't work, so you explore more menu options, blah blah blah. But you finally prevailed! You saved your edit and you could see your change on the screen in the article. Hurray! Do a little dance to celebrate! Sacrifice a goat! I must show Mum!
Then the edit gets reverted.
The first question to ask is how does the user know it got reverted.
* The article does not show their edit when they look at it later; they do not know it was reverted
This is the likely scenario if they are not still logged in to their user account (or at a different IP address if they did an IP edit) They find out next time they look at the article. Remember how proud they are of that edit. They may show the article to someone "look how I changed Wikipedia, hey, why can't I see the change I made?".
Now how do they react to this? They may be thinking "maybe it's awaiting a review" (remember newbies don't know how things work) so they wait and wait ..
Maybe, perhaps after some waiting, they decide they must have not got it right. Remember they struggled to do that edit; they found it difficult; they can imagine that they did something wrong. So they might just give up thinking "I am not tech savvy enough to change Wikipedia". Or they might think "I have to give it another go and see if I can get it right this time. So they repeat the edit (possibly not being logged in) and presumably it gets reverted again.
* they may see a an alert or notification so they know it was reverted
If they are still logged in or at the same IP address, they may see an alert or notification. I say "may" because not being a new user I am not sure how they are shown a reverted edit. Someone else who knows will have to answer this. But I do know from face-to-face observation that new users often do not notice things in the user interface like alerts, notifications, message etc even when they remain logged-in. Their eye focus is entirely on the article content. Lots of studies of eye tracking and their heat maps show us that this is normal behaviour on most web pages, people are focussed on where they think is relevant to them. Since this user's experience of Wikipedia is 99.99% as a reader, they are 99.99% pre-programmed to look straight to the article content. As regular contributors, we are probably far more aware of things like alerts, notifications, etc (but equally would you notice a change in the elements of, say, the left hand tool bar as quickly).
Assuming they see that there is an alert or notification, do they know to click the alert or notification to find out that their edit was reverted? Again, stuff we take for granted, but it's their first time. So they may still not know their edit has been reverted.
Assuming they managed to navigate the GUI to get to the revert notification, they might be seeing the edit summary on the reversion and/or a talk page entry (probably a Twinkle-or-other-tool template).
Edit summaries are by their very nature short and they can be empty, or very cryptic or use unfamiliar jargon or link off to pages full of more jargon [[WP:SOMEPOLICY]]. Messages on talk pages can be longer but not necessarily any more helpful. For example, the default Twinkle response for a revert (level 1 vandalism) says that the reverted edit "did not appear constructive" and points the user to the Sandbox (not helpful) or to the Help Desk (potentially helpful). Also, the user did an original VE edit, they may be unable to interpret a page they are pointed to which uses any markup example (which occurs if they have done something wrong technically rather than policy-wise).
If they got this far, it is very likely that although the user knows their edit was reverted, they may still not know why either in general or in particular about what was wrong with their edit. Or they may know what was wrong but be unclear on how to fix it. Why was my citation not reliable enough? Etc.
Assuming they have not given up, they will probably feel the need to talk to someone about their reverted edit. Depending on how they were notified of the revert, there are a range of places that they have been shown as a place to have such a conversation. These include their own user talk page, the user talk page of the person who wrote a message on their user talk page, the Help Desk, the Teahouse, the article Talk page, talk pages of Wikipedia policies, etc. So we don't know how they choose where to go but there are problems with all of them. The first problem is technical. We are asking a new user who needs help with Wikipedia to get that help via Wikipedia's methods of communication (Talk) with which they are not familiar. Plus if they did their first edit with the Visual Editor, they have the scary markup hurdle as well. So that's the technical hurdle to asking for help.
But there are other hurdles in asking for help. They don't speak our language. They don't how to provide a diff link for example, which may make it harder to anyone to respond, particularly if their enquiry is not using the user account (or IP) of the original edit (i.e. cannot connect the user account to the problem edit). So their description of the problem could be quite confusing which may make it hard for anyone to give them a good response and if the person responding is not the reverting editor, then they may be unsure what problem the reverting editor (who might be a subject matter expert familiar with reliable sources for that topic, conventions in that topic space) might have seen in the edit.
The final problem is social. Now some of the places I mentioned above are just not good places to go. Responding on their User Talk page *should* work but reverting editors do not appear to actively watch such accounts for responses so their response may be ignored. So even if they successfully write a message on one of these places, there is the possibility that nobody is actively watching it, or that even if it is actively watched, or just not think it is their responsibility to reply ("I didn't revert the edit, not my problem"). The talk page of the reverting editor *should* get a response, but it depends on their personal goodwill towards new users. Presumably the Help Desk or Teahouse would respond, so these are probably the best places, but the newbie doesn't know that.
Assuming someone does respond to their plaintive cry for help, how does the newbie know they have received a reply? Remember they don't know about page watching. While the folks at Teahouse etc do tend to ping (probably for this reason), it's very possible that there may be a reply but they simply never see it (again, if they become logged out, they will not see it). Again, are we relying on alerts and notifications to reach them.
OK, let's assume they've received the reply. They may now have an answer they can work with or they may have be referred to read a policy page (which they don't understand) or told to ask the newbie to ask at a different Talk page (e.g. the Help Desk often tells the person to ask the question on the article Talk page).
I think if we draw the newbie revert experience out as a flow chart, it becomes very clear that there are plenty of ways the newbie can reach a dead end or get into an infinite loop, and perhaps not so many ways they can get to a sufficient understanding of what they did wrong and how to do it right (assuming it can be fixed, as Ziko points out, not all good faiths edits are acceptable to Wikipedia).
What I think this enumeration of steps draws out is that the things we could do to improve the new user experience are:
1. find ways to communicate with them so they know their edit was reverted in the first place (if they have an email address for their account, email them). Encourage them to add an email address at account creation by explaining the benefits (currently it just says "optional" without explaining how it will be used, at least there could be a link called "benefits of providing your email" which could mention password recovery and easier ways to get help to provide positive motivation to provide it) 2. provide individual feedback not generic templates in the first instance of reverting (yes, I see the obvious problem with this and doubt we can do much to change the behaviour of random reverting editors) 3. to get help, don't force them to use Talk, let them use email or chat (and by chat, I don't mean IRC) that may be more familiar to them and make sure it happens in a way in which they can't miss the reply 4. don't give them too many options on where to seek help -- try to funnel them to a single place where they will receive individual specific help in newbie-friendly language (the Teahouse is probably the best option if it had an email/chat interface), this means making sure all the templated response systems include this information prominently (or better still have a clickable link, see below) 5. automate the asking-for-help process in some way so that if they talking about a reverted edit, that edit is presented to the person trying to help them (the not-logged-in problem, can't provide a diff problem), ideally an revert communication (whether it be an alert, a user talk message or email etc) should have a "click here to ask for help about this reverted edit" and that clickable link will have the diff embedded in it
Apart from item 2 (which depends heavily on the time and goodwill of individual users), these steps are all achievable. There may be some objection to increased use of email over Talk because of the transparency but email communication is supported by Wikipedia. And I can see if we encourage new users to communicate via email.chat, then they may expect to continue to use those communication methods to be their communication mode as they become more experienced users. It may be that we need some email/chat gateway to Talk (but then we have to be careful that everyone understands the visibility of the email they will send). Or of course we could just ditch Talk completely and move to Flow (or anything vaguely 21st century). Why should it be so easy to have a conversation on Facebook and so hard on Wikipedia? (How many colons do I need ...)
Kerry
-----Original Message----- From: Wiki-research-l [mailto:wiki-research-l-bounces@lists.wikimedia.org] On Behalf Of Pine W Sent: Sunday, 30 September 2018 5:01 AM To: Wiki Research-l wiki-research-l@lists.wikimedia.org Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
Kerry,
This discussion about reverts, combined with my recent experience on ENWP, makes me wonder if there's a way to make reverts feel less hostile on average. Do you have any ideas about how to do that?
Thanks,
Pine ( https://meta.wikimedia.org/wiki/User:Pine ) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
As advice to an individual editor on how to deal with good faith but problematics edits, I would say give the newbie feedback on exactly what the specific problem is with their reverted edit explain how to fix it, and continue to watch the article and their user talk page to see how they are going, and keep offering help until they get it right.
For my long answer on how to do it at scale, see my other longer email.
Kerry
-----Original Message----- From: Wiki-research-l [mailto:wiki-research-l-bounces@lists.wikimedia.org] On Behalf Of Pine W Sent: Sunday, 30 September 2018 5:01 AM To: Wiki Research-l wiki-research-l@lists.wikimedia.org Subject: Re: [Wiki-research-l] Results from 2018 global Wikimedia survey are published!
Kerry,
This discussion about reverts, combined with my recent experience on ENWP, makes me wonder if there's a way to make reverts feel less hostile on average. Do you have any ideas about how to do that?
Thanks,
Pine ( https://meta.wikimedia.org/wiki/User:Pine ) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Instead of putting down every idea as not being able to work without the benefit of an experiment, let's reverse the question.
Researchers, forgetting for a moment whether the community would accept it, if you were asked by the WMF BoT to make recommendations on experiments to run on en.WP to try to make it more attractive to women (since that's the aspect of diversity on which we seem to have the most data and the most research), what changes would you suggest for the experiment and why?
Let's at least get the ideas onto the table before knocking them off.
Or do we genuinely believe this is something that cannot be solved?
Kerry
Hi everyone,
The presentation about this report will start in 30 minutes. We will be watching the youtube stream and IRC for your comments or questions.
[2] https://www.youtube.com/watch?v=qGQtWFP9Cjc
Thanks! Edward
---------- Forwarded message --------- From: Edward Galvez egalvez@wikimedia.org Date: Thu, Sep 13, 2018 at 3:59 PM Subject: Results from 2018 global Wikimedia survey are published! To:
Hi everyone,
I'm excited to share that our annual survey about Wikimedia communities is now published!
This survey included 170 questions and reaches over 4,000 community members across four audiences: Contributors, Affiliate organizers, Program Organizers, and Volunteer Developers. This survey helps us hear from the experience of Wikimedians from across the movement so that teams are able to use community feedback in their planning and their work. This survey also helps us learn about long term changes in communities, such as community health or demographics.
The report is available on meta: https://meta.wikimedia.org/wiki/Community_Engagement_Insights/2018_Report
For this survey, we worked with 11 teams to develop the questions. Once the results were analyzed, we spent time with each team to help them understand their results. Most teams have already identified how they will use the results to help improve their work to support you.
The report could be useful for your work in the Wikimedia movement as well! What are you learning from the data? Take some time to read the report and share your feedback on the talk pages. We have also published a blog that you can read.[1]
We are hosting a livestream presentation[2] on September 20 at 1600 UTC. Hope to see you there!
Feel free to email me directly with any questions.
All the best, Edward
[1] https://wikimediafoundation.org/2018/09/13/what-we-learned-surveying-4000-co... [2] https://www.youtube.com/watch?v=qGQtWFP9Cjc
wiki-research-l@lists.wikimedia.org