Hello,
We have an update on the community health initiative mentioned following the Board's Statement on Healthy Community Culture, Inclusivity, and Safe Spaces.[1]
As Patrick Earley from the Support and Safety team noted on Wikimedia-l last month[2], we’ve been developing a community health initiative to help address the harassment issues discussed in the Board's statement. We believe an important aspect of our efforts to combat harassment is providing the volunteer community with better tools to more effectively respond to instances of harassment as they arise.
We’re excited to announce that the Craig Newmark Foundation and craigslist Charitable Fund have agreed to provide initial funding to help the Wikimedia Foundation begin this work. The two seed restricted grants, collectively a gift of $500,000, will enable the Foundation to scale up our support of these efforts and provide us with the resources to do it right.
In preparing for this work, we’ve been discussing issues with the current tools and processes with active administrators and functionaries. These discussions have resulted in requested improvements in several key areas where admins and functionaries see immediate needs—better reporting systems for volunteers, smarter ways to detect and address problems early, and improved tools and workflows related to the blocking process.
In the coming months, the Community Tech team, working with the Support and Safety team, will be expanding their work on development of these tools. The long-term goal for this effort is to build up the toolbox that volunteers can use to combat harassment and other disruptive behavior on our wikis.
Specifically, there are four areas where we think new tools will help:
1. Detection - Improve our detection and prevention tools, like AbuseFilter, and build new features to detect aggressive behavior.
2. Reporting - Design ways to report harassment that are less chaotic, more respectful of privacy, and less stressful than the current workflow.
3. Evaluating - Offer admins tools that make evaluating harassment reports easier, so that they can make good decisions.
4. Blocking - When someone is blocked from the site, we can make it more difficult for them to return under a different name or IP address.
Of course, these improvements need to be made with the participation and support of the volunteers who will be using the tools. We don't want to create new systems and workflows that create more work for an already overburdened team of wiki administrators. We want to make these tasks less grueling and able to more consistently produce effective outcomes.
Work in other areas - such as project policies and better training for administrators and functionaries - still needs to be done in order to comprehensively tackle the overall issue of harassment and harmful behavior on the projects. However, we believe that improving and building better tools for volunteers currently most engaged in this effort is a necessary first step.
We welcome your feedback on this approach, and invite you to join us in thanking Craig and his charitable organizations for their support of this initiative!
We’ll be sharing regular updates about the progress of this work in the coming months. If you have any questions in the meantime, please reach out to us on the talk page of the Meta-Wiki page where you can find more information: https://meta.wikimedia.org/wiki/Community_health_initiative
You can also find more details about this announcement in this blog post: https://blog.wikimedia.org/2017/01/26/community-health-initiative-grant
Danny Horn (Product Manager, Community Tech) and Patrick Earley (Manager, Support & Safety)
[1] https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Novem...
[2] https://lists.wikimedia.org/pipermail/wikimedia-l/2016-December/085668.html