Hello everyone,
Today, the Wikimedia Foundation Board of Trustees unanimously passed a resolution and published a statement[1] regarding the urgent need to make our movement more safe and inclusive by addressing harassment and incivility on Wikimedia projects. The statement builds on prior statements from 2016 and 2019,[2][3] affirms the forthcoming introduction of a universal code of conduct, and directs the Wikimedia Foundation to rapidly and substantively address these challenges in complement with existing community processes.
This includes developing sustainable practices and tools that eliminate harassment, toxicity, and incivility, promote inclusivity, cultivate respectful discourse, reduce harms to participants, protect the projects from disinformation and bad actors, and promote trust in our projects.
Over the past nearly twenty years, the movement has taken a number of unique and sometimes extraordinary steps to create an environment unlike anything else online: a place to share knowledge, to learn, and to collaborate together. In order for the movement to continue to thrive and make progress to our mission, it is essential to build a culture that is welcoming and inclusive.
Research has consistently shown that members of our communities have been subject to hostility and toxic behavior in Wikimedia spaces.[4][5] The Wikimedia 2030 movement strategy recommendations have also identified the safety of our Wikimedia spaces as a core issue to address if we are to reach the 2030 goals, with concrete recommendations which include a universal code of conduct, pathways for users to privately report incidents, and a baseline of community responsibilities.[6]
While the movement has made progress in addressing harassment and toxic behavior, we recognize there is still much more to do. The Board’s resolution and statement today is a step toward establishing clear, consistent guidelines around acceptable behavior on our projects, and guiding the Wikimedia Foundation in supporting the movement’s ability to ensure a healthy environment for those who participate in our projects.
* Developing and introducing, in close consultation with volunteer contributor communities, a universal code of conduct that will be a binding minimum set of standards across all Wikimedia projects;
* Taking actions to ban, sanction, or otherwise limit the access of Wikimedia movement participants who do not comply with these policies and the Terms of Use;
* Working with community functionaries to create and refine a retroactive review process for cases brought by involved parties, excluding those cases which pose legal or other severe risks; and
* Significantly increasing support for and collaboration with community functionaries primarily enforcing such compliance in a way that prioritizes the personal safety of these functionaries.
Together, we have made our movement what it is today. In this same way, we must all be responsible for building the positive community culture of the future, and accountable for stopping harassment and toxic behavior on our sites.
We have also made this statement available on Meta-Wiki for translation and wider distribution.[1]
On behalf of the Board, María, Board Chair
[1] https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/May_2...
[2] https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Novem...
[3] https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Archi...
[4] https://meta.wikimedia.org/wiki/Research:Harassment_survey_2015
[5] https://meta.wikimedia.org/wiki/Community_Insights/2018_Report#Experience_of...
[6] https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20/Recommen...
== Statement on Healthy Community Culture, Inclusivity, and Safe Spaces ==
Harassment, toxic behavior, and incivility in the Wikimedia movement are contrary to our shared values and detrimental to our vision and mission. They negatively impact our ability to collect, share, and disseminate free knowledge, harm the immediate well-being of individual Wikimedians, and threaten the long-term health and success of the Wikimedia projects. The Board does not believe we have made enough progress toward creating welcoming, inclusive, harassment-free spaces in which people can contribute productively and debate constructively.
In recognition of the urgency of these issues, the Board is directing the Wikimedia Foundation to directly improve the situation in collaboration with our communities. This should include developing sustainable practices and tools that eliminate harassment, toxicity, and incivility, promote inclusivity, cultivate respectful discourse, reduce harms to participants, protect the projects from disinformation and bad actors, and promote trust in our projects.
Specifically, the Foundation shall:
* Develop and introduce a universal code of conduct (UCoC) that will be a binding minimum set of standards across all Wikimedia projects.
** The first phase, covering policies for in-person and virtual events, technical spaces, and all Wikimedia projects and wikis, and developed in collaboration with the international Wikimedia communities, will be presented to the Board for ratification by August 30, 2020.
** The second phase, outlining clear enforcement pathways, and refined with broad input from the Wikimedia communities, will be presented to the Board for ratification by the end of 2020;
* Take actions to ban, sanction, or otherwise limit the access of Wikimedia movement participants who do not comply with these policies and the Terms of Use;
* Work with community functionaries to create and refine a retroactive review process for cases brought by involved parties, excluding those cases which pose legal or other severe risks; and
* Significantly increase support for and collaboration with community functionaries primarily enforcing such compliance in a way that prioritizes the personal safety of these functionaries.
Until such directives are implemented, the Board instructs the Foundation to adopt and implement policies for reducing harassment and toxicity on our projects and minimizing legal risks for the movement, in collaboration with communities whenever practicable. Until these two phases of the UCoC are complete and operational an interim review process involving community functionaries will be in effect. In this interim period, the Product Committee of the Board of Trustees will also advise the Trust & Safety team.
To that end, the Board further directs the Foundation, in collaboration with the communities, to make additional investments in Trust & Safety capacity, including but not limited to: development of tools needed to assist our volunteers and staff, research to support data-informed decisions, development of clear metrics to measure success, development of training tools and materials (including building communities’ capacities around harassment awareness and conflict resolution), and consultations with international experts on harassment, community health and children’s rights, as well as additional hiring.
The above efforts will be undertaken in coordination and collaboration with appropriate partners from across the movement, seek to increase effective community governance of conduct and behavioral standards, and reduce the long-term need of the Foundation to act. It is the shared goal of the Board and Foundation that these efforts advance a sustainable Wikimedia movement and support, rather than substitute, effective models of community governance.
We urge every member of the Wikimedia communities to collaborate in a way that models the Wikimedia values of openness and inclusivity, step forward to do their part to create a safe and welcoming culture for all, stop hostile and toxic behavior, support people who have been targeted by such behavior, assist good-faith people learning to contribute, and help set clear expectations for all contributors.
"Work with community functionaries to create and refine a retroactive review process for cases brought by involved parties, excluding those cases which pose legal or other severe risks "
What does "retroactive review process" mean?
I hope it doesn't mean applying standards that were not promulgated at the time to past actions and applying severe sanctions to the alleged perpetrators.
On Fri, May 22, 2020 at 5:59 PM María Sefidari maria@wikimedia.org wrote:
Hello everyone,
Today, the Wikimedia Foundation Board of Trustees unanimously passed a resolution and published a statement[1] regarding the urgent need to make our movement more safe and inclusive by addressing harassment and incivility on Wikimedia projects. The statement builds on prior statements from 2016 and 2019,[2][3] affirms the forthcoming introduction of a universal code of conduct, and directs the Wikimedia Foundation to rapidly and substantively address these challenges in complement with existing community processes.
This includes developing sustainable practices and tools that eliminate harassment, toxicity, and incivility, promote inclusivity, cultivate respectful discourse, reduce harms to participants, protect the projects from disinformation and bad actors, and promote trust in our projects.
Over the past nearly twenty years, the movement has taken a number of unique and sometimes extraordinary steps to create an environment unlike anything else online: a place to share knowledge, to learn, and to collaborate together. In order for the movement to continue to thrive and make progress to our mission, it is essential to build a culture that is welcoming and inclusive.
Research has consistently shown that members of our communities have been subject to hostility and toxic behavior in Wikimedia spaces.[4][5] The Wikimedia 2030 movement strategy recommendations have also identified the safety of our Wikimedia spaces as a core issue to address if we are to reach the 2030 goals, with concrete recommendations which include a universal code of conduct, pathways for users to privately report incidents, and a baseline of community responsibilities.[6]
While the movement has made progress in addressing harassment and toxic behavior, we recognize there is still much more to do. The Board’s resolution and statement today is a step toward establishing clear, consistent guidelines around acceptable behavior on our projects, and guiding the Wikimedia Foundation in supporting the movement’s ability to ensure a healthy environment for those who participate in our projects.
- Developing and introducing, in close consultation with volunteer
contributor communities, a universal code of conduct that will be a binding minimum set of standards across all Wikimedia projects;
- Taking actions to ban, sanction, or otherwise limit the access of
Wikimedia movement participants who do not comply with these policies and the Terms of Use;
- Working with community functionaries to create and refine a retroactive
review process for cases brought by involved parties, excluding those cases which pose legal or other severe risks; and
- Significantly increasing support for and collaboration with community
functionaries primarily enforcing such compliance in a way that prioritizes the personal safety of these functionaries.
Together, we have made our movement what it is today. In this same way, we must all be responsible for building the positive community culture of the future, and accountable for stopping harassment and toxic behavior on our sites.
We have also made this statement available on Meta-Wiki for translation and wider distribution.[1]
On behalf of the Board, María, Board Chair
[1]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/May_2...
[2]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Novem...
[3]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Archi...
[4] https://meta.wikimedia.org/wiki/Research:Harassment_survey_2015
[5]
https://meta.wikimedia.org/wiki/Community_Insights/2018_Report#Experience_of...
[6]
https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20/Recommen...
== Statement on Healthy Community Culture, Inclusivity, and Safe Spaces ==
Harassment, toxic behavior, and incivility in the Wikimedia movement are contrary to our shared values and detrimental to our vision and mission. They negatively impact our ability to collect, share, and disseminate free knowledge, harm the immediate well-being of individual Wikimedians, and threaten the long-term health and success of the Wikimedia projects. The Board does not believe we have made enough progress toward creating welcoming, inclusive, harassment-free spaces in which people can contribute productively and debate constructively.
In recognition of the urgency of these issues, the Board is directing the Wikimedia Foundation to directly improve the situation in collaboration with our communities. This should include developing sustainable practices and tools that eliminate harassment, toxicity, and incivility, promote inclusivity, cultivate respectful discourse, reduce harms to participants, protect the projects from disinformation and bad actors, and promote trust in our projects.
Specifically, the Foundation shall:
- Develop and introduce a universal code of conduct (UCoC) that will be a
binding minimum set of standards across all Wikimedia projects.
** The first phase, covering policies for in-person and virtual events, technical spaces, and all Wikimedia projects and wikis, and developed in collaboration with the international Wikimedia communities, will be presented to the Board for ratification by August 30, 2020.
** The second phase, outlining clear enforcement pathways, and refined with broad input from the Wikimedia communities, will be presented to the Board for ratification by the end of 2020;
- Take actions to ban, sanction, or otherwise limit the access of Wikimedia
movement participants who do not comply with these policies and the Terms of Use;
- Work with community functionaries to create and refine a retroactive
review process for cases brought by involved parties, excluding those cases which pose legal or other severe risks; and
- Significantly increase support for and collaboration with community
functionaries primarily enforcing such compliance in a way that prioritizes the personal safety of these functionaries.
Until such directives are implemented, the Board instructs the Foundation to adopt and implement policies for reducing harassment and toxicity on our projects and minimizing legal risks for the movement, in collaboration with communities whenever practicable. Until these two phases of the UCoC are complete and operational an interim review process involving community functionaries will be in effect. In this interim period, the Product Committee of the Board of Trustees will also advise the Trust & Safety team.
To that end, the Board further directs the Foundation, in collaboration with the communities, to make additional investments in Trust & Safety capacity, including but not limited to: development of tools needed to assist our volunteers and staff, research to support data-informed decisions, development of clear metrics to measure success, development of training tools and materials (including building communities’ capacities around harassment awareness and conflict resolution), and consultations with international experts on harassment, community health and children’s rights, as well as additional hiring.
The above efforts will be undertaken in coordination and collaboration with appropriate partners from across the movement, seek to increase effective community governance of conduct and behavioral standards, and reduce the long-term need of the Foundation to act. It is the shared goal of the Board and Foundation that these efforts advance a sustainable Wikimedia movement and support, rather than substitute, effective models of community governance.
We urge every member of the Wikimedia communities to collaborate in a way that models the Wikimedia values of openness and inclusivity, step forward to do their part to create a safe and welcoming culture for all, stop hostile and toxic behavior, support people who have been targeted by such behavior, assist good-faith people learning to contribute, and help set clear expectations for all contributors.
--
María Sefidari Huici
Chair of the Board
Wikimedia Foundation https://wikimediafoundation.org/ _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
Hello, Dennis!
Not at all. What it means is that this a not a process that goes into play *before* a decision to act is made, but *after*. It should stand as an option for those who want to ensure that actions taken are fair, as long as the case does not relate to legal risks or other severe concerns.
Best regards, antanana / Nataliia Tymkiv
NOTICE: You may have received this message outside of your normal working hours/days, as I usually can work more as a volunteer during weekend. You should not feel obligated to answer it during your days off. Thank you in advance!
On Sat, May 23, 2020, 01:58 Dennis During dcduring@gmail.com wrote:
"Work with community functionaries to create and refine a retroactive review process for cases brought by involved parties, excluding those cases which pose legal or other severe risks "
What does "retroactive review process" mean?
I hope it doesn't mean applying standards that were not promulgated at the time to past actions and applying severe sanctions to the alleged perpetrators.
On Fri, May 22, 2020 at 5:59 PM María Sefidari maria@wikimedia.org wrote:
Hello everyone,
Today, the Wikimedia Foundation Board of Trustees unanimously passed a resolution and published a statement[1] regarding the urgent need to make our movement more safe and inclusive by addressing harassment and incivility on Wikimedia projects. The statement builds on prior
statements
from 2016 and 2019,[2][3] affirms the forthcoming introduction of a universal code of conduct, and directs the Wikimedia Foundation to
rapidly
and substantively address these challenges in complement with existing community processes.
This includes developing sustainable practices and tools that eliminate harassment, toxicity, and incivility, promote inclusivity, cultivate respectful discourse, reduce harms to participants, protect the projects from disinformation and bad actors, and promote trust in our projects.
Over the past nearly twenty years, the movement has taken a number of unique and sometimes extraordinary steps to create an environment unlike anything else online: a place to share knowledge, to learn, and to collaborate together. In order for the movement to continue to thrive and make progress to our mission, it is essential to build a culture that is welcoming and inclusive.
Research has consistently shown that members of our communities have been subject to hostility and toxic behavior in Wikimedia spaces.[4][5] The Wikimedia 2030 movement strategy recommendations have also identified the safety of our Wikimedia spaces as a core issue to address if we are to reach the 2030 goals, with concrete recommendations which include a universal code of conduct, pathways for users to privately report incidents, and a baseline of community responsibilities.[6]
While the movement has made progress in addressing harassment and toxic behavior, we recognize there is still much more to do. The Board’s resolution and statement today is a step toward establishing clear, consistent guidelines around acceptable behavior on our projects, and guiding the Wikimedia Foundation in supporting the movement’s ability to ensure a healthy environment for those who participate in our projects.
- Developing and introducing, in close consultation with volunteer
contributor communities, a universal code of conduct that will be a
binding
minimum set of standards across all Wikimedia projects;
- Taking actions to ban, sanction, or otherwise limit the access of
Wikimedia movement participants who do not comply with these policies and the Terms of Use;
- Working with community functionaries to create and refine a retroactive
review process for cases brought by involved parties, excluding those
cases
which pose legal or other severe risks; and
- Significantly increasing support for and collaboration with community
functionaries primarily enforcing such compliance in a way that
prioritizes
the personal safety of these functionaries.
Together, we have made our movement what it is today. In this same way,
we
must all be responsible for building the positive community culture of
the
future, and accountable for stopping harassment and toxic behavior on our sites.
We have also made this statement available on Meta-Wiki for translation
and
wider distribution.[1]
On behalf of the Board, María, Board Chair
[1]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/May_2...
[2]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Novem...
[3]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Archi...
[4] https://meta.wikimedia.org/wiki/Research:Harassment_survey_2015
[5]
https://meta.wikimedia.org/wiki/Community_Insights/2018_Report#Experience_of...
[6]
https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20/Recommen...
== Statement on Healthy Community Culture, Inclusivity, and Safe Spaces
==
Harassment, toxic behavior, and incivility in the Wikimedia movement are contrary to our shared values and detrimental to our vision and mission. They negatively impact our ability to collect, share, and disseminate
free
knowledge, harm the immediate well-being of individual Wikimedians, and threaten the long-term health and success of the Wikimedia projects. The Board does not believe we have made enough progress toward creating welcoming, inclusive, harassment-free spaces in which people can
contribute
productively and debate constructively.
In recognition of the urgency of these issues, the Board is directing the Wikimedia Foundation to directly improve the situation in collaboration with our communities. This should include developing sustainable
practices
and tools that eliminate harassment, toxicity, and incivility, promote inclusivity, cultivate respectful discourse, reduce harms to
participants,
protect the projects from disinformation and bad actors, and promote
trust
in our projects.
Specifically, the Foundation shall:
- Develop and introduce a universal code of conduct (UCoC) that will be a
binding minimum set of standards across all Wikimedia projects.
** The first phase, covering policies for in-person and virtual events, technical spaces, and all Wikimedia projects and wikis, and developed in collaboration with the international Wikimedia communities, will be presented to the Board for ratification by August 30, 2020.
** The second phase, outlining clear enforcement pathways, and refined
with
broad input from the Wikimedia communities, will be presented to the
Board
for ratification by the end of 2020;
- Take actions to ban, sanction, or otherwise limit the access of
Wikimedia
movement participants who do not comply with these policies and the Terms of Use;
- Work with community functionaries to create and refine a retroactive
review process for cases brought by involved parties, excluding those
cases
which pose legal or other severe risks; and
- Significantly increase support for and collaboration with community
functionaries primarily enforcing such compliance in a way that
prioritizes
the personal safety of these functionaries.
Until such directives are implemented, the Board instructs the Foundation to adopt and implement policies for reducing harassment and toxicity on
our
projects and minimizing legal risks for the movement, in collaboration
with
communities whenever practicable. Until these two phases of the UCoC are complete and operational an interim review process involving community functionaries will be in effect. In this interim period, the Product Committee of the Board of Trustees will also advise the Trust & Safety team.
To that end, the Board further directs the Foundation, in collaboration with the communities, to make additional investments in Trust & Safety capacity, including but not limited to: development of tools needed to assist our volunteers and staff, research to support data-informed decisions, development of clear metrics to measure success, development
of
training tools and materials (including building communities’ capacities around harassment awareness and conflict resolution), and consultations with international experts on harassment, community health and children’s rights, as well as additional hiring.
The above efforts will be undertaken in coordination and collaboration
with
appropriate partners from across the movement, seek to increase effective community governance of conduct and behavioral standards, and reduce the long-term need of the Foundation to act. It is the shared goal of the
Board
and Foundation that these efforts advance a sustainable Wikimedia
movement
and support, rather than substitute, effective models of community governance.
We urge every member of the Wikimedia communities to collaborate in a way that models the Wikimedia values of openness and inclusivity, step
forward
to do their part to create a safe and welcoming culture for all, stop hostile and toxic behavior, support people who have been targeted by such behavior, assist good-faith people learning to contribute, and help set clear expectations for all contributors.
--
María Sefidari Huici
Chair of the Board
Wikimedia Foundation https://wikimediafoundation.org/ _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
-- Dennis C. During _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
While I'm pretty sure that this wasn't the intention, that sounds a lot like "ban first and ask questions later". As Pine noted, this is a topic where great care must be taken to communicate intentions clearly and diplomatically. This point was likely introduced to respond to concerns about unappealable Office Actions. The way it was phrased, however, diminishes the point it was trying to make and also implies that community input is only applicable after the fact, and only from functionaries.
Would it be fair to say that: - Enforcement of a universal code of conduct would happen though a fair, clearly-defined process without significant bias and with significant community oversight and input - Universal code of conduct enforcement actions would be appealable through a fair, clearly-defined process with significant community oversight that allowed statements from involved parties and uninvolved community members - To ensure proper community oversight, code of conduct enforcement actions and appeals would be made as public as possible as often as possible (excepting issues where public disclosure would harm privacy or safety)
AntiComposite
On Fri, May 22, 2020 at 7:52 PM Nataliia Tymkiv ntymkiv@wikimedia.org wrote:
Hello, Dennis!
Not at all. What it means is that this a not a process that goes into play *before* a decision to act is made, but *after*. It should stand as an option for those who want to ensure that actions taken are fair, as long as the case does not relate to legal risks or other severe concerns.
Best regards, antanana / Nataliia Tymkiv
NOTICE: You may have received this message outside of your normal working hours/days, as I usually can work more as a volunteer during weekend. You should not feel obligated to answer it during your days off. Thank you in advance!
On Sat, May 23, 2020, 01:58 Dennis During dcduring@gmail.com wrote:
"Work with community functionaries to create and refine a retroactive review process for cases brought by involved parties, excluding those cases which pose legal or other severe risks "
What does "retroactive review process" mean?
I hope it doesn't mean applying standards that were not promulgated at the time to past actions and applying severe sanctions to the alleged perpetrators.
On Fri, May 22, 2020 at 5:59 PM María Sefidari maria@wikimedia.org wrote:
Hello everyone,
Today, the Wikimedia Foundation Board of Trustees unanimously passed a resolution and published a statement[1] regarding the urgent need to make our movement more safe and inclusive by addressing harassment and incivility on Wikimedia projects. The statement builds on prior
statements
from 2016 and 2019,[2][3] affirms the forthcoming introduction of a universal code of conduct, and directs the Wikimedia Foundation to
rapidly
and substantively address these challenges in complement with existing community processes.
This includes developing sustainable practices and tools that eliminate harassment, toxicity, and incivility, promote inclusivity, cultivate respectful discourse, reduce harms to participants, protect the projects from disinformation and bad actors, and promote trust in our projects.
Over the past nearly twenty years, the movement has taken a number of unique and sometimes extraordinary steps to create an environment unlike anything else online: a place to share knowledge, to learn, and to collaborate together. In order for the movement to continue to thrive and make progress to our mission, it is essential to build a culture that is welcoming and inclusive.
Research has consistently shown that members of our communities have been subject to hostility and toxic behavior in Wikimedia spaces.[4][5] The Wikimedia 2030 movement strategy recommendations have also identified the safety of our Wikimedia spaces as a core issue to address if we are to reach the 2030 goals, with concrete recommendations which include a universal code of conduct, pathways for users to privately report incidents, and a baseline of community responsibilities.[6]
While the movement has made progress in addressing harassment and toxic behavior, we recognize there is still much more to do. The Board’s resolution and statement today is a step toward establishing clear, consistent guidelines around acceptable behavior on our projects, and guiding the Wikimedia Foundation in supporting the movement’s ability to ensure a healthy environment for those who participate in our projects.
- Developing and introducing, in close consultation with volunteer
contributor communities, a universal code of conduct that will be a
binding
minimum set of standards across all Wikimedia projects;
- Taking actions to ban, sanction, or otherwise limit the access of
Wikimedia movement participants who do not comply with these policies and the Terms of Use;
- Working with community functionaries to create and refine a retroactive
review process for cases brought by involved parties, excluding those
cases
which pose legal or other severe risks; and
- Significantly increasing support for and collaboration with community
functionaries primarily enforcing such compliance in a way that
prioritizes
the personal safety of these functionaries.
Together, we have made our movement what it is today. In this same way,
we
must all be responsible for building the positive community culture of
the
future, and accountable for stopping harassment and toxic behavior on our sites.
We have also made this statement available on Meta-Wiki for translation
and
wider distribution.[1]
On behalf of the Board, María, Board Chair
[1]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/May_2...
[2]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Novem...
[3]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Archi...
[4] https://meta.wikimedia.org/wiki/Research:Harassment_survey_2015
[5]
https://meta.wikimedia.org/wiki/Community_Insights/2018_Report#Experience_of...
[6]
https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20/Recommen...
== Statement on Healthy Community Culture, Inclusivity, and Safe Spaces
==
Harassment, toxic behavior, and incivility in the Wikimedia movement are contrary to our shared values and detrimental to our vision and mission. They negatively impact our ability to collect, share, and disseminate
free
knowledge, harm the immediate well-being of individual Wikimedians, and threaten the long-term health and success of the Wikimedia projects. The Board does not believe we have made enough progress toward creating welcoming, inclusive, harassment-free spaces in which people can
contribute
productively and debate constructively.
In recognition of the urgency of these issues, the Board is directing the Wikimedia Foundation to directly improve the situation in collaboration with our communities. This should include developing sustainable
practices
and tools that eliminate harassment, toxicity, and incivility, promote inclusivity, cultivate respectful discourse, reduce harms to
participants,
protect the projects from disinformation and bad actors, and promote
trust
in our projects.
Specifically, the Foundation shall:
- Develop and introduce a universal code of conduct (UCoC) that will be a
binding minimum set of standards across all Wikimedia projects.
** The first phase, covering policies for in-person and virtual events, technical spaces, and all Wikimedia projects and wikis, and developed in collaboration with the international Wikimedia communities, will be presented to the Board for ratification by August 30, 2020.
** The second phase, outlining clear enforcement pathways, and refined
with
broad input from the Wikimedia communities, will be presented to the
Board
for ratification by the end of 2020;
- Take actions to ban, sanction, or otherwise limit the access of
Wikimedia
movement participants who do not comply with these policies and the Terms of Use;
- Work with community functionaries to create and refine a retroactive
review process for cases brought by involved parties, excluding those
cases
which pose legal or other severe risks; and
- Significantly increase support for and collaboration with community
functionaries primarily enforcing such compliance in a way that
prioritizes
the personal safety of these functionaries.
Until such directives are implemented, the Board instructs the Foundation to adopt and implement policies for reducing harassment and toxicity on
our
projects and minimizing legal risks for the movement, in collaboration
with
communities whenever practicable. Until these two phases of the UCoC are complete and operational an interim review process involving community functionaries will be in effect. In this interim period, the Product Committee of the Board of Trustees will also advise the Trust & Safety team.
To that end, the Board further directs the Foundation, in collaboration with the communities, to make additional investments in Trust & Safety capacity, including but not limited to: development of tools needed to assist our volunteers and staff, research to support data-informed decisions, development of clear metrics to measure success, development
of
training tools and materials (including building communities’ capacities around harassment awareness and conflict resolution), and consultations with international experts on harassment, community health and children’s rights, as well as additional hiring.
The above efforts will be undertaken in coordination and collaboration
with
appropriate partners from across the movement, seek to increase effective community governance of conduct and behavioral standards, and reduce the long-term need of the Foundation to act. It is the shared goal of the
Board
and Foundation that these efforts advance a sustainable Wikimedia
movement
and support, rather than substitute, effective models of community governance.
We urge every member of the Wikimedia communities to collaborate in a way that models the Wikimedia values of openness and inclusivity, step
forward
to do their part to create a safe and welcoming culture for all, stop hostile and toxic behavior, support people who have been targeted by such behavior, assist good-faith people learning to contribute, and help set clear expectations for all contributors.
--
María Sefidari Huici
Chair of the Board
Wikimedia Foundation https://wikimediafoundation.org/ _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
-- Dennis C. During _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
I like the concept as it means the WMF can step up and address the dodgy corporate players a lost more effectively across all platforms including taking big stick tools to prevent them white washing articles or providing paid for services
On Sun, 24 May 2020 at 10:25, AntiCompositeNumber < anticompositenumber@gmail.com> wrote:
While I'm pretty sure that this wasn't the intention, that sounds a lot like "ban first and ask questions later". As Pine noted, this is a topic where great care must be taken to communicate intentions clearly and diplomatically. This point was likely introduced to respond to concerns about unappealable Office Actions. The way it was phrased, however, diminishes the point it was trying to make and also implies that community input is only applicable after the fact, and only from functionaries.
Would it be fair to say that:
- Enforcement of a universal code of conduct would happen though a
fair, clearly-defined process without significant bias and with significant community oversight and input
- Universal code of conduct enforcement actions would be appealable
through a fair, clearly-defined process with significant community oversight that allowed statements from involved parties and uninvolved community members
- To ensure proper community oversight, code of conduct enforcement
actions and appeals would be made as public as possible as often as possible (excepting issues where public disclosure would harm privacy or safety)
AntiComposite
On Fri, May 22, 2020 at 7:52 PM Nataliia Tymkiv ntymkiv@wikimedia.org wrote:
Hello, Dennis!
Not at all. What it means is that this a not a process that goes into
play
*before* a decision to act is made, but *after*. It should stand as an option for those who want to ensure that actions taken are fair, as long
as
the case does not relate to legal risks or other severe concerns.
Best regards, antanana / Nataliia Tymkiv
NOTICE: You may have received this message outside of your normal working hours/days, as I usually can work more as a volunteer during weekend. You should not feel obligated to answer it during your days off. Thank you in advance!
On Sat, May 23, 2020, 01:58 Dennis During dcduring@gmail.com wrote:
"Work with community functionaries to create and refine a retroactive review process for cases brought by involved parties, excluding those
cases
which pose legal or other severe risks "
What does "retroactive review process" mean?
I hope it doesn't mean applying standards that were not promulgated at
the
time to past actions and applying severe sanctions to the alleged perpetrators.
On Fri, May 22, 2020 at 5:59 PM María Sefidari maria@wikimedia.org wrote:
Hello everyone,
Today, the Wikimedia Foundation Board of Trustees unanimously passed
a
resolution and published a statement[1] regarding the urgent need to
make
our movement more safe and inclusive by addressing harassment and incivility on Wikimedia projects. The statement builds on prior
statements
from 2016 and 2019,[2][3] affirms the forthcoming introduction of a universal code of conduct, and directs the Wikimedia Foundation to
rapidly
and substantively address these challenges in complement with
existing
community processes.
This includes developing sustainable practices and tools that
eliminate
harassment, toxicity, and incivility, promote inclusivity, cultivate respectful discourse, reduce harms to participants, protect the
projects
from disinformation and bad actors, and promote trust in our
projects.
Over the past nearly twenty years, the movement has taken a number of unique and sometimes extraordinary steps to create an environment
unlike
anything else online: a place to share knowledge, to learn, and to collaborate together. In order for the movement to continue to
thrive and
make progress to our mission, it is essential to build a culture
that is
welcoming and inclusive.
Research has consistently shown that members of our communities have
been
subject to hostility and toxic behavior in Wikimedia spaces.[4][5]
The
Wikimedia 2030 movement strategy recommendations have also
identified the
safety of our Wikimedia spaces as a core issue to address if we are
to
reach the 2030 goals, with concrete recommendations which include a universal code of conduct, pathways for users to privately report incidents, and a baseline of community responsibilities.[6]
While the movement has made progress in addressing harassment and
toxic
behavior, we recognize there is still much more to do. The Board’s resolution and statement today is a step toward establishing clear, consistent guidelines around acceptable behavior on our projects, and guiding the Wikimedia Foundation in supporting the movement’s
ability to
ensure a healthy environment for those who participate in our
projects.
- Developing and introducing, in close consultation with volunteer
contributor communities, a universal code of conduct that will be a
binding
minimum set of standards across all Wikimedia projects;
- Taking actions to ban, sanction, or otherwise limit the access of
Wikimedia movement participants who do not comply with these
policies and
the Terms of Use;
- Working with community functionaries to create and refine a
retroactive
review process for cases brought by involved parties, excluding those
cases
which pose legal or other severe risks; and
- Significantly increasing support for and collaboration with
community
functionaries primarily enforcing such compliance in a way that
prioritizes
the personal safety of these functionaries.
Together, we have made our movement what it is today. In this same
way,
we
must all be responsible for building the positive community culture
of
the
future, and accountable for stopping harassment and toxic behavior
on our
sites.
We have also made this statement available on Meta-Wiki for
translation
and
wider distribution.[1]
On behalf of the Board, María, Board Chair
[1]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/May_2...
[2]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Novem...
[3]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Archi...
[4] https://meta.wikimedia.org/wiki/Research:Harassment_survey_2015
[5]
https://meta.wikimedia.org/wiki/Community_Insights/2018_Report#Experience_of...
[6]
https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20/Recommen...
== Statement on Healthy Community Culture, Inclusivity, and Safe
Spaces
==
Harassment, toxic behavior, and incivility in the Wikimedia movement
are
contrary to our shared values and detrimental to our vision and
mission.
They negatively impact our ability to collect, share, and disseminate
free
knowledge, harm the immediate well-being of individual Wikimedians,
and
threaten the long-term health and success of the Wikimedia projects.
The
Board does not believe we have made enough progress toward creating welcoming, inclusive, harassment-free spaces in which people can
contribute
productively and debate constructively.
In recognition of the urgency of these issues, the Board is
directing the
Wikimedia Foundation to directly improve the situation in
collaboration
with our communities. This should include developing sustainable
practices
and tools that eliminate harassment, toxicity, and incivility,
promote
inclusivity, cultivate respectful discourse, reduce harms to
participants,
protect the projects from disinformation and bad actors, and promote
trust
in our projects.
Specifically, the Foundation shall:
- Develop and introduce a universal code of conduct (UCoC) that will
be a
binding minimum set of standards across all Wikimedia projects.
** The first phase, covering policies for in-person and virtual
events,
technical spaces, and all Wikimedia projects and wikis, and
developed in
collaboration with the international Wikimedia communities, will be presented to the Board for ratification by August 30, 2020.
** The second phase, outlining clear enforcement pathways, and
refined
with
broad input from the Wikimedia communities, will be presented to the
Board
for ratification by the end of 2020;
- Take actions to ban, sanction, or otherwise limit the access of
Wikimedia
movement participants who do not comply with these policies and the
Terms
of Use;
- Work with community functionaries to create and refine a
retroactive
review process for cases brought by involved parties, excluding those
cases
which pose legal or other severe risks; and
- Significantly increase support for and collaboration with community
functionaries primarily enforcing such compliance in a way that
prioritizes
the personal safety of these functionaries.
Until such directives are implemented, the Board instructs the
Foundation
to adopt and implement policies for reducing harassment and toxicity
on
our
projects and minimizing legal risks for the movement, in
collaboration
with
communities whenever practicable. Until these two phases of the UCoC
are
complete and operational an interim review process involving
community
functionaries will be in effect. In this interim period, the Product Committee of the Board of Trustees will also advise the Trust &
Safety
team.
To that end, the Board further directs the Foundation, in
collaboration
with the communities, to make additional investments in Trust &
Safety
capacity, including but not limited to: development of tools needed
to
assist our volunteers and staff, research to support data-informed decisions, development of clear metrics to measure success,
development
of
training tools and materials (including building communities’
capacities
around harassment awareness and conflict resolution), and
consultations
with international experts on harassment, community health and
children’s
rights, as well as additional hiring.
The above efforts will be undertaken in coordination and
collaboration
with
appropriate partners from across the movement, seek to increase
effective
community governance of conduct and behavioral standards, and reduce
the
long-term need of the Foundation to act. It is the shared goal of the
Board
and Foundation that these efforts advance a sustainable Wikimedia
movement
and support, rather than substitute, effective models of community governance.
We urge every member of the Wikimedia communities to collaborate in
a way
that models the Wikimedia values of openness and inclusivity, step
forward
to do their part to create a safe and welcoming culture for all, stop hostile and toxic behavior, support people who have been targeted by
such
behavior, assist good-faith people learning to contribute, and help
set
clear expectations for all contributors.
--
María Sefidari Huici
Chair of the Board
Wikimedia Foundation https://wikimediafoundation.org/ _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe:
https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
-- Dennis C. During _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
Wikimedia-l mailing list, guidelines at:
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
On Sun, 24 May 2020 at 04:25, AntiCompositeNumber < anticompositenumber@gmail.com> wrote:
Would it be fair to say that:
- Enforcement of a universal code of conduct would happen though a
fair, clearly-defined process without significant bias and with significant community oversight and input
- Universal code of conduct enforcement actions would be appealable
through a fair, clearly-defined process with significant community oversight that allowed statements from involved parties and uninvolved community members
- To ensure proper community oversight, code of conduct enforcement
actions and appeals would be made as public as possible as often as possible (excepting issues where public disclosure would harm privacy or safety)
AntiComposite
Yes! These are fundamental requirements that need to be met by the process that will be implemented in the second phase (Aug - end of 2020). It seems there will be an opportunity to incorporate these requirements:
The second phase, outlining clear enforcement pathways, and
*refined with** broad input from the Wikimedia communities*, will be presented to the Board for ratification by the end of 2020;
I'd add a few more points: - To handle workload and different languages, local boards should be selected as the first step of the process, with possible escalation to a global board if necessary (eg. for conflict-of-interest reason). - To minimize bias the boards should consist of people from different areas. As long as the local DR processes remain operational (ANI and the likes), there should be a clear separation of powers: CoC board members should not be involved with local DR to avoid concentration of power. Being an admin should not be a requirement, in fact adminship and dispute resolution should be separate roles, as the latter requires specific training or experience, which is not part of the requirements to be admin. - There should be at least 2 independent global boards so one can review the other's decisions and handle appeals. Cases should be evaluated by the board that has more members unrelated to the involved parties. - Functionaries and board members should be regularly reviewed and terms limited to a few years.
About the DR process: - Most of our communication is publicly visible on-wiki, therefore the cases should be resolved in public. Transparency is crucial for community review and a great learning opportunity about dispute resolution. - Privately handled cases should only happen when all parties agree to it, so one party can't use "privacy" as a means to avoid the burden of proof. Non-public evidence should only be taken into account if there is a very strong justification, proportional to the sanction that comes from it. - Reports, however, should be created privately and published only when the case opens. Before the case opens the reporter might seek advice and help to create the report from people they trust. I've outlined a process draft for this in the context of the User Reporting System https://meta.wikimedia.org/wiki/Talk:Community_health_initiative/User_reporting_system_consultation_2019#Factual,_evidence_based_reporting_tool_-_draft,_proposal . - Reports should be treated with respect, as the personal experience of a person. Nobody should be sanctioned for what a report contains, whether the boards, or the community finds that true or false, as that would be a deterrent to reporting influential users, who made a mistake or lost their way. - The focus should be on dispute *resolution. *Disputes and the resulting reports often start with disagreements, not bad intent towards each other. Mediation is an effective approach to finding a mutually agreeable resolution in these situations. Such resolutions create a more cooperative environment and allow for personal growth, learning from mistakes. Mediators should be hired and board members offered mediator training to support this path. - When necessary, only the minimal sanctions should be applied that prevent the reported behaviour, to reduce the abuse potential of blocking. Partial blocks was a great step in this direction: typical conduct issues should be addressed early on with minor sanctions, not after years of misconduct, when a ban becomes warranted. Bans and project-wide blocks should only be used after numerous escalations and repeated sanctions, or in clear-cut cases of extreme misconduct.
Dispute resolution is difficult and often requires effort from all parties. The above approaches are unusual compared to the traditional handling of disputes, which often results in one-sided sanctioning of the party with less support from the community. However, adopting new ways of dispute resolution is necessary to create an inclusive community, where editors are treated equally and fairly, regardless of their status.
These are just superficial thoughts, which I'll detail in the second phase.
Thanks, Aron (Demian)
That's a tricky topic, especially when local dispute resolution bodies (which should in most cases be approached first, I agree here) cannot solve the dispute or when multiple projects are involved. At the moment, there is in fact a lack of such body and of course it should be transparent, composed of multi-diverse community members who are trained and supported by professional mediation, etc. as pointed out. Currently, stewards like me are quite often approached with such topics but this user group is more focused on technical stuff like userrights. A former steward fellow and I discussed this topic at the Safety Space at Wikimania. Due to the nature of the space, the discussion have not been documented but you can find the presentation with backgrounds of the situation and open questions on Commons https://commons.wikimedia.org/wiki/File:Wikimania_2019_%E2%80%93_Do_we_need_a_global_dispute_resolution_committee%3F.pdf. Maybe it can give some ideas how to proceed with this.
Best, Martin/DerHexer
Am So., 24. Mai 2020 um 06:19 Uhr schrieb Aron Demian < aronmanning5@gmail.com>:
On Sun, 24 May 2020 at 04:25, AntiCompositeNumber < anticompositenumber@gmail.com> wrote:
Would it be fair to say that:
- Enforcement of a universal code of conduct would happen though a
fair, clearly-defined process without significant bias and with significant community oversight and input
- Universal code of conduct enforcement actions would be appealable
through a fair, clearly-defined process with significant community oversight that allowed statements from involved parties and uninvolved community members
- To ensure proper community oversight, code of conduct enforcement
actions and appeals would be made as public as possible as often as possible (excepting issues where public disclosure would harm privacy or safety)
AntiComposite
Yes! These are fundamental requirements that need to be met by the process that will be implemented in the second phase (Aug - end of 2020). It seems there will be an opportunity to incorporate these requirements:
The second phase, outlining clear enforcement pathways, and
*refined with** broad input from the Wikimedia communities*, will be presented to the Board for ratification by the end of 2020;
I'd add a few more points:
- To handle workload and different languages, local boards should be
selected as the first step of the process, with possible escalation to a global board if necessary (eg. for conflict-of-interest reason).
- To minimize bias the boards should consist of people from different
areas. As long as the local DR processes remain operational (ANI and the likes), there should be a clear separation of powers: CoC board members should not be involved with local DR to avoid concentration of power. Being an admin should not be a requirement, in fact adminship and dispute resolution should be separate roles, as the latter requires specific training or experience, which is not part of the requirements to be admin.
- There should be at least 2 independent global boards so one can review
the other's decisions and handle appeals. Cases should be evaluated by the board that has more members unrelated to the involved parties.
- Functionaries and board members should be regularly reviewed and terms
limited to a few years.
About the DR process:
- Most of our communication is publicly visible on-wiki, therefore the
cases should be resolved in public. Transparency is crucial for community review and a great learning opportunity about dispute resolution.
- Privately handled cases should only happen when all parties agree to
it, so one party can't use "privacy" as a means to avoid the burden of proof. Non-public evidence should only be taken into account if there is a very strong justification, proportional to the sanction that comes from it.
- Reports, however, should be created privately and published only when the
case opens. Before the case opens the reporter might seek advice and help to create the report from people they trust. I've outlined a process draft for this in the context of the User Reporting System < https://meta.wikimedia.org/wiki/Talk:Community_health_initiative/User_report...
.
- Reports should be treated with respect, as the personal experience of a
person. Nobody should be sanctioned for what a report contains, whether the boards, or the community finds that true or false, as that would be a deterrent to reporting influential users, who made a mistake or lost their way.
- The focus should be on dispute *resolution. *Disputes and the resulting
reports often start with disagreements, not bad intent towards each other. Mediation is an effective approach to finding a mutually agreeable resolution in these situations. Such resolutions create a more cooperative environment and allow for personal growth, learning from mistakes. Mediators should be hired and board members offered mediator training to support this path.
- When necessary, only the minimal sanctions should be applied that prevent
the reported behaviour, to reduce the abuse potential of blocking. Partial blocks was a great step in this direction: typical conduct issues should be addressed early on with minor sanctions, not after years of misconduct, when a ban becomes warranted. Bans and project-wide blocks should only be used after numerous escalations and repeated sanctions, or in clear-cut cases of extreme misconduct.
Dispute resolution is difficult and often requires effort from all parties. The above approaches are unusual compared to the traditional handling of disputes, which often results in one-sided sanctioning of the party with less support from the community. However, adopting new ways of dispute resolution is necessary to create an inclusive community, where editors are treated equally and fairly, regardless of their status.
These are just superficial thoughts, which I'll detail in the second phase.
Thanks, Aron (Demian) _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
The board resolution aims at "addressing harassment and incivility on Wikimedia projects".
I don't see that this covers "disputes", i.e disputes over content. We can, of course, disagree with someone totally over a topic, as long as we discuss our differences in a civil and respectful way - and consider our opponent's point-of-view and arguments seriously.
Regards, Thyge - Sir48
http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail Virusfri. www.avg.com http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail <#DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>
Den man. 25. maj 2020 kl. 14.45 skrev DerHexer via Wikimedia-l < wikimedia-l@lists.wikimedia.org>:
That's a tricky topic, especially when local dispute resolution bodies (which should in most cases be approached first, I agree here) cannot solve the dispute or when multiple projects are involved. At the moment, there is in fact a lack of such body and of course it should be transparent, composed of multi-diverse community members who are trained and supported by professional mediation, etc. as pointed out. Currently, stewards like me are quite often approached with such topics but this user group is more focused on technical stuff like userrights. A former steward fellow and I discussed this topic at the Safety Space at Wikimania. Due to the nature of the space, the discussion have not been documented but you can find the presentation with backgrounds of the situation and open questions on Commons < https://commons.wikimedia.org/wiki/File:Wikimania_2019_%E2%80%93_Do_we_need_...
.
Maybe it can give some ideas how to proceed with this.
Best, Martin/DerHexer
Am So., 24. Mai 2020 um 06:19 Uhr schrieb Aron Demian < aronmanning5@gmail.com>:
On Sun, 24 May 2020 at 04:25, AntiCompositeNumber < anticompositenumber@gmail.com> wrote:
Would it be fair to say that:
- Enforcement of a universal code of conduct would happen though a
fair, clearly-defined process without significant bias and with significant community oversight and input
- Universal code of conduct enforcement actions would be appealable
through a fair, clearly-defined process with significant community oversight that allowed statements from involved parties and uninvolved community members
- To ensure proper community oversight, code of conduct enforcement
actions and appeals would be made as public as possible as often as possible (excepting issues where public disclosure would harm privacy or safety)
AntiComposite
Yes! These are fundamental requirements that need to be met by the
process
that will be implemented in the second phase (Aug - end of 2020). It seems there will be an opportunity to incorporate these requirements:
The second phase, outlining clear enforcement pathways, and
*refined with** broad input from the Wikimedia communities*, will be presented to the Board for ratification by the end of 2020;
I'd add a few more points:
- To handle workload and different languages, local boards should be
selected as the first step of the process, with possible escalation to a global board if necessary (eg. for conflict-of-interest reason).
- To minimize bias the boards should consist of people from different
areas. As long as the local DR processes remain operational (ANI and the likes), there should be a clear separation of powers: CoC board members should not be involved with local DR to avoid concentration of power.
Being
an admin should not be a requirement, in fact adminship and dispute resolution should be separate roles, as the latter requires specific training or experience, which is not part of the requirements to be
admin.
- There should be at least 2 independent global boards so one can review
the other's decisions and handle appeals. Cases should be evaluated by
the
board that has more members unrelated to the involved parties.
- Functionaries and board members should be regularly reviewed and terms
limited to a few years.
About the DR process:
- Most of our communication is publicly visible on-wiki, therefore the
cases should be resolved in public. Transparency is crucial for community review and a great learning opportunity about dispute resolution.
- Privately handled cases should only happen when all parties agree to
it, so one party can't use "privacy" as a means to avoid the burden of proof. Non-public evidence should only be taken into account if there is
a
very strong justification, proportional to the sanction that comes from
it.
- Reports, however, should be created privately and published only when
the
case opens. Before the case opens the reporter might seek advice and help to create the report from people they trust. I've outlined a process
draft
for this in the context of the User Reporting System <
https://meta.wikimedia.org/wiki/Talk:Community_health_initiative/User_report...
.
- Reports should be treated with respect, as the personal experience of a
person. Nobody should be sanctioned for what a report contains, whether
the
boards, or the community finds that true or false, as that would be a deterrent to reporting influential users, who made a mistake or lost
their
way.
- The focus should be on dispute *resolution. *Disputes and the resulting
reports often start with disagreements, not bad intent towards each
other.
Mediation is an effective approach to finding a mutually agreeable resolution in these situations. Such resolutions create a more
cooperative
environment and allow for personal growth, learning from mistakes. Mediators should be hired and board members offered mediator training to support this path.
- When necessary, only the minimal sanctions should be applied that
prevent
the reported behaviour, to reduce the abuse potential of blocking.
Partial
blocks was a great step in this direction: typical conduct issues should
be
addressed early on with minor sanctions, not after years of misconduct, when a ban becomes warranted. Bans and project-wide blocks should only be used after numerous escalations and repeated sanctions, or in clear-cut cases of extreme misconduct.
Dispute resolution is difficult and often requires effort from all
parties.
The above approaches are unusual compared to the traditional handling of disputes, which often results in one-sided sanctioning of the party with less support from the community. However, adopting new ways of dispute resolution is necessary to create an inclusive community, where editors
are
treated equally and fairly, regardless of their status.
These are just superficial thoughts, which I'll detail in the second
phase.
Thanks, Aron (Demian) _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail Virusfri. www.avg.com http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail <#DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>
A former steward fellow and I discussed this topic at the Safety Space at Wikimania. Due to the nature of the space, the discussion have not been documented but you can find the presentation with backgrounds of the situation and open questions on Commons < https://commons.wikimedia.org/wiki/File:Wikimania_2019_%E2%80%93_Do_we_need_...
.
Maybe it can give some ideas how to proceed with this.
Yes -- I was just thinking of your discussions of this while reading the thread. I hope these steward reflections are considered as people move forward.
The case of disputes that embroil an entire community and their admins should (also) specifically be addressed. S
What Martin mentions should be covered in the recommendations for the 2030 strategy, the measures mentioned here being "fast-tracked" to provide a starting point for improving Community Health. Conflict resolution needs to happen on the lowest possible level so that we don't run into situations we've encountered in the past. Of course it's difficult for one aspect to work without the other, so the overall goal won't be achieved until every part is in place.
On Mon, 25 May 2020 at 17:46, Samuel Klein meta.sj@gmail.com wrote:
A former steward fellow and I discussed this topic at the Safety Space at Wikimania. Due to the nature
of
the space, the discussion have not been documented but you can find the presentation with backgrounds of the situation and open questions on Commons <
https://commons.wikimedia.org/wiki/File:Wikimania_2019_%E2%80%93_Do_we_need_...
.
Maybe it can give some ideas how to proceed with this.
Yes -- I was just thinking of your discussions of this while reading the thread. I hope these steward reflections are considered as people move forward.
The case of disputes that embroil an entire community and their admins should (also) specifically be addressed. S _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
Is anyone not already aware of the recent issue facing Facebook over compensation for moderators https://techcrunch.com/2020/05/12/facebook-moderators-ptsd-settlement/ To me there appears to be potential risk that the Board and the WMF must consider in relation to any role that involves any form of moderation;
1. is there a problem with setting standards against harassment, toxic behavior, and incivility that is at a minimum equal, understandable, and respected on all projects, committees, affiliates, events and everything else we do 2. is there concern about being asked to contribute at these standards 3. is the concern how much the WMF needs to be part of the process, or 4. how long it should be allowed to go unaddressed before its escalated.
I go back way to far back I remember a group targeted stalking of female admins, I was part of a group of admins that were willing to take action against this group. We lost some very good people during that, Harassment has been an on going issue for all my 15 years, we had some the worst people become tool holders, others have just created 1,000's of socks. There are still people contributing today that are trolls, and harassers contributing today, we know that our failures to deal with it effectively and quickly are legendary. What ever we do we need to keep improving our response and our ability to respond across projects, the alternative is going to be that the Board & WMF are going to have to step in and take responsibility out of the communities hands.
On Tue, 26 May 2020 at 18:58, Philip Kopetzky philip.kopetzky@gmail.com wrote:
What Martin mentions should be covered in the recommendations for the 2030 strategy, the measures mentioned here being "fast-tracked" to provide a starting point for improving Community Health. Conflict resolution needs to happen on the lowest possible level so that we don't run into situations we've encountered in the past. Of course it's difficult for one aspect to work without the other, so the overall goal won't be achieved until every part is in place.
On Mon, 25 May 2020 at 17:46, Samuel Klein meta.sj@gmail.com wrote:
A former steward fellow and I discussed this topic at the Safety Space at Wikimania. Due to the
nature
of
the space, the discussion have not been documented but you can find the presentation with backgrounds of the situation and open questions on Commons <
https://commons.wikimedia.org/wiki/File:Wikimania_2019_%E2%80%93_Do_we_need_...
.
Maybe it can give some ideas how to proceed with this.
Yes -- I was just thinking of your discussions of this while reading the thread. I hope these steward reflections are considered as people move forward.
The case of disputes that embroil an entire community and their admins should (also) specifically be addressed. S _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
With regard to the issue Facebook is having, if that were to become an issue on Wikimedia projects something likely would have happened already. The majority of disturbing content is handled by volunteers, and that which T&S handles is often sent to them by volunteers.
Also, given the relatively complicated upload process (compared to Facebook), we simply don’t get nearly as many problematic uploads as they do.
On Tue, May 26, 2020 at 09:19 Gnangarra gnangarra@gmail.com wrote:
Is anyone not already aware of the recent issue facing Facebook over compensation for moderators https://techcrunch.com/2020/05/12/facebook-moderators-ptsd-settlement/ To me there appears to be potential risk that the Board and the WMF must consider in relation to any role that involves any form of moderation;
- is there a problem with setting standards against harassment, toxic
behavior, and incivility that is at a minimum equal, understandable, and respected on all projects, committees, affiliates, events and everything else we do 2. is there concern about being asked to contribute at these standards 3. is the concern how much the WMF needs to be part of the process, or 4. how long it should be allowed to go unaddressed before its escalated.
I go back way to far back I remember a group targeted stalking of female admins, I was part of a group of admins that were willing to take action against this group. We lost some very good people during that, Harassment has been an on going issue for all my 15 years, we had some the worst people become tool holders, others have just created 1,000's of socks. There are still people contributing today that are trolls, and harassers contributing today, we know that our failures to deal with it effectively and quickly are legendary. What ever we do we need to keep improving our response and our ability to respond across projects, the alternative is going to be that the Board & WMF are going to have to step in and take responsibility out of the communities hands.
On Tue, 26 May 2020 at 18:58, Philip Kopetzky philip.kopetzky@gmail.com wrote:
What Martin mentions should be covered in the recommendations for the
2030
strategy, the measures mentioned here being "fast-tracked" to provide a starting point for improving Community Health. Conflict resolution needs to happen on the lowest possible level so that
we
don't run into situations we've encountered in the past. Of course it's difficult for one aspect to work without the other, so the overall goal won't be achieved until every part is in place.
On Mon, 25 May 2020 at 17:46, Samuel Klein meta.sj@gmail.com wrote:
A former steward fellow and I discussed this topic at the Safety Space at Wikimania. Due to the
nature
of
the space, the discussion have not been documented but you can find
the
presentation with backgrounds of the situation and open questions on Commons <
https://commons.wikimedia.org/wiki/File:Wikimania_2019_%E2%80%93_Do_we_need_...
.
Maybe it can give some ideas how to proceed with this.
Yes -- I was just thinking of your discussions of this while reading
the
thread. I hope these steward reflections are considered as people move forward.
The case of disputes that embroil an entire community and their admins should (also) specifically be addressed. S _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
-- GN.
*Power of Diverse Collaboration* *Sharing knowledge brings people together* Wikimania Bangkok 2021 August hosted by ESEAP
Wikimania: https://wikimania.wikimedia.org/wiki/User:Gnangarra Noongarpedia: https://incubator.wikimedia.org/wiki/Wp/nys/Main_Page My print shop: https://www.redbubble.com/people/Gnangarra/shop?asc=u _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
We of course do not have as many problematic uploads as FB does (and to be honest having a personal experience I am not really impressed with the quality of their moderators), but we still get several hundreds of obvious copyright violations per day uploaded to Commons, and several hundreds junk articles started and not passed the new page patrol barrier in the English Wikipedia (deleted or draftified forever). I am sure we have a similar situation in other big projects. All these things are cleaned up by a very few people who are on top of the time lost for these tasks also subject to constant abuse. Note that I am not saying that WMF must pay admins compensation (still stronger, I will likely leave WMF projects if it starts doing so), but the problem of emotional drain of those who are dealing with this shit on a daily basis is real. I am afraid though it has no solution, because we know that the obvious solution - get more people - does not work.
I am not even talking about off-wiki harassment - which in my experience is more rare but much stronger because you do not know how real are the threats. Last time I had to report to the police. This one has no solution either.
Best Yaroslav
On Tue, May 26, 2020 at 6:08 PM Chris Gates via Wikimedia-l < wikimedia-l@lists.wikimedia.org> wrote:
With regard to the issue Facebook is having, if that were to become an issue on Wikimedia projects something likely would have happened already. The majority of disturbing content is handled by volunteers, and that which T&S handles is often sent to them by volunteers.
Also, given the relatively complicated upload process (compared to Facebook), we simply don’t get nearly as many problematic uploads as they do.
On Tue, May 26, 2020 at 09:19 Gnangarra gnangarra@gmail.com wrote:
Is anyone not already aware of the recent issue facing Facebook over compensation for moderators https://techcrunch.com/2020/05/12/facebook-moderators-ptsd-settlement/ To me there appears to be potential risk that the Board and the WMF must consider in relation to any role that involves any form of moderation;
- is there a problem with setting standards against harassment, toxic
behavior, and incivility that is at a minimum equal, understandable,
and
respected on all projects, committees, affiliates, events and
everything
else we do 2. is there concern about being asked to contribute at these standards 3. is the concern how much the WMF needs to be part of the process, or 4. how long it should be allowed to go unaddressed before its
escalated.
I go back way to far back I remember a group targeted stalking of female admins, I was part of a group of admins that were willing to take action against this group. We lost some very good people during that,
Harassment
has been an on going issue for all my 15 years, we had some the worst people become tool holders, others have just created 1,000's of socks. There are still people contributing today that are trolls, and harassers contributing today, we know that our failures to deal with it effectively and quickly are legendary. What ever we do we need to keep improving
our
response and our ability to respond across projects, the alternative is going to be that the Board & WMF are going to have to step in and take responsibility out of the communities hands.
On Tue, 26 May 2020 at 18:58, Philip Kopetzky <philip.kopetzky@gmail.com
wrote:
What Martin mentions should be covered in the recommendations for the
2030
strategy, the measures mentioned here being "fast-tracked" to provide a starting point for improving Community Health. Conflict resolution needs to happen on the lowest possible level so
that
we
don't run into situations we've encountered in the past. Of course it's difficult for one aspect to work without the other, so the overall goal won't be achieved until every part is in place.
On Mon, 25 May 2020 at 17:46, Samuel Klein meta.sj@gmail.com wrote:
A former steward fellow and I discussed this topic at the Safety Space at Wikimania. Due to the
nature
of
the space, the discussion have not been documented but you can find
the
presentation with backgrounds of the situation and open questions
on
Commons <
https://commons.wikimedia.org/wiki/File:Wikimania_2019_%E2%80%93_Do_we_need_...
.
Maybe it can give some ideas how to proceed with this.
Yes -- I was just thinking of your discussions of this while reading
the
thread. I hope these steward reflections are considered as people
move
forward.
The case of disputes that embroil an entire community and their
admins
should (also) specifically be addressed. S _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe:
https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
-- GN.
*Power of Diverse Collaboration* *Sharing knowledge brings people together* Wikimania Bangkok 2021 August hosted by ESEAP
Wikimania: https://wikimania.wikimedia.org/wiki/User:Gnangarra Noongarpedia: https://incubator.wikimedia.org/wiki/Wp/nys/Main_Page My print shop: https://www.redbubble.com/people/Gnangarra/shop?asc=u _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
Thanks for this step - I wish that it wouldn't be necessary. I'm not sure of all the implications, but was mostly wondering: will this be primarily a stick, or is the foundation also going to invest more heavily in carrots and education?
I get the impression that we have much progress to make in training, educating and exposing correct behavior (some chapters have made attempts at this). So much of our energy already goes into the bad behavior, that it exhausts many community members. I'm confident that the Trust and Safety live through a more extreme version of that daily.
I'd wish that we manage to build an ecosystem that encourages good behavior, diverts bad behavior at a very early stage, and removes the bad actors that cannot be corrected. Probably not as popular as punishing people, but hopefully more constructive for the community as a whole.
Lodewijk
On Fri, May 22, 2020 at 4:52 PM Nataliia Tymkiv ntymkiv@wikimedia.org wrote:
Hello, Dennis!
Not at all. What it means is that this a not a process that goes into play *before* a decision to act is made, but *after*. It should stand as an option for those who want to ensure that actions taken are fair, as long as the case does not relate to legal risks or other severe concerns.
Best regards, antanana / Nataliia Tymkiv
NOTICE: You may have received this message outside of your normal working hours/days, as I usually can work more as a volunteer during weekend. You should not feel obligated to answer it during your days off. Thank you in advance!
On Sat, May 23, 2020, 01:58 Dennis During dcduring@gmail.com wrote:
"Work with community functionaries to create and refine a retroactive review process for cases brought by involved parties, excluding those
cases
which pose legal or other severe risks "
What does "retroactive review process" mean?
I hope it doesn't mean applying standards that were not promulgated at
the
time to past actions and applying severe sanctions to the alleged perpetrators.
On Fri, May 22, 2020 at 5:59 PM María Sefidari maria@wikimedia.org wrote:
Hello everyone,
Today, the Wikimedia Foundation Board of Trustees unanimously passed a resolution and published a statement[1] regarding the urgent need to
make
our movement more safe and inclusive by addressing harassment and incivility on Wikimedia projects. The statement builds on prior
statements
from 2016 and 2019,[2][3] affirms the forthcoming introduction of a universal code of conduct, and directs the Wikimedia Foundation to
rapidly
and substantively address these challenges in complement with existing community processes.
This includes developing sustainable practices and tools that eliminate harassment, toxicity, and incivility, promote inclusivity, cultivate respectful discourse, reduce harms to participants, protect the
projects
from disinformation and bad actors, and promote trust in our projects.
Over the past nearly twenty years, the movement has taken a number of unique and sometimes extraordinary steps to create an environment
unlike
anything else online: a place to share knowledge, to learn, and to collaborate together. In order for the movement to continue to thrive
and
make progress to our mission, it is essential to build a culture that
is
welcoming and inclusive.
Research has consistently shown that members of our communities have
been
subject to hostility and toxic behavior in Wikimedia spaces.[4][5] The Wikimedia 2030 movement strategy recommendations have also identified
the
safety of our Wikimedia spaces as a core issue to address if we are to reach the 2030 goals, with concrete recommendations which include a universal code of conduct, pathways for users to privately report incidents, and a baseline of community responsibilities.[6]
While the movement has made progress in addressing harassment and toxic behavior, we recognize there is still much more to do. The Board’s resolution and statement today is a step toward establishing clear, consistent guidelines around acceptable behavior on our projects, and guiding the Wikimedia Foundation in supporting the movement’s ability
to
ensure a healthy environment for those who participate in our projects.
- Developing and introducing, in close consultation with volunteer
contributor communities, a universal code of conduct that will be a
binding
minimum set of standards across all Wikimedia projects;
- Taking actions to ban, sanction, or otherwise limit the access of
Wikimedia movement participants who do not comply with these policies
and
the Terms of Use;
- Working with community functionaries to create and refine a
retroactive
review process for cases brought by involved parties, excluding those
cases
which pose legal or other severe risks; and
- Significantly increasing support for and collaboration with community
functionaries primarily enforcing such compliance in a way that
prioritizes
the personal safety of these functionaries.
Together, we have made our movement what it is today. In this same way,
we
must all be responsible for building the positive community culture of
the
future, and accountable for stopping harassment and toxic behavior on
our
sites.
We have also made this statement available on Meta-Wiki for translation
and
wider distribution.[1]
On behalf of the Board, María, Board Chair
[1]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/May_2...
[2]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Novem...
[3]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Archi...
[4] https://meta.wikimedia.org/wiki/Research:Harassment_survey_2015
[5]
https://meta.wikimedia.org/wiki/Community_Insights/2018_Report#Experience_of...
[6]
https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20/Recommen...
== Statement on Healthy Community Culture, Inclusivity, and Safe Spaces
==
Harassment, toxic behavior, and incivility in the Wikimedia movement
are
contrary to our shared values and detrimental to our vision and
mission.
They negatively impact our ability to collect, share, and disseminate
free
knowledge, harm the immediate well-being of individual Wikimedians, and threaten the long-term health and success of the Wikimedia projects.
The
Board does not believe we have made enough progress toward creating welcoming, inclusive, harassment-free spaces in which people can
contribute
productively and debate constructively.
In recognition of the urgency of these issues, the Board is directing
the
Wikimedia Foundation to directly improve the situation in collaboration with our communities. This should include developing sustainable
practices
and tools that eliminate harassment, toxicity, and incivility, promote inclusivity, cultivate respectful discourse, reduce harms to
participants,
protect the projects from disinformation and bad actors, and promote
trust
in our projects.
Specifically, the Foundation shall:
- Develop and introduce a universal code of conduct (UCoC) that will
be a
binding minimum set of standards across all Wikimedia projects.
** The first phase, covering policies for in-person and virtual events, technical spaces, and all Wikimedia projects and wikis, and developed
in
collaboration with the international Wikimedia communities, will be presented to the Board for ratification by August 30, 2020.
** The second phase, outlining clear enforcement pathways, and refined
with
broad input from the Wikimedia communities, will be presented to the
Board
for ratification by the end of 2020;
- Take actions to ban, sanction, or otherwise limit the access of
Wikimedia
movement participants who do not comply with these policies and the
Terms
of Use;
- Work with community functionaries to create and refine a retroactive
review process for cases brought by involved parties, excluding those
cases
which pose legal or other severe risks; and
- Significantly increase support for and collaboration with community
functionaries primarily enforcing such compliance in a way that
prioritizes
the personal safety of these functionaries.
Until such directives are implemented, the Board instructs the
Foundation
to adopt and implement policies for reducing harassment and toxicity on
our
projects and minimizing legal risks for the movement, in collaboration
with
communities whenever practicable. Until these two phases of the UCoC
are
complete and operational an interim review process involving community functionaries will be in effect. In this interim period, the Product Committee of the Board of Trustees will also advise the Trust & Safety team.
To that end, the Board further directs the Foundation, in collaboration with the communities, to make additional investments in Trust & Safety capacity, including but not limited to: development of tools needed to assist our volunteers and staff, research to support data-informed decisions, development of clear metrics to measure success, development
of
training tools and materials (including building communities’
capacities
around harassment awareness and conflict resolution), and consultations with international experts on harassment, community health and
children’s
rights, as well as additional hiring.
The above efforts will be undertaken in coordination and collaboration
with
appropriate partners from across the movement, seek to increase
effective
community governance of conduct and behavioral standards, and reduce
the
long-term need of the Foundation to act. It is the shared goal of the
Board
and Foundation that these efforts advance a sustainable Wikimedia
movement
and support, rather than substitute, effective models of community governance.
We urge every member of the Wikimedia communities to collaborate in a
way
that models the Wikimedia values of openness and inclusivity, step
forward
to do their part to create a safe and welcoming culture for all, stop hostile and toxic behavior, support people who have been targeted by
such
behavior, assist good-faith people learning to contribute, and help set clear expectations for all contributors.
--
María Sefidari Huici
Chair of the Board
Wikimedia Foundation https://wikimediafoundation.org/ _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
-- Dennis C. During _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
Hello,
I share the Board's interest in many of these matters. However, I remind the WMF Board that it exists to serve the Wikimedia community, not the other way around. Also, I recommend that the Board take great care in approaching these matters diplomatically and with appreciation for the volunteers who spend considerable amounts of personal time developing and protecting the Wikimedia projects, and who are not WMF's employees or servants. I suggest to the WMF Board that a self-righteous tone is likely to be counterproductive for its intended outcomes.
There have been enormous amounts of volunteer time already spent considering and deliberating regarding civility and related complex issues.
I suggest to the Board that while being appropriately mindful of where there are problems, the Board should also spend at least as much time considering what is going well and expressing gratitude to the thousands of volunteers who strive to protect Wikipedia and the sister projects.
I remind the Board that it is not the governing entity for the Wikimedia community. The WMF Board governs the Wikimedia Foundation corporate entity. The corporate entity exists to serve the community, not the other way around.
I also remind the Board that ill-considered interventions may do more harm than good. Goodwill is easy to destroy.
Worked out great the last time WMF tried to pull something like this, didn't it?
https://en.wikipedia.org/wiki/Wikipedia:Community_response_to_the_Wikimedia_...
Oh, wait. By "worked out great" I mean "was an unmitigated disaster." One wonders if the folks at the WMF are capable of learning from mistakes, and one is not encouraged by the apparent answer.
Todd
On Fri, May 22, 2020 at 3:59 PM María Sefidari maria@wikimedia.org wrote:
Hello everyone,
Today, the Wikimedia Foundation Board of Trustees unanimously passed a resolution and published a statement[1] regarding the urgent need to make our movement more safe and inclusive by addressing harassment and incivility on Wikimedia projects. The statement builds on prior statements from 2016 and 2019,[2][3] affirms the forthcoming introduction of a universal code of conduct, and directs the Wikimedia Foundation to rapidly and substantively address these challenges in complement with existing community processes.
This includes developing sustainable practices and tools that eliminate harassment, toxicity, and incivility, promote inclusivity, cultivate respectful discourse, reduce harms to participants, protect the projects from disinformation and bad actors, and promote trust in our projects.
Over the past nearly twenty years, the movement has taken a number of unique and sometimes extraordinary steps to create an environment unlike anything else online: a place to share knowledge, to learn, and to collaborate together. In order for the movement to continue to thrive and make progress to our mission, it is essential to build a culture that is welcoming and inclusive.
Research has consistently shown that members of our communities have been subject to hostility and toxic behavior in Wikimedia spaces.[4][5] The Wikimedia 2030 movement strategy recommendations have also identified the safety of our Wikimedia spaces as a core issue to address if we are to reach the 2030 goals, with concrete recommendations which include a universal code of conduct, pathways for users to privately report incidents, and a baseline of community responsibilities.[6]
While the movement has made progress in addressing harassment and toxic behavior, we recognize there is still much more to do. The Board’s resolution and statement today is a step toward establishing clear, consistent guidelines around acceptable behavior on our projects, and guiding the Wikimedia Foundation in supporting the movement’s ability to ensure a healthy environment for those who participate in our projects.
- Developing and introducing, in close consultation with volunteer
contributor communities, a universal code of conduct that will be a binding minimum set of standards across all Wikimedia projects;
- Taking actions to ban, sanction, or otherwise limit the access of
Wikimedia movement participants who do not comply with these policies and the Terms of Use;
- Working with community functionaries to create and refine a retroactive
review process for cases brought by involved parties, excluding those cases which pose legal or other severe risks; and
- Significantly increasing support for and collaboration with community
functionaries primarily enforcing such compliance in a way that prioritizes the personal safety of these functionaries.
Together, we have made our movement what it is today. In this same way, we must all be responsible for building the positive community culture of the future, and accountable for stopping harassment and toxic behavior on our sites.
We have also made this statement available on Meta-Wiki for translation and wider distribution.[1]
On behalf of the Board, María, Board Chair
[1]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/May_2...
[2]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Novem...
[3]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Archi...
[4] https://meta.wikimedia.org/wiki/Research:Harassment_survey_2015
[5]
https://meta.wikimedia.org/wiki/Community_Insights/2018_Report#Experience_of...
[6]
https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20/Recommen...
== Statement on Healthy Community Culture, Inclusivity, and Safe Spaces ==
Harassment, toxic behavior, and incivility in the Wikimedia movement are contrary to our shared values and detrimental to our vision and mission. They negatively impact our ability to collect, share, and disseminate free knowledge, harm the immediate well-being of individual Wikimedians, and threaten the long-term health and success of the Wikimedia projects. The Board does not believe we have made enough progress toward creating welcoming, inclusive, harassment-free spaces in which people can contribute productively and debate constructively.
In recognition of the urgency of these issues, the Board is directing the Wikimedia Foundation to directly improve the situation in collaboration with our communities. This should include developing sustainable practices and tools that eliminate harassment, toxicity, and incivility, promote inclusivity, cultivate respectful discourse, reduce harms to participants, protect the projects from disinformation and bad actors, and promote trust in our projects.
Specifically, the Foundation shall:
- Develop and introduce a universal code of conduct (UCoC) that will be a
binding minimum set of standards across all Wikimedia projects.
** The first phase, covering policies for in-person and virtual events, technical spaces, and all Wikimedia projects and wikis, and developed in collaboration with the international Wikimedia communities, will be presented to the Board for ratification by August 30, 2020.
** The second phase, outlining clear enforcement pathways, and refined with broad input from the Wikimedia communities, will be presented to the Board for ratification by the end of 2020;
- Take actions to ban, sanction, or otherwise limit the access of Wikimedia
movement participants who do not comply with these policies and the Terms of Use;
- Work with community functionaries to create and refine a retroactive
review process for cases brought by involved parties, excluding those cases which pose legal or other severe risks; and
- Significantly increase support for and collaboration with community
functionaries primarily enforcing such compliance in a way that prioritizes the personal safety of these functionaries.
Until such directives are implemented, the Board instructs the Foundation to adopt and implement policies for reducing harassment and toxicity on our projects and minimizing legal risks for the movement, in collaboration with communities whenever practicable. Until these two phases of the UCoC are complete and operational an interim review process involving community functionaries will be in effect. In this interim period, the Product Committee of the Board of Trustees will also advise the Trust & Safety team.
To that end, the Board further directs the Foundation, in collaboration with the communities, to make additional investments in Trust & Safety capacity, including but not limited to: development of tools needed to assist our volunteers and staff, research to support data-informed decisions, development of clear metrics to measure success, development of training tools and materials (including building communities’ capacities around harassment awareness and conflict resolution), and consultations with international experts on harassment, community health and children’s rights, as well as additional hiring.
The above efforts will be undertaken in coordination and collaboration with appropriate partners from across the movement, seek to increase effective community governance of conduct and behavioral standards, and reduce the long-term need of the Foundation to act. It is the shared goal of the Board and Foundation that these efforts advance a sustainable Wikimedia movement and support, rather than substitute, effective models of community governance.
We urge every member of the Wikimedia communities to collaborate in a way that models the Wikimedia values of openness and inclusivity, step forward to do their part to create a safe and welcoming culture for all, stop hostile and toxic behavior, support people who have been targeted by such behavior, assist good-faith people learning to contribute, and help set clear expectations for all contributors.
--
María Sefidari Huici
Chair of the Board
Wikimedia Foundation https://wikimediafoundation.org/ _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
How is a one-off ban comparable in any way with a structured effort to develop a policy in consultation with the community, and then implement it together?
Lodewijk
On Sat, May 23, 2020 at 10:02 PM Todd Allen toddmallen@gmail.com wrote:
Worked out great the last time WMF tried to pull something like this, didn't it?
https://en.wikipedia.org/wiki/Wikipedia:Community_response_to_the_Wikimedia_...
Oh, wait. By "worked out great" I mean "was an unmitigated disaster." One wonders if the folks at the WMF are capable of learning from mistakes, and one is not encouraged by the apparent answer.
Todd
On Fri, May 22, 2020 at 3:59 PM María Sefidari maria@wikimedia.org wrote:
Hello everyone,
Today, the Wikimedia Foundation Board of Trustees unanimously passed a resolution and published a statement[1] regarding the urgent need to make our movement more safe and inclusive by addressing harassment and incivility on Wikimedia projects. The statement builds on prior
statements
from 2016 and 2019,[2][3] affirms the forthcoming introduction of a universal code of conduct, and directs the Wikimedia Foundation to
rapidly
and substantively address these challenges in complement with existing community processes.
This includes developing sustainable practices and tools that eliminate harassment, toxicity, and incivility, promote inclusivity, cultivate respectful discourse, reduce harms to participants, protect the projects from disinformation and bad actors, and promote trust in our projects.
Over the past nearly twenty years, the movement has taken a number of unique and sometimes extraordinary steps to create an environment unlike anything else online: a place to share knowledge, to learn, and to collaborate together. In order for the movement to continue to thrive and make progress to our mission, it is essential to build a culture that is welcoming and inclusive.
Research has consistently shown that members of our communities have been subject to hostility and toxic behavior in Wikimedia spaces.[4][5] The Wikimedia 2030 movement strategy recommendations have also identified the safety of our Wikimedia spaces as a core issue to address if we are to reach the 2030 goals, with concrete recommendations which include a universal code of conduct, pathways for users to privately report incidents, and a baseline of community responsibilities.[6]
While the movement has made progress in addressing harassment and toxic behavior, we recognize there is still much more to do. The Board’s resolution and statement today is a step toward establishing clear, consistent guidelines around acceptable behavior on our projects, and guiding the Wikimedia Foundation in supporting the movement’s ability to ensure a healthy environment for those who participate in our projects.
- Developing and introducing, in close consultation with volunteer
contributor communities, a universal code of conduct that will be a
binding
minimum set of standards across all Wikimedia projects;
- Taking actions to ban, sanction, or otherwise limit the access of
Wikimedia movement participants who do not comply with these policies and the Terms of Use;
- Working with community functionaries to create and refine a retroactive
review process for cases brought by involved parties, excluding those
cases
which pose legal or other severe risks; and
- Significantly increasing support for and collaboration with community
functionaries primarily enforcing such compliance in a way that
prioritizes
the personal safety of these functionaries.
Together, we have made our movement what it is today. In this same way,
we
must all be responsible for building the positive community culture of
the
future, and accountable for stopping harassment and toxic behavior on our sites.
We have also made this statement available on Meta-Wiki for translation
and
wider distribution.[1]
On behalf of the Board, María, Board Chair
[1]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/May_2...
[2]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Novem...
[3]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Board_noticeboard/Archi...
[4] https://meta.wikimedia.org/wiki/Research:Harassment_survey_2015
[5]
https://meta.wikimedia.org/wiki/Community_Insights/2018_Report#Experience_of...
[6]
https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20/Recommen...
== Statement on Healthy Community Culture, Inclusivity, and Safe Spaces
==
Harassment, toxic behavior, and incivility in the Wikimedia movement are contrary to our shared values and detrimental to our vision and mission. They negatively impact our ability to collect, share, and disseminate
free
knowledge, harm the immediate well-being of individual Wikimedians, and threaten the long-term health and success of the Wikimedia projects. The Board does not believe we have made enough progress toward creating welcoming, inclusive, harassment-free spaces in which people can
contribute
productively and debate constructively.
In recognition of the urgency of these issues, the Board is directing the Wikimedia Foundation to directly improve the situation in collaboration with our communities. This should include developing sustainable
practices
and tools that eliminate harassment, toxicity, and incivility, promote inclusivity, cultivate respectful discourse, reduce harms to
participants,
protect the projects from disinformation and bad actors, and promote
trust
in our projects.
Specifically, the Foundation shall:
- Develop and introduce a universal code of conduct (UCoC) that will be a
binding minimum set of standards across all Wikimedia projects.
** The first phase, covering policies for in-person and virtual events, technical spaces, and all Wikimedia projects and wikis, and developed in collaboration with the international Wikimedia communities, will be presented to the Board for ratification by August 30, 2020.
** The second phase, outlining clear enforcement pathways, and refined
with
broad input from the Wikimedia communities, will be presented to the
Board
for ratification by the end of 2020;
- Take actions to ban, sanction, or otherwise limit the access of
Wikimedia
movement participants who do not comply with these policies and the Terms of Use;
- Work with community functionaries to create and refine a retroactive
review process for cases brought by involved parties, excluding those
cases
which pose legal or other severe risks; and
- Significantly increase support for and collaboration with community
functionaries primarily enforcing such compliance in a way that
prioritizes
the personal safety of these functionaries.
Until such directives are implemented, the Board instructs the Foundation to adopt and implement policies for reducing harassment and toxicity on
our
projects and minimizing legal risks for the movement, in collaboration
with
communities whenever practicable. Until these two phases of the UCoC are complete and operational an interim review process involving community functionaries will be in effect. In this interim period, the Product Committee of the Board of Trustees will also advise the Trust & Safety team.
To that end, the Board further directs the Foundation, in collaboration with the communities, to make additional investments in Trust & Safety capacity, including but not limited to: development of tools needed to assist our volunteers and staff, research to support data-informed decisions, development of clear metrics to measure success, development
of
training tools and materials (including building communities’ capacities around harassment awareness and conflict resolution), and consultations with international experts on harassment, community health and children’s rights, as well as additional hiring.
The above efforts will be undertaken in coordination and collaboration
with
appropriate partners from across the movement, seek to increase effective community governance of conduct and behavioral standards, and reduce the long-term need of the Foundation to act. It is the shared goal of the
Board
and Foundation that these efforts advance a sustainable Wikimedia
movement
and support, rather than substitute, effective models of community governance.
We urge every member of the Wikimedia communities to collaborate in a way that models the Wikimedia values of openness and inclusivity, step
forward
to do their part to create a safe and welcoming culture for all, stop hostile and toxic behavior, support people who have been targeted by such behavior, assist good-faith people learning to contribute, and help set clear expectations for all contributors.
--
María Sefidari Huici
Chair of the Board
Wikimedia Foundation https://wikimediafoundation.org/ _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l New messages to: Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
wikimedia-l@lists.wikimedia.org