Dear Richard and all,
The recommendations from this 2020 report that you have published now make
– Some (UCoC, Human Rights Policy) have clearly been implemented since you
received the report two years ago.
– Others (training for admins and rights holders) are in the process of
– Some ("audit protocol to assess projects that are at high risk of capture
or government-sponsored disinformation") have been at least partially
implemented (Croatian Wikipedia, disinformation hires; Japanese Wikipedia?).
– Others ("provide access to a geotargeted suicide prevention hotline at
the top of the articles on Suicide Methods") have neither been discussed
(to my knowledge) nor implemented to date.
– Yet others ("develop a Content Oversight Committee (COC) to review
content with a focus on bias and have the ability to make binding editorial
decisions in line with ICCPR 19") have not been discussed, and
implementation status in the various language versions is unknown.
Could you provide an overview here or on Meta as to the status of each of
the priority recommendations?
I append the complete set of priority recommendations below for everybody's
Article One developed a suite of recommendations to address each category
of salient risks. *We recognize the need to engage and secure input from
Wikimedia’s vast volunteer base and as such recommend that the Foundation
consult with volunteers and other experts to determine the best path
forward.* Priority recommendations include:
Strategies for the Foundation
*1. *Develop a standalone Human Rights Policy that commits to
respecting all internationally recognized human rights by referencing the
International Bill of Human Rights.
*2. *Conduct ongoing human rights due diligence to continually assess
risks to rightsholders. A Foundation-level HRIA should be conducted every
three years or whenever significant changes could have an effect on human
*3. *Develop rights-compatible channels to address human rights
concerns, including private channels, and ensure alignment with the UNGPs’
*1. *Develop an audit protocol to assess projects that are at high
risk of capture or government-sponsored disinformation.
*2. *Develop a Content Oversight Committee (COC) to review content
with a focus on bias and have the ability to make binding editorial
decisions in line with ICCPR 19.
*3. *Continue efforts outlined in the Knowledge Integrity white paper
to develop: a) a machine-readable representation of knowledge that exists
within Wikimedia projects along with its provenance; b) models to assess
the quality of information provenance; and c) models to assess content
neutrality and bias. Ensure that all AI/ML tools are designed to detect
content and action that would be considered illegal under international
human rights law, and that the response aligns with the threepart ICCPR
test requiring that any restriction on the right to free expression be
legal, proportional, and necessary.
*4. *Provide access to a geotargeted suicide prevention hotline at the
top of the articles on Suicide Methods.
*1. *Develop and deploy training programs for admins and volunteers
with advanced rights on detecting and responding to harassment claims.
*2. *Commission a “social norms marketing” research project to assess
what type of messaging is likely to reduce and prevent harassing comments
*3. *Explore opportunities to rate the toxicity of users, helping to
identify repeat offenders and patterns of harassment. Consider awards for
projects with the lowest toxicity levels.
*4. *Consider developing admin metrics focused on enforcing civility
and applying the forthcoming Universal Code of Conduct (UCoC).
*5. *Ensure that the (UCoC) and its accompanying governance mechanism
is reviewed by human rights experts, including experts on free expression
and incitement to violence.
Government surveillance and censorship
*1. *Continue efforts underway as part of the IP-masking project to
further protect users from public identification.
*2. *Develop awareness-raising tools and programs for all volunteers
to understand and mitigate risks of engagement. Tools should be made
publicly available and should be translated into languages spoken by
volunteers in higher risk regions. <#_ftn1>
Risks to child rights
*1. *Conduct a child rights impact assessment of Wikimedia projects,
including conducting interviews and focus groups with child contributors
across the globe.
*2. *Create child safeguarding tools, including child-friendly
guidance on privacy settings, data collection, reporting of grooming
attempts, the forthcoming UCoC as well a “Child’s Guide to Editing
Wikimedia Project” to help advance the right of children to be civically
Limitations on knowledge equity
*1. *Support retention by developing peer support and mentoring for
*2. *Engage stakeholders on how the “notability” requirement may be
shifted to be more inclusive of oral histories, and to identify what
definitions resonate with under-represented communities.
*3. *Adapt Wikimedia projects to be more accessible via mobile phones.
On Tue, Jul 12, 2022 at 4:31 PM Richard Gaines <rgaines(a)wikimedia.org>
In December 2021, the Wikimedia Foundation announced
our new Human Rights Policy. At the time, we also committed
to publishing more information from the Human Rights Impact Assessment
(HRIA) that helped to inform the policy. Today, I am glad to be able to
share that HRIA with you all.
Wikimedia projects play an important role in enabling people around the
world to exercise their human rights, including the right to access
knowledge. As host of these projects, the Foundation is committed to
protect and respect the human rights of all those who use our projects to
access, share, and contribute knowledge. In recognizing this critical role,
we commissioned the Human Rights Impact Assessment in 2020 to provide a
better understanding of potential human rights harms our projects may
inadvertently cause as well as recommendations to address them. We release
the report now to invite broader discussion on the report’s findings and
recommendations in order to determine which ones the Foundation should
prioritize moving forward.
You can find the assessment on Meta-Wiki
which includes a foreword authored by the Foundation, an executive summary
of the assessment, and a detailed analysis of the five categories of human
rights risks identified in the report (harmful content, harassment,
government surveillance and censorship, impacts on child rights, and
limitations on knowledge equity). The foreword and executive summary are
also available in Arabic
You can also find more information about the assessment in a blog post
We will be hosting a series of conversation hours in the coming weeks
where we invite feedback and ask questions about the assessment:
14 July (18:00–19:30 UTC): Community Affairs Committee Meeting
28 July (12:00 UTC): Global Advocacy Community Conversation Hour
28 July (17:00 UTC): Global Advocacy Community Conversation Hour
We also invite questions and feedback on the discussion page
of the assessment on Meta-Wiki as well as through the Movement Strategy
This assessment can help all stakeholders in the Wikimedia movement to
better understand the human rights risks and threats that we jointly face,
and the work required to reduce those risks. By doing so, the Foundation,
volunteers, and affiliates can work together to protect both our movement
and our people.
We look forward to engaging with all of you on this HRIA, its
recommendations, and your thoughts on how to move forward on the insights
offered by this report.
*Ricky Gaines *(he/him)
Senior Human Rights Advocacy Manager
Wikimedia-l mailing list -- wikimedia-l(a)lists.wikimedia.org, guidelines
Public archives at
To unsubscribe send an email to wikimedia-l-leave(a)lists.wikimedia.org