Dear Richard and all,
The recommendations from this 2020 report that you have published now make interesting reading.
– Some (UCoC, Human Rights Policy) have clearly been implemented since you received the report two years ago. – Others (training for admins and rights holders) are in the process of implementation/community negotiation. – Some ("audit protocol to assess projects that are at high risk of capture or government-sponsored disinformation") have been at least partially implemented (Croatian Wikipedia, disinformation hires; Japanese Wikipedia?). – Others ("provide access to a geotargeted suicide prevention hotline at the top of the articles on Suicide Methods") have neither been discussed (to my knowledge) nor implemented to date. – Yet others ("develop a Content Oversight Committee (COC) to review content with a focus on bias and have the ability to make binding editorial decisions in line with ICCPR 19") have not been discussed, and implementation status in the various language versions is unknown.
Could you provide an overview here or on Meta as to the status of each of the priority recommendations?
I append the complete set of priority recommendations below for everybody's reference.
Best, Andreas
Article One developed a suite of recommendations to address each category of salient risks. *We recognize the need to engage and secure input from Wikimedia’s vast volunteer base and as such recommend that the Foundation consult with volunteers and other experts to determine the best path forward.* Priority recommendations include: Strategies for the Foundation
*1. *Develop a standalone Human Rights Policy that commits to respecting all internationally recognized human rights by referencing the International Bill of Human Rights.
*2. *Conduct ongoing human rights due diligence to continually assess risks to rightsholders. A Foundation-level HRIA should be conducted every three years or whenever significant changes could have an effect on human rights.
*3. *Develop rights-compatible channels to address human rights concerns, including private channels, and ensure alignment with the UNGPs’ effectiveness criteria. Harmful Content
*1. *Develop an audit protocol to assess projects that are at high risk of capture or government-sponsored disinformation.
*2. *Develop a Content Oversight Committee (COC) to review content with a focus on bias and have the ability to make binding editorial decisions in line with ICCPR 19.
*3. *Continue efforts outlined in the Knowledge Integrity white paper to develop: a) a machine-readable representation of knowledge that exists within Wikimedia projects along with its provenance; b) models to assess the quality of information provenance; and c) models to assess content neutrality and bias. Ensure that all AI/ML tools are designed to detect content and action that would be considered illegal under international human rights law, and that the response aligns with the threepart ICCPR test requiring that any restriction on the right to free expression be legal, proportional, and necessary.
*4. *Provide access to a geotargeted suicide prevention hotline at the top of the articles on Suicide Methods. Harassment
*1. *Develop and deploy training programs for admins and volunteers with advanced rights on detecting and responding to harassment claims.
*2. *Commission a “social norms marketing” research project to assess what type of messaging is likely to reduce and prevent harassing comments and actions.
*3. *Explore opportunities to rate the toxicity of users, helping to identify repeat offenders and patterns of harassment. Consider awards for projects with the lowest toxicity levels.
*4. *Consider developing admin metrics focused on enforcing civility and applying the forthcoming Universal Code of Conduct (UCoC).
*5. *Ensure that the (UCoC) and its accompanying governance mechanism is reviewed by human rights experts, including experts on free expression and incitement to violence. Government surveillance and censorship
*1. *Continue efforts underway as part of the IP-masking project to further protect users from public identification.
*2. *Develop awareness-raising tools and programs for all volunteers to understand and mitigate risks of engagement. Tools should be made publicly available and should be translated into languages spoken by volunteers in higher risk regions.[1] <#_ftn1> Risks to child rights
*1. *Conduct a child rights impact assessment of Wikimedia projects, including conducting interviews and focus groups with child contributors across the globe.
*2. *Create child safeguarding tools, including child-friendly guidance on privacy settings, data collection, reporting of grooming attempts, the forthcoming UCoC as well a “Child’s Guide to Editing Wikimedia Project” to help advance the right of children to be civically engaged. Limitations on knowledge equity
*1. *Support retention by developing peer support and mentoring for under-represented contributors.
*2. *Engage stakeholders on how the “notability” requirement may be shifted to be more inclusive of oral histories, and to identify what definitions resonate with under-represented communities.
*3. *Adapt Wikimedia projects to be more accessible via mobile phones.
On Tue, Jul 12, 2022 at 4:31 PM Richard Gaines rgaines@wikimedia.org wrote:
Hello!
In December 2021, the Wikimedia Foundation announced https://diff.wikimedia.org/2021/12/09/what-the-wikimedia-foundations-new-human-rights-policy-means-for-our-movement/ our new Human Rights Policy. At the time, we also committed https://foundation.wikimedia.org/wiki/Policy:Human_Rights_Policy/Frequently_asked_questions#Frequently_asked_questions_(published_9_December_2021) to publishing more information from the Human Rights Impact Assessment (HRIA) that helped to inform the policy. Today, I am glad to be able to share that HRIA with you all.
Wikimedia projects play an important role in enabling people around the world to exercise their human rights, including the right to access knowledge. As host of these projects, the Foundation is committed to protect and respect the human rights of all those who use our projects to access, share, and contribute knowledge. In recognizing this critical role, we commissioned the Human Rights Impact Assessment in 2020 to provide a better understanding of potential human rights harms our projects may inadvertently cause as well as recommendations to address them. We release the report now to invite broader discussion on the report’s findings and recommendations in order to determine which ones the Foundation should prioritize moving forward.
You can find the assessment on Meta-Wiki https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Human_Rights_Impact_Assessment, which includes a foreword authored by the Foundation, an executive summary of the assessment, and a detailed analysis of the five categories of human rights risks identified in the report (harmful content, harassment, government surveillance and censorship, impacts on child rights, and limitations on knowledge equity). The foreword and executive summary are also available in Arabic https://commons.wikimedia.org/wiki/File:Wikimedia_HRIA_Foreword_%2B_Executive_Summary_(Arabic).pdf?useskin=vector-2022, Chinese (Traditional) https://commons.wikimedia.org/wiki/File:Wikimedia_HRIA_Foreword_%2B_Executive_Summary_Chinese_(Traditional).pdf?useskin=vector-2022, French https://commons.wikimedia.org/wiki/File:Wikimedia_HRIA_Foreword_%2B_Executive_Summary_(French).pdf?useskin=vector-2022, Russian https://commons.wikimedia.org/wiki/File:Wikimedia_HRIA_Foreword_%2B_Executive_Summary_(Russian).pdf?useskin=vector-2022, and Spanish https://commons.wikimedia.org/wiki/File:Wikimedia_HRIA_Foreword_%2B_Executive_Summary_(Spanish).pdf?useskin=vector-2022. You can also find more information about the assessment in a blog post https://diff.wikimedia.org/2022/07/12/what-does-the-wikimedia-foundations-human-rights-impact-assessment-mean-for-the-wikimedia-movement/ on Diff.
We will be hosting a series of conversation hours in the coming weeks where we invite feedback and ask questions about the assessment:
14 July (18:00–19:30 UTC): Community Affairs Committee Meeting
28 July (12:00 UTC): Global Advocacy Community Conversation Hour
https://meta.wikimedia.org/wiki/Public_policy/Conversation_hours_and_Events
28 July (17:00 UTC): Global Advocacy Community Conversation Hour https://meta.wikimedia.org/wiki/Public_policy/Conversation_hours_and_Events
We also invite questions and feedback on the discussion page https://meta.wikimedia.org/w/index.php?title=Talk:Wikimedia_Foundation_Human_Rights_Impact_Assessment&action=edit&redlink=1 of the assessment on Meta-Wiki as well as through the Movement Strategy Forum https://forum.movement-strategy.org/t/did-you-see-the-wikimedia-foundation-published-a-human-rights-impact-assessment/1175 .
This assessment can help all stakeholders in the Wikimedia movement to better understand the human rights risks and threats that we jointly face, and the work required to reduce those risks. By doing so, the Foundation, volunteers, and affiliates can work together to protect both our movement and our people.
We look forward to engaging with all of you on this HRIA, its recommendations, and your thoughts on how to move forward on the insights offered by this report.
Thank you,
-- *Ricky Gaines *(he/him) Senior Human Rights Advocacy Manager Wikimedia Foundation rgaines@wikimedia.org
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l Public archives at https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/... To unsubscribe send an email to wikimedia-l-leave@lists.wikimedia.org