As we have stated in our annual plan , “currently, community members
must search many pages and places to stay informed about Foundation
activities and resources.” We have worked in the past two quarters to
create a single point of entry. We call it the Wikimedia Resource Center,
and its alpha version is now live on Meta Wikimedia:
As the movement expands to include more affiliates and more programmatic
activities every year, newer Wikimedians are faced with lack of experience
in the movement and its various channels for requesting support. In order
to expand Wikimedia communities’ efforts, we want to provide easy access to
resources that support their very important work. The [[m:Wikimedia
Resource Center]] is a hub designed in response to this issue: it is
intended to evolve into a single point of entry for Wikimedians all over
the world to the variety of resources and types of staff support they may
need to develop new initiatives or also expand existing ones.
This version of the Resource Center is only the beginning. For phase two of
the project, we will enable volunteer Wikimedians to add resources
developed by other individuals or organizations to the Wikimedia Resource
Center, and in phase three, the Wikimedia Resource Center will include
features to better connect Wikimedians to other Wikimedians that can
We want to hear what you think about this prototype and our plans for it!
If you have comments about the Wikimedia Resource Center, you can submit
your feedback publicly, on the Talk Page, or privately, via a survey hosted
by a third party, that shouldn’t take you more than 4 minutes to complete.
A feedback button is on the top right corner on every page of the hub.
Looking forward to more collaborations!
Communications and Outreach Project Manager, Community Engagement
"A Call to Men UK has 55 coaches working in schools, youth justice
departments and youth centres across Worcestershire. The organisation has
one principal aim, explains development manager Michael Conroy: to spark a
'cultural shift in the way boys relate to girls', and through this to
prevent violence against women and girls.... 'As a culture it’s time that
we gave our young men permission to be complex, sensitive and happy human
beings who transmit positivity and respect to others'.” 
They have a program "for young men from 11-19", which if you think about
it, is pretty much the demographic of Wikimedia's admins and functionaries.
This is all the more interesting right now because of the recent Newmark
Foundation grant to combat harassment, which it seems is to be used for
developing more forceful blocking tools for admins and functionaries "with
the participation and support of the volunteers who will be using the
tools". If anyone has not seen the Susan J Fowler / Uber piece on
harassment that has started going viral in the last 24 hours, it is
didn't do anything because the manager who threatened me was a 'high
performer.'"  Sound familiar? This happened in a company with HR
oversight; Wikimedia admins and functionaries have no oversight at all.
We are preparing our bias detection <http://net.wanderingliquen.com/> tool
to help us to track content bias about women that can be edited during this
month, to know how the tool works, you can watch this video
We are inviting groups working on events and edit-a-thones to let us know
which categories will be working on in order to upload those categories as
new maps in the app (For example, Wikimujeres is working on an event to
edit content related to women in science and our suggestion is to work with
the category "scientists", if the category is too big, for example,
"science", the app may take too long to render the map being a major
usability set back)
We will release weekly reports listing the problematic pages per category
indicating which bias has been detected in each page, so any volunteer can
pick up the lead and and review+improve any of the listed pages.
The idea is that this weekly reports will go on for the whole month as kind
of an online support for the women's day events
I am aware that these days are really busy, but If you want to let me know
which categories would you want to be uploaded for reviewing I will highly
creative director and co-founder
Media & Memory Research Initiative
University of Hull (UK)
44 (0) 7923 528128
Hello, everyone. I’m Trevor, a new Product Manager here at the WMF and I’ll
be working on the Community Health Initiative.
If you’re looking to read press about this new initiative, I personally
think the two best articles are the Wikimedia Foundation’s blog post
published on January 26th 2017 and Craig Newmark’s blog post published
the following day.
This specific Newmark Foundation grant will be used to staff a software
development team to research, design, and build tools for wiki contributors
and functionaries to detect, report, and evaluate incidents of harassment
and block offenders if appropriate. At this moment in time we believe all
four pillars — detection, reporting, evaluation, and blocking — are equally
important. This initiative is not just about blocking users.
If you’re looking for in-depth details, the best places to learn more are
the Meta page on the Community Health Initiative (which is admittedly
still a work-in-progress) and the 14-page grant proposal.
But ‘Tools’ is just part of the equation. An equally important part of this
initiative is will be to work with the communities to evaluate and improve
the conduct policies and dispute resolution processes. This is outside the
scope of this specific Newmark Foundation grant but is crucial to the goal
of reducing the frequency of harassment and complexity of resolving
One of the challenges for this initiative will be crafting, committing to,
and executing an inclusive and open communication strategy that encourages
and fosters constructive participation into the process. Community input is
vital to this initiative. As we staff up this new team we’ll begin making
decisions on best to include everyone who wants to be involved. Until then,
I personally encourage you to discuss this initiative on its talk page.
I am writing you on behalf of my group (www.liquendatalab.com)
<https://www.liquendatalab.com/>. During the last months we have been
working on building an online bias detection tool in order to help fight
systemic bias in Wikipedia, but unfortuntaley we have lost our developer
before finishing the project. This is why I would love to ask you for
assitance to finish this digital project in time for the next Art+Feminism
edit-a-thon, which is in less than 4 weeks.
Our prototype has been designed in order to allow women, black and queer
folks react to biased content without exposing themselves to online abuse
taking under consideration issues of oppressed group identities within the
access the prototype here --> http://net.wanderingliquen.com/
A brief description of the project:
When accessing the prototype for Art+Feminism, users can explore 7
Wikipedia categories and react with the feedback buttons they think that
there is any biased information. Users need to register before accessing.
After registering the app offers a series of maps (vector graphs) to
explore Wikipedia categories. The maps are build with vector graphs where
nodes represent pages and subcategories. Red nodes indicate controversy in
the page (controversy is calculated by counting the number of comments on
the discussion page of each Wikipedia entry). When clicking the node, a
module appears in the left side of the screen showing the article abstract
and the article link to access the Wikipedia page. Users then can check the
information on Wikipedia and, if they think that the information is biased,
they can react with the feedback buttons.
The bias button aims to tackle precisely this time-consuming process of
locating biased content. The feedback button work in a similar way than the
Facebook like button, but instead of implying ‘I like this content’ with
their reaction, users can imply: ‘I think this content is biased’ because
is islamophobic, or sexist, or ageist, or ableist, or homophobic etc. This
particular feature may result useful first to gather quantitative evidence
regarding biased content in Wikipedia, because they - the community- are
pointing out where we have issues, and which are those issues. Also it is
expected that this information, and other features for collaborative
working still to be developed, will be of assistance to the community of
editors, specially for online collaboration during the editathones that
Wikimedia organize yearly around Women’s Day. The long term goal of the
application is to give feminist communities an alternative space from
sexist environments to allow them to keep focused on women-centred
Nevertheless there are still some work to do on the app, as to redesign the
backend and the front end focusing on usability issues, and also improve
some of the functionallities adding a text selector and charts to visualize
the information about user reactions gathered by app.
We are looking for a volunteer developer willing to help us during this
last stage, and I am wandering if we could find them whitin this community.
Please find attached a more comprehensive report for the proposal and
please don't hesitate if you have any doubt or comment.
Many thanks in advance,
creative director and co-founder
Media & Memory Research Initiative
University of Hull (UK)
44 (0) 7923 528128
Firstly apologies for any cross-posting.
Over the next week I'll be finalising a piece for a new MuseumsEtc.
publication, *Feminism and Museums: Intervention, Disruption and Change.* The
title of my chapter will be 'Closing the Gender Gap on Wikimedia' and the
focus will be on the work of the global Wikimedia movement to develop
responses to the exclusion or misrepresentation of women and their creative
works, in relation to museums. I hope to be able to draw on case studies of
good practice across the movement, showcasing partnership projects with
museums or the wider cultural sector including galleries, archives and
libraries, including those where the gender gap is compounded by
geographic, linguistic and racial bias.
I'd love to hear about any projects you've been involved with, however big
or small, that might fit within the broad remit of feminism, culture and
open knowledge. I have reached out to a number of different groups and
individuals and have got a great response so far, but thought this would be
a good place to post this request.
With thanks and best wishes
+44 (0) 207 065 0991 <+44%2020%207065%200991>
Wikimedia UK is a Company Limited by Guarantee registered in England and
Wales, Registered No. 6741827. Registered Charity No.1144513. Registered
Office 4th Floor, Development House, 56-64 Leonard Street, London EC2A 4LT.
Wikimedia UK is the UK chapter of a global Wikimedia movement. The
Wikimedia projects are run by the Wikimedia Foundation (who operate
Wikipedia, amongst other projects). *Wikimedia UK is an independent
non-profit charity with no legal control over Wikipedia nor responsibility
for its contents.*
Christophe, Carol and Fae's notes have set me thinking as to what we could
do with these funds,
One of the areas that I understand has been a problem is email harassment,
particularly of women and I believe particularly from throwaway accounts.
I was wondering what people on this list would think of some possible
changes we could make to the "email this user" system.
The first would be to allow editors to set their email to only receive from
confirmed or even extended confirmed accounts. This would be invisible to
new editors, they'd just not see the *email this user *option for people
they weren't entitled to email.
The second would be an opt in Email moderation service. Similarly to only
receiving email from confirmed or extended confirmed accounts, this would
enable editors to opt all or parts of their email via the "email this user"
function into a moderated stream. Much as with moderated posts to lists
like this, a list admin would see the email and either approve it or take
other action. You'd presumably need to having something on the send email
screen to say that "this editor has opted into email moderation and your
email will be delayed slightly before being screened and forwarded" You'd
also need a group of volunteers to do the moderation, spot abusive emails
and block abusers.
The third would be an AI driven filter that people could opt into and which
would screen emails going through this system and put high risk ones into a
What do people think, if this existed would it help, would anyone have used
any of it?
Wow! When I think of the 2 plus hrs a week x 385 odd weeks of hours I
spent dealing with guys who just didn't like the idea that a "female"
dared to edit - or worse, change their edit - I still tear my hair out.
I just hope it helps!!
I'd like to go back in a few years when hopefully have accomplished
other goals. Or ENCOURAGE women to edit, as opposed to now having to
warn them all the time about what they have to do to edit safely!