Chris Schilling gave a talk on harassment with regards to June's Inspire Campaign [1], at yesterday's Metrics & Activities meeting [2]. In it, he discussed an idea I had about reducing/preventing user page harassment [3], which we turned into an RfC [4], and is now being worked out on Phabricator [5].
- Pax aka Funcrunch
[1] https://youtu.be/4GHy3BIx3JM?t=16m29s [2] https://meta.wikimedia.org/wiki/Wikimedia_Foundation_metrics_and_activities_... [3] https://meta.wikimedia.org/wiki/Grants:IdeaLab/Protect_user_space_by_default [4] https://en.wikipedia.org/wiki/Wikipedia:Requests_for_comment/Protect_user_pa... [5] https://phabricator.wikimedia.org/T149445
On 11/18/16 11:35 AM, C. Scott Ananian wrote:
A few weeks ago our Executive Director gave a talk on "Privacy and Harassment on the Internet" at MozFest 2016 in London. I encourage you to read the transcript:
https://en.wikisource.org/wiki/Privacy_and_Harassment_on_the_Internet
Katherine argued that the Wikimedia project can take a lead role in creating a culture of respect and inclusion online. I whole-heartedly agree, and I hope you all do too. She concluded with:
"We have a lot of work to do. I know that. We know that. As Molly’s story
illustrates, we are not there yet."
I'd like to open a broader discussion on how we get "there": how to build/maintain places where we can get work done and control abuse and vandalism while still remaining wide open to the universe of differing viewpoints present in our projects. We can't afford to create filter bubbles, but we must be able to provide users safe(r) spaces to work.
By habit I would propose that this be a technical discussion, on specific tools or features that our platform is currently missing to facilitate healthy discussions. But the "filter bubble" is a social problem, not a technical one. Our project isn't just a collection of code; it's a community, a set of norms and habits, and a reflection of the social process of collaboration. A graph algorithm might be able to identify a filter bubble and good UX can make countervailing opinions no more than a click away, but it takes human will to seek out uncomfortable truth.
So although my endgame is specific engineering tasks, we need to start with a broader conversation about our work as social creatures. How do we work in the projects, how do we communicate among ourselves, and how do we balance openness and the pursuit of truth with the fight against abuse, harassment, and bias.
Let's discuss discussions!
Here are some jumping off points; feel free to contribute your own:
We currently use a mixture of Talk pages, Echo, mailing lists, IRC, Phabricator, OTRS, Slack, Conpherence, and Google Doc on our projects, with different logging, publication, privacy/identity, and other characteristics. I tried to start cataloging them here:
https://lists.wikimedia.org/pipermail/wikimedia-l/2016-November/085542.html
Because of this diversity, we lack a unified code of conduct or mechanism to report/combat harassment and vandalism.
Matt Flaschen replied in the above thread with an update on the Code of Conduct for technical spaces:
https://lists.wikimedia.org/pipermail/wikimedia-l/2016-November/085542.html
...which should definitely help! The creation of a centralized reporting mechanism, in particular, would be most welcome.
I created a proposal for the Wikimedia Developer Summit in January discussing "safe spaces" on our projects:
https://phabricator.wikimedia.org/T149665
Subscribe/comment/click "award token" to support its inclusion in the dev summit or to start a conversation there.
I have another, broader, proposal as well, on the "future of chat" on our projects:
https://phabricator.wikimedia.org/T149661
Subscribe/comment/click "award token" there if that angle piques your interest.
It seems that "groups of users" arise repeatedly as an architectural meta-concept, whether it's a group of collaborators you want to invite to an editing session, a group of users you want to block or ban, a group of users who belong to a particular wikiproject, or who watch a certain page. We don't really have a first-class representation of that concept in our code right now. In previous conversations I've heard that people "don't want <their wiki project> to turn into another facebook" and so have pushed back strongly on the idea of "friend lists" (one type of group of users) -- but inverting the concept to allow WikiProjects to maintain a list of "members of the wikiproject" is more palatable, more focused on the editing task. From a computer science perspective "friend list" and "member of a wikiproject" might seem identical--they are both lists of users--but from a social perspective the connotations and focus are significantly different. But who administers that list of users?
Perhaps we can build a system which avoids grappling with user groups entirely. It was suggested that we might use an ORES-like system to automatically suggest collaborators on an editing project based on some criteria (like editing history), rather than force you or the WikiProject to maintain an explicit list. Perhaps you can implement block lists to combat harassment based entirely on keywords, not users. Do we trust the machine to be more fair and less abusive than us mere mortals? Additional ideas welcome! (I don't have a dedicated phab task for this, but https://phabricator.wikimedia.org/T149665 might be appropriate if you want to contribute off-list.)
Hopefully this has been enough to prime the pump.
Let's discuss discussions.
Let's live up to the hope placed in us by the Washington Post:
https://www.washingtonpost.com/news/wonk/wp/2016/10/25/somethings-terribly-w...
Let's retake the lead on building and renewing a healthy collaborative community. We can't afford to be complacent or content with the status quo. Let's come up with new ideas, build them, find the problems, and try again. It starts with deciding that we can do better. --scott
-- (http://cscott.net)