Perhaps it’s blindingly obvious but the
best solution is to prevent conflict happening in the first place. OK, that
sounds a little trite, but what it means is to design your platform so it’s
harder for conflict to occur. We routinely “design in” safety into
engineering projects. Think about roads. Having everyone drive on the left (or
right, depending on which country you are in) avoids a lot of road crashes. Traffic
lights control busy interactions. There are many things we “design into”
roads to ensure that people can co-exist on the roads without killing each
other. Yet we seem to completely ignore such principles in the social aspects
of projects.
For example, if a message in a forum contains
words likely to cause offense or indicate aggression “you stupid moron, you
dumb twat”, point this out to the user and ask if they really mean to
send it (possibly reminding them that your Code of Conduct does not allow
personal attacks, racist language or whatever the words suggest is going on,
and suggest they rewrite or cool off for a while). At least then, if they
proceed to send it and there are complaints, then it’s clear that they did
use those words deliberately aware of their potential to offend and knowing
that it was contrary to the Code of Conduct). No more “I didn’t
mean to offend because I tell everyone, even my best friends, that they are stupid
morons”. Or you can just quietly divert that message to a moderator for
consideration before it’s posted.
In Facebook, we have the protocol of
confirming each other as friends, which then allows us to see one another’s
postings etc, as well as the ability to accept someone as a friend (avoiding
the offence of refusing friend status) and still giving you the ability to
avoid reading their posts and restricting which ones of yours they see. Again,
this is an example of the system of allowing you to be “friends”
but putting some boundaries on that relationship. “Good fences makes good
neighbours” as they say.
Obviously, the way to “design in”
conflict avoidance depends a bit on the nature of the platform, e.g. forums are
different to wikis. For example, newspapers and blogs increasing disable
comments because of the conflicts that can arise there. You could do something like
limit people to only one comment per topic, this prevents the to-ing and
fro-ing between a couple of people escalating their dispute and forces people to
use their one-comment wisely, hopefully to address the substantive topic and
not to criticise others. Indeed, if you restricted the ability to mention
another user name in a comment, it would become much harder to make personal
attacks (works best if you don’t allow anyone to register the username “the”
or other common words though J)
Now in Wikipedia, we seem to like to do
the exact opposite, we seem to “design in” conflict. Think about “undo”.
It’s is a quick-and-easy solution for dealing with vandalism but it’s
a terrible tool for dealing with a difference of opinion between good faith
editors. By the time one of them gets blocked by the three-revert rule, you’ve
got conflict well-escalated. So long as it is more work for people to “talk”
about the issue than “undo”, what’s going to happen? Water
and people take the path of least resistance. So make “undo” just
that little bit harder. Pop up a box that asks them to tick one of “vandalism”,
“spam”, “patent nonsense” (and whichever other reasons
are appropriate, e.g. unsourced material on BLP if applicable) and the last is
“discuss on the talk page”. This means that someone cannot undo unless
they are prepared to declare it’s vandalism, etc. This takes away “undo
because I don’t agree with you” or “undo because I think my
source is better than yours”. That box could also highlight “This
is a new contributor; remember not to bite”. The NOBITE rule is easily
overlooked as it is not immediately visible that the person being reverted is a
new user.
Also, think about the old saying “praise
in public, criticise in private” and then remember that a user talk page
is visible to everyone and watchlists bring changes to the attention of
goodness-knows-how-many people. Can you think of a worse place for someone to come
and say something critical? Is there a better way to handle this? Have some private
channel? Allow users to decide which posts on their user talk page will be visible
to all? Allow users to redact anything in their user space? And while Barnstars
and other WikiLove are public (good!), thanks are not. Why not at least include
that information on a User’s page or Talk page in some way (“SmellyJockStraps
gave thanks 234 times and received thanks 432 times”)? We need to build a
platform that makes positive behaviours and positive interaction easier and
more visible than negative behaviour and negative interaction.
But start with a Code of Conduct. I would
stick to the principles not the specifics. I would have an all-emcompassing “a
panel of administrators/moderators can ban/block/suppress a user whose repeated
behaviour violates these principles” to avoid wikilawyering leading to a
never-ending proliferation of rules . E.g. Rule 1235. Not only is “moron”
disallowed, so is “m0r0n” and “m-o-r-o-n” or …
Kerry
From:
gendergap-bounces@lists.wikimedia.org
[mailto:gendergap-bounces@lists.wikimedia.org] On Behalf Of Vicky Knox
Sent: Tuesday, 25 November 2014
7:13 AM
To: gendergap@lists.wikimedia.org
Subject: Re: [Gendergap] Conflict
resolution resources for onlinecommunities?
Here is the very basic first draft of the article so far: https://localwiki.org/main/Conflict_resolution
If you want to add anything directly into the article, please do so. :]
2014-11-24 12:20 GMT-08:00 Vicky Knox <vknoxsironi@gmail.com>:
Hi gendergap folks!
I hope you're well. :]
I'm writing conflict resolution documentation for LocalWiki (https://localwiki.org/main/Front_Page),
a global local knowledge commons. Do you have any conflict resolution resources
for online communities, or conflict resolution examples from Wikimedia projects
you'd like to recommend? I'm particularly interested in examples of online
nonviolent communication modalities, and intersectional feminist perspectives
on online conflict resolution in communities of mixed real name and *nym
identities. (This all said, I'm open to all suggestions--I've lurked this list
for a while and highly value the perspectives I've found on it.)
Thank you!
Vicky