Online communities can allow anyone to "report" problem posts or PMs. Only the moderators see these reports, not the general membership or public. For example, Simple Machines Forum has a report link on every post.

I've been part of the moderation team in scienceforums.net for the past 6-7 years, and I can account for this from the "other side" (of a moderation team member) --- it really depends on how well the moderation team handles these reports, but from my experience, the system has great advantages-

1. It allows users to complain about anything from bias to bad attitude to stalking *privately* and without repercussion (no one other than the moderators knows that something was reported) and when we take action, we take care not to imply that anyone reported the post.

2. It also allows users the freedom to "check" their concerns before they become disasters. We sometimes get reports about a thread that isn't a problem *yet* but might very well get there without intervention, and we keep that thread in sight and try to intervene when possible to steer things back to normal.

However, these reports and moderation-action can also have some negative side effects -

1. It can look like "Big Brother is Watching" when moderators respond to a report but no one knows that there even was a report. 

2. It requires that there *is* a sort of moderation team and that people know who the moderators are. It also requires that people are able to complain *about* the moderation team in the reports, so the team has to have internal rules about how to inspect one of its own members. 

3. Some (not all) of the forums and moderation-driven systems also have some sort of "history" about troublesome users. This is extremely helpful to spot a user that is "borderline" on trolling or harassment, those are very easily "flying under the radar" and hurting others. So history in that aspect is very helpful. However, that can easily devolve, especially when/if these are public (in which case they can trigger worse behaviors)

The entire idea of reporting posts can be a tricky to make right and effective. I don't know how this can be implemented in a project like Wikipedia, where the idea of some "moderation authority" is generally frowned upon (and justly so) 

Maybe we can have a faux-moderation-team, a team that can get (private!) reports and then go and intervene. 
So even if they have no "teeth" or authority for actual action it can show users that they have support and they're not alone -- which seems to be one of the main issues with the gendergap and participation of minorities in general.

 
http://www.simplemachines.org/community/index.php

Now in many cases the harasser can blame the victim, but that happens whether it is the truth or not.

I have run into a problem of neutrals feeling as though reporting is "being a snitch." Haven't figured out a way around that yet.

Janine



_______________________________________________
Gendergap mailing list
Gendergap@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/gendergap



--
No trees were harmed in the creation of this post. 
But billions of electrons, photons, and electromagnetic waves were terribly inconvenienced during its transmission!