Yay! Erik replied. Seriously, I was beginning to think no one from the Foundation read this mailing list anymore aside from me and Kaldari (and we read it as volunteers!). See comments below. 

On Thu, May 9, 2013 at 12:19 AM, Erik Moeller <erik@wikimedia.org> wrote:
On Wed, May 8, 2013 at 10:43 PM, Sarah <slimvirgin@gmail.com> wrote:

> A similar statement from the Foundation about the need to reject racism,
> sexism and homophobia among editors -- and to remember that this is an
> educational project -- might go a long way to adjusting attitudes. 
Most egregious examples of these behaviors are already in violation of
site terms of use and community policies, but I agree that a strong
reinforcement of core values could help. Agendas unrelated to the


First, I agree with Sarah. While we can go "oh the Foundation has this policy, and this policy as approved by the board," having a formal, non-legal speak statement that is shared with the press, the world, etc, stating we're not tolerating any type of crap - racism, sexism, homophobia, etc. - could make quite an impact. 

It might even cut down on vandalism (recent "faggot" blow up regarding that basketball player where his teams page was vandalized and then the community failed to strike the changes and edit summary until the press caught wave of it and then they struck it, fail) and step up administrative duties to make sure "crap" is removed off the permanent record quicker. 

And it's not that hard - to draft up a no-tolerance statement. I sign one when I started my job at WMF, but no one reads terms of services, right? (except a 2% of community members and the people who write it :) )  This serves as a reminder. 



The Terms of Use prohibit harassment, which is the same word that's
used to characterize the behaviors the friendly space policy
prohibits. So at least in that respect the two are already somewhat
analogous.

https://wikimediafoundation.org/wiki/Terms_of_Use#4._Refraining_from_Certain_Activities

Here's a question: so, if someone harasses me on Wikipedia (and it happened recently, and it'll happen again) I can point to this TOS and that can do what with that user? It took days for admins to go back and forth with the editor who harassed me (and another editor) before they finally blocked the guy forever (he had harassed other people before and he had been blocked temporarily). So what can *I* do with this TOS when something happens to me next? 

What are my actions to take to make sure that things move quickly and the situation gets resolved as fast as possible, because as we know, it doesn't often get resolved quickly? And as an admin prone to harassment, I can't do much about it because of "conflict of interest." So what can/should I do? 

That's one thing I see missing from that section. A "here's what you do if you're harassed" but, thankfully there is a section for what to do if you fear copyright violations. (DMCA compliance) (and yes, I'm being sarcastic)

 

In response to issues with the ethical management of photographs the
WMF Board did in fact pass a resolution specifically about photographs
of identifiable people:

https://wikimediafoundation.org/wiki/Resolution:Images_of_identifiable_people

Erring on the side of conservatism, the Board used language about
"private situations / places". But it calls explicitly for
strengthening and developing the relevant policy on Commons:

https://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_people


Yes, and this was the last thing "this" mailing list had some influence on regarding policy. And it's still awaiting to be seriously implemented in my humble opinion. And with so much crap on Commons, and that whole drama with the crap that gets downloaded from Flickr (i.e. "Russian porn" that gets mysteriously deleted from Commons after being uploaded but is still maintained on Commons and they refuse to download) and so forth with little to no knowledge of what "contract" a person signed - Commons has found ways to get around this I'm sure, in many ways. 
 


There _are_ thoughtful people on Commons who could be engaged
individually to help further develop and refine this policy to
elaborate on ethical issues like the one which started this thread.
And there are thoughtful people on this list who could help drive that
conversation.


Yes, that's what I'm saying. What more can we do than just sit here on this list and complain about it? 

 
Similarly, on things like acceptable content in user space, en.wp has
a pretty sophisticated and carefully considered policy which already
prohibits needlessly provocative content, and which could be developed
further to explain how such content can be seen as harassing and
damage an environment where people can work together productively.

https://en.wikipedia.org/wiki/Wikipedia:User_pages


Yes, but that's a user page. Remember the pregnancy article drama? Talk about an unsafe place - especially the talk page. Sexy pregnancy naked model or clothed "normal" woman who is pregnant? Hmm tough call. We better make a bunch of sexist and sexualized jokes to help figure it all out and then post photos...hmmm....

 
It's also worth noting on the subject of Commons that WMF did _not_
withdraw the Controversial Content resolution from May 2011, only the
personal image hiding feature component thereof. The resolution also
contained other recommendations consistent with reinforcing the
educational scope of Wikimedia Commons:

https://wikimediafoundation.org/wiki/Resolution:Controversial_content


Yes, and times and times again people like me, Kaldari, and others have said "YOU CAN HELP CURATE PLEASE COME TO COMMONS AND HELP" and it's so rare (uh....almost never?) that anyone does come and help that it's still a hopeless cause. Everyone wants to edit Wikipedia but no one wants to help clean up Commons. If Wikipedia needs more voices...imagine what's happening in Commons. Bad things. 

And the community lives and breaths that "NOT CENSORED" clause. If I had a dollar for how many times I had it thrown in my face on Commons deletion discussions, followed by an appalling break down as to why a photo of another penis is important because of the shape and size and where it can be used on projects and that "We can't prove if that it's someone under 18 even though the description says it, we don't know for sure," I'd be rich enough I could invest all my money in something less demoralizing and retire in Paris and help the French fight their freedom of panorama fight. 

That "not censored" thing needs to die a big dramatic death. It's so stupid. 


On the last point, it's not dropped off our radar. Better media
patrolling and review tools are on the agenda for the new multimedia
engineering team which we're currently hiring for. Lowering the
barrier to flag media that have no realistic educational value (for
whatever reason) may help create a greater culture of shared
responsibility for curating Commons and keeping it useful, rather than
allowing personal interests to dominate small group discussions.
Thoughts on how software design could positively affect user behavior
and lead to increased diversity in decision-making are greatly
appreciated.


I had no clue about this, and most people here probably didn't. Thanks for letting us know. I really hope that when this team starts, they don't forget about the gender gap list, like many people often do :) 

 
Is there a page on Meta already where we're coordinating overall
policy reform issues relating to the gender gap (whether WMF or
community policies) that should be considered?

Erik


http://meta.wikimedia.org/wiki/Gender_gap/Policy_revolution

There is now. Folks need to remember - Wikipedia is where Wikipedia policy is developed, meta is where larger scale policy is developed. So it's the best place to be for this type of work right now. 

Sarah

--
--
Sarah Stierch
Museumist, open culture advocate, and Wikimedian
www.sarahstierch.com