I'll go by pieces in your mail Erik.
*The intro and footer of Sue's post say: "The purpose of this post is not to
talk specifically about the referendum results or the image hiding feature" (...) So it's perhaps not surprising that she doesn't mention the de.wp poll regarding the filter in a post that she says is not about the filter. ;-)
It is quite surprise yes, since she gave half of the post to de.wiki main page "issue"[1]. And also, if we decide to ABFhttp://en.wikipedia.org/wiki/Wikipedia:ABFof the other side (like that post pretty much does) I would say that she doesn't mention because would not help her case.
*Now, it's completely fair to say that the filter issue remains the elephant
in the room until it's resolved what will actually be implemented and how.
You forgot the "*IF*": IF the elephant will be or not implemented.
*What Sue is saying is that we sometimes fail to take the needs and
expectations of our readers fully into account
Well, if we consider the "referendum" a good place to go see results[2] we can say that our readers are in doubt about that issue, pretty much 50%-50% in doubt - with the difference that our germans readers are not: They DON'T WANT it.
*Let me be specific. Let's take the good old autofellatio article (...) If
you visit http://en.wikipedia.org/wiki/Talk:Autofellatio , you'll notice that there are two big banners: "Wikipedia is not censored" and "If you find some images offensive you can configure your browser to mask them", with further instructions. (...) And yet, it's a deeply imperfect solution. The autofellatio page has been viewed 85,000 times in September. The associated discussion page has been viewed 400 times. The "options not to see an image" page, which is linked from many many of these pages, has been viewed 750 times. We can reasonably hypothesize without digging much further into the data that there's a significant number of people who are offended by images they see in Wikipedia but who don't know how to respond.
No we can not. With 85,000 views, would be childish to imagine that only 400 people could see the "Discussion" tab over the article. If they got to the article (and the article is not on MP) we need to assume that: 1. They looked for "*autofellatio*" in Google - thefore they knew what they would might find. 2. They placed that into the search box - thefore they know at least a bit how wikipedia works and know what is a discussion page and how to get there. 3. They got to the article by the links in another article. And by the links of "What Links herehttp://en.wikipedia.org/w/index.php?title=Special:WhatLinksHere/Autofellatio&namespace=0&limit=250" feature there are no article no related with sex and sexuality that links to this one, so that reader would know what they would find - like the 1. - and knows how wikipedia works - like 2.
In any of the cases, I can only imagine that 1 has any reason to be offended and don't know how to find the talk page. Even in that case - if we divide by 3 the number of viewers (assuming here that 1, 2 and 3 has exactly the same contribution to the number, that is 28,333 people. Which means that - from the other 56,667 people - only 400 decided to check what is the talk page. Which is 0,7% of the readers. From those, I can only see 3 people complaining, which is 0,75% of everyone who goes in the talk page. Can you see the idea? Only ~0,7% of all people who say that article is offended by it. So, no, we can't assume that people get offended.
*An alternative would be, for example, to give Wikipedians a piece of wiki
syntax that they can use to selectively make images hideable on specific articles. Imagine visiting the article Autofellatio and seeing small print at the top that says:
"This article contains explicit images that some readers may find objectionable. [[Hide all images on this page]]."*
That would indeed be a better idea - to be implemented as a gadget to log in users. - and to be implemented in a way that prevents any kind of "censorship categories"
*Our core community is 91% male, and that does lead to obvious perception
biases (and yes, occasional sexism and other -isms). Polls and discussions in our community are typically not only dominated by that core group, they're sometimes in fact explicitly closed to people who aren't meeting sufficient edit count criteria, etc.*
Yes it is. That does not mean girls get more offend by that. The 9% of the girls are not screaming to tire apart all images, are they? In the opposite, we can see the same 50%-50% pro-oppose in the female community as well. (As example: the only 2 girls who commented here - phoebe and me - are in opposite sides. Have a vagina don't make us more or less offend for see one in the main page.
[1]: Note there a page who was elected featured article be in the main page is not a issue, whatever the subject is. [2]: I don't, for the very simple reason that was badly written, as several people already said. _____ *Béria Lima* http://wikimedia.pt/(351) 925 171 484
*Imagine um mundo onde é dada a qualquer pessoa a possibilidade de ter livre acesso ao somatório de todo o conhecimento humano. É isso o que estamos a fazer http://wikimediafoundation.org/wiki/Nossos_projetos.*
On 30 September 2011 09:44, Erik Moeller erik@wikimedia.org wrote:
On Wed, Sep 28, 2011 at 11:45 PM, David Gerard dgerard@gmail.com wrote:
The complete absence of mentioning the de:wp poll that was 85% against any imposed filter is just *weird*.
The intro and footer of Sue's post say: "The purpose of this post is not to talk specifically about the referendum results or the image hiding feature"
She also wrote in the comments: "What I talk about in this post is completely independent of the filter, and it’s worth discussing (IMO) on its own merits"
So it's perhaps not surprising that she doesn't mention the de.wp poll regarding the filter in a post that she says is not about the filter. ;-)
Now, it's completely fair to say that the filter issue remains the elephant in the room until it's resolved what will actually be implemented and how. And it's understandable that lots of people are responding accordingly. But I think it's pretty clear that Sue was trying to start a broader conversation in good faith. I know that she's done lots of thinking about the conversations so far including the de.wp poll, and she's also summarized some of this in her report to the Board:
http://meta.wikimedia.org/wiki/Image_filter_referendum/Sue%27s_report_to_the...
The broader conversation she's seeking to kick off in her blog post _can_, IMO, usefully inform the filter conversation.
What Sue is saying is that we sometimes fail to take the needs and expectations of our readers fully into account. Whether you agree with her specific examples or not, this is certainly generally true in a community where decisions are generally made by whoever happens to show up, and sometimes the people who show up are biased, stupid or wrong. And even when the people who show up are thoughtful, intelligent and wise, the existing systems, processes and expectations may lead them to only be able to make imperfect decisions.
Let me be specific. Let's take the good old autofellatio article, which was one of the first examples of an article with a highly disputed explicit image on the English Wikipedia (cf. http://en.wikipedia.org/wiki/Talk:Autofellatio/Archive_1 ).
If you visit http://en.wikipedia.org/wiki/Talk:Autofellatio , you'll notice that there are two big banners: "Wikipedia is not censored" and "If you find some images offensive you can configure your browser to mask them", with further instructions.
Often, these kinds of banners come into being because people (readers and active editors) find their way to the talk page and complain about an image being offensive. They are intended to do two things: Explain our philosophy, but also give people support in making more informed choices.
This is, in other words, the result of reasonable discussion by thoughtful, intelligent and wise people about how to deal with offensive images (and in some cases, text).
And yet, it's a deeply imperfect solution. The autofellatio page has been viewed 85,000 times in September. The associated discussion page has been viewed 400 times. The "options not to see an image" page, which is linked from many many of these pages, has been viewed 750 times.
We can reasonably hypothesize without digging much further into the data that there's a significant number of people who are offended by images they see in Wikipedia but who don't know how to respond, and we can reasonably hypothesize that the responses that Wikipedians have conceived so far to help them have been overall insufficient in doing so. It would be great to have much more data -- but again, I think these are reasonable hypotheses.
The image filter in an incarnation similar to the one that's been discussed to-date is one possible response, but it's not the only one. Indeed, nothing in the Board resolution prescribes a complex system based on categories that exists adjacent to normal mechanisms of editorial control.
An alternative would be, for example, to give Wikipedians a piece of wiki syntax that they can use to selectively make images hideable on specific articles. Imagine visiting the article Autofellatio and seeing small print at the top that says:
"This article contains explicit images that some readers may find objectionable. [[Hide all images on this page]]."
As requested by the Board resolution, it could then be trivial to selectively unhide specific images.
If desired, it could be made easy to browse articles with that setting on-by-default, which would be similar to the way the Arabic Wikipedia handles some types of controversial content ( cf. http://ar.wikipedia.org/wiki/%D9%88%D8%B6%D8%B9_%D8%AC%D9%86%D8%B3%D9%8A ).
This could possibly be entirely implemented in JS and templates without any complex additional software support, but it would probably be nice to create a standardized tag for it and design the feature itself for maximum usability.
Solutions of this type would have the advantage of giving Wiki[mp]edians full editorial judgment and responsibility to use them as they see fit, as opposed to being an imposition from WMF, with an image filter tool showing up on the page about tangential quadrilaterals, and with constant warfare about correct labeling of controversial content. They would also be so broad as to be not very useful for third party censorship.
Clearly, one wouldn't just want to tag all articles in this fashion if people complain -- some complaints should be discussed and resolved, not responded to by adding a "Hide it if you don't like it" tag; some should be ignored. By putting the control of when to add the tag fully in the hands of the community, one would also give communities the option to say "Why would we use this feature? We don't need it!" This could then lead to further internal and external conversations.
I don't think this would address all the concerns Sue expresses. For example, I think we need to do more to bring readers into conversations, and to treat them respectfully. Our core community is 91% male, and that does lead to obvious perception biases (and yes, occasional sexism and other -isms). Polls and discussions in our community are typically not only dominated by that core group, they're sometimes in fact explicitly closed to people who aren't meeting sufficient edit count criteria, etc. For good reasons, of course -- but we need to find ways to hear those voices as well.
Overall, I think Sue's post was an effort to move the conversation away from thinking of this issue purely in the terms of the debate as it's taken place so far. I think that's a very worthwhile thing to do. I would also point out that lots of good and thoughtful ideas have been collected at: http://meta.wikimedia.org/wiki/Image_filter_referendum/Next_steps/en
IMO the appropriate level of WMF attention to this issue is to 1) look for simple technical help that we can give the community, 2) use the resources that WMF and chapters have (in terms of dedicated, focused attention) to help host conversations in the communities, and bring new voices into the debate, to help us all be the best possible versions of ourselves. And as Sue said, we shouldn't demonize each other in the process. Everyone's trying to think about these topics in a serious fashion, balancing many complex interests, and bringing their own useful perspective.
Erik
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l