[Foundation-l] Letter to the community on Controversial Content
Andreas K.
jayen466 at gmail.com
Tue Oct 18 23:03:51 UTC 2011
On Tue, Oct 18, 2011 at 11:10 PM, Tobias Oelgarte <
tobias.oelgarte at googlemail.com> wrote:
> Am 18.10.2011 23:20, schrieb Andreas K.:
> > On Tue, Oct 18, 2011 at 8:09 PM, Tobias Oelgarte<
> > tobias.oelgarte at googlemail.com> wrote:
> >
> >> You said that we should learn from Google and other top websites, but at
> >> the same time you want to introduce objective criteria, which neither of
> >> this websites did?
> >
> >
> >
> > What I mean is that we should not classify media as offensive, but in
> terms
> > such as "photographic depictions of real-life sex and masturbation",
> "images
> > of Muhammad". If someone feels strongly that they do not want to see
> these
> > by default, they should not have to. In terms of what areas to cover, we
> can
> > look at what people like Google do (e.g. by comparing "moderate safe
> search"
> > and "safe search off" results), and at what our readers request.
> >
> >
> The problem is, that we never asked our readers, before the whole thing
> was running wild already. It would be really the time to question the
> feelings of the readers. That would mean to ask the readers in very
> different regions to get an good overview about this topic.
I agree with you here, and in fact said so months ago. We should have
surveyed our readership (as well), rather than (just) our editorship.
> What Google
> and other commercial groups do shouldn't be a reference to us. They
> serve their core audience and ignore the rest, since their aim is
> profit, and only profit, no matter what "good reasons" they represent.
> We are quite an exception from them. Not in popularity, but in concept.
> If we put to the example of "futanari", then we surely agree that there
> could be quite a lot of people that would be surprised. Especially if
> "safe-search" is on. But now we have to ask why it is that way? Why does
> it work so well for other, more common terms in a western audience?
>
I think we addressed this example previously.
> I do not see this as the majority winning, and a minority losing. I see it
> > as everyone winning -- those who do not want to be confronted with
> whatever
> > media don't have to be, and those who want to see them can.
> >
> I guess you missed the point that a minority of offended people would
> just be ignored. Looking at the goal and Tings examples, then we would
> just strengthen the current position (western majority and point of
> view) but doing little to nothing in the areas that where the main
> concern, or at least the strong argument to start the progress. If it
> really comes down to the point that a majority does not find Muhammad
> caricatures offensive and it "wins", then we have no solution.
>
I am all in favour of taking minority concerns on board. Specifically that
Muhammad images should be filterable; no question. The point is that the
more disparate filter wishes we accommodate, the more filter attributes will
be necessary, which is something that worries other editors. I haven't
really made my mind up on this one.
> > My mind is not made up; we are still in a brainstorming phase. Of the
> > alternatives presented so far, I like the opt-in version of Neitram's
> > proposal best:
> >
> >
> http://meta.wikimedia.org/wiki/Controversial_content/Brainstorming#thumb.2Fhidden
> >
> > If something better were proposed, my views might change.
> >
> >
> > Best,
> > Andreas
> >
> I read this proposal and can't see a real difference in a second
> thought. At first it is good that the decision stays related to the
> topic and is not separated as in the first proposals. But it also has a
> bad taste in itself. We directly deliver the tags needed to remove
> content by third parties (SPI, Local Network, Institutions), no matter
> if the reader chooses to view the image or not, and we are still in
> charge to declare what might be or is offensive to others, forcing our
> judgment onto the users of the feature.
>
> Overall it follows a good intention, but I'm very concerned about the
> side effects, which just let me say "no way" to this proposal as it is.
>
The community will always be in charge, one way or the other. I think that's
unavoidable. And to many people, this is the exact opposite of a bad thing:
it's an absolute *must* that the community should be in charge, and
understandably so, as they are the ones doing the work. I disagree though
that it necessarily must mean that the community declares what is offensive
to others.
We, as a community, can *listen* to what people are telling us, and take
their concerns on board.
I have no problem adding a filter attribute to a file that a reader tells me
offends him, and which he wishes to be able to filter out, even if I think
it is a perfectly fine image.
Nor would I feel the need to impose my view on them the other way round,
telling them they should just grow a thicker skin and get used to the image.
Andreas
More information about the wikimedia-l
mailing list