[Foundation-l] News from Germany: White Bags and thinking about a fork
Erik Moeller
erik at wikimedia.org
Sun Oct 23 00:00:44 UTC 2011
On Sat, Oct 22, 2011 at 4:14 PM, Tobias Oelgarte
<tobias.oelgarte at googlemail.com> wrote:
> Isn't that the same as putting some images inside the category
> "inappropriate content"? Will it not leave the impression to the reader
> that "we" think that this is something not anybody should see? Can it be
> easily used by providers to filter out this images?
http://en.wikipedia.org/w/index.php?title=Special:WhatLinksHere/Template:Censor&namespace=1&limit=500
http://en.wikipedia.org/wiki/MediaWiki:Bad_image_list
http://commons.wikimedia.org/wiki/Category:Nude_men
Simply in the process of doing our normal editorial work, we're
already providing a number of ways to identify content in the broad
area of "someone might be upset of this" or even in specific
categories, and of course censorship also often relies on deriving
characteristics from the content itself without any need for
additional metadata (keyword filters, ranging from simple to
sophisticated; image pattern matching, etc.).
It's not clear that a low-granularity identification of content that
some editors, in some projects, have identified as potentially
objectionable to some readers, for a wide variety of different
reasons, adds meaningfully to the existing toolset of censors. A
censor who's going to nuke all that content from orbit would probably
be equally happy to just block everything that has the word "sex" in
it; in other words, they are a reckless censor, and they will apply a
reckless degree of censorship irrespective of our own actions.
Erik
More information about the wikimedia-l
mailing list