[Foundation-l] News from Germany: White Bags and thinking about a fork

Tobias Oelgarte tobias.oelgarte at googlemail.com
Sun Oct 23 00:53:43 UTC 2011


Am 23.10.2011 02:00, schrieb Erik Moeller:
> On Sat, Oct 22, 2011 at 4:14 PM, Tobias Oelgarte
> <tobias.oelgarte at googlemail.com>  wrote:
>> Isn't that the same as putting some images inside the category
>> "inappropriate content"? Will it not leave the impression to the reader
>> that "we" think that this is something not anybody should see? Can it be
>> easily used by providers to filter out this images?
> http://en.wikipedia.org/w/index.php?title=Special:WhatLinksHere/Template:Censor&namespace=1&limit=500
> http://en.wikipedia.org/wiki/MediaWiki:Bad_image_list
> http://commons.wikimedia.org/wiki/Category:Nude_men
>
> Simply in the process of doing our normal editorial work, we're
> already providing a number of ways to identify content in the broad
> area of "someone might be upset of this" or even in specific
> categories, and of course censorship also often relies on deriving
> characteristics from the content itself without any need for
> additional metadata (keyword filters, ranging from simple to
> sophisticated; image pattern matching, etc.).
>
> It's not clear that a low-granularity identification of content that
> some editors, in some projects, have identified as potentially
> objectionable to some readers, for a wide variety of different
> reasons, adds meaningfully to the existing toolset of censors. A
> censor who's going to nuke all that content from orbit would probably
> be equally happy to just block everything that has the word "sex" in
> it; in other words, they are a reckless censor, and they will apply a
> reckless degree of censorship irrespective of our own actions.
>
> Erik
That is an risky assumption. Nothing is easier then to identify flagged 
images and to remove them, while the typical users think that anything 
is fine. A simple stream filter would remove such images, without the 
need for the clever censor to remove the whole page. Shutting down 
Wikipedia as whole is very unlikely due to social pressure. Suppressing 
only some parts of it is a completely different story. This (knowingly) 
works already in China, Vietnam, Taiwan and other countries. Some pages 
are blocked, listed by the authority. It would be in great support for 
them to exclude content that we flagged as controversial. The misuse 
potential is relatively high.

In comparison to that the filter WereSpielChequers proposed is a much 
saver solution. It also does not intervene with the editorial process 
(for example: edit wars over flagging). The only doubt would be, if we 
could get this running, since it needs a relatively high amount of users 
of the feature, that also have to login.

Back to your proposal. How is it any different from putting all so 
called maybe offensive images in the category "category:maybe offensive 
image"? The only difference i can see, is,  that it is now project and 
article based categorizing.

nya~



More information about the foundation-l mailing list