[Foundation-l] 86% of german users disagree with the introduction of the personal image filter

Tobias Oelgarte tobias.oelgarte at googlemail.com
Mon Sep 19 14:47:25 UTC 2011


Am 19.09.2011 15:33, schrieb me at marcusbuck.org:
> Zitat von Tobias Oelgarte<tobias.oelgarte at googlemail.com>:
>
>> The second problem will be the categorization progress. We would
>> categorize the images for others, not our selfs, and we also have no
>> sources for argumentation. But there is another problem. We already
>> discuss about the inclusion of images inside related articles discussion
>> pages. While some image might not be appropriate for inclusion in one
>> article, it might be the perfect, valuable, needed for understanding,
>> maybe offensive illustration for another article.
>    From what I understood the image filter will not have subjective
> criteria like "a little offensive", "very offensive", "pornography",
> but neutrally decidable criteria like "depicts nude female breasts",
> "depicts the face of Muhammad", "depicts mutilated dead body". If you
> select these criteria carefully there should be no need for any
> "sources" for your decision to put a file in the criterion's category.
> Either the image depicts the category topic or it doesn't.
>
> Marcus Buck
> User:Slomox
>
We discussed this already and came to the conclusion, that you would 
need hundreds of these categories to filter out most of the 
"objectionable content". But that is neither manageable from our side 
nor manageable by the user. You run into a deadlock. Either we will end 
up having some rather subjective categories or we have whole lot of 
them, we can't manage (at least not under the assumption to be 
user-friendly or wasting a whole lot of resources for tiny group of 
readers).




More information about the wikimedia-l mailing list