[Foundation-l] 86% of german users disagree with the introduction of the personal image filter
Tobias Oelgarte
tobias.oelgarte at googlemail.com
Fri Sep 16 22:32:23 UTC 2011
Am 16.09.2011 22:37, schrieb Andre Engels:
> There might be a difference because of the differences in voting
> requirements - those were very low for the 'referendum', so there would be a
> possibly large percentage of people who aren't hardcore Wikimedians, but
> people who are mostly readers and at most occasionally edit. On the other
> hand, this would also increase the chance of having sockpuppeting. Another
> reason could indeed be the questioning: Opponents of the plan could have not
> voted on the referendum because the whole issue seemed like it had been
> decided anyway. Then again, proponents might be less likely to vote in the
> German poll because it is non-anonymous in an environment which seemed
> opposed to their point of view.
The requirements where different. Thats right. But i don't believe that
this would influence the overall outcome. I monitored some articles
about sexual topics for a while (especially Hentai, Lolicon, Futanari,
and so on) that had controversial images in them. Additionally i looked
up the numbers of views per day and compared them with the number of
complaints over time. In the last year we had one short request to
remove the well known image from Lolicon, because it was considered porn
(which it isn't). That was the first and only complain in the whole
time, despite the fact that every of this articles has more then 1000
views a day.
On EN the story is a little bit different. The articles have in average
the double or triple amount of viewers, but you will get at least one
complain in a month per article. This is still very very low. That means
that one out of approximately 90.000 viewers will complain (including
the constant repeating complainers). Assuming that only 1 of 100 viewers
would leave a comment while being offended, then it is still
approximately 1 in 1000 (0,1 %).
This a vague fact on for how many users the filter would be an maybe
helpful tool. Compared to the German Wikipedia it is still high in
relation. (I did not count the zeros after the comma.)
Given that, can we assume that the filter is really necessary or that it
would be an valid explanation based on the fact that the polls had
different requirements for voting?
If you compare my simple (not waterproof) calculation to the result of
the German poll you might think that this isn't true. Only 86% opposed
the filter. Thats right, but you will also have to consider the public
comments. There are only some that found it important that the filter
will be implemented, since they see use in it. The most opposing votes
(14%) are: Let them do it, it won't hurt, but not strictly in favor of
the filter itself.
> It was just an example (a literal allegation). The current proposal (as
>> represented in side the referendum) did not assume any cultural
>> difference. My thoughts on this is, how we want to create filter
>> categories which are cultural neutral. One common (easy to describe)
>> example is nudity. What will be considered nude by an catholic priest
>> and an common atheist, both from Germany. Will they come to the same
>> conclusion if they look an swimsuits? I guess we can assume that they
>> would have different opinions and a need for discussion.
>>
> As said before, just get different categories, and let people choose among
> them. The priest could then choose to block "full nudity", "female
> toplessness", "people in underwear" and "people in swimwear", but not
> "images containing naked bellies" or "unveiled women", whereas the atheist
> could for example choose to only block "photographs of sexual organs" and
> watch the rest.
And again (now i must repeat myself) i have to say, that we aren't able
to manage such an amount of categories. Every category increases the
work drastically and complicates it. But even if we are able to
categorize millions of images, then we still have to represent them to
the reader who is willed to filter the content. A dialog with options to
sort out of hundreds or more categories seems not the a viable option.
Who would really take his time to go trough all points as an IP-user who
loses it's settings if switching the browser or the workplace? That
would be anything but not user friendly.
So you will have to make a compromise (i assume 10-20 categories), where
you don't have this fine graded model. You proposed alone 7 categories
for nudity. What about other topics? You will easily reach high numbers.
That is an high effort for categoization/labeling as well as for the
user. It would just not work that way and I don't believe that we can
proceed under the assumption of having an infinite amount of labels.
>
>> Would we need this discussion until now and for all images? No we did
>> not. We discussed about the articles and would be a good illustration
>> for the subject. But now we don't talk about if something is good
>> illustration. We talk about if it is objectionable by someone else. We
>> judge for others what they would see as objectionable. That is
>> inherently against the rule of NPOV. That isn't our job as an
>> encyclopedia. We present the facts in neutral attitude toward the topic.
>> We state the arguments of both or multiple sides. A filter only knows a
>> yes or no to this question. We make a "final" decision what people don't
>> want to see. That is not our job!
>>
> I find it strange that you consider this an objection to a filter. Surely,
> giving someone an imperfect choice of what they consider objectionable is
> _less_ making a decision for them than judging in advance that nothing is
> objectionable?
Judging that nothing is objectionable is our job as the writers of an
encyclopedia. We are not hiding facts based upon the view that it might
be objectionable for some users. This is the most basic rule that every
encyclopedia followed, if not under the control of a dictatorial system.
Thats exactly why library organizations or the editorial staff of
encyclopedias are opposed to label content by non-neutral categories.
For example have a look at
http://www.ala.org/ala/issuesadvocacy/intfreedom/librarybill/interpretations/labelingrating.cfm
Which claims for more then 50 years that labeling after
objectionableness is in violation with the basic rules of intellectual
freedom. We have the same rules. One of them is NPOV, which covers this
topic as well.
>
>> I don't know where you got this information. But I would not wonder if
>> it is as it is presented by you. At least in case of Ting and Jimbo you
>> should have right. I learned with the time about Jimbo, his attitude
>> towards topics and it's understanding. So i have no doubt that he would
>> trade intellectual freedom against some more donations.
>>
> How are we giving away intellectual freedom with this?
Follow the link above and you will hopefully understand what this is
about. Intellectual freedom means to represent knowledge in way that
does respect the facts, but does not respect the audience and their
wishes or likings. It means to represent knowledge, but not to bend
before people who think that something might be objectionable. Not
censoring yourself and making knowledge available to everybody, that is
the definition of intellectual freedom.
>
>> That is my personal main issue with the whole filter thing based on
>> arbitrary non-neutral labeling of content and POV as the measure for
>> judgment.
>>
> What is POV about labelling something as being an image containing a nude
> human or an illustration supposed to represent a religious figure?
>
The POV starts when you draw a line between what is nude and what is not
nude. Is someone wearing a half transparent swimsuit nude? (yes/no) Is
it still (not) nude if it is a tiny bit thinner/thicker? (yes/no)
You see that even this simple question can't be answered with a simple
yes or no. No think about what you use to separate between nude content
and non nude content. Do you have a strict catalog, with points to
follow. Isn't this your point of view where nudity begins and where it
ends? Do you think that your neighbor would come to exactly the same
conclusion? He won't. It will not divide far away from each other. But
this is also the simplest question.
What about violence. Is a boxing fight inside a ring violence? Does an
image from after the fight depict violence, even so there is blood on
the floor and both boxers shake hands? Now it gets a little bit more
complicated.
Now we could move on to next topic and i could ensure you, that it won't
get easier, only much harder. You could play the judge, but i can
guarantee you, that won't be able to claim neutrality on this matters
anymore.
To be truthful to yourself, you will have to admit that you have choosen
"nudity" as the example, because it is much easier to handle then every
other topic. But still you found a lot of categories to divide between
different grades of nudity.
More information about the wikimedia-l
mailing list