[Wikimedia-l] Who invoked "principle of least surprise" for the image filter?

Thomas Morton morton.thomas at googlemail.com
Mon Jun 18 14:31:59 UTC 2012


On 18 June 2012 15:16, Tobias Oelgarte <tobias.oelgarte at googlemail.com>wrote:

> Am 18.06.2012 15:06, schrieb Thomas Morton:
>
>
>>>   It is not convincing since it interferes with the work of our editors
>>>>
>>> that aren't interested in such a feature.
>>>
>>
>> Seems unlikely. Although please feel to expand on this with specifics.
>>
> Any tagging by non neutral definitions would interfere with project. It's
> like to create categories named "bad images", "uninteresting topics" or
> "not for ethnic minority X".


Of course; but that is predicated on a bad process design. Solution; design
an appropriate process.


>
>  If we tag images inside the project itself then we impose our judgment
>>> onto it, while ignoring or separating it from the context it is used in.
>>>
>>
>> And yet you allow that we use editorial judgement in articles. This is no
>> different, it gives a further tool for editorial decisions to be made.
>>
> Editorial judgment is based on how to wrap up a topic a nice way without
> making an own judgment about the topic. A hard job to do, but that is the
> goal.
>
> If i would write the article "pornography" then i would have to think
> about what should be mentioned inside this article because it is important
> and which parts are not relevant enough or should be but in separate
> sections to elaborate them in further detail. This is entirely different to
> say "pornography is good or evil" or "this pornographic practice is good or
> evil and thats why it should be mentioned or excluded".
>
> There is a difference between the relevance of a topic and the attitude
> toward a topic. The whole image filter idea is based on the latter and not
> to be confused with editorial judgment.


Pornography articles, as it stands, have a community-implemented "filter"
as it is. Which is the tradition that articles are illustrated with
graphics, not photographs. So the example is a poor one; because we already
have a poor man's filter :)

Similarly the decision "does this image represent hardcore porn, softcore
porn, nudity or none of the above" is an editorial one. Bad design process
would introduce POV issues - but we are plagued with them anyway. If
anything this gives us an opportunity to design and trial a process without
those issues (or at least minimising them).



>  The first proposal (referendum) mentioned various tagging
>>> options/categories that would have to be maintained by the community,
>>> despite existing and huge backlogs.
>>>
>>
>>  A reasonable argument; but almost everything adds to our backlog anyway.
>>
> I would have nothing against additional work if i would see the benefits.
> But in this case i see some good points and i also see list of bad points.
> At best it might be a very tiny improvement which comes along with a huge
> load of additional work while other parts could be improved with little
> extra work and be a true improvement. If we had nothing better to do then i
> would say "yes lets try it". But at the moment it is a plain "No, other
> things have to come first".
>
>
>  Additionally we are a multi culture project with quite different view
>>
>>> points and which accepts different view points (main difference between
>>> Flickr and Co).
>>>
>>
>> This is an argument for an opt-in filter.
>>
> Don't confuse opt-in and opt-out if a filter is implemented on an external
> platform. There is no opt-in or opt-out for Wikipedia as long the WP isn't
> blocked and the filter is the only access to Wikipedia. <contains some
> irony>We have the long story that parents want their children to visit
> Wikipedia without coming across controversial content, which they
> apparently do everytime they search for something entirely
> unrelated.</contains some irony> In this case an opt-in (to view) filter
> makes actually sense. Otherwise it doesn't.


We may be confusing opt in/out between us. The filter I would like to see
is optional to enable (and then stays enabled) and gives a robust method of
customising the level and type of filtering.


>
>  The result will be huge amount of discussions about whether to tag an
>>> image or not.
>>>
>>
>> Not if well designed. And at the moment we have big discussions about
>> whether to include images or not.
>>
> We have such discussions. But I'm afraid that most of them do not circle
> around the benefits of the image for the article, but the latter part that
> i mentioned above (editorial judgment vs attitude judgment).
>

Filtering images would resolve most of these issues.


>
> Believe me or believe me not. If we introduce such tagging then the
> discussions will only be about personal attitude towards an image, ignoring
> the context, it's educational benefits entirely.


We successfully tag images as pornographic, apparently without drama,
already. So I find this scenario unlikely.


>
>  This leads me to the simple conclusion that it isn't worth the effort,
>>> especially if the filter is advertised to make Wikipedia a save place for
>>> children, while everyone (including children) can disable it at any time.
>>>
>>>  "Think of the children" is not really an argument I ascribe to. And not
>> really one other proponents of the filter, by my observation, ascribe to
>> either.
>>
>> It mostly seems to be brought up by opponents to try and invalidate
>> arguments.
>>
> I don't think that we need this argument since the filter can't replace
> parents anyway. But it is a constant part of the discussions with various
> exaggerated examples that can be seen in bold at Jimmys talk page even
> right at this moment. For example:
>
> "Wikipedia helps me teach my children about the world in a safe, clean and
> trustworthy manner. Free from bias, banter, commercial interests and risky
> content."[1]
>
> [1] http://en.wikipedia.org/wiki/**User_talk:Jimbo_Wales#UK_law<http://en.wikipedia.org/wiki/User_talk:Jimbo_Wales#UK_law>
>
>  Separate projects that only focus on one task (providing a whitelisted
>>> view, an automatically updated subset of Wikipedia) would not be a burden
>>> for the community or at least for everyone not interested in or against
>>> filtering. Additionally it could define it's own strict rules and could
>>> even hide images and articles entirely depending on it's goal.
>>>
>>>  Please note we define community in significantly different ways. My
>> "community" includes a minority, us, who edit and maintain the project.
>> And
>> also the vast majority who merely read and use the project.
>>
>> Our goal as maintainers for this main community should be:
>> * Maximise the ability of individuals to access content by...
>> * Minimising the road blocks (social, political, etc.) to accessing
>> content
>>
>> A significant portion of the filter discussion is predicated on our
>> internal prejudices and POV - basically navel gazing - with a wide
>> rejection of the idea that a multi-cultural society exists.
>>
>> A non-WMF filtering project would not be useful to our community due to
>> the
>> chicken/egg seeding problem.
>>
> It is a chicken/egg problem. One part of our community (including readers)
> dislikes tagging/filtering and sees it as (or the tool for) the creation of
> road blocks that don't exist at the moment. A second part of our community
> wants it to be more conservative in fear that it might the deciding factor
> that could create road blocks. I already mentioned it above in the
> "benefits vs effort" section.
>
>
We don't have much data on what our readers want; but a not insignificant
portion of them, at least, are concerned with controversial images
(nudity, Mohammed, etc.). I fully advocate finding out what the community
thinks; but when I raised this issue before it was snorted at with
something along the lines of "the readers aren't the driving force here".

*sigh*

Tom


More information about the Wikimedia-l mailing list