[Foundation-l] 86% of german users disagree with the introduction of the personal image filter
Bjoern Hoehrmann
derhoermi at gmx.net
Sat Sep 17 23:33:19 UTC 2011
* Andre Engels wrote:
>Thereby giving those who have objections nothing just because there are
>others who we can't give what they want. If we had the same attitude towards
>article creation, we would not have published Wikipedia until we had
>articles on all subjects we could think of.
They are given plenty, in fact there are all sorts of filters already in
place that lead to people not being exposed to media they do not wish to
be exposed to, starting with laws, what people regard as appropriate for
encyclopedic content, and local cultural norms affecting the latter, to
name just a few examples. They also get to see images they do not object
to without additional effort, and they have the option to hide them all,
and they can be careful which articles they load, avoiding those likely
to feature media they do not want. Exposure to things you are uncomfort-
able with, where the feeling is not overwhelmingly shared, like when you
click a link to a foreign Wikipedia, is also giving them something, most
likely something good (consider images that cannot legally be displayed
publically in some jurisdiction; they will be so displayed in others in-
cluding on Wikipedias where they are relevant, with no filtering for the
users from the jurisdiction that banned it, up to where it's likely that
a court that Wikimedia cares about orders the image to be taken down).
You can consider the last point from the other direction: if you don't
like to see media with human suffering, horror, graphic violence, so you
filter them, what should you see when reading about Nazi concentration
camps? Profile pictures of Nazi leaders, architecture, maps maybe, but
please, not the prisoners? What about people who find it wrong to make
it easy for others to scroll down a Wikipedia article without the reader
being visually confronted with human suffering if there is a consensus
to display it at all in the article in this manner? Or, for that matter,
that in this context it is wrong to require the reader to tell the com-
puter "Yes, please show me piles or rotting corpses of starved people!"?
Note that it may be the user of the filter who thinks this, in which
case not giving them a filter that would do this is addressing one of
their needs aswell (while failing to address another need; giving them a
context-aware filter that avoids this problem would work of course, but
then the system would be harder to use making it worse, and so on).
So we already do plenty so people are not overexposed to media that they
reasonably do not wish to be exposed to (note that I use "wish" broadly,
someone suffering phobiae for instance has more of a need than a wish to
avoid certain kinds of media). To a point really where I am unsure what
is left that can realistically be optimized even further, and I am some-
what surprised the "referendum" had so many people voting that this is
of the highest priority (whatever that means given that this wasn't com-
pared to other things that should be of high priority), though since it
was already decided to introduce a filter because there is a problem, it
can be assumed that some of the votes are subject to confirmation bias.
(I do not know people who would say they frequently encounter articles
on Wikipedia featuring images that would disturb them no matter where
they appear and would thus prefer to have those pictures hidden for
them, though I do know people who would prefer that medical articles
concerning humans feature images that go beyond nudity, like showing the
symptoms of a disease, or a corpse that has been cut open, towards the
end in a specially labeled section, and people who do not like insects
much, and thus do not browse around articles on insects. Neither of the
two examples leads to me thinking a filter as proposed is the solution.)
>We don't say they're unreasonable, we say that we don't cater to it, or at
>least not yet. That may be non-neutral, but no more non-neutral than that
>one subject has an article and the other not, or one picture is used to
>describe an article and the other not, or one category is deemed important
>enough to be used to categorize our articles, books, words and images and
>another not.
>
>Or even clearer: that one language has a Wikipedia and another not. Wid we
>make a non-neutral choice that only certain languages (the ones for which
>Wikipedias exist) are valid languages to use for spreading knowledge?
These analogies are all invalid as individual preference is rejected as
argument in all of these cases, while the filter is solely concerned
with individual preference (rather: aggregated individual preferences).
Tagging an image with tag X, knowing that this will lead to the image
being hidden to users who chose to hide all X images, is a matter of
whether these users want the image to be hidden, who are in a minority,
because if the majority agrees an image should be hidden, it would not
be shown at all, no need for a personal filter; with the added problem
that the people affected by the tagging are the least likely to engage
in discussions about the tagging; and the added problem that this is by
its very nature a leaky system, as editors will consider whether some
image is filtered for some in their decisions whether to include it to
begin with (add a, to a minority, "controversial" image to an article
that lacks controversial images and likely get reverted, for instance).
--
Björn Höhrmann · mailto:bjoern at hoehrmann.de · http://bjoern.hoehrmann.de
Am Badedeich 7 · Telefon: +49(0)160/4415681 · http://www.bjoernsworld.de
25899 Dagebüll · PGP Pub. KeyID: 0xA4357E78 · http://www.websitedev.de/
More information about the foundation-l
mailing list