On Fri, Apr 15, 2005 at 12:45:04PM +0100, Tony Sidaway wrote:
We don't make the user do anything, but if he
wants a censored Wikipedia
he already has the capability to produce his own; we're hearing from
people who for whatever reason are unwilling to do so and think that it is
Wikipedia that should change to accommodate their intransigence. Thus a
personal problem has been incorrectly reframed as a problem for Wikipedia.
I think it's been made tolerably obvious at this point that anyone does
have the option to disable images, *and* that users of well-designed
browsers are frequently even capable of displaying or refusing images on
a per-site basis. This likely doesn't need to be reiterated again.
I'm curious about the proposals for free-form tagging systems, where
users would have the option to block out images or text matching
specific sets of tags that they chose. I'm not sure that any such
system could actually meet Wikipedia's needs and at the same time
fulfill the desires of those who call for greater restriction on images.
To start with, it's not clear to me that such a system could do what
some people are asking for -- that is, creating a view of Wikipedia that
would be "work-safe" or otherwise fitting some particular content
censorship standard. That goal seems impossible unless a tagging system
suiting that standard were *mandatory* -- that is, any edit violating
the tagging system would have to be considered abusive. This is not a
rule that I would expect to garner community consensus, especially since
the judgment of what content merits a particular tag would be POV-laden.
(Consider: What is the point of a "violence" tag if people who post
images of violence are not required to use it? People can depend on
such a system only if it is mandatory, but making it mandatory violates
Wikipedia principles ... especially since people can be expected to
disagree on what deserves the label "violence".)
I also think it's pretty clear that *no* such system will suit the goals
of those who really do want to marginalize or exclude "evil" expression
from the public arena. Those who actually want to _remove_ nudity or
violence or sex education or "cult" content from Wikipedia, under the
belief that such material is harmful to the public morality, will not be
satisfied by mere tagging.
Another concern: Will the presence of a tagging system legitimize the
behavior of people who really _do_ simply want to post nudity (or
whatever) for its own sake? We already have the problem that people are
more ready to talk about an image's offensiveness than its relevance.
A tagging system might give an additional argument to people who simply
want to put more nudity or violence or what-have-you on Wikipedia, and
to hell with relevance: "If you don't want to see it, go turn on more
censorship in your filtering preferences." This would be unfortunate.
Yet another possible consequence is that people could use a tagging
system as a way of searching Wikipedia for all the "naughty images".
Let us imagine a curious adolescent who today browses Wikipedia looking
for Renaissance nude paintings, swimsuit images like those on
[[Bikini]], the sketches of sexual positions, and so on. By aggregating
all the nudity in an easy-to-find category, tagging would make it
_easier_, not harder, to use Wikipedia as a source of titillation.
This could even lend itself to greater calls for the deletion of such
material -- since it would make it trivial to construct a "nudes of
Wikipedia" view that would highlight just that particular content.
--
Karl A. Krueger <kkrueger(a)whoi.edu>