On 7/1/2011 3:55 PM, Casey Brown wrote:
*Call for referendum*: The Wikimedia Foundation, at
the direction of
the Board of Trustees, will be holding a vote to determine whether
members of the community support the creation and usage of an opt-in
personal image filter, which would allow readers to voluntarily screen
particular types of images strictly for their own account.
This proposal is
just astonishingly vague. It sounds like this is
something like the "safe" flag in Flickr.
The software side of implementing such filters is simple, but the
real problem is maintaining the tags that keep track of what's offensive.
I run a site that uses content from Wikimedia Commons and other
sources and I've recently had some problems with my partners after one
of them found a picture of a human anus on the site. I'd made some
effort to remove offensive content from the site before, but I
redoubled my efforts after this.
I did a lot of thinking about it and personally I decided I'd
rather take the change of excluding a few good things if I can get rid
of some bad things. Currently my system thinks that about 0.15% of
images used in Wikipedia are "offensive", which roughly means
"connected with nudity, sexuality, pornography, or illegal drugs."
Now, I'm trying to make something that's useful for K-12 education, so
I'm probably more exclusionary than some people would be -- the site has
already gotten an endorsement from a board of education in a relatively
conservative state and frankly, I'd rather preserve relationships that
help students find the %99.85 of images that everyone will agree on.
Now somebody else can make a different decision for another site
and that's fine.
I had to make all sorts of decisions here: for instance, I wasn't
sure if I wanted to get rid of illegal drugs, because there's a
slippery slope here: a picture of a pot plant is relevant to botany,
people abuse uncontrolled drugs such as cough syrup, and there's a very
common mushroom that's possibly growing on your lawn that contains trace
quantities of psilocybin. On the other hand, I felt that 30% of drug
images were offensive, such as pictures of identifiable people using
cocaine. Since it would have been hard to make an operational
definition of what exactly is 'offensive', I decided to just remove all
of them.
Now, Wikipedia is widely used in K-12 education, but people don't
often mention all of the things you can find in Wikipedia that aren't in
the Encyclopedia Britannica, such as the video and images that you'd
find on this page:
http://en.wikipedia.org/wiki/Ejaculation
In a consensus-based organization, I think it will be very
difficult to set tagging standards and get them consistently enforced.
Where I'm the king of my own domain I had a lot of agony about getting
things right -- add politics to the mix, and it all gets worse.
To take a specific example, the category "Gay Culture" in
Wikipedia is particularly problematic because "Gay" as a category is
related to sexuality (just as is "Straight".) Maybe 60% or so of "Gay
Culture" topics (in Category:LGBT_culture) could be said to be
offensive, such as
http://en.wikipedia.org/wiki/Glory_hole_(sexual_slang)
<http://en.wikipedia.org/wiki/Glory_hole_%28sexual_slang%29>
now the way I see it, most of the "offensive" acts related to
homosexuality can also be performed by heterosexuals and would be
equally offensive. On the other hand, there might be some people who'd
see an "offensive" tag on a gay-related topic and see that as some kind
of hate speech, even if an effort is being made to treat gay and
straight the same. If, however, a conservative school board
complained that I had pictures of the Stonewall or a gay pride parade
I'd tell them to go to hell.
Other areas of "offensiveness" which may be problematic are
gambling and hate speech. Cards and dice, for instance, are used for
many non-gambling games and pictures of the exteriors of casinos on the
Vegas strip have a high relevance to post-modern architecture and aren't
likely to incite people to gamble illegally or destructively.
Similarly, there are reasons to suppress active hate speech, but you
can't flag every picture of Nazi Germany as "offensive:hate_speech" or,
going a bit further back in history where things are murkier and more
controversial, every picture that has a confederate flag in it.