I thought it might be useful to here if I shared some of my
experiences with commons.
Like many people I've had the experience of bumping into a human
sexuality related commons category or gallery and thinking "Holy crap!
Thats a lot of [gallery name]. Freeking teenage pornofreaks!".
But unlike many other people, I am in a position to do something about
it: I'm a commons administrator and checkuser reasonably well
respected in the commons community (when I'm not inactive, at least),
well connected to the commons star-chamber, and I've played a role in
many of the internal 'governance by fiat' events. I think it's likely
that a majority of my deletions have been technically "out of
process", but by keeping a good working relationship with the rest of
the commons community this hasn't been a problem at all.
To take action you have to understand a few things: "The problem",
"The lay of the land", and "The goal".
Why might a super-abundance of explicit images be a problem?
(1) They potentially bring the Wikimedia sites into ill repute (it's
just a big porn site!)
(2) They encourage the blocking of Wikimedia sites from schools and libraries
(3) Explicit photographs are a hot-bed of privacy issues and can even
risk bumping into the law (underage models)
I'm sure others can be listed but these are sufficient for now.
"The lay of the land"
Commons has a hard rule that for images to be in scope they must
potentially serve an educational purpose. The rule is followed pretty
strictly, but the definition of educational purpose is taken very
broadly. In particular the commons community expects the public to
also use commons as a form of "visual education", so having a great
big bucket of distinct pictures of the same subject generally furthers
the educational mission.
There are two major factors complicating every policy decision on commons:
Commons is also a service project. When commons policy changes over
700 wikis feel the results. Often, language barriers inhibit effective
communication with these customers. Some Wikimedia projects rely on
commons exclusively for their images, so a prohibition on commons
means (for example) a prohibition on Es wiki, even though most
Eswikipedians are not active in the commons community. This
relationship works because of trust which the commons community has
built over the years. Part of that trust is that commons avoids making
major changes with great haste and works with projects to fix issues
when hasty acts do cause issues.
Commons itself is highly multi-cultural. While commons does have a
strong organizing principle (which is part of why it has been a
fantastic success on its own terms where all other non-wikipedia WMF
projects are at best weakly successful), that principle is strongly
inclusive and mostly directs us to collect and curate while only
excluding on legal grounds and a few common areas of basic human
decency— it's harder to create any kind of cross cultural agreement on
matters of taste. Avoiding issues of taste also makes us more
reliable as an image source for customer projects.
I think that a near majority of commons users think that we could do
with some reductions in the quantity of redundant / low quality human
sexuality content, due to having the same experience I started this
message with. Of that group I think there is roughly an even split
between people who believe the existing "educational purposes" policy
is sufficient and people who think we could probably strengthen the
policy somehow.
There are also people who are honestly offended that some people are
offended by human sexuality content— and some of them view efforts to
curtail this content to be a threat to their own cultural values. If
this isn't your culture, please take a moment to ponder it. If your
personal culture believes in the open expression of sexuality an
effort to remove "redundant / low quality" sexuality images while we
not removing low quality pictures of clay pots, for example, is
effectively an attack on your beliefs. These people would tell you: If
you don't like it, don't look. _Understanding_ differences in opinion
is part of the commons way, so even if you do not embrace this view
you should at least stop to understand that it is not without merit.
In any case, while sometimes vocal, people from this end of the
spectrum don't appear to be all that much of the community.
Of course, there are a few trolls here and there from time to time,
but I don't think anyone really pays them much attention. There are
lots of horny twenty somethings, but while it might bias the
discussions towards permissiveness I don't think that it really has a
big effect beyond the basic youthful liberalism which exists
everywhere in our projects.
There are also a couple of occasional agitators calling for things
like a complete removal of sexuality content. Most of them fail to
sound reasonable at all— demanding the removal of old works of art,
basic anatomy photos... I think these complaints are mostly ignored.
... and a majority of people who either don't care or don't speak the
languages the discussions are held in.
"The goal"
Considering the landscape, how do we solve the problems?
Lets take a category of Penis images as an example. Load it up.
Hundreds of penii. Pretty shocking. We can obviously cut back on this,
right? How many penis images do we really need to meet the mission of
the Wikipedias? (and then we need to consider the more expansive
mission of commons in educating through media).
Well, we ought to have circumcised, and uncircumcised. Flaccid and
erect. An example of each kind of penis jewellery that has a WP
article in some language. An example of every disease with
penis-visible symptoms.... We're easily at 50-100 images already.
People seem to think we also need many of the prior samples from
multiple races to demonstrate the (lack of) differences. Add a little
further inflation because editorial preferences on the Wikipedias will
differ.
So on the basis of meeting the Wikipedia's need alone, we're up to
hundreds of pictures of penises. Now— commons' hundreds are not so
diverse, we need fewer of some kinds and more of others, but in terms
of the sheer count even before considering commons' own educational
remit we still need a bunch.
Where does this place us in terms of our problem statements? Well,
With hundreds of pictures in the category it will be easy to cast
commons as a penis palace. Thus, in terms of this class of images—
problem (1) is probably unsolvable given our educational mission. If
someone wants to point to the category and inspire the "Oh my god;
it's full of cocks" response, they can...
Virtually all libraries and schools that block internet sites employ
categorical blocking software. They block broad categories like
"Drugs, weapons, nudity, pornography, and proxy evasion". All of the
Wikimedia projects could be blocked under all of these categories.
Even a highly educational penis is still nudity— these filtering
services are often criticized for blocking information on breast
exams, for example. Because of the way the blocking happens reducing
the number of penis images to the educational minimum would not likely
reduce the incidence of blocking in any material way. So problem (2)
seems to be unsolvable given our educational mission.
I think we could make some improvement with problem (3). The privacy
issues can also be addressed by using images without visible faces
(which are often perceived to be more prurient, unfortunately).
Ironically— the commercial pornography industry has been pretty happy
to supply us with images which we are quite sure are legal and without
privacy problems. But accepting these images heightens the perception
that commons is promoting pornography rather than merely hosting
educational resources.
The prevalence of commercial sex images reflects the result of prior
attempts to avoid child images and images created without the model's
consent, though I don't think the consequence was expected. As a
checkuser (with OTRS access) I can't say that I've seen evidence of
abuse by commercial porn providers: Wikimedians are going to them.
Although, _obviously_ problematic images are regularly and easily
deleted without dispute. I've nuked a few from orbit and never hit
the slightest bit of resistance. Though the community also has no
reason to distrust my claims that an image is inappropriate, other
people may get different results.
Now how would we draft such a policy to further improve things?
We need a policy which can be easily understood by many languages and
cultures, which improves the situation but doesn't provide a basis for
other censorship (e.g. some would have us remove all likenesses of
Muhammad, images of women without veils, historical offensive
political cartoons and symbols, etc). Actually be enforceable in the
face of incomplete information from uploaders, without the risk of too
much 'taste' and the resulting instability for customers. I'm at a
loss. I have no suggestion beyond preferring illustrations rather than
photos (which we already do), and accepting images contributions
commercial sources, which is bad for our image. This seems really
hard.
Now pull in the part of the landscape that I didn't mention: Commons
has almost five million images. The deletion spree which was operated
completely without regard to the community process was described as an
"almost complete cleanup" removed fewer than 500 images— or about
0.008% of the collection.
At this point in my reasoning I inevitably conclude (1) The problem
was far less bad than my initial impression. (2) At _best_ we can't
solve much of the problem without accepting aggressive censorship of
our coverage, both text and images (3) The part we could improve is
pretty hard to improve. (4) There are more important things to work
on.
None of this really depends on any difficulty coming from governance.
Even as supreme ruler for a day I couldn't solve this one
satisfactorily.
The initial surprise is enough that I've gone through this cycle
several times now, but I keep reaching the same conclusion. I expect
the same is true for many other contributors.
... and outside of some agitation from people pushing for the
unachievable like "school safeness", and some popular troll-nest
message boards, troll-nest 'news' agencies, and a somewhat trollish
ex-nupedian, I haven't seen a lot of evidence that these 0.008% are
suddenly in need of a major effort. I can promise you that a far
greater proportion of our works are misleadingly labelled, outright
spam, egregious copyright violations, potentially carrying hidden
malware, etc.
Feedback from the board that such an effort is desired from the board
would certainly help shift the priorities— it would also give us some
excusability for disruption to our customer projects.
But this isn't what we got at all. The clear _consensus_ among the
commons community and many of our customers is that what we what we
got was disruptive, under-informed, and damaging to our internal
governance. We now faction lines have been drawn between the couple
of commons users aligned with Jimmy and the (literally) hundreds of
users opposed the methodology used here and the specifics of some of
the deletions. There is no active discussion about making an
improvement, our customers are discussing creating chapter operated
forks of commons free from this kind of disruptive intervention which
is perceived by many to be overt values based censorship. Many other
messages have expressed the complaints in greater detail.
I hope this has provided some useful background and that it will
foster improved communication on the subject.