[Foundation-l] Appropriate surprise (Commons stuff)
Samuel Klein
meta.sj at gmail.com
Sun May 9 08:58:25 UTC 2010
Thank you Greg, for this brilliant and personal overview. Very helpful.
A few thoughts:
On Sun, May 9, 2010 at 4:17 AM, Gregory Maxwell <gmaxwell at gmail.com> wrote:
>
> Why might a super-abundance of explicit images be a problem?
> (1) They potentially bring the Wikimedia sites into ill repute (it's
> just a big porn site!)
This can be addressed in part by increasing the quality standard for
our images. A well-ordered set of anatomy images, in standard
proscribed frame and format, from an established cross-section of
races or backgrounds : this would be excellent. It would also be a
useful model to follow for all sorts of anatomical images (you could
use the same models to get entire sets of images of the body).
Likewise, a well-ordered set of images of jewelry and piercings,
perhaps organized in partnership with a large piercing/jewelry parlor
in a multiethnic community, would also be easy enough to set up -- and
would quickly replace the many lazily-shot and casually curated images
we have today. (note that I didn't specify genital jewelry and
piercings; though that would be part of the series).
A gorgeous and professionally made encyclopedia of sexuality might not
be to some people's tastes, but wouldn't inspire them to say 'just a
big porn site!', just as the Museum of Sex has acquired a very
respectable following and media coverage in New York. That is
something we should aspire to.
(And if some people want to debate whether we want to host such a
specialized sub-encyclopedia on Foundation servers, or on servers
belonging to the Dutch chapter, for fear of overly strict laws in the
US - that's fine. The point is, this is a topic worth covering
beautifully and comprehensively, like all important topics, and we
should not shortchange it.)
> (2) They encourage the blocking of Wikimedia sites from schools and libraries
I think there are good solutions here, beginning with communicating
directly with schools and libraries and find solutions that work for
them. For instance, making sure that they have access to
schools-wikipedia.org and similar snapshot sites until they can find a
way to provide access to all of wikipedia.
Working on these solutions may be a good way to recruit new teacher
editors, as well.
> (3) Explicit photographs are a hot-bed of privacy issues and can even
> risk bumping into the law (underage models)
This is the easiest one to address. Requiring proof of model
release, the way we require proof of copyright release, would be an
excellent start -- and doing this on general principle, not just in
cases where a face is recognizable: make sure you have the model's
permission. This is simply a philosophical question; we can afford to
be picky and only host images that we are sure the model was
comfortable with publishing.
SJ
> "The lay of the land"
>
>
> Commons has a hard rule that for images to be in scope they must
> potentially serve an educational purpose. The rule is followed pretty
> strictly, but the definition of educational purpose is taken very
> broadly. In particular the commons community expects the public to
> also use commons as a form of "visual education", so having a great
> big bucket of distinct pictures of the same subject generally furthers
> the educational mission.
>
> There are two major factors complicating every policy decision on commons:
>
> Commons is also a service project. When commons policy changes over
> 700 wikis feel the results. Often, language barriers inhibit effective
> communication with these customers. Some Wikimedia projects rely on
> commons exclusively for their images, so a prohibition on commons
> means (for example) a prohibition on Es wiki, even though most
> Eswikipedians are not active in the commons community. This
> relationship works because of trust which the commons community has
> built over the years. Part of that trust is that commons avoids making
> major changes with great haste and works with projects to fix issues
> when hasty acts do cause issues.
>
> Commons itself is highly multi-cultural. While commons does have a
> strong organizing principle (which is part of why it has been a
> fantastic success on its own terms where all other non-wikipedia WMF
> projects are at best weakly successful), that principle is strongly
> inclusive and mostly directs us to collect and curate while only
> excluding on legal grounds and a few common areas of basic human
> decency— it's harder to create any kind of cross cultural agreement on
> matters of taste. Avoiding issues of taste also makes us more
> reliable as an image source for customer projects.
>
>
> I think that a near majority of commons users think that we could do
> with some reductions in the quantity of redundant / low quality human
> sexuality content, due to having the same experience I started this
> message with. Of that group I think there is roughly an even split
> between people who believe the existing "educational purposes" policy
> is sufficient and people who think we could probably strengthen the
> policy somehow.
>
> There are also people who are honestly offended that some people are
> offended by human sexuality content— and some of them view efforts to
> curtail this content to be a threat to their own cultural values. If
> this isn't your culture, please take a moment to ponder it. If your
> personal culture believes in the open expression of sexuality an
> effort to remove "redundant / low quality" sexuality images while we
> not removing low quality pictures of clay pots, for example, is
> effectively an attack on your beliefs. These people would tell you: If
> you don't like it, don't look. _Understanding_ differences in opinion
> is part of the commons way, so even if you do not embrace this view
> you should at least stop to understand that it is not without merit.
> In any case, while sometimes vocal, people from this end of the
> spectrum don't appear to be all that much of the community.
>
> Of course, there are a few trolls here and there from time to time,
> but I don't think anyone really pays them much attention. There are
> lots of horny twenty somethings, but while it might bias the
> discussions towards permissiveness I don't think that it really has a
> big effect beyond the basic youthful liberalism which exists
> everywhere in our projects.
>
> There are also a couple of occasional agitators calling for things
> like a complete removal of sexuality content. Most of them fail to
> sound reasonable at all— demanding the removal of old works of art,
> basic anatomy photos... I think these complaints are mostly ignored.
>
> ... and a majority of people who either don't care or don't speak the
> languages the discussions are held in.
>
> "The goal"
>
> Considering the landscape, how do we solve the problems?
>
> Lets take a category of Penis images as an example. Load it up.
> Hundreds of penii. Pretty shocking. We can obviously cut back on this,
> right? How many penis images do we really need to meet the mission of
> the Wikipedias? (and then we need to consider the more expansive
> mission of commons in educating through media).
>
> Well, we ought to have circumcised, and uncircumcised. Flaccid and
> erect. An example of each kind of penis jewellery that has a WP
> article in some language. An example of every disease with
> penis-visible symptoms.... We're easily at 50-100 images already.
> People seem to think we also need many of the prior samples from
> multiple races to demonstrate the (lack of) differences. Add a little
> further inflation because editorial preferences on the Wikipedias will
> differ.
>
> So on the basis of meeting the Wikipedia's need alone, we're up to
> hundreds of pictures of penises. Now— commons' hundreds are not so
> diverse, we need fewer of some kinds and more of others, but in terms
> of the sheer count even before considering commons' own educational
> remit we still need a bunch.
>
> Where does this place us in terms of our problem statements? Well,
> With hundreds of pictures in the category it will be easy to cast
> commons as a penis palace. Thus, in terms of this class of images—
> problem (1) is probably unsolvable given our educational mission. If
> someone wants to point to the category and inspire the "Oh my god;
> it's full of cocks" response, they can...
>
> Virtually all libraries and schools that block internet sites employ
> categorical blocking software. They block broad categories like
> "Drugs, weapons, nudity, pornography, and proxy evasion". All of the
> Wikimedia projects could be blocked under all of these categories.
> Even a highly educational penis is still nudity— these filtering
> services are often criticized for blocking information on breast
> exams, for example. Because of the way the blocking happens reducing
> the number of penis images to the educational minimum would not likely
> reduce the incidence of blocking in any material way. So problem (2)
> seems to be unsolvable given our educational mission.
>
> I think we could make some improvement with problem (3). The privacy
> issues can also be addressed by using images without visible faces
> (which are often perceived to be more prurient, unfortunately).
>
> Ironically— the commercial pornography industry has been pretty happy
> to supply us with images which we are quite sure are legal and without
> privacy problems. But accepting these images heightens the perception
> that commons is promoting pornography rather than merely hosting
> educational resources.
>
> The prevalence of commercial sex images reflects the result of prior
> attempts to avoid child images and images created without the model's
> consent, though I don't think the consequence was expected. As a
> checkuser (with OTRS access) I can't say that I've seen evidence of
> abuse by commercial porn providers: Wikimedians are going to them.
>
> Although, _obviously_ problematic images are regularly and easily
> deleted without dispute. I've nuked a few from orbit and never hit
> the slightest bit of resistance. Though the community also has no
> reason to distrust my claims that an image is inappropriate, other
> people may get different results.
>
> Now how would we draft such a policy to further improve things?
>
> We need a policy which can be easily understood by many languages and
> cultures, which improves the situation but doesn't provide a basis for
> other censorship (e.g. some would have us remove all likenesses of
> Muhammad, images of women without veils, historical offensive
> political cartoons and symbols, etc). Actually be enforceable in the
> face of incomplete information from uploaders, without the risk of too
> much 'taste' and the resulting instability for customers. I'm at a
> loss. I have no suggestion beyond preferring illustrations rather than
> photos (which we already do), and accepting images contributions
> commercial sources, which is bad for our image. This seems really
> hard.
>
>
> Now pull in the part of the landscape that I didn't mention: Commons
> has almost five million images. The deletion spree which was operated
> completely without regard to the community process was described as an
> "almost complete cleanup" removed fewer than 500 images— or about
> 0.008% of the collection.
>
> At this point in my reasoning I inevitably conclude (1) The problem
> was far less bad than my initial impression. (2) At _best_ we can't
> solve much of the problem without accepting aggressive censorship of
> our coverage, both text and images (3) The part we could improve is
> pretty hard to improve. (4) There are more important things to work
> on.
>
> None of this really depends on any difficulty coming from governance.
> Even as supreme ruler for a day I couldn't solve this one
> satisfactorily.
>
> The initial surprise is enough that I've gone through this cycle
> several times now, but I keep reaching the same conclusion. I expect
> the same is true for many other contributors.
>
> ... and outside of some agitation from people pushing for the
> unachievable like "school safeness", and some popular troll-nest
> message boards, troll-nest 'news' agencies, and a somewhat trollish
> ex-nupedian, I haven't seen a lot of evidence that these 0.008% are
> suddenly in need of a major effort. I can promise you that a far
> greater proportion of our works are misleadingly labelled, outright
> spam, egregious copyright violations, potentially carrying hidden
> malware, etc.
>
>
> Feedback from the board that such an effort is desired from the board
> would certainly help shift the priorities— it would also give us some
> excusability for disruption to our customer projects.
>
> But this isn't what we got at all. The clear _consensus_ among the
> commons community and many of our customers is that what we what we
> got was disruptive, under-informed, and damaging to our internal
> governance. We now faction lines have been drawn between the couple
> of commons users aligned with Jimmy and the (literally) hundreds of
> users opposed the methodology used here and the specifics of some of
> the deletions. There is no active discussion about making an
> improvement, our customers are discussing creating chapter operated
> forks of commons free from this kind of disruptive intervention which
> is perceived by many to be overt values based censorship. Many other
> messages have expressed the complaints in greater detail.
>
>
> I hope this has provided some useful background and that it will
> foster improved communication on the subject.
>
> _______________________________________________
> foundation-l mailing list
> foundation-l at lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
>
More information about the foundation-l
mailing list