I thought it might be useful to here if I shared some of my experiences with commons.
Like many people I've had the experience of bumping into a human sexuality related commons category or gallery and thinking "Holy crap! Thats a lot of [gallery name]. Freeking teenage pornofreaks!".
But unlike many other people, I am in a position to do something about it: I'm a commons administrator and checkuser reasonably well respected in the commons community (when I'm not inactive, at least), well connected to the commons star-chamber, and I've played a role in many of the internal 'governance by fiat' events. I think it's likely that a majority of my deletions have been technically "out of process", but by keeping a good working relationship with the rest of the commons community this hasn't been a problem at all.
To take action you have to understand a few things: "The problem", "The lay of the land", and "The goal".
Why might a super-abundance of explicit images be a problem? (1) They potentially bring the Wikimedia sites into ill repute (it's just a big porn site!) (2) They encourage the blocking of Wikimedia sites from schools and libraries (3) Explicit photographs are a hot-bed of privacy issues and can even risk bumping into the law (underage models)
I'm sure others can be listed but these are sufficient for now.
"The lay of the land"
Commons has a hard rule that for images to be in scope they must potentially serve an educational purpose. The rule is followed pretty strictly, but the definition of educational purpose is taken very broadly. In particular the commons community expects the public to also use commons as a form of "visual education", so having a great big bucket of distinct pictures of the same subject generally furthers the educational mission.
There are two major factors complicating every policy decision on commons:
Commons is also a service project. When commons policy changes over 700 wikis feel the results. Often, language barriers inhibit effective communication with these customers. Some Wikimedia projects rely on commons exclusively for their images, so a prohibition on commons means (for example) a prohibition on Es wiki, even though most Eswikipedians are not active in the commons community. This relationship works because of trust which the commons community has built over the years. Part of that trust is that commons avoids making major changes with great haste and works with projects to fix issues when hasty acts do cause issues.
Commons itself is highly multi-cultural. While commons does have a strong organizing principle (which is part of why it has been a fantastic success on its own terms where all other non-wikipedia WMF projects are at best weakly successful), that principle is strongly inclusive and mostly directs us to collect and curate while only excluding on legal grounds and a few common areas of basic human decency— it's harder to create any kind of cross cultural agreement on matters of taste. Avoiding issues of taste also makes us more reliable as an image source for customer projects.
I think that a near majority of commons users think that we could do with some reductions in the quantity of redundant / low quality human sexuality content, due to having the same experience I started this message with. Of that group I think there is roughly an even split between people who believe the existing "educational purposes" policy is sufficient and people who think we could probably strengthen the policy somehow.
There are also people who are honestly offended that some people are offended by human sexuality content— and some of them view efforts to curtail this content to be a threat to their own cultural values. If this isn't your culture, please take a moment to ponder it. If your personal culture believes in the open expression of sexuality an effort to remove "redundant / low quality" sexuality images while we not removing low quality pictures of clay pots, for example, is effectively an attack on your beliefs. These people would tell you: If you don't like it, don't look. _Understanding_ differences in opinion is part of the commons way, so even if you do not embrace this view you should at least stop to understand that it is not without merit. In any case, while sometimes vocal, people from this end of the spectrum don't appear to be all that much of the community.
Of course, there are a few trolls here and there from time to time, but I don't think anyone really pays them much attention. There are lots of horny twenty somethings, but while it might bias the discussions towards permissiveness I don't think that it really has a big effect beyond the basic youthful liberalism which exists everywhere in our projects.
There are also a couple of occasional agitators calling for things like a complete removal of sexuality content. Most of them fail to sound reasonable at all— demanding the removal of old works of art, basic anatomy photos... I think these complaints are mostly ignored.
... and a majority of people who either don't care or don't speak the languages the discussions are held in.
"The goal"
Considering the landscape, how do we solve the problems?
Lets take a category of Penis images as an example. Load it up. Hundreds of penii. Pretty shocking. We can obviously cut back on this, right? How many penis images do we really need to meet the mission of the Wikipedias? (and then we need to consider the more expansive mission of commons in educating through media).
Well, we ought to have circumcised, and uncircumcised. Flaccid and erect. An example of each kind of penis jewellery that has a WP article in some language. An example of every disease with penis-visible symptoms.... We're easily at 50-100 images already. People seem to think we also need many of the prior samples from multiple races to demonstrate the (lack of) differences. Add a little further inflation because editorial preferences on the Wikipedias will differ.
So on the basis of meeting the Wikipedia's need alone, we're up to hundreds of pictures of penises. Now— commons' hundreds are not so diverse, we need fewer of some kinds and more of others, but in terms of the sheer count even before considering commons' own educational remit we still need a bunch.
Where does this place us in terms of our problem statements? Well, With hundreds of pictures in the category it will be easy to cast commons as a penis palace. Thus, in terms of this class of images— problem (1) is probably unsolvable given our educational mission. If someone wants to point to the category and inspire the "Oh my god; it's full of cocks" response, they can...
Virtually all libraries and schools that block internet sites employ categorical blocking software. They block broad categories like "Drugs, weapons, nudity, pornography, and proxy evasion". All of the Wikimedia projects could be blocked under all of these categories. Even a highly educational penis is still nudity— these filtering services are often criticized for blocking information on breast exams, for example. Because of the way the blocking happens reducing the number of penis images to the educational minimum would not likely reduce the incidence of blocking in any material way. So problem (2) seems to be unsolvable given our educational mission.
I think we could make some improvement with problem (3). The privacy issues can also be addressed by using images without visible faces (which are often perceived to be more prurient, unfortunately).
Ironically— the commercial pornography industry has been pretty happy to supply us with images which we are quite sure are legal and without privacy problems. But accepting these images heightens the perception that commons is promoting pornography rather than merely hosting educational resources.
The prevalence of commercial sex images reflects the result of prior attempts to avoid child images and images created without the model's consent, though I don't think the consequence was expected. As a checkuser (with OTRS access) I can't say that I've seen evidence of abuse by commercial porn providers: Wikimedians are going to them.
Although, _obviously_ problematic images are regularly and easily deleted without dispute. I've nuked a few from orbit and never hit the slightest bit of resistance. Though the community also has no reason to distrust my claims that an image is inappropriate, other people may get different results.
Now how would we draft such a policy to further improve things?
We need a policy which can be easily understood by many languages and cultures, which improves the situation but doesn't provide a basis for other censorship (e.g. some would have us remove all likenesses of Muhammad, images of women without veils, historical offensive political cartoons and symbols, etc). Actually be enforceable in the face of incomplete information from uploaders, without the risk of too much 'taste' and the resulting instability for customers. I'm at a loss. I have no suggestion beyond preferring illustrations rather than photos (which we already do), and accepting images contributions commercial sources, which is bad for our image. This seems really hard.
Now pull in the part of the landscape that I didn't mention: Commons has almost five million images. The deletion spree which was operated completely without regard to the community process was described as an "almost complete cleanup" removed fewer than 500 images— or about 0.008% of the collection.
At this point in my reasoning I inevitably conclude (1) The problem was far less bad than my initial impression. (2) At _best_ we can't solve much of the problem without accepting aggressive censorship of our coverage, both text and images (3) The part we could improve is pretty hard to improve. (4) There are more important things to work on.
None of this really depends on any difficulty coming from governance. Even as supreme ruler for a day I couldn't solve this one satisfactorily.
The initial surprise is enough that I've gone through this cycle several times now, but I keep reaching the same conclusion. I expect the same is true for many other contributors.
... and outside of some agitation from people pushing for the unachievable like "school safeness", and some popular troll-nest message boards, troll-nest 'news' agencies, and a somewhat trollish ex-nupedian, I haven't seen a lot of evidence that these 0.008% are suddenly in need of a major effort. I can promise you that a far greater proportion of our works are misleadingly labelled, outright spam, egregious copyright violations, potentially carrying hidden malware, etc.
Feedback from the board that such an effort is desired from the board would certainly help shift the priorities— it would also give us some excusability for disruption to our customer projects.
But this isn't what we got at all. The clear _consensus_ among the commons community and many of our customers is that what we what we got was disruptive, under-informed, and damaging to our internal governance. We now faction lines have been drawn between the couple of commons users aligned with Jimmy and the (literally) hundreds of users opposed the methodology used here and the specifics of some of the deletions. There is no active discussion about making an improvement, our customers are discussing creating chapter operated forks of commons free from this kind of disruptive intervention which is perceived by many to be overt values based censorship. Many other messages have expressed the complaints in greater detail.
I hope this has provided some useful background and that it will foster improved communication on the subject.
Thank you Greg, for this brilliant and personal overview. Very helpful.
A few thoughts:
On Sun, May 9, 2010 at 4:17 AM, Gregory Maxwell gmaxwell@gmail.com wrote:
Why might a super-abundance of explicit images be a problem? (1) They potentially bring the Wikimedia sites into ill repute (it's just a big porn site!)
This can be addressed in part by increasing the quality standard for our images. A well-ordered set of anatomy images, in standard proscribed frame and format, from an established cross-section of races or backgrounds : this would be excellent. It would also be a useful model to follow for all sorts of anatomical images (you could use the same models to get entire sets of images of the body).
Likewise, a well-ordered set of images of jewelry and piercings, perhaps organized in partnership with a large piercing/jewelry parlor in a multiethnic community, would also be easy enough to set up -- and would quickly replace the many lazily-shot and casually curated images we have today. (note that I didn't specify genital jewelry and piercings; though that would be part of the series).
A gorgeous and professionally made encyclopedia of sexuality might not be to some people's tastes, but wouldn't inspire them to say 'just a big porn site!', just as the Museum of Sex has acquired a very respectable following and media coverage in New York. That is something we should aspire to.
(And if some people want to debate whether we want to host such a specialized sub-encyclopedia on Foundation servers, or on servers belonging to the Dutch chapter, for fear of overly strict laws in the US - that's fine. The point is, this is a topic worth covering beautifully and comprehensively, like all important topics, and we should not shortchange it.)
(2) They encourage the blocking of Wikimedia sites from schools and libraries
I think there are good solutions here, beginning with communicating directly with schools and libraries and find solutions that work for them. For instance, making sure that they have access to schools-wikipedia.org and similar snapshot sites until they can find a way to provide access to all of wikipedia.
Working on these solutions may be a good way to recruit new teacher editors, as well.
(3) Explicit photographs are a hot-bed of privacy issues and can even risk bumping into the law (underage models)
This is the easiest one to address. Requiring proof of model release, the way we require proof of copyright release, would be an excellent start -- and doing this on general principle, not just in cases where a face is recognizable: make sure you have the model's permission. This is simply a philosophical question; we can afford to be picky and only host images that we are sure the model was comfortable with publishing.
SJ
"The lay of the land"
Commons has a hard rule that for images to be in scope they must potentially serve an educational purpose. The rule is followed pretty strictly, but the definition of educational purpose is taken very broadly. In particular the commons community expects the public to also use commons as a form of "visual education", so having a great big bucket of distinct pictures of the same subject generally furthers the educational mission.
There are two major factors complicating every policy decision on commons:
Commons is also a service project. When commons policy changes over 700 wikis feel the results. Often, language barriers inhibit effective communication with these customers. Some Wikimedia projects rely on commons exclusively for their images, so a prohibition on commons means (for example) a prohibition on Es wiki, even though most Eswikipedians are not active in the commons community. This relationship works because of trust which the commons community has built over the years. Part of that trust is that commons avoids making major changes with great haste and works with projects to fix issues when hasty acts do cause issues.
Commons itself is highly multi-cultural. While commons does have a strong organizing principle (which is part of why it has been a fantastic success on its own terms where all other non-wikipedia WMF projects are at best weakly successful), that principle is strongly inclusive and mostly directs us to collect and curate while only excluding on legal grounds and a few common areas of basic human decency— it's harder to create any kind of cross cultural agreement on matters of taste. Avoiding issues of taste also makes us more reliable as an image source for customer projects.
I think that a near majority of commons users think that we could do with some reductions in the quantity of redundant / low quality human sexuality content, due to having the same experience I started this message with. Of that group I think there is roughly an even split between people who believe the existing "educational purposes" policy is sufficient and people who think we could probably strengthen the policy somehow.
There are also people who are honestly offended that some people are offended by human sexuality content— and some of them view efforts to curtail this content to be a threat to their own cultural values. If this isn't your culture, please take a moment to ponder it. If your personal culture believes in the open expression of sexuality an effort to remove "redundant / low quality" sexuality images while we not removing low quality pictures of clay pots, for example, is effectively an attack on your beliefs. These people would tell you: If you don't like it, don't look. _Understanding_ differences in opinion is part of the commons way, so even if you do not embrace this view you should at least stop to understand that it is not without merit. In any case, while sometimes vocal, people from this end of the spectrum don't appear to be all that much of the community.
Of course, there are a few trolls here and there from time to time, but I don't think anyone really pays them much attention. There are lots of horny twenty somethings, but while it might bias the discussions towards permissiveness I don't think that it really has a big effect beyond the basic youthful liberalism which exists everywhere in our projects.
There are also a couple of occasional agitators calling for things like a complete removal of sexuality content. Most of them fail to sound reasonable at all— demanding the removal of old works of art, basic anatomy photos... I think these complaints are mostly ignored.
... and a majority of people who either don't care or don't speak the languages the discussions are held in.
"The goal"
Considering the landscape, how do we solve the problems?
Lets take a category of Penis images as an example. Load it up. Hundreds of penii. Pretty shocking. We can obviously cut back on this, right? How many penis images do we really need to meet the mission of the Wikipedias? (and then we need to consider the more expansive mission of commons in educating through media).
Well, we ought to have circumcised, and uncircumcised. Flaccid and erect. An example of each kind of penis jewellery that has a WP article in some language. An example of every disease with penis-visible symptoms.... We're easily at 50-100 images already. People seem to think we also need many of the prior samples from multiple races to demonstrate the (lack of) differences. Add a little further inflation because editorial preferences on the Wikipedias will differ.
So on the basis of meeting the Wikipedia's need alone, we're up to hundreds of pictures of penises. Now— commons' hundreds are not so diverse, we need fewer of some kinds and more of others, but in terms of the sheer count even before considering commons' own educational remit we still need a bunch.
Where does this place us in terms of our problem statements? Well, With hundreds of pictures in the category it will be easy to cast commons as a penis palace. Thus, in terms of this class of images— problem (1) is probably unsolvable given our educational mission. If someone wants to point to the category and inspire the "Oh my god; it's full of cocks" response, they can...
Virtually all libraries and schools that block internet sites employ categorical blocking software. They block broad categories like "Drugs, weapons, nudity, pornography, and proxy evasion". All of the Wikimedia projects could be blocked under all of these categories. Even a highly educational penis is still nudity— these filtering services are often criticized for blocking information on breast exams, for example. Because of the way the blocking happens reducing the number of penis images to the educational minimum would not likely reduce the incidence of blocking in any material way. So problem (2) seems to be unsolvable given our educational mission.
I think we could make some improvement with problem (3). The privacy issues can also be addressed by using images without visible faces (which are often perceived to be more prurient, unfortunately).
Ironically— the commercial pornography industry has been pretty happy to supply us with images which we are quite sure are legal and without privacy problems. But accepting these images heightens the perception that commons is promoting pornography rather than merely hosting educational resources.
The prevalence of commercial sex images reflects the result of prior attempts to avoid child images and images created without the model's consent, though I don't think the consequence was expected. As a checkuser (with OTRS access) I can't say that I've seen evidence of abuse by commercial porn providers: Wikimedians are going to them.
Although, _obviously_ problematic images are regularly and easily deleted without dispute. I've nuked a few from orbit and never hit the slightest bit of resistance. Though the community also has no reason to distrust my claims that an image is inappropriate, other people may get different results.
Now how would we draft such a policy to further improve things?
We need a policy which can be easily understood by many languages and cultures, which improves the situation but doesn't provide a basis for other censorship (e.g. some would have us remove all likenesses of Muhammad, images of women without veils, historical offensive political cartoons and symbols, etc). Actually be enforceable in the face of incomplete information from uploaders, without the risk of too much 'taste' and the resulting instability for customers. I'm at a loss. I have no suggestion beyond preferring illustrations rather than photos (which we already do), and accepting images contributions commercial sources, which is bad for our image. This seems really hard.
Now pull in the part of the landscape that I didn't mention: Commons has almost five million images. The deletion spree which was operated completely without regard to the community process was described as an "almost complete cleanup" removed fewer than 500 images— or about 0.008% of the collection.
At this point in my reasoning I inevitably conclude (1) The problem was far less bad than my initial impression. (2) At _best_ we can't solve much of the problem without accepting aggressive censorship of our coverage, both text and images (3) The part we could improve is pretty hard to improve. (4) There are more important things to work on.
None of this really depends on any difficulty coming from governance. Even as supreme ruler for a day I couldn't solve this one satisfactorily.
The initial surprise is enough that I've gone through this cycle several times now, but I keep reaching the same conclusion. I expect the same is true for many other contributors.
... and outside of some agitation from people pushing for the unachievable like "school safeness", and some popular troll-nest message boards, troll-nest 'news' agencies, and a somewhat trollish ex-nupedian, I haven't seen a lot of evidence that these 0.008% are suddenly in need of a major effort. I can promise you that a far greater proportion of our works are misleadingly labelled, outright spam, egregious copyright violations, potentially carrying hidden malware, etc.
Feedback from the board that such an effort is desired from the board would certainly help shift the priorities— it would also give us some excusability for disruption to our customer projects.
But this isn't what we got at all. The clear _consensus_ among the commons community and many of our customers is that what we what we got was disruptive, under-informed, and damaging to our internal governance. We now faction lines have been drawn between the couple of commons users aligned with Jimmy and the (literally) hundreds of users opposed the methodology used here and the specifics of some of the deletions. There is no active discussion about making an improvement, our customers are discussing creating chapter operated forks of commons free from this kind of disruptive intervention which is perceived by many to be overt values based censorship. Many other messages have expressed the complaints in greater detail.
I hope this has provided some useful background and that it will foster improved communication on the subject.
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
I refuse to believe you could read that novel and respond intelligently in 41 minutes.I'm still waiting for the cliff notes version.
^_^
-Jon
On Sun, May 9, 2010 at 01:58, Samuel Klein meta.sj@gmail.com wrote:
Thank you Greg, for this brilliant and personal overview. Very helpful.
A few thoughts:
On Sun, May 9, 2010 at 4:17 AM, Gregory Maxwell gmaxwell@gmail.com wrote:
Why might a super-abundance of explicit images be a problem? (1) They potentially bring the Wikimedia sites into ill repute (it's just a big porn site!)
This can be addressed in part by increasing the quality standard for our images. A well-ordered set of anatomy images, in standard proscribed frame and format, from an established cross-section of races or backgrounds : this would be excellent. It would also be a useful model to follow for all sorts of anatomical images (you could use the same models to get entire sets of images of the body).
Likewise, a well-ordered set of images of jewelry and piercings, perhaps organized in partnership with a large piercing/jewelry parlor in a multiethnic community, would also be easy enough to set up -- and would quickly replace the many lazily-shot and casually curated images we have today. (note that I didn't specify genital jewelry and piercings; though that would be part of the series).
A gorgeous and professionally made encyclopedia of sexuality might not be to some people's tastes, but wouldn't inspire them to say 'just a big porn site!', just as the Museum of Sex has acquired a very respectable following and media coverage in New York. That is something we should aspire to.
(And if some people want to debate whether we want to host such a specialized sub-encyclopedia on Foundation servers, or on servers belonging to the Dutch chapter, for fear of overly strict laws in the US - that's fine. The point is, this is a topic worth covering beautifully and comprehensively, like all important topics, and we should not shortchange it.)
(2) They encourage the blocking of Wikimedia sites from schools and
libraries
I think there are good solutions here, beginning with communicating directly with schools and libraries and find solutions that work for them. For instance, making sure that they have access to schools-wikipedia.org and similar snapshot sites until they can find a way to provide access to all of wikipedia.
Working on these solutions may be a good way to recruit new teacher editors, as well.
(3) Explicit photographs are a hot-bed of privacy issues and can even risk bumping into the law (underage models)
This is the easiest one to address. Requiring proof of model release, the way we require proof of copyright release, would be an excellent start -- and doing this on general principle, not just in cases where a face is recognizable: make sure you have the model's permission. This is simply a philosophical question; we can afford to be picky and only host images that we are sure the model was comfortable with publishing.
SJ
"The lay of the land"
Commons has a hard rule that for images to be in scope they must potentially serve an educational purpose. The rule is followed pretty strictly, but the definition of educational purpose is taken very broadly. In particular the commons community expects the public to also use commons as a form of "visual education", so having a great big bucket of distinct pictures of the same subject generally furthers the educational mission.
There are two major factors complicating every policy decision on
commons:
Commons is also a service project. When commons policy changes over 700 wikis feel the results. Often, language barriers inhibit effective communication with these customers. Some Wikimedia projects rely on commons exclusively for their images, so a prohibition on commons means (for example) a prohibition on Es wiki, even though most Eswikipedians are not active in the commons community. This relationship works because of trust which the commons community has built over the years. Part of that trust is that commons avoids making major changes with great haste and works with projects to fix issues when hasty acts do cause issues.
Commons itself is highly multi-cultural. While commons does have a strong organizing principle (which is part of why it has been a fantastic success on its own terms where all other non-wikipedia WMF projects are at best weakly successful), that principle is strongly inclusive and mostly directs us to collect and curate while only excluding on legal grounds and a few common areas of basic human decency— it's harder to create any kind of cross cultural agreement on matters of taste. Avoiding issues of taste also makes us more reliable as an image source for customer projects.
I think that a near majority of commons users think that we could do with some reductions in the quantity of redundant / low quality human sexuality content, due to having the same experience I started this message with. Of that group I think there is roughly an even split between people who believe the existing "educational purposes" policy is sufficient and people who think we could probably strengthen the policy somehow.
There are also people who are honestly offended that some people are offended by human sexuality content— and some of them view efforts to curtail this content to be a threat to their own cultural values. If this isn't your culture, please take a moment to ponder it. If your personal culture believes in the open expression of sexuality an effort to remove "redundant / low quality" sexuality images while we not removing low quality pictures of clay pots, for example, is effectively an attack on your beliefs. These people would tell you: If you don't like it, don't look. _Understanding_ differences in opinion is part of the commons way, so even if you do not embrace this view you should at least stop to understand that it is not without merit. In any case, while sometimes vocal, people from this end of the spectrum don't appear to be all that much of the community.
Of course, there are a few trolls here and there from time to time, but I don't think anyone really pays them much attention. There are lots of horny twenty somethings, but while it might bias the discussions towards permissiveness I don't think that it really has a big effect beyond the basic youthful liberalism which exists everywhere in our projects.
There are also a couple of occasional agitators calling for things like a complete removal of sexuality content. Most of them fail to sound reasonable at all— demanding the removal of old works of art, basic anatomy photos... I think these complaints are mostly ignored.
... and a majority of people who either don't care or don't speak the languages the discussions are held in.
"The goal"
Considering the landscape, how do we solve the problems?
Lets take a category of Penis images as an example. Load it up. Hundreds of penii. Pretty shocking. We can obviously cut back on this, right? How many penis images do we really need to meet the mission of the Wikipedias? (and then we need to consider the more expansive mission of commons in educating through media).
Well, we ought to have circumcised, and uncircumcised. Flaccid and erect. An example of each kind of penis jewellery that has a WP article in some language. An example of every disease with penis-visible symptoms.... We're easily at 50-100 images already. People seem to think we also need many of the prior samples from multiple races to demonstrate the (lack of) differences. Add a little further inflation because editorial preferences on the Wikipedias will differ.
So on the basis of meeting the Wikipedia's need alone, we're up to hundreds of pictures of penises. Now— commons' hundreds are not so diverse, we need fewer of some kinds and more of others, but in terms of the sheer count even before considering commons' own educational remit we still need a bunch.
Where does this place us in terms of our problem statements? Well, With hundreds of pictures in the category it will be easy to cast commons as a penis palace. Thus, in terms of this class of images— problem (1) is probably unsolvable given our educational mission. If someone wants to point to the category and inspire the "Oh my god; it's full of cocks" response, they can...
Virtually all libraries and schools that block internet sites employ categorical blocking software. They block broad categories like "Drugs, weapons, nudity, pornography, and proxy evasion". All of the Wikimedia projects could be blocked under all of these categories. Even a highly educational penis is still nudity— these filtering services are often criticized for blocking information on breast exams, for example. Because of the way the blocking happens reducing the number of penis images to the educational minimum would not likely reduce the incidence of blocking in any material way. So problem (2) seems to be unsolvable given our educational mission.
I think we could make some improvement with problem (3). The privacy issues can also be addressed by using images without visible faces (which are often perceived to be more prurient, unfortunately).
Ironically— the commercial pornography industry has been pretty happy to supply us with images which we are quite sure are legal and without privacy problems. But accepting these images heightens the perception that commons is promoting pornography rather than merely hosting educational resources.
The prevalence of commercial sex images reflects the result of prior attempts to avoid child images and images created without the model's consent, though I don't think the consequence was expected. As a checkuser (with OTRS access) I can't say that I've seen evidence of abuse by commercial porn providers: Wikimedians are going to them.
Although, _obviously_ problematic images are regularly and easily deleted without dispute. I've nuked a few from orbit and never hit the slightest bit of resistance. Though the community also has no reason to distrust my claims that an image is inappropriate, other people may get different results.
Now how would we draft such a policy to further improve things?
We need a policy which can be easily understood by many languages and cultures, which improves the situation but doesn't provide a basis for other censorship (e.g. some would have us remove all likenesses of Muhammad, images of women without veils, historical offensive political cartoons and symbols, etc). Actually be enforceable in the face of incomplete information from uploaders, without the risk of too much 'taste' and the resulting instability for customers. I'm at a loss. I have no suggestion beyond preferring illustrations rather than photos (which we already do), and accepting images contributions commercial sources, which is bad for our image. This seems really hard.
Now pull in the part of the landscape that I didn't mention: Commons has almost five million images. The deletion spree which was operated completely without regard to the community process was described as an "almost complete cleanup" removed fewer than 500 images— or about 0.008% of the collection.
At this point in my reasoning I inevitably conclude (1) The problem was far less bad than my initial impression. (2) At _best_ we can't solve much of the problem without accepting aggressive censorship of our coverage, both text and images (3) The part we could improve is pretty hard to improve. (4) There are more important things to work on.
None of this really depends on any difficulty coming from governance. Even as supreme ruler for a day I couldn't solve this one satisfactorily.
The initial surprise is enough that I've gone through this cycle several times now, but I keep reaching the same conclusion. I expect the same is true for many other contributors.
... and outside of some agitation from people pushing for the unachievable like "school safeness", and some popular troll-nest message boards, troll-nest 'news' agencies, and a somewhat trollish ex-nupedian, I haven't seen a lot of evidence that these 0.008% are suddenly in need of a major effort. I can promise you that a far greater proportion of our works are misleadingly labelled, outright spam, egregious copyright violations, potentially carrying hidden malware, etc.
Feedback from the board that such an effort is desired from the board would certainly help shift the priorities— it would also give us some excusability for disruption to our customer projects.
But this isn't what we got at all. The clear _consensus_ among the commons community and many of our customers is that what we what we got was disruptive, under-informed, and damaging to our internal governance. We now faction lines have been drawn between the couple of commons users aligned with Jimmy and the (literally) hundreds of users opposed the methodology used here and the specifics of some of the deletions. There is no active discussion about making an improvement, our customers are discussing creating chapter operated forks of commons free from this kind of disruptive intervention which is perceived by many to be overt values based censorship. Many other messages have expressed the complaints in greater detail.
I hope this has provided some useful background and that it will foster improved communication on the subject.
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Commons-l mailing list Commons-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/commons-l
Hello Gregory,
thank you very much for this mail, it is very informative and reasonable. May I forward this mail to the board-mailing-list? I think it is a piece of opinion and insight that the board need, and not all board members are reading the Commons-mailing-list.
There is also a suggestions from me to the dilemmas you illustrated with your penis-example. It is certainly a premature consideration, and maybe it is also flawed and not practible. The suggestions is that we put a number limit for images in a category. I would suggest for all categories, not only sexuality related. If new images are included in that category when that category had already met its limit, it should be stated which value it brings that is not already there. For example because it depicts a desease, or it has a higher resolution. The same is also for other categories. If let's say we have already hundreds of panorama of the Rhine river in the city of Mainz than I would not add any value to our repository if I put another panorama depicts the same scene. But if it is took by a festival, or by a daytime that is not yet depicted, or by a certain weather phenomenon then it would add value into the gallary.
Greetings Ting
Gregory Maxwell wrote:
I thought it might be useful to here if I shared some of my experiences with commons.
Like many people I've had the experience of bumping into a human sexuality related commons category or gallery and thinking "Holy crap! Thats a lot of [gallery name]. Freeking teenage pornofreaks!".
But unlike many other people, I am in a position to do something about it: I'm a commons administrator and checkuser reasonably well respected in the commons community (when I'm not inactive, at least), well connected to the commons star-chamber, and I've played a role in many of the internal 'governance by fiat' events. I think it's likely that a majority of my deletions have been technically "out of process", but by keeping a good working relationship with the rest of the commons community this hasn't been a problem at all.
To take action you have to understand a few things: "The problem", "The lay of the land", and "The goal".
Why might a super-abundance of explicit images be a problem? (1) They potentially bring the Wikimedia sites into ill repute (it's just a big porn site!) (2) They encourage the blocking of Wikimedia sites from schools and libraries (3) Explicit photographs are a hot-bed of privacy issues and can even risk bumping into the law (underage models)
I'm sure others can be listed but these are sufficient for now.
"The lay of the land"
Commons has a hard rule that for images to be in scope they must potentially serve an educational purpose. The rule is followed pretty strictly, but the definition of educational purpose is taken very broadly. In particular the commons community expects the public to also use commons as a form of "visual education", so having a great big bucket of distinct pictures of the same subject generally furthers the educational mission.
There are two major factors complicating every policy decision on commons:
Commons is also a service project. When commons policy changes over 700 wikis feel the results. Often, language barriers inhibit effective communication with these customers. Some Wikimedia projects rely on commons exclusively for their images, so a prohibition on commons means (for example) a prohibition on Es wiki, even though most Eswikipedians are not active in the commons community. This relationship works because of trust which the commons community has built over the years. Part of that trust is that commons avoids making major changes with great haste and works with projects to fix issues when hasty acts do cause issues.
Commons itself is highly multi-cultural. While commons does have a strong organizing principle (which is part of why it has been a fantastic success on its own terms where all other non-wikipedia WMF projects are at best weakly successful), that principle is strongly inclusive and mostly directs us to collect and curate while only excluding on legal grounds and a few common areas of basic human decency— it's harder to create any kind of cross cultural agreement on matters of taste. Avoiding issues of taste also makes us more reliable as an image source for customer projects.
I think that a near majority of commons users think that we could do with some reductions in the quantity of redundant / low quality human sexuality content, due to having the same experience I started this message with. Of that group I think there is roughly an even split between people who believe the existing "educational purposes" policy is sufficient and people who think we could probably strengthen the policy somehow.
There are also people who are honestly offended that some people are offended by human sexuality content— and some of them view efforts to curtail this content to be a threat to their own cultural values. If this isn't your culture, please take a moment to ponder it. If your personal culture believes in the open expression of sexuality an effort to remove "redundant / low quality" sexuality images while we not removing low quality pictures of clay pots, for example, is effectively an attack on your beliefs. These people would tell you: If you don't like it, don't look. _Understanding_ differences in opinion is part of the commons way, so even if you do not embrace this view you should at least stop to understand that it is not without merit. In any case, while sometimes vocal, people from this end of the spectrum don't appear to be all that much of the community.
Of course, there are a few trolls here and there from time to time, but I don't think anyone really pays them much attention. There are lots of horny twenty somethings, but while it might bias the discussions towards permissiveness I don't think that it really has a big effect beyond the basic youthful liberalism which exists everywhere in our projects.
There are also a couple of occasional agitators calling for things like a complete removal of sexuality content. Most of them fail to sound reasonable at all— demanding the removal of old works of art, basic anatomy photos... I think these complaints are mostly ignored.
... and a majority of people who either don't care or don't speak the languages the discussions are held in.
"The goal"
Considering the landscape, how do we solve the problems?
Lets take a category of Penis images as an example. Load it up. Hundreds of penii. Pretty shocking. We can obviously cut back on this, right? How many penis images do we really need to meet the mission of the Wikipedias? (and then we need to consider the more expansive mission of commons in educating through media).
Well, we ought to have circumcised, and uncircumcised. Flaccid and erect. An example of each kind of penis jewellery that has a WP article in some language. An example of every disease with penis-visible symptoms.... We're easily at 50-100 images already. People seem to think we also need many of the prior samples from multiple races to demonstrate the (lack of) differences. Add a little further inflation because editorial preferences on the Wikipedias will differ.
So on the basis of meeting the Wikipedia's need alone, we're up to hundreds of pictures of penises. Now— commons' hundreds are not so diverse, we need fewer of some kinds and more of others, but in terms of the sheer count even before considering commons' own educational remit we still need a bunch.
Where does this place us in terms of our problem statements? Well, With hundreds of pictures in the category it will be easy to cast commons as a penis palace. Thus, in terms of this class of images— problem (1) is probably unsolvable given our educational mission. If someone wants to point to the category and inspire the "Oh my god; it's full of cocks" response, they can...
Virtually all libraries and schools that block internet sites employ categorical blocking software. They block broad categories like "Drugs, weapons, nudity, pornography, and proxy evasion". All of the Wikimedia projects could be blocked under all of these categories. Even a highly educational penis is still nudity— these filtering services are often criticized for blocking information on breast exams, for example. Because of the way the blocking happens reducing the number of penis images to the educational minimum would not likely reduce the incidence of blocking in any material way. So problem (2) seems to be unsolvable given our educational mission.
I think we could make some improvement with problem (3). The privacy issues can also be addressed by using images without visible faces (which are often perceived to be more prurient, unfortunately).
Ironically— the commercial pornography industry has been pretty happy to supply us with images which we are quite sure are legal and without privacy problems. But accepting these images heightens the perception that commons is promoting pornography rather than merely hosting educational resources.
The prevalence of commercial sex images reflects the result of prior attempts to avoid child images and images created without the model's consent, though I don't think the consequence was expected. As a checkuser (with OTRS access) I can't say that I've seen evidence of abuse by commercial porn providers: Wikimedians are going to them.
Although, _obviously_ problematic images are regularly and easily deleted without dispute. I've nuked a few from orbit and never hit the slightest bit of resistance. Though the community also has no reason to distrust my claims that an image is inappropriate, other people may get different results.
Now how would we draft such a policy to further improve things?
We need a policy which can be easily understood by many languages and cultures, which improves the situation but doesn't provide a basis for other censorship (e.g. some would have us remove all likenesses of Muhammad, images of women without veils, historical offensive political cartoons and symbols, etc). Actually be enforceable in the face of incomplete information from uploaders, without the risk of too much 'taste' and the resulting instability for customers. I'm at a loss. I have no suggestion beyond preferring illustrations rather than photos (which we already do), and accepting images contributions commercial sources, which is bad for our image. This seems really hard.
Now pull in the part of the landscape that I didn't mention: Commons has almost five million images. The deletion spree which was operated completely without regard to the community process was described as an "almost complete cleanup" removed fewer than 500 images— or about 0.008% of the collection.
At this point in my reasoning I inevitably conclude (1) The problem was far less bad than my initial impression. (2) At _best_ we can't solve much of the problem without accepting aggressive censorship of our coverage, both text and images (3) The part we could improve is pretty hard to improve. (4) There are more important things to work on.
None of this really depends on any difficulty coming from governance. Even as supreme ruler for a day I couldn't solve this one satisfactorily.
The initial surprise is enough that I've gone through this cycle several times now, but I keep reaching the same conclusion. I expect the same is true for many other contributors.
... and outside of some agitation from people pushing for the unachievable like "school safeness", and some popular troll-nest message boards, troll-nest 'news' agencies, and a somewhat trollish ex-nupedian, I haven't seen a lot of evidence that these 0.008% are suddenly in need of a major effort. I can promise you that a far greater proportion of our works are misleadingly labelled, outright spam, egregious copyright violations, potentially carrying hidden malware, etc.
Feedback from the board that such an effort is desired from the board would certainly help shift the priorities— it would also give us some excusability for disruption to our customer projects.
But this isn't what we got at all. The clear _consensus_ among the commons community and many of our customers is that what we what we got was disruptive, under-informed, and damaging to our internal governance. We now faction lines have been drawn between the couple of commons users aligned with Jimmy and the (literally) hundreds of users opposed the methodology used here and the specifics of some of the deletions. There is no active discussion about making an improvement, our customers are discussing creating chapter operated forks of commons free from this kind of disruptive intervention which is perceived by many to be overt values based censorship. Many other messages have expressed the complaints in greater detail.
I hope this has provided some useful background and that it will foster improved communication on the subject.
Commons-l mailing list Commons-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/commons-l
On 9 May 2010, at 12:15, Ting Chen wrote:
If let's say we have already hundreds of panorama of the Rhine river in the city of Mainz than I would not add any value to our repository if I put another panorama depicts the same scene. But if it is took by a festival, or by a daytime that is not yet depicted, or by a certain weather phenomenon then it would add value into the gallary.
I've isolated this from the rest of the email, as it touches on a much broader issue, which it seems I disagree with Ting on. Every image is different - at the very least it will have different lighting and a different perspective - which means that it may be more useful in any given situation than another (remember that it's not just Wikipedia using the images, or even just Wikimedia projects). It may also be technically better quality, given that technology improves over time. There's rarely a downside* to posting that onto Wikimedia Commons, even if the upside to posting it is not as great as for a picture showing something completely different than what we have already. There is always value.
There is also a big downside with this sort of exclusion: you have to turn around to people and say "thanks for releasing your reasonably good image, but we don't want it." That's not only somewhat unpleasant to do, but it also puts people off making other images available that might be more useful.
To me, this leads into the difference between categories and galleries/pages, though - the latter should be much more selective and shouldn't include very similar photographs.
* cultural sensitivities aside, there are also worries about impact of an increased workload on the Wikimedia community and strain on the servers, but not large ones at the current time IMO.
Mike Peel
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA384
Michael Peel wrote:
There is also a big downside with this sort of exclusion: you have to turn around to people and say "thanks for releasing your reasonably good image, but we don't want it." That's not only somewhat unpleasant to do, but it also puts people off making other images available that might be more useful.
That's happening already: there are those users who don't seem to understand the formalities of uploading files, like including description, source, author, licence/permission, that get tagged for not adding the required information. After a little while, they "quit" contributing because of frustration. Of course, failing to understand how to properly upload files may be your own fault, but the principle of frustration still applies. If any of the exclusion mentioned in the quote ever gets implemented, this user frustration problem may likely grow much worse.
To me, this leads into the difference between categories and galleries/pages, though - the latter should be much more selective and shouldn't include very similar photographs.
That's how I've always known the system on Commons to be—and it should stay that way. I think the source of this exclusion proposal stems from the fact that there are much more categories than are galleries/pages. The lack of gallery pages in some content areas where categories already do a good job of organisation is justified, however.
- cultural sensitivities aside, there are also worries about impact of an increased workload on the Wikimedia community and strain on the servers, but not large ones at the current time IMO.
Note that deletion only hides the stuff from regular and anonymous users, for those who don't know or have forgotten.
- -- C Li (User:O) Can't think of a witty .sigline today... - -----BEGIN GEEK CODE BLOCK----- Version: 3.12 GAT d-(++) s++:- !a C++ UL++++ P+ L++(++++) E+ W+++ N++(+++) !o !K w(+) !O M-- V-- PS(++) PE-(++) Y+(++) PGP++(+++) t--- !5 !X R(-) tv-(--) b-(+) !DI !D G e-@ h--(++) !r y- - ------END GEEK CODE BLOCK------
On Sun, May 9, 2010 at 1:15 PM, Ting Chen wing.philopp@gmx.de wrote:
There is also a suggestions from me to the dilemmas you illustrated with your penis-example. It is certainly a premature consideration, and maybe it is also flawed and not practible. The suggestions is that we put a number limit for images in a category. I would suggest for all categories, not only sexuality related. If new images are included in that category when that category had already met its limit, it should be stated which value it brings that is not already there. For example because it depicts a desease, or it has a higher resolution. The same is also for other categories. If let's say we have already hundreds of panorama of the Rhine river in the city of Mainz than I would not add any value to our repository if I put another panorama depicts the same scene. But if it is took by a festival, or by a daytime that is not yet depicted, or by a certain weather phenomenon then it would add value into the gallary.
Definitely a point I agree with, in fact it is something I have been saying from the beginning of Commons: We should not want to have yet another picture of the Tower of Pisa, but if it is a picture from an unusual angle, or a particularly good picture, or one at another time of day, then we do want it.
I think we should not be restricting this to new images. Rather, I would like to see people go over (connected) groups of categories, recategorize the files that are there, and in this process also propose the images that give no additional value. Only files that are in use on a project should be fully exempt, but of course almost anything could be a valid objection against such a deletion.
-- André Engels, andreengels@gmail.com
Any kind of limiting number of particular group of media files (while not doing it with another) would be imposing a censorship. However, other kinds of limits are mostly negotiable. For example, it is not the same if we have a photo of poor quality of some village from Papua New Guinea and a photo of poor quality of yet another penis.
At the other side, as Jon mentioned in one and I inside of another thread, working on software solution in accordance to Internet Content Rating Association [1] principles and methods should be fine: We should allow to our users to choose which category of images is offensive to them.
This is a perfectly flexible method, as it can deal with all kinds of taboos. We are dealing now just with "Abrahamic taboos" (sex and Muhammad images), but we didn't touch any of taboos of many small cultures. By allowing users and editors to say "I don't want to see images of that sacral place" we would give to all cultures equal chance to protect their taboos. At the other side, Commons would be able to keep its own policies related to free content, image quality etc.
[1] - http://en.wikipedia.org/wiki/Internet_Content_Rating_Association