There have been a number of discussion on the English Wikipedia lately (sparked, of course, by the Virgin Killer image controversy) on the propriety of various images and the need for retaining them on Wikipedia. This is a problem that has a long history on Wikipedia, and a number of controls are in place - limited ability to post explicit images on new articles, some filtering of newly uploaded images to delete those that are obviously duplicative, exhibitionist, etc. Many comments we've had in the last few days concerned the legality of various images, particularly where consent is not demonstrated or verifiable. I've commented [1] that the legality issue shouldn't be a major concern for English Wikipedia editors, because the Foundation itself ought to have limited liability and the individual uploaders have primary culpability for any illegal images.
But I still think that there is a community issue here, and I wonder if someone can fill in the details on how we currently deal with it. How well is the Commons guideline COM:PEOPLE enforced with respect to sexual images? Do the many projects with separate image databases generally have similar guidelines? Does anyone know how well they are enforced? In a discussion this past weekend someone else and I were discussing examples of problem images, where the person in an explicit photograph is of questionable age.
I realized after a quick survey on Commons of image origins that many of the explicit images are sourced to a single Flickr account. The license of the images was verified closer to the time of upload, but since then the Flickr account has been deactivated. We have no knowledge of the consent of the photographed models, nor any mechanism for verifying their age, and many if not most of the images are unused on Wikipedia projects (which is true, I suspect, for many sexually explicit photographs in general). The whole category of images [3] was previously put up for deletion [2] but the discussion was closed in favor of individual image reviews, which I understand mostly closed as keep.
I don't think the Foundation itself can or should do anything about this issue in most cases, but I think the topic deserves some wider discussion and reconsideration - not necessarily as a response to the IWF debacle, but taking that as an opportunity to get a wider audience.
Of note is Jimmy's recommendation to the en.wp community (I assume, since it was posted there) for this sort of reconsideration. [4]
Nathan
[1] http://en.wikipedia.org/w/index.php?title=User_talk:Jimbo_Wales&curid=98... [2] http://commons.wikimedia.org/wiki/Commons:Deletion_requests/Peter_Klashorst_... [3] http://commons.wikimedia.org/wiki/Category:Peter_Klashorst [4] http://en.wikipedia.org/w/index.php?title=User_talk:Jimbo_Wales&curid=98...
Hello Nathan,
also I don't consider myself as an active member of the commons community, but surely as a heavy user of it :-), I agree with you that we should reestimate these images.
As for other wikipedia language versions. As far as I know on my home-version, the zh-wp there are no such images. By the fair-use-images there we handle it far more restrictive and many people are watching on it. We also very fast delete non-free-images there that are not or no more in use. So I think this problem is smaller. But we also have free images there that are not in use, that may be uploaded years ago and no one know them any more. So I would animate the administrators there to reexam all images that are classified as free images.
Ting
Nathan wrote:
There have been a number of discussion on the English Wikipedia lately (sparked, of course, by the Virgin Killer image controversy) on the propriety of various images and the need for retaining them on Wikipedia. This is a problem that has a long history on Wikipedia, and a number of controls are in place - limited ability to post explicit images on new articles, some filtering of newly uploaded images to delete those that are obviously duplicative, exhibitionist, etc. Many comments we've had in the last few days concerned the legality of various images, particularly where consent is not demonstrated or verifiable. I've commented [1] that the legality issue shouldn't be a major concern for English Wikipedia editors, because the Foundation itself ought to have limited liability and the individual uploaders have primary culpability for any illegal images.
But I still think that there is a community issue here, and I wonder if someone can fill in the details on how we currently deal with it. How well is the Commons guideline COM:PEOPLE enforced with respect to sexual images? Do the many projects with separate image databases generally have similar guidelines? Does anyone know how well they are enforced? In a discussion this past weekend someone else and I were discussing examples of problem images, where the person in an explicit photograph is of questionable age.
I realized after a quick survey on Commons of image origins that many of the explicit images are sourced to a single Flickr account. The license of the images was verified closer to the time of upload, but since then the Flickr account has been deactivated. We have no knowledge of the consent of the photographed models, nor any mechanism for verifying their age, and many if not most of the images are unused on Wikipedia projects (which is true, I suspect, for many sexually explicit photographs in general). The whole category of images [3] was previously put up for deletion [2] but the discussion was closed in favor of individual image reviews, which I understand mostly closed as keep.
I don't think the Foundation itself can or should do anything about this issue in most cases, but I think the topic deserves some wider discussion and reconsideration - not necessarily as a response to the IWF debacle, but taking that as an opportunity to get a wider audience.
Of note is Jimmy's recommendation to the en.wp community (I assume, since it was posted there) for this sort of reconsideration. [4]
Nathan
[1] http://en.wikipedia.org/w/index.php?title=User_talk:Jimbo_Wales&curid=98... [2] http://commons.wikimedia.org/wiki/Commons:Deletion_requests/Peter_Klashorst_... [3] http://commons.wikimedia.org/wiki/Category:Peter_Klashorst [4] http://en.wikipedia.org/w/index.php?title=User_talk:Jimbo_Wales&curid=98...
Hi,
I believe that we have a lot of images from flickr with sexual content. And there is no way to make sure that the (Fe)male on the photo agrees with the photo on commons or the licence it is under.
I have tryed to nominate images like that for deletion. I can say all image are kept. The main reasson was the image is free so we can have it.
I believe that image with sexual content have to be checked... Do we need it... Is it really free.... Isn't there a other option than a image.
We have more than one category with nude male or female images and most of them are not in use on any project. I don't think we need the images with very young people on it.
Huib
Oh boy in comes the political correctness brigade .....
Hi,
I believe that we have a lot of images from flickr with sexual content. And there is no way to make sure that the (Fe)male on the photo agrees with the photo on commons or the licence it is under.
I have tryed to nominate images like that for deletion. I can say all image are kept. The main reasson was the image is free so we can have it.
I believe that image with sexual content have to be checked... Do we need it... Is it really free.... Isn't there a other option than a image.
We have more than one category with nude male or female images and most of them are not in use on any project. I don't think we need the images with very young people on it.
Huib
Actually I don't care if the image has sexual content or not. There are some points we should consider:
At first I don't trust all the claims on flickr. Second there may be content that violate personality or other legal issues.
Some of the images were uploaded years ago and at that time we had other measurement criterias as today, so I think a reexamination should be done, this totally unrelated to the content of the images.
Ting
Waerth wrote:
Oh boy in comes the political correctness brigade .....
Hi,
I believe that we have a lot of images from flickr with sexual content. And there is no way to make sure that the (Fe)male on the photo agrees with the photo on commons or the licence it is under.
I have tryed to nominate images like that for deletion. I can say all image are kept. The main reasson was the image is free so we can have it.
I believe that image with sexual content have to be checked... Do we need it... Is it really free.... Isn't there a other option than a image.
We have more than one category with nude male or female images and most of them are not in use on any project. I don't think we need the images with very young people on it.
Huib
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
I don't think it's helpful or useful to classify images that aren't currently being used in an article somewhere as second class, or more readily deletable. There are, I think it safe to say, TONS of images on Commons that aren't being used anywhere. So what if we have male nudes far in excess of what would ever need to be used in one article? The point of commons isn't as a hosting substitute for Wikipedia's article, it is as a repository of free images. For most purposes, people will only need one image out of a group, but offering a variety from which they can choose can only be beneficial.
If the free-ness of an image can be reasonably disputed, fine, go ahead and delete it, but don't start setting up separate standards for deletion based on an image's use.
FMF
On 12/10/08, Ting Chen wing.philopp@gmx.de wrote:
Actually I don't care if the image has sexual content or not. There are some points we should consider:
At first I don't trust all the claims on flickr. Second there may be content that violate personality or other legal issues.
Some of the images were uploaded years ago and at that time we had other measurement criterias as today, so I think a reexamination should be done, this totally unrelated to the content of the images.
Ting
Waerth wrote:
Oh boy in comes the political correctness brigade .....
Hi,
I believe that we have a lot of images from flickr with sexual content. And there is no way to make sure that the (Fe)male on the photo agrees with the photo on commons or the licence it is under.
I have tryed to nominate images like that for deletion. I can say all image are kept. The main reasson was the image is free so we can have it.
I believe that image with sexual content have to be checked... Do we need it... Is it really free.... Isn't there a other option than a image.
We have more than one category with nude male or female images and most of them are not in use on any project. I don't think we need the images with very young people on it.
Huib
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Also, it's probably worth pointing out that most of the people here ultimately seem to be urging a re-examination of Flickr-licensed images in general, not so much specifically sexual ones.
FMF
On 12/10/08, David Moran fordmadoxfraud@gmail.com wrote:
I don't think it's helpful or useful to classify images that aren't currently being used in an article somewhere as second class, or more readily deletable. There are, I think it safe to say, TONS of images on Commons that aren't being used anywhere. So what if we have male nudes far in excess of what would ever need to be used in one article? The point of commons isn't as a hosting substitute for Wikipedia's article, it is as a repository of free images. For most purposes, people will only need one image out of a group, but offering a variety from which they can choose can only be beneficial.
If the free-ness of an image can be reasonably disputed, fine, go ahead and delete it, but don't start setting up separate standards for deletion based on an image's use.
FMF
On 12/10/08, Ting Chen wing.philopp@gmx.de wrote:
Actually I don't care if the image has sexual content or not. There are some points we should consider:
At first I don't trust all the claims on flickr. Second there may be content that violate personality or other legal issues.
Some of the images were uploaded years ago and at that time we had other measurement criterias as today, so I think a reexamination should be done, this totally unrelated to the content of the images.
Ting
Waerth wrote:
Oh boy in comes the political correctness brigade .....
Hi,
I believe that we have a lot of images from flickr with sexual content. And there is no way to make sure that the (Fe)male on the photo agrees with the photo on commons or the licence it is under.
I have tryed to nominate images like that for deletion. I can say all image are kept. The main reasson was the image is free so we can have it.
I believe that image with sexual content have to be checked... Do we need it... Is it really free.... Isn't there a other option than a image.
We have more than one category with nude male or female images and most of them are not in use on any project. I don't think we need the images with very young people on it.
Huib
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On Wed, Dec 10, 2008 at 10:22 AM, David Moran fordmadoxfraud@gmail.com wrote:
I don't think it's helpful or useful to classify images that aren't currently being used in an article somewhere as second class, or more readily deletable. There are, I think it safe to say, TONS of images on Commons that aren't being used anywhere. So what if we have male nudes far in excess of what would ever need to be used in one article? The point of commons isn't as a hosting substitute for Wikipedia's article, it is as a repository of free images. For most purposes, people will only need one image out of a group, but offering a variety from which they can choose can only be beneficial.
If the free-ness of an image can be reasonably disputed, fine, go ahead and delete it, but don't start setting up separate standards for deletion based on an image's use.
It's also worth considering hypothetical books at Wikibooks or courses at Wikversity that teach the art of nude portraits, for which a large wealth of such images would be needed as examples. A simple search on Amazon for "nude photography" returns many such books [1]. Just because the nudity-related articles on Wikipedia can't use all of these types of images doesn't mean that they are useless to our projects.
Obviously non-free images are a different topic entirely, and if these images are unacceptable for other reasons then they should be handled accordingly. However, deleting an image just because it is not currently used at Wikipedia is awfully short-sighted.
[1] http://www.amazon.com/s/ref=nb_ss_gw?url=search-alias%3Daps&field-keywor...
--Andrew Whitworth
On Wed, Dec 10, 2008 at 4:35 PM, Andrew Whitworth wknight8111@gmail.com wrote:
On Wed, Dec 10, 2008 at 10:22 AM, David Moran fordmadoxfraud@gmail.com wrote:
I don't think it's helpful or useful to classify images that aren't currently being used in an article somewhere as second class, or more readily deletable. There are, I think it safe to say, TONS of images on Commons that aren't being used anywhere. So what if we have male nudes far in excess of what would ever need to be used in one article? The point of commons isn't as a hosting substitute for Wikipedia's article, it is as a repository of free images. For most purposes, people will only need one image out of a group, but offering a variety from which they can choose can only be beneficial.
If the free-ness of an image can be reasonably disputed, fine, go ahead and delete it, but don't start setting up separate standards for deletion based on an image's use.
It's also worth considering hypothetical books at Wikibooks or courses at Wikversity that teach the art of nude portraits, for which a large wealth of such images would be needed as examples. A simple search on Amazon for "nude photography" returns many such books [1]. Just because the nudity-related articles on Wikipedia can't use all of these types of images doesn't mean that they are useless to our projects.
Obviously non-free images are a different topic entirely, and if these images are unacceptable for other reasons then they should be handled accordingly. However, deleting an image just because it is not currently used at Wikipedia is awfully short-sighted.
I may imagine free books and manuals about sex, too. Sexual education is a part of modern education, as well as it is very important issue in development of personality; actually, much more important than quantum mechanics.
Besides the illustration of sexual positions, such books should include illustrations of pornography, too, because this is the important part of contemporary sexuality.
On Wed, Dec 10, 2008 at 7:22 AM, David Moran fordmadoxfraud@gmail.com wrote:
I don't think it's helpful or useful to classify images that aren't currently being used in an article somewhere as second class, or more readily deletable. There are, I think it safe to say, TONS of images on Commons that aren't being used anywhere. So what if we have male nudes far in excess of what would ever need to be used in one article? The point of commons isn't as a hosting substitute for Wikipedia's article, it is as a repository of free images. For most purposes, people will only need one image out of a group, but offering a variety from which they can choose can only be beneficial.
If the free-ness of an image can be reasonably disputed, fine, go ahead and delete it, but don't start setting up separate standards for deletion based on an image's use.
Considerations of personal privacy don't apply to pictures of fruit or airplanes. Images of identifiable people posing are intrinsically different and deserve to be treated with greater sceptism.
If you don't like a use standard, I'd be happy to accept an OTRS standard for identifiable nudes, but I do think we need to recognize that not all images have equal impact. Is it useful to have 500 poorly documented pictures of naked women, maybe. Is it harmful to have 1 image inappropriately uploaded by an angry ex-boyfriend, absolutely. If we can help prevent the latter circumstance by reducing the number of poorly documented (and often unused) nude photos on Commons, then I am all for it, regardless of how you want to approach it.
Perhaps because I suggested "use" as a limitation, you misunderstood my goal. My intent is to prevent the misuse of Commons to store and distribute images inappropriately, by which I mean images not authorized for distribution by all the parties involved. This is an area where I think we would lose little if we removed images we aren't using (speculations about sex manuals notwithstanding), but if you want to take different steps to minimize inappropriate use then by all means suggest what they should be.
-Robert Rohde
I think first what would be required was that it be convincingly demonstrated that "inappropriate use" of sexual imagery on Commons was in fact a problem before we start crafting deletion policies to deal with it.
FMF
On 12/10/08, Robert Rohde rarohde@gmail.com wrote:
On Wed, Dec 10, 2008 at 7:22 AM, David Moran fordmadoxfraud@gmail.com wrote:
I don't think it's helpful or useful to classify images that aren't currently being used in an article somewhere as second class, or more readily deletable. There are, I think it safe to say, TONS of images on Commons that aren't being used anywhere. So what if we have male nudes
far
in excess of what would ever need to be used in one article? The point
of
commons isn't as a hosting substitute for Wikipedia's article, it is as a repository of free images. For most purposes, people will only need one image out of a group, but offering a variety from which they can choose
can
only be beneficial.
If the free-ness of an image can be reasonably disputed, fine, go ahead
and
delete it, but don't start setting up separate standards for deletion
based
on an image's use.
Considerations of personal privacy don't apply to pictures of fruit or airplanes. Images of identifiable people posing are intrinsically different and deserve to be treated with greater sceptism.
If you don't like a use standard, I'd be happy to accept an OTRS standard for identifiable nudes, but I do think we need to recognize that not all images have equal impact. Is it useful to have 500 poorly documented pictures of naked women, maybe. Is it harmful to have 1 image inappropriately uploaded by an angry ex-boyfriend, absolutely. If we can help prevent the latter circumstance by reducing the number of poorly documented (and often unused) nude photos on Commons, then I am all for it, regardless of how you want to approach it.
Perhaps because I suggested "use" as a limitation, you misunderstood my goal. My intent is to prevent the misuse of Commons to store and distribute images inappropriately, by which I mean images not authorized for distribution by all the parties involved. This is an area where I think we would lose little if we removed images we aren't using (speculations about sex manuals notwithstanding), but if you want to take different steps to minimize inappropriate use then by all means suggest what they should be.
-Robert Rohde
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On Wed, Dec 10, 2008 at 11:10 AM, Robert Rohde rarohde@gmail.com wrote: [snip]
Considerations of personal privacy don't apply to pictures of fruit or airplanes. Images of identifiable people posing are intrinsically different and deserve to be treated with greater sceptism.
If you don't like a use standard, I'd be happy to accept an OTRS standard for identifiable nudes, but I do think we need to recognize that not all images have equal impact.
We do: http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_people
Though I'd generally support a strengthening of those terms wrt nudity or sexual situations... we already do, though it's not so clearly expressed there.
Is it useful to have 500 poorly documented pictures of naked women, maybe. Is it harmful to have 1 image inappropriately uploaded by an angry ex-boyfriend, absolutely.
LonelygirlUk. "Oh yes, Thats me— I consent to being naked on the internet"
We're kind screwed with respect to your hypothetical, but we should still do due diligence.
[snip]
Perhaps because I suggested "use" as a limitation, you misunderstood my goal. My intent is to prevent the misuse of Commons to store and distribute images inappropriately, by which I mean images not authorized for distribution by all the parties involved.
This case is particularly relevant because many other "upload whatever you want" sites won't accept the nudity, hide it behind account registration. The only site coming to mind that will is eroshare, but your revenge image surrounded by hardcore porn may not be the effect you were trying to achieve.
[snip]
This is an area where I think we would lose little if we removed images we aren't using (speculations about sex manuals notwithstanding), but if you want to take different steps to minimize inappropriate use then by all means suggest what they should be.
The usefulness of most of our human sexuality images for such purposes is far from indisputable. A good sex manual would seek to maximize its educational value by minimizing the unnecessary shock and perception of prurience of their images. This is a major factor in why EnWP has historically used drawings for most sex position articles.
I'm all for aggressively defending material which serves an educational purpose. But commons is not a porno gallery. Commons:Scope is quite clear on our mission. We may include sexually related imagery but only in an effort to fulfill our mission.
On Wed, Dec 10, 2008 at 11:21 AM, David Moran fordmadoxfraud@gmail.com wrote:
I think first what would be required was that it be convincingly demonstrated that "inappropriate use" of sexual imagery on Commons was in fact a problem before we start crafting deletion policies to deal with it.
Proof made harder by the fact that we're already fairly aggressive in deleting the most inappropriate activity. :)
On Wed, Dec 10, 2008 at 8:25 AM, Gregory Maxwell gmaxwell@gmail.com wrote: <snip>
LonelygirlUk. "Oh yes, Thats me— I consent to being naked on the internet"
We're kind screwed with respect to your hypothetical, but we should still do due diligence.
<snip>
Of course, the LonelygirlUK images were eventually (significantly after the fact) identified as belonging to a particular pornstar. So while they were copyvios (unless the pornstar was actually the uploader?!) they probably can't be said to have caused any harm to woman in question.
-Robert Rohde
On Wed, Dec 10, 2008 at 1:43 PM, Ting Chen wing.philopp@gmx.de wrote:
Actually I don't care if the image has sexual content or not. There are some points we should consider:
At first I don't trust all the claims on flickr. Second there may be content that violate personality or other legal issues.
Some of the images were uploaded years ago and at that time we had other measurement criterias as today, so I think a reexamination should be done, this totally unrelated to the content of the images.
In the last year or so Commons got at least two verifiable sources of photos of nudity via Flickr. One is by a known author, other is by some group which gave permissions to OTRS. If I remember well, those two sets make the most (or, at least, a relative majority) of categorized nudity images on Commons.
Sorry I wrote my last mail in haste and I didn't explained it very good.
At first I am not very worried about images on commons, I believe there are already some reexaminations done. I am more worried about images that are in the local projects. Take the example of my home-project zh-wp. We have images that are uploaded in 2003. At that time no one really cared about them. Later User:Shizhao mostly did the examination alone for quite long time. We had really began to notice the image upload and examine every uploaded image since one or two years. If someone labeled the image he uploaded as GFDL before this time, we normally didn't cared about it anymore. When I say reexamine I mean these images. I don't know how about the other projects. But I can imagine that at least in some projects we would have similar situation.
I mentioned not-used images not to discriminate these images. But these images are not seen since they are uploaded and as thus are less reexamined since then. Only under this aspect did I mention the not-used images.
Ting
I wouldn't mind a standard that said that identifiable, contemporary nudes (i.e. images with faces showing which aren't decades old) would be deleted if there aren't being used on any Wikimedia project. There is a non-trivial risk of harm if we simply allow unlimited inclusion of photos that under normal circumstance would usually be considered private, and when the photos aren't in actual use I think respect for that risk should usually outweigh the consideration that they might possibly be useful some day.
-Robert Rohde
On Wed, Dec 10, 2008 at 2:55 AM, Huib Laurens sterkebak@gmail.com wrote:
Hi,
I believe that we have a lot of images from flickr with sexual content. And there is no way to make sure that the (Fe)male on the photo agrees with the photo on commons or the licence it is under.
I have tryed to nominate images like that for deletion. I can say all image are kept. The main reasson was the image is free so we can have it.
I believe that image with sexual content have to be checked... Do we need it... Is it really free.... Isn't there a other option than a image.
We have more than one category with nude male or female images and most of them are not in use on any project. I don't think we need the images with very young people on it.
Huib
-- Leave nothing but footprints, take nothing but pictures
http://commons.wikimedia.org/wiki/User:SterkeBak
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
2008/12/10 Huib Laurens sterkebak@gmail.com:
Hi,
I believe that we have a lot of images from flickr with sexual content. And there is no way to make sure that the (Fe)male on the photo agrees with the photo on commons or the licence it is under.
I have tryed to nominate images like that for deletion. I can say all image are kept. The main reasson was the image is free so we can have it.
I believe that image with sexual content have to be checked... Do we need it... Is it really free.... Isn't there a other option than a image.
We have more than one category with nude male or female images and most of them are not in use on any project. I don't think we need the images with very young people on it.
Huib
I disagree that we should have different standards for media containing nudity and sexuality. Sexuality is an important educational subject. One of the most important, as another poster pointed out. On Wikipedia alone, one would expect a range of articles on different issues relating to sexuality and nudity which would be illustrated where possible. Commons isn't simply a dumping ground for Wikipedia articles though, and also functions as a free media repository.
To treat media differently because it contains nudity or sexuality is to allow our own biases and tastes to influence content. To exclude such media because it offends our tastes is not neutral or unbiased. These are legitimate topics that need to be illustrated and demonstrated as much as any other topic.
On Wed, Dec 10, 2008 at 11:37 AM, Oldak Quill oldakquill@gmail.com wrote:
I disagree that we should have different standards for media containing nudity and sexuality. Sexuality is an important educational subject. One of the most important, as another poster pointed out. On Wikipedia alone, one would expect a range of articles on different issues relating to sexuality and nudity which would be illustrated where possible. Commons isn't simply a dumping ground for Wikipedia articles though, and also functions as a free media repository.
To treat media differently because it contains nudity or sexuality is to allow our own biases and tastes to influence content. To exclude such media because it offends our tastes is not neutral or unbiased. These are legitimate topics that need to be illustrated and demonstrated as much as any other topic.
I don't think what we're discussing is taste. Quite apart from the issue of taste and values is the issue of doing harm to the subject of our content. The potential for harm in a sexually explicit photograph is much higher than that for most any other class of content that comes to mind. With these images the notions of consent and age become very important, and while the COM:PEOPLE guideline on Commons addresses this in very broad way there seems to be room for improvement and tightening in the control of this sort of content across Wikimedia projects.
Educational use is certainly to be allowed and encouraged - sexual manuals, artistic manuals, etc. are valid uses of Wikimedia projects and the accompanying images have their place on Commons. But there is no need to have unlimited images of the sort that theoretically could be attached to these projects, when these images present their subjects and our community with an array of problems.
In an ideal world, all nude images on Commons would require that the age and consent to publish of the model be verifiable. This would not be the same as barring nude images - indeed, it would explicitly permit the upload of these images to Commons while ensuring that we meet our responsibility to limit the potential for harm to living people.
David Moran mentions that we should be sure there is a current problem before working on a solution. This is a valid position, but there are problems with that approach. The Commons project is quite obscure to the wider world, so we simply can't rely on those who can potentially be offended by images of themselves to contact Commons or OTRS. Commons, and the other projects, don't appear to have systematic procedures for requiring that consent and age of models be verifiable. The lack of these procedures means that the extent of any current problem is unknown - we may have explicit images now that were published without the consent of the subject (I suspect its quite likely that we do), or with a subject beneath the age of consent. But because we don't check, and we don't require uploaders to provide such information, we don't know.
Nathan
Your donations keep Wikipedia running! Support the Wikimedia Foundation today: http://www.wikimediafoundation.org/wiki/Donate
wikimedia-l@lists.wikimedia.org