This is a NSFW photo.... http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Korean_Vulv...
Five for deletion, two for keep. This is its third nomination.
An admin came in today and declared it being kept because "No valid reason for deletion, per previous decisions. Person is not recognizable." It has been nominated twice, by anon IP's who have simply declared "porn" or "obscene" as the deletion reason (not enough of a reason).
I nominated it, like I do many things, because it was unused on any project since its upload in March of 2009, it's uneducational, and the poor description proves that. I also think it's poor quality - if we need an "educational photo of a vulva" we have two really fab ones on the [[vulva]] article. Which of course was argued (a nude photo of a headless woman blow drying her hair in heels with the blow dryer cord and shadow in the shot.. come...on...), and as FloNight noted, we can probably have some high quality photos of a nude woman using a blow dryer that aren't taken in the bedroom for the project..if it's that in demand. http://commons.wikimedia.org/wiki/File:Korean_Vulva2.jpg
I shouldn't even act surprised...I guess.. :-/
Were the reasons we provided not valid enough? Can you even challenge something like this? Did I miss something? Am I doing this wrong? Regardless of the subject, I don't understand why the admin would declare the peoples reasons in valid based on my knowledge of the Commons policies...: "Commons is not a porn site", "private location, lack of model release" etc...
(And yes, I was a little snappy on my nomination (this was my original rager when I nominated a bunch of stuff from the "high heels" category..)...so no need to reprimand me....I've curbed my 'tude!)
Any help would be great,
Sarah
On Sun, Sep 11, 2011 at 14:53, Sarah Stierch sarah.stierch@gmail.com wrote:
(And yes, I was a little snappy on my nomination (this was my original rager when I nominated a bunch of stuff from the "high heels" category..)...so no need to reprimand me....I've curbed my 'tude!)
I love your 'tude. Don't let them kill it. :)
Sarah
This is a NSFW photo.... http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Korean_Vulv...
Five for deletion, two for keep. This is its third nomination.
An admin came in today and declared it being kept because "No valid reason for deletion, per previous decisions. Person is not recognizable."
Were the reasons we provided not valid enough? Can you even challenge something like this? Did I miss something? Am I doing this wrong? Any help would be great,
Sarah
He is correct that a proposal to do something must cite a valid reason, as must comments if they are to be considered when closing a matter.
An image where the person is not identifiable doesn't require their permission. In that he is correct.
I agree it is a particularly poor image that shows nearly nothing of educational value. However the reason you cite, pornographic, does not seem to apply; she is just drying her hair. He does admit "the amount of naked women on Commons is a bit ridiculous"
Perhaps that issue should be addressed as a policy discussion.
Fred
I left Yann a message on his talk page asking him to reconsider.
http://commons.wikimedia.org/wiki/User_talk:Yann#Korean_Vulva
I sincerely hope that she did give consent and knows that it is on Commons. Otherwise we are exploiting her.
I disagree that the person is not recognizable. It would be very unethical to upload this image without this person's consent. True exploitation of the person.
I feel very strong about this point because of the my knowledge of past exploitation of people in medical images in textbooks and medical journals, some of them nude. It was absolutely wrong when it was done in the name of education and it is wrong for us to do it now.
Sydney Poore User:FloNight
On Sun, Sep 11, 2011 at 4:53 PM, Sarah Stierch sarah.stierch@gmail.comwrote:
This is a NSFW photo.... http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Korean_Vulv...
Five for deletion, two for keep. This is its third nomination.
An admin came in today and declared it being kept because "No valid reason for deletion, per previous decisions. Person is not recognizable." It has been nominated twice, by anon IP's who have simply declared "porn" or "obscene" as the deletion reason (not enough of a reason).
I nominated it, like I do many things, because it was unused on any project since its upload in March of 2009, it's uneducational, and the poor description proves that. I also think it's poor quality - if we need an "educational photo of a vulva" we have two really fab ones on the [[vulva]] article. Which of course was argued (a nude photo of a headless woman blow drying her hair in heels with the blow dryer cord and shadow in the shot.. come...on...), and as FloNight noted, we can probably have some high quality photos of a nude woman using a blow dryer that aren't taken in the bedroom for the project..if it's that in demand. http://commons.wikimedia.org/wiki/File:Korean_Vulva2.jpg
I shouldn't even act surprised...I guess.. :-/
Were the reasons we provided not valid enough? Can you even challenge something like this? Did I miss something? Am I doing this wrong? Regardless of the subject, I don't understand why the admin would declare the peoples reasons in valid based on my knowledge of the Commons policies...: "Commons is not a porn site", "private location, lack of model release" etc...
(And yes, I was a little snappy on my nomination (this was my original rager when I nominated a bunch of stuff from the "high heels" category..)...so no need to reprimand me....I've curbed my 'tude!)
Any help would be great,
Sarah
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundationhttp://www.glamwiki.org Wikipedian-in-Residence, Archives of American Arthttp://en.wikipedia.org/wiki/User:SarahStierch and Sarah Stierch Consulting
*Historical, cultural & artistic research & advising.*
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
See the standard for medical images from the American Medical College of Genetics
http://www.acmg.net/resources/policies/pol-020.pdf
I worked with people with high risk pregnancy and sometimes we took pictures of the baby if it had a genetic disorder. But we always got consent first.
Sydney
On Sun, Sep 11, 2011 at 8:33 PM, Sydney Poore sydney.poore@gmail.comwrote:
I left Yann a message on his talk page asking him to reconsider.
http://commons.wikimedia.org/wiki/User_talk:Yann#Korean_Vulva
I sincerely hope that she did give consent and knows that it is on Commons. Otherwise we are exploiting her.
I disagree that the person is not recognizable. It would be very unethical to upload this image without this person's consent. True exploitation of the person.
I feel very strong about this point because of the my knowledge of past exploitation of people in medical images in textbooks and medical journals, some of them nude. It was absolutely wrong when it was done in the name of education and it is wrong for us to do it now.
Sydney Poore User:FloNight
On Sun, Sep 11, 2011 at 4:53 PM, Sarah Stierch sarah.stierch@gmail.comwrote:
This is a NSFW photo.... http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Korean_Vulv...
Five for deletion, two for keep. This is its third nomination.
An admin came in today and declared it being kept because "No valid reason for deletion, per previous decisions. Person is not recognizable." It has been nominated twice, by anon IP's who have simply declared "porn" or "obscene" as the deletion reason (not enough of a reason).
I nominated it, like I do many things, because it was unused on any project since its upload in March of 2009, it's uneducational, and the poor description proves that. I also think it's poor quality - if we need an "educational photo of a vulva" we have two really fab ones on the [[vulva]] article. Which of course was argued (a nude photo of a headless woman blow drying her hair in heels with the blow dryer cord and shadow in the shot.. come...on...), and as FloNight noted, we can probably have some high quality photos of a nude woman using a blow dryer that aren't taken in the bedroom for the project..if it's that in demand. http://commons.wikimedia.org/wiki/File:Korean_Vulva2.jpg
I shouldn't even act surprised...I guess.. :-/
Were the reasons we provided not valid enough? Can you even challenge something like this? Did I miss something? Am I doing this wrong? Regardless of the subject, I don't understand why the admin would declare the peoples reasons in valid based on my knowledge of the Commons policies...: "Commons is not a porn site", "private location, lack of model release" etc...
(And yes, I was a little snappy on my nomination (this was my original rager when I nominated a bunch of stuff from the "high heels" category..)...so no need to reprimand me....I've curbed my 'tude!)
Any help would be great,
Sarah
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundationhttp://www.glamwiki.org Wikipedian-in-Residence, Archives of American Arthttp://en.wikipedia.org/wiki/User:SarahStierch and Sarah Stierch Consulting
*Historical, cultural & artistic research & advising.*
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
In response to Sydney's post..
Having worked in the photography industry (and been forced in front of a camera a few times in my day..) as a consultant and a make-up artist (10 years in that industry) I've written, signed, had others sign, and dealt with model release forms a million times over. Here is a nice standard break down of that from the NYIP:
http://www.nyip.com/ezine/techtips/model-release.html
If we require permission for use via OTRS, I don't know why we can't have "model release" be incorporated sexual/nude photography, modeling photography, studio photography. Materials used for educational purposes, as Commons is supposed to be, this shouldn't be too hard. I haven't thought too hard about it yet, but, it is possible.
There of course comes the question of grandfathering in content, and Flickr. The strange thing about all this creative commons stuff on Flickr - is that most people *don't* release photographs of their friends, naked partners, or themselves to be used freely by the world CC-By-A/SA. So, it's always really hard for me to trust Flickr accounts where people are releasing their content for free use of naked people without some type of quality release content or statements on their page. I don't even release photographs of my friends via CCBYA (and if I would, I'd have permission), except Wikimedia related events and even then I have to ask people (generally) if it's okay if I post their photo.
There is also the idea of a warning that is more amplified. One could ask the uploader if it's questionable content they're uploading (or perhaps we can have some fancy Commons thing that "scans" the image for certain body parties, styles or actions) to make sure they really want to do that. We've had two "teenagers" (a 13 an 14 year old) recently request photographs of their lower-half in there mere underwear be removed from Commons. These presumed children uploaded photos of themselves, probably to be sexy and voyeuristic (like so many of us in the digital age growing up have explored) and then went "OH GOD NOOOOOO" a few days later.
The age is bad enough, but...plenty of people go "Ok please delete my crotch from Commons" often enough.
This brainstorm features:
- Model release form combined with OTRS - Commons nekkid parts sensor (i.e. like face recognition but for boobs, penises, vaginas, doggie style, whatever) - Alert for uploaders with sexual content making sure they want to do it - And I'll throw in a review of Flickr policy.
Sarah
On Sun, Sep 11, 2011 at 8:40 PM, Sydney Poore sydney.poore@gmail.comwrote:
See the standard for medical images from the American Medical College of Genetics
http://www.acmg.net/resources/policies/pol-020.pdf
I worked with people with high risk pregnancy and sometimes we took pictures of the baby if it had a genetic disorder. But we always got consent first.
Sydney
On Sun, Sep 11, 2011 at 8:33 PM, Sydney Poore sydney.poore@gmail.comwrote:
I left Yann a message on his talk page asking him to reconsider.
http://commons.wikimedia.org/wiki/User_talk:Yann#Korean_Vulva
I sincerely hope that she did give consent and knows that it is on Commons. Otherwise we are exploiting her.
I disagree that the person is not recognizable. It would be very unethical to upload this image without this person's consent. True exploitation of the person.
I feel very strong about this point because of the my knowledge of past exploitation of people in medical images in textbooks and medical journals, some of them nude. It was absolutely wrong when it was done in the name of education and it is wrong for us to do it now.
Sydney Poore User:FloNight
On Sun, Sep 11, 2011 at 4:53 PM, Sarah Stierch sarah.stierch@gmail.comwrote:
This is a NSFW photo.... http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Korean_Vulv...
Five for deletion, two for keep. This is its third nomination.
An admin came in today and declared it being kept because "No valid reason for deletion, per previous decisions. Person is not recognizable." It has been nominated twice, by anon IP's who have simply declared "porn" or "obscene" as the deletion reason (not enough of a reason).
I nominated it, like I do many things, because it was unused on any project since its upload in March of 2009, it's uneducational, and the poor description proves that. I also think it's poor quality - if we need an "educational photo of a vulva" we have two really fab ones on the [[vulva]] article. Which of course was argued (a nude photo of a headless woman blow drying her hair in heels with the blow dryer cord and shadow in the shot.. come...on...), and as FloNight noted, we can probably have some high quality photos of a nude woman using a blow dryer that aren't taken in the bedroom for the project..if it's that in demand. http://commons.wikimedia.org/wiki/File:Korean_Vulva2.jpg
I shouldn't even act surprised...I guess.. :-/
Were the reasons we provided not valid enough? Can you even challenge something like this? Did I miss something? Am I doing this wrong? Regardless of the subject, I don't understand why the admin would declare the peoples reasons in valid based on my knowledge of the Commons policies...: "Commons is not a porn site", "private location, lack of model release" etc...
(And yes, I was a little snappy on my nomination (this was my original rager when I nominated a bunch of stuff from the "high heels" category..)...so no need to reprimand me....I've curbed my 'tude!)
Any help would be great,
Sarah
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundationhttp://www.glamwiki.org Wikipedian-in-Residence, Archives of American Arthttp://en.wikipedia.org/wiki/User:SarahStierch and Sarah Stierch Consulting
*Historical, cultural & artistic research & advising.*
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
On Sun, Sep 11, 2011 at 8:55 PM, Sarah Stierch sarah.stierch@gmail.comwrote:
In response to Sydney's post..
Having worked in the photography industry (and been forced in front of a camera a few times in my day..) as a consultant and a make-up artist (10 years in that industry) I've written, signed, had others sign, and dealt with model release forms a million times over. Here is a nice standard break down of that from the NYIP:
http://www.nyip.com/ezine/techtips/model-release.html
If we require permission for use via OTRS, I don't know why we can't have "model release" be incorporated sexual/nude photography, modeling photography, studio photography. Materials used for educational purposes, as Commons is supposed to be, this shouldn't be too hard. I haven't thought too hard about it yet, but, it is possible.
I've been advocating for this for several years (check the archives of Foundation-l), but there's never been very much support - and none at all on Commons. Even the Board resolution only requires an "affirmation" from the uploader that the subject gave consent.
Nathan
This FloNight(Sydney), and Fred, for your thoughts.
Fred, when I posted the original deletion request it was based on Commons deletion policies - uneducational, quality, orphaning, personal photo, and unnecessary nudity (i.e. tasteless nude shot that the guy "claims" he took of this woman). How can we even trust that the uploader didn't take it from an online personals website, craigslist, or perhaps an email from a dating website - who knows. (Slippery slope..OoooOOOooh!!!)
Policy does need to be reexamined. I think Pete brought this up, and we've talked a bit about it. I do think that Commons has policies changes that need to be looked at - and I wholeheartedly believe this photo, and many others, qualify even under the current Commons policies, but I am "viewing them in my own way" just like those who support keeping those images view them their own way.
One thing Wikimedia as a whole *suffers* from is no "solidity" when it comes to policy and rules. Everything seems that it can be adapted, broken, changed, manipulated..etc. I think that's a problem.
Thanks Sydney for bringing it up with the admin. I have to admit, Commons does make me anxious (I'm so paranoid about backlash and harassment from Commons, after the last shit storm that I started that was forwarded to the Commons-L list) so I appreciate you speaking up about it!
Sarah who really needs to stop letting a bunch of dudes behind computers piss her off so much. ;)
On Sun, Sep 11, 2011 at 8:33 PM, Sydney Poore sydney.poore@gmail.comwrote:
I left Yann a message on his talk page asking him to reconsider.
http://commons.wikimedia.org/wiki/User_talk:Yann#Korean_Vulva
I sincerely hope that she did give consent and knows that it is on Commons. Otherwise we are exploiting her.
I disagree that the person is not recognizable. It would be very unethical to upload this image without this person's consent. True exploitation of the person.
I feel very strong about this point because of the my knowledge of past exploitation of people in medical images in textbooks and medical journals, some of them nude. It was absolutely wrong when it was done in the name of education and it is wrong for us to do it now.
Sydney Poore User:FloNight
On Sun, Sep 11, 2011 at 4:53 PM, Sarah Stierch sarah.stierch@gmail.comwrote:
This is a NSFW photo.... http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Korean_Vulv...
Five for deletion, two for keep. This is its third nomination.
An admin came in today and declared it being kept because "No valid reason for deletion, per previous decisions. Person is not recognizable." It has been nominated twice, by anon IP's who have simply declared "porn" or "obscene" as the deletion reason (not enough of a reason).
I nominated it, like I do many things, because it was unused on any project since its upload in March of 2009, it's uneducational, and the poor description proves that. I also think it's poor quality - if we need an "educational photo of a vulva" we have two really fab ones on the [[vulva]] article. Which of course was argued (a nude photo of a headless woman blow drying her hair in heels with the blow dryer cord and shadow in the shot.. come...on...), and as FloNight noted, we can probably have some high quality photos of a nude woman using a blow dryer that aren't taken in the bedroom for the project..if it's that in demand. http://commons.wikimedia.org/wiki/File:Korean_Vulva2.jpg
I shouldn't even act surprised...I guess.. :-/
Were the reasons we provided not valid enough? Can you even challenge something like this? Did I miss something? Am I doing this wrong? Regardless of the subject, I don't understand why the admin would declare the peoples reasons in valid based on my knowledge of the Commons policies...: "Commons is not a porn site", "private location, lack of model release" etc...
(And yes, I was a little snappy on my nomination (this was my original rager when I nominated a bunch of stuff from the "high heels" category..)...so no need to reprimand me....I've curbed my 'tude!)
Any help would be great,
Sarah
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundationhttp://www.glamwiki.org Wikipedian-in-Residence, Archives of American Arthttp://en.wikipedia.org/wiki/User:SarahStierch and Sarah Stierch Consulting
*Historical, cultural & artistic research & advising.*
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
On 12/09/2011 02:43, Sarah Stierch wrote:
One thing Wikimedia as a whole /suffers/ from is no "solidity" when it comes to policy and rules. Everything seems that it can be adapted, broken, changed, manipulated..etc. I think that's a problem.
Absolutely. I think in this case the real troublemaker is the admin, and the original poster is almost an innocent boy trying to post something he deems erotic or daring. By the admin's behaviour we see that the original poster is almost encouraged to behave like a bad little boy.
It is obvious that a photo of the vulva should show the vulva. If the admin doesn't understand that then he is hopeless and must go back to highschool for several years. He is certainly not scientifically literate enough to hold a position on Wikipedia.
You don't have to discuss with an admin who doesn't understand that a photo of an organ must show the organ.
You don't have to discuss with an admin who doesn't understand that photos of anatomy should be as devoid of erotic content as possible.
Democracy should not go that far as to negociate with total incompetence.
Either this admin is really stupid, and should never have made it to his position in WP, or he is being perverse with the vulva page.
If find it very difficult to believe that a person literate enough to make it to the position of admin on WP would be illiterate enough to not understand that a photo named vulva in the vulva page should show a vulva, and should avoid evocation of private life promiscuity.
On Mon, Sep 12, 2011 at 4:32 AM, Arnaud HERVE arnaudherve@x-mail.netwrote:
On 12/09/2011 02:43, Sarah Stierch wrote:
One thing Wikimedia as a whole *suffers* from is no "solidity" when it comes to policy and rules. Everything seems that it can be adapted, broken, changed, manipulated..etc. I think that's a problem.
Absolutely. I think in this case the real troublemaker is the admin, and the original poster is almost an innocent boy trying to post something he deems erotic or daring. By the admin's behaviour we see that the original poster is almost encouraged to behave like a bad little boy.
It is obvious that a photo of the vulva should show the vulva. If the admin doesn't understand that then he is hopeless and must go back to highschool for several years. He is certainly not scientifically literate enough to hold a position on Wikipedia.
I agree that this image had many problems and keeping it does not really make sense. That is the reason that I asked the admin to review his decision.
You don't have to discuss with an admin who doesn't understand that a photo of an organ must show the organ.
You don't have to discuss with an admin who doesn't understand that photos of anatomy should be as devoid of erotic content as possible.
Democracy should not go that far as to negociate with total incompetence.
Either this admin is really stupid, and should never have made it to his position in WP, or he is being perverse with the vulva page.
If find it very difficult to believe that a person literate enough to make it to the position of admin on WP would be illiterate enough to not understand that a photo named vulva in the vulva page should show a vulva, and should avoid evocation of private life promiscuity.
I know this administrators work on several projects, and I don't think that is an accurate description of his work in general. He regularly closes deletion discussions, and will close them for deletion about sexual content as he did in some of the other ones put up for deletion recently.
http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:April_after...
http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Hairpenis.j...
The reason that I see the issue with controversial content as a problem of systemic biasis that is that it has taken hold of WMF projects in general. If you look at the full body of his work, this admin truly is trying to follow policy and the customs of Commons and WMF projects in general. IMO, the policies need to be tweaked so that admins like him will have better policy to work with. And we need a broader group of people commenting in all deletion discussions so that we get a more globally representative view of what is appropriate for Commons to have on site.
Sydney
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
On 12/09/2011 12:18, Sydney Poore wrote:
If you look at the full body of his work, this admin truly is trying to follow policy and the customs of Commons and WMF projects in general.
Well I might have been too quick in judging him, and besides idiocy or perversion the reason of his behaviour might have been a complete lack of attention. To the point that he didn't even have a look at the photo, because if he did and still protected the photo, then I am back at the idiocy or perversity hypothesis.
Because, quite frankly, voluntary or not, exceptional or not, what he has done here is an insult to plain common sense, and a clear direct deterioration of WP content.
From the scientific point of view it is below the required level to even begin a discussion.
Imagine the page for Finger, should we even take time to discuss the propriety of a photo showing the forearm without the fingers ? What would we think of an admin who would protect a photo of the forearm without the fingers on the Finger page, after having been duly pointed to the obvious mistake by a user ? Don't you think the user with a normal self-respect would be right to no bother to come any longer on Wikipedia ?
If you add the Asian-erotic content to that, you realize that the photo was totally inappropriate on so many levels that the problem doesn't lie in the photo anymore but on the admin.
IMO, the policies need to be tweaked so that admins like him will have better policy to work with. And we need a broader group of people commenting in all deletion discussions so that we get a more globally representative view of what is appropriate for Commons to have on site.
Yes but as Sarah Stierch wrote today :
One thing Wikimedia as a whole /suffers/ from is no "solidity" when it comes to policy and rules. Everything seems that it can be adapted, broken, changed, manipulated..etc. I think that's a problem.
Adding rules or adding policies or adding commentators doesn't work if the admins don't show the adequate level of literacy, or use their position to manipulate the rules at their convenience.
In his Discussion lock comment Yann says "Person is not recognizable". That is typical of illiteracy and bad faith. You add a right detail to justify an otherwise totally wrong and very obviously wrong decision. That is totally twisting the rules.
As a result we now have a scientifically totally irrelevant and plainly domestic-erotic photo on WP, which is explicitly protected by WP. The mistake is so obvious that no further rules will work if admins don't show a normal intention to respect the rules.
Re-read the discussion page. Is it normal that Sarah Stierch (Missvain) had to take time to write the obvious in detail, and that she was not followed eventually ? This is not fair, no grown-up literate person should be treated like that. Even if it is involuntary, Yann's decision is so wrong and so rude it should seriously put in doubt his position as an admin.
http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Korean_Vulv...
On Mon, Sep 12, 2011 at 9:17 PM, Arnaud HERVE arnaudherve@x-mail.net wrote:
Well I might have been too quick in judging him, and besides idiocy or perversion the reason of his behaviour might have been a complete lack of attention. To the point that he didn't even have a look at the photo, because if he did and still protected the photo, then I am back at the idiocy or perversity hypothesis.
If I understand correctly, one of your concerns are regarding the image name. An invalid name isn't a good reason to delete an image, as images can be renamed.
On 12/09/2011 13:25, John Vandenberg wrote:
On Mon, Sep 12, 2011 at 9:17 PM, Arnaud HERVEarnaudherve@x-mail.net wrote:
Well I might have been too quick in judging him, and besides idiocy or perversion the reason of his behaviour might have been a complete lack of attention. To the point that he didn't even have a look at the photo, because if he did and still protected the photo, then I am back at the idiocy or perversity hypothesis.
If I understand correctly, one of your concerns are regarding the image name. An invalid name isn't a good reason to delete an image, as images can be renamed.
No you did not understand me correctly. You did not even begin to.
First I did not mention the image name in the paragraph from me that you quote saying I am concerned by the image name.
I don't think I have mentioned it elsewhere either. Although obviously, to a person with normal command of English, finding such a fishy name which sounds so much like "get young Asian meat" would already trigger the red alarm, and the alarm would be fully confirmed by seeing the image.
Then, it is not only me here.
You understand there are thousands of highly trained professional women in the medical sector who could participate. Now, telling them that their contribution can be deleted any time by a 14 year old boy who happens to pass by is already - from their point of view - obnoxious enough. But then if their basic intention to keep images showing the appropriate organ is not even respected by the admins, then they MUST fly away from WP.
Come on, what can be more simple than keeping specific photos of the relevant organ on the dedicated organ's page.
What can be difficult to understand in avoiding erotic content on anatomy pages ?
What for mentioning renaming an image, when the image itself is so completely unacceptable ?
PS : Sorry I am angry today : I am trying to quit smoking.
On Mon, Sep 12, 2011 at 05:17, Arnaud HERVE arnaudherve@x-mail.net wrote:
IMO, the policies need to be tweaked so that admins like him will have better policy to work with.
Do we have specific Commons policies on voyeurism and invasion of privacy?
Sarah
No, not really. The assumption is toward the uploader having the appropriate permission if it appears to be an amateur image and it has not obvious signs of being a copyright violation. People have been in disagreement about whether images that are "controversial content" should be be held to a higher level of scrutiny. Some people say that we are be biased if we require a higher level of scrutiny for images of naked people. I disagree, but think that we really need to have a higher level of scrutiny for all images with identifiable people. By requiring model consent, we would solve a large part of the problems with the images on Commons.
Sydney Poore
On Mon, Sep 12, 2011 at 7:35 AM, Sarah slimvirgin@gmail.com wrote:
On Mon, Sep 12, 2011 at 05:17, Arnaud HERVE arnaudherve@x-mail.net wrote:
IMO, the policies need to be tweaked so that admins like him will have better policy to work with.
Do we have specific Commons policies on voyeurism and invasion of privacy?
Sarah
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
I wonder whether it would be worth developing a guideline, or just writing an essay about it on Commons. Trouble is, I know so little about how the Commons works -- I don't even know how to find their list of policies.
My thinking is that voyeurism is increasingly becoming a criminal offence, and an essay about it might help to identify the kinds of images we should be wary of uploading. For example, in the UK, a person commits a criminal offence if:
"(a) he records another person (B) doing a private act,
"(b) he does so with the intention that he or a third person will, for the purpose of obtaining sexual gratification, look at an image of B doing the act, and
"(c) he knows that B does not consent to his recording the act with that intention."
http://www.legislation.gov.uk/ukpga/2003/42/section/67
The problem with all of this on Wikimedia is the anonymity factor. People could say "I am the model and I hereby give consent." I don't know how we get round that.
Sarah
On Mon, Sep 12, 2011 at 05:45, Sydney Poore sydney.poore@gmail.com wrote:
No, not really. The assumption is toward the uploader having the appropriate permission if it appears to be an amateur image and it has not obvious signs of being a copyright violation. People have been in disagreement about whether images that are "controversial content" should be be held to a higher level of scrutiny. Some people say that we are be biased if we require a higher level of scrutiny for images of naked people. I disagree, but think that we really need to have a higher level of scrutiny for all images with identifiable people. By requiring model consent, we would solve a large part of the problems with the images on Commons.
Sydney Poore
On Mon, Sep 12, 2011 at 7:35 AM, Sarah slimvirgin@gmail.com wrote:
On Mon, Sep 12, 2011 at 05:17, Arnaud HERVE arnaudherve@x-mail.net wrote:
IMO, the policies need to be tweaked so that admins like him will have better policy to work with.
Do we have specific Commons policies on voyeurism and invasion of privacy?
Sarah
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Sydney, I completely agree with your opinion and comments.
http://commons.wikimedia.org/wiki/Commons:IDENT
http://commons.wikimedia.org/wiki/Commons:SCOPE
Roberta
2011/9/12 Sarah slimvirgin@gmail.com
I wonder whether it would be worth developing a guideline, or just writing an essay about it on Commons. Trouble is, I know so little about how the Commons works -- I don't even know how to find their list of policies.
My thinking is that voyeurism is increasingly becoming a criminal offence, and an essay about it might help to identify the kinds of images we should be wary of uploading. For example, in the UK, a person commits a criminal offence if:
"(a) he records another person (B) doing a private act,
"(b) he does so with the intention that he or a third person will, for the purpose of obtaining sexual gratification, look at an image of B doing the act, and
"(c) he knows that B does not consent to his recording the act with that intention."
http://www.legislation.gov.uk/ukpga/2003/42/section/67
The problem with all of this on Wikimedia is the anonymity factor. People could say "I am the model and I hereby give consent." I don't know how we get round that.
Sarah
On Mon, Sep 12, 2011 at 05:45, Sydney Poore sydney.poore@gmail.com wrote:
No, not really. The assumption is toward the uploader having the
appropriate
permission if it appears to be an amateur image and it has not obvious
signs
of being a copyright violation. People have been in disagreement about whether images that are "controversial content" should be be held to a higher level of scrutiny. Some people say that we are be biased if we require a higher level of scrutiny for images of naked people. I
disagree,
but think that we really need to have a higher level of scrutiny for all images with identifiable people. By requiring model consent, we would
solve
a large part of the problems with the images on Commons.
Sydney Poore
On Mon, Sep 12, 2011 at 7:35 AM, Sarah slimvirgin@gmail.com wrote:
On Mon, Sep 12, 2011 at 05:17, Arnaud HERVE arnaudherve@x-mail.net wrote:
IMO, the policies need to be tweaked so that admins like him will have better policy to work with.
Do we have specific Commons policies on voyeurism and invasion of
privacy?
Sarah
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
WMF projects should be a leader in assuring that people's human rights are enforced. Right now WMF policies do not reflect best practices. But the WMF Board and staff are moving in the right direction.
The problem is that the a large part of the community holds the idea of free speak as a higher value than protecting the rights of people who might be harmed.
The solution is more discussion where people can be educated about all the ramifications of hosting controversial content. And also bringing more people into the community who hold a more moderate view about the importance of free speech, and who will be better able to make more balanced decisions when we must weigh all the differing ideals and ethical considerations.
There are some essays around, I think. Read one recently about hosting images of people. Another one would be good on the topic of voyeurism.
Sydney
On Mon, Sep 12, 2011 at 7:57 AM, Sarah slimvirgin@gmail.com wrote:
I wonder whether it would be worth developing a guideline, or just writing an essay about it on Commons. Trouble is, I know so little about how the Commons works -- I don't even know how to find their list of policies.
My thinking is that voyeurism is increasingly becoming a criminal offence, and an essay about it might help to identify the kinds of images we should be wary of uploading. For example, in the UK, a person commits a criminal offence if:
"(a) he records another person (B) doing a private act,
"(b) he does so with the intention that he or a third person will, for the purpose of obtaining sexual gratification, look at an image of B doing the act, and
"(c) he knows that B does not consent to his recording the act with that intention."
http://www.legislation.gov.uk/ukpga/2003/42/section/67
The problem with all of this on Wikimedia is the anonymity factor. People could say "I am the model and I hereby give consent." I don't know how we get round that.
Sarah
On Mon, Sep 12, 2011 at 05:45, Sydney Poore sydney.poore@gmail.com wrote:
No, not really. The assumption is toward the uploader having the
appropriate
permission if it appears to be an amateur image and it has not obvious
signs
of being a copyright violation. People have been in disagreement about whether images that are "controversial content" should be be held to a higher level of scrutiny. Some people say that we are be biased if we require a higher level of scrutiny for images of naked people. I
disagree,
but think that we really need to have a higher level of scrutiny for all images with identifiable people. By requiring model consent, we would
solve
a large part of the problems with the images on Commons.
Sydney Poore
[Including whole original message]
I wonder whether it would be worth developing a guideline, or just writing an essay about it on Commons. Trouble is, I know so little about how the Commons works -- I don't even know how to find their list of policies.
My thinking is that voyeurism is increasingly becoming a criminal offence, and an essay about it might help to identify the kinds of images we should be wary of uploading. For example, in the UK, a person commits a criminal offence if:
"(a) he records another person (B) doing a private act,
"(b) he does so with the intention that he or a third person will, for the purpose of obtaining sexual gratification, look at an image of B doing the act, and
"(c) he knows that B does not consent to his recording the act with that intention."
http://www.legislation.gov.uk/ukpga/2003/42/section/67
The problem with all of this on Wikimedia is the anonymity factor. People could say "I am the model and I hereby give consent." I don't know how we get round that.
Sarah
Especially when the images are scraped off the CC-BY and CC-BY-SA Flickr streams.
While many American states have enacted similar statutes, there has been no effort to criminalize the distribution of media created through a violation of them, which has never quite made sense to me. So on Wikipedia, we are also in an ethically gray area.
Along those lines, I direct your attention to:
http://commons.wikimedia.org/wiki/Category:Upskirt and http://commons.wikimedia.org/wiki/Category:Downblouse
While hardly all of those images are what I feared they might be (most don't really seem to depict the unintentional exposure of an unaware subject's private parts or underwear from an angle that suggests intentional use for that purpose by the photographer), there are some that I strongly doubt were taken with the subject's awareness, much less consent (although they don't show that much:
http://commons.wikimedia.org/wiki/File:Marcia_Imperator_back.jpg http://commons.wikimedia.org/wiki/File:Marcia_Imperator_legs.jpg
I also really don't think it's fair to the subject to categorize this picture as "upskirt"
http://commons.wikimedia.org/wiki/File:US_Open_2009_4th_round_258.jpg
The greater problem is, what do we do about the potential problem here? I think there is a real problem already with Flickr images ... Flickr doesn't bother to affirmatively screen submissions for copyright infringement, much less whether they were taken or uploaded with the subject's consent even if they are unidentifiable. The former problem long ago reached the point where we've had to publish a whole page of Flickr users to not reupload from (http://commons.wikimedia.org/wiki/Commons:Questionable_Flickr_images#Flickr users) Why would we not have such a list of Flickr users who might have uploaded nude images without the consent or knowledge of the subject?
And perhaps we ought not to presume a Flickr-sourced nude is ethically OK. Perhaps Commons policy ought to require that any image of a nude person or parts thereof transferred to Commons from Flickr come with evidence of consent to be photographed and allow such a photograph to be distributed under a free license. Perhaps any such media uploaded directly to Commons ought to require an OTRS-verified permission with such stated on the image page.
As it is, some of the images in the categories, even those of clearly identifiable people, don't even the {{personality rights}} tag, the little legal protection we do try to offer.
Daniel Case
On Mon, Sep 12, 2011 at 11:01, Daniel and Elizabeth Case dancase@frontiernet.net wrote:
The problem with all of this on Wikimedia is the anonymity factor. People could say "I am the model and I hereby give consent." I don't know how we get round that.
Sarah
Especially when the images are scraped off the CC-BY and CC-BY-SA Flickr streams.
That was something I noticed the other day. An anon replaced the infobox image on Veganism with a close-up shot of a woman's genitals and a vibrator. I looked to see who had uploaded it and it said Flickr upload bot. So is there a bot that uploads all cc images from Flickr indiscriminately?
Sarah
Especially when the images are scraped off the CC-BY and CC-BY-SA Flickr streams.
That was something I noticed the other day. An anon replaced the infobox image on Veganism with a close-up shot of a woman's genitals and a vibrator. I looked to see who had uploaded it and it said Flickr upload bot. So is there a bot that uploads all cc images from Flickr indiscriminately?
Apparently a bot does the work, but a human has to ask for them.
http://commons.wikimedia.org/wiki/User:Flickr_upload_bot
Chris
And as a note - when you review the content that users are uploading using Bryan's bot, the MAJORITY of it is educational content. Nothing "questionable" or too contorversial.
It seems the biggest problems come from a freedom of panorama, nudity/porn, and celebrity images uploaded to Flickr with incorrect permissions/not the users work/copyright infringement/blahbalhblah. It does appear that there are some users that just upload every bit of free content they can find outside of family photos.
Like Commons, anyone can upload it, but, unlike Commons, no one on Flickr reviews content for appropriateness and copyright correctness. I suppose we are one step ahead, it's just irritating when you come across an image's source and this is what you get:
http://www.flickr.com/photos/22186088@N03/4038072177/
Sarah
On Mon, Sep 12, 2011 at 2:05 PM, Chris Keating chriskeatingwiki@gmail.comwrote:
Especially when the images are scraped off the CC-BY and CC-BY-SA Flickr streams.
That was something I noticed the other day. An anon replaced the infobox image on Veganism with a close-up shot of a woman's genitals and a vibrator. I looked to see who had uploaded it and it said Flickr upload bot. So is there a bot that uploads all cc images from Flickr indiscriminately?
Apparently a bot does the work, but a human has to ask for them.
http://commons.wikimedia.org/wiki/User:Flickr_upload_bot
Chris
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
On Mon, Sep 12, 2011 at 1:59 PM, Sarah slimvirgin@gmail.com wrote:
the images are scraped off the CC-BY and CC-BY-SA Flickr
streams.
That was something I noticed the other day. An anon replaced the infobox image on Veganism with a close-up shot of a woman's genitals and a vibrator. I looked to see who had uploaded it and it said Flickr upload bot. So is there a bot that uploads all cc images from Flickr indiscriminately?
People can set up bots to scrape. I do use a tool (Bryan's toolhttp://toolserver.org/%7Ebryan/flickr/upload) when I upload images I chose from Flickr that are CC appropriate. I guess there are bots that do scrape things if you chose a topic, theme, or search? I know Aude (Katie) runs a bot that scrapes cultural institution content sometimes. Perhaps someone else can elaborate.
While I support the use of technology, I also fear that people put so much trust into this technology they aren't aware of the lame content being uploaded. They love to reiterate that if the "bot approves it" it's okay and fine to be on Commons, but so much content that is pornographic in nature is often uploaded, bot approved, then the Flickr account is deleted. This is a rather broken approval process or system, IMHO.
I have said this before, and I'll say it again: Automation is good to a point. It's destructive to the community in many ways though: it removes personality and human touch, it removes human connection, empathy and awareness from work, and it has this surreal ability to have people fully trust it. That's something that really disturbs me, and I think it's one reason why we have a hard time retaining editors. Everything is automated.
And Bladerunner is my favorite movie and sometimes I wish I was a replicant. But, I didn't know what Twinkle or Huggle was until Wikimania, because I rely on my own mind to produce what I need out of Wikimedia. Not all bots are bad, but, it's a problem.
-Sarah
On Mon, Sep 12, 2011 at 12:06, Sarah Stierch sarah.stierch@gmail.com wrote:
While I support the use of technology, I also fear that people put so much trust into this technology they aren't aware of the lame content being uploaded. They love to reiterate that if the "bot approves it" it's okay and fine to be on Commons, but so much content that is pornographic in nature is often uploaded, bot approved, then the Flickr account is deleted. This is a rather broken approval process or system, IMHO.
It's a major problem with Flickr. I've often emailed people there who have CC images that I'd like to use in articles. They say "sure, I've released it, go ahead." I say, "can I just check that you took the photograph yourself?" Answer, "sure, it's fine for you to use it." Me: "Can you confirm that you actually took the photograph yourself?" No answer.
That's ignoring whether the subject of the photograph has given consent, which is another issue on top of who the author is.
Practically, what can we do about this? I feel we've become a magnet for pornography in a way that's bad for everyone -- project, editors, Foundation, the individual women depicted, women in general. Realistically, what can we do that doesn't involve simply having the occasional image deleted?
Sarah
Sarah Stierch: While I support the use of technology, I also fear that people put so much trust into this technology they aren't aware of the lame content being uploaded. They love to reiterate that if the "bot approves it" it's okay and fine to be on Commons, but so much content that is pornographic in nature is often uploaded, bot approved, then the Flickr account is deleted. This is a rather broken approval process or system, IMHO.
Myself:
My problem with Flickrbot is that it encourages the transfer of images without editing. So many Flickr images could be improved even with minimal Photoshop skill, or even a few juducious crops. There are very few of the many Flickr images I've transferred that I didn't do a little work on.
Sarah again:
I have said this before, and I'll say it again: Automation is good to a point. It's destructive to the community in many ways though: it removes personality and human touch, it removes human connection, empathy and awareness from work, and it has this surreal ability to have people fully trust it. That's something that really disturbs me, and I think it's one reason why we have a hard time retaining editors. Everything is automated.
Myself:
I do wonder if automating some tasks cost us a few editors ... as opposed to how it worked in the real world, editors who specialized in these things, who may not have felt up to any editorial task involving extensive writing or rewriting or image creation were certainly welcome to stay, but just didn't find anywhere else they could fit in when the bots took over.
Daniel Case
On 12/09/2011 13:45, Sydney Poore wrote:
No, not really. The assumption is toward the uploader having the appropriate permission if it appears to be an amateur image and it has not obvious signs of being a copyright violation. People have been in disagreement about whether images that are "controversial content" should be be held to a higher level of scrutiny. Some people say that we are be biased if we require a higher level of scrutiny for images of naked people. I disagree, but think that we really need to have a higher level of scrutiny for all images with identifiable people. By requiring model consent, we would solve a large part of the problems with the images on Commons.
Let us not forget that for many males the decision to post an image of their nude girl-friend online is a deliberate insult. Precisely, that is their revenge after they have been dumped by said girl-friend.
In such a context, any domestic nude photo should be deemed suspicious. Unless there is the explicit consent of the girl, but I assume I am not very far from truth in assuming all those photos are posted by males.
Less frequently, such a tolerance might also encourage the publication of incest. More concretely, in this "Korean" case today, a small but not neglectable probability exists that it is the author's daughter. It can happen.
On Mon, Sep 12, 2011 at 8:00 AM, Arnaud HERVE arnaudherve@x-mail.netwrote:
On 12/09/2011 13:45, Sydney Poore wrote:
No, not really. The assumption is toward the uploader having the appropriate permission if it appears to be an amateur image and it has not obvious signs of being a copyright violation. People have been in disagreement about whether images that are "controversial content" should be be held to a higher level of scrutiny. Some people say that we are be biased if we require a higher level of scrutiny for images of naked people. I disagree, but think that we really need to have a higher level of scrutiny for all images with identifiable people. By requiring model consent, we would solve a large part of the problems with the images on Commons.
Let us not forget that for many males the decision to post an image of their nude girl-friend online is a deliberate insult. Precisely, that is their revenge after they have been dumped by said girl-friend.
In such a context, any domestic nude photo should be deemed suspicious. Unless there is the explicit consent of the girl, but I assume I am not very far from truth in assuming all those photos are posted by males.
Less frequently, such a tolerance might also encourage the publication of incest. More concretely, in this "Korean" case today, a small but not neglectable probability exists that it is the author's daughter. It can happen.
I agree that this is a potential problem. We know that people upload images of people to disparage, embarrass or harass them. People in some societies usually do not go nude in public, or publicly publish images of themselves performing sex acts. In fact most people don't anywhere in the world although it does happen in some places more than others.
Right now, I think that we should assume that most people value their privacy, and unless they have actively given consent for a nude image we should delete it to protect their right to privacy, and the right to control the way that their photograph is used.
Sydney
User:FloNight
Right now, I think that we should assume that most people value their privacy, and unless they have actively given consent for a nude image we should delete it to protect their right to privacy, and the right to control the way that their photograph is used.
Sydney
User:FloNight
Yes, the rule should be that personally identifying images should be routinely suppressed in the absence of permission from an identifiable person who has signed an informed release. An informed release would include a declaration that they understand how open source image licensing works.
There are privacy issues. Can a person remain anonymous and release an image of themselves. If so, how, in the case of Commons or Wikipedia?
We routinely use the oversight tool on the English Wikipedia to suppress phone numbers and home addresses as releases of personal information.
Fred
Right now, I think that we should assume that most people value their privacy, and unless they have actively given consent for a nude image we should delete it to protect their right to privacy, and the right to control the way that their photograph is used.
Sydney
User:FloNight
Yes, the rule should be that personally identifying images should be routinely suppressed in the absence of permission from an identifiable person who has signed an informed release. An informed release would include a declaration that they understand how open source image licensing works.
There are privacy issues. Can a person remain anonymous and release an image of themselves. If so, how, in the case of Commons or Wikipedia?
We routinely use the oversight tool on the English Wikipedia to suppress phone numbers and home addresses as releases of personal information.
Fred
If a person is identifiable to their social set, family, or even to just themselves, permission should be required if there is sexual or sexually suggestive content.
Fred
There are privacy issues. Can a person remain anonymous and release an image of themselves. If so, how, in the case of Commons or Wikipedia?
Yes, through trusted OTRS representatives that understand the need for privacy. I'd happily sign a contract with the Wikimedia Foundation "swearing" myself to secrecy. While transparency might be a key for Wikimedia, anonymity is also another important feature and option for users. There can be a simple template that affirms that this individual has verified their information - even if it requires the OTRS agent who signs up for the task to go above and beyond to verify content (i.e. a copy of identification, phonecall, I don't know - I'm just throwing out ideas) - and do our best to trust people and our work.
-Sarah
I'm both a long-time admin on Commons and an OTRS volunteer. I've been wanting to chime in on this thread, but haven't really had the time. I'm worried though that I'm about to see history repeat itself, so I want to quickly share a few thoughts...
First, the issue of consent on Commons has been passionately debates for years, and has a long and tortured history. Before proposing anything, please make yourself familiar with the previous discussions and their outcomes. Most notably the discussions surrounding these pages: http://commons.wikimedia.org/wiki/Commons:Sexual_content http://commons.wikimedia.org/wiki/Commons:Administrators%27_noticeboard/Arch... http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_people http://commons.wikimedia.org/wiki/Commons:Nudity
The point I can't emphasize enough is that if you put forward any proposal on Commons that implies there is anything possibly problematic about sexual or nude images in any way, you will be completely shut down. The only way you have any chance to shape the policies and guidelines on Commons is if you approach the problem from a sex/nudity-agnostic point of view. Here's a good example of what NOT to do:
I think a general statement that permission of the subject is desirable / necessary for photos featuring nudity would be a good thing - thoughts? Privatemusings (talk) 00:49, 8 January 2009 (UTC) I think the horse is beyond dead by now. --Carnildo (talk) 22:46, 8 January 2009 (UTC)
If the horse was beyond dead in January 2009, imagine where it is now. That said, there is still lots of room for improvement. In particular...
Commons already requires consent for photos of identifiable people in private spaces. In addition, many countries require consent even for public spaces. (Take a look at http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_person....) The way this requirement works, however, is completely passive and reactive - there is no impetus to proactively assert consent, only to assert it when an image is challenged. This is a very inefficient system. There are no templates or categories or anything to deal with consent on Commons (apart from Template:Consent which is tied up with the tortured history of Commons:Sexual_content and can't be used currently).
I don't think it would be incredibly controversial to introduce a very simple consent template that was specifically tailored to the existing policies and laws. This would make things easier for Commons reusers, professional photographers who use model releases, and admins who have to constantly deal with these issues. In short, it would be a win for everyone and it would introduce the idea of thinking proactively about consent on Commons in a way that isn't threatening to people who are concerned about censorship.
As soon as I have some free time, I'll whip up such a template and throw it into the water. It'll be interesting to see how it is received.
Ryan Kaldari
--- On Mon, 12/9/11, Ryan Kaldari rkaldari@wikimedia.org wrote: First, the issue of consent on Commons has been passionately debates for years, and has a long and tortured history. Before proposing anything, please make yourself familiar with the previous discussions and their outcomes. Most notably the discussions surrounding these pages: http://commons.wikimedia.org/wiki/Commons:Sexual_content http://commons.wikimedia.org/wiki/Commons:Administrators%27_noticeboard/Arch... http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_people http://commons.wikimedia.org/wiki/Commons:Nudity
The point I can't emphasize enough is that if you put forward any proposal on Commons that implies there is anything possibly problematic about sexual or nude images in any way, you will be completely shut down. And rightly so. After all, the idea -- -- that people might feel aggrieved if a picture of them naked, or giving a blowjob, is hosted on Commons for global reuse, without their consent, -- that their strength of feeling might be different if the matter concerned a picture showing them clothed, walking down the street -- and that the Foundation should bear that difference in strength of feeling in mind, by requiring more solid consent for the former type of image, is really outré, isn't it. :))
Andreas The only way you have any chance to shape the policies and guidelines on Commons is if you approach the problem from a sex/nudity-agnostic point of view. Here's a good example of what NOT to do: I think a general statement that permission of the subject is desirable / necessary for photos featuring nudity would be a good thing - thoughts? Privatemusings (talk) 00:49, 8 January 2009 (UTC) I think the horse is beyond dead by now. --Carnildo (talk) 22:46, 8 January 2009 (UTC)
If the horse was beyond dead in January 2009, imagine where it is now. That said, there is still lots of room for improvement. In particular...
Commons already requires consent for photos of identifiable people in private spaces. In addition, many countries require consent even for public spaces. (Take a look at http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_person....) The way this requirement works, however, is completely passive and reactive - there is no impetus to proactively assert consent, only to assert it when an image is challenged. This is a very inefficient system. There are no templates or categories or anything to deal with consent on Commons (apart from Template:Consent which is tied up with the tortured history of Commons:Sexual_content and can't be used currently).
I don't think it would be incredibly controversial to introduce a very simple consent template that was specifically tailored to the existing policies and laws. This would make things easier for Commons reusers, professional photographers who use model releases, and admins who have to constantly deal with these issues. In short, it would be a win for everyone and it would introduce the idea of thinking proactively about consent on Commons in a way that isn't threatening to people who are concerned about censorship.
As soon as I have some free time, I'll whip up such a template and throw it into the water. It'll be interesting to see how it is received.
Ryan Kaldari
_______________________________________________ Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Here's something we might do though. Addition of any image which violates anyone's privacy to any article can be suppressed on the English Wikipedia. Using this policy: "Removal of non-public personal information, such as phone numbers, home addresses, workplaces or identities of pseudonymous or anonymous individuals who have not made their identity public. This includes hiding the IP data of editors who accidentally logged out and thus inadvertently revealed their own IP addresses. Suppression is a tool of first resort in removing this information."
https://secure.wikimedia.org/wikipedia/en/wiki/Wikipedia:Oversight
That would not resolve the situation on Commons, but would render the image useless and superfluous.
Fred
--- On Mon, 12/9/11, Ryan Kaldari rkaldari@wikimedia.org wrote: First, the issue of consent on Commons has been passionately debates for years, and has a long and tortured history. Before proposing anything, please make yourself familiar with the previous discussions and their outcomes. Most notably the discussions surrounding these pages: http://commons.wikimedia.org/wiki/Commons:Sexual_content http://commons.wikimedia.org/wiki/Commons:Administrators%27_noticeboard/Arch... http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_people http://commons.wikimedia.org/wiki/Commons:Nudity
The point I can't emphasize enough is that if you put forward any proposal on Commons that implies there is anything possibly problematic about sexual or nude images in any way, you will be completely shut down. And rightly so. After all, the idea -- -- that people might feel aggrieved if a picture of them naked, or giving a blowjob, is hosted on Commons for global reuse, without their consent, -- that their strength of feeling might be different if the matter concerned a picture showing them clothed, walking down the street -- and that the Foundation should bear that difference in strength of feeling in mind, by requiring more solid consent for the former type of image, is really outré, isn't it. :))
Andreas The only way you have any chance to shape the policies and guidelines on Commons is if you approach the problem from a sex/nudity-agnostic point of view. Here's a good example of what NOT to do: I think a general statement that permission of the subject is desirable / necessary for photos featuring nudity would be a good thing - thoughts? Privatemusings (talk) 00:49, 8 January 2009 (UTC) I think the horse is beyond dead by now. --Carnildo (talk) 22:46, 8 January 2009 (UTC)
If the horse was beyond dead in January 2009, imagine where it is now. That said, there is still lots of room for improvement. In particular...
Commons already requires consent for photos of identifiable people in private spaces. In addition, many countries require consent even for public spaces. (Take a look at http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_person....) The way this requirement works, however, is completely passive and reactive - there is no impetus to proactively assert consent, only to assert it when an image is challenged. This is a very inefficient system. There are no templates or categories or anything to deal with consent on Commons (apart from Template:Consent which is tied up with the tortured history of Commons:Sexual_content and can't be used currently).
I don't think it would be incredibly controversial to introduce a very simple consent template that was specifically tailored to the existing policies and laws. This would make things easier for Commons reusers, professional photographers who use model releases, and admins who have to constantly deal with these issues. In short, it would be a win for everyone and it would introduce the idea of thinking proactively about consent on Commons in a way that isn't threatening to people who are concerned about censorship.
As soon as I have some free time, I'll whip up such a template and throw it into the water. It'll be interesting to see how it is received.
Ryan Kaldari
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap _______________________________________________ Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
On Mon, Sep 12, 2011 at 7:00 PM, Fred Bauder fredbaud@fairpoint.net wrote:
Here's something we might do though. Addition of any image which violates anyone's privacy to any article can be suppressed on the English Wikipedia. Using this policy: "Removal of non-public personal information, such as phone numbers, home addresses, workplaces or identities of pseudonymous or anonymous individuals who have not made their identity public. This includes hiding the IP data of editors who accidentally logged out and thus inadvertently revealed their own IP addresses. Suppression is a tool of first resort in removing this information."
https://secure.wikimedia.org/wikipedia/en/wiki/Wikipedia:Oversight
That would not resolve the situation on Commons, but would render the image useless and superfluous.
Frankly, I'd consider that an abusive misinterpretation of the rules you cite.
As soon as I have some free time, I'll whip up such a template and throw it into the water. It'll be interesting to see how it is received.
Ryan Kaldari
Ok, sounds like a plan. I'll make a noise in the east; you strike in the west...
Fred
Hi Ryan,
A draft template was actually made to augment the mostly recently voted [[COM:SEX]] proposal: http://commons.wikimedia.org/wiki/Template:Consent
The proposal closed with no consensus*, but with a few modifications, the template could still be put to good use.
Toby / 99of9
*Mainly because it included a clause allowing admins to delete out of scope sexual content directly in a speedy deletion rather than setting up a deletion request. There actually wasn't too much opposition to requiring a statement of consent for identifiable sexual images, although there was some.
On Tue, Sep 13, 2011 at 8:51 AM, Ryan Kaldari rkaldari@wikimedia.orgwrote:
I'm both a long-time admin on Commons and an OTRS volunteer. I've been wanting to chime in on this thread, but haven't really had the time. I'm worried though that I'm about to see history repeat itself, so I want to quickly share a few thoughts...
First, the issue of consent on Commons has been passionately debates for years, and has a long and tortured history. Before proposing anything, please make yourself familiar with the previous discussions and their outcomes. Most notably the discussions surrounding these pages: http://commons.wikimedia.org/wiki/Commons:Sexual_content
http://commons.wikimedia.org/wiki/Commons:Administrators%27_noticeboard/Arch...
http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_people http://commons.wikimedia.org/wiki/Commons:Nudity
The point I can't emphasize enough is that if you put forward any proposal on Commons that implies there is anything possibly problematic about sexual or nude images in any way, you will be completely shut down. The only way you have any chance to shape the policies and guidelines on Commons is if you approach the problem from a sex/nudity-agnostic point of view. Here's a good example of what NOT to do:
I think a general statement that permission of the subject is desirable / necessary for photos featuring nudity would be a good thing - thoughts? Privatemusings (talk) 00:49, 8 January 2009 (UTC) I think the horse is beyond dead by now. --Carnildo (talk) 22:46, 8 January 2009 (UTC)
If the horse was beyond dead in January 2009, imagine where it is now. That said, there is still lots of room for improvement. In particular...
Commons already requires consent for photos of identifiable people in private spaces. In addition, many countries require consent even for public spaces. (Take a look at
http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_person... .) The way this requirement works, however, is completely passive and reactive - there is no impetus to proactively assert consent, only to assert it when an image is challenged. This is a very inefficient system. There are no templates or categories or anything to deal with consent on Commons (apart from Template:Consent which is tied up with the tortured history of Commons:Sexual_content and can't be used currently).
I don't think it would be incredibly controversial to introduce a very simple consent template that was specifically tailored to the existing policies and laws. This would make things easier for Commons reusers, professional photographers who use model releases, and admins who have to constantly deal with these issues. In short, it would be a win for everyone and it would introduce the idea of thinking proactively about consent on Commons in a way that isn't threatening to people who are concerned about censorship.
As soon as I have some free time, I'll whip up such a template and throw it into the water. It'll be interesting to see how it is received.
Ryan Kaldari
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
I have created the new consent template: http://commons.wikimedia.org/wiki/Template:Consent
Here is an example of it in use: http://commons.wikimedia.org/wiki/File:Splitting_logs_with_a_gas_powered_log...
I also added a new section to the Commons:Photographs_of_identifiable_persons guidelines encouraging people to use the new template.
The wording of the template and guidelines don't mention anything about nudity or sexualization. This is on purpose. Hopefully, this will be a good first step to increasing the value and visibility of consent on Commons (in a way that builds consensus rather than warring factions).
Ryan Kaldari
On 9/12/11 5:49 PM, Toby Hudson wrote:
Hi Ryan,
A draft template was actually made to augment the mostly recently voted [[COM:SEX]] proposal: http://commons.wikimedia.org/wiki/Template:Consent
The proposal closed with no consensus*, but with a few modifications, the template could still be put to good use.
Toby / 99of9
*Mainly because it included a clause allowing admins to delete out of scope sexual content directly in a speedy deletion rather than setting up a deletion request. There actually wasn't too much opposition to requiring a statement of consent for identifiable sexual images, although there was some.
On Tue, Sep 13, 2011 at 8:51 AM, Ryan Kaldari <rkaldari@wikimedia.org mailto:rkaldari@wikimedia.org> wrote:
I'm both a long-time admin on Commons and an OTRS volunteer. I've been wanting to chime in on this thread, but haven't really had the time. I'm worried though that I'm about to see history repeat itself, so I want to quickly share a few thoughts... First, the issue of consent on Commons has been passionately debates for years, and has a long and tortured history. Before proposing anything, please make yourself familiar with the previous discussions and their outcomes. Most notably the discussions surrounding these pages: http://commons.wikimedia.org/wiki/Commons:Sexual_content http://commons.wikimedia.org/wiki/Commons:Administrators%27_noticeboard/Archives/User_problems_7#Privatemusings http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_people http://commons.wikimedia.org/wiki/Commons:Nudity The point I can't emphasize enough is that if you put forward any proposal on Commons that implies there is anything possibly problematic about sexual or nude images in any way, you will be completely shut down. The only way you have any chance to shape the policies and guidelines on Commons is if you approach the problem from a sex/nudity-agnostic point of view. Here's a good example of what NOT to do: I think a general statement that permission of the subject is desirable / necessary for photos featuring nudity would be a good thing - thoughts? Privatemusings (talk) 00:49, 8 January 2009 (UTC) I think the horse is beyond dead by now. --Carnildo (talk) 22:46, 8 January 2009 (UTC) If the horse was beyond dead in January 2009, imagine where it is now. That said, there is still lots of room for improvement. In particular... Commons already requires consent for photos of identifiable people in private spaces. In addition, many countries require consent even for public spaces. (Take a look at http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_persons#Country_specific_consent_requirements.) The way this requirement works, however, is completely passive and reactive - there is no impetus to proactively assert consent, only to assert it when an image is challenged. This is a very inefficient system. There are no templates or categories or anything to deal with consent on Commons (apart from Template:Consent which is tied up with the tortured history of Commons:Sexual_content and can't be used currently). I don't think it would be incredibly controversial to introduce a very simple consent template that was specifically tailored to the existing policies and laws. This would make things easier for Commons reusers, professional photographers who use model releases, and admins who have to constantly deal with these issues. In short, it would be a win for everyone and it would introduce the idea of thinking proactively about consent on Commons in a way that isn't threatening to people who are concerned about censorship. As soon as I have some free time, I'll whip up such a template and throw it into the water. It'll be interesting to see how it is received. Ryan Kaldari _______________________________________________ Gendergap mailing list Gendergap@lists.wikimedia.org <mailto:Gendergap@lists.wikimedia.org> https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
That looks good, Ryan. Would it make sense to add something about the release of the image? For example,
"I personally created this media. All identifiable persons shown specifically consented to this photograph or video being taken and released under a free licence."
Sarah
On Tue, Sep 13, 2011 at 15:43, Ryan Kaldari rkaldari@wikimedia.org wrote:
I have created the new consent template: http://commons.wikimedia.org/wiki/Template:Consent
Here is an example of it in use: http://commons.wikimedia.org/wiki/File:Splitting_logs_with_a_gas_powered_log...
I also added a new section to the Commons:Photographs_of_identifiable_persons guidelines encouraging people to use the new template.
The wording of the template and guidelines don't mention anything about nudity or sexualization. This is on purpose. Hopefully, this will be a good first step to increasing the value and visibility of consent on Commons (in a way that builds consensus rather than warring factions).
Ryan Kaldari
On 9/12/11 5:49 PM, Toby Hudson wrote:
Hi Ryan,
A draft template was actually made to augment the mostly recently voted [[COM:SEX]] proposal: http://commons.wikimedia.org/wiki/Template:Consent
The proposal closed with no consensus*, but with a few modifications, the template could still be put to good use.
Toby / 99of9
*Mainly because it included a clause allowing admins to delete out of scope sexual content directly in a speedy deletion rather than setting up a deletion request. There actually wasn't too much opposition to requiring a statement of consent for identifiable sexual images, although there was some.
On Tue, Sep 13, 2011 at 8:51 AM, Ryan Kaldari rkaldari@wikimedia.org wrote:
I'm both a long-time admin on Commons and an OTRS volunteer. I've been wanting to chime in on this thread, but haven't really had the time. I'm worried though that I'm about to see history repeat itself, so I want to quickly share a few thoughts...
First, the issue of consent on Commons has been passionately debates for years, and has a long and tortured history. Before proposing anything, please make yourself familiar with the previous discussions and their outcomes. Most notably the discussions surrounding these pages: http://commons.wikimedia.org/wiki/Commons:Sexual_content
http://commons.wikimedia.org/wiki/Commons:Administrators%27_noticeboard/Arch...
http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_people http://commons.wikimedia.org/wiki/Commons:Nudity
The point I can't emphasize enough is that if you put forward any proposal on Commons that implies there is anything possibly problematic about sexual or nude images in any way, you will be completely shut down. The only way you have any chance to shape the policies and guidelines on Commons is if you approach the problem from a sex/nudity-agnostic point of view. Here's a good example of what NOT to do:
I think a general statement that permission of the subject is desirable / necessary for photos featuring nudity would be a good thing - thoughts? Privatemusings (talk) 00:49, 8 January 2009 (UTC) I think the horse is beyond dead by now. --Carnildo (talk) 22:46, 8 January 2009 (UTC)
If the horse was beyond dead in January 2009, imagine where it is now. That said, there is still lots of room for improvement. In particular...
Commons already requires consent for photos of identifiable people in private spaces. In addition, many countries require consent even for public spaces. (Take a look at
http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_person....) The way this requirement works, however, is completely passive and reactive - there is no impetus to proactively assert consent, only to assert it when an image is challenged. This is a very inefficient system. There are no templates or categories or anything to deal with consent on Commons (apart from Template:Consent which is tied up with the tortured history of Commons:Sexual_content and can't be used currently).
I don't think it would be incredibly controversial to introduce a very simple consent template that was specifically tailored to the existing policies and laws. This would make things easier for Commons reusers, professional photographers who use model releases, and admins who have to constantly deal with these issues. In short, it would be a win for everyone and it would introduce the idea of thinking proactively about consent on Commons in a way that isn't threatening to people who are concerned about censorship.
As soon as I have some free time, I'll whip up such a template and throw it into the water. It'll be interesting to see how it is received.
Ryan Kaldari
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
I agree. As you wrote this email, I was altering it to include the phrase "consent to publish". Your wording is better, I'll change to that. Toby/99of9
On Wed, Sep 14, 2011 at 8:28 AM, Sarah slimvirgin@gmail.com wrote:
That looks good, Ryan. Would it make sense to add something about the release of the image? For example,
"I personally created this media. All identifiable persons shown specifically consented to this photograph or video being taken and released under a free licence."
Sarah
On Tue, Sep 13, 2011 at 15:43, Ryan Kaldari rkaldari@wikimedia.org wrote:
I have created the new consent template: http://commons.wikimedia.org/wiki/Template:Consent
Here is an example of it in use:
http://commons.wikimedia.org/wiki/File:Splitting_logs_with_a_gas_powered_log...
I also added a new section to the Commons:Photographs_of_identifiable_persons guidelines encouraging people
to
use the new template.
The wording of the template and guidelines don't mention anything about nudity or sexualization. This is on purpose. Hopefully, this will be a
good
first step to increasing the value and visibility of consent on Commons
(in
a way that builds consensus rather than warring factions).
Ryan Kaldari
On 9/12/11 5:49 PM, Toby Hudson wrote:
Hi Ryan,
A draft template was actually made to augment the mostly recently voted [[COM:SEX]] proposal: http://commons.wikimedia.org/wiki/Template:Consent
The proposal closed with no consensus*, but with a few modifications, the template could still be put to good use.
Toby / 99of9
*Mainly because it included a clause allowing admins to delete out of
scope
sexual content directly in a speedy deletion rather than setting up a deletion request. There actually wasn't too much opposition to requiring
a
statement of consent for identifiable sexual images, although there was some.
On Tue, Sep 13, 2011 at 8:51 AM, Ryan Kaldari rkaldari@wikimedia.org wrote:
I'm both a long-time admin on Commons and an OTRS volunteer. I've been wanting to chime in on this thread, but haven't really had the time. I'm worried though that I'm about to see history repeat itself, so I want to quickly share a few thoughts...
First, the issue of consent on Commons has been passionately debates for years, and has a long and tortured history. Before proposing anything, please make yourself familiar with the previous discussions and their outcomes. Most notably the discussions surrounding these pages: http://commons.wikimedia.org/wiki/Commons:Sexual_content
http://commons.wikimedia.org/wiki/Commons:Administrators%27_noticeboard/Arch...
http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_people
http://commons.wikimedia.org/wiki/Commons:Nudity
The point I can't emphasize enough is that if you put forward any proposal on Commons that implies there is anything possibly problematic about sexual or nude images in any way, you will be completely shut down. The only way you have any chance to shape the policies and guidelines on Commons is if you approach the problem from a sex/nudity-agnostic point of view. Here's a good example of what NOT to do:
I think a general statement that permission of the subject is desirable / necessary for photos featuring nudity would be a good thing - thoughts? Privatemusings (talk) 00:49, 8 January 2009 (UTC) I think the horse is beyond dead by now. --Carnildo (talk) 22:46, 8 January 2009 (UTC)
If the horse was beyond dead in January 2009, imagine where it is now. That said, there is still lots of room for improvement. In particular...
Commons already requires consent for photos of identifiable people in private spaces. In addition, many countries require consent even for public spaces. (Take a look at
http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_person... .)
The way this requirement works, however, is completely passive and reactive - there is no impetus to proactively assert consent, only to assert it when an image is challenged. This is a very inefficient system. There are no templates or categories or anything to deal with consent on Commons (apart from Template:Consent which is tied up with the tortured history of Commons:Sexual_content and can't be used currently).
I don't think it would be incredibly controversial to introduce a very simple consent template that was specifically tailored to the existing policies and laws. This would make things easier for Commons reusers, professional photographers who use model releases, and admins who have to constantly deal with these issues. In short, it would be a win for everyone and it would introduce the idea of thinking proactively about consent on Commons in a way that isn't threatening to people who are concerned about censorship.
As soon as I have some free time, I'll whip up such a template and throw it into the water. It'll be interesting to see how it is received.
Ryan Kaldari
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
I like the template and added it to a few of my uploads to see how it works.
http://commons.wikimedia.org/wiki/File:Simon_Planting_%28bassist%29_of_the_J...
I like the change in wording Sarah suggests.
Sydney
On Tue, Sep 13, 2011 at 6:28 PM, Sarah slimvirgin@gmail.com wrote:
That looks good, Ryan. Would it make sense to add something about the release of the image? For example,
"I personally created this media. All identifiable persons shown specifically consented to this photograph or video being taken and released under a free licence."
Sarah
On Tue, Sep 13, 2011 at 15:43, Ryan Kaldari rkaldari@wikimedia.org wrote:
I have created the new consent template: http://commons.wikimedia.org/wiki/Template:Consent
Here is an example of it in use:
http://commons.wikimedia.org/wiki/File:Splitting_logs_with_a_gas_powered_log...
I also added a new section to the Commons:Photographs_of_identifiable_persons guidelines encouraging people
to
use the new template.
The wording of the template and guidelines don't mention anything about nudity or sexualization. This is on purpose. Hopefully, this will be a
good
first step to increasing the value and visibility of consent on Commons
(in
a way that builds consensus rather than warring factions).
Ryan Kaldari
On 9/12/11 5:49 PM, Toby Hudson wrote:
Hi Ryan,
A draft template was actually made to augment the mostly recently voted [[COM:SEX]] proposal: http://commons.wikimedia.org/wiki/Template:Consent
The proposal closed with no consensus*, but with a few modifications, the template could still be put to good use.
Toby / 99of9
*Mainly because it included a clause allowing admins to delete out of
scope
sexual content directly in a speedy deletion rather than setting up a deletion request. There actually wasn't too much opposition to requiring
a
statement of consent for identifiable sexual images, although there was some.
On Tue, Sep 13, 2011 at 8:51 AM, Ryan Kaldari rkaldari@wikimedia.org wrote:
I'm both a long-time admin on Commons and an OTRS volunteer. I've been wanting to chime in on this thread, but haven't really had the time. I'm worried though that I'm about to see history repeat itself, so I want to quickly share a few thoughts...
First, the issue of consent on Commons has been passionately debates for years, and has a long and tortured history. Before proposing anything, please make yourself familiar with the previous discussions and their outcomes. Most notably the discussions surrounding these pages: http://commons.wikimedia.org/wiki/Commons:Sexual_content
http://commons.wikimedia.org/wiki/Commons:Administrators%27_noticeboard/Arch...
http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_people
http://commons.wikimedia.org/wiki/Commons:Nudity
The point I can't emphasize enough is that if you put forward any proposal on Commons that implies there is anything possibly problematic about sexual or nude images in any way, you will be completely shut down. The only way you have any chance to shape the policies and guidelines on Commons is if you approach the problem from a sex/nudity-agnostic point of view. Here's a good example of what NOT to do:
I think a general statement that permission of the subject is desirable / necessary for photos featuring nudity would be a good thing - thoughts? Privatemusings (talk) 00:49, 8 January 2009 (UTC) I think the horse is beyond dead by now. --Carnildo (talk) 22:46, 8 January 2009 (UTC)
If the horse was beyond dead in January 2009, imagine where it is now. That said, there is still lots of room for improvement. In particular...
Commons already requires consent for photos of identifiable people in private spaces. In addition, many countries require consent even for public spaces. (Take a look at
http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_person... .)
The way this requirement works, however, is completely passive and reactive - there is no impetus to proactively assert consent, only to assert it when an image is challenged. This is a very inefficient system. There are no templates or categories or anything to deal with consent on Commons (apart from Template:Consent which is tied up with the tortured history of Commons:Sexual_content and can't be used currently).
I don't think it would be incredibly controversial to introduce a very simple consent template that was specifically tailored to the existing policies and laws. This would make things easier for Commons reusers, professional photographers who use model releases, and admins who have to constantly deal with these issues. In short, it would be a win for everyone and it would introduce the idea of thinking proactively about consent on Commons in a way that isn't threatening to people who are concerned about censorship.
As soon as I have some free time, I'll whip up such a template and throw it into the water. It'll be interesting to see how it is received.
Ryan Kaldari
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
I added a new parameter to the template for indicating full consent. If you use {{consent|full}} it outputs: "I personally created this media. All identifiable persons shown specifically consented to publication of this photograph or video under a free license, granting unrestricted rights to redistribute the media for any purpose."
If you use {{consent|basic}} it outputs: "I personally created this media. All identifiable persons shown specifically consented to this photograph or video."
I think it's important that both options are available, since we should allow people to indicate different degrees of consent. There's also a parameter, 'public', for photographs without consent of people in public places (which explains some of the issues involved in that particular case).
Ryan Kaldari
On 9/13/11 3:28 PM, Sarah wrote:
That looks good, Ryan. Would it make sense to add something about the release of the image? For example,
"I personally created this media. All identifiable persons shown specifically consented to this photograph or video being taken and released under a free licence."
Sarah
On Tue, Sep 13, 2011 at 15:43, Ryan Kaldarirkaldari@wikimedia.org wrote:
I have created the new consent template: http://commons.wikimedia.org/wiki/Template:Consent
Here is an example of it in use: http://commons.wikimedia.org/wiki/File:Splitting_logs_with_a_gas_powered_log...
I also added a new section to the Commons:Photographs_of_identifiable_persons guidelines encouraging people to use the new template.
The wording of the template and guidelines don't mention anything about nudity or sexualization. This is on purpose. Hopefully, this will be a good first step to increasing the value and visibility of consent on Commons (in a way that builds consensus rather than warring factions).
Ryan Kaldari
On 9/12/11 5:49 PM, Toby Hudson wrote:
Hi Ryan,
A draft template was actually made to augment the mostly recently voted [[COM:SEX]] proposal: http://commons.wikimedia.org/wiki/Template:Consent
The proposal closed with no consensus*, but with a few modifications, the template could still be put to good use.
Toby / 99of9
*Mainly because it included a clause allowing admins to delete out of scope sexual content directly in a speedy deletion rather than setting up a deletion request. There actually wasn't too much opposition to requiring a statement of consent for identifiable sexual images, although there was some.
On Tue, Sep 13, 2011 at 8:51 AM, Ryan Kaldarirkaldari@wikimedia.org wrote:
I'm both a long-time admin on Commons and an OTRS volunteer. I've been wanting to chime in on this thread, but haven't really had the time. I'm worried though that I'm about to see history repeat itself, so I want to quickly share a few thoughts...
First, the issue of consent on Commons has been passionately debates for years, and has a long and tortured history. Before proposing anything, please make yourself familiar with the previous discussions and their outcomes. Most notably the discussions surrounding these pages: http://commons.wikimedia.org/wiki/Commons:Sexual_content
http://commons.wikimedia.org/wiki/Commons:Administrators%27_noticeboard/Arch...
http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_people http://commons.wikimedia.org/wiki/Commons:Nudity
The point I can't emphasize enough is that if you put forward any proposal on Commons that implies there is anything possibly problematic about sexual or nude images in any way, you will be completely shut down. The only way you have any chance to shape the policies and guidelines on Commons is if you approach the problem from a sex/nudity-agnostic point of view. Here's a good example of what NOT to do:
I think a general statement that permission of the subject is desirable / necessary for photos featuring nudity would be a good thing - thoughts? Privatemusings (talk) 00:49, 8 January 2009 (UTC) I think the horse is beyond dead by now. --Carnildo (talk) 22:46, 8 January 2009 (UTC)
If the horse was beyond dead in January 2009, imagine where it is now. That said, there is still lots of room for improvement. In particular...
Commons already requires consent for photos of identifiable people in private spaces. In addition, many countries require consent even for public spaces. (Take a look at
http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_person....) The way this requirement works, however, is completely passive and reactive - there is no impetus to proactively assert consent, only to assert it when an image is challenged. This is a very inefficient system. There are no templates or categories or anything to deal with consent on Commons (apart from Template:Consent which is tied up with the tortured history of Commons:Sexual_content and can't be used currently).
I don't think it would be incredibly controversial to introduce a very simple consent template that was specifically tailored to the existing policies and laws. This would make things easier for Commons reusers, professional photographers who use model releases, and admins who have to constantly deal with these issues. In short, it would be a win for everyone and it would introduce the idea of thinking proactively about consent on Commons in a way that isn't threatening to people who are concerned about censorship.
As soon as I have some free time, I'll whip up such a template and throw it into the water. It'll be interesting to see how it is received.
Ryan Kaldari
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
I think this is really great, thank you Kaldari for taking the time to create this.
The n00b in me asks :
1) Is this trackable? That is, a hidden category or anything?
and the rest:
2) I think we should solidify the policy documentation (i.e. the recent board passing, etc), and complete that work before we promote this template.
3) I advise those who can or desire to follow the use of this template, to do so, if that is possible. My trust has been tampered due to the "ease" of Commons uploaders to chose whatever templates, author, permissions, they desire. I *want* to see this work.
Thanks again Kaldari =) and Sarah too!
-Sarah (the other Sarah!)
On Tue, Sep 13, 2011 at 7:39 PM, Ryan Kaldari rkaldari@wikimedia.orgwrote:
I added a new parameter to the template for indicating full consent. If you use {{consent|full}} it outputs: "I personally created this media. All identifiable persons shown specifically consented to publication of this photograph or video under a free license, granting unrestricted rights to redistribute the media for any purpose."
If you use {{consent|basic}} it outputs: "I personally created this media. All identifiable persons shown specifically consented to this photograph or video."
I think it's important that both options are available, since we should allow people to indicate different degrees of consent. There's also a parameter, 'public', for photographs without consent of people in public places (which explains some of the issues involved in that particular case).
Ryan Kaldari
On 9/13/11 3:28 PM, Sarah wrote:
That looks good, Ryan. Would it make sense to add something about the release of the image? For example,
"I personally created this media. All identifiable persons shown specifically consented to this photograph or video being taken and released under a free licence."
Sarah
On Tue, Sep 13, 2011 at 15:43, Ryan Kaldarirkaldari@wikimedia.org
wrote:
I have created the new consent template: http://commons.wikimedia.org/wiki/Template:Consent
Here is an example of it in use:
http://commons.wikimedia.org/wiki/File:Splitting_logs_with_a_gas_powered_log...
I also added a new section to the Commons:Photographs_of_identifiable_persons guidelines encouraging
people to
use the new template.
The wording of the template and guidelines don't mention anything about nudity or sexualization. This is on purpose. Hopefully, this will be a
good
first step to increasing the value and visibility of consent on Commons
(in
a way that builds consensus rather than warring factions).
Ryan Kaldari
On 9/12/11 5:49 PM, Toby Hudson wrote:
Hi Ryan,
A draft template was actually made to augment the mostly recently voted [[COM:SEX]] proposal: http://commons.wikimedia.org/wiki/Template:Consent
The proposal closed with no consensus*, but with a few modifications,
the
template could still be put to good use.
Toby / 99of9
*Mainly because it included a clause allowing admins to delete out of
scope
sexual content directly in a speedy deletion rather than setting up a deletion request. There actually wasn't too much opposition to
requiring a
statement of consent for identifiable sexual images, although there was some.
On Tue, Sep 13, 2011 at 8:51 AM, Ryan Kaldarirkaldari@wikimedia.org wrote:
I'm both a long-time admin on Commons and an OTRS volunteer. I've been wanting to chime in on this thread, but haven't really had the time.
I'm
worried though that I'm about to see history repeat itself, so I want
to
quickly share a few thoughts...
First, the issue of consent on Commons has been passionately debates
for
years, and has a long and tortured history. Before proposing anything, please make yourself familiar with the previous discussions and their outcomes. Most notably the discussions surrounding these pages: http://commons.wikimedia.org/wiki/Commons:Sexual_content
http://commons.wikimedia.org/wiki/Commons:Administrators%27_noticeboard/Arch...
http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_people
http://commons.wikimedia.org/wiki/Commons:Nudity
The point I can't emphasize enough is that if you put forward any proposal on Commons that implies there is anything possibly problematic about sexual or nude images in any way, you will be completely shut down. The only way you have any chance to shape the policies and guidelines on Commons is if you approach the problem from a sex/nudity-agnostic point of view. Here's a good example of what NOT to do:
I think a general statement that permission of the subject is desirable / necessary for photos featuring nudity would be a good thing - thoughts? Privatemusings (talk) 00:49, 8 January 2009 (UTC) I think the horse is beyond dead by now. --Carnildo (talk) 22:46,
8
January 2009 (UTC)
If the horse was beyond dead in January 2009, imagine where it is now. That said, there is still lots of room for improvement. In
particular...
Commons already requires consent for photos of identifiable people in private spaces. In addition, many countries require consent even for public spaces. (Take a look at
http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_person... .)
The way this requirement works, however, is completely passive and reactive - there is no impetus to proactively assert consent, only to assert it when an image is challenged. This is a very inefficient system. There are no templates or categories or anything to deal with consent on Commons (apart from Template:Consent which is tied up with the tortured history of Commons:Sexual_content and can't be used currently).
I don't think it would be incredibly controversial to introduce a very simple consent template that was specifically tailored to the existing policies and laws. This would make things easier for Commons reusers, professional photographers who use model releases, and admins who have to constantly deal with these issues. In short, it would be a win for everyone and it would introduce the idea of thinking proactively about consent on Commons in a way that isn't threatening to people who are concerned about censorship.
As soon as I have some free time, I'll whip up such a template and
throw
it into the water. It'll be interesting to see how it is received.
Ryan Kaldari
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
On 9/13/11 8:03 PM, Sarah Stierch wrote:
I think this is really great, thank you Kaldari for taking the time to create this.
The n00b in me asks :
- Is this trackable? That is, a hidden category or anything?
There aren't any categories currently, but I could add some. Right now, you can view all the images it is used on at http://commons.wikimedia.org/wiki/Special:WhatLinksHere/Template:Consent
- I think we should solidify the policy documentation (i.e. the
recent board passing, etc), and complete that work before we promote this template.
Right now there is no actual policy as far as I know, just a guideline and a resolution. Getting something encoded as policy might be a good goal to work towards.
Ryan Kaldari
On Wed, Sep 14, 2011 at 4:16 PM, Ryan Kaldari rkaldari@wikimedia.orgwrote:
On 9/13/11 8:03 PM, Sarah Stierch wrote:
I think this is really great, thank you Kaldari for taking the time to create this.
The n00b in me asks :
- Is this trackable? That is, a hidden category or anything?
There aren't any categories currently, but I could add some. Right now, you can view all the images it is used on at http://commons.wikimedia.org/wiki/Special:WhatLinksHere/Template:Consent
- I think we should solidify the policy documentation (i.e. the
recent board passing, etc), and complete that work before we promote this template.
Right now there is no actual policy as far as I know, just a guideline and a resolution. Getting something encoded as policy might be a good goal to work towards.
Ryan Kaldari
It make no real difference what it is called, guideline or policy, as long as everyone one is singing from the same page in the hymnal. The deletion discussion over the past week, (and there have been many of them using lack of model consent as reason) have gone well. Friendly, with no conduct issues that I can see. No extreme hyperbole. Largely people are discussing the images by citing policy, and admins are closing them with consensus.
I know that there has been division about this in the past, but if people use the consent template as you have written it, I think that everything will be fine. Sydney
User:FloNight
FloNight said:
[snip]
It make no real difference what it is called, guideline or policy, as long
as everyone one is singing from the same page in the hymnal. The deletion discussion over the past week, (and there have been many of them using lack of model consent as reason) have gone well. Friendly, with no conduct issues that I can see. No extreme hyperbole. Largely people are discussing the images by citing policy, and admins are closing them with consensus.
Nice to hear about civility being done! [?]
From, Emily
On Wed, Sep 14, 2011 at 3:50 PM, Sydney Poore sydney.poore@gmail.comwrote:
On Wed, Sep 14, 2011 at 4:16 PM, Ryan Kaldari rkaldari@wikimedia.orgwrote:
On 9/13/11 8:03 PM, Sarah Stierch wrote:
I think this is really great, thank you Kaldari for taking the time to create this.
The n00b in me asks :
- Is this trackable? That is, a hidden category or anything?
There aren't any categories currently, but I could add some. Right now, you can view all the images it is used on at http://commons.wikimedia.org/wiki/Special:WhatLinksHere/Template:Consent
- I think we should solidify the policy documentation (i.e. the
recent board passing, etc), and complete that work before we promote this template.
Right now there is no actual policy as far as I know, just a guideline and a resolution. Getting something encoded as policy might be a good goal to work towards.
Ryan Kaldari
It make no real difference what it is called, guideline or policy, as long as everyone one is singing from the same page in the hymnal. The deletion discussion over the past week, (and there have been many of them using lack of model consent as reason) have gone well. Friendly, with no conduct issues that I can see. No extreme hyperbole. Largely people are discussing the images by citing policy, and admins are closing them with consensus.
I know that there has been division about this in the past, but if people use the consent template as you have written it, I think that everything will be fine. Sydney
User:FloNight
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
On Tue, Sep 13, 2011 at 7:39 PM, Ryan Kaldari rkaldari@wikimedia.org wrote:
I added a new parameter to the template for indicating full consent. If you use {{consent|full}} it outputs: "I personally created this media. All identifiable persons shown specifically consented to publication of this photograph or video under a free license, granting unrestricted rights to redistribute the media for any purpose."
Not commenting on the merit of including such a variation:
Consenting to the release of something under a free license emphatically *does not* mean "granting unrestricted rights to redistribute the media for any purpose." In particular, under many licenses, the redistribution is restricted by the requirement that the new copies also be provided under a free license. They also usually require that the creator be attributed.
You may wish to rethink the wording of that, a bit.
On Tue, Sep 13, 2011 at 7:39 PM, Ryan Kaldari rkaldari@wikimedia.org wrote:
I added a new parameter to the template for indicating full consent. If you use {{consent|full}} it outputs: "I personally created this media. All identifiable persons shown specifically consented to publication of this photograph or video under a free license, granting unrestricted rights to redistribute the media for any purpose."
Not commenting on the merit of including such a variation:
Consenting to the release of something under a free license emphatically *does not* mean "granting unrestricted rights to redistribute the media for any purpose." In particular, under many licenses, the redistribution is restricted by the requirement that the new copies also be provided under a free license. They also usually require that the creator be attributed.
You may wish to rethink the wording of that, a bit.
-- Tracy Poff
A "full" release should signify that the subject should have been fully informed and understand the broad nature of an unrestricted license; things like the right to modify the image, substitute a pig's body for yours, and use it in an ad during televising of the Super Bowl.
Fred
On Tue, Sep 13, 2011 at 17:39, Ryan Kaldari rkaldari@wikimedia.org wrote:
I added a new parameter to the template for indicating full consent. If you use {{consent|full}} it outputs: "I personally created this media. All identifiable persons shown specifically consented to publication of this photograph or video under a free license, granting unrestricted rights to redistribute the media for any purpose."
If you use {{consent|basic}} it outputs: "I personally created this media. All identifiable persons shown specifically consented to this photograph or video."
I think it's important that both options are available, since we should allow people to indicate different degrees of consent. There's also a parameter, 'public', for photographs without consent of people in public places (which explains some of the issues involved in that particular case).
Hi Ryan, I think it's a good idea to have two templates (and thank you for creating them), but is it not important in the basic one to signal its limitations? For example, we could say: "I personally created this media. All identifiable persons shown specifically consented to this photograph or video being taken, but may not have consented to its publication or release."
We have three basic scenarios we are dealing with:
1. Someone takes a photograph of a person without their knowledge. This is voyeurism if done for the purpose of sexual gratification, and that's something we should never allow to be uploaded in my view, because it's a criminal offence in some jurisdictions, and always unethical.
2. Someone takes a photograph of a person with their knowledge, but publishes it without their knowledge. This is almost as bad as (1) if it's in a private space and there's a sexual element.
3. Someone takes a photograph of a person with their knowledge, but releases it under a free licence without their knowledge. This means the author can't easily withdraw the image, or control how it's used.
So the problem with the basic consent template as written -- "I personally created this media. All identifiable persons shown specifically consented to this photograph or video" -- is that it implies to the unsuspecting that consent has been given to take the image, publish the image, and release the image.
Sarah
On 14/09/2011 16:56, Sarah wrote:
So the problem with the basic consent template as written -- "I personally created this media. All identifiable persons shown specifically consented to this photograph or video" -- is that it implies to the unsuspecting that consent has been given to take the image, publish the image, and release the image.
Well yes and it is also legally invalid. You don't sign an agreement stating the consent of another person.
Arnaud
On Wed, Sep 14, 2011 at 09:25, Arnaud HERVE arnaudherve@x-mail.net wrote:
On 14/09/2011 16:56, Sarah wrote:
So the problem with the basic consent template as written -- "I personally created this media. All identifiable persons shown specifically consented to this photograph or video" -- is that it implies to the unsuspecting that consent has been given to take the image, publish the image, and release the image.
Well yes and it is also legally invalid. You don't sign an agreement stating the consent of another person.
Arnaud
You also don't sign an agreement using a pseudonym. So, yes, the whole thing is problematic, but I think the idea is to make a start by getting uploaders to bear the issue of the subject's consent in mind.
Sarah
Three more things that I want to state clearly based on these conversations:
- Commons bases "identifiably" on the face of an individual. While in many situations, that maybe the only way to identify an individual, when it comes to nudity, etc, there is more to identify than just a face. Any sexually active person can often remember specific features etc. of current and past lovers (birth marks, hair patterns, piercings, whatever), porn star they watch, models they like (I can pick out Bettie Page's sucked in stomach, Tempest Storm's legendary "moneymakers" and my favorite Suicide Girls tattooed back from a mile away without heads..) etc. As SlimVirgin stated - the "model" is identifiable to those who know her. (And yes, slippery slope again..) However, I really doubt that we'd have much weight with this argument, but, perhaps I'm wrong in that. - We must stress that objectification goes beyond women on Commons. Men are objectified, however, generally in a different manner by "self-imposed objectification" - uploading photographs of their own body parts and self-indulgent photographs, while it appears others upload images of women "on their behalf". - Objectification of culture is a major problem, especially when it comes to Asian women. Whether it's anime pornography (which we have plenty of and people argue that it's educational because of the tools or techniques used to create it) or photographs of "Korean vulvas" which feature "hot Korean girls" (or whatever). I notice there is a similar situation with Eastern/Eurasian women as well. Something has to change - while these women might not be active on Commons, someone has to have a voice for them.
-Sarah
On Mon, Sep 12, 2011 at 7:35 AM, Sarah slimvirgin@gmail.com wrote:
On Mon, Sep 12, 2011 at 05:17, Arnaud HERVE arnaudherve@x-mail.net wrote:
IMO, the policies need to be tweaked so that admins like him will have better policy to work with.
Do we have specific Commons policies on voyeurism and invasion of privacy?
Sarah
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
On Mon, Sep 12, 2011 at 06:50, Sarah Stierch sarah.stierch@gmail.com wrote:
Three more things that I want to state clearly based on these conversations:
Commons bases "identifiably" on the face of an individual. While in many situations, that maybe the only way to identify an individual, when it comes to nudity, etc, there is more to identify than just a face. Any sexually active person can often remember specific features etc. of current and past lovers (birth marks, hair patterns, piercings, whatever), porn star they watch, models they like (I can pick out Bettie Page's sucked in stomach, Tempest Storm's legendary "moneymakers" and my favorite Suicide Girls tattooed back from a mile away without heads..) etc. As SlimVirgin stated - the "model" is identifiable to those who know her. (And yes, slippery slope again..) However, I really doubt that we'd have much weight with this argument, but, perhaps I'm wrong in that. We must stress that objectification goes beyond women on Commons. Men are objectified, however, generally in a different manner by "self-imposed objectification" - uploading photographs of their own body parts and self-indulgent photographs, while it appears others upload images of women "on their behalf". Objectification of culture is a major problem, especially when it comes to Asian women. Whether it's anime pornography (which we have plenty of and people argue that it's educational because of the tools or techniques used to create it) or photographs of "Korean vulvas" which feature "hot Korean girls" (or whatever). I notice there is a similar situation with Eastern/Eurasian women as well. Something has to change - while these women might not be active on Commons, someone has to have a voice for them.
The expertise about what it means for a photograph to identify a person is out there, so it's just a question of accessing it. In journalism, when a court orders a publication ban on identifying someone, you can't argue that your description of them did not identify them to the general reader. If you write about them in a way that allows their local circle to recognize them that's often sufficient to trigger contempt of court proceedings.
This Commons guideline -- http://commons.wikimedia.org/wiki/Commons:IDENT -- discusses what's meant by "private place," but doesn't say how the law defines "identifiable."
Sarah
On Mon, Sep 12, 2011 at 7:17 AM, Arnaud HERVE arnaudherve@x-mail.netwrote:
On 12/09/2011 12:18, Sydney Poore wrote:
If you look at the full body of his work, this admin truly is trying to follow policy and the customs of Commons and WMF projects in general.
Well I might have been too quick in judging him, and besides idiocy or perversion the reason of his behaviour might have been a complete lack of attention. To the point that he didn't even have a look at the photo, because if he did and still protected the photo, then I am back at the idiocy or perversity hypothesis.
Because, quite frankly, voluntary or not, exceptional or not, what he has done here is an insult to plain common sense, and a clear direct deterioration of WP content.
From the scientific point of view it is below the required level to even begin a discussion.
Imagine the page for Finger, should we even take time to discuss the propriety of a photo showing the forearm without the fingers ? What would we think of an admin who would protect a photo of the forearm without the fingers on the Finger page, after having been duly pointed to the obvious mistake by a user ? Don't you think the user with a normal self-respect would be right to no bother to come any longer on Wikipedia ?
If you add the Asian-erotic content to that, you realize that the photo was totally inappropriate on so many levels that the problem doesn't lie in the photo anymore but on the admin.
IMO, the policies need to be tweaked so that admins like him will have better policy to work with. And we need a broader group of people commenting in all deletion discussions so that we get a more globally representative view of what is appropriate for Commons to have on site.
Yes but as Sarah Stierch wrote today :
One thing Wikimedia as a whole *suffers* from is no "solidity" when it comes to policy and rules. Everything seems that it can be adapted, broken, changed, manipulated..etc. I think that's a problem.
Adding rules or adding policies or adding commentators doesn't work if the admins don't show the adequate level of literacy, or use their position to manipulate the rules at their convenience.
In his Discussion lock comment Yann says "Person is not recognizable". That is typical of illiteracy and bad faith. You add a right detail to justify an otherwise totally wrong and very obviously wrong decision. That is totally twisting the rules.
As a result we now have a scientifically totally irrelevant and plainly domestic-erotic photo on WP, which is explicitly protected by WP. The mistake is so obvious that no further rules will work if admins don't show a normal intention to respect the rules.
Re-read the discussion page. Is it normal that Sarah Stierch (Missvain) had to take time to write the obvious in detail, and that she was not followed eventually ? This is not fair, no grown-up literate person should be treated like that. Even if it is involuntary, Yann's decision is so wrong and so rude it should seriously put in doubt his position as an admin.
http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Korean_Vulv...
He reconsidered and deleted the image. Approaching an admin to reconsider is always okay. They close dozens of deletion discussions and will sometimes get something wrong.
This is a good outcome.
Sydney Poore
http://commons.wikimedia.org/w/index.php?title=Commons:Deletion_requests/Fil...
On Mon, Sep 12, 2011 at 05:52, Sydney Poore sydney.poore@gmail.com wrote:
On Mon, Sep 12, 2011 at 7:17 AM, Arnaud HERVE arnaudherve@x-mail.net He reconsidered and deleted the image. Approaching an admin to reconsider is always okay. They close dozens of deletion discussions and will sometimes get something wrong.
This is a good outcome.
Sydney Poore
Thanks for asking him to reconsider. It would be worth identifying a set of Commons admins who are used to dealing with these issues -- privacy concerns, lack of model consent, possible voyeurism.
Sarah
On Sun, Sep 11, 2011 at 4:53 PM, Sarah Stierch sarah.stierch@gmail.comwrote:
This is a NSFW photo.... http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Korean_Vulv...
Five for deletion, two for keep. This is its third nomination.
An admin came in today and declared it being kept because "No valid reason for deletion, per previous decisions. Person is not recognizable." It has been nominated twice, by anon IP's who have simply declared "porn" or "obscene" as the deletion reason (not enough of a reason).
I nominated it, like I do many things, because it was unused on any project since its upload in March of 2009, it's uneducational, and the poor description proves that. I also think it's poor quality - if we need an "educational photo of a vulva" we have two really fab ones on the [[vulva]] article. Which of course was argued (a nude photo of a headless woman blow drying her hair in heels with the blow dryer cord and shadow in the shot.. come...on...), and as FloNight noted, we can probably have some high quality photos of a nude woman using a blow dryer that aren't taken in the bedroom for the project..if it's that in demand. http://commons.wikimedia.org/wiki/File:Korean_Vulva2.jpg
I'd be concerned about this user's track record of uploads, this the only one not deleted:
http://commons.wikimedia.org/wiki/User_talk:Jonghap
Presumably if there is/was File:Korean Vulva3.jpg and File:Korean Vulva2.jpg, then there was File:Korean Vulva1.jpg which is gone now.
On copyright issues alone, I am concerned about this image, as well as regarding consent, given the private location of the photo.
Cheers, Katie
Cheers, Katie
I shouldn't even act surprised...I guess.. :-/
Were the reasons we provided not valid enough? Can you even challenge something like this? Did I miss something? Am I doing this wrong? Regardless of the subject, I don't understand why the admin would declare the peoples reasons in valid based on my knowledge of the Commons policies...: "Commons is not a porn site", "private location, lack of model release" etc...
(And yes, I was a little snappy on my nomination (this was my original rager when I nominated a bunch of stuff from the "high heels" category..)...so no need to reprimand me....I've curbed my 'tude!)
Any help would be great,
Sarah
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundationhttp://www.glamwiki.org Wikipedian-in-Residence, Archives of American Arthttp://en.wikipedia.org/wiki/User:SarahStierch and Sarah Stierch Consulting
*Historical, cultural & artistic research & advising.*
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
On Sun, Sep 11, 2011 at 4:53 PM, Sarah Stierch sarah.stierch@gmail.comwrote:
This is a NSFW photo.... http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Korean_Vulv...
Five for deletion, two for keep. This is its third nomination.
An admin came in today and declared it being kept because "No valid reason for deletion, per previous decisions. Person is not recognizable." It has been nominated twice, by anon IP's who have simply declared "porn" or "obscene" as the deletion reason (not enough of a reason).
I nominated it, like I do many things, because it was unused on any project since its upload in March of 2009, it's uneducational, and the poor description proves that. I also think it's poor quality - if we need an "educational photo of a vulva" we have two really fab ones on the [[vulva]] article. Which of course was argued (a nude photo of a headless woman blow drying her hair in heels with the blow dryer cord and shadow in the shot.. come...on...), and as FloNight noted, we can probably have some high quality photos of a nude woman using a blow dryer that aren't taken in the bedroom for the project..if it's that in demand. http://commons.wikimedia.org/wiki/File:Korean_Vulva2.jpg
I'd be concerned about this user's track record of uploads, this the only one not deleted:
http://commons.wikimedia.org/wiki/User_talk:Jonghap
Presumably if there is/was File:Korean Vulva3.jpg and File:Korean Vulva2.jpg, then there was File:Korean Vulva1.jpg which is gone now.
On copyright issues alone, I am concerned about this image, as well as regarding consent, given the private location of the photo.
Cheers, Katie
Yes, the context is being ignored, particularly the choice of name for the image, transmogrifying an innocent image of a nude woman into an oriental sex image.
Fred