Category:High-heeled shoes is an excellent example of the current problem WMF projects are having with creating and disseminating content that is unbiased.
http://commons.wikimedia.org/wiki/Category:High-heeled_shoes
This category is different that most all the other categories about footwear because it contains many images that are not primarily examples of high-heeled shoes. Most other categories about footwear contain mostly images of shoes or the lower leg(s) with a shoe or shoes.
The number of images in Category:High-heeled shoes is higher than most categories about footwear. Approximately one- third of the images are of full body shots of attractive females who are wearing high heeled shoes, and a significant number of them are nude or posed in sexually provocative positions.
There are random women who are wearing shoes and are mixed in with the porn-stars and strip-tease dancers. These women are being objectified and sexualized without their consent because of the way the the images are displayed in the category. See Wikipedia article on Sexualization http://en.wikipedia.org/wiki/Sexualization for a description of the term.
In each language that has Wikipedia articles about high-heeled shoes, the content is about a type of footwear, so the links in the articles that lead to commons are directing people to nudity or sexual content that they would not anticipate. There are other problems with some of the images, including unclear consent for the image to be uploaded by the subject of the image.
I see this category as a concrete example of systemic bias coming from having a male dominated editing community.
Leather boots is only other category that I found that also has a large number of images of people. It also contain a disproportionate number of images of women who are nude or in sexually provocative poses.
I think that it is important to continue to talk about these issues in the hope that more people with became educated about the problems with with our current methods to collect, categorize, and disseminate content.
Sydney Poore User:FloNight
The number of images in Category:High-heeled shoes is higher than most categories about footwear. Approximately one- third of the images are of full body shots of attractive females who are wearing high heeled shoes, and a significant number of them are nude or posed in sexually provocative positions.
Hmm, yes, it's very different from all the other categories about types of shoes.
I was just having a look at that category and this image caught my eye: http://commons.wikimedia.org/wiki/File:Young_girl_with_see-through_tops_and_...
It was bot-transferred over from Flickr with a description of "Hello, My name is Amber and I'm 5' 1" and very petite. I like to meet new people and I'm very easy to get to know. Just ad me as a friend to see all my pics and any comments, notes, and favorites are appreciated too. bye for"
The Flickr accounts originally involved have since been deleted
This makes me very suspicious that we've basically taken an image from a porn-spammer and unquestioningly put it on Commons....
Chris (The Land)
On Sun, Sep 4, 2011 at 5:05 AM, Chris Keating chriskeatingwiki@gmail.comwrote:
The number of images in Category:High-heeled shoes is higher than most categories about footwear. Approximately one- third of the images are of full body shots of attractive females who are wearing high heeled shoes,
and
a significant number of them are nude or posed in sexually provocative positions.
Hmm, yes, it's very different from all the other categories about types of shoes.
I was just having a look at that category and this image caught my eye: http://commons.wikimedia.org/wiki/File:Young_girl_with_see-through_tops_and_...
It was bot-transferred over from Flickr with a description of "Hello, My name is Amber and I'm 5' 1" and very petite. I like to meet new people and I'm very easy to get to know. Just ad me as a friend to see all my pics and any comments, notes, and favorites are appreciated too. bye for"
The Flickr accounts originally involved have since been deleted
This makes me very suspicious that we've basically taken an image from a porn-spammer and unquestioningly put it on Commons....
Chris (The Land)
Chris, that is a good point.
Commons has other images of "Amber". Including one with the original description "You guys can get as ruff as you like. I promise I won't break!! bye for now, Amber "
http://commons.wikimedia.org/wiki/File:Teen_in_tank_top_and_cut-offs.jpg
Sydney
I've made a start on refactoring this category according to the principle of least surprise. Feel free to do this whenever you notice a "surprising" image in a mundane category.
Regarding consent, if any of the identifiable women are in private locations, http://commons.wikimedia.org/wiki/COM:PEOPLEhttp://commons.wikimedia.org/wiki/Category:High-heeled_shoesapplies, and the uploader should state that permission was obtained to take & publish the image. If this has not been done, please either contact the uploader or propose deletion.
Toby Hudson / 99of9
On Sun, Sep 4, 2011 at 8:05 AM, Sydney Poore sydney.poore@gmail.com wrote:
Category:High-heeled shoes is an excellent example of the current problem WMF projects are having with creating and disseminating content that is unbiased.
http://commons.wikimedia.org/wiki/Category:High-heeled_shoes
This category is different that most all the other categories about footwear because it contains many images that are not primarily examples of high-heeled shoes. Most other categories about footwear contain mostly images of shoes or the lower leg(s) with a shoe or shoes.
The number of images in Category:High-heeled shoes is higher than most categories about footwear. Approximately one- third of the images are of full body shots of attractive females who are wearing high heeled shoes, and a significant number of them are nude or posed in sexually provocative positions.
There are random women who are wearing shoes and are mixed in with the porn-stars and strip-tease dancers. These women are being objectified and sexualized without their consent because of the way the the images are displayed in the category. See Wikipedia article on Sexualization http://en.wikipedia.org/wiki/Sexualization for a description of the term.
In each language that has Wikipedia articles about high-heeled shoes, the content is about a type of footwear, so the links in the articles that lead to commons are directing people to nudity or sexual content that they would not anticipate. There are other problems with some of the images, including unclear consent for the image to be uploaded by the subject of the image.
I see this category as a concrete example of systemic bias coming from having a male dominated editing community.
Leather boots is only other category that I found that also has a large number of images of people. It also contain a disproportionate number of images of women who are nude or in sexually provocative poses.
I think that it is important to continue to talk about these issues in the hope that more people with became educated about the problems with with our current methods to collect, categorize, and disseminate content.
Sydney Poore User:FloNight
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Hi Toby -
Sorry to be a n00b but, can you explain what you mean by "refactoring this category according to the principle of least surprise?"
For anyone else - if you find an image that has been uploaded by a Flickr bot, and the Flickr account has been deleted what do you do? I notice a large portion of images like this are often snapshot uneducational photos (here is an example: http://commons.wikimedia.org/wiki/File:Labace_%2824%29.jpg) I was going to nominate it for just being out of scope because Commons is not a repository for snapshots.
;)
Asking questions like this on Commons-L isn't very pleasant, so thanks for helping!
Thanks,
Sarah
On Sun, Sep 4, 2011 at 6:48 AM, Toby Hudson tobyyy@gmail.com wrote:
I've made a start on refactoring this category according to the principle of least surprise. Feel free to do this whenever you notice a "surprising" image in a mundane category.
Regarding consent, if any of the identifiable women are in private locations, http://commons.wikimedia.org/wiki/COM:PEOPLEhttp://commons.wikimedia.org/wiki/Category:High-heeled_shoesapplies, and the uploader should state that permission was obtained to take & publish the image. If this has not been done, please either contact the uploader or propose deletion.
Toby Hudson / 99of9
On Sun, Sep 4, 2011 at 8:05 AM, Sydney Poore sydney.poore@gmail.comwrote:
Category:High-heeled shoes is an excellent example of the current problem WMF projects are having with creating and disseminating content that is unbiased.
http://commons.wikimedia.org/wiki/Category:High-heeled_shoes
This category is different that most all the other categories about footwear because it contains many images that are not primarily examples of high-heeled shoes. Most other categories about footwear contain mostly images of shoes or the lower leg(s) with a shoe or shoes.
The number of images in Category:High-heeled shoes is higher than most categories about footwear. Approximately one- third of the images are of full body shots of attractive females who are wearing high heeled shoes, and a significant number of them are nude or posed in sexually provocative positions.
There are random women who are wearing shoes and are mixed in with the porn-stars and strip-tease dancers. These women are being objectified and sexualized without their consent because of the way the the images are displayed in the category. See Wikipedia article on Sexualization http://en.wikipedia.org/wiki/Sexualization for a description of the term.
In each language that has Wikipedia articles about high-heeled shoes, the content is about a type of footwear, so the links in the articles that lead to commons are directing people to nudity or sexual content that they would not anticipate. There are other problems with some of the images, including unclear consent for the image to be uploaded by the subject of the image.
I see this category as a concrete example of systemic bias coming from having a male dominated editing community.
Leather boots is only other category that I found that also has a large number of images of people. It also contain a disproportionate number of images of women who are nude or in sexually provocative poses.
I think that it is important to continue to talk about these issues in the hope that more people with became educated about the problems with with our current methods to collect, categorize, and disseminate content.
Sydney Poore User:FloNight
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Hi Sarah,
The principle of least surprise is roughly the following: People who go to a category/gallery/encyclopedia-article expecting something (shoes) should not be surprised by something they may find offensive (naked women wearing shoes).
One way to ensure this is to make clearly labelled subcategories for the potentially offensive material. In this case, I made a subcategory: http://commons.wikimedia.org/wiki/Category:Women_wearing_high-heeled_shoes and within that http://commons.wikimedia.org/wiki/Category:Nude_women_wearing_high-heeled_sh...
so everyone who visits that category knows exactly what they're going to see in advance.
Regarding your Flickr question: Whether the account is deleted or not doesn't usually change whether or not the picture is in scope. But deleted accounts do make the copyright status more questionable. At the time of upload, the bot would check that the license is correct, but that doesn't eliminate the possibility that the Flickr user is uploading copyright violations to their Flickr account ("Flickrwashing"). If there are other likely signs of copyright violation, I would nominate for deletion (as I did for the other image mentioned in this thread http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Young_girl_...). When the account is still active, you can also check the rest of the Flickr user's contributions to get a good sense of whether they are really the author of the photos they're uploading.
Snapshots aren't necessarily out of scope just because they're snapshots, they're sometimes realistically useful for an educational purpose.
Toby
On Sun, Sep 4, 2011 at 10:55 PM, Sarah Stierch sarah.stierch@gmail.comwrote:
Hi Toby -
Sorry to be a n00b but, can you explain what you mean by "refactoring this category according to the principle of least surprise?"
For anyone else - if you find an image that has been uploaded by a Flickr bot, and the Flickr account has been deleted what do you do? I notice a large portion of images like this are often snapshot uneducational photos (here is an example: http://commons.wikimedia.org/wiki/File:Labace_%2824%29.jpg) I was going to nominate it for just being out of scope because Commons is not a repository for snapshots.
;)
Asking questions like this on Commons-L isn't very pleasant, so thanks for helping!
Thanks,
Sarah
On Sun, Sep 4, 2011 at 6:48 AM, Toby Hudson tobyyy@gmail.com wrote:
I've made a start on refactoring this category according to the principle of least surprise. Feel free to do this whenever you notice a "surprising" image in a mundane category.
Regarding consent, if any of the identifiable women are in private locations, http://commons.wikimedia.org/wiki/COM:PEOPLEhttp://commons.wikimedia.org/wiki/Category:High-heeled_shoesapplies, and the uploader should state that permission was obtained to take & publish the image. If this has not been done, please either contact the uploader or propose deletion.
Toby Hudson / 99of9
On Sun, Sep 4, 2011 at 8:05 AM, Sydney Poore sydney.poore@gmail.comwrote:
Category:High-heeled shoes is an excellent example of the current problem WMF projects are having with creating and disseminating content that is unbiased.
http://commons.wikimedia.org/wiki/Category:High-heeled_shoes
This category is different that most all the other categories about footwear because it contains many images that are not primarily examples of high-heeled shoes. Most other categories about footwear contain mostly images of shoes or the lower leg(s) with a shoe or shoes.
The number of images in Category:High-heeled shoes is higher than most categories about footwear. Approximately one- third of the images are of full body shots of attractive females who are wearing high heeled shoes, and a significant number of them are nude or posed in sexually provocative positions.
There are random women who are wearing shoes and are mixed in with the porn-stars and strip-tease dancers. These women are being objectified and sexualized without their consent because of the way the the images are displayed in the category. See Wikipedia article on Sexualization http://en.wikipedia.org/wiki/Sexualization for a description of the term.
In each language that has Wikipedia articles about high-heeled shoes, the content is about a type of footwear, so the links in the articles that lead to commons are directing people to nudity or sexual content that they would not anticipate. There are other problems with some of the images, including unclear consent for the image to be uploaded by the subject of the image.
I see this category as a concrete example of systemic bias coming from having a male dominated editing community.
Leather boots is only other category that I found that also has a large number of images of people. It also contain a disproportionate number of images of women who are nude or in sexually provocative poses.
I think that it is important to continue to talk about these issues in the hope that more people with became educated about the problems with with our current methods to collect, categorize, and disseminate content.
Sydney Poore User:FloNight
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundationhttp://www.glamwiki.org Wikipedian-in-Residence, Archives of American Arthttp://en.wikipedia.org/wiki/User:SarahStierch and Sarah Stierch Consulting
*Historical, cultural & artistic research & advising.*
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Great, thank you :)
On Sun, Sep 4, 2011 at 9:20 AM, Toby Hudson tobyyy@gmail.com wrote:
Hi Sarah,
The principle of least surprise is roughly the following: People who go to a category/gallery/encyclopedia-article expecting something (shoes) should not be surprised by something they may find offensive (naked women wearing shoes).
One way to ensure this is to make clearly labelled subcategories for the potentially offensive material. In this case, I made a subcategory: http://commons.wikimedia.org/wiki/Category:Women_wearing_high-heeled_shoes and within that
http://commons.wikimedia.org/wiki/Category:Nude_women_wearing_high-heeled_sh...
so everyone who visits that category knows exactly what they're going to see in advance.
Regarding your Flickr question: Whether the account is deleted or not doesn't usually change whether or not the picture is in scope. But deleted accounts do make the copyright status more questionable. At the time of upload, the bot would check that the license is correct, but that doesn't eliminate the possibility that the Flickr user is uploading copyright violations to their Flickr account ("Flickrwashing"). If there are other likely signs of copyright violation, I would nominate for deletion (as I did for the other image mentioned in this thread http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Young_girl_...). When the account is still active, you can also check the rest of the Flickr user's contributions to get a good sense of whether they are really the author of the photos they're uploading.
Snapshots aren't necessarily out of scope just because they're snapshots, they're sometimes realistically useful for an educational purpose.
Toby
On Sun, Sep 4, 2011 at 10:55 PM, Sarah Stierch sarah.stierch@gmail.comwrote:
Hi Toby -
Sorry to be a n00b but, can you explain what you mean by "refactoring this category according to the principle of least surprise?"
For anyone else - if you find an image that has been uploaded by a Flickr bot, and the Flickr account has been deleted what do you do? I notice a large portion of images like this are often snapshot uneducational photos (here is an example: http://commons.wikimedia.org/wiki/File:Labace_%2824%29.jpg) I was going to nominate it for just being out of scope because Commons is not a repository for snapshots.
;)
Asking questions like this on Commons-L isn't very pleasant, so thanks for helping!
Thanks,
Sarah
On Sun, Sep 4, 2011 at 6:48 AM, Toby Hudson tobyyy@gmail.com wrote:
I've made a start on refactoring this category according to the principle of least surprise. Feel free to do this whenever you notice a "surprising" image in a mundane category.
Regarding consent, if any of the identifiable women are in private locations, http://commons.wikimedia.org/wiki/COM:PEOPLEhttp://commons.wikimedia.org/wiki/Category:High-heeled_shoesapplies, and the uploader should state that permission was obtained to take & publish the image. If this has not been done, please either contact the uploader or propose deletion.
Toby Hudson / 99of9
On Sun, Sep 4, 2011 at 8:05 AM, Sydney Poore sydney.poore@gmail.comwrote:
Category:High-heeled shoes is an excellent example of the current problem WMF projects are having with creating and disseminating content that is unbiased.
http://commons.wikimedia.org/wiki/Category:High-heeled_shoes
This category is different that most all the other categories about footwear because it contains many images that are not primarily examples of high-heeled shoes. Most other categories about footwear contain mostly images of shoes or the lower leg(s) with a shoe or shoes.
The number of images in Category:High-heeled shoes is higher than most categories about footwear. Approximately one- third of the images are of full body shots of attractive females who are wearing high heeled shoes, and a significant number of them are nude or posed in sexually provocative positions.
There are random women who are wearing shoes and are mixed in with the porn-stars and strip-tease dancers. These women are being objectified and sexualized without their consent because of the way the the images are displayed in the category. See Wikipedia article on Sexualization http://en.wikipedia.org/wiki/Sexualization for a description of the term.
In each language that has Wikipedia articles about high-heeled shoes, the content is about a type of footwear, so the links in the articles that lead to commons are directing people to nudity or sexual content that they would not anticipate. There are other problems with some of the images, including unclear consent for the image to be uploaded by the subject of the image.
I see this category as a concrete example of systemic bias coming from having a male dominated editing community.
Leather boots is only other category that I found that also has a large number of images of people. It also contain a disproportionate number of images of women who are nude or in sexually provocative poses.
I think that it is important to continue to talk about these issues in the hope that more people with became educated about the problems with with our current methods to collect, categorize, and disseminate content.
Sydney Poore User:FloNight
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundationhttp://www.glamwiki.org Wikipedian-in-Residence, Archives of American Arthttp://en.wikipedia.org/wiki/User:SarahStierch and Sarah Stierch Consulting
*Historical, cultural & artistic research & advising.*
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
This was left on my talk page after I nominated a photo by this user for deletion:
http://commons.wikimedia.org/wiki/User_talk:Missvain#You_seem_to_be_unaware_......
This is the photo I tagged. It no longer has a Flickr account, it's not used anywhere, there isn't a single educational merit in this image (I mean come on..). I also love how this person tells me that licenses aren't revocable and they don't link to an exact statement that is easily found explaining that if something is deleted from Flickr and the account gets deleted we no longer have any proof of copyright from the original creator so what does that mean?
(Did that last statement make sense? LOL)
I also love that when you tag something like this for deletion someone basically passive-aggressively calls you a censor. Talk about offensive.
Two Wings also has this on the top of his talk page: "After the Wikimedia/Wales/FoxNews "porn purge" affair, I'm vigilant towards possible drifts, notably in terms of abuse of power from Jimbo Wales and/or admins, but also in terms of extreme censorship and/or puritanism against sex illustrations and nude pictures."
(adds this to her list of being a prude, racist, sexist, homophobe, anti-sex and high-brow editor.)
-Sarah (Part of me wants to be snappy and tell him enough I can afford to actually buy my porn. :P )
On Sun, Sep 4, 2011 at 9:21 AM, Sarah Stierch sarah.stierch@gmail.comwrote:
Great, thank you :)
On Sun, Sep 4, 2011 at 9:20 AM, Toby Hudson tobyyy@gmail.com wrote:
Hi Sarah,
The principle of least surprise is roughly the following: People who go to a category/gallery/encyclopedia-article expecting something (shoes) should not be surprised by something they may find offensive (naked women wearing shoes).
One way to ensure this is to make clearly labelled subcategories for the potentially offensive material. In this case, I made a subcategory: http://commons.wikimedia.org/wiki/Category:Women_wearing_high-heeled_shoes and within that
http://commons.wikimedia.org/wiki/Category:Nude_women_wearing_high-heeled_sh...
so everyone who visits that category knows exactly what they're going to see in advance.
Regarding your Flickr question: Whether the account is deleted or not doesn't usually change whether or not the picture is in scope. But deleted accounts do make the copyright status more questionable. At the time of upload, the bot would check that the license is correct, but that doesn't eliminate the possibility that the Flickr user is uploading copyright violations to their Flickr account ("Flickrwashing"). If there are other likely signs of copyright violation, I would nominate for deletion (as I did for the other image mentioned in this thread http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Young_girl_...). When the account is still active, you can also check the rest of the Flickr user's contributions to get a good sense of whether they are really the author of the photos they're uploading.
Snapshots aren't necessarily out of scope just because they're snapshots, they're sometimes realistically useful for an educational purpose.
Toby
On Sun, Sep 4, 2011 at 10:55 PM, Sarah Stierch sarah.stierch@gmail.comwrote:
Hi Toby -
Sorry to be a n00b but, can you explain what you mean by "refactoring this category according to the principle of least surprise?"
For anyone else - if you find an image that has been uploaded by a Flickr bot, and the Flickr account has been deleted what do you do? I notice a large portion of images like this are often snapshot uneducational photos (here is an example: http://commons.wikimedia.org/wiki/File:Labace_%2824%29.jpg) I was going to nominate it for just being out of scope because Commons is not a repository for snapshots.
;)
Asking questions like this on Commons-L isn't very pleasant, so thanks for helping!
Thanks,
Sarah
On Sun, Sep 4, 2011 at 6:48 AM, Toby Hudson tobyyy@gmail.com wrote:
I've made a start on refactoring this category according to the principle of least surprise. Feel free to do this whenever you notice a "surprising" image in a mundane category.
Regarding consent, if any of the identifiable women are in private locations, http://commons.wikimedia.org/wiki/COM:PEOPLEhttp://commons.wikimedia.org/wiki/Category:High-heeled_shoesapplies, and the uploader should state that permission was obtained to take & publish the image. If this has not been done, please either contact the uploader or propose deletion.
Toby Hudson / 99of9
On Sun, Sep 4, 2011 at 8:05 AM, Sydney Poore sydney.poore@gmail.comwrote:
Category:High-heeled shoes is an excellent example of the current problem WMF projects are having with creating and disseminating content that is unbiased.
http://commons.wikimedia.org/wiki/Category:High-heeled_shoes
This category is different that most all the other categories about footwear because it contains many images that are not primarily examples of high-heeled shoes. Most other categories about footwear contain mostly images of shoes or the lower leg(s) with a shoe or shoes.
The number of images in Category:High-heeled shoes is higher than most categories about footwear. Approximately one- third of the images are of full body shots of attractive females who are wearing high heeled shoes, and a significant number of them are nude or posed in sexually provocative positions.
There are random women who are wearing shoes and are mixed in with the porn-stars and strip-tease dancers. These women are being objectified and sexualized without their consent because of the way the the images are displayed in the category. See Wikipedia article on Sexualization http://en.wikipedia.org/wiki/Sexualization for a description of the term.
In each language that has Wikipedia articles about high-heeled shoes, the content is about a type of footwear, so the links in the articles that lead to commons are directing people to nudity or sexual content that they would not anticipate. There are other problems with some of the images, including unclear consent for the image to be uploaded by the subject of the image.
I see this category as a concrete example of systemic bias coming from having a male dominated editing community.
Leather boots is only other category that I found that also has a large number of images of people. It also contain a disproportionate number of images of women who are nude or in sexually provocative poses.
I think that it is important to continue to talk about these issues in the hope that more people with became educated about the problems with with our current methods to collect, categorize, and disseminate content.
Sydney Poore User:FloNight
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundationhttp://www.glamwiki.org Wikipedian-in-Residence, Archives of American Arthttp://en.wikipedia.org/wiki/User:SarahStierch and Sarah Stierch Consulting
*Historical, cultural & artistic research & advising.*
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundationhttp://www.glamwiki.org Wikipedian-in-Residence, Archives of American Arthttp://en.wikipedia.org/wiki/User:SarahStierch and Sarah Stierch Consulting
*Historical, cultural & artistic research & advising.*
Just a follow up...
It doesn't even matter, anymore. Some of these images have been nominated before, and been kept. They all just keep stating I don't know the policies and that they are in scope. Perhaps it all is and perhaps I really am an idiot who just can't comprehend the policies, despite reading things multiple times.
I think the policy about Flickr accounts being deleted and it doesn't matter is one of the stupidest ideas. Two of the images I nominated have incorrect licenses and were still uploaded from Flickr and "okayed" by a bot, despite the Flickr account stating they are all rights reserved. I also don't get how a deleted Flickr account can still be considered a "source."
Commons is really good at making a smart person feel stupid and like a gnat.
-Sarah
On Sun, Sep 4, 2011 at 9:20 AM, Toby Hudson tobyyy@gmail.com wrote:
Hi Sarah,
The principle of least surprise is roughly the following: People who go to a category/gallery/encyclopedia-article expecting something (shoes) should not be surprised by something they may find offensive (naked women wearing shoes).
One way to ensure this is to make clearly labelled subcategories for the potentially offensive material. In this case, I made a subcategory: http://commons.wikimedia.org/wiki/Category:Women_wearing_high-heeled_shoes and within that
http://commons.wikimedia.org/wiki/Category:Nude_women_wearing_high-heeled_sh...
so everyone who visits that category knows exactly what they're going to see in advance.
Regarding your Flickr question: Whether the account is deleted or not doesn't usually change whether or not the picture is in scope. But deleted accounts do make the copyright status more questionable. At the time of upload, the bot would check that the license is correct, but that doesn't eliminate the possibility that the Flickr user is uploading copyright violations to their Flickr account ("Flickrwashing"). If there are other likely signs of copyright violation, I would nominate for deletion (as I did for the other image mentioned in this thread http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Young_girl_...). When the account is still active, you can also check the rest of the Flickr user's contributions to get a good sense of whether they are really the author of the photos they're uploading.
Snapshots aren't necessarily out of scope just because they're snapshots, they're sometimes realistically useful for an educational purpose.
Toby
On Sun, Sep 4, 2011 at 10:55 PM, Sarah Stierch sarah.stierch@gmail.comwrote:
Hi Toby -
Sorry to be a n00b but, can you explain what you mean by "refactoring this category according to the principle of least surprise?"
For anyone else - if you find an image that has been uploaded by a Flickr bot, and the Flickr account has been deleted what do you do? I notice a large portion of images like this are often snapshot uneducational photos (here is an example: http://commons.wikimedia.org/wiki/File:Labace_%2824%29.jpg) I was going to nominate it for just being out of scope because Commons is not a repository for snapshots.
;)
Asking questions like this on Commons-L isn't very pleasant, so thanks for helping!
Thanks,
Sarah
On Sun, Sep 4, 2011 at 6:48 AM, Toby Hudson tobyyy@gmail.com wrote:
I've made a start on refactoring this category according to the principle of least surprise. Feel free to do this whenever you notice a "surprising" image in a mundane category.
Regarding consent, if any of the identifiable women are in private locations, http://commons.wikimedia.org/wiki/COM:PEOPLEhttp://commons.wikimedia.org/wiki/Category:High-heeled_shoesapplies, and the uploader should state that permission was obtained to take & publish the image. If this has not been done, please either contact the uploader or propose deletion.
Toby Hudson / 99of9
On Sun, Sep 4, 2011 at 8:05 AM, Sydney Poore sydney.poore@gmail.comwrote:
Category:High-heeled shoes is an excellent example of the current problem WMF projects are having with creating and disseminating content that is unbiased.
http://commons.wikimedia.org/wiki/Category:High-heeled_shoes
This category is different that most all the other categories about footwear because it contains many images that are not primarily examples of high-heeled shoes. Most other categories about footwear contain mostly images of shoes or the lower leg(s) with a shoe or shoes.
The number of images in Category:High-heeled shoes is higher than most categories about footwear. Approximately one- third of the images are of full body shots of attractive females who are wearing high heeled shoes, and a significant number of them are nude or posed in sexually provocative positions.
There are random women who are wearing shoes and are mixed in with the porn-stars and strip-tease dancers. These women are being objectified and sexualized without their consent because of the way the the images are displayed in the category. See Wikipedia article on Sexualization http://en.wikipedia.org/wiki/Sexualization for a description of the term.
In each language that has Wikipedia articles about high-heeled shoes, the content is about a type of footwear, so the links in the articles that lead to commons are directing people to nudity or sexual content that they would not anticipate. There are other problems with some of the images, including unclear consent for the image to be uploaded by the subject of the image.
I see this category as a concrete example of systemic bias coming from having a male dominated editing community.
Leather boots is only other category that I found that also has a large number of images of people. It also contain a disproportionate number of images of women who are nude or in sexually provocative poses.
I think that it is important to continue to talk about these issues in the hope that more people with became educated about the problems with with our current methods to collect, categorize, and disseminate content.
Sydney Poore User:FloNight
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundationhttp://www.glamwiki.org Wikipedian-in-Residence, Archives of American Arthttp://en.wikipedia.org/wiki/User:SarahStierch and Sarah Stierch Consulting
*Historical, cultural & artistic research & advising.*
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
On Sun, Sep 4, 2011 at 9:59 AM, Sarah Stierch sarah.stierch@gmail.comwrote:
Just a follow up...
It doesn't even matter, anymore. Some of these images have been nominated before, and been kept. They all just keep stating I don't know the policies and that they are in scope. Perhaps it all is and perhaps I really am an idiot who just can't comprehend the policies, despite reading things multiple times.
I think the policy about Flickr accounts being deleted and it doesn't matter is one of the stupidest ideas. Two of the images I nominated have incorrect licenses and were still uploaded from Flickr and "okayed" by a bot, despite the Flickr account stating they are all rights reserved. I also don't get how a deleted Flickr account can still be considered a "source."
Commons is really good at making a smart person feel stupid and like a gnat.
-Sarah
Sarah
I know that some of the images have been nominated before and kept, and some of the images have to be repeatedly re-categorized, too. I get frustrated and at times feel that it is a time sink with no end in sight.
That is the reason that I wrote to the mailing list to discuss the matter as an community issue. I have come to believe that is rooted in the culture values of the WMF editors who add loads of these images to commons.
We can't walk away from the issue because it is too important. We need to discuss it so that we can better understand why that we are having trouble addressing the issue in a way that is promotes an inclusive editing environment.
Sydney
On Sun, Sep 4, 2011 at 9:20 AM, Toby Hudson tobyyy@gmail.com wrote:
Hi Sarah,
The principle of least surprise is roughly the following: People who go to a category/gallery/encyclopedia-article expecting something (shoes) should not be surprised by something they may find offensive (naked women wearing shoes).
One way to ensure this is to make clearly labelled subcategories for the potentially offensive material. In this case, I made a subcategory: http://commons.wikimedia.org/wiki/Category:Women_wearing_high-heeled_shoes and within that
http://commons.wikimedia.org/wiki/Category:Nude_women_wearing_high-heeled_sh...
so everyone who visits that category knows exactly what they're going to see in advance.
Regarding your Flickr question: Whether the account is deleted or not doesn't usually change whether or not the picture is in scope. But deleted accounts do make the copyright status more questionable. At the time of upload, the bot would check that the license is correct, but that doesn't eliminate the possibility that the Flickr user is uploading copyright violations to their Flickr account ("Flickrwashing"). If there are other likely signs of copyright violation, I would nominate for deletion (as I did for the other image mentioned in this thread http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Young_girl_...). When the account is still active, you can also check the rest of the Flickr user's contributions to get a good sense of whether they are really the author of the photos they're uploading.
Snapshots aren't necessarily out of scope just because they're snapshots, they're sometimes realistically useful for an educational purpose.
Toby
On Sun, Sep 4, 2011 at 10:55 PM, Sarah Stierch sarah.stierch@gmail.comwrote:
Hi Toby -
Sorry to be a n00b but, can you explain what you mean by "refactoring this category according to the principle of least surprise?"
For anyone else - if you find an image that has been uploaded by a Flickr bot, and the Flickr account has been deleted what do you do? I notice a large portion of images like this are often snapshot uneducational photos (here is an example: http://commons.wikimedia.org/wiki/File:Labace_%2824%29.jpg) I was going to nominate it for just being out of scope because Commons is not a repository for snapshots.
;)
Asking questions like this on Commons-L isn't very pleasant, so thanks for helping!
Thanks,
Sarah
On Sun, Sep 4, 2011 at 6:48 AM, Toby Hudson tobyyy@gmail.com wrote:
I've made a start on refactoring this category according to the principle of least surprise. Feel free to do this whenever you notice a "surprising" image in a mundane category.
Regarding consent, if any of the identifiable women are in private locations, http://commons.wikimedia.org/wiki/COM:PEOPLEhttp://commons.wikimedia.org/wiki/Category:High-heeled_shoesapplies, and the uploader should state that permission was obtained to take & publish the image. If this has not been done, please either contact the uploader or propose deletion.
Toby Hudson / 99of9
On Sun, Sep 4, 2011 at 8:05 AM, Sydney Poore sydney.poore@gmail.comwrote:
Category:High-heeled shoes is an excellent example of the current problem WMF projects are having with creating and disseminating content that is unbiased.
http://commons.wikimedia.org/wiki/Category:High-heeled_shoes
This category is different that most all the other categories about footwear because it contains many images that are not primarily examples of high-heeled shoes. Most other categories about footwear contain mostly images of shoes or the lower leg(s) with a shoe or shoes.
The number of images in Category:High-heeled shoes is higher than most categories about footwear. Approximately one- third of the images are of full body shots of attractive females who are wearing high heeled shoes, and a significant number of them are nude or posed in sexually provocative positions.
There are random women who are wearing shoes and are mixed in with the porn-stars and strip-tease dancers. These women are being objectified and sexualized without their consent because of the way the the images are displayed in the category. See Wikipedia article on Sexualization http://en.wikipedia.org/wiki/Sexualization for a description of the term.
In each language that has Wikipedia articles about high-heeled shoes, the content is about a type of footwear, so the links in the articles that lead to commons are directing people to nudity or sexual content that they would not anticipate. There are other problems with some of the images, including unclear consent for the image to be uploaded by the subject of the image.
I see this category as a concrete example of systemic bias coming from having a male dominated editing community.
Leather boots is only other category that I found that also has a large number of images of people. It also contain a disproportionate number of images of women who are nude or in sexually provocative poses.
I think that it is important to continue to talk about these issues in the hope that more people with became educated about the problems with with our current methods to collect, categorize, and disseminate content.
Sydney Poore User:FloNight
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundationhttp://www.glamwiki.org Wikipedian-in-Residence, Archives of American Arthttp://en.wikipedia.org/wiki/User:SarahStierch and Sarah Stierch Consulting
*Historical, cultural & artistic research & advising.*
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundationhttp://www.glamwiki.org Wikipedian-in-Residence, Archives of American Arthttp://en.wikipedia.org/wiki/User:SarahStierch and Sarah Stierch Consulting
*Historical, cultural & artistic research & advising.*
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
I would tackle this at the level of deletion templates.
Flickrwashing is a known widespread source of copyvios. 1. There should be a template specifically for that class of deletion. 2. This should be added as a new reason for deletion to the appropriate policy page.
A Flickr-imported image whose original uploader has had their account removed, and which has no other indication of copyright status, should be eligible for deletion. This can be counterweighted by * significant educational value, e.g., active use (as the best available image) in multiple articles * significant reason to believe the image was originally posted to Flickr by its author [based on metadata or descriptions on the Flickr account at the time of import, or other online sleuthing]
If either of these is true, we can take a risk and wait for a takedown notice. But we should be as harsh on getting copyright confirmation for these images as we are for images obviously uploaded by their creator or someone who knows the creator, who fail to choose the right license template.
SJ
On Sun, Sep 4, 2011 at 12:34 PM, Sydney Poore sydney.poore@gmail.com wrote:
On Sun, Sep 4, 2011 at 9:59 AM, Sarah Stierch sarah.stierch@gmail.com wrote:
Just a follow up...
It doesn't even matter, anymore. Some of these images have been nominated before, and been kept. They all just keep stating I don't know the policies and that they are in scope. Perhaps it all is and perhaps I really am an idiot who just can't comprehend the policies, despite reading things multiple times.
I think the policy about Flickr accounts being deleted and it doesn't matter is one of the stupidest ideas. Two of the images I nominated have incorrect licenses and were still uploaded from Flickr and "okayed" by a bot, despite the Flickr account stating they are all rights reserved. I also don't get how a deleted Flickr account can still be considered a "source."
Commons is really good at making a smart person feel stupid and like a gnat.
-Sarah
Sarah
I know that some of the images have been nominated before and kept, and some of the images have to be repeatedly re-categorized, too. I get frustrated and at times feel that it is a time sink with no end in sight.
That is the reason that I wrote to the mailing list to discuss the matter as an community issue. I have come to believe that is rooted in the culture values of the WMF editors who add loads of these images to commons.
We can't walk away from the issue because it is too important. We need to discuss it so that we can better understand why that we are having trouble addressing the issue in a way that is promotes an inclusive editing environment.
Sydney
On Sun, Sep 4, 2011 at 9:20 AM, Toby Hudson tobyyy@gmail.com wrote:
Hi Sarah,
The principle of least surprise is roughly the following: People who go to a category/gallery/encyclopedia-article expecting something (shoes) should not be surprised by something they may find offensive (naked women wearing shoes).
One way to ensure this is to make clearly labelled subcategories for the potentially offensive material. In this case, I made a subcategory:
http://commons.wikimedia.org/wiki/Category:Women_wearing_high-heeled_shoes and within that
http://commons.wikimedia.org/wiki/Category:Nude_women_wearing_high-heeled_sh...
so everyone who visits that category knows exactly what they're going to see in advance.
Regarding your Flickr question: Whether the account is deleted or not doesn't usually change whether or not the picture is in scope. But deleted accounts do make the copyright status more questionable. At the time of upload, the bot would check that the license is correct, but that doesn't eliminate the possibility that the Flickr user is uploading copyright violations to their Flickr account ("Flickrwashing"). If there are other likely signs of copyright violation, I would nominate for deletion (as I did for the other image mentioned in this thread http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Young_girl_...). When the account is still active, you can also check the rest of the Flickr user's contributions to get a good sense of whether they are really the author of the photos they're uploading.
Snapshots aren't necessarily out of scope just because they're snapshots, they're sometimes realistically useful for an educational purpose.
Toby
On Sun, Sep 4, 2011 at 10:55 PM, Sarah Stierch sarah.stierch@gmail.com wrote:
Hi Toby -
Sorry to be a n00b but, can you explain what you mean by "refactoring this category according to the principle of least surprise?"
For anyone else - if you find an image that has been uploaded by a Flickr bot, and the Flickr account has been deleted what do you do? I notice a large portion of images like this are often snapshot uneducational photos (here is an example: http://commons.wikimedia.org/wiki/File:Labace_%2824%29.jpg) I was going to nominate it for just being out of scope because Commons is not a repository for snapshots.
;)
Asking questions like this on Commons-L isn't very pleasant, so thanks for helping!
Thanks,
Sarah
On Sun, Sep 4, 2011 at 6:48 AM, Toby Hudson tobyyy@gmail.com wrote:
I've made a start on refactoring this category according to the principle of least surprise. Feel free to do this whenever you notice a "surprising" image in a mundane category.
Regarding consent, if any of the identifiable women are in private locations, http://commons.wikimedia.org/wiki/COM:PEOPLE applies, and the uploader should state that permission was obtained to take & publish the image. If this has not been done, please either contact the uploader or propose deletion.
Toby Hudson / 99of9
On Sun, Sep 4, 2011 at 8:05 AM, Sydney Poore sydney.poore@gmail.com wrote:
Category:High-heeled shoes is an excellent example of the current problem WMF projects are having with creating and disseminating content that is unbiased.
http://commons.wikimedia.org/wiki/Category:High-heeled_shoes
This category is different that most all the other categories about footwear because it contains many images that are not primarily examples of high-heeled shoes. Most other categories about footwear contain mostly images of shoes or the lower leg(s) with a shoe or shoes.
The number of images in Category:High-heeled shoes is higher than most categories about footwear. Approximately one- third of the images are of full body shots of attractive females who are wearing high heeled shoes, and a significant number of them are nude or posed in sexually provocative positions.
There are random women who are wearing shoes and are mixed in with the porn-stars and strip-tease dancers. These women are being objectified and sexualized without their consent because of the way the the images are displayed in the category. See Wikipedia article on Sexualization http://en.wikipedia.org/wiki/Sexualization for a description of the term.
In each language that has Wikipedia articles about high-heeled shoes, the content is about a type of footwear, so the links in the articles that lead to commons are directing people to nudity or sexual content that they would not anticipate. There are other problems with some of the images, including unclear consent for the image to be uploaded by the subject of the image.
I see this category as a concrete example of systemic bias coming from having a male dominated editing community.
Leather boots is only other category that I found that also has a large number of images of people. It also contain a disproportionate number of images of women who are nude or in sexually provocative poses.
I think that it is important to continue to talk about these issues in the hope that more people with became educated about the problems with with our current methods to collect, categorize, and disseminate content.
Sydney Poore User:FloNight
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundation Wikipedian-in-Residence, Archives of American Art and Sarah Stierch Consulting Historical, cultural & artistic research & advising.
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundation Wikipedian-in-Residence, Archives of American Art and Sarah Stierch Consulting Historical, cultural & artistic research & advising.
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Thanks for the support SJ.
Does anyone know if there is a template for this?
It's this that they claim allow images like that to stay on Commons:
http://www.crucialthought.com/2009/03/03/creative-commons-licenses-cannot-be...
Someone else has jumped in and is arguing on some of this content shouldn't be here.
...fighting the good fight,
Sarah
On Mon, Sep 5, 2011 at 2:10 AM, Samuel Klein meta.sj@gmail.com wrote:
I would tackle this at the level of deletion templates.
Flickrwashing is a known widespread source of copyvios.
- There should be a template specifically for that class of deletion.
- This should be added as a new reason for deletion to the
appropriate policy page.
A Flickr-imported image whose original uploader has had their account removed, and which has no other indication of copyright status, should be eligible for deletion. This can be counterweighted by
- significant educational value, e.g., active use (as the best
available image) in multiple articles
- significant reason to believe the image was originally posted to
Flickr by its author [based on metadata or descriptions on the Flickr account at the time of import, or other online sleuthing]
If either of these is true, we can take a risk and wait for a takedown notice. But we should be as harsh on getting copyright confirmation for these images as we are for images obviously uploaded by their creator or someone who knows the creator, who fail to choose the right license template.
SJ
On Sun, Sep 4, 2011 at 12:34 PM, Sydney Poore sydney.poore@gmail.com wrote:
On Sun, Sep 4, 2011 at 9:59 AM, Sarah Stierch sarah.stierch@gmail.com wrote:
Just a follow up...
It doesn't even matter, anymore. Some of these images have been
nominated
before, and been kept. They all just keep stating I don't know the
policies
and that they are in scope. Perhaps it all is and perhaps I really am an idiot who just can't comprehend the policies, despite reading things multiple times.
I think the policy about Flickr accounts being deleted and it doesn't matter is one of the stupidest ideas. Two of the images I nominated have incorrect licenses and were still uploaded from Flickr and "okayed" by a bot, despite the Flickr account stating they are all rights reserved. I
also
don't get how a deleted Flickr account can still be considered a
"source."
Commons is really good at making a smart person feel stupid and like a gnat.
-Sarah
Sarah
I know that some of the images have been nominated before and kept, and
some
of the images have to be repeatedly re-categorized, too. I get frustrated and at times feel that it is a time sink with no end in sight.
That is the reason that I wrote to the mailing list to discuss the matter
as
an community issue. I have come to believe that is rooted in the culture values of the WMF editors who add loads of these images to commons.
We can't walk away from the issue because it is too important. We need to discuss it so that we can better understand why that we are having
trouble
addressing the issue in a way that is promotes an inclusive editing environment.
Sydney
On Sun, Sep 4, 2011 at 9:20 AM, Toby Hudson tobyyy@gmail.com wrote:
Hi Sarah,
The principle of least surprise is roughly the following: People who go to a category/gallery/encyclopedia-article expecting something (shoes) should not be surprised by something they may find offensive (naked women wearing shoes).
One way to ensure this is to make clearly labelled subcategories for
the
potentially offensive material. In this case, I made a subcategory:
http://commons.wikimedia.org/wiki/Category:Women_wearing_high-heeled_shoes
and within that
http://commons.wikimedia.org/wiki/Category:Nude_women_wearing_high-heeled_sh...
so everyone who visits that category knows exactly what they're going
to
see in advance.
Regarding your Flickr question: Whether the account is deleted or not doesn't usually change whether or not the picture is in scope. But
deleted
accounts do make the copyright status more questionable. At the time
of
upload, the bot would check that the license is correct, but that
doesn't
eliminate the possibility that the Flickr user is uploading copyright violations to their Flickr account ("Flickrwashing"). If there are
other
likely signs of copyright violation, I would nominate for deletion (as
I did
for the other image mentioned in this thread
http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Young_girl_... ).
When the account is still active, you can also check the rest of the
Flickr
user's contributions to get a good sense of whether they are really the author of the photos they're uploading.
Snapshots aren't necessarily out of scope just because they're
snapshots,
they're sometimes realistically useful for an educational purpose.
Toby
On Sun, Sep 4, 2011 at 10:55 PM, Sarah Stierch <
sarah.stierch@gmail.com>
wrote:
Hi Toby -
Sorry to be a n00b but, can you explain what you mean by "refactoring this category according to the principle of least surprise?"
For anyone else - if you find an image that has been uploaded by a Flickr bot, and the Flickr account has been deleted what do you do? I
notice
a large portion of images like this are often snapshot uneducational
photos
(here is an example: http://commons.wikimedia.org/wiki/File:Labace_%2824%29.jpg) I was
going to
nominate it for just being out of scope because Commons is not a
repository
for snapshots.
;)
Asking questions like this on Commons-L isn't very pleasant, so thanks for helping!
Thanks,
Sarah
On Sun, Sep 4, 2011 at 6:48 AM, Toby Hudson tobyyy@gmail.com wrote:
I've made a start on refactoring this category according to the principle of least surprise. Feel free to do this whenever you
notice a
"surprising" image in a mundane category.
Regarding consent, if any of the identifiable women are in private locations, http://commons.wikimedia.org/wiki/COM:PEOPLE applies, and
the
uploader should state that permission was obtained to take & publish
the
image. If this has not been done, please either contact the uploader
or
propose deletion.
Toby Hudson / 99of9
On Sun, Sep 4, 2011 at 8:05 AM, Sydney Poore <sydney.poore@gmail.com
wrote: > > Category:High-heeled shoes is an excellent example of the current > problem WMF projects are having with creating and disseminating
content that
> is unbiased. > > http://commons.wikimedia.org/wiki/Category:High-heeled_shoes > > This category is different that most all the other categories about > footwear because it contains many images that are not primarily
examples of
> high-heeled shoes. Most other categories about footwear contain
mostly
> images of shoes or the lower leg(s) with a shoe or shoes. > > The number of images in Category:High-heeled shoes is higher than
most
> categories about footwear. Approximately one- third of the images
are of
> full body shots of attractive females who are wearing high heeled
shoes, and
> a significant number of them are nude or posed in sexually
provocative
> positions. > > There are random women who are wearing shoes and are mixed in with
the
> porn-stars and strip-tease dancers. These women are being
objectified and
> sexualized without their consent because of the way the the images
are
> displayed in the category. See Wikipedia article on Sexualization > http://en.wikipedia.org/wiki/Sexualization for a description of the
term.
> > In each language that has Wikipedia articles about high-heeled
shoes,
> the content is about a type of footwear, so the links in the
articles that
> lead to commons are directing people to nudity or sexual content
that they
> would not anticipate. There are other problems with some of the
images,
> including unclear consent for the image to be uploaded by the
subject of the
> image. > > I see this category as a concrete example of systemic bias coming
from
> having a male dominated editing community. > > Leather boots is only other category that I found that also has a > large number of images of people. It also contain a disproportionate
number
> of images of women who are nude or in sexually provocative poses. > > I think that it is important to continue to talk about these issues
in
> the hope that more people with became educated about the problems
with with
> our current methods to collect, categorize, and disseminate content. > > Sydney Poore > User:FloNight > > > _______________________________________________ > Gendergap mailing list > Gendergap@lists.wikimedia.org > https://lists.wikimedia.org/mailman/listinfo/gendergap >
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundation Wikipedian-in-Residence, Archives of American Art and Sarah Stierch Consulting Historical, cultural & artistic research & advising.
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundation Wikipedian-in-Residence, Archives of American Art and Sarah Stierch Consulting Historical, cultural & artistic research & advising.
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- Samuel Klein identi.ca:sj w:user:sj +1 617 529 4266
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
LOL, at least he realizes I'm on a vendetta against crappy profile personal photos too:
http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Bio_picture...
The more people speak out against crap on Commons the more our voices will be heard.
;)
On Mon, Sep 5, 2011 at 9:22 AM, Sarah Stierch sarah.stierch@gmail.comwrote:
Thanks for the support SJ.
Does anyone know if there is a template for this?
It's this that they claim allow images like that to stay on Commons:
http://www.crucialthought.com/2009/03/03/creative-commons-licenses-cannot-be...
Someone else has jumped in and is arguing on some of this content shouldn't be here.
...fighting the good fight,
Sarah
On Mon, Sep 5, 2011 at 2:10 AM, Samuel Klein meta.sj@gmail.com wrote:
I would tackle this at the level of deletion templates.
Flickrwashing is a known widespread source of copyvios.
- There should be a template specifically for that class of deletion.
- This should be added as a new reason for deletion to the
appropriate policy page.
A Flickr-imported image whose original uploader has had their account removed, and which has no other indication of copyright status, should be eligible for deletion. This can be counterweighted by
- significant educational value, e.g., active use (as the best
available image) in multiple articles
- significant reason to believe the image was originally posted to
Flickr by its author [based on metadata or descriptions on the Flickr account at the time of import, or other online sleuthing]
If either of these is true, we can take a risk and wait for a takedown notice. But we should be as harsh on getting copyright confirmation for these images as we are for images obviously uploaded by their creator or someone who knows the creator, who fail to choose the right license template.
SJ
On Sun, Sep 4, 2011 at 12:34 PM, Sydney Poore sydney.poore@gmail.com wrote:
On Sun, Sep 4, 2011 at 9:59 AM, Sarah Stierch sarah.stierch@gmail.com wrote:
Just a follow up...
It doesn't even matter, anymore. Some of these images have been
nominated
before, and been kept. They all just keep stating I don't know the
policies
and that they are in scope. Perhaps it all is and perhaps I really am
an
idiot who just can't comprehend the policies, despite reading things multiple times.
I think the policy about Flickr accounts being deleted and it doesn't matter is one of the stupidest ideas. Two of the images I nominated
have
incorrect licenses and were still uploaded from Flickr and "okayed" by
a
bot, despite the Flickr account stating they are all rights reserved. I
also
don't get how a deleted Flickr account can still be considered a
"source."
Commons is really good at making a smart person feel stupid and like a gnat.
-Sarah
Sarah
I know that some of the images have been nominated before and kept, and
some
of the images have to be repeatedly re-categorized, too. I get
frustrated
and at times feel that it is a time sink with no end in sight.
That is the reason that I wrote to the mailing list to discuss the
matter as
an community issue. I have come to believe that is rooted in the culture values of the WMF editors who add loads of these images to commons.
We can't walk away from the issue because it is too important. We need
to
discuss it so that we can better understand why that we are having
trouble
addressing the issue in a way that is promotes an inclusive editing environment.
Sydney
On Sun, Sep 4, 2011 at 9:20 AM, Toby Hudson tobyyy@gmail.com wrote:
Hi Sarah,
The principle of least surprise is roughly the following: People who go to a category/gallery/encyclopedia-article expecting something (shoes) should not be surprised by something they may find offensive (naked women wearing shoes).
One way to ensure this is to make clearly labelled subcategories for
the
potentially offensive material. In this case, I made a subcategory:
http://commons.wikimedia.org/wiki/Category:Women_wearing_high-heeled_shoes
and within that
http://commons.wikimedia.org/wiki/Category:Nude_women_wearing_high-heeled_sh...
so everyone who visits that category knows exactly what they're going
to
see in advance.
Regarding your Flickr question: Whether the account is deleted or not doesn't usually change whether or not the picture is in scope. But
deleted
accounts do make the copyright status more questionable. At the time
of
upload, the bot would check that the license is correct, but that
doesn't
eliminate the possibility that the Flickr user is uploading copyright violations to their Flickr account ("Flickrwashing"). If there are
other
likely signs of copyright violation, I would nominate for deletion (as
I did
for the other image mentioned in this thread
http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Young_girl_... ).
When the account is still active, you can also check the rest of the
Flickr
user's contributions to get a good sense of whether they are really
the
author of the photos they're uploading.
Snapshots aren't necessarily out of scope just because they're
snapshots,
they're sometimes realistically useful for an educational purpose.
Toby
On Sun, Sep 4, 2011 at 10:55 PM, Sarah Stierch <
sarah.stierch@gmail.com>
wrote:
Hi Toby -
Sorry to be a n00b but, can you explain what you mean by "refactoring this category according to the principle of least surprise?"
For anyone else - if you find an image that has been uploaded by a Flickr bot, and the Flickr account has been deleted what do you do? I
notice
a large portion of images like this are often snapshot uneducational
photos
(here is an example: http://commons.wikimedia.org/wiki/File:Labace_%2824%29.jpg) I was
going to
nominate it for just being out of scope because Commons is not a
repository
for snapshots.
;)
Asking questions like this on Commons-L isn't very pleasant, so
thanks
for helping!
Thanks,
Sarah
On Sun, Sep 4, 2011 at 6:48 AM, Toby Hudson tobyyy@gmail.com
wrote:
> > I've made a start on refactoring this category according to the > principle of least surprise. Feel free to do this whenever you
notice a
> "surprising" image in a mundane category. > > Regarding consent, if any of the identifiable women are in private > locations, http://commons.wikimedia.org/wiki/COM:PEOPLE applies,
and the
> uploader should state that permission was obtained to take & publish
the
> image. If this has not been done, please either contact the
uploader or
> propose deletion. > > Toby Hudson / 99of9 > > > On Sun, Sep 4, 2011 at 8:05 AM, Sydney Poore <
sydney.poore@gmail.com>
> wrote: >> >> Category:High-heeled shoes is an excellent example of the current >> problem WMF projects are having with creating and disseminating
content that
>> is unbiased. >> >> http://commons.wikimedia.org/wiki/Category:High-heeled_shoes >> >> This category is different that most all the other categories about >> footwear because it contains many images that are not primarily
examples of
>> high-heeled shoes. Most other categories about footwear contain
mostly
>> images of shoes or the lower leg(s) with a shoe or shoes. >> >> The number of images in Category:High-heeled shoes is higher than
most
>> categories about footwear. Approximately one- third of the images
are of
>> full body shots of attractive females who are wearing high heeled
shoes, and
>> a significant number of them are nude or posed in sexually
provocative
>> positions. >> >> There are random women who are wearing shoes and are mixed in with
the
>> porn-stars and strip-tease dancers. These women are being
objectified and
>> sexualized without their consent because of the way the the images
are
>> displayed in the category. See Wikipedia article on Sexualization >> http://en.wikipedia.org/wiki/Sexualization for a description of
the term.
>> >> In each language that has Wikipedia articles about high-heeled
shoes,
>> the content is about a type of footwear, so the links in the
articles that
>> lead to commons are directing people to nudity or sexual content
that they
>> would not anticipate. There are other problems with some of the
images,
>> including unclear consent for the image to be uploaded by the
subject of the
>> image. >> >> I see this category as a concrete example of systemic bias coming
from
>> having a male dominated editing community. >> >> Leather boots is only other category that I found that also has a >> large number of images of people. It also contain a
disproportionate number
>> of images of women who are nude or in sexually provocative poses. >> >> I think that it is important to continue to talk about these issues
in
>> the hope that more people with became educated about the problems
with with
>> our current methods to collect, categorize, and disseminate
content.
>> >> Sydney Poore >> User:FloNight >> >> >> _______________________________________________ >> Gendergap mailing list >> Gendergap@lists.wikimedia.org >> https://lists.wikimedia.org/mailman/listinfo/gendergap >> > > > _______________________________________________ > Gendergap mailing list > Gendergap@lists.wikimedia.org > https://lists.wikimedia.org/mailman/listinfo/gendergap >
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundation Wikipedian-in-Residence, Archives of American Art and Sarah Stierch Consulting Historical, cultural & artistic research & advising.
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundation Wikipedian-in-Residence, Archives of American Art and Sarah Stierch Consulting Historical, cultural & artistic research & advising.
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- Samuel Klein identi.ca:sj w:user:sj +1 617 529 4266
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundationhttp://www.glamwiki.org Wikipedian-in-Residence, Archives of American Arthttp://en.wikipedia.org/wiki/User:SarahStierch and Sarah Stierch Consulting
*Historical, cultural & artistic research & advising.*
On Sun, Sep 4, 2011 at 9:20 AM, Toby Hudson tobyyy@gmail.com wrote:
Hi Sarah,
The principle of least surprise is roughly the following: People who go to a category/gallery/encyclopedia-article expecting something (shoes) should not be surprised by something they may find offensive (naked women wearing shoes).
One way to ensure this is to make clearly labelled subcategories for the potentially offensive material. In this case, I made a subcategory: http://commons.wikimedia.org/wiki/Category:Women_wearing_high-heeled_shoes and within that
http://commons.wikimedia.org/wiki/Category:Nude_women_wearing_high-heeled_sh...
so everyone who visits that category knows exactly what they're going to see in advance.
Sarah,
If you look through my contributions on Commons, you see the way that I usually handle changing categories on images with nudity or sexually provocative images. I use a descriptive edit summary, and if I get reverted I remove it again, and leave a talk page message.
http://commons.wikimedia.org/w/index.php?title=Special:Contributions&lim...
I don't re-add a different category back unless I think that someone would truly want to look for an image in the category. Often you would need to create a new category because one that specific does not exist.
The problem with adding back categories with titles that make it obvious that the category contains sexual content is that these categories will show up in searches for the the non-controversial term. The same is true of file names that combine sexual terms with household objects. For example, a search for toothbrush brings up a women masturbating with an electric toothbrush toward the top of the search results even though it long appears in category toothbrush.
http://commons.wikimedia.org/w/index.php?title=Special:Search&search=too...
(At least the Commons link on Wikipedia no longer will take you to the image since the image was taken out of Category:Toothbrushes).
The categories have been changed a half a dozen times on this since then because there are differences of opinion about how to categorize this type of image.
http://commons.wikimedia.org/w/index.php?title=File:Masturbating_with_a_toot...
In my view the over categorization of sexual content makes Commons look like it has more sexual content than it really has. Some of the categories that people have created border on being ridiculous. Many of them eventually get removed into a more general category. For example eventually, now File:Masturbating_with_a_toothbrush.jpg only has 2 categories, and both seem appropriate.
This is getting off topic from the core mission of the email list, so if you want more detailed help drop me an email.
Sydney
User:FloNight
Regarding your Flickr question: Whether the account is deleted or not doesn't usually change whether or not the picture is in scope. But deleted accounts do make the copyright status more questionable. At the time of upload, the bot would check that the license is correct, but that doesn't eliminate the possibility that the Flickr user is uploading copyright violations to their Flickr account ("Flickrwashing"). If there are other likely signs of copyright violation, I would nominate for deletion (as I did for the other image mentioned in this thread http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Young_girl_...). When the account is still active, you can also check the rest of the Flickr user's contributions to get a good sense of whether they are really the author of the photos they're uploading.
Snapshots aren't necessarily out of scope just because they're snapshots, they're sometimes realistically useful for an educational purpose.
Toby
On Sun, Sep 4, 2011 at 10:55 PM, Sarah Stierch sarah.stierch@gmail.comwrote:
Hi Toby -
Sorry to be a n00b but, can you explain what you mean by "refactoring this category according to the principle of least surprise?"
For anyone else - if you find an image that has been uploaded by a Flickr bot, and the Flickr account has been deleted what do you do? I notice a large portion of images like this are often snapshot uneducational photos (here is an example: http://commons.wikimedia.org/wiki/File:Labace_%2824%29.jpg) I was going to nominate it for just being out of scope because Commons is not a repository for snapshots.
;)
Asking questions like this on Commons-L isn't very pleasant, so thanks for helping!
Thanks,
Sarah
On Sun, Sep 4, 2011 at 6:48 AM, Toby Hudson tobyyy@gmail.com wrote:
I've made a start on refactoring this category according to the principle of least surprise. Feel free to do this whenever you notice a "surprising" image in a mundane category.
Regarding consent, if any of the identifiable women are in private locations, http://commons.wikimedia.org/wiki/COM:PEOPLEhttp://commons.wikimedia.org/wiki/Category:High-heeled_shoesapplies, and the uploader should state that permission was obtained to take & publish the image. If this has not been done, please either contact the uploader or propose deletion.
Toby Hudson / 99of9
On Sun, Sep 4, 2011 at 8:05 AM, Sydney Poore sydney.poore@gmail.comwrote:
Category:High-heeled shoes is an excellent example of the current problem WMF projects are having with creating and disseminating content that is unbiased.
http://commons.wikimedia.org/wiki/Category:High-heeled_shoes
This category is different that most all the other categories about footwear because it contains many images that are not primarily examples of high-heeled shoes. Most other categories about footwear contain mostly images of shoes or the lower leg(s) with a shoe or shoes.
The number of images in Category:High-heeled shoes is higher than most categories about footwear. Approximately one- third of the images are of full body shots of attractive females who are wearing high heeled shoes, and a significant number of them are nude or posed in sexually provocative positions.
There are random women who are wearing shoes and are mixed in with the porn-stars and strip-tease dancers. These women are being objectified and sexualized without their consent because of the way the the images are displayed in the category. See Wikipedia article on Sexualization http://en.wikipedia.org/wiki/Sexualization for a description of the term.
In each language that has Wikipedia articles about high-heeled shoes, the content is about a type of footwear, so the links in the articles that lead to commons are directing people to nudity or sexual content that they would not anticipate. There are other problems with some of the images, including unclear consent for the image to be uploaded by the subject of the image.
I see this category as a concrete example of systemic bias coming from having a male dominated editing community.
Leather boots is only other category that I found that also has a large number of images of people. It also contain a disproportionate number of images of women who are nude or in sexually provocative poses.
I think that it is important to continue to talk about these issues in the hope that more people with became educated about the problems with with our current methods to collect, categorize, and disseminate content.
Sydney Poore User:FloNight
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundationhttp://www.glamwiki.org Wikipedian-in-Residence, Archives of American Arthttp://en.wikipedia.org/wiki/User:SarahStierch and Sarah Stierch Consulting
*Historical, cultural & artistic research & advising.*
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Hi Toby :-)
You made my day by cleaning up this category!
I thought I was going to have to make room in my schedule to do it today. If you look through my contributions on Commons, you see that for the last 18 months the bulk of my edits to Commons is cleaning up mis-categorized controversial images. I usually do the most outrageous ones because there just too many to do.
So, I greatly appreciate you do it, and invite other people to give us a hand.
Warmest regards,
Sydney Poore User:FloNight
On Sun, Sep 4, 2011 at 6:48 AM, Toby Hudson tobyyy@gmail.com wrote:
I've made a start on refactoring this category according to the principle of least surprise. Feel free to do this whenever you notice a "surprising" image in a mundane category.
Regarding consent, if any of the identifiable women are in private locations, http://commons.wikimedia.org/wiki/COM:PEOPLEhttp://commons.wikimedia.org/wiki/Category:High-heeled_shoesapplies, and the uploader should state that permission was obtained to take & publish the image. If this has not been done, please either contact the uploader or propose deletion.
Toby Hudson / 99of9
On Sun, Sep 4, 2011 at 8:05 AM, Sydney Poore sydney.poore@gmail.comwrote:
Category:High-heeled shoes is an excellent example of the current problem WMF projects are having with creating and disseminating content that is unbiased.
http://commons.wikimedia.org/wiki/Category:High-heeled_shoes
This category is different that most all the other categories about footwear because it contains many images that are not primarily examples of high-heeled shoes. Most other categories about footwear contain mostly images of shoes or the lower leg(s) with a shoe or shoes.
The number of images in Category:High-heeled shoes is higher than most categories about footwear. Approximately one- third of the images are of full body shots of attractive females who are wearing high heeled shoes, and a significant number of them are nude or posed in sexually provocative positions.
There are random women who are wearing shoes and are mixed in with the porn-stars and strip-tease dancers. These women are being objectified and sexualized without their consent because of the way the the images are displayed in the category. See Wikipedia article on Sexualization http://en.wikipedia.org/wiki/Sexualization for a description of the term.
In each language that has Wikipedia articles about high-heeled shoes, the content is about a type of footwear, so the links in the articles that lead to commons are directing people to nudity or sexual content that they would not anticipate. There are other problems with some of the images, including unclear consent for the image to be uploaded by the subject of the image.
I see this category as a concrete example of systemic bias coming from having a male dominated editing community.
Leather boots is only other category that I found that also has a large number of images of people. It also contain a disproportionate number of images of women who are nude or in sexually provocative poses.
I think that it is important to continue to talk about these issues in the hope that more people with became educated about the problems with with our current methods to collect, categorize, and disseminate content.
Sydney Poore User:FloNight
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap