This was left on my talk page after I nominated a photo by this user for deletion:

http://commons.wikimedia.org/wiki/User_talk:Missvain#You_seem_to_be_unaware_of_some_few_things...

This is the photo I tagged. It no longer has a Flickr account, it's not used anywhere, there isn't a single educational merit in this image (I mean come on..). I also love how this person tells me that licenses aren't revocable and they don't link to an exact statement that is easily found explaining that if something is deleted from Flickr and the account gets deleted we no longer have any proof of copyright from the original creator so what does that mean?

(Did that last statement make sense? LOL)

I also love that when you tag something like this for deletion someone basically passive-aggressively calls you a censor. Talk about offensive.

Two Wings also has this on the top of his talk page:
"After the Wikimedia/Wales/FoxNews "porn purge" affair, I'm vigilant towards possible drifts, notably in terms of abuse of power from Jimbo Wales and/or admins, but also in terms of extreme censorship and/or puritanism against sex illustrations and nude pictures."

(adds this to her list of being a prude, racist, sexist, homophobe, anti-sex and high-brow editor.)

-Sarah
(Part of me wants to be snappy and tell him enough I can afford to actually buy my porn. :P )



On Sun, Sep 4, 2011 at 9:21 AM, Sarah Stierch <sarah.stierch@gmail.com> wrote:
Great, thank you :)

On Sun, Sep 4, 2011 at 9:20 AM, Toby Hudson <tobyyy@gmail.com> wrote:
Hi Sarah,

The principle of least surprise is roughly the following:
People who go to a category/gallery/encyclopedia-article expecting something (shoes) should not be surprised by something they may find offensive (naked women wearing shoes).


One way to ensure this is to make clearly labelled subcategories for the potentially offensive material.  In this case, I made a subcategory:
http://commons.wikimedia.org/wiki/Category:Women_wearing_high-heeled_shoes
and within that
http://commons.wikimedia.org/wiki/Category:Nude_women_wearing_high-heeled_shoes

so everyone who visits that category knows exactly what they're going to see in advance.


Regarding your Flickr question: Whether the account is deleted or not doesn't usually change whether or not the picture is in scope.  But deleted accounts do make the copyright status more questionable.  At the time of upload, the bot would check that the license is correct, but that doesn't eliminate the possibility that the Flickr user is uploading copyright violations to their Flickr account ("Flickrwashing").  If there are other likely signs of copyright violation, I would nominate for deletion (as I did for the other image mentioned in this thread http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Young_girl_with_see-through_tops_and_shorts.jpg).   When the account is still active, you can also check the rest of the Flickr user's contributions to get a good sense of whether they are really the author of the photos they're uploading.

Snapshots aren't necessarily out of scope just because they're snapshots, they're sometimes realistically useful for an educational purpose.

Toby



On Sun, Sep 4, 2011 at 10:55 PM, Sarah Stierch <sarah.stierch@gmail.com> wrote:
Hi Toby -

Sorry to be a n00b but, can you explain what you mean by "refactoring this category according to the principle of least surprise?"

For anyone else - if you find an image that has been uploaded by a Flickr bot, and the Flickr account has been deleted what do you do? I notice a large portion of images like this are often snapshot uneducational photos (here is an example: http://commons.wikimedia.org/wiki/File:Labace_%2824%29.jpg) I was going to nominate it for just being out of scope because Commons is not a repository for snapshots.

;)

Asking questions like this on Commons-L isn't very pleasant, so thanks for helping!

Thanks,

Sarah


On Sun, Sep 4, 2011 at 6:48 AM, Toby Hudson <tobyyy@gmail.com> wrote:
I've made a start on refactoring this category according to the principle of least surprise.  Feel free to do this whenever you notice a "surprising" image in a mundane category.

Regarding consent, if any of the identifiable women are in private locations, http://commons.wikimedia.org/wiki/COM:PEOPLE applies, and the uploader should state that permission was obtained to take & publish the image.  If this has not been done, please either contact the uploader or propose deletion.

Toby Hudson  /  99of9


On Sun, Sep 4, 2011 at 8:05 AM, Sydney Poore <sydney.poore@gmail.com> wrote:
Category:High-heeled shoes is an excellent example of the current problem WMF projects are having with creating and disseminating content that is unbiased.

http://commons.wikimedia.org/wiki/Category:High-heeled_shoes

This category is different that most all the other categories about footwear because it contains many images that are not primarily examples of high-heeled shoes. Most other categories about footwear contain mostly images of shoes or the lower leg(s) with a shoe or shoes.

The number of images in Category:High-heeled shoes is higher than most categories about footwear. Approximately one- third of the images are of full body shots of attractive females who are wearing high heeled shoes, and a significant number of them are nude or posed in sexually provocative positions.

There are random women who are wearing shoes and are mixed in with the porn-stars and strip-tease dancers. These women are being objectified and sexualized without their consent because of the way the the images are displayed in  the category. See Wikipedia article on Sexualization http://en.wikipedia.org/wiki/Sexualization for a description of the term.

In each language that has Wikipedia articles about high-heeled shoes, the content is about a type of footwear, so the links in the articles that lead to commons are directing people to nudity or sexual content that they would not anticipate. There are other problems with some of the images, including unclear consent for the image to be uploaded by the subject of the image.

I see this category as a concrete example of systemic bias coming from having a male dominated editing community.

Leather boots is only other category that I found that also has a large number of images of people. It also contain a disproportionate number of images of women who are nude or in sexually provocative poses.

I think that it is important to continue to talk about these issues in the hope that more people with became educated about the problems with with our current methods to collect, categorize, and disseminate content.

Sydney Poore
User:FloNight


_______________________________________________
Gendergap mailing list
Gendergap@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/gendergap



_______________________________________________
Gendergap mailing list
Gendergap@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/gendergap




--
GLAMWIKI Partnership Ambassador for the Wikimedia Foundation
Wikipedian-in-Residence, Archives of American Art
and
Sarah Stierch Consulting
Historical, cultural & artistic research & advising.
------------------------------------------------------
http://www.sarahstierch.com/


_______________________________________________
Gendergap mailing list
Gendergap@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/gendergap



_______________________________________________
Gendergap mailing list
Gendergap@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/gendergap




--
GLAMWIKI Partnership Ambassador for the Wikimedia Foundation
Wikipedian-in-Residence, Archives of American Art
and
Sarah Stierch Consulting
Historical, cultural & artistic research & advising.
------------------------------------------------------
http://www.sarahstierch.com/




--
GLAMWIKI Partnership Ambassador for the Wikimedia Foundation
Wikipedian-in-Residence, Archives of American Art
and
Sarah Stierch Consulting
Historical, cultural & artistic research & advising.
------------------------------------------------------
http://www.sarahstierch.com/