On Thu, May 31, 2012 at 1:38 PM, Kim Osman kim.osman@qut.edu.au wrote:
My first thought was that this indeed is a red herring in terms of addressing the gendergap, however in my limited editing experience I do at times feel like Wikipedia is a boys' club, and perhaps the prevalence of pornography goes some way to an imagining of what is hanging on the clubhouse walls
Hi,
I edit Wikipedia a lot. I probably spend more time than I should editing Wikipedia. Can I ask where there is a prevalence of pornography on Wikipedia? I honestly can't think of a single time I have come across it when I wasn't directly looking for it. Misogny to a degree, yes. Discrimination against women's topics and topics outside the United States, youbetcha. But pornography? Maybe I just don't edit articles where pornography is very prevalent?
Here are results of a multimedia search for "human female" in Wikipedia (NSFW):
http://en.wikipedia.org/w/index.php?title=Special:Search&limit=250&o...
Did you look at the examples Larry mentioned in his post?
There are many more: e.g.
http://en.wikipedia.org/wiki/Deep-throating, viewed more than 50,000 times this month (this actually had three rather than two images until a couple of days ago: http://en.wikipedia.org/w/index.php?title=Deep-throating&oldid=494580914)
http://en.wikipedia.org/wiki/Tit_torture (16,000+ views this month)
http://en.wikipedia.org/wiki/Bukkake (120,000+ views this month)
Basically, if you go through the articles listed in en:WP templates like the sexual slang template, the Outline of BDSM template etc. you will come across many such articles, all with high viewing figures.
An example from de:WP: http://de.wikipedia.org/w/index.php?title=Vaginalverkehr&oldid=97830340
Source: http://www.flickr.com/photos/46879013@N03/4414846436/ http://www.flickr.com/people/46879013@N03
The Flickr account has been closed down (usually for breach of Flickr's terms of service). Note that there are no 18 USC 2257 records demonstrating that the persons depicted were 18 or over. According to my understanding of US law, any Wikimedian who uploads or inserts such an image without having documentation of model age, name, and publication consent is in breach of US law; see discussion at http://meta.wikimedia.org/wiki/User_talk:Philippe_(WMF)#Implications_of_2257...
Andreas
On Thu, May 31, 2012 at 4:44 AM, Laura Hale laura@fanhistory.com wrote:
On Thu, May 31, 2012 at 1:38 PM, Kim Osman kim.osman@qut.edu.au wrote:
My first thought was that this indeed is a red herring in terms of addressing the gendergap, however in my limited editing experience I do at times feel like Wikipedia is a boys' club, and perhaps the prevalence of pornography goes some way to an imagining of what is hanging on the clubhouse walls
Hi,
I edit Wikipedia a lot. I probably spend more time than I should editing Wikipedia. Can I ask where there is a prevalence of pornography on Wikipedia? I honestly can't think of a single time I have come across it when I wasn't directly looking for it. Misogny to a degree, yes. Discrimination against women's topics and topics outside the United States, youbetcha. But pornography? Maybe I just don't edit articles where pornography is very prevalent?
-- twitter: purplepopple blog: ozziesport.com
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
The Flickr account has been closed down (usually for breach of Flickr's terms of service). Note that there are no 18 USC 2257 records demonstrating that the persons depicted were 18 or over. According to my understanding of US law, any Wikimedian who uploads or inserts such an image without having documentation of model age, name, and publication consent is in breach of US law; see discussion at http://meta.wikimedia.org/wiki/User_talk:Philippe_(WMF)#Implications_of_2257...
Andreas
Does it also apply to artwork of nude underages, such as http://commons.wikimedia.org/wiki/Category:Boy_playing_jonchets_by_Julien-Ch... or the trillion paintings with nude babies ?
On 31 May 2012 09:23, Caroline Becker carobecker54@gmail.com wrote:
The Flickr account has been closed down (usually for breach of Flickr's terms of service). Note that there are no 18 USC 2257 records demonstrating that the persons depicted were 18 or over. According to my understanding of US law, any Wikimedian who uploads or inserts such an image without having documentation of model age, name, and publication consent is in breach of US law; see discussion at http://meta.wikimedia.org/wiki/User_talk:Philippe_(WMF)#Implications_of_2257...
Andreas
Does it also apply to artwork of nude underages, such as http://commons.wikimedia.org/wiki/Category:Boy_playing_jonchets_by_Julien-Ch... or the trillion paintings with nude babies ?
No, because that's not a sexual image (which is what 2257 relates to).
Tom
On Thu, May 31, 2012 at 9:23 AM, Caroline Becker carobecker54@gmail.comwrote:
The Flickr account has been closed down (usually for breach of Flickr's terms of service). Note that there are no 18 USC 2257 records demonstrating that the persons depicted were 18 or over. According to my understanding of US law, any Wikimedian who uploads or inserts such an image without having documentation of model age, name, and publication consent is in breach of US law; see discussion at http://meta.wikimedia.org/wiki/User_talk:Philippe_(WMF)#Implications_of_2257...
Andreas
Does it also apply to artwork of nude underages, such as http://commons.wikimedia.org/wiki/Category:Boy_playing_jonchets_by_Julien-Ch... or the trillion paintings with nude babies ?
No. Record-keeping is required by law for images whose production involved actual people engaged in sexually explicit conduct, meaning "actual or simulated—(i) sexual intercourse, including genital-genital, oral-genital, anal-genital, or oral-anal, whether between persons of the same or opposite sex; (ii) bestiality; (iii) masturbation; (iv) sadistic or masochistic abuse; or (v) lascivious exhibition of the genitals or pubic area of any person."
http://www.law.cornell.edu/uscode/text/18/2256
If creation of the image did not involve real people engaged in such conduct, no record-keeping requirements apply.
Note that while the Wikimedia Foundation, due to Section 230(c) safe harbor provisions, does not have a record-keeping duty here, my layman's reading of http://www.law.cornell.edu/uscode/text/18/2257 is that every *individual contributor* who
– uploads an image depicting real people engaged in sexually explicit conduct, or – inserts such an image in Wikipedia, or – manages such content on Wikimedia sites,
thereby becomes a "secondary producer" required to keep and maintain records documenting the performers' age, name, and consent, with failure to do so punishable by up to five years in prison.
Note that this includes anyone, say, inserting an image or video of masturbation in a Wikipedia article or categorising it in Commons without having a written record of the name, age and consent of the person shown on file.
I've asked Philippe Beaudette to confirm that this reading is correct. He has said that while they cannot provide legal advice to individual editors, they will put someone to work on that, and that it will be a month or so before they can come back to us.
Andreas
On Thu, May 31, 2012 at 6:10 PM, Andreas Kolbe jayen466@gmail.com wrote:
No. Record-keeping is required by law for images whose production involved actual people engaged in sexually explicit conduct, meaning "actual or simulated—(i) sexual intercourse, including genital-genital, oral-genital, anal-genital, or oral-anal, whether between persons of the same or opposite sex; (ii) bestiality; (iii) masturbation; (iv) sadistic or masochistic abuse; or (v) lascivious exhibition of the genitals or pubic area of any person."
http://www.law.cornell.edu/uscode/text/18/2256
If creation of the image did not involve real people engaged in such conduct, no record-keeping requirements apply.
Note that while the Wikimedia Foundation, due to Section 230(c) safe harbor provisions, does not have a record-keeping duty here, my layman's reading of http://www.law.cornell.edu/uscode/text/18/2257 is that every *individual contributor* who
– uploads an image depicting real people engaged in sexually explicit conduct, or – inserts such an image in Wikipedia, or – manages such content on Wikimedia sites,
thereby becomes a "secondary producer" required to keep and maintain records documenting the performers' age, name, and consent, with failure to do so punishable by up to five years in prison.
Note that this includes anyone, say, inserting an image or video of masturbation in a Wikipedia article or categorising it in Commons without having a written record of the name, age and consent of the person shown on file.
I've asked Philippe Beaudette to confirm that this reading is correct. He has said that while they cannot provide legal advice to individual editors, they will put someone to work on that, and that it will be a month or so before they can come back to us.
Let me try and give the whole context here. Actually, the Wikipedia article[1] on this subject explains the situation much better. I'm sure, finer legal minds reading this can correct where I go wrong. I am a layman too, and this is my inference from reading about the subject.
The law you are speaking of is part of Child Protection and Obscenity Enforcement Act of 1988 or and the guideline enforcing them is 2257 Regulations. It actually placed the burden of record keeping, on the primary producers, as in, who is "involved in hiring, contracting for, managing, or otherwise arranging for, the participation of the performers depicted,". In its original form, it only placed the burden on producers of pornographic material to comply with record-keeping.
Now, things got complicated when DOJ added an entirely new class of producers you speak of "secondary producers", anyone who "publishes, reproduces, or reissues" explicit material. This is where things get complicated. What followed was a circuit court decision, and other proceedings, that ruled these requirements were facially invalid because they imposed an overbroad burden on legitimate, constitutionally protected speech.
The real question now becomes about its enforcement. Much of the sexual material on the internet, even depiction of works of art several hundred years old, any form of nudity even for educational, anatomical purposes might fall under this law (lascivious exhibition of the genitals or pubic area of any person). The burden on service providers, and hosting websites would be massive to speak of - consider the implication on Facebook for example, or Flickr, or even Google, being responsible for linking every single image in results, they don't possess the proper records of the depicted subjects, which might very well number into tens of millions. Maybe that's why, it has been implemented only in one specific case primarily based on the new 2257 law and related legislation. The case was against Joe Francis, the originator of "Girls gone Wild" series. Also, of relevance might be that the series in question only depicted nudity, and not any sexual act. Even these charges were for the most part dropped later on.
Regards Theo
On 31 May 2012 14:10, Theo10011 de10011@gmail.com wrote:
On Thu, May 31, 2012 at 6:10 PM, Andreas Kolbe jayen466@gmail.com wrote:
No. Record-keeping is required by law for images whose production involved actual people engaged in sexually explicit conduct, meaning "actual or simulated—(i) sexual intercourse, including genital-genital, oral-genital, anal-genital, or oral-anal, whether between persons of the same or opposite sex; (ii) bestiality; (iii) masturbation; (iv) sadistic or masochistic abuse; or (v) lascivious exhibition of the genitals or pubic area of any person."
http://www.law.cornell.edu/uscode/text/18/2256
If creation of the image did not involve real people engaged in such conduct, no record-keeping requirements apply.
Note that while the Wikimedia Foundation, due to Section 230(c) safe harbor provisions, does not have a record-keeping duty here, my layman's reading of http://www.law.cornell.edu/uscode/text/18/2257 is that every *individual contributor* who
– uploads an image depicting real people engaged in sexually explicit conduct, or – inserts such an image in Wikipedia, or – manages such content on Wikimedia sites,
thereby becomes a "secondary producer" required to keep and maintain records documenting the performers' age, name, and consent, with failure to do so punishable by up to five years in prison.
Note that this includes anyone, say, inserting an image or video of masturbation in a Wikipedia article or categorising it in Commons without having a written record of the name, age and consent of the person shown on file.
I've asked Philippe Beaudette to confirm that this reading is correct. He has said that while they cannot provide legal advice to individual editors, they will put someone to work on that, and that it will be a month or so before they can come back to us.
Let me try and give the whole context here. Actually, the Wikipedia article[1] on this subject explains the situation much better. I'm sure, finer legal minds reading this can correct where I go wrong. I am a layman too, and this is my inference from reading about the subject.
The law you are speaking of is part of Child Protection and Obscenity Enforcement Act of 1988 or and the guideline enforcing them is 2257 Regulations. It actually placed the burden of record keeping, on the primary producers, as in, who is "involved in hiring, contracting for, managing, or otherwise arranging for, the participation of the performers depicted,". In its original form, it only placed the burden on producers of pornographic material to comply with record-keeping.
Now, things got complicated when DOJ added an entirely new class of producers you speak of "secondary producers", anyone who "publishes, reproduces, or reissues" explicit material. This is where things get complicated. What followed was a circuit court decision, and other proceedings, that ruled these requirements were facially invalid because they imposed an overbroad burden on legitimate, constitutionally protected speech.
That's pretty important then, right? Because IIRC circuit court decisions inform judgement in later such cases - and the only way the legal interpretation can be rejudged is in a full appeals court?
Tom
On Thu, May 31, 2012 at 9:15 AM, Thomas Morton <morton.thomas@googlemail.com
wrote:
That's pretty important then, right? Because IIRC circuit court decisions inform judgement in later such cases - and the only way the legal interpretation can be rejudged is in a full appeals court?
Tom
That can be true, but there are 13 circuits and a decision in one has force only within its own jurisdiction. In any case, it's clear that Wikimedia is not held to these rules, but that's rather beside the point. We should *want* this information, whether we are required to have it or not.
Nathan
On Thu, May 31, 2012 at 2:15 PM, Thomas Morton <morton.thomas@googlemail.com
wrote:
Now, things got complicated when DOJ added an entirely new class of
producers you speak of "secondary producers", anyone who "publishes, reproduces, or reissues" explicit material. This is where things get complicated. What followed was a circuit court decision, and other proceedings, that ruled these requirements were facially invalid because they imposed an overbroad burden on legitimate, constitutionally protected speech.
That's pretty important then, right? Because IIRC circuit court decisions inform judgement in later such cases - and the only way the legal interpretation can be rejudged is in a full appeals court?
Tom
Tom, the Wikipedia article continues as follows (my emphases):
---o0o---
On October 23, 2007, the 6th Circuit U.S. Court of Appealshttp://en.wikipedia.org/wiki/United_States_Court_of_Appeals_for_the_Sixth_Circuit ruled the federal record-keeping statute unconstitutional, holding that the law is overly broad and facially invalid.[1]http://en.wikipedia.org/wiki/18_USC_2257#cite_note-Court_Opinion-0 The Sixth Circuit subsequently *reheard the case **en banchttp://en.wikipedia.org/wiki/En_banc ** and issued an opinion on February 20, 2009, upholding the constitutionality* of the record-keeping requirements, albeit with some dissents.[3]http://en.wikipedia.org/wiki/18_USC_2257#cite_note-uscourts1-2
Proposed regulations
On July 12, 2007, the Department of Justice issued a preliminary set of addendum record keeping regulations based on the Walsh Act amendments onto the existing regulations at 25 C.F.R. pt. 75.[16]http://en.wikipedia.org/wiki/18_USC_2257#cite_note-15 These new regulations are meant to encompass the inclusion of simulated sexual actions that do not actually show explicit sexual contact or fulfillment that were included by the Adam Walsh Act that was signed into law in 2007.
These new regulations were allowed in actual legal enforcement by the *dismissal of its constitutionality challenges* by U.S. District Judge Michael Baylson on July 28, 2010,[17]http://en.wikipedia.org/wiki/18_USC_2257#cite_note-16 as the US Supreme Court had already refused to hear the same challenge in 2009. [edithttp://en.wikipedia.org/w/index.php?title=Child_Protection_and_Obscenity_Enforcement_Act&action=edit§ion=5 ]Court affirmation of 2257 and 2257A
After the July 2010 decision by U.S. District Judge Michael Baylson to dismiss the FSC’s lawsuit per the request of US Attorney Eric Holder's DOJ, agreeing that USC 2257 and 2257A regulations are constitutional,[18]http://en.wikipedia.org/wiki/18_USC_2257#cite_note-17 the FSC then filed an additional appeal to amend their original challenge to the constitutionality challenge.[19]http://en.wikipedia.org/wiki/18_USC_2257#cite_note-18
ON Monday, September 20, 2010, Judge Baylson rejected FSC's amended appeal, allowing the government record-keeping inspections to be restarted.[20]http://en.wikipedia.org/wiki/18_USC_2257#cite_note-19 ]
The FSC stated that they would appeal the case to the Third Circuit of Appeals if needed.
---o0o---
Also, remember that this is a Wikipedia article. We'd be better off looking at the cited sources.
Nathan said, "That can be true, but there are 13 circuits and a decision in one has force only within its own jurisdiction. In any case, it's clear that Wikimedia is not held to these rules, but that's rather beside the point. We should *want* this information, whether we are required to have it or not."
Please understand that there is a difference between what the Wikimedia Foundation is held to, and what contributors are held to. Editors are not the Foundation, unless they are employees.
To give two examples which will hopefully make this clear:
1. If Joe Smith uploads child pornography on Facebook, Facebook and individual Facebook employees are in the clear. They do not need age and consent records to host this material on their servers. They are protected by Section 230(c). But John Smith goes to jail.
2. If Joe Smith posts defamatory statements in Jack Smith's Wikipedia biography, the Wikimedia Foundation is not legally liable for defamation. Joe Smith, however, is and can be sued.
It's, potentially at least, exactly the same with 2257 record-keeping requirements.
Of course I agree with Nathan's main point: the Wikimedia Foundation should not accept sexually explicit material without the uploader providing a copy of the required documentation. It's the professional, best-practice thing to do. Not least to protect its volunteers and third-party reusers from potential legal liability.
And if anonymous uploads dry up, then support a photography project with professional porn performers to create high-quality media for sex education: media with proper lighting, and with the proper records, made available to all reusers. The Foundation took $20m last year, ten times as much as just a few years ago. There should be money for a grant for such a project.
From that point onwards, anonymous uploads of revenge porn or people
wanking in their bathrooms can just be deleted on sight.
On Thu, May 31, 2012 at 2:10 PM, Theo10011 de10011@gmail.com wrote:
Now, things got complicated when DOJ added an entirely new class of producers you speak of "secondary producers", anyone who "publishes, reproduces, or reissues" explicit material. This is where things get complicated. What followed was a circuit court decision, and other proceedings, that ruled these requirements were facially invalid because they imposed an overbroad burden on legitimate, constitutionally protected speech.
The real question now becomes about its enforcement. Much of the sexual material on the internet, even depiction of works of art several hundred years old, any form of nudity even for educational, anatomical purposes might fall under this law (lascivious exhibition of the genitals or pubic area of any person).
Theo, that is completely wrong. Record-keeping requirements only apply to images where models were required to engage in actual sexually explicit conduct, and moreover, it only applies to images created from 1990 onward.
The burden on service providers, and hosting websites would be massive to speak of - consider the implication on Facebook for example, or Flickr, or even Google, being responsible for linking every single image in results, they don't possess the proper records of the depicted subjects, which might very well number into tens of millions.
Again, that is completely wrong. Facebook and the Wikimedia Foundation are already protected by 230(c) safe harbor provisions. Responsibility lies with the individual uploader or editor, who enjoys no such protection but is fully liable for their own actions.
Maybe that's why, it has been implemented only in one specific case
primarily based on the new 2257 law and related legislation. The case was against Joe Francis, the originator of "Girls gone Wild" series. Also, of relevance might be that the series in question only depicted nudity, and not any sexual act. Even these charges were for the most part dropped later on.
The thing is: Wikimedia keeps edit histories and contributions lists for decades. We have no idea what implementation of US law will look like in five or ten years' time, given political vagaries.
On Thu, May 31, 2012 at 6:48 PM, Andreas Kolbe jayen466@gmail.com wrote:
Theo, that is completely wrong. Record-keeping requirements only apply to images where models were required to engage in actual sexually explicit conduct, and moreover, it only applies to images created from 1990 onward.
I reread the information I based my conclusion on. I must be missing some part that you are referring to. Even the definition of "secondary producers" and its distinction from "primary producers" is linked and explained at length in articles related to legislation 2257.
You gave the definition of sexually explicit content from Cornell law that I based my understanding on, " (v) lascivious exhibition of the genitals or pubic area of any person." - is a giant definition to cover a lot of content, regardless of its purpose.
The single prosecution based on that law has been against the "Girls Gone Wild" producer, they weren't exactly charged for any other form of sexually explicit content beyond depiction of "genitals or pubic area" and lack of proper record-keeping.
Again, that is completely wrong. Facebook and the Wikimedia Foundation are
already protected by 230(c) safe harbor provisions. Responsibility lies with the individual uploader or editor, who enjoys no such protection but is fully liable for their own actions.
Here in lies the burden on the service providers. The complete record of the uploader/editor/individual would lie with the service provider. Edit histories and contribution list provide next to no information on the real world identities of editors, not even where they are located. The only possible link, even the country of residence would be information only available to the service provider in the form of their IP address.
Outside prosecutors can not prosecute, or charge any editor based on their username, whether its User:someguy542 or User:Ladiesman232, there is no real world link without the IP records. That is where the burden comes on the service provider. In order to prosecute and get that information legally, the burden passes to the provider. This actually happens all the time, requests are made by prosecutors, law-enforcers, to legally get the information to prosecute, just not in cases of record-keeping violations. There have been new developments related to this, IP addresses not being directly culpable of actions as such, or not being culpable of the individual paying for the said IP connection but that's another discussion.
The thing is: Wikimedia keeps edit histories and contributions lists for decades. We have no idea what implementation of US law will look like in five or ten years' time, given political vagaries.
Yes, but those edit histories and contribution lists are useless in their public form. More so, in Wikimedia's case than say Facebook. The only relevant information is the IP address, which again open up a whole new door of privacy and how far Wikimedia would go to defend or infringe on it. I really don't know how long CU information is retained vs. edit histories, but if they are indeed kept for decades, it might turn into a liability. The speed at which IP addresses can change, even entire blocks move, even that history might become unusable in 1-2 years.
Regards Theo
*I really don't know how long CU information is retained *
3 months. All CU logs are kept for 3 months _____ *Béria Lima* http://wikimedia.pt *Imagine um mundo onde é dada a qualquer pessoa a possibilidade de ter livre acesso ao somatório de todo o conhecimento humano. Ajude-nos a construir esse sonho. http://wikimedia.pt/Donativos*
On 31 May 2012 10:43, Theo10011 de10011@gmail.com wrote:
On Thu, May 31, 2012 at 6:48 PM, Andreas Kolbe jayen466@gmail.com wrote:
Theo, that is completely wrong. Record-keeping requirements only apply to images where models were required to engage in actual sexually explicit conduct, and moreover, it only applies to images created from 1990 onward.
I reread the information I based my conclusion on. I must be missing some part that you are referring to. Even the definition of "secondary producers" and its distinction from "primary producers" is linked and explained at length in articles related to legislation 2257.
You gave the definition of sexually explicit content from Cornell law that I based my understanding on, " (v) lascivious exhibition of the genitals or pubic area of any person." - is a giant definition to cover a lot of content, regardless of its purpose.
The single prosecution based on that law has been against the "Girls Gone Wild" producer, they weren't exactly charged for any other form of sexually explicit content beyond depiction of "genitals or pubic area" and lack of proper record-keeping.
Again, that is completely wrong. Facebook and the Wikimedia Foundation are
already protected by 230(c) safe harbor provisions. Responsibility lies with the individual uploader or editor, who enjoys no such protection but is fully liable for their own actions.
Here in lies the burden on the service providers. The complete record of the uploader/editor/individual would lie with the service provider. Edit histories and contribution list provide next to no information on the real world identities of editors, not even where they are located. The only possible link, even the country of residence would be information only available to the service provider in the form of their IP address.
Outside prosecutors can not prosecute, or charge any editor based on their username, whether its User:someguy542 or User:Ladiesman232, there is no real world link without the IP records. That is where the burden comes on the service provider. In order to prosecute and get that information legally, the burden passes to the provider. This actually happens all the time, requests are made by prosecutors, law-enforcers, to legally get the information to prosecute, just not in cases of record-keeping violations. There have been new developments related to this, IP addresses not being directly culpable of actions as such, or not being culpable of the individual paying for the said IP connection but that's another discussion.
The thing is: Wikimedia keeps edit histories and contributions lists for decades. We have no idea what implementation of US law will look like in five or ten years' time, given political vagaries.
Yes, but those edit histories and contribution lists are useless in their public form. More so, in Wikimedia's case than say Facebook. The only relevant information is the IP address, which again open up a whole new door of privacy and how far Wikimedia would go to defend or infringe on it. I really don't know how long CU information is retained vs. edit histories, but if they are indeed kept for decades, it might turn into a liability. The speed at which IP addresses can change, even entire blocks move, even that history might become unusable in 1-2 years.
Regards Theo
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
On Thu, May 31, 2012 at 2:43 PM, Theo10011 de10011@gmail.com wrote:
Outside prosecutors can not prosecute, or charge any editor based on their username, whether its User:someguy542 or User:Ladiesman232, there is no real world link without the IP records.
Firstly, that's not the sort of reasoning a charitable foundation should rely on. It makes for bad PR.
Secondly, it is often relatively trivial to identify people. You'll remember that the person who posted the Seigenthaler hoax was identified from his IP, and lost his job (I think he got it back afterwards, when Seigenthaler took pity on him and spoke to his employer). Furthermore, many established Commonists and Wikipedians either disclose their real names on mailing lists and/or their user pages, have pictures of themselves on Commons from Wikimania or other Wikimedia events, or are otherwise trivially identifiable. Take the recent Beta M case, for example.
Yes, an anonymous uploader who made only one edit from an Internet café may escape scrutiny. Although the other day I came across one uploader who had inadvertently uploaded geolocation data from his mobile phone along with his image, identifying the precise street address of the bedroom in Germany where the image was taken ... many mobile phones these days include geolocation in their metadata.
On Thu, May 31, 2012 at 7:43 PM, Andreas Kolbe jayen466@gmail.com wrote:
On Thu, May 31, 2012 at 2:43 PM, Theo10011 de10011@gmail.com wrote:
Outside prosecutors can not prosecute, or charge any editor based on their username, whether its User:someguy542 or User:Ladiesman232, there is no real world link without the IP records.
Firstly, that's not the sort of reasoning a charitable foundation should rely on. It makes for bad PR.
A charitable foundation? Bad PR? Every OSP uses this and it is an actual and often-used legal defense. Youtube uploaders don't use their real names, their social security numbers or even their resident country. Same with Facebook, in case of legal proceeding, just a FB page is not enough to link identity for an entire legal case. There are a few posters on this list, not using their real name, is it PR to say you can not legally link their real identities to their email addresses without going through their email provider?
Secondly, it is often relatively trivial to identify people. You'll remember that the person who posted the Seigenthaler hoax was identified from his IP, and lost his job (I think he got it back afterwards, when Seigenthaler took pity on him and spoke to his employer). Furthermore, many established Commonists and Wikipedians either disclose their real names on mailing lists and/or their user pages, have pictures of themselves on Commons from Wikimania or other Wikimedia events, or are otherwise trivially identifiable. Take the recent Beta M case, for example.
You must have some super-powers to identify people on commons without their IP info then, most of us need CU and even that is not conclusive or linked to any real identity. I don't know a single thing about any editor that they doesn't choose to mention on their user page. To have that weak of a chain of identification, enough to prosecute someone and stand up to legal scrutiny, is something totally different.
If you want, I can list 100 users on commons and en.wp, please identify them for me if it is that trivial. I have only talked to them for a couple of years, but I still can't tell anything about them, no names, gender, location. I'm usually surprised to learn those things, when they are actually revealed, I never thought someone's real identity and personal information being trivial to figure out.
Even in the case you mention, the only identifiable information was his IP, the OSP is usually the only one with access to that information.
Yes, an anonymous uploader who made only one edit from an Internet café may escape scrutiny. Although the other day I came across one uploader who had inadvertently uploaded geolocation data from his mobile phone along with his image, identifying the precise street address of the bedroom in Germany where the image was taken ... many mobile phones these days include geolocation in their metadata.
Actually, most smartphone have the option to add geolocation data to the metadata. They can turn it off or on in their settings. Most people already do upload geolocation data along with their smartphone images without knowing about it. It is actually very, very common.
Regards Theo
This may be an interesting tangent, but it doesn't really bear on the responsibility of Wikimedia or its projects. While others may have both legal and moral obligations, Wikimedia certainly has moral obligations with or without potential legal liability. The legal arguments are just a smokescreen.
~Nathan
Okay, I'm going to try to redirect this thread a bit from the long, drawn out discussion about legal requirements for model releases of explicit images (and the related record keeping), because I think that is only one small aspect of issues.
I agree with those who say there is a low risk of people accidentally finding images of an explicit nature in Wikipedia articles that are not directly related to those subjects. I do agree that at least some Wikipedia projects seem to have a disproportionately large collection of such articles, and that some of them are poorly named or identified, so that someone looking up a term that is used both in relation to a non-sexual topic and a sexual topic may get a bit of a surprise, and that needs to be addressed.
On the Commons side of things, I think there has been an over-aggressive campaign to extract "license compliant" images from Flickr and other non-WMF repositories that include subjects who were very unlikely to know that their image was going to be made available on Commons. I believe that whoever uploads those images to Commons has a personal responsibility to verify that all of the subjects in those images was aware of, and agrees to, the licensing terms. I also believe that it should become part of the process that prior to uploading such images, the person uploading to Commons confirms with the Flickr uploader that the terms of the license are correct, and that there are suitable model releases where applicable.
Let's not worry so much about what courts have decided, and pay more attention to developing best practices within our own projects.
Risker/Anne
On Thu, May 31, 2012 at 4:20 PM, Risker risker.wp@gmail.com wrote:
On the Commons side of things, I think there has been an over-aggressive campaign to extract "license compliant" images from Flickr and other non-WMF repositories that include subjects who were very unlikely to know that their image was going to be made available on Commons. I believe that whoever uploads those images to Commons has a personal responsibility to verify that all of the subjects in those images was aware of, and agrees to, the licensing terms. I also believe that it should become part of the process that prior to uploading such images, the person uploading to Commons confirms with the Flickr uploader that the terms of the license are correct, and that there are suitable model releases where applicable.
Let's not worry so much about what courts have decided, and pay more attention to developing best practices within our own projects.
Risker/Anne
Agreed. Most of these are from Flickr, for example:
http://commons.wikimedia.org/wiki/Special:ListFiles/Handcuffed
Going by past experience, the Flickr account holders are quite likely unaware of these uploads.
From: Risker
On the Commons side of things, I think there has been an over-aggressive campaign to extract "license compliant" images from Flickr and other >non-WMF repositories that include subjects who were very unlikely to know that their image was going to be made available on Commons. I >believe that whoever uploads those images to Commons has a personal responsibility to verify that all of the subjects in those images was >aware of, and agrees to, the licensing terms. I also believe that it should become part of the process that prior to uploading such images, the >person uploading to Commons confirms with the Flickr uploader that the terms of the license are correct, and that there are suitable model >releases where applicable.
This has always been one of my concerns about the superordination of free licensing in our image policy, both on Commons and enwiki. Any other issues with the image are downplayed in favor of archiving all the free images possible. I am not sure, for instance, that many of the Flickr users whose pictures have been used are quite aware of what the CC license means. Some of them seemed to think at one point that it was the only way to make their pictures publicly viewable, or did so because of peer pressure to do this good and cool thing without really understanding the legal implications.
I have often wondered what we do if confronted with a situation where there was a notable person with plenty of good-quality copyrighted images, but the only free one would be one that was rather unintentionally revealing (upskirt, say) while still showing their face. Could some editors insist on using one of the copyrighted images in that case even though the NFCC would not allow it because an equivalent free image was available?
Daniel Case
I think this comment completely misses the point. Yes, if you go to articles on "deep throating" or "tit torture", you will surprise surprise, see images of those things. I don't see this as a big problem. The problem would be if the same images were showing up on articles unrelated to sexuality, but that does not appear to be the case at all from the examples that you and Larry Sanger have put forward.
On Thu, May 31, 2012 at 2:48 PM, Andreas Kolbe jayen466@gmail.com wrote:
Here are results of a multimedia search for "human female" in Wikipedia (NSFW):
http://en.wikipedia.org/w/index.php?title=Special:Search&limit=250&o...
Did you look at the examples Larry mentioned in his post?
There are many more: e.g.
http://en.wikipedia.org/wiki/Deep-throating, viewed more than 50,000 times this month (this actually had three rather than two images until a couple of days ago: http://en.wikipedia.org/w/index.php?title=Deep-throating&oldid=494580914)
http://en.wikipedia.org/wiki/Tit_torture (16,000+ views this month)
http://en.wikipedia.org/wiki/Bukkake (120,000+ views this month)
Basically, if you go through the articles listed in en:WP templates like the sexual slang template, the Outline of BDSM template etc. you will come across many such articles, all with high viewing figures.
An example from de:WP: http://de.wikipedia.org/w/index.php?title=Vaginalverkehr&oldid=97830340
Source: http://www.flickr.com/photos/46879013@N03/4414846436/ http://www.flickr.com/people/46879013@N03
The Flickr account has been closed down (usually for breach of Flickr's terms of service). Note that there are no 18 USC 2257 records demonstrating that the persons depicted were 18 or over. According to my understanding of US law, any Wikimedian who uploads or inserts such an image without having documentation of model age, name, and publication consent is in breach of US law; see discussion at http://meta.wikimedia.org/wiki/User_talk:Philippe_(WMF)#Implications_of_2257...
Andreas
On Thu, May 31, 2012 at 4:44 AM, Laura Hale laura@fanhistory.com wrote:
On Thu, May 31, 2012 at 1:38 PM, Kim Osman kim.osman@qut.edu.au wrote:
My first thought was that this indeed is a red herring in terms of addressing the gendergap, however in my limited editing experience I do at times feel like Wikipedia is a boys' club, and perhaps the prevalence of pornography goes some way to an imagining of what is hanging on the clubhouse walls
Hi,
I edit Wikipedia a lot. I probably spend more time than I should editing Wikipedia. Can I ask where there is a prevalence of pornography on Wikipedia? I honestly can't think of a single time I have come across it when I wasn't directly looking for it. Misogny to a degree, yes. Discrimination against women's topics and topics outside the United States, youbetcha. But pornography? Maybe I just don't edit articles where pornography is very prevalent?
-- twitter: purplepopple blog: ozziesport.com
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
I'm not convinced that sexual images is a gender gap issue. But my non-expert opinion is that there is, or ought to be, a degree of feminist interest in the problems of model releases and age verification. I've always thought it strange that Andreas, and privatemusings before him, focused primarily on the very low probability that someone might accidentally stumble onto sexual images... to the near exclusion of the far more important problem, to me, of hosting potentially thousands of images where the subject is unknown, unaware of the publication of the image and did not (and would not have) given permission for such publication. For most images on Commons of a sexual nature there is no model release and no age verification, but despite the Board resolution and the lip-service paid to personality rights on Commons, there have been only minimal efforts to rectify this problem.
On Thu, May 31, 2012 at 1:12 PM, Nathan nawrich@gmail.com wrote:
I'm not convinced that sexual images is a gender gap issue. But my non-expert opinion is that there is, or ought to be, a degree of feminist interest in the problems of model releases and age verification. I've always thought it strange that Andreas, and privatemusings before him, focused primarily on the very low probability that someone might accidentally stumble onto sexual images... to the near exclusion of the far more important problem, to me, of hosting potentially thousands of images where the subject is unknown, unaware of the publication of the image and did not (and would not have) given permission for such publication. For most images on Commons of a sexual nature there is no model release and no age verification, but despite the Board resolution and the lip-service paid to personality rights on Commons, there have been only minimal efforts to rectify this problem.
Nathan, I agree with you that the consent issue is a huge problem. Wikimedia is allowing people to upload revenge porn (= sexual images of ex-partners) anonymously, without models' knowledge or consent, and editors then use this kind of material to illustrate articles.
Editors are pinching hundreds of private sexual images off Flickr and upload them to Wikimedia sites without asking Flickr account owners for consent, in violation of the board resolution.
The March/April thread on personality rights I started on the Commons list was exactly about that:
http://lists.wikimedia.org/pipermail/commons-l/2012-March/006409.html
Even after that post it took over a month to get these images deleted, after a total of six or seven deletion nominations: even though Commons *knew all along* that the models did not want these images on Wikimedia.
Commons has images here
http://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Lesbic_use_...
from an uploader who has written on Commons,
First of all, I am the photographer of this photo [dianaoftripoli]. I'm not even sure WHY this photo is on Wikimedia. the photo was posted on my Flickr account. This is in violation of how I want the photo to be used, so I do want it to be taken DOWN. For the record, no one involved in that project was underage. This conversation is completely idiotic. It was a college final project and of course it was taken with a high quality camera and of course it doesn't match my normal life because it is ART. You're all crazy. REMOVE this photo from this site and all others that I have taken. If you need to contact me, contact me directly via Flickr. Do NOT publish any more of my photos on another site WITHOUT my consent. PERIOD. FURTHERMORE, your posting of my photography AND COMMENTARY are in VIOLATION of my PRIVATE life and those who are in the photographs. You all should be ASHAMED. Bunch of speculative meddlers. Find something better to do and respect other people's privacy.
http://commons.wikimedia.org/w/index.php?title=Commons%3ADeletion_requests%2...
and Commons is STILL refusing to delete her images. I did my best to get them deleted, bringing them to the attention of the Wikimedia UK chair, who nominated them for deletion. To no avail (well, one of the images was deleted; it was a simulated image of a naked woman having her throat cut in a bathtub). Any help on consent issues is very much appreciated, Nathan.
For a list of current nudity and sexuality-related Commons deletion requests, see http://commons.wikimedia.org/wiki/Category:Nudity_and_sexuality-related_dele...
I am sorry – this thread may now actually be in danger of derailing the discussion. If people want to debate this further, but consensus is that the discussion should take place elsewhere, we could perhaps create a page on Meta.
Andreas
On Wed, May 30, 2012 at 11:44 PM, Laura Hale laura@fanhistory.com wrote:
I edit Wikipedia a lot. I probably spend more time than I should editing Wikipedia. Can I ask where there is a prevalence of pornography on Wikipedia? I honestly can't think of a single time I have come across it when I wasn't directly looking for it.
Encyclopedias (Wikipedia language editions) have increasingly comprehensive articlse /about/ pornography, with illustrations. But as you note, you won't come across it unless you are looking for it.
Commons has a large collection of orphaned images which are not used in any articles or on any other Wikimedia project, but are well-categorized. This long tail of images shows up in many searches (sometimes at the top, as in Andreas's examples), and can be surprising: macabre or pornographic or sacrilegious or circus images that happen to include a trout may show up in a search for 'trout'. This is how most people run across undesired images of all kinds: a media search on their home wiki, which happens to search Commons media as well.
SJ, coulrophobic
I wanted to ask a question to the members of the list-
Is all pornography inherently bad, against women, perhaps, Anti-feminist but does it degrade women just by its sheer existence? Are there women who either a) don't have strong opinions on it b) are supportive of some form of it.
For the record, Most forms of nudity, erotica, paintings, books, even video games, are capable of being classified as pornographic.
On Thu, May 31, 2012 at 9:14 AM, Laura Hale laura@fanhistory.com wrote:
On Thu, May 31, 2012 at 1:38 PM, Kim Osman kim.osman@qut.edu.au wrote:
My first thought was that this indeed is a red herring in terms of addressing the gendergap, however in my limited editing experience I do at times feel like Wikipedia is a boys' club, and perhaps the prevalence of pornography goes some way to an imagining of what is hanging on the clubhouse walls
Hi,
I edit Wikipedia a lot. I probably spend more time than I should editing Wikipedia. Can I ask where there is a prevalence of pornography on Wikipedia? I honestly can't think of a single time I have come across it when I wasn't directly looking for it. Misogny to a degree, yes. Discrimination against women's topics and topics outside the United States, youbetcha. But pornography? Maybe I just don't edit articles where pornography is very prevalent?
I agree with Laura.
Even in pornography related articles, I've rarely seen discussion that was characteristic of a "Boy's club" while degrading or objectifying women in any shape or form. The impression here might be, that its all teenagers working on their fantasies in not as-visible pages, but that is hardly the case. They are few active editors that only edit a single topic or interact with one subset of the ecosystem; the idea that they constantly mask and carry around their hateful misogynistic tendencies, to only let loose on pornography articles, is just plain wrong.
Pornography has always had 3 critics - Law, religion and feminism. In this age, coloring all 3 with the same generalized brush-stroke would be mistake; opinions mature and change over time, tolerance increases in all 3 forms. Law had it's problem with pornography, mostly descended from century old common law, until people started realizing they don't have to be bound by morality of old dead white men, from 300 years ago and they could decide for themselves. The same law in its vague interpretation outlawed homosexuality and the existence of homosexuals, in half of the world, and it still does. Religion had its problem with pornography but then we came out of the dark ages, art, even iconic religious art flirted with the boundaries of morality. The renaissance happened, with an explosion of culture and light and beauty, would Michelangelo's David have been pornographic in its age? or does it speak to more tolerance than what you might find even today. Would it have mattered if it was Aphrodite or The birth of Venus being ridiculed today. Adherence to certain practices, decreased and cultural tolerance increased - We just seem to be moving back in some cases. Then, the feminist movement, once all pornography was characterized as harmful and objectification of women, until there were dissenting voices, the sex-positive feminist movement for example. I once heard a plausible argument about the role pornography played in the sexual revolution for women that led to Women's lib in the US. I've also heard that the strongest critics within the feminist movement, would be equally if not more critical of censorship, which, incidentally is suggested on this list often as a solution.
Regards Theo
I've found this line of dialogue interesting but have hesitated to participate. When I first started editing Wikipedia, I arrived with a goal to bring some balance to many of the articles pertaining to domestic and international human trafficking and pornography. I soon realized that pornography and closely aligned topics were very heated. I encountered vulgar language, gender discrimination, objectification of women, and a less than hospitable environment that taught everybody to refrain from being dicks. I left for three years with no plans to return.
My professional background includes speaking before local, state, and national legislative commissions and government houses on these issues, in addition to obscenity and the secondary harmful affects of pornography. I come from a long line of preachers, judges, and family members that are serving as city mayors, county commissioners, a US Senator, and state legislators. At the same time, I have many close friends that currently write, produce, and star in adult films. Then there are my stripper and hooker friends. I also work with global agencies and government officials to assist individuals escaping human trafficking situations from throughout Southeast Asia, Western Europe, and North America. This is my area of expertise. And the area of my life that I have long maintained separately from Wikipedia.
While I say this hesitantly, I am one example of an editor that left due to the divide between the genders represented on Wikipedia.
All that said, there is a lack of knowledge and ability on Wikipedia to differentiate between pornography and obscenity. Pornography is defined as erotic content or material that is intended or created to cause sexual arousal or excitement. That said, erotic content that depicts or displays sexual organs, sexual intercourse, or sexual acts *may not* always be defined as pornography. This is the case with content and materials presented for *educational purposes*.
(In the US, outside of child pornography, pornography may only be regulated, based on the identified secondary harmful affects on the community in which it is created and/or distributed.)
In the US, obscenity can be legislated according to local, regional, state laws. It is up to each community to determine what constitutes obscenity. And these laws can often change over the years, based on the norms of the individuals that vote to pass or fail the proposed regulations. At the same time, obscenity is defined differently throughout the world from one country and culture to the next.
Due to the global nature of Wikipedia, I doubt that we will ever be able to establish guidelines regarding the presence of pornography. The rule of thumb is that which is determined to be educational. This differs from one person and one culture to the next. What one Wikipedian may find obscene, another may not. This can only be determined by the community. Is an image merely presented to bring shock and awe? Entice? Arouse? Or is it presented for educational purposes? Heck, even an image of arousal may be presented for educational purposes. The issue of pornography can really only be determined on a case by case basis.
As I earlier stated, I left Wikipedia for three years due to the vulgarity and discrimination against women. I returned because I enjoy writing during my spare time. Wikipedia is reflective of our global culture, no matter where you choose to spend your time. When it comes right down to it, if I don't want to see it, as in my daily life, all I have to do is stay out of the Wikipedia red light district.
Cindy
On Thu, May 31, 2012 at 2:17 AM, Theo10011 de10011@gmail.com wrote:
I wanted to ask a question to the members of the list-
Is all pornography inherently bad, against women, perhaps, Anti-feminist but does it degrade women just by its sheer existence? Are there women who either a) don't have strong opinions on it b) are supportive of some form of it.
For the record, Most forms of nudity, erotica, paintings, books, even video games, are capable of being classified as pornographic.
On Thu, May 31, 2012 at 9:14 AM, Laura Hale laura@fanhistory.com wrote:
On Thu, May 31, 2012 at 1:38 PM, Kim Osman kim.osman@qut.edu.au wrote:
My first thought was that this indeed is a red herring in terms of addressing the gendergap, however in my limited editing experience I do at times feel like Wikipedia is a boys' club, and perhaps the prevalence of pornography goes some way to an imagining of what is hanging on the clubhouse walls
Hi,
I edit Wikipedia a lot. I probably spend more time than I should editing Wikipedia. Can I ask where there is a prevalence of pornography on Wikipedia? I honestly can't think of a single time I have come across it when I wasn't directly looking for it. Misogny to a degree, yes. Discrimination against women's topics and topics outside the United States, youbetcha. But pornography? Maybe I just don't edit articles where pornography is very prevalent?
I agree with Laura.
Even in pornography related articles, I've rarely seen discussion that was characteristic of a "Boy's club" while degrading or objectifying women in any shape or form. The impression here might be, that its all teenagers working on their fantasies in not as-visible pages, but that is hardly the case. They are few active editors that only edit a single topic or interact with one subset of the ecosystem; the idea that they constantly mask and carry around their hateful misogynistic tendencies, to only let loose on pornography articles, is just plain wrong.
Pornography has always had 3 critics - Law, religion and feminism. In this age, coloring all 3 with the same generalized brush-stroke would be mistake; opinions mature and change over time, tolerance increases in all 3 forms. Law had it's problem with pornography, mostly descended from century old common law, until people started realizing they don't have to be bound by morality of old dead white men, from 300 years ago and they could decide for themselves. The same law in its vague interpretation outlawed homosexuality and the existence of homosexuals, in half of the world, and it still does. Religion had its problem with pornography but then we came out of the dark ages, art, even iconic religious art flirted with the boundaries of morality. The renaissance happened, with an explosion of culture and light and beauty, would Michelangelo's David have been pornographic in its age? or does it speak to more tolerance than what you might find even today. Would it have mattered if it was Aphrodite or The birth of Venus being ridiculed today. Adherence to certain practices, decreased and cultural tolerance increased - We just seem to be moving back in some cases. Then, the feminist movement, once all pornography was characterized as harmful and objectification of women, until there were dissenting voices, the sex-positive feminist movement for example. I once heard a plausible argument about the role pornography played in the sexual revolution for women that led to Women's lib in the US. I've also heard that the strongest critics within the feminist movement, would be equally if not more critical of censorship, which, incidentally is suggested on this list often as a solution.
Regards Theo
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
On Thu, May 31, 2012 at 2:43 AM, Cynthia Ashley-Nelson cindamuse@gmail.com wrote:
I've found this line of dialogue interesting but have hesitated to participate. When I first started editing Wikipedia, I arrived with a goal to bring some balance to many of the articles pertaining to domestic and international human trafficking and pornography. I soon realized that pornography and closely aligned topics were very heated. I encountered vulgar language, gender discrimination, objectification of women, and a less than hospitable environment that taught everybody to refrain from being dicks. I left for three years with no plans to return.
My professional background includes speaking before local, state, and national legislative commissions and government houses on these issues, in addition to obscenity and the secondary harmful affects of pornography. I come from a long line of preachers, judges, and family members that are serving as city mayors, county commissioners, a US Senator, and state legislators. At the same time, I have many close friends that currently write, produce, and star in adult films. Then there are my stripper and hooker friends. I also work with global agencies and government officials to assist individuals escaping human trafficking situations from throughout Southeast Asia, Western Europe, and North America. This is my area of expertise. And the area of my life that I have long maintained separately from Wikipedia.
While I say this hesitantly, I am one example of an editor that left due to the divide between the genders represented on Wikipedia.
All that said, there is a lack of knowledge and ability on Wikipedia to differentiate between pornography and obscenity. Pornography is defined as erotic content or material that is intended or created to cause sexual arousal or excitement. That said, erotic content that depicts or displays sexual organs, sexual intercourse, or sexual acts may not always be defined as pornography. This is the case with content and materials presented for educational purposes.
(In the US, outside of child pornography, pornography may only be regulated, based on the identified secondary harmful affects on the community in which it is created and/or distributed.)
In the US, obscenity can be legislated according to local, regional, state laws. It is up to each community to determine what constitutes obscenity. And these laws can often change over the years, based on the norms of the individuals that vote to pass or fail the proposed regulations. At the same time, obscenity is defined differently throughout the world from one country and culture to the next.
Due to the global nature of Wikipedia, I doubt that we will ever be able to establish guidelines regarding the presence of pornography. The rule of thumb is that which is determined to be educational. This differs from one person and one culture to the next. What one Wikipedian may find obscene, another may not. This can only be determined by the community. Is an image merely presented to bring shock and awe? Entice? Arouse? Or is it presented for educational purposes? Heck, even an image of arousal may be presented for educational purposes. The issue of pornography can really only be determined on a case by case basis.
As I earlier stated, I left Wikipedia for three years due to the vulgarity and discrimination against women. I returned because I enjoy writing during my spare time. Wikipedia is reflective of our global culture, no matter where you choose to spend your time. When it comes right down to it, if I don't want to see it, as in my daily life, all I have to do is stay out of the Wikipedia red light district.
Cindy
On Thu, May 31, 2012 at 4:56 PM, Kim Osman kim.osman@qut.edu.au wrote:
I totally agree with you - I have never come across anything remotely offensive in the course of editing or browsing. What I was trying to say is that rather than being a reason more females don't edit Wikipedia (and perhaps here my use of the word prevalence was wrong) the presence of certain types of pornography on Wikipedia contributes to the culture which results in the instances of misogny and discrimination you note. So I do see the editorial decisions made around the type of content Larry Sanger referenced as being part of a wider conversation about female participation.
Cheers, Kim
Cindy and Kim, thank you both for your messages and perspectives -- it's always nice to hear from newer voices on the list, especially in discussions that tend to get heated and dominated by just a few people. And Kim, welcome to Wikipedia, and Cindy, welcome back -- and happy writing :)
-- phoebe