If anyone ever needs a good example of the locker-room environment on Wikimedia Commons, I just came across this old deletion discussion: https://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Radio_butt...
The last two keep votes are especially interesting. One need look no farther than the current Main Page talk page for more of the same (search for "premature ejaculation").
Kaldari
Ryan, thanks for bringing this up for discussion. I've put a lot of thought into the series of photos this comes from over the years, and it's well worth some discussion. I'd like to hear what others think about this. Here is a link to the category for the larger collection; warning, there's lots of nudity and sexual objectification here, so don't click if you don't want to see that: https://commons.wikimedia.org/wiki/Category:Nude_portrayals_of_computer_tech...
First, I agree with Ryan that in the (various) deletion discussions I've seen around this and similar topics, there is often a toxic level of childish and offensive comments. I think that's a significant problem, and I don't know what can be done to improve it. Scolding people in those discussions often a backfires, and serves only to amplify the offensive commentary. But silence can imply tacit consent. How should one participate in the discussion, promoting an outcome one believes in, without contributing to or enabling the toxic nature of the discourse? I think I've done a decent job of walking that line in similar discussions, but I'm sure there's a lot of room for better approaches. I would love to hear what has worked for others, here and/or privately.
Also, my initial reaction to these images is that they are inherently offensive; my gut reaction is to keep them off Commons.
But after thinking it through and reading through a number of deletion discussions, the conclusion I've come to (at least so far) is that the decision to keep them (in spite of the childish and offensive commentary along the way) is the right decision. These strike me as the important points: * We have a collection of more than 20 million images, intended to support a wide diversity of educational projects. Among those 20 million files are a great many that would be offensive to some audience. (For instance, if I understand correctly, *all images portraying people* are offensive to at least some devout Muslims.) * Were these images originally intended to promote objectification of women? To support insightful commentary on objectification of women? Something else? I can't see into the minds of their creators, but I *can* imagine them being put to all kinds of uses, some of which would be worthwhile. The intent of the photographer and models, I've come to believe, is not relevant to the decision. (apart from the basic issue of consent in the next bullet point:) * Unlike many images on Commons, I see no reason to doubt that these were produced by consenting adults, and intended for public distribution.
If they are to be deleted, what is the principle under which we would delete them? To me, that's the key question. If it's simply the fact that we as individuals find them offensive, I don't think that's sufficient. If it's out of a belief that they inherently cause more harm than good, I think the reasons for that would need to be fleshed out before they could be persuasive.
Art is often meant to be provocative, to challenge our assumptions and sensibilities, to prompt discussion. We host a lot of art on Commons. On what basis would we delete these, but keep other controversial works of art? Of course it would be terrible to use these in, for instance, a Wikipedia article about HTML syntax. But overall, does it cause harm to simply have them exist in an image repository? My own conclusion with regard to this photo series is that the net value of maintaining a large and diverse collection of media, without endorsing its contents per se., outweighs other considerations.
(For anybody interested in the deletion process on Commons, the kinds of things that are deliberated, and the way the discussions go, you might be interested in my related blog post from a couple months ago: http://wikistrategies.net/wikimedia-commons-is-far-from-ethically-broken/ )
-Pete [[User:Peteforsyth]]
On Wed, Jul 23, 2014 at 1:03 PM, Ryan Kaldari rkaldari@wikimedia.org wrote:
If anyone ever needs a good example of the locker-room environment on Wikimedia Commons, I just came across this old deletion discussion:
https://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Radio_butt...
The last two keep votes are especially interesting. One need look no farther than the current Main Page talk page for more of the same (search for "premature ejaculation").
Kaldari
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Personally, I don't think it's worth having a discussion here about the merits of deleting these images. There's no chance in hell they are going to be deleted from Commons. What I'm more interested in is the locker-room nature of the discussions and how/if this can be addressed, as I think that is actually more likely to dissuade female contributors than the images themselves.
Ryan Kaldari
On Wed, Jul 23, 2014 at 2:01 PM, Pete Forsyth peteforsyth@gmail.com wrote:
Ryan, thanks for bringing this up for discussion. I've put a lot of thought into the series of photos this comes from over the years, and it's well worth some discussion. I'd like to hear what others think about this. Here is a link to the category for the larger collection; warning, there's lots of nudity and sexual objectification here, so don't click if you don't want to see that: https://commons.wikimedia.org/wiki/Category:Nude_portrayals_of_computer_tech...
First, I agree with Ryan that in the (various) deletion discussions I've seen around this and similar topics, there is often a toxic level of childish and offensive comments. I think that's a significant problem, and I don't know what can be done to improve it. Scolding people in those discussions often a backfires, and serves only to amplify the offensive commentary. But silence can imply tacit consent. How should one participate in the discussion, promoting an outcome one believes in, without contributing to or enabling the toxic nature of the discourse? I think I've done a decent job of walking that line in similar discussions, but I'm sure there's a lot of room for better approaches. I would love to hear what has worked for others, here and/or privately.
Also, my initial reaction to these images is that they are inherently offensive; my gut reaction is to keep them off Commons.
But after thinking it through and reading through a number of deletion discussions, the conclusion I've come to (at least so far) is that the decision to keep them (in spite of the childish and offensive commentary along the way) is the right decision. These strike me as the important points:
- We have a collection of more than 20 million images, intended to support
a wide diversity of educational projects. Among those 20 million files are a great many that would be offensive to some audience. (For instance, if I understand correctly, *all images portraying people* are offensive to at least some devout Muslims.)
- Were these images originally intended to promote objectification of
women? To support insightful commentary on objectification of women? Something else? I can't see into the minds of their creators, but I *can* imagine them being put to all kinds of uses, some of which would be worthwhile. The intent of the photographer and models, I've come to believe, is not relevant to the decision. (apart from the basic issue of consent in the next bullet point:)
- Unlike many images on Commons, I see no reason to doubt that these were
produced by consenting adults, and intended for public distribution.
If they are to be deleted, what is the principle under which we would delete them? To me, that's the key question. If it's simply the fact that we as individuals find them offensive, I don't think that's sufficient. If it's out of a belief that they inherently cause more harm than good, I think the reasons for that would need to be fleshed out before they could be persuasive.
Art is often meant to be provocative, to challenge our assumptions and sensibilities, to prompt discussion. We host a lot of art on Commons. On what basis would we delete these, but keep other controversial works of art? Of course it would be terrible to use these in, for instance, a Wikipedia article about HTML syntax. But overall, does it cause harm to simply have them exist in an image repository? My own conclusion with regard to this photo series is that the net value of maintaining a large and diverse collection of media, without endorsing its contents per se., outweighs other considerations.
(For anybody interested in the deletion process on Commons, the kinds of things that are deliberated, and the way the discussions go, you might be interested in my related blog post from a couple months ago: http://wikistrategies.net/wikimedia-commons-is-far-from-ethically-broken/ )
-Pete [[User:Peteforsyth]]
On Wed, Jul 23, 2014 at 1:03 PM, Ryan Kaldari rkaldari@wikimedia.org wrote:
If anyone ever needs a good example of the locker-room environment on Wikimedia Commons, I just came across this old deletion discussion:
https://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Radio_butt...
The last two keep votes are especially interesting. One need look no farther than the current Main Page talk page for more of the same (search for "premature ejaculation").
Kaldari
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
On Wed, Jul 23, 2014 at 2:10 PM, Ryan Kaldari rkaldari@wikimedia.org wrote:
Personally, I don't think it's worth having a discussion here about the merits of deleting these images. There's no chance in hell they are going to be deleted from Commons. What I'm more interested in is the locker-room nature of the discussions and how/if this can be addressed, as I think that is actually more likely to dissuade female contributors than the images themselves.
Totally reasonable, and I agree that would be a useful discussion. Not that anybody needs my permission, but please feel free to disregard the parts of my message that don't relate to this -- and sorry if it was an unwanted distraction.
For the discussion you're suggesting, it might be worthwhile to review the behavior-related policies and guidelines on Commons. It might be fruitful to develop, seek consensus around, and begin enforcing one or more new guidelines related to this stuff. https://commons.wikimedia.org/wiki/Template:Commons_policies_and_guidelines In my experience, I think it tends to be a small number of users who engage in this sort of thing, and if the behavior can be clearly and dispassionately described, it might be possible to chip away at the culture that makes it seem acceptable.
A big project, but a worthy one.
Pete [[User:Peteforsyth]]
On 7/23/2014 5:10 PM, Ryan Kaldari wrote:
Personally, I don't think it's worth having a discussion here about the merits of deleting these images. There's no chance in hell they are going to be deleted from Commons. What I'm more interested in is the locker-room nature of the discussions and how/if this can be addressed, as I think that is actually more likely to dissuade female contributors than the images themselves.
Ryan Kaldari
As long as they aren't in articles (or at least those most women are likely to end up at), it's not likely most women will see them and be dissuaded by that aspect of editing.
Constantly reminding women they exist through this list or the Gender Gap Task Force probably would be more of a turn off.
On the other hand, having a separate list which will, among other things, post notices of all such AfDs for those likely to want to AfD them might help get rid of some of the worse ones. And it might raise the consciousness of at least a few guys as to just how tacky they are. (I might join it for a while, but there's only so much one can take!)
Another idea is to start "Stupid sexist Wikicommons upload of the week (or day)" page or -more likely - off wiki blog and make sure Wikicommons people all know about it. At least it would be evidence some in the wiki community are fed up with it and make it generally easy to AfD the most gratuitous images. Make it a facebook page with text making it clear LIKE means you think it's stupid and should be the "Stupid sexist upload of the Day/Week" - or whatever it might be called...
Who knows, it might make a lot more women interested in Wikimedia projects (or not?)
Finally, let's try to post only things from the past year. Who knows, maybe all those guys' consciousnesses have been raised 3% since we all started talking about these issues and media have started covering it and we might actually have improved things a bit since that 2011 posting :-) https://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Radio_butt...
CM
I agree that offensiveness is in the eye of the beholder. And while there may be all manner of very niche groups who find strange things offensiveness, maybe some people object to seeing refrigerators or reading about cakes, nonetheless we know that there are a lot of widespread categories of offensiveness that generate the bulk of discussions about the inclusion of items on Wikipedia or Commons.
What we could do is to have to some system of classification (like the movies) for articles, images, and/or categories indicating that they are potentially offensive for various reasons. Perhaps along similar lines to the "content advisories" in IMDB, e.g.
http://www.imdb.com/title/tt0295297/parentalguide?ref_=tt_stry_pg
People could then put in their profiles that all classifications are acceptable or them or that these are the classifications they don't want to see (e.g. Sex and Nudity, Gore and Violence, Profanity, etc - obviously our classifications might not be identical to IMDB as we are dealing with different kinds of content but you get the idea). When that person searches Wikipedia or Commons, then those articles, images and categories that they would find offensive are not returned. When a person reads an article containing an offensive-to-them categorised image, it is simply not displayed or some image saying "Suppressed at your request (Sex and Nudity)". We could possibly bundle such these finer classifications into common collections, e.g. Inappropriate for Children, Suitable for Muslims, or whatever, so for many people it's a simple tick-one-box.
For anonymous users or users who have not explicitly set their preferences, rendering of an article or image could first ask "This article/image has been tagged as potentially offensive for SuchAndSuch reason, click OK to confirm you want to view it". If they are a logged-in user, it could also offer a link to set their preferences for future use.
I note that movies are often made with variants for different countries. Sometimes that's simply a matter of being dubbed into another language but it can also include the deletion (or replacement) of certain scenes or language that would be offensive in those countries. So it is not as if we are reinventing the wheel here, just customising it to Wikipedia.
Kerry
_____
From: gendergap-bounces@lists.wikimedia.org [mailto:gendergap-bounces@lists.wikimedia.org] On Behalf Of Ryan Kaldari Sent: Thursday, 24 July 2014 7:11 AM To: Addressing gender equity and exploring ways to increase the participationof women within Wikimedia projects. Subject: Re: [Gendergap] Sexualized environment on Commons
Personally, I don't think it's worth having a discussion here about the merits of deleting these images. There's no chance in hell they are going to be deleted from Commons. What I'm more interested in is the locker-room nature of the discussions and how/if this can be addressed, as I think that is actually more likely to dissuade female contributors than the images themselves.
Ryan Kaldari
On Wed, Jul 23, 2014 at 2:01 PM, Pete Forsyth peteforsyth@gmail.com wrote:
Ryan, thanks for bringing this up for discussion. I've put a lot of thought into the series of photos this comes from over the years, and it's well worth some discussion. I'd like to hear what others think about this. Here is a link to the category for the larger collection; warning, there's lots of nudity and sexual objectification here, so don't click if you don't want to see that: https://commons.wikimedia.org/wiki/Category:Nude_portrayals_of_computer_tech nology
First, I agree with Ryan that in the (various) deletion discussions I've seen around this and similar topics, there is often a toxic level of childish and offensive comments. I think that's a significant problem, and I don't know what can be done to improve it. Scolding people in those discussions often a backfires, and serves only to amplify the offensive commentary. But silence can imply tacit consent. How should one participate in the discussion, promoting an outcome one believes in, without contributing to or enabling the toxic nature of the discourse? I think I've done a decent job of walking that line in similar discussions, but I'm sure there's a lot of room for better approaches. I would love to hear what has worked for others, here and/or privately.
Also, my initial reaction to these images is that they are inherently offensive; my gut reaction is to keep them off Commons.
But after thinking it through and reading through a number of deletion discussions, the conclusion I've come to (at least so far) is that the decision to keep them (in spite of the childish and offensive commentary along the way) is the right decision. These strike me as the important points:
* We have a collection of more than 20 million images, intended to support a wide diversity of educational projects. Among those 20 million files are a great many that would be offensive to some audience. (For instance, if I understand correctly, *all images portraying people* are offensive to at least some devout Muslims.) * Were these images originally intended to promote objectification of women? To support insightful commentary on objectification of women? Something else? I can't see into the minds of their creators, but I *can* imagine them being put to all kinds of uses, some of which would be worthwhile. The intent of the photographer and models, I've come to believe, is not relevant to the decision. (apart from the basic issue of consent in the next bullet point:)
* Unlike many images on Commons, I see no reason to doubt that these were produced by consenting adults, and intended for public distribution.
If they are to be deleted, what is the principle under which we would delete them? To me, that's the key question. If it's simply the fact that we as individuals find them offensive, I don't think that's sufficient. If it's out of a belief that they inherently cause more harm than good, I think the reasons for that would need to be fleshed out before they could be persuasive.
Art is often meant to be provocative, to challenge our assumptions and sensibilities, to prompt discussion. We host a lot of art on Commons. On what basis would we delete these, but keep other controversial works of art? Of course it would be terrible to use these in, for instance, a Wikipedia article about HTML syntax. But overall, does it cause harm to simply have them exist in an image repository? My own conclusion with regard to this photo series is that the net value of maintaining a large and diverse collection of media, without endorsing its contents per se., outweighs other considerations.
(For anybody interested in the deletion process on Commons, the kinds of things that are deliberated, and the way the discussions go, you might be interested in my related blog post from a couple months ago: http://wikistrategies.net/wikimedia-commons-is-far-from-ethically-broken/ )
-Pete
[[User:Peteforsyth]]
On Wed, Jul 23, 2014 at 1:03 PM, Ryan Kaldari rkaldari@wikimedia.org wrote:
If anyone ever needs a good example of the locker-room environment on Wikimedia Commons, I just came across this old deletion discussion:
https://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Radio_butt on_and_female_nude.jpg
The last two keep votes are especially interesting. One need look no farther than the current Main Page talk page for more of the same (search for "premature ejaculation").
Kaldari
_______________________________________________ Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
_______________________________________________ Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Past experience is that such systems are inevitably used to facilitate censorship.
On Wed, Jul 23, 2014 at 8:51 PM, Kerry Raymond kerry.raymond@gmail.com wrote:
I agree that offensiveness is in the eye of the beholder. And while there may be all manner of very niche groups who find strange things offensiveness, maybe some people object to seeing refrigerators or reading about cakes, nonetheless we know that there are a lot of widespread categories of offensiveness that generate the bulk of discussions about the inclusion of items on Wikipedia or Commons.
What we could do is to have to some system of classification (like the movies) for articles, images, and/or categories indicating that they are potentially offensive for various reasons. Perhaps along similar lines to the “content advisories” in IMDB, e.g.
http://www.imdb.com/title/tt0295297/parentalguide?ref_=tt_stry_pg
People could then put in their profiles that all classifications are acceptable or them or that these are the classifications they don’t want to see (e.g. Sex and Nudity, Gore and Violence, Profanity, etc – obviously our classifications might not be identical to IMDB as we are dealing with different kinds of content but you get the idea). When that person searches Wikipedia or Commons, then those articles, images and categories that they would find offensive are not returned. When a person reads an article containing an offensive-to-them categorised image, it is simply not displayed or some image saying “Suppressed at your request (Sex and Nudity)”. We could possibly bundle such these finer classifications into common collections, e.g. Inappropriate for Children, Suitable for Muslims, or whatever, so for many people it’s a simple tick-one-box.
For anonymous users or users who have not explicitly set their preferences, rendering of an article or image could first ask “This article/image has been tagged as potentially offensive for SuchAndSuch reason, click OK to confirm you want to view it”. If they are a logged-in user, it could also offer a link to set their preferences for future use.
I note that movies are often made with variants for different countries. Sometimes that’s simply a matter of being dubbed into another language but it can also include the deletion (or replacement) of certain scenes or language that would be offensive in those countries. So it is not as if we are reinventing the wheel here, just customising it to Wikipedia.
Kerry
From: gendergap-bounces@lists.wikimedia.org [mailto:gendergap-bounces@lists.wikimedia.org] On Behalf Of Ryan Kaldari Sent: Thursday, 24 July 2014 7:11 AM To: Addressing gender equity and exploring ways to increase the participationof women within Wikimedia projects. Subject: Re: [Gendergap] Sexualized environment on Commons
Personally, I don't think it's worth having a discussion here about the merits of deleting these images. There's no chance in hell they are going to be deleted from Commons. What I'm more interested in is the locker-room nature of the discussions and how/if this can be addressed, as I think that is actually more likely to dissuade female contributors than the images themselves.
Ryan Kaldari
On Wed, Jul 23, 2014 at 2:01 PM, Pete Forsyth peteforsyth@gmail.com wrote:
Ryan, thanks for bringing this up for discussion. I've put a lot of thought into the series of photos this comes from over the years, and it's well worth some discussion. I'd like to hear what others think about this. Here is a link to the category for the larger collection; warning, there's lots of nudity and sexual objectification here, so don't click if you don't want to see that: https://commons.wikimedia.org/wiki/Category:Nude_portrayals_of_computer_tech...
First, I agree with Ryan that in the (various) deletion discussions I've seen around this and similar topics, there is often a toxic level of childish and offensive comments. I think that's a significant problem, and I don't know what can be done to improve it. Scolding people in those discussions often a backfires, and serves only to amplify the offensive commentary. But silence can imply tacit consent. How should one participate in the discussion, promoting an outcome one believes in, without contributing to or enabling the toxic nature of the discourse? I think I've done a decent job of walking that line in similar discussions, but I'm sure there's a lot of room for better approaches. I would love to hear what has worked for others, here and/or privately.
Also, my initial reaction to these images is that they are inherently offensive; my gut reaction is to keep them off Commons.
But after thinking it through and reading through a number of deletion discussions, the conclusion I've come to (at least so far) is that the decision to keep them (in spite of the childish and offensive commentary along the way) is the right decision. These strike me as the important points:
- We have a collection of more than 20 million images, intended to support a
wide diversity of educational projects. Among those 20 million files are a great many that would be offensive to some audience. (For instance, if I understand correctly, *all images portraying people* are offensive to at least some devout Muslims.)
- Were these images originally intended to promote objectification of women?
To support insightful commentary on objectification of women? Something else? I can't see into the minds of their creators, but I *can* imagine them being put to all kinds of uses, some of which would be worthwhile. The intent of the photographer and models, I've come to believe, is not relevant to the decision. (apart from the basic issue of consent in the next bullet point:)
- Unlike many images on Commons, I see no reason to doubt that these were
produced by consenting adults, and intended for public distribution.
If they are to be deleted, what is the principle under which we would delete them? To me, that's the key question. If it's simply the fact that we as individuals find them offensive, I don't think that's sufficient. If it's out of a belief that they inherently cause more harm than good, I think the reasons for that would need to be fleshed out before they could be persuasive.
Art is often meant to be provocative, to challenge our assumptions and sensibilities, to prompt discussion. We host a lot of art on Commons. On what basis would we delete these, but keep other controversial works of art? Of course it would be terrible to use these in, for instance, a Wikipedia article about HTML syntax. But overall, does it cause harm to simply have them exist in an image repository? My own conclusion with regard to this photo series is that the net value of maintaining a large and diverse collection of media, without endorsing its contents per se., outweighs other considerations.
(For anybody interested in the deletion process on Commons, the kinds of things that are deliberated, and the way the discussions go, you might be interested in my related blog post from a couple months ago: http://wikistrategies.net/wikimedia-commons-is-far-from-ethically-broken/ )
-Pete
[[User:Peteforsyth]]
On Wed, Jul 23, 2014 at 1:03 PM, Ryan Kaldari rkaldari@wikimedia.org wrote:
If anyone ever needs a good example of the locker-room environment on Wikimedia Commons, I just came across this old deletion discussion:
https://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Radio_butt...
The last two keep votes are especially interesting. One need look no farther than the current Main Page talk page for more of the same (search for "premature ejaculation").
Kaldari
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
While this can work in some situations, in a Wiki run by volunteers you rely on people to accurately self-classify their work, which many would not. Or you rely on other volunteers changing the rating. Whether up or down, it probably will lead to a big debate. This dozens or even hundreds of debates a day, which would be quite time consuming. Too many people already try to AfD photos for phony reasons. ("I don't like that person; I don't believe you took the picture!" being one I encountered myself.)
On 7/23/2014 9:51 PM, Kerry Raymond wrote:
I agree that offensiveness is in the eye of the beholder. And while there may be all manner of very niche groups who find strange things offensiveness, maybe some people object to seeing refrigerators or reading about cakes, nonetheless we know that there are a lot of widespread categories of offensiveness that generate the bulk of discussions about the inclusion of items on Wikipedia or Commons.
What we could do is to have to some system of classification (like the movies) for articles, images, and/or categories indicating that they are potentially offensive for various reasons. Perhaps along similar lines to the "content advisories" in IMDB, e.g.
http://www.imdb.com/title/tt0295297/parentalguide?ref_=tt_stry_pg
People could then put in their profiles that all classifications are acceptable or them or that these are the classifications they don't want to see (e.g. Sex and Nudity, Gore and Violence, Profanity, etc -- obviously our classifications might not be identical to IMDB as we are dealing with different kinds of content but you get the idea). When that person searches Wikipedia or Commons, then those articles, images and categories that they would find offensive are not returned. When a person reads an article containing an offensive-to-them categorised image, it is simply not displayed or some image saying "Suppressed at your request (Sex and Nudity)". We could possibly bundle such these finer classifications into common collections, e.g. Inappropriate for Children, Suitable for Muslims, or whatever, so for many people it's a simple tick-one-box.
For anonymous users or users who have not explicitly set their preferences, rendering of an article or image could first ask "This article/image has been tagged as potentially offensive for SuchAndSuch reason, click OK to confirm you want to view it". If they are a logged-in user, it could also offer a link to set their preferences for future use.
I note that movies are often made with variants for different countries. Sometimes that's simply a matter of being dubbed into another language but it can also include the deletion (or replacement) of certain scenes or language that would be offensive in those countries. So it is not as if we are reinventing the wheel here, just customising it to Wikipedia.
Kerry
I presume that uploaders only upload images they are personally comfortable with, so it is almost axiomatic that it would be others who would probably add such classifications, just as occurs with movies. I have no idea how IMDB make it work, but they do and they are using volunteers too. I note that IMDB use a 1-to-10 scale for the classifications. Maybe they just let people vote and the result is the average.
But, whether or not my proposal can work, I think we have to use this list to put forward ideas with a view to rolling out some kind of trial/pilot/experiment. The gender gap is of long standing and is unlikely to spontaneously disappear by just talking about it.
Kerry
_____
From: Carol Moore dc [mailto:carolmooredc@verizon.net] Sent: Friday, 25 July 2014 6:34 AM To: kerry.raymond@gmail.com; Addressing gender equity and exploring ways to increase the participation of women within Wikimedia projects. Subject: Re: [Gendergap] Sexualized environment on Commons
While this can work in some situations, in a Wiki run by volunteers you rely on people to accurately self-classify their work, which many would not. Or you rely on other volunteers changing the rating. Whether up or down, it probably will lead to a big debate. This dozens or even hundreds of debates a day, which would be quite time consuming. Too many people already try to AfD photos for phony reasons. ("I don't like that person; I don't believe you took the picture!" being one I encountered myself.)
On 7/23/2014 9:51 PM, Kerry Raymond wrote:
I agree that offensiveness is in the eye of the beholder. And while there may be all manner of very niche groups who find strange things offensiveness, maybe some people object to seeing refrigerators or reading about cakes, nonetheless we know that there are a lot of widespread categories of offensiveness that generate the bulk of discussions about the inclusion of items on Wikipedia or Commons.
What we could do is to have to some system of classification (like the movies) for articles, images, and/or categories indicating that they are potentially offensive for various reasons. Perhaps along similar lines to the "content advisories" in IMDB, e.g.
http://www.imdb.com/title/tt0295297/parentalguide?ref_=tt_stry_pg
People could then put in their profiles that all classifications are acceptable or them or that these are the classifications they don't want to see (e.g. Sex and Nudity, Gore and Violence, Profanity, etc - obviously our classifications might not be identical to IMDB as we are dealing with different kinds of content but you get the idea). When that person searches Wikipedia or Commons, then those articles, images and categories that they would find offensive are not returned. When a person reads an article containing an offensive-to-them categorised image, it is simply not displayed or some image saying "Suppressed at your request (Sex and Nudity)". We could possibly bundle such these finer classifications into common collections, e.g. Inappropriate for Children, Suitable for Muslims, or whatever, so for many people it's a simple tick-one-box.
For anonymous users or users who have not explicitly set their preferences, rendering of an article or image could first ask "This article/image has been tagged as potentially offensive for SuchAndSuch reason, click OK to confirm you want to view it". If they are a logged-in user, it could also offer a link to set their preferences for future use.
I note that movies are often made with variants for different countries. Sometimes that's simply a matter of being dubbed into another language but it can also include the deletion (or replacement) of certain scenes or language that would be offensive in those countries. So it is not as if we are reinventing the wheel here, just customising it to Wikipedia.
Kerry
Hi Kerry,
Sad as it is to be the bearer of dispiriting news...
A proposal more or less similar to this was made by the Board in 2011 (some kind of image filtering on a user-selected basis) - http://wikimediafoundation.org/wiki/Resolution:Controversial_content
The debate about whether (and/or how) to implement it was pretty vicious, pretty angry, and went on for the best part of a year. A September 2011 community poll gave interestingly mixed results - https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-09-05/News_a... and the development of any software was suspended pending further discussion. In mid-2012, the Board then formally rescinded the "develop a filter system" request - http://wikimediafoundation.org/wiki/Resolution:_Personal_Image_Hiding_Featur... - and it has more or less been dead in the water since then.
There's been no significant attempt to revive it, but I think this is in part because the wounds are still fresh - I think were it to be reopened now you'd get much the same result, a lot of heat which eventually stalls.
It's worth noting that a very small-scale version of this is in use for some wikis - it's been pointed out that some sexual topics on Arabic Wikipedia have a "click to expand" field which conceals an image - but this is pretty rare and done on a page-by-page, not image-by-image, basis; it also has no user-level customisability.
Andrew.
On 24 July 2014 02:51, Kerry Raymond kerry.raymond@gmail.com wrote:
I agree that offensiveness is in the eye of the beholder. And while there may be all manner of very niche groups who find strange things offensiveness, maybe some people object to seeing refrigerators or reading about cakes, nonetheless we know that there are a lot of widespread categories of offensiveness that generate the bulk of discussions about the inclusion of items on Wikipedia or Commons.
What we could do is to have to some system of classification (like the movies) for articles, images, and/or categories indicating that they are potentially offensive for various reasons. Perhaps along similar lines to the “content advisories” in IMDB, e.g.
http://www.imdb.com/title/tt0295297/parentalguide?ref_=tt_stry_pg
People could then put in their profiles that all classifications are acceptable or them or that these are the classifications they don’t want to see (e.g. Sex and Nudity, Gore and Violence, Profanity, etc – obviously our classifications might not be identical to IMDB as we are dealing with different kinds of content but you get the idea). When that person searches Wikipedia or Commons, then those articles, images and categories that they would find offensive are not returned. When a person reads an article containing an offensive-to-them categorised image, it is simply not displayed or some image saying “Suppressed at your request (Sex and Nudity)”. We could possibly bundle such these finer classifications into common collections, e.g. Inappropriate for Children, Suitable for Muslims, or whatever, so for many people it’s a simple tick-one-box.
For anonymous users or users who have not explicitly set their preferences, rendering of an article or image could first ask “This article/image has been tagged as potentially offensive for SuchAndSuch reason, click OK to confirm you want to view it”. If they are a logged-in user, it could also offer a link to set their preferences for future use.
I note that movies are often made with variants for different countries. Sometimes that’s simply a matter of being dubbed into another language but it can also include the deletion (or replacement) of certain scenes or language that would be offensive in those countries. So it is not as if we are reinventing the wheel here, just customising it to Wikipedia.
Kerry
From: gendergap-bounces@lists.wikimedia.org [mailto:gendergap-bounces@lists.wikimedia.org] On Behalf Of Ryan Kaldari Sent: Thursday, 24 July 2014 7:11 AM To: Addressing gender equity and exploring ways to increase the participationof women within Wikimedia projects. Subject: Re: [Gendergap] Sexualized environment on Commons
Personally, I don't think it's worth having a discussion here about the merits of deleting these images. There's no chance in hell they are going to be deleted from Commons. What I'm more interested in is the locker-room nature of the discussions and how/if this can be addressed, as I think that is actually more likely to dissuade female contributors than the images themselves.
Ryan Kaldari
On Wed, Jul 23, 2014 at 2:01 PM, Pete Forsyth peteforsyth@gmail.com wrote:
Ryan, thanks for bringing this up for discussion. I've put a lot of thought into the series of photos this comes from over the years, and it's well worth some discussion. I'd like to hear what others think about this. Here is a link to the category for the larger collection; warning, there's lots of nudity and sexual objectification here, so don't click if you don't want to see that: https://commons.wikimedia.org/wiki/Category:Nude_portrayals_of_computer_tech...
First, I agree with Ryan that in the (various) deletion discussions I've seen around this and similar topics, there is often a toxic level of childish and offensive comments. I think that's a significant problem, and I don't know what can be done to improve it. Scolding people in those discussions often a backfires, and serves only to amplify the offensive commentary. But silence can imply tacit consent. How should one participate in the discussion, promoting an outcome one believes in, without contributing to or enabling the toxic nature of the discourse? I think I've done a decent job of walking that line in similar discussions, but I'm sure there's a lot of room for better approaches. I would love to hear what has worked for others, here and/or privately.
Also, my initial reaction to these images is that they are inherently offensive; my gut reaction is to keep them off Commons.
But after thinking it through and reading through a number of deletion discussions, the conclusion I've come to (at least so far) is that the decision to keep them (in spite of the childish and offensive commentary along the way) is the right decision. These strike me as the important points:
- We have a collection of more than 20 million images, intended to support a
wide diversity of educational projects. Among those 20 million files are a great many that would be offensive to some audience. (For instance, if I understand correctly, *all images portraying people* are offensive to at least some devout Muslims.)
- Were these images originally intended to promote objectification of women?
To support insightful commentary on objectification of women? Something else? I can't see into the minds of their creators, but I *can* imagine them being put to all kinds of uses, some of which would be worthwhile. The intent of the photographer and models, I've come to believe, is not relevant to the decision. (apart from the basic issue of consent in the next bullet point:)
- Unlike many images on Commons, I see no reason to doubt that these were
produced by consenting adults, and intended for public distribution.
If they are to be deleted, what is the principle under which we would delete them? To me, that's the key question. If it's simply the fact that we as individuals find them offensive, I don't think that's sufficient. If it's out of a belief that they inherently cause more harm than good, I think the reasons for that would need to be fleshed out before they could be persuasive.
Art is often meant to be provocative, to challenge our assumptions and sensibilities, to prompt discussion. We host a lot of art on Commons. On what basis would we delete these, but keep other controversial works of art? Of course it would be terrible to use these in, for instance, a Wikipedia article about HTML syntax. But overall, does it cause harm to simply have them exist in an image repository? My own conclusion with regard to this photo series is that the net value of maintaining a large and diverse collection of media, without endorsing its contents per se., outweighs other considerations.
(For anybody interested in the deletion process on Commons, the kinds of things that are deliberated, and the way the discussions go, you might be interested in my related blog post from a couple months ago: http://wikistrategies.net/wikimedia-commons-is-far-from-ethically-broken/ )
-Pete
[[User:Peteforsyth]]
On Wed, Jul 23, 2014 at 1:03 PM, Ryan Kaldari rkaldari@wikimedia.org wrote:
If anyone ever needs a good example of the locker-room environment on Wikimedia Commons, I just came across this old deletion discussion:
https://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Radio_butt...
The last two keep votes are especially interesting. One need look no farther than the current Main Page talk page for more of the same (search for "premature ejaculation").
Kaldari
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Well, I am unsurprised that it has been considered before, as it's the obvious solution. Sad that the Board lacked the will to see it through.
But it doesn't mean that it could not or should not be raised again. Social justice issues rarely succeed on their first attempt. If we took that attitude, women still wouldn't have the vote.
The group we should be most concerned about is younger children. With many children increasingly having smartphones, it is far harder for parents to supervise the content they are viewing (unlike a desktop that can be positioned where the parent can keep an eye on things). At the same time, WMF is putting increasing effort into the mobile platforms and the WMF metrics show consistent uptrends in mobile access. The two trends suggest that Wikipedia and Commons are now a lot more likely to be accessed by children in an unsupervised context.
Kerry
-----Original Message----- From: shimgray@gmail.com [mailto:shimgray@gmail.com] On Behalf Of Andrew Gray Sent: Saturday, 26 July 2014 4:08 AM To: kerry.raymond@gmail.com; Addressing gender equity and exploring ways to increase the participation of women within Wikimedia projects. Subject: Re: [Spam] Re: [Gendergap] Sexualized environment on Commons
Hi Kerry,
Sad as it is to be the bearer of dispiriting news...
A proposal more or less similar to this was made by the Board in 2011 (some kind of image filtering on a user-selected basis) - http://wikimediafoundation.org/wiki/Resolution:Controversial_content
The debate about whether (and/or how) to implement it was pretty vicious, pretty angry, and went on for the best part of a year. A September 2011 community poll gave interestingly mixed results - https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-09-05/News_a nd_notes and the development of any software was suspended pending further discussion. In mid-2012, the Board then formally rescinded the "develop a filter system" request - http://wikimediafoundation.org/wiki/Resolution:_Personal_Image_Hiding_Featur e - and it has more or less been dead in the water since then.
There's been no significant attempt to revive it, but I think this is in part because the wounds are still fresh - I think were it to be reopened now you'd get much the same result, a lot of heat which eventually stalls.
It's worth noting that a very small-scale version of this is in use for some wikis - it's been pointed out that some sexual topics on Arabic Wikipedia have a "click to expand" field which conceals an image - but this is pretty rare and done on a page-by-page, not image-by-image, basis; it also has no user-level customisability.
Andrew.
On 24 July 2014 02:51, Kerry Raymond kerry.raymond@gmail.com wrote:
I agree that offensiveness is in the eye of the beholder. And while there may be all manner of very niche groups who find strange things offensiveness, maybe some people object to seeing refrigerators or reading about cakes, nonetheless we know that there are a lot of widespread categories of offensiveness that generate the bulk of discussions about
the
inclusion of items on Wikipedia or Commons.
What we could do is to have to some system of classification (like the movies) for articles, images, and/or categories indicating that they are potentially offensive for various reasons. Perhaps along similar lines to the "content advisories" in IMDB, e.g.
http://www.imdb.com/title/tt0295297/parentalguide?ref_=tt_stry_pg
People could then put in their profiles that all classifications are acceptable or them or that these are the classifications they don't want
to
see (e.g. Sex and Nudity, Gore and Violence, Profanity, etc - obviously
our
classifications might not be identical to IMDB as we are dealing with different kinds of content but you get the idea). When that person
searches
Wikipedia or Commons, then those articles, images and categories that they would find offensive are not returned. When a person reads an article containing an offensive-to-them categorised image, it is simply not displayed or some image saying "Suppressed at your request (Sex and Nudity)". We could possibly bundle such these finer classifications into common collections, e.g. Inappropriate for Children, Suitable for Muslims, or whatever, so for many people it's a simple tick-one-box.
For anonymous users or users who have not explicitly set their
preferences,
rendering of an article or image could first ask "This article/image has been tagged as potentially offensive for SuchAndSuch reason, click OK to confirm you want to view it". If they are a logged-in user, it could also offer a link to set their preferences for future use.
I note that movies are often made with variants for different countries. Sometimes that's simply a matter of being dubbed into another language but it can also include the deletion (or replacement) of certain scenes or language that would be offensive in those countries. So it is not as if we are reinventing the wheel here, just customising it to Wikipedia.
Kerry
From: gendergap-bounces@lists.wikimedia.org [mailto:gendergap-bounces@lists.wikimedia.org] On Behalf Of Ryan Kaldari Sent: Thursday, 24 July 2014 7:11 AM To: Addressing gender equity and exploring ways to increase the participationof women within Wikimedia projects. Subject: Re: [Gendergap] Sexualized environment on Commons
Personally, I don't think it's worth having a discussion here about the merits of deleting these images. There's no chance in hell they are going
to
be deleted from Commons. What I'm more interested in is the locker-room nature of the discussions and how/if this can be addressed, as I think
that
is actually more likely to dissuade female contributors than the images themselves.
Ryan Kaldari
On Wed, Jul 23, 2014 at 2:01 PM, Pete Forsyth peteforsyth@gmail.com
wrote:
Ryan, thanks for bringing this up for discussion. I've put a lot of
thought
into the series of photos this comes from over the years, and it's well worth some discussion. I'd like to hear what others think about this. Here is a link to the category for the larger collection; warning, there's lots of nudity and sexual objectification here, so don't click if you don't
want
to see that:
https://commons.wikimedia.org/wiki/Category:Nude_portrayals_of_computer_tech nology
First, I agree with Ryan that in the (various) deletion discussions I've seen around this and similar topics, there is often a toxic level of childish and offensive comments. I think that's a significant problem, and
I
don't know what can be done to improve it. Scolding people in those discussions often a backfires, and serves only to amplify the offensive commentary. But silence can imply tacit consent. How should one
participate
in the discussion, promoting an outcome one believes in, without contributing to or enabling the toxic nature of the discourse? I think
I've
done a decent job of walking that line in similar discussions, but I'm
sure
there's a lot of room for better approaches. I would love to hear what has worked for others, here and/or privately.
Also, my initial reaction to these images is that they are inherently offensive; my gut reaction is to keep them off Commons.
But after thinking it through and reading through a number of deletion discussions, the conclusion I've come to (at least so far) is that the decision to keep them (in spite of the childish and offensive commentary along the way) is the right decision. These strike me as the important points:
- We have a collection of more than 20 million images, intended to support
a
wide diversity of educational projects. Among those 20 million files are a great many that would be offensive to some audience. (For instance, if I understand correctly, *all images portraying people* are offensive to at least some devout Muslims.)
- Were these images originally intended to promote objectification of
women?
To support insightful commentary on objectification of women? Something else? I can't see into the minds of their creators, but I *can* imagine
them
being put to all kinds of uses, some of which would be worthwhile. The intent of the photographer and models, I've come to believe, is not
relevant
to the decision. (apart from the basic issue of consent in the next bullet point:)
- Unlike many images on Commons, I see no reason to doubt that these were
produced by consenting adults, and intended for public distribution.
If they are to be deleted, what is the principle under which we would
delete
them? To me, that's the key question. If it's simply the fact that we as individuals find them offensive, I don't think that's sufficient. If it's out of a belief that they inherently cause more harm than good, I think
the
reasons for that would need to be fleshed out before they could be persuasive.
Art is often meant to be provocative, to challenge our assumptions and sensibilities, to prompt discussion. We host a lot of art on Commons. On what basis would we delete these, but keep other controversial works of
art?
Of course it would be terrible to use these in, for instance, a Wikipedia article about HTML syntax. But overall, does it cause harm to simply have them exist in an image repository? My own conclusion with regard to this photo series is that the net value of maintaining a large and diverse collection of media, without endorsing its contents per se., outweighs
other
considerations.
(For anybody interested in the deletion process on Commons, the kinds of things that are deliberated, and the way the discussions go, you might be interested in my related blog post from a couple months ago: http://wikistrategies.net/wikimedia-commons-is-far-from-ethically-broken/
)
-Pete
[[User:Peteforsyth]]
On Wed, Jul 23, 2014 at 1:03 PM, Ryan Kaldari rkaldari@wikimedia.org wrote:
If anyone ever needs a good example of the locker-room environment on Wikimedia Commons, I just came across this old deletion discussion:
https://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Radio_butt on_and_female_nude.jpg
The last two keep votes are especially interesting. One need look no
farther
than the current Main Page talk page for more of the same (search for "premature ejaculation").
Kaldari
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Thanks to Andrew Gray for covering some of the history.
Kerry, there is further material that you might find of interest in a recent (May 2014) discussion on the Wikimedia-l mailing list:
http://www.gossamer-threads.com/lists/engine?do=post_view_flat;post=466380;p...
Best, Andreas
On Fri, Jul 25, 2014 at 10:53 PM, Kerry Raymond kerry.raymond@gmail.com wrote:
Well, I am unsurprised that it has been considered before, as it's the obvious solution. Sad that the Board lacked the will to see it through.
But it doesn't mean that it could not or should not be raised again. Social justice issues rarely succeed on their first attempt. If we took that attitude, women still wouldn't have the vote.
The group we should be most concerned about is younger children. With many children increasingly having smartphones, it is far harder for parents to supervise the content they are viewing (unlike a desktop that can be positioned where the parent can keep an eye on things). At the same time, WMF is putting increasing effort into the mobile platforms and the WMF metrics show consistent uptrends in mobile access. The two trends suggest that Wikipedia and Commons are now a lot more likely to be accessed by children in an unsupervised context.
Kerry
-----Original Message----- From: shimgray@gmail.com [mailto:shimgray@gmail.com] On Behalf Of Andrew Gray Sent: Saturday, 26 July 2014 4:08 AM To: kerry.raymond@gmail.com; Addressing gender equity and exploring ways to increase the participation of women within Wikimedia projects. Subject: Re: [Spam] Re: [Gendergap] Sexualized environment on Commons
Hi Kerry,
Sad as it is to be the bearer of dispiriting news...
A proposal more or less similar to this was made by the Board in 2011 (some kind of image filtering on a user-selected basis) - http://wikimediafoundation.org/wiki/Resolution:Controversial_content
The debate about whether (and/or how) to implement it was pretty vicious, pretty angry, and went on for the best part of a year. A September 2011 community poll gave interestingly mixed results -
https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-09-05/News_a nd_notes and the development of any software was suspended pending further discussion. In mid-2012, the Board then formally rescinded the "develop a filter system" request -
http://wikimediafoundation.org/wiki/Resolution:_Personal_Image_Hiding_Featur e
- and it has more or less been dead in the water since then.
There's been no significant attempt to revive it, but I think this is in part because the wounds are still fresh - I think were it to be reopened now you'd get much the same result, a lot of heat which eventually stalls.
It's worth noting that a very small-scale version of this is in use for some wikis - it's been pointed out that some sexual topics on Arabic Wikipedia have a "click to expand" field which conceals an image - but this is pretty rare and done on a page-by-page, not image-by-image, basis; it also has no user-level customisability.
Andrew.
On 24 July 2014 02:51, Kerry Raymond kerry.raymond@gmail.com wrote:
I agree that offensiveness is in the eye of the beholder. And while there may be all manner of very niche groups who find strange things offensiveness, maybe some people object to seeing refrigerators or
reading
about cakes, nonetheless we know that there are a lot of widespread categories of offensiveness that generate the bulk of discussions about
the
inclusion of items on Wikipedia or Commons.
What we could do is to have to some system of classification (like the movies) for articles, images, and/or categories indicating that they are potentially offensive for various reasons. Perhaps along similar lines to the "content advisories" in IMDB, e.g.
http://www.imdb.com/title/tt0295297/parentalguide?ref_=tt_stry_pg
People could then put in their profiles that all classifications are acceptable or them or that these are the classifications they don't want
to
see (e.g. Sex and Nudity, Gore and Violence, Profanity, etc - obviously
our
classifications might not be identical to IMDB as we are dealing with different kinds of content but you get the idea). When that person
searches
Wikipedia or Commons, then those articles, images and categories that
they
would find offensive are not returned. When a person reads an article containing an offensive-to-them categorised image, it is simply not displayed or some image saying "Suppressed at your request (Sex and Nudity)". We could possibly bundle such these finer classifications into common collections, e.g. Inappropriate for Children, Suitable for
Muslims,
or whatever, so for many people it's a simple tick-one-box.
For anonymous users or users who have not explicitly set their
preferences,
rendering of an article or image could first ask "This article/image has been tagged as potentially offensive for SuchAndSuch reason, click OK to confirm you want to view it". If they are a logged-in user, it could also offer a link to set their preferences for future use.
I note that movies are often made with variants for different countries. Sometimes that's simply a matter of being dubbed into another language
but
it can also include the deletion (or replacement) of certain scenes or language that would be offensive in those countries. So it is not as if
we
are reinventing the wheel here, just customising it to Wikipedia.
Kerry
From: gendergap-bounces@lists.wikimedia.org [mailto:gendergap-bounces@lists.wikimedia.org] On Behalf Of Ryan Kaldari Sent: Thursday, 24 July 2014 7:11 AM To: Addressing gender equity and exploring ways to increase the participationof women within Wikimedia projects. Subject: Re: [Gendergap] Sexualized environment on Commons
Personally, I don't think it's worth having a discussion here about the merits of deleting these images. There's no chance in hell they are going
to
be deleted from Commons. What I'm more interested in is the locker-room nature of the discussions and how/if this can be addressed, as I think
that
is actually more likely to dissuade female contributors than the images themselves.
Ryan Kaldari
On Wed, Jul 23, 2014 at 2:01 PM, Pete Forsyth peteforsyth@gmail.com
wrote:
Ryan, thanks for bringing this up for discussion. I've put a lot of
thought
into the series of photos this comes from over the years, and it's well worth some discussion. I'd like to hear what others think about this.
Here
is a link to the category for the larger collection; warning, there's
lots
of nudity and sexual objectification here, so don't click if you don't
want
to see that:
https://commons.wikimedia.org/wiki/Category:Nude_portrayals_of_computer_tech nology
First, I agree with Ryan that in the (various) deletion discussions I've seen around this and similar topics, there is often a toxic level of childish and offensive comments. I think that's a significant problem,
and I
don't know what can be done to improve it. Scolding people in those discussions often a backfires, and serves only to amplify the offensive commentary. But silence can imply tacit consent. How should one
participate
in the discussion, promoting an outcome one believes in, without contributing to or enabling the toxic nature of the discourse? I think
I've
done a decent job of walking that line in similar discussions, but I'm
sure
there's a lot of room for better approaches. I would love to hear what
has
worked for others, here and/or privately.
Also, my initial reaction to these images is that they are inherently offensive; my gut reaction is to keep them off Commons.
But after thinking it through and reading through a number of deletion discussions, the conclusion I've come to (at least so far) is that the decision to keep them (in spite of the childish and offensive commentary along the way) is the right decision. These strike me as the important points:
- We have a collection of more than 20 million images, intended to
support a
wide diversity of educational projects. Among those 20 million files are
a
great many that would be offensive to some audience. (For instance, if I understand correctly, *all images portraying people* are offensive to at least some devout Muslims.)
- Were these images originally intended to promote objectification of
women?
To support insightful commentary on objectification of women? Something else? I can't see into the minds of their creators, but I *can* imagine
them
being put to all kinds of uses, some of which would be worthwhile. The intent of the photographer and models, I've come to believe, is not
relevant
to the decision. (apart from the basic issue of consent in the next
bullet
point:)
- Unlike many images on Commons, I see no reason to doubt that these were
produced by consenting adults, and intended for public distribution.
If they are to be deleted, what is the principle under which we would
delete
them? To me, that's the key question. If it's simply the fact that we as individuals find them offensive, I don't think that's sufficient. If it's out of a belief that they inherently cause more harm than good, I think
the
reasons for that would need to be fleshed out before they could be persuasive.
Art is often meant to be provocative, to challenge our assumptions and sensibilities, to prompt discussion. We host a lot of art on Commons. On what basis would we delete these, but keep other controversial works of
art?
Of course it would be terrible to use these in, for instance, a Wikipedia article about HTML syntax. But overall, does it cause harm to simply have them exist in an image repository? My own conclusion with regard to this photo series is that the net value of maintaining a large and diverse collection of media, without endorsing its contents per se., outweighs
other
considerations.
(For anybody interested in the deletion process on Commons, the kinds of things that are deliberated, and the way the discussions go, you might be interested in my related blog post from a couple months ago:
http://wikistrategies.net/wikimedia-commons-is-far-from-ethically-broken/ )
-Pete
[[User:Peteforsyth]]
On Wed, Jul 23, 2014 at 1:03 PM, Ryan Kaldari rkaldari@wikimedia.org wrote:
If anyone ever needs a good example of the locker-room environment on Wikimedia Commons, I just came across this old deletion discussion:
https://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Radio_butt on_and_female_nude.jpg
The last two keep votes are especially interesting. One need look no
farther
than the current Main Page talk page for more of the same (search for "premature ejaculation").
Kaldari
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
--
- Andrew Gray andrew.gray@dunelm.org.uk
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
The new hovercards (which I otherwise love) have created another problem, in that lead images show up when your cursor hovers over a wikilink.
You would have to be reading an article where potentially offensive images are in linked pages, so this won't be a problem across the board. But it's easy to find yourself hovering over a link without intending to, so if you're in an article that contains such links, you can suddenly have images of genitalia on your screen without having clicked on the links that contain them.
Sarah
On Fri, Jul 25, 2014 at 3:08 PM, Andreas Kolbe jayen466@gmail.com wrote:
Thanks to Andrew Gray for covering some of the history.
Kerry, there is further material that you might find of interest in a recent (May 2014) discussion on the Wikimedia-l mailing list:
http://www.gossamer-threads.com/lists/engine?do=post_view_flat;post=466380;p...
Best, Andreas
On Fri, Jul 25, 2014 at 10:53 PM, Kerry Raymond kerry.raymond@gmail.com wrote:
Well, I am unsurprised that it has been considered before, as it's the obvious solution. Sad that the Board lacked the will to see it through.
But it doesn't mean that it could not or should not be raised again. Social justice issues rarely succeed on their first attempt. If we took that attitude, women still wouldn't have the vote.
The group we should be most concerned about is younger children. With many children increasingly having smartphones, it is far harder for parents to supervise the content they are viewing (unlike a desktop that can be positioned where the parent can keep an eye on things). At the same time, WMF is putting increasing effort into the mobile platforms and the WMF metrics show consistent uptrends in mobile access. The two trends suggest that Wikipedia and Commons are now a lot more likely to be accessed by children in an unsupervised context.
Kerry
-----Original Message----- From: shimgray@gmail.com [mailto:shimgray@gmail.com] On Behalf Of Andrew Gray Sent: Saturday, 26 July 2014 4:08 AM To: kerry.raymond@gmail.com; Addressing gender equity and exploring ways to increase the participation of women within Wikimedia projects. Subject: Re: [Spam] Re: [Gendergap] Sexualized environment on Commons
Hi Kerry,
Sad as it is to be the bearer of dispiriting news...
A proposal more or less similar to this was made by the Board in 2011 (some kind of image filtering on a user-selected basis) - http://wikimediafoundation.org/wiki/Resolution:Controversial_content
The debate about whether (and/or how) to implement it was pretty vicious, pretty angry, and went on for the best part of a year. A September 2011 community poll gave interestingly mixed results -
https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-09-05/News_a nd_notes and the development of any software was suspended pending further discussion. In mid-2012, the Board then formally rescinded the "develop a filter system" request -
http://wikimediafoundation.org/wiki/Resolution:_Personal_Image_Hiding_Featur e
- and it has more or less been dead in the water since then.
There's been no significant attempt to revive it, but I think this is in part because the wounds are still fresh - I think were it to be reopened now you'd get much the same result, a lot of heat which eventually stalls.
It's worth noting that a very small-scale version of this is in use for some wikis - it's been pointed out that some sexual topics on Arabic Wikipedia have a "click to expand" field which conceals an image - but this is pretty rare and done on a page-by-page, not image-by-image, basis; it also has no user-level customisability.
Andrew.
On 24 July 2014 02:51, Kerry Raymond kerry.raymond@gmail.com wrote:
I agree that offensiveness is in the eye of the beholder. And while
there
may be all manner of very niche groups who find strange things offensiveness, maybe some people object to seeing refrigerators or
reading
about cakes, nonetheless we know that there are a lot of widespread categories of offensiveness that generate the bulk of discussions about
the
inclusion of items on Wikipedia or Commons.
What we could do is to have to some system of classification (like the movies) for articles, images, and/or categories indicating that they are potentially offensive for various reasons. Perhaps along similar lines
to
the "content advisories" in IMDB, e.g.
http://www.imdb.com/title/tt0295297/parentalguide?ref_=tt_stry_pg
People could then put in their profiles that all classifications are acceptable or them or that these are the classifications they don't want
to
see (e.g. Sex and Nudity, Gore and Violence, Profanity, etc - obviously
our
classifications might not be identical to IMDB as we are dealing with different kinds of content but you get the idea). When that person
searches
Wikipedia or Commons, then those articles, images and categories that
they
would find offensive are not returned. When a person reads an article containing an offensive-to-them categorised image, it is simply not displayed or some image saying "Suppressed at your request (Sex and Nudity)". We could possibly bundle such these finer classifications into common collections, e.g. Inappropriate for Children, Suitable for
Muslims,
or whatever, so for many people it's a simple tick-one-box.
For anonymous users or users who have not explicitly set their
preferences,
rendering of an article or image could first ask "This article/image has been tagged as potentially offensive for SuchAndSuch reason, click OK to confirm you want to view it". If they are a logged-in user, it could
also
offer a link to set their preferences for future use.
I note that movies are often made with variants for different countries. Sometimes that's simply a matter of being dubbed into another language
but
it can also include the deletion (or replacement) of certain scenes or language that would be offensive in those countries. So it is not as if
we
are reinventing the wheel here, just customising it to Wikipedia.
Kerry
From: gendergap-bounces@lists.wikimedia.org [mailto:gendergap-bounces@lists.wikimedia.org] On Behalf Of Ryan
Kaldari
Sent: Thursday, 24 July 2014 7:11 AM To: Addressing gender equity and exploring ways to increase the participationof women within Wikimedia projects. Subject: Re: [Gendergap] Sexualized environment on Commons
Personally, I don't think it's worth having a discussion here about the merits of deleting these images. There's no chance in hell they are
going to
be deleted from Commons. What I'm more interested in is the locker-room nature of the discussions and how/if this can be addressed, as I think
that
is actually more likely to dissuade female contributors than the images themselves.
Ryan Kaldari
On Wed, Jul 23, 2014 at 2:01 PM, Pete Forsyth peteforsyth@gmail.com
wrote:
Ryan, thanks for bringing this up for discussion. I've put a lot of
thought
into the series of photos this comes from over the years, and it's well worth some discussion. I'd like to hear what others think about this.
Here
is a link to the category for the larger collection; warning, there's
lots
of nudity and sexual objectification here, so don't click if you don't
want
to see that:
https://commons.wikimedia.org/wiki/Category:Nude_portrayals_of_computer_tech nology
First, I agree with Ryan that in the (various) deletion discussions I've seen around this and similar topics, there is often a toxic level of childish and offensive comments. I think that's a significant problem,
and I
don't know what can be done to improve it. Scolding people in those discussions often a backfires, and serves only to amplify the offensive commentary. But silence can imply tacit consent. How should one
participate
in the discussion, promoting an outcome one believes in, without contributing to or enabling the toxic nature of the discourse? I think
I've
done a decent job of walking that line in similar discussions, but I'm
sure
there's a lot of room for better approaches. I would love to hear what
has
worked for others, here and/or privately.
Also, my initial reaction to these images is that they are inherently offensive; my gut reaction is to keep them off Commons.
But after thinking it through and reading through a number of deletion discussions, the conclusion I've come to (at least so far) is that the decision to keep them (in spite of the childish and offensive commentary along the way) is the right decision. These strike me as the important points:
- We have a collection of more than 20 million images, intended to
support a
wide diversity of educational projects. Among those 20 million files
are a
great many that would be offensive to some audience. (For instance, if I understand correctly, *all images portraying people* are offensive to at least some devout Muslims.)
- Were these images originally intended to promote objectification of
women?
To support insightful commentary on objectification of women? Something else? I can't see into the minds of their creators, but I *can* imagine
them
being put to all kinds of uses, some of which would be worthwhile. The intent of the photographer and models, I've come to believe, is not
relevant
to the decision. (apart from the basic issue of consent in the next
bullet
point:)
- Unlike many images on Commons, I see no reason to doubt that these
were
produced by consenting adults, and intended for public distribution.
If they are to be deleted, what is the principle under which we would
delete
them? To me, that's the key question. If it's simply the fact that we as individuals find them offensive, I don't think that's sufficient. If
it's
out of a belief that they inherently cause more harm than good, I think
the
reasons for that would need to be fleshed out before they could be persuasive.
Art is often meant to be provocative, to challenge our assumptions and sensibilities, to prompt discussion. We host a lot of art on Commons. On what basis would we delete these, but keep other controversial works of
art?
Of course it would be terrible to use these in, for instance, a
Wikipedia
article about HTML syntax. But overall, does it cause harm to simply
have
them exist in an image repository? My own conclusion with regard to this photo series is that the net value of maintaining a large and diverse collection of media, without endorsing its contents per se., outweighs
other
considerations.
(For anybody interested in the deletion process on Commons, the kinds of things that are deliberated, and the way the discussions go, you might
be
interested in my related blog post from a couple months ago:
http://wikistrategies.net/wikimedia-commons-is-far-from-ethically-broken/ )
-Pete
[[User:Peteforsyth]]
On Wed, Jul 23, 2014 at 1:03 PM, Ryan Kaldari rkaldari@wikimedia.org wrote:
If anyone ever needs a good example of the locker-room environment on Wikimedia Commons, I just came across this old deletion discussion:
https://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Radio_butt on_and_female_nude.jpg https://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Radio_button_and_female_nude.jpg
The last two keep votes are especially interesting. One need look no
farther
than the current Main Page talk page for more of the same (search for "premature ejaculation").
Kaldari
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
--
- Andrew Gray andrew.gray@dunelm.org.uk
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
On Fri, Jul 25, 2014 at 3:19 PM, Sarah slimvirgin@gmail.com wrote:
The new hovercards (which I otherwise love) have created another problem, in that lead images show up when your cursor hovers over a wikilink.
Good point. In general, it would be good to have a more thorough process for exploring difficult-to-anticipate side effects before new features are broadly released -- something there's been a lot of discussion about lately.
Back to Ryan's original topic -- the sometimes inappropriate nature of discussions on Commons -- I started a draft of an essay (which, at least theoretically, could eventually become a guideline if there is enough support for it). I think it might be a decent start, but it could use more input and fleshing out. Please take a look, and feel free to edit as you see fit:
https://commons.wikimedia.org/wiki/User:Peteforsyth/Provocative_behavior
-Pete
What's interesting to me about this discussion, and Gender Gap generally, is the discrepancy between what is perceived as being driving women editors away (and if you really want to see a classic example then the 'drop the sticks' closed section of this discussion https://en.wikipedia.org/wiki/Wikipedia:Administrators%27_noticeboard/Archiv... ) and the things that I have actually found difficult on Wikipedia. These are my bullet points about my first few months of joining Wikipedia.
1. Was reading something on WP and, out of curiousity, clicked on the other tabs 'edit' 'history' and 'discussion' just to see what they were about. 2. Realized they were discussions about editing WP and decided to look further & considered editing WP myself. 3. One tab open with daunting looking amounts of code that I could make absolutely nothing of, and another tab open next to it with a thing called 'Sandbox'. 4. Almost gave up there and then due to the mistaken idea that I if I wanted to write an article then I would have nothing but a completely blank canvass and have to write all the code from scratch by myself. 5. Came back to it the next day thinking, "That can't be it.", created an account and started making small edits, single lines with a citation, obvious copy edit errors and asked for help on noticeboards when I was stuck. 6. I had some stuff seized on, deleted as 'unimportant' or tagged for 'not enough refs', 'orphan', as well as some curt / abrasive comments but nice and helpful ones too. I should say something more about this - Wikipedia does not exist in a vacuum, either online or in the world, if nasty comments are the reason that women don't edit Wikipedia then they wouldn't use social media either - but they do. Did I think that my edits were being treated disproportionately to male editors? Yes, but I am female and the off-line world that I inhabit is also sexist - so what else is new?. 7. So what did have me tearing my hair out early on? I would say that it was what I would call 'the washing machine effect'. I would have saved myself a lot of time and trouble if I had had a quick-start guide that explained Help:XXXX, Template:XXXX, WP:XXXX. I would click 'Help' and be taken to the help homepage, search 'X', be taken to Help:'X', click on 'Y' - and here was the bit I didn't realize - when I clicked on 'Y' I was also, by default, leaving 'Help'. I regarded clicking the Help button as walking into the the lobby of Hotel Help, I would go through 2-3 links and then think, "Wait a minute, this is just ordinary Wikipedia, and this is just a definition of [word]. When did I leave Help?" Back button, back button, back button. "Okay, start over..." I would go around, and around like this for ages, either stumbling across what I was looking for, finding another way of doing what I wanted to do, or ask at the Teahouse (not New Users House? Why?). 8. I only ever visited the Commons when I need a picture for something, used the search engine to see if the Commons had what I wanted and then went back to Wikipedia. I didn't stick around to read the conversations so I didn't even know much about that side of it until I joined Gender Gap.
Things that I think might help:
1. A culture of irresponsible behaviour stems from bad people. A culture of responsible behaviour stems from good people. The way to really make a difference is to crowd out the bad with the good so they bad get bored and go and find a new place to play. An increased number of sexist images will then be deleted by the improved culture of the community. 2. The greatest form of outreach is Wikipedia itself. When I was a student what was valuable to me was a way of accessing resources on topics. I recently went through Amartya Sen's page and fixed the bibliography / referencing including author / editor links. This is what his bibliography looked like before: https://en.wikipedia.org/w/index.php?title=Amartya_Sen&oldid=611115580#P... and this is it now: https://en.wikipedia.org/wiki/Amartya_Sen#Bibliography The same with the referencing section, before: https://en.wikipedia.org/w/index.php?title=Amartya_Sen&oldid=611115580#R... and after: https://en.wikipedia.org/wiki/Amartya_Sen#References Similar clean ups / new articles on other academics from the world of feminist economics / political science / political psychology / sociology / care work / human development etc. will increasingly gain Wikipedia a reputation amongst students and scholars as a useful reference tool and recruiting new editors from that pool of visitors would change the culture. A similar thing needs to happen with articles like: https://en.wikipedia.org/wiki/Feminist_movements_and_ideologies 3. I recently added the biography of the political theorist Jane Bennett https://en.wikipedia.org/wiki/Jane_Bennett_(political_theorist). I had in draft for a long time, I took her bibliography from her CV and worked through it item by item. As I did this I checked to see if any co-authors had biographies so I could author-link them. Michael J. Shapiro was one, I went through his bibliography and cleaned it up, the co-authors of his books include James Der Derian, Hayward Alker, David Campbell (academic) - I added author-links on Shapiro's bio but all three of them need their bibliographies sorting out in a similar way and their pages need checking for infoboxes, authority control boxes and LCCN ref no. in authority control boxes. 4. Where dates of birth are known on biographies they should be added to WP's calender, I've added Jane Bennett's, 31 July 1957 https://en.wikipedia.org/wiki/31_July#Births 5. When you first 'land' on Wikipedia what are the key pages beyond the main page:
Contents: https://en.wikipedia.org/wiki/Portal:Contents Outlines: https://en.wikipedia.org/wiki/Portal:Contents/Outlines Portals: https://en.wikipedia.org/wiki/Portal:Contents/Portals Lists: https://en.wikipedia.org/wiki/Portal:Contents/Lists Glossaries: https://en.wikipedia.org/wiki/Portal:Contents/Glossaries Indexes: https://en.wikipedia.org/wiki/Portal:Contents/Indexes
6. The 12 groupings are: General reference / Culture and the arts / Geography and places / Health and fitness / History and events / Mathematics and logic / Natural and physical sciences / People and self / Philosophy and thinking / Religion and belief systems / Society and social sciences / Technology and applied sciences A quick glance at the Outline for Philosophy and thinking https://en.wikipedia.org/wiki/Portal:Contents/Outlines#Philosophy_and_thinki... shows 'Ethics' with 'Sexual ethics' singled out for special mention - why? Under Indexes for Society and social sciences https://en.wikipedia.org/wiki/Portal:Contents/Indexes#Society_and_social_sci... we have an index for BDSM but red links for Social Policy, Political Science and Development Studies. 7. There are no Portals of the following names: Pro-life portal / Pro-choice portal / Abortion debates portal / Same-sex marriage debates portal... so why is there an, equally contentious, 'Pornography portal', shouldn't it at least be a 'Pornography debates portal'? 8. For me issues like particular pictures making it onto the Commons only matter if they are put into articles or if they become featured / POTD. If there is a debate then fine, mention on Gender Gap and give a link, the same with other debates where a 'support' or 'oppose' may be needed. Taking on sexist editors and trying to find new systems of dealing with them and the images they want to put up is admirable, but there is an element of fiddling while Rome burns, for instance this is a video on how to edit Wikipedia - which new editors is this likely to attract? http://www.youtube.com/watch?v=BhvsVaTymzM Recruitment of better editors = better content = attracting better editors = crowding out the bad.
Marie
Date: Fri, 25 Jul 2014 21:59:59 -0700 From: peteforsyth@gmail.com To: gendergap@lists.wikimedia.org Subject: Re: [Gendergap] [Spam] Re: Sexualized environment on Commons
On Fri, Jul 25, 2014 at 3:19 PM, Sarah slimvirgin@gmail.com wrote:
The new hovercards (which I otherwise love) have created another problem, in that lead images show up when your cursor hovers over a wikilink.
Good point. In general, it would be good to have a more thorough process for exploring difficult-to-anticipate side effects before new features are broadly released -- something there's been a lot of discussion about lately.
Back to Ryan's original topic -- the sometimes inappropriate nature of discussions on Commons -- I started a draft of an essay (which, at least theoretically, could eventually become a guideline if there is enough support for it). I think it might be a decent start, but it could use more input and fleshing out. Please take a look, and feel free to edit as you see fit:
https://commons.wikimedia.org/wiki/User:Peteforsyth/Provocative_behavior
-Pete
_______________________________________________ Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Marie,
Thanks very much for this overview of your early experience as an editor. Would you mind sending this email to the editor growth team so that they can look at your experience for ideas about what they can improve? Their email list is called "Editor Engagement" and you can find it on lists.wikimedia.org.
I'm also pinging Mssemantics who may be interested in your experience for her research.
Pine
On Wed, Jul 30, 2014 at 2:51 AM, Marie Earley eiryel@hotmail.com wrote:
What's interesting to me about this discussion, and Gender Gap generally, is the discrepancy between what is perceived as being driving women editors away (and if you really want to see a classic example then the 'drop the sticks' closed section of this discussion https://en.wikipedia.org/wiki/Wikipedia:Administrators%27_noticeboard/Archiv... ) https://en.wikipedia.org/wiki/Wikipedia:Administrators%27_noticeboard/Archive263#Topic_ban_proposal_for_Gibson_Flying_V and the things that I have actually found difficult on Wikipedia. These are my bullet points about my first few months of joining Wikipedia.
- Was reading something on WP and, out of curiousity, clicked on the
other tabs 'edit' 'history' and 'discussion' just to see what they were about. 2. Realized they were discussions about editing WP and decided to look further & considered editing WP myself. 3. One tab open with daunting looking amounts of code that I could make absolutely nothing of, and another tab open next to it with a thing called 'Sandbox'. 4. Almost gave up there and then due to the mistaken idea that I if I wanted to write an article then I would have nothing but a completely blank canvass and have to write all the code from scratch by myself. 5. Came back to it the next day thinking, "That can't be it.", created an account and started making small edits, single lines with a citation, obvious copy edit errors and asked for help on noticeboards when I was stuck. 6. I had some stuff seized on, deleted as 'unimportant' or tagged for 'not enough refs', 'orphan', as well as some curt / abrasive comments but nice and helpful ones too. I should say something more about this - Wikipedia does not exist in a vacuum, either online or in the world, if nasty comments are the reason that women don't edit Wikipedia then they wouldn't use social media either - but they do. Did I think that my edits were being treated disproportionately to male editors? Yes, but I am female and the off-line world that I inhabit is also sexist - so what else is new?. 7. So what did have me tearing my hair out early on? I would say that it was what I would call 'the washing machine effect'. I would have saved myself a lot of time and trouble if I had had a quick-start guide that explained Help:XXXX, Template:XXXX, WP:XXXX. I would click 'Help' and be taken to the help homepage, search 'X', be taken to Help:'X', click on 'Y'
- and here was the bit I didn't realize - when I clicked on 'Y' I was also,
by default, leaving 'Help'. I regarded clicking the Help button as walking into the the lobby of Hotel Help, I would go through 2-3 links and then think, "Wait a minute, this is just ordinary Wikipedia, and this is just a definition of [word]. When did I leave Help?" Back button, back button, back button. "Okay, start over..." I would go around, and around like this for ages, either stumbling across what I was looking for, finding another way of doing what I wanted to do, or ask at the Teahouse (not New Users House? Why?). 8. I only ever visited the Commons when I need a picture for something, used the search engine to see if the Commons had what I wanted and then went back to Wikipedia. I didn't stick around to read the conversations so I didn't even know much about that side of it until I joined Gender Gap.
Things that I think might help:
- A culture of irresponsible behaviour stems from bad people. A culture
of responsible behaviour stems from good people. The way to really make a difference is to crowd out the bad with the good so they bad get bored and go and find a new place to play. An increased number of sexist images will then be deleted by the improved culture of the community. 2. The greatest form of outreach is Wikipedia itself. When I was a student what was valuable to me was a way of accessing resources on topics. I recently went through Amartya Sen's page and fixed the bibliography / referencing including author / editor links. This is what his bibliography looked like before: https://en.wikipedia.org/w/index.php?title=Amartya_Sen&oldid=611115580#P... and this is it now: https://en.wikipedia.org/wiki/Amartya_Sen#Bibliography The same with the referencing section, before: https://en.wikipedia.org/w/index.php?title=Amartya_Sen&oldid=611115580#R... and after: https://en.wikipedia.org/wiki/Amartya_Sen#References Similar clean ups / new articles on other academics from the world of feminist economics / political science / political psychology / sociology / care work / human development etc. will increasingly gain Wikipedia a reputation amongst students and scholars as a useful reference tool and recruiting new editors from that pool of visitors would change the culture. A similar thing needs to happen with articles like: https://en.wikipedia.org/wiki/Feminist_movements_and_ideologies 3. I recently added the biography of the political theorist Jane Bennett https://en.wikipedia.org/wiki/Jane_Bennett_(political_theorist) https://en.wikipedia.org/wiki/Jane_Bennett_%28political_theorist. I had in draft for a long time, I took her bibliography from her CV and worked through it item by item. As I did this I checked to see if any co-authors had biographies so I could author-link them. Michael J. Shapiro was one, I went through his bibliography and cleaned it up, the co-authors of his books include James Der Derian, Hayward Alker, David Campbell (academic) - I added author-links on Shapiro's bio but all three of them need their bibliographies sorting out in a similar way and their pages need checking for infoboxes, authority control boxes and LCCN ref no. in authority control boxes. 4. Where dates of birth are known on biographies they should be added to WP's calender, I've added Jane Bennett's, 31 July 1957 https://en.wikipedia.org/wiki/31_July#Births 5. When you first 'land' on Wikipedia what are the key pages beyond the main page:
Contents: https://en.wikipedia.org/wiki/Portal:Contents Outlines: https://en.wikipedia.org/wiki/Portal:Contents/Outlines Portals: https://en.wikipedia.org/wiki/Portal:Contents/Portals Lists: https://en.wikipedia.org/wiki/Portal:Contents/Lists Glossaries: https://en.wikipedia.org/wiki/Portal:Contents/Glossaries Indexes: https://en.wikipedia.org/wiki/Portal:Contents/Indexes
- The 12 groupings are: General reference / Culture and the arts /
Geography and places / Health and fitness / History and events / Mathematics and logic / Natural and physical sciences / People and self / Philosophy and thinking / Religion and belief systems / Society and social sciences / Technology and applied sciences A quick glance at the Outline for Philosophy and thinking https://en.wikipedia.org/wiki/Portal:Contents/Outlines#Philosophy_and_thinki... shows https://en.wikipedia.org/wiki/Portal:Contents/Outlines#Philosophy_and_thinking 'Ethics' with 'Sexual ethics' singled out for special mention - why? Under Indexes for Society and social sciences https://en.wikipedia.org/wiki/Portal:Contents/Indexes#Society_and_social_sci... we have an index for BDSM but red links for Social Policy, Political Science and Development Studies. 7. There are no Portals of the following names: Pro-life portal / Pro-choice portal / Abortion debates portal / Same-sex marriage debates portal... so why is there an, equally contentious, 'Pornography portal', shouldn't it at least be a 'Pornography debates portal'? 8. For me issues like particular pictures making it onto the Commons only matter if they are put into articles or if they become featured / POTD. If there is a debate then fine, mention on Gender Gap and give a link, the same with other debates where a 'support' or 'oppose' may be needed. Taking on sexist editors and trying to find new systems of dealing with them and the images they want to put up is admirable, but there is an element of fiddling while Rome burns, for instance this is a video on how to edit Wikipedia - which new editors is this likely to attract? http://www.youtube.com/watch?v=BhvsVaTymzM Recruitment of better editors = better content = attracting better editors = crowding out the bad.
Marie
Thanks Pine!
~ A.
On Jul 30, 2014, at 3:27 PM, "Pine W" <wiki.pine@gmail.commailto:wiki.pine@gmail.com> wrote:
Marie,
Thanks very much for this overview of your early experience as an editor. Would you mind sending this email to the editor growth team so that they can look at your experience for ideas about what they can improve? Their email list is called "Editor Engagement" and you can find it on lists.wikimedia.orghttp://lists.wikimedia.org.
I'm also pinging Mssemantics who may be interested in your experience for her research.
Pine
On Wed, Jul 30, 2014 at 2:51 AM, Marie Earley <eiryel@hotmail.commailto:eiryel@hotmail.com> wrote: What's interesting to me about this discussion, and Gender Gap generally, is the discrepancy between what is perceived as being driving women editors away (and if you really want to see a classic example then the 'drop the sticks' closed section of this discussion https://en.wikipedia.org/wiki/Wikipedia:Administrators%27_noticeboard/Archiv... )https://en.wikipedia.org/wiki/Wikipedia:Administrators%27_noticeboard/Archive263#Topic_ban_proposal_for_Gibson_Flying_V and the things that I have actually found difficult on Wikipedia. These are my bullet points about my first few months of joining Wikipedia.
1. Was reading something on WP and, out of curiousity, clicked on the other tabs 'edit' 'history' and 'discussion' just to see what they were about. 2. Realized they were discussions about editing WP and decided to look further & considered editing WP myself. 3. One tab open with daunting looking amounts of code that I could make absolutely nothing of, and another tab open next to it with a thing called 'Sandbox'. 4. Almost gave up there and then due to the mistaken idea that I if I wanted to write an article then I would have nothing but a completely blank canvass and have to write all the code from scratch by myself. 5. Came back to it the next day thinking, "That can't be it.", created an account and started making small edits, single lines with a citation, obvious copy edit errors and asked for help on noticeboards when I was stuck. 6. I had some stuff seized on, deleted as 'unimportant' or tagged for 'not enough refs', 'orphan', as well as some curt / abrasive comments but nice and helpful ones too. I should say something more about this - Wikipedia does not exist in a vacuum, either online or in the world, if nasty comments are the reason that women don't edit Wikipedia then they wouldn't use social media either - but they do. Did I think that my edits were being treated disproportionately to male editors? Yes, but I am female and the off-line world that I inhabit is also sexist - so what else is new?. 7. So what did have me tearing my hair out early on? I would say that it was what I would call 'the washing machine effect'. I would have saved myself a lot of time and trouble if I had had a quick-start guide that explained Help:XXXX, Template:XXXX, WP:XXXX. I would click 'Help' and be taken to the help homepage, search 'X', be taken to Help:'X', click on 'Y' - and here was the bit I didn't realize - when I clicked on 'Y' I was also, by default, leaving 'Help'. I regarded clicking the Help button as walking into the the lobby of Hotel Help, I would go through 2-3 links and then think, "Wait a minute, this is just ordinary Wikipedia, and this is just a definition of [word]. When did I leave Help?" Back button, back button, back button. "Okay, start over..." I would go around, and around like this for ages, either stumbling across what I was looking for, finding another way of doing what I wanted to do, or ask at the Teahouse (not New Users House? Why?). 8. I only ever visited the Commons when I need a picture for something, used the search engine to see if the Commons had what I wanted and then went back to Wikipedia. I didn't stick around to read the conversations so I didn't even know much about that side of it until I joined Gender Gap.
Things that I think might help:
1. A culture of irresponsible behaviour stems from bad people. A culture of responsible behaviour stems from good people. The way to really make a difference is to crowd out the bad with the good so they bad get bored and go and find a new place to play. An increased number of sexist images will then be deleted by the improved culture of the community. 2. The greatest form of outreach is Wikipedia itself. When I was a student what was valuable to me was a way of accessing resources on topics. I recently went through Amartya Sen's page and fixed the bibliography / referencing including author / editor links. This is what his bibliography looked like before: https://en.wikipedia.org/w/index.php?title=Amartya_Sen&oldid=611115580#P... and this is it now: https://en.wikipedia.org/wiki/Amartya_Sen#Bibliography The same with the referencing section, before: https://en.wikipedia.org/w/index.php?title=Amartya_Sen&oldid=611115580#R... and after: https://en.wikipedia.org/wiki/Amartya_Sen#References Similar clean ups / new articles on other academics from the world of feminist economics / political science / political psychology / sociology / care work / human development etc. will increasingly gain Wikipedia a reputation amongst students and scholars as a useful reference tool and recruiting new editors from that pool of visitors would change the culture. A similar thing needs to happen with articles like: https://en.wikipedia.org/wiki/Feminist_movements_and_ideologies 3. I recently added the biography of the political theorist Jane Bennett https://en.wikipedia.org/wiki/Jane_Bennett_(political_theorist)https://en.wikipedia.org/wiki/Jane_Bennett_%28political_theorist. I had in draft for a long time, I took her bibliography from her CV and worked through it item by item. As I did this I checked to see if any co-authors had biographies so I could author-link them. Michael J. Shapiro was one, I went through his bibliography and cleaned it up, the co-authors of his books include James Der Derian, Hayward Alker, David Campbell (academic) - I added author-links on Shapiro's bio but all three of them need their bibliographies sorting out in a similar way and their pages need checking for infoboxes, authority control boxes and LCCN ref no. in authority control boxes. 4. Where dates of birth are known on biographies they should be added to WP's calender, I've added Jane Bennett's, 31 July 1957 https://en.wikipedia.org/wiki/31_July#Births 5. When you first 'land' on Wikipedia what are the key pages beyond the main page:
Contents: https://en.wikipedia.org/wiki/Portal:Contents Outlines: https://en.wikipedia.org/wiki/Portal:Contents/Outlines Portals: https://en.wikipedia.org/wiki/Portal:Contents/Portals Lists: https://en.wikipedia.org/wiki/Portal:Contents/Lists Glossaries: https://en.wikipedia.org/wiki/Portal:Contents/Glossaries Indexes: https://en.wikipedia.org/wiki/Portal:Contents/Indexes
6. The 12 groupings are: General reference / Culture and the arts / Geography and places / Health and fitness / History and events / Mathematics and logic / Natural and physical sciences / People and self / Philosophy and thinking / Religion and belief systems / Society and social sciences / Technology and applied sciences A quick glance at the Outline for Philosophy and thinking https://en.wikipedia.org/wiki/Portal:Contents/Outlines#Philosophy_and_thinki... showshttps://en.wikipedia.org/wiki/Portal:Contents/Outlines#Philosophy_and_thinking 'Ethics' with 'Sexual ethics' singled out for special mention - why? Under Indexes for Society and social sciences https://en.wikipedia.org/wiki/Portal:Contents/Indexes#Society_and_social_sci... we have an index for BDSM but red links for Social Policy, Political Science and Development Studies. 7. There are no Portals of the following names: Pro-life portal / Pro-choice portal / Abortion debates portal / Same-sex marriage debates portal... so why is there an, equally contentious, 'Pornography portal', shouldn't it at least be a 'Pornography debates portal'? 8. For me issues like particular pictures making it onto the Commons only matter if they are put into articles or if they become featured / POTD. If there is a debate then fine, mention on Gender Gap and give a link, the same with other debates where a 'support' or 'oppose' may be needed. Taking on sexist editors and trying to find new systems of dealing with them and the images they want to put up is admirable, but there is an element of fiddling while Rome burns, for instance this is a video on how to edit Wikipedia - which new editors is this likely to attract? http://www.youtube.com/watch?v=BhvsVaTymzM Recruitment of better editors = better content = attracting better editors = crowding out the bad.
Marie
_______________________________________________ Gendergap mailing list Gendergap@lists.wikimedia.orgmailto:Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
On 7/30/2014 5:51 AM, Marie Earley wrote:
Things that I think might help:
Help pages wise, I'm sure they'd love to see you at: https://en.wikipedia.org/wiki/Wikipedia:WikiProject_Help
I know I wasted a couple years learning the hard way because the Help pages didn't seem intuitive enough.
However one trick we have to remember is to go to the search box and type WP:_____ whatever the topic of interest is. One often gets a search return that get one just where one wants to go.
A "cheat sheet" of editing and conflict resolution tips for women would be a great addition to: https://en.wikipedia.org/wiki/Wikipedia:WikiProject_Countering_systemic_bias...
Which is slowly but surely coming along.
CM
Nice idea in principle, but there are still two hurdles to be overcome
1. How do you get the cheatsheet to the new female editor? How do you spot new female editors? By what mechanism do you communicate with them? Can you assume they know about User Talk (my almost entirely unsuccessful attempts to communicate with new users in a friendly way to offer help suggests many don't see the message.
2. People don't read user manuals, cheatsheets, etc. Every new Wikipedia user already gets one of those "Welcome to Wikipedia" on their User Tal which points them to a morass of information (which is admittedly written in the language of the expert Wikipedian not the new user) and I think these days they are also offered the "onboarding experience" (or whatever precisely it is called) which aims to teach them to do basic editing. However, generally what people (men and women) really want is "the answer to the question I have here and now" to get them past the immediate barrier to achieving their mission (whatever it was that motivated them to click that Edit button), not a set of lessons nor a set of documentation. Part of the problem we have created for ourselves is that all the policies and processes and technologies have set the bar far too high for many new editors to get started on their own. :-(
Kerry
_____
From: gendergap-bounces@lists.wikimedia.org [mailto:gendergap-bounces@lists.wikimedia.org] On Behalf Of Carol Moore dc Sent: Thursday, 31 July 2014 10:24 AM To: Addressing gender equity and exploring ways to increase the participationof women within Wikimedia projects. Subject: Re: [Gendergap] [Spam] Re: Sexualized environment on Commons
On 7/30/2014 5:51 AM, Marie Earley wrote:
Things that I think might help:
Help pages wise, I'm sure they'd love to see you at: https://en.wikipedia.org/wiki/Wikipedia:WikiProject_Help
I know I wasted a couple years learning the hard way because the Help pages didn't seem intuitive enough.
However one trick we have to remember is to go to the search box and type WP:_____ whatever the topic of interest is. One often gets a search return that get one just where one wants to go.
A "cheat sheet" of editing and conflict resolution tips for women would be a great addition to: https://en.wikipedia.org/wiki/Wikipedia:WikiProject_Countering_systemic_bias /Gender_gap_task_force
Which is slowly but surely coming along.
CM
Nope and I get consistent messages on and off wiki from women saying cheat sheets are poorly designed or people are too busy... But I don't think surveys are being done about workshops and the guides they pass out (I believe in throwing people into the pool to learn how to swim).
I Still stand by hand holding...personal out weighs what we attempt...
But perhaps I am old school in the world of wiki. I also lost a job to trolls who coincidentally also disagreed with my beliefs on commons...so I am particularly sensitive. Commons is a terrible and demoralizing place.
The women's Commons revolution won't happen anytime soon.....
Sarah On Jul 30, 2014 7:48 PM, "Kerry Raymond" kerry.raymond@gmail.com wrote:
Nice idea in principle, but there are still two hurdles to be overcome
- How do you get the cheatsheet to the new female editor? How do you
spot new female editors? By what mechanism do you communicate with them? Can you assume they know about User Talk (my almost entirely unsuccessful attempts to communicate with new users in a friendly way to offer help suggests many don’t see the message.
- People don’t read user manuals, cheatsheets, etc. Every new
Wikipedia user already gets one of those “Welcome to Wikipedia” on their User Tal which points them to a morass of information (which is admittedly written in the language of the expert Wikipedian not the new user) and I think these days they are also offered the “onboarding experience” (or whatever precisely it is called) which aims to teach them to do basic editing. However, generally what people (men and women) really want is “the answer to the question I have here and now” to get them past the immediate barrier to achieving their mission (whatever it was that motivated them to click that Edit button), not a set of lessons nor a set of documentation. Part of the problem we have created for ourselves is that all the policies and processes and technologies have set the bar far too high for many new editors to get started on their own. L
Kerry
*From:* gendergap-bounces@lists.wikimedia.org [mailto: gendergap-bounces@lists.wikimedia.org] *On Behalf Of *Carol Moore dc *Sent:* Thursday, 31 July 2014 10:24 AM *To:* Addressing gender equity and exploring ways to increase the participationof women within Wikimedia projects. *Subject:* Re: [Gendergap] [Spam] Re: Sexualized environment on Commons
On 7/30/2014 5:51 AM, Marie Earley wrote:
Things that I think might help:
Help pages wise, I'm sure they'd love to see you at: https://en.wikipedia.org/wiki/Wikipedia:WikiProject_Help
I know I wasted a couple years learning the hard way because the Help pages didn't seem intuitive enough.
However one trick we have to remember is to go to the search box and type WP:_____ whatever the topic of interest is. One often gets a search return that get one just where one wants to go.
A "cheat sheet" of editing and conflict resolution tips for women would be a great addition to:
https://en.wikipedia.org/wiki/Wikipedia:WikiProject_Countering_systemic_bias...
Which is slowly but surely coming along.
CM
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Twice during my short discussion about how to start a civility board, which turned into a long discussion about the word c*nt, an Admin gave the link to the Commons search results for that word, saying that showed that the "text" of the word isn't very offensive. WTF?!
On Jul 30, 2014 7:55 PM, "Sarah Stierch" sarah.stierch@gmail.com wrote:
Nope and I get consistent messages on and off wiki from women saying
cheat sheets are poorly designed or people are too busy... But I don't think surveys are being done about workshops and the guides they pass out (I believe in throwing people into the pool to learn how to swim).
I Still stand by hand holding...personal out weighs what we attempt...
But perhaps I am old school in the world of wiki. I also lost a job to
trolls who coincidentally also disagreed with my beliefs on commons...so I am particularly sensitive. Commons is a terrible and demoralizing place.
The women's Commons revolution won't happen anytime soon.....
Sarah
what a joke...
I'm sorry you were "exposed" to such a search..
-Sarah
On Wed, Jul 30, 2014 at 8:39 PM, LB lightbreather2@gmail.com wrote:
Twice during my short discussion about how to start a civility board, which turned into a long discussion about the word c*nt, an Admin gave the link to the Commons search results for that word, saying that showed that the "text" of the word isn't very offensive. WTF?!
On Jul 30, 2014 7:55 PM, "Sarah Stierch" sarah.stierch@gmail.com wrote:
Nope and I get consistent messages on and off wiki from women saying
cheat sheets are poorly designed or people are too busy... But I don't think surveys are being done about workshops and the guides they pass out (I believe in throwing people into the pool to learn how to swim).
I Still stand by hand holding...personal out weighs what we attempt...
But perhaps I am old school in the world of wiki. I also lost a job to
trolls who coincidentally also disagreed with my beliefs on commons...so I am particularly sensitive. Commons is a terrible and demoralizing place.
The women's Commons revolution won't happen anytime soon.....
Sarah
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
On 7/30/2014 11:39 PM, LB wrote:
Twice during my short discussion about how to start a civility board, which turned into a long discussion about the word c*nt, an Admin gave the link to the Commons search results for that word, saying that showed that the "text" of the word isn't very offensive. WTF?!
Actually I just searched for the first time and saw all photos were regarding "Courageous Cunts" and had a whole rant written on a talk page thinking it was some pervert thing.
Then I looked at this political poster image https://commons.wikimedia.org/wiki/File:Courageous_Cunts.jpg which leads to this site http://courageouscunts.com/
Which says: This is a protest page! We're a group of girls that got quite angry about the growing propaganda to surgically "improve" the female genitalia. Don't get us wrong: we're not blaming any woman for her conscious, informed decision. If you really want labiaplasty, go ahead. It's the alliance between porn and the medical industry we're opposed to. It's about their campaign to sell us the perfect labia. Here we try to raise a voice against it!
Also CC's are at: https://www.flickr.com/people/76200162@N06 And saw all the photos l looked at were upload by by user: courageousC*nts
So I assume it is a woman or women who were real ticked off about this in 2012? Unless it is a guy who used this evidently real issue as an excuse to get his jollies taking photos of shaved women. All that shaving does make me a bit suspicious...
Also I noticed there are all sorts of photos under both male and female genitalia which probably are excessive in number and/or in detail, but not an issue I'm have energy to do much about.
CM
On Thu, Jul 31, 2014 at 7:10 PM, Carol Moore dc carolmooredc@verizon.net wrote:
Then I looked at this political poster image https://commons.wikimedia.org/wiki/File:Courageous_Cunts.jpg which leads to this site http://courageouscunts.com/
I think nobody has bothered to write much on the movement.
https://en.wikipedia.org/wiki/Courageous_Cunts https://en.wikipedia.org/wiki/Labia_pride_movement https://en.wikipedia.org/wiki/Large_Labia_Project has no content
Contrast that with the content on this site: http://largelabiaproject.org
Best
A. Mani
A. Mani [Last_Name. First_Name Format] CU, ASL, AMS, ISRS, CLC, CMS HomePage: http://www.logicamani.in Blog: http://logicamani.blogspot.in/
To briefly go back to what Sarah and Marie have said, I do find that in person hand-holding and social support are the most effective factors in getting women to stick around. I don't know how to translate that from the real-world environment I teach newbies in to the virtual environment of new users' talk pages. I'd love to brainstorm something in that vein, though. :)
-Emily
On Fri, Aug 1, 2014 at 11:40 PM, A. Mani a.mani.cms@gmail.com wrote:
On Thu, Jul 31, 2014 at 7:10 PM, Carol Moore dc carolmooredc@verizon.net wrote:
Then I looked at this political poster image https://commons.wikimedia.org/wiki/File:Courageous_Cunts.jpg which leads to this site http://courageouscunts.com/
I think nobody has bothered to write much on the movement.
https://en.wikipedia.org/wiki/Courageous_Cunts https://en.wikipedia.org/wiki/Labia_pride_movement https://en.wikipedia.org/wiki/Large_Labia_Project has no content
Contrast that with the content on this site: http://largelabiaproject.org
Best
A. Mani
A. Mani [Last_Name. First_Name Format] CU, ASL, AMS, ISRS, CLC, CMS HomePage: http://www.logicamani.in Blog: http://logicamani.blogspot.in/
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Hi Keilana,
J-mo discussed similar thoughs in the presentation videos that I just sent to some of the other email lists. Perhaps you could brainstorm ideas with him and other interested people at Wikimania?
Pine On Aug 1, 2014 10:37 PM, "Keilana" keilanawiki@gmail.com wrote:
To briefly go back to what Sarah and Marie have said, I do find that in person hand-holding and social support are the most effective factors in getting women to stick around. I don't know how to translate that from the real-world environment I teach newbies in to the virtual environment of new users' talk pages. I'd love to brainstorm something in that vein, though. :)
-Emily
On Fri, Aug 1, 2014 at 11:40 PM, A. Mani a.mani.cms@gmail.com wrote:
On Thu, Jul 31, 2014 at 7:10 PM, Carol Moore dc carolmooredc@verizon.net wrote:
Then I looked at this political poster image https://commons.wikimedia.org/wiki/File:Courageous_Cunts.jpg which leads to this site http://courageouscunts.com/
I think nobody has bothered to write much on the movement.
https://en.wikipedia.org/wiki/Courageous_Cunts https://en.wikipedia.org/wiki/Labia_pride_movement https://en.wikipedia.org/wiki/Large_Labia_Project has no content
Contrast that with the content on this site: http://largelabiaproject.org
Best
A. Mani
A. Mani [Last_Name. First_Name Format] CU, ASL, AMS, ISRS, CLC, CMS HomePage: http://www.logicamani.in Blog: http://logicamani.blogspot.in/
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Getting women to stick around...
I had a class of 25. About 20 women and 5 men. The women, especially Latina women, expressed much anxiety over various treatments--all over the map. Esp. comments and tone of talk pages.Lack of politeness and common courtesies. Today, a year later, only 2 (of 20) women are editing, but the men have become engaged contributors.
Should we tell people at the outset that this is sort of an Outward Bound experience? Survival only of the thick skinned? I do believe this is too bad. As a teacher if I treated students the way Wikipedians treat new contributors, I would be sent for continuing education. Most people want to edit to contribute, not to be wrangled to the mat. Too many women who begin to edit are made to feel belittled, stupid and small. This is crazy. So much talent and good will is nattered away.
--Kathleen.
Kathleen de la Peña McCook Distinguished University Professor of Librarianship USF/SI: http://si.usf.edu/faculty/kmccook/ Academia.edu: https://usf.academia.edu/KathleendelaPe%C3%B1aMcCook Library Thing:: http://www.librarything.com/catalog/klmccook/allcollections
On Sat, Aug 2, 2014 at 2:04 AM, Pine W wiki.pine@gmail.com wrote:
Hi Keilana,
J-mo discussed similar thoughs in the presentation videos that I just sent to some of the other email lists. Perhaps you could brainstorm ideas with him and other interested people at Wikimania?
Pine On Aug 1, 2014 10:37 PM, "Keilana" keilanawiki@gmail.com wrote:
To briefly go back to what Sarah and Marie have said, I do find that in person hand-holding and social support are the most effective factors in getting women to stick around. I don't know how to translate that from the real-world environment I teach newbies in to the virtual environment of new users' talk pages. I'd love to brainstorm something in that vein, though. :)
-Emily
On Fri, Aug 1, 2014 at 11:40 PM, A. Mani a.mani.cms@gmail.com wrote:
On Thu, Jul 31, 2014 at 7:10 PM, Carol Moore dc carolmooredc@verizon.net wrote:
Then I looked at this political poster image https://commons.wikimedia.org/wiki/File:Courageous_Cunts.jpg which leads to this site http://courageouscunts.com/
I think nobody has bothered to write much on the movement.
https://en.wikipedia.org/wiki/Courageous_Cunts https://en.wikipedia.org/wiki/Labia_pride_movement https://en.wikipedia.org/wiki/Large_Labia_Project has no content
Contrast that with the content on this site: http://largelabiaproject.org
Best
A. Mani
A. Mani [Last_Name. First_Name Format] CU, ASL, AMS, ISRS, CLC, CMS HomePage: http://www.logicamani.in Blog: http://logicamani.blogspot.in/
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
https://en.wikipedia.org/wiki/User_talk:Jimbo_Wales#Rebooted_discussion This whole topic is going hot on heave on his talk page, starting with his proposal which I mention in my response on the proposalbelow:
What if it was far more limited: /WMF hires mediators to do mediation and to train and monitor volunteer mediators. Mediation would be voluntary but it is likely Admins and Arbitrators would not look well on those who refused to engage in mediation or obviously did not take it seriously once they agreed to it./ I was in one mediation around 2007-8 on a really controversial topic. The mediator was inexperienced and had to start over at one point; but it still was extremely effective and greatly diminished edit warring among a few editors over several articles. However after that I couldn't find mediators for a one or two issues that had been accepted for mediation because no moderators were available, so I didn't try again for a few years. When I did four people wanted it; two refused on questionable grounds. The issue went to arbitration but Arbitrators didn't take the mediation issue seriously, perhaps because it was known that there aren't many mediators or they aren't effective.
Of course it's been ignored, but there are some thoughtful comments there. And a lot of drama with a couple guys who defend their right to be "uncivil" quitting.
While I was on my best behavior with constructive comments throughout, I did have to say at one point that those who support incivility should at least not have a double standard against women being equally uncivil. "What good for the goose is good for the gander."
Later my roommate explained to me the gander is the MALE not the female! So it took me 66 years to figure it out. Maybe others are similarly confused?? I guess from now on just to make myself perfectly clear I'll say "Whats good for the male gander is good for the female goose."
Ai, yi, yi!!
CM
On 8/2/2014 1:37 AM, Keilana wrote:
To briefly go back to what Sarah and Marie have said, I do find that in person hand-holding and social support are the most effective factors in getting women to stick around. I don't know how to translate that from the real-world environment I teach newbies in to the virtual environment of new users' talk pages. I'd love to brainstorm something in that vein, though. :)
-Emily
Lots of SKYPE mini- seminars!!! (Women only.)
I'm just blue-skying here, but wouldn't it be great if we could have a little window pop-up when someone clicks the edit button for the first time that says "Hey! Thanks for looking into editing Wiki____! It's easy but it can be confusing sometimes. Would you like to chat with a fellow volunteer who can answer your questions?"
And then there could be a little chat window allowing real-time communication while the editor walks through her first edit.
Powers &8^]
-----Original Message----- From: Keilana [mailto:keilanawiki@gmail.com] Sent: 02 August 2014 01:37 To: Addressing gender equity and exploring ways to increase the participationof women within Wikimedia projects. Subject: Re: [Gendergap] Sexualized environment on Commons
To briefly go back to what Sarah and Marie have said, I do find that in person hand-holding and social support are the most effective factors in getting women to stick around. I don't know how to translate that from the real-world environment I teach newbies in to the virtual environment of new users' talk pages. I'd love to brainstorm something in that vein, though. :)
-Emily
On Fri, Aug 1, 2014 at 11:40 PM, A. Mani a.mani.cms@gmail.com wrote:
On Thu, Jul 31, 2014 at 7:10 PM, Carol Moore dc carolmooredc@verizon.net wrote:
Then I looked at this political poster image https://commons.wikimedia.org/wiki/File:Courageous_Cunts.jpg which leads to this site http://courageouscunts.com/
I think nobody has bothered to write much on the movement.
https://en.wikipedia.org/wiki/Courageous_Cunts https://en.wikipedia.org/wiki/Labia_pride_movement https://en.wikipedia.org/wiki/Large_Labia_Project has no content
Contrast that with the content on this site: http://largelabiaproject.org
Best
A. Mani
A. Mani [Last_Name. First_Name Format] CU, ASL, AMS, ISRS, CLC, CMS HomePage: http://www.logicamani.in Blog: http://logicamani.blogspot.in/
_______________________________________________ Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
On Aug 2, 2014 11:01 AM, "LtPowers" LtPowers_Wiki@rochester.rr.com wrote:
And then there could be a little chat window allowing real-time
communication while the editor walks through her first edit.
[originally didn't realize who you were replying to… also haven't read the whole thread yet]
That is technically feasible. Maybe would have new implications for privacy (including WMF privacy policy). Unless the realtime chats were publicly logged. (then same privacy as existing teahouse, etc)
Essentially would be a more interactive version of teahouse? (i.e. shorter wait for a reply and you're paired with someone that's known to be available at that moment) would be a part of teahouse?
How would you staff it? Shifts?
Anyway, that does nothing for the case Kathleen describes. 25 people (20f:5m) in a class and everyone getting that introduction to all things wiki. Then 7 stay active for a year including all the men. (and only 2 of the 20 women)
I'm leaning towards thinking we as a community should (for now) focus more on the retention gap than the recruitment gap. Then we're not recruiting people just to (mostly) lose them in a month or two. But would be interested to hear thoughts on that from someone with a more rigorous analysis.
-Jeremy (jeremyb)
P.S. http://www.onthemedia.org/story/31-race-swap-experiment/
We already have #wikipedia-en-help which is remarkably good for a volunteer help project. Links to join that IRC channel could be offered in multiple places. Other languages may have similar channels.
Pine On Aug 2, 2014 8:42 AM, "Jeremy Baron" jeremy@tuxmachine.com wrote:
On Aug 2, 2014 11:01 AM, "LtPowers" LtPowers_Wiki@rochester.rr.com wrote:
And then there could be a little chat window allowing real-time
communication while the editor walks through her first edit.
[originally didn't realize who you were replying to… also haven't read the whole thread yet]
That is technically feasible. Maybe would have new implications for privacy (including WMF privacy policy). Unless the realtime chats were publicly logged. (then same privacy as existing teahouse, etc)
Essentially would be a more interactive version of teahouse? (i.e. shorter wait for a reply and you're paired with someone that's known to be available at that moment) would be a part of teahouse?
How would you staff it? Shifts?
Anyway, that does nothing for the case Kathleen describes. 25 people (20f:5m) in a class and everyone getting that introduction to all things wiki. Then 7 stay active for a year including all the men. (and only 2 of the 20 women)
I'm leaning towards thinking we as a community should (for now) focus more on the retention gap than the recruitment gap. Then we're not recruiting people just to (mostly) lose them in a month or two. But would be interested to hear thoughts on that from someone with a more rigorous analysis.
-Jeremy (jeremyb)
P.S. http://www.onthemedia.org/story/31-race-swap-experiment/
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
IRC is almost embarrassingly old technology; Wikimedia Foundation projects are the only place I've seen it mentioned in the last five years or more.
On Sat, Aug 2, 2014 at 7:29 PM, Pine W wiki.pine@gmail.com wrote:
We already have #wikipedia-en-help which is remarkably good for a volunteer help project. Links to join that IRC channel could be offered in multiple places. Other languages may have similar channels.
Pine
On Aug 2, 2014 8:42 AM, "Jeremy Baron" jeremy@tuxmachine.com wrote:
On Aug 2, 2014 11:01 AM, "LtPowers" LtPowers_Wiki@rochester.rr.com wrote:
And then there could be a little chat window allowing real-time communication while the editor walks through her first edit.
[originally didn't realize who you were replying to… also haven't read the whole thread yet]
That is technically feasible. Maybe would have new implications for privacy (including WMF privacy policy). Unless the realtime chats were publicly logged. (then same privacy as existing teahouse, etc)
Essentially would be a more interactive version of teahouse? (i.e. shorter wait for a reply and you're paired with someone that's known to be available at that moment) would be a part of teahouse?
How would you staff it? Shifts?
Anyway, that does nothing for the case Kathleen describes. 25 people (20f:5m) in a class and everyone getting that introduction to all things wiki. Then 7 stay active for a year including all the men. (and only 2 of the 20 women)
I'm leaning towards thinking we as a community should (for now) focus more on the retention gap than the recruitment gap. Then we're not recruiting people just to (mostly) lose them in a month or two. But would be interested to hear thoughts on that from someone with a more rigorous analysis.
-Jeremy (jeremyb)
P.S. http://www.onthemedia.org/story/31-race-swap-experiment/
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
You might be surprised how widely and how much Freenode is used for open source projects. The Blender main and dev channels were even more active than English Wikipedia's equivalents when I visited a few days ago. Pine On Aug 2, 2014 6:38 PM, "Michael J. Lowrey" orangemike@gmail.com wrote:
IRC is almost embarrassingly old technology; Wikimedia Foundation projects are the only place I've seen it mentioned in the last five years or more.
On Sat, Aug 2, 2014 at 7:29 PM, Pine W wiki.pine@gmail.com wrote:
We already have #wikipedia-en-help which is remarkably good for a
volunteer
help project. Links to join that IRC channel could be offered in multiple places. Other languages may have similar channels.
Pine
On Aug 2, 2014 8:42 AM, "Jeremy Baron" jeremy@tuxmachine.com wrote:
On Aug 2, 2014 11:01 AM, "LtPowers" LtPowers_Wiki@rochester.rr.com wrote:
And then there could be a little chat window allowing real-time communication while the editor walks through her first edit.
[originally didn't realize who you were replying to… also haven't read
the
whole thread yet]
That is technically feasible. Maybe would have new implications for privacy (including WMF privacy policy). Unless the realtime chats were publicly logged. (then same privacy as existing teahouse, etc)
Essentially would be a more interactive version of teahouse? (i.e.
shorter
wait for a reply and you're paired with someone that's known to be
available
at that moment) would be a part of teahouse?
How would you staff it? Shifts?
Anyway, that does nothing for the case Kathleen describes. 25 people (20f:5m) in a class and everyone getting that introduction to all things wiki. Then 7 stay active for a year including all the men. (and only 2
of
the 20 women)
I'm leaning towards thinking we as a community should (for now) focus
more
on the retention gap than the recruitment gap. Then we're not recruiting people just to (mostly) lose them in a month or two. But would be
interested
to hear thoughts on that from someone with a more rigorous analysis.
-Jeremy (jeremyb)
P.S. http://www.onthemedia.org/story/31-race-swap-experiment/
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- Michael J. "Orange Mike" Lowrey
"When I get a little money I buy books; and if any is left, I buy food and clothes." -- Desiderius Erasmus
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
That's exactly my point, Pine. This kind of inside-baseball geekery is so much Choctaw to the ordinary new editor we are trying to recruit and retain, people more likely to be using Pinterest or Skype or Ravelry to communicate with peers and mentors.
On Sat, Aug 2, 2014 at 8:54 PM, Pine W wiki.pine@gmail.com wrote:
You might be surprised how widely and how much Freenode is used for open source projects. The Blender main and dev channels were even more active than English Wikipedia's equivalents when I visited a few days ago. Pine
On Aug 2, 2014 6:38 PM, "Michael J. Lowrey" orangemike@gmail.com wrote:
IRC is almost embarrassingly old technology; Wikimedia Foundation projects are the only place I've seen it mentioned in the last five years or more.
On Sat, Aug 2, 2014 at 7:29 PM, Pine W wiki.pine@gmail.com wrote:
We already have #wikipedia-en-help which is remarkably good for a volunteer help project. Links to join that IRC channel could be offered in multiple places. Other languages may have similar channels.
Pine
On Aug 2, 2014 8:42 AM, "Jeremy Baron" jeremy@tuxmachine.com wrote:
On Aug 2, 2014 11:01 AM, "LtPowers" LtPowers_Wiki@rochester.rr.com wrote:
And then there could be a little chat window allowing real-time communication while the editor walks through her first edit.
[originally didn't realize who you were replying to… also haven't read the whole thread yet]
That is technically feasible. Maybe would have new implications for privacy (including WMF privacy policy). Unless the realtime chats were publicly logged. (then same privacy as existing teahouse, etc)
Essentially would be a more interactive version of teahouse? (i.e. shorter wait for a reply and you're paired with someone that's known to be available at that moment) would be a part of teahouse?
How would you staff it? Shifts?
Anyway, that does nothing for the case Kathleen describes. 25 people (20f:5m) in a class and everyone getting that introduction to all things wiki. Then 7 stay active for a year including all the men. (and only 2 of the 20 women)
I'm leaning towards thinking we as a community should (for now) focus more on the retention gap than the recruitment gap. Then we're not recruiting people just to (mostly) lose them in a month or two. But would be interested to hear thoughts on that from someone with a more rigorous analysis.
-Jeremy (jeremyb)
P.S. http://www.onthemedia.org/story/31-race-swap-experiment/
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- Michael J. "Orange Mike" Lowrey
"When I get a little money I buy books; and if any is left, I buy food and clothes." -- Desiderius Erasmus
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Exactly. IRC is for the old school and ubergeek. And as Sue has said in the past - we're only going to "retain" specific types of people to be long term editors (ubergeeks like us) but, if we can figure out a solution to help out the "average joe/sphine" editor...
then huzzah. That's what the Teahouse helped do, but what is the next step to supporting people who haven't quite passed the barrier to "editing" Wikipedia.
And expecting people to want to "join the ranks" through OTRS emails surely isn't the ultimate goal..
-Sarah
On Sat, Aug 2, 2014 at 7:02 PM, Michael J. Lowrey orangemike@gmail.com wrote:
That's exactly my point, Pine. This kind of inside-baseball geekery is so much Choctaw to the ordinary new editor we are trying to recruit and retain, people more likely to be using Pinterest or Skype or Ravelry to communicate with peers and mentors.
On Sat, Aug 2, 2014 at 8:54 PM, Pine W wiki.pine@gmail.com wrote:
You might be surprised how widely and how much Freenode is used for open source projects. The Blender main and dev channels were even more active than English Wikipedia's equivalents when I visited a few days ago. Pine
On Aug 2, 2014 6:38 PM, "Michael J. Lowrey" orangemike@gmail.com
wrote:
IRC is almost embarrassingly old technology; Wikimedia Foundation projects are the only place I've seen it mentioned in the last five years or more.
On Sat, Aug 2, 2014 at 7:29 PM, Pine W wiki.pine@gmail.com wrote:
We already have #wikipedia-en-help which is remarkably good for a volunteer help project. Links to join that IRC channel could be offered in multiple places. Other languages may have similar channels.
Pine
On Aug 2, 2014 8:42 AM, "Jeremy Baron" jeremy@tuxmachine.com wrote:
On Aug 2, 2014 11:01 AM, "LtPowers" LtPowers_Wiki@rochester.rr.com wrote:
And then there could be a little chat window allowing real-time communication while the editor walks through her first edit.
[originally didn't realize who you were replying to… also haven't
read
the whole thread yet]
That is technically feasible. Maybe would have new implications for privacy (including WMF privacy policy). Unless the realtime chats
were
publicly logged. (then same privacy as existing teahouse, etc)
Essentially would be a more interactive version of teahouse? (i.e. shorter wait for a reply and you're paired with someone that's known to be available at that moment) would be a part of teahouse?
How would you staff it? Shifts?
Anyway, that does nothing for the case Kathleen describes. 25 people (20f:5m) in a class and everyone getting that introduction to all things wiki. Then 7 stay active for a year including all the men. (and only
2
of the 20 women)
I'm leaning towards thinking we as a community should (for now) focus more on the retention gap than the recruitment gap. Then we're not recruiting people just to (mostly) lose them in a month or two. But would be interested to hear thoughts on that from someone with a more rigorous analysis.
-Jeremy (jeremyb)
P.S. http://www.onthemedia.org/story/31-race-swap-experiment/
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- Michael J. "Orange Mike" Lowrey
"When I get a little money I buy books; and if any is left, I buy food and clothes." -- Desiderius Erasmus
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- Michael J. "Orange Mike" Lowrey
"When I get a little money I buy books; and if any is left, I buy food and clothes." -- Desiderius Erasmus
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
I think we are talking past each other. The issue I responded to was about live help, which we offer, is used extensively for English Wikipedia, and should be respected. Advertising the existing service to more editors is surely better than not doing so. If we are talking about longer-term alternative help systems then I agree that we should explore options like Pintrest which seem to be popular with less technical audiences.
Pine On Aug 2, 2014 7:06 PM, "Sarah Stierch" sarah.stierch@gmail.com wrote:
Exactly. IRC is for the old school and ubergeek. And as Sue has said in the past - we're only going to "retain" specific types of people to be long term editors (ubergeeks like us) but, if we can figure out a solution to help out the "average joe/sphine" editor...
then huzzah. That's what the Teahouse helped do, but what is the next step to supporting people who haven't quite passed the barrier to "editing" Wikipedia.
And expecting people to want to "join the ranks" through OTRS emails surely isn't the ultimate goal..
-Sarah
On Sat, Aug 2, 2014 at 7:02 PM, Michael J. Lowrey orangemike@gmail.com wrote:
That's exactly my point, Pine. This kind of inside-baseball geekery is so much Choctaw to the ordinary new editor we are trying to recruit and retain, people more likely to be using Pinterest or Skype or Ravelry to communicate with peers and mentors.
On Sat, Aug 2, 2014 at 8:54 PM, Pine W wiki.pine@gmail.com wrote:
You might be surprised how widely and how much Freenode is used for open source projects. The Blender main and dev channels were even more active than English Wikipedia's equivalents when I visited a few days ago. Pine
On Aug 2, 2014 6:38 PM, "Michael J. Lowrey" orangemike@gmail.com
wrote:
IRC is almost embarrassingly old technology; Wikimedia Foundation projects are the only place I've seen it mentioned in the last five years or more.
On Sat, Aug 2, 2014 at 7:29 PM, Pine W wiki.pine@gmail.com wrote:
We already have #wikipedia-en-help which is remarkably good for a volunteer help project. Links to join that IRC channel could be offered in multiple places. Other languages may have similar channels.
Pine
On Aug 2, 2014 8:42 AM, "Jeremy Baron" jeremy@tuxmachine.com
wrote:
On Aug 2, 2014 11:01 AM, "LtPowers" <LtPowers_Wiki@rochester.rr.com
wrote: > And then there could be a little chat window allowing real-time > communication while the editor walks through her first edit.
[originally didn't realize who you were replying to… also haven't
read
the whole thread yet]
That is technically feasible. Maybe would have new implications for privacy (including WMF privacy policy). Unless the realtime chats
were
publicly logged. (then same privacy as existing teahouse, etc)
Essentially would be a more interactive version of teahouse? (i.e. shorter wait for a reply and you're paired with someone that's known to be available at that moment) would be a part of teahouse?
How would you staff it? Shifts?
Anyway, that does nothing for the case Kathleen describes. 25 people (20f:5m) in a class and everyone getting that introduction to all things wiki. Then 7 stay active for a year including all the men. (and
only 2
of the 20 women)
I'm leaning towards thinking we as a community should (for now)
focus
more on the retention gap than the recruitment gap. Then we're not recruiting people just to (mostly) lose them in a month or two. But would be interested to hear thoughts on that from someone with a more rigorous analysis.
-Jeremy (jeremyb)
P.S. http://www.onthemedia.org/story/31-race-swap-experiment/
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- Michael J. "Orange Mike" Lowrey
"When I get a little money I buy books; and if any is left, I buy food and clothes." -- Desiderius Erasmus
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- Michael J. "Orange Mike" Lowrey
"When I get a little money I buy books; and if any is left, I buy food and clothes." -- Desiderius Erasmus
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
--
Sarah Stierch
Diverse and engaging consulting for your organization.
www.sarahstierch.com
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Thank you, Sarah. I hope that subjects like this will be part of the discussion in Washington, whether I get to go or not. (I have applied, but I'm an old white male so….)
On Sat, Aug 2, 2014 at 9:06 PM, Sarah Stierch sarah.stierch@gmail.com wrote:
Exactly. IRC is for the old school and ubergeek. And as Sue has said in the past - we're only going to "retain" specific types of people to be long term editors (ubergeeks like us) but, if we can figure out a solution to help out the "average joe/sphine" editor...
then huzzah. That's what the Teahouse helped do, but what is the next step to supporting people who haven't quite passed the barrier to "editing" Wikipedia.
And expecting people to want to "join the ranks" through OTRS emails surely isn't the ultimate goal..
-Sarah
On Sat, Aug 2, 2014 at 7:02 PM, Michael J. Lowrey orangemike@gmail.com wrote:
That's exactly my point, Pine. This kind of inside-baseball geekery is so much Choctaw to the ordinary new editor we are trying to recruit and retain, people more likely to be using Pinterest or Skype or Ravelry to communicate with peers and mentors.
On Sat, Aug 2, 2014 at 8:54 PM, Pine W wiki.pine@gmail.com wrote:
You might be surprised how widely and how much Freenode is used for open source projects. The Blender main and dev channels were even more active than English Wikipedia's equivalents when I visited a few days ago. Pine
On Aug 2, 2014 6:38 PM, "Michael J. Lowrey" orangemike@gmail.com wrote:
IRC is almost embarrassingly old technology; Wikimedia Foundation projects are the only place I've seen it mentioned in the last five years or more.
On Sat, Aug 2, 2014 at 7:29 PM, Pine W wiki.pine@gmail.com wrote:
We already have #wikipedia-en-help which is remarkably good for a volunteer help project. Links to join that IRC channel could be offered in multiple places. Other languages may have similar channels.
Pine
On Aug 2, 2014 8:42 AM, "Jeremy Baron" jeremy@tuxmachine.com wrote:
On Aug 2, 2014 11:01 AM, "LtPowers" LtPowers_Wiki@rochester.rr.com wrote: > And then there could be a little chat window allowing real-time > communication while the editor walks through her first edit.
[originally didn't realize who you were replying to… also haven't read the whole thread yet]
That is technically feasible. Maybe would have new implications for privacy (including WMF privacy policy). Unless the realtime chats were publicly logged. (then same privacy as existing teahouse, etc)
Essentially would be a more interactive version of teahouse? (i.e. shorter wait for a reply and you're paired with someone that's known to be available at that moment) would be a part of teahouse?
How would you staff it? Shifts?
Anyway, that does nothing for the case Kathleen describes. 25 people (20f:5m) in a class and everyone getting that introduction to all things wiki. Then 7 stay active for a year including all the men. (and only 2 of the 20 women)
I'm leaning towards thinking we as a community should (for now) focus more on the retention gap than the recruitment gap. Then we're not recruiting people just to (mostly) lose them in a month or two. But would be interested to hear thoughts on that from someone with a more rigorous analysis.
-Jeremy (jeremyb)
P.S. http://www.onthemedia.org/story/31-race-swap-experiment/
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- Michael J. "Orange Mike" Lowrey
"When I get a little money I buy books; and if any is left, I buy food and clothes." -- Desiderius Erasmus
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- Michael J. "Orange Mike" Lowrey
"When I get a little money I buy books; and if any is left, I buy food and clothes." -- Desiderius Erasmus
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
--
Sarah Stierch
Diverse and engaging consulting for your organization.
www.sarahstierch.com
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
There are plenty of people using IRC, but many of them don't know it. There are chatroom/IRC hybrids, generally on forum sites. You embed the chat window in a web page, and anyone can join in. Those who want can use any IRC client to get to the same channel, but with more features.
http://www.irchighway.net/ http://mibbit.com/
Janine
Sarah Stierch wrote:
Exactly. IRC is for the old school and ubergeek. And as Sue has said in the past - we're only going to "retain" specific types of people to be long term editors (ubergeeks like us) but, if we can figure out a solution to help out the "average joe/sphine" editor...
That sounds workable and hopefully friendly.
Pine On Aug 2, 2014 7:51 PM, "Janine Starykowicz" jrstark@barntowire.com wrote:
There are plenty of people using IRC, but many of them don't know it. There are chatroom/IRC hybrids, generally on forum sites. You embed the chat window in a web page, and anyone can join in. Those who want can use any IRC client to get to the same channel, but with more features.
http://www.irchighway.net/ http://mibbit.com/
Janine
Sarah Stierch wrote:
Exactly. IRC is for the old school and ubergeek. And as Sue has said in the past - we're only going to "retain" specific types of people to be long term editors (ubergeeks like us) but, if we can figure out a solution to help out the "average joe/sphine" editor...
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
One of the sites I've found them on is more technical, but another is definitely not. The embedded version is very newbie friendly.
Janine
Pine W wrote:
That sounds workable and hopefully friendly.
Pine
On Aug 2, 2014 7:51 PM, "Janine Starykowicz" <jrstark@barntowire.com mailto:jrstark@barntowire.com> wrote:
There are plenty of people using IRC, but many of them don't know it. There are chatroom/IRC hybrids, generally on forum sites. You embed the chat window in a web page, and anyone can join in. Those who want can use any IRC client to get to the same channel, but with more features. http://www.irchighway.net/ http://mibbit.com/ Janine Sarah Stierch wrote: Exactly. IRC is for the old school and ubergeek. And as Sue has said in the past - we're only going to "retain" specific types of people to be long term editors (ubergeeks like us) but, if we can figure out a solution to help out the "average joe/sphine" editor...
Janine, can you share links to the sites? I'm seriously interested in this idea of a friendlier interface for IRC.
Pine On Aug 2, 2014 8:03 PM, "Janine Starykowicz" jrstark@barntowire.com wrote:
One of the sites I've found them on is more technical, but another is definitely not. The embedded version is very newbie friendly.
Janine
Pine W wrote:
That sounds workable and hopefully friendly.
Pine
On Aug 2, 2014 7:51 PM, "Janine Starykowicz" <jrstark@barntowire.com mailto:jrstark@barntowire.com> wrote:
There are plenty of people using IRC, but many of them don't know it.
There are chatroom/IRC hybrids, generally on forum sites. You embed the chat window in a web page, and anyone can join in. Those who want can use any IRC client to get to the same channel, but with more features.
http://www.irchighway.net/ http://mibbit.com/ Janine Sarah Stierch wrote: Exactly. IRC is for the old school and ubergeek. And as Sue has
said in the past - we're only going to "retain" specific types of people to be long term editors (ubergeeks like us) but, if we can figure out a solution to help out the "average joe/sphine" editor...
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Hi Janine,
Of the links that you mentioned I was only able to get one of them to work, but I searched for friendlier IRC clients and I think I've found one. It's called Kiwi IRC. I'll ask the Freenode people what they think about changing their default web client to Kiwi. If they want to keep their current client it may still be possible for Wikimedia to change the default chat client used when people connect directly from English Wikipedia to #wikipedia-en-help.
Pine
On Sat, Aug 2, 2014 at 10:27 PM, Pine W wiki.pine@gmail.com wrote:
Janine, can you share links to the sites? I'm seriously interested in this idea of a friendlier interface for IRC.
Pine On Aug 2, 2014 8:03 PM, "Janine Starykowicz" jrstark@barntowire.com wrote:
One of the sites I've found them on is more technical, but another is definitely not. The embedded version is very newbie friendly.
Janine
Pine W wrote:
That sounds workable and hopefully friendly.
Pine
On Aug 2, 2014 7:51 PM, "Janine Starykowicz" <jrstark@barntowire.com mailto:jrstark@barntowire.com> wrote:
There are plenty of people using IRC, but many of them don't know
it. There are chatroom/IRC hybrids, generally on forum sites. You embed the chat window in a web page, and anyone can join in. Those who want can use any IRC client to get to the same channel, but with more features.
http://www.irchighway.net/ http://mibbit.com/ Janine Sarah Stierch wrote: Exactly. IRC is for the old school and ubergeek. And as Sue has
said in the past - we're only going to "retain" specific types of people to be long term editors (ubergeeks like us) but, if we can figure out a solution to help out the "average joe/sphine" editor...
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
I have spent years trying to figure all of this out. I feel like I rehash this question over and over again. Every year. Perhaps if I had a grant from the government I could sit around and figure it out finally ;-)
1. Training women to be trainers is important. After I did that with some folks after the first WikiWomen's History Month event, now those women do their own events.
2. I did an evaluation of events I've done. No, they don't retain people, except the "experienced usual suspects" - newbies rarely edit after the event and generally do it AT events. I've seen it in the 20+ events I have now facilitated internationally. I use Wikidata to track their contributions and surveys. No dice. People do it at events. Read it: https://meta.wikimedia.org/wiki/Grants:Evaluation/Library/Case_studies/WWHM oh and you can read the proof in the puddin' re: edit-a-thons and workshops here: https://meta.wikimedia.org/wiki/Grants:Evaluation/Library/Edit-a-thons AND https://meta.wikimedia.org/wiki/Grants:Evaluation/Library/Editing_workshops
3. Capturing people through "IRC" is a silly old school way of thinking. Sorry dudes. When I first got "hardcore" into the community here in Wikimedia I was SHOCKED that people were still using IRC. I used IRC in 1991. Not in 2011. Only uber geeks use that stuff - the average person doesn't. Seriously.
4. Pop up windows - interesting idea of an experiment. Even though I hit the "x" every time one of those things pops up when I'm using the ATT website among a million others. I ignore them, they look like spam. But that's just me, maybe others do use them.
5. Better cheat sheets are needed. People complain about how cluttered and overwhelming they are. Just like our online help pages. They're full of Wikipediababblespeak and not "to the point."
6. More guides on how to do events. I have developed checklists and so forth for people. I know how much Wikimedians hate writing documentation, but honestly, I know for a fact that Wikipedians in Residency's have started because of the case study I wrote, I know for a fact GLAMs have done content donations because of the case studies I write, and I know for a fact that people have ready the case study I wrote about edit-a-thons and learned from it and done it. I make powerpoints and post them and encourage people to reuse them, and they do.
So making more shit for people to use that is awesome and usable and quality and not full of babblespeak and such is helpful.
Those books that the education folks made were a great start, but those appear to be specifically for education - I've never seen them at edit-a-thons, but, I don't go to them very often these days.
Sarah Stierch
On Sat, Aug 2, 2014 at 6:54 PM, Pine W wiki.pine@gmail.com wrote:
You might be surprised how widely and how much Freenode is used for open source projects. The Blender main and dev channels were even more active than English Wikipedia's equivalents when I visited a few days ago. Pine On Aug 2, 2014 6:38 PM, "Michael J. Lowrey" orangemike@gmail.com wrote:
IRC is almost embarrassingly old technology; Wikimedia Foundation projects are the only place I've seen it mentioned in the last five years or more.
On Sat, Aug 2, 2014 at 7:29 PM, Pine W wiki.pine@gmail.com wrote:
We already have #wikipedia-en-help which is remarkably good for a
volunteer
help project. Links to join that IRC channel could be offered in
multiple
places. Other languages may have similar channels.
Pine
On Aug 2, 2014 8:42 AM, "Jeremy Baron" jeremy@tuxmachine.com wrote:
On Aug 2, 2014 11:01 AM, "LtPowers" LtPowers_Wiki@rochester.rr.com wrote:
And then there could be a little chat window allowing real-time communication while the editor walks through her first edit.
[originally didn't realize who you were replying to… also haven't read
the
whole thread yet]
That is technically feasible. Maybe would have new implications for privacy (including WMF privacy policy). Unless the realtime chats were publicly logged. (then same privacy as existing teahouse, etc)
Essentially would be a more interactive version of teahouse? (i.e.
shorter
wait for a reply and you're paired with someone that's known to be
available
at that moment) would be a part of teahouse?
How would you staff it? Shifts?
Anyway, that does nothing for the case Kathleen describes. 25 people (20f:5m) in a class and everyone getting that introduction to all
things
wiki. Then 7 stay active for a year including all the men. (and only 2
of
the 20 women)
I'm leaning towards thinking we as a community should (for now) focus
more
on the retention gap than the recruitment gap. Then we're not
recruiting
people just to (mostly) lose them in a month or two. But would be
interested
to hear thoughts on that from someone with a more rigorous analysis.
-Jeremy (jeremyb)
P.S. http://www.onthemedia.org/story/31-race-swap-experiment/
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
-- Michael J. "Orange Mike" Lowrey
"When I get a little money I buy books; and if any is left, I buy food and clothes." -- Desiderius Erasmus
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
Gendergap mailing list Gendergap@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/gendergap
On Sun, Aug 3, 2014 at 7:34 AM, Sarah Stierch sarah.stierch@gmail.com wrote:
- Better cheat sheets are needed. People complain about how cluttered and
overwhelming they are. Just like our online help pages. They're full of Wikipediababblespeak and not "to the point."
- More guides on how to do events. I have developed checklists and so forth
for people. I know how much Wikimedians hate writing documentation, but honestly, I know for a fact that Wikipedians in Residency's have started because of the case study I wrote, I know for a fact GLAMs have done content donations because of the case studies I write, and I know for a fact that people have ready the case study I wrote about edit-a-thons and learned from it and done it. I make powerpoints and post them and encourage people to reuse them, and they do.
It is also important to automate help. I have not seen much progress along those lines. 'Computing with words' is a mature subject.
Best
A. Mani
A. Mani [Last_Name. First_Name Format] CU, ASL, AMS, ISRS, CLC, CMS HomePage: http://www.logicamani.in Blog: http://logicamani.blogspot.in/
I totally agree, and no offense to the people who have contributed to help pages, but I find them very unhelpful and sometimes downright wrong.
Sent from my iPad
On Jul 31, 2014, at 4:55 AM, Sarah Stierch <sarah.stierch@gmail.com
I Still stand by hand holding...personal out weighs what we attempt...