Please see the discussion on [[Image:KateWinsletTitanic.jpg]] at http://en.wikipedia.org/wiki/Wikipedia:Images_and_media_for_deletion#Apr il_11.
Here's my reply to the whole deal:
Delete -- For the same reason as Nunh-huh and DreamGuy. This adds nothing to the article except controversy. So the only reason to keep it in the article is to reduce WikiLove, and to increase Wikistress. I hate all these stupid debates about nudity. We exists to be an encyclopedia not a bastion of freedom of speech. We are here to provide information NOT to make political statements about censorship. Having nude and/or sexually explicit pictures in our encyclopedia upsets enough people that it detracts significantly from the task of making an encylopedia and that is reason enough to delete this picture (and others like it). There is a definite agenda by the nude-picture-pushers here and it has nothing to do with creating an NPOV encyclopedia. Kevin Rector (talk) 22:04, Apr 12, 2005 (UTC)
Now to you the mailing list:
I've tried to stay out of the fray with all the nude/porn picture debates that have gone on, but I've come to the realization that people are searching out nude pictures to put in the 'pedia. They are looking to stir up trouble, mostly to make a point.
The fact that the Wikipedia "community" won't resolve these problems, and the fact that the community so proudly proclaims that "Wikipedia is not censored" while ignoring the fact that "Wikipedia is quite dysfunctional" has made me realize that I need to re-evaluate if this is a project that I want to be a part of.
Plus the politics and partisanship are really annoying.
I'm going on WikiVacation - I may or may not return.
-Kevin Rector
I was expecting to find a nude photo in the article about [[Kate Winslet]], which would be appropriate (out of context, misleading). However, the photo is a very tasteful nude shot from the film ''Titanic'', and is used to enrich that article's narrative about the movie - not to mention that it's a partial nude, with only a left breast visible.
I don't know the uploader's motivation, and I share your suspicion that many people might try to cause disruption by uploading such photos. However, the best response to that, in my opinion, is to have clear policies on the matter. Then the trolls can do whatever they want, and we just decide quickly how to deal with the matter. Such policies might evolve out of precedents like Autofellatio and this photo.
Erik
On 4/12/05, Erik Moeller erik_moeller@gmx.de wrote:
I was expecting to find a nude photo in the article about [[Kate Winslet]], which would be appropriate (out of context, misleading). However, the photo is a very tasteful nude shot from the film ''Titanic'', and is used to enrich that article's narrative about the movie - not to mention that it's a partial nude, with only a left breast visible.
I don't know the uploader's motivation, and I share your suspicion that many people might try to cause disruption by uploading such photos. However, the best response to that, in my opinion, is to have clear policies on the matter. Then the trolls can do whatever they want, and we just decide quickly how to deal with the matter. Such policies might evolve out of precedents like Autofellatio and this photo.
Erik
Frankly, I see nothing particularly offensive about the photo. I see no need to have it deleted on the spot as pornographic or anything like that.
However, I also believe that there is room for constructive discussion about its use on any particular article. I'm one of the three people in the US who have not seen this movie, so I cannot comment on this particular scene's importance to the overall movie.
What does concern me is the tone that is used by some people on each side in these conversations. One one hand we have people who want no nudity in Wikipedia at all, and who seem to consider almost any nudity as offensive. On the other hand, we have those people who consider almost any discussion about such images as being extreme prudism or attempted censorship. From both sides there is name-calling and labeling.
I take it as a given that we don't want to encourage censorship on Wikipedia. There is significant weight to the argument that "when you go to an encyclopedia article about the penis, you should expect to see a penis," though some "decency" advocates will insist otherwise.
On the other hand, I also take it as a given that we want to encourage good editorial judgment on wikipedia. If I go to an encyclopedia article about a kneecap, I don't expect to see a penis along with the kneecap. I fear that, if some "free-speech advocates" have their way, removing such a photo from the Kneecap article would be dubbed as censorship.
Somewhere in between those two examples lie situations such as the Winslet (partial) nude being used on the [[Titanic (1997 movie)]] article. People of integrity, intelligence and good intention can disagree about it.
Can we do so without hurling vitriol at one another?
-- Rich Holton
[[W:en:User:Rholton]]
On 4/13/05, Richard Holton richholton@gmail.com wrote:
On 4/12/05, Erik Moeller erik_moeller@gmx.de wrote:
I was expecting to find a nude photo in the article about [[Kate Winslet]], which would be appropriate (out of context, misleading). However, the photo is a very tasteful nude shot from the film ''Titanic'', and is used to enrich that article's narrative about the movie - not to mention that it's a partial nude, with only a left breast visible.
I don't know the uploader's motivation, and I share your suspicion that many people might try to cause disruption by uploading such photos. However, the best response to that, in my opinion, is to have clear policies on the matter. Then the trolls can do whatever they want, and we just decide quickly how to deal with the matter. Such policies might evolve out of precedents like Autofellatio and this photo.
Erik
Frankly, I see nothing particularly offensive about the photo. I see no need to have it deleted on the spot as pornographic or anything like that.
However, I also believe that there is room for constructive discussion about its use on any particular article. I'm one of the three people in the US who have not seen this movie, so I cannot comment on this particular scene's importance to the overall movie.
What does concern me is the tone that is used by some people on each side in these conversations. One one hand we have people who want no nudity in Wikipedia at all, and who seem to consider almost any nudity as offensive. On the other hand, we have those people who consider almost any discussion about such images as being extreme prudism or attempted censorship. From both sides there is name-calling and labeling.
I take it as a given that we don't want to encourage censorship on Wikipedia. There is significant weight to the argument that "when you go to an encyclopedia article about the penis, you should expect to see a penis," though some "decency" advocates will insist otherwise.
On the other hand, I also take it as a given that we want to encourage good editorial judgment on wikipedia. If I go to an encyclopedia article about a kneecap, I don't expect to see a penis along with the kneecap. I fear that, if some "free-speech advocates" have their way, removing such a photo from the Kneecap article would be dubbed as censorship.
Somewhere in between those two examples lie situations such as the Winslet (partial) nude being used on the [[Titanic (1997 movie)]] article. People of integrity, intelligence and good intention can disagree about it.
Can we do so without hurling vitriol at one another?
-- Rich Holton
[[W:en:User:Rholton]]
Hear, hear. You worded it exactly right. I totally agree. --Mgm
For me, the largest part of the problem with the Kate Winslet nude is that it renders all of Wikipedia non-work-safe. You can defend the autofellatio picture by asking "why are you looking at an article about sucking your own dick while at work?" but if any article that mentions any person can have a nude picture of that person, I don't dare use Wikipedia anywhere but in the privacy of my own bedroom, with the shades drawn.
Right now, anyone at my place of work who happens to look up Kate Winslet will be disciplined. If he then inadvertently trips over another Wikinude, he'll be fired. In other words, Wikipedia is now unusable in a buttoned-down professional environment.
Some of you, of course, think that makes Wikipedia a better thing.
Sean Barrett wrote:
For me, the largest part of the problem with the Kate Winslet nude is that it renders all of Wikipedia non-work-safe. You can defend the autofellatio picture by asking "why are you looking at an article about sucking your own dick while at work?" but if any article that mentions any person can have a nude picture of that person, I don't dare use Wikipedia anywhere but in the privacy of my own bedroom, with the shades drawn.
Er, the Kate Winslet nude is only marginally "nude". If it's such a problem, use a tabbed browser or disable images.
Right now, anyone at my place of work who happens to look up Kate Winslet will be disciplined. If he then inadvertently trips over another Wikinude, he'll be fired. In other words, Wikipedia is now unusable in a buttoned-down professional environment.
Why are you looking up Kate Winslet at work anyway? Why aren't you working?
Some of you, of course, think that makes Wikipedia a better thing.
Better go around that bridge when the Billy Goats Gruff get to it.
Er, the Kate Winslet nude is only marginally "nude". If it's such a problem, use a tabbed browser or disable images.
How does using a tabbed browser save me from having a nude appear on my screen?
Why are you looking up Kate Winslet at work anyway? Why aren't you working?
Believe it or not, we have good reasons to look things up while at work, including articles about people. If any biographical article can have a nude of that person, however, we won't be using Wikipedia to do so.
Some of you, of course, think that makes Wikipedia a better thing.
Better go around that bridge when the Billy Goats Gruff get to it.
I agree that many of these non-work-safe pictures are trolling efforts, but I think we're already on the bridge, jumping up and down.
Sean Barrett said:
Why are you looking up Kate Winslet at work anyway? Why aren't you working?
Believe it or not, we have good reasons to look things up while at work, including articles about people. If any biographical article can have a nude of that person, however, we won't be using Wikipedia to do so.
Why not?
X-Original-To: wikien-l@wikipedia.org Date: Wed, 13 Apr 2005 16:23:47 +0100 (BST) From: "Tony Sidaway" minorityreport@bluebottle.com X-Priority: 3 Importance: Normal X-MSMail-Priority: Normal Reply-To: minorityreport@bluebottle.com, English Wikipedia wikien-l@Wikipedia.org Sender: wikien-l-bounces@Wikipedia.org X-Spam-Checker-Version: SpamAssassin 3.0.0 (2004-09-13) on orwen.epoptic.com X-Spam-Level: X-Spam-Status: No, hits=-2.6 required=4.0 tests=AWL,BAYES_00 autolearn=ham version=3.0.0
Sean Barrett said:
Why are you looking up Kate Winslet at work anyway? Why aren't you working?
Believe it or not, we have good reasons to look things up while at work, including articles about people. If any biographical article can have a nude of that person, however, we won't be using Wikipedia to do so.
Why not?
Corporate policy prohibits all images of nudity, out of fear of sexual harassment lawsuits. Repeat offenses result in termination. I love Wikipedia, but I'm not going to lose my job because the Wikipedia community feels it necessary to show bare boobies.
Sean Barrett said:
From: "Tony Sidaway" minorityreport@bluebottle.com Sean:
Believe it or not, we have good reasons to look things up while at work, including articles about people. If any biographical article can have a nude of that person, however, we won't be using Wikipedia to do so.
Why not?
Corporate policy prohibits all images of nudity, out of fear of sexual harassment lawsuits. Repeat offenses result in termination. I love Wikipedia, but I'm not going to lose my job because the Wikipedia community feels it necessary to show bare boobies.
The chance of seeing a picture containing nudity applies to the whole of the WWW. Surely the logical thing to do is to suppress image downloads. That way you only see the pictures you want to see, and you get all the text.
On 4/13/05, Sean Barrett sean@epoptic.org wrote:
Corporate policy prohibits all images of nudity, out of fear of sexual harassment lawsuits. Repeat offenses result in termination. I love Wikipedia, but I'm not going to lose my job because the Wikipedia community feels it necessary to show bare boobies.
Sean brings up a good point here. As long as Mediawiki is without a content filtering system, the possibility of problems arising from images or other content exists. It's quite plausible that someone could, say, lose his or her job because of viewing a Wikipedia article and then sue the Foundation for damages, asserting that the Foundation did not give fair warning about such content. We do have a disclaimer, but that may not be sufficient, especially since it's linked at the bottoms of articles.
--Slowking Man
Believe it or not, we have good reasons to look things up while at work, including articles about people. If any biographical article can have a nude of that person, however, we won't be using Wikipedia to do so.
The above alone, I think, is good enough reason to ban nude pictures when sufficient non-nude imagery is present.
Well the uploader was [[User:1337]] lets look at his contributions and see what we can found out.
His first two edits were to to put upload that picture and put into the article. A new user know how to find Special:upload is not unheard of but...
Well to continue the user created [[11th millennium]] which is currently listed on VFD and looks like it is going to be deleted.
next there is an edit to very POV edit to [[0s BC]] which gets reverted
Then then the user blanks his talk page
Finialy the User uploads [[Image:Msspyware.png]] a faked image with no mention that it is faked (It is a low grade bit of antimicrsoft propergander) and inserts it into an article again with no mention that it is faked.
Now I suspose he could be looking for the title of most clueless newbie every but I'm not to sure
geni said:
Finialy the User uploads [[Image:Msspyware.png]] a faked image with no mention that it is faked (It is a low grade bit of antimicrsoft propergander) and inserts it into an article again with no mention that it is faked.
It's probable that he thought it was real. This has entered anti-Microsoft folklore.
Now I suspose he could be looking for the title of most clueless newbie every but I'm not to sure
Not the most clueful of guys, but not an obvious troll. I don't think you can seriously hope to discredit the picture on these grounds.
It's probable that he thought it was real. This has entered anti-Microsoft folklore.
Source? Who ever it was was able to find [[Special:Upload]] fast so they have a least a couple of functioning brain cells
Not the most clueful of guys, but not an obvious troll. I don't think you can seriously hope to discredit the picture on these grounds.
See above
Finialy the User uploads [[Image:Msspyware.png]] a faked image with no mention that it is faked (It is a low grade bit of antimicrsoft propergander) and inserts it into an article again with no mention that it is faked.
It's probable that he thought it was real. This has entered anti-Microsoft folklore.
A few weeks ago that image along with a story like "MS thinks Firefox is spyware" showed up on Slashdot. That doesn't mean anything ofcourse, since there story poster may very well have been trolling. Either way, judging by the comments quite a few people believed it to be true.
Alphax wrote:
Sean Barrett wrote:
For me, the largest part of the problem with the Kate Winslet nude is that it renders all of Wikipedia non-work-safe. You can defend the autofellatio picture by asking "why are you looking at an article about sucking your own dick while at work?" but if any article that mentions any person can have a nude picture of that person, I don't dare use Wikipedia anywhere but in the privacy of my own bedroom, with the shades drawn.
Er, the Kate Winslet nude is only marginally "nude". If it's such a problem, use a tabbed browser or disable images.
"Marginally nude" is still nudity. One breast counts as nudity. You can argue that nudity should be allowed on Wikipedia, and I might even agree with you, but a spade's a spade.
And you can't seriously mean that disabling images is a real solution to the problem. We'd have to have a notice on the main page, e.g. "Note: Nudity may appear in any entry. If you are offended by nudity, please disable images in your browser." I don't think that would go over well.
Right now, anyone at my place of work who happens to look up Kate Winslet will be disciplined. If he then inadvertently trips over another Wikinude, he'll be fired. In other words, Wikipedia is now unusable in a buttoned-down professional environment.
Why are you looking up Kate Winslet at work anyway? Why aren't you working?
I agree that this is a very good question. But the workplace isn't the only context where being seen using a site with random nudity on it has social consequences. So I think there's still some need for policies.
How hard would it be to add to Mediawiki the option to tag explicit images, and have an option in the preferences to not see them? A la Google SafeSearch. This might be a win-win (aside from the programming work) solution.
Zach
On Wed, Apr 13, 2005 at 04:34:20PM +0100, Zach Alexander wrote:
Alphax wrote:
Er, the Kate Winslet nude is only marginally "nude". If it's such a problem, use a tabbed browser or disable images.
"Marginally nude" is still nudity. One breast counts as nudity.
In some cultures, exposure of the female *face* in public is "nudity".
You can argue that nudity should be allowed on Wikipedia, and I might even agree with you, but a spade's a spade.
Of course nudity should be allowed on Wikipedia, just as explicit pictures of tomatoes should be allowed. After all, the world contains both nudity and tomatoes, and Wikipedia's job is to describe the world.
And you can't seriously mean that disabling images is a real solution to the problem. We'd have to have a notice on the main page, e.g. "Note: Nudity may appear in any entry. If you are offended by nudity, please disable images in your browser." I don't think that would go over well.
We already have a [[Wikipedia:Content disclaimer]] linked from the main page. It is at least as prominent as other quasi-legal notices posted on other Web sites, which purport to advise the reader of site policies -- for instance, privacy policies on many commercial Web sites.
How hard would it be to add to Mediawiki the option to tag explicit images, and have an option in the preferences to not see them? A la Google SafeSearch. This might be a win-win (aside from the programming work) solution.
This has been brought up time and again here.
Censorware tagging on Wikipedia is a little bit like "email postage" as an anti-spam solution -- it can't work; there are reasonably clear reasons it can't work; but it keeps getting proposed anyway.
(The chief reason "email postage" can't work as an anti-spam solution is that spammers already steal other people's computing resources to send spam, e.g. using virus-infected computers, and would quite readily adapt to foist the "postage" cost off on innocents similarly. Ordinary mail users would be stuck with the "postage" while spammers would keep screwing over everyone else, just as they have always done. And all non-commercial mailing lists, such as this one, would be destroyed.)
What's wrong with censorware tagging? Where to start? Here's the biggie: tagging is incompatible with Wikipedia's existing commitments.
No system of tags is compatible with Wikipedia's commitment to neutrality. The dimensions, biases, and extremes of any system of tags are created from a particular non-neutral point of view. Wikipedia is categorically forbidden from taking on such a point of view as its own.
By "dimensions" I mean the types of material that are considered worth tagging -- e.g. nudity; violence; religion. The reason that commercial censorware products have an "explicit nudity" dimension and not a "explicit Christianity" dimension should be tolerably obvious in the marketplace -- but Wikipedia does not have any business deciding for its readers that "nudity" is problematic and needs to be a filtering option but "Christianity" doesn't.
By "biases" I mean the inherent bigotries that will be encoded into any particular category. A system which considers female breasts to be "nudity" but male chests not to be is sexist by nature. (And anyone who thinks that female breasts are "sexual" but male chests are not simply has not asked enough straight women or gay men for their opinion on the issue.)
By "extremes" I mean the judgment as to what is the "mildest" category of "objectionable content" which still merits tagging -- and what constitutes "mild" or "extreme". A system which tags exposure of the female breast, but not of the female ankle or face expresses a POV as to what is "acceptable" exposure. A system which tags explicit sexual activity but does not consider a homosexual kiss to be "explicit sexual activity" expresses a bias about an issue that many people apparently find offensive.
The people or organizations likely to want content tagging would require complete tagging, or else it would be useless to them, right? Yet being a wiki and volunteer, Wikipedia could never be "safeguarded" to their satisfaction. Is this not a useless discussion?
No, Wikipedia will never be completely safeguarded, just like it will never be completely finished. But if we use a keyword system (either instead of or complementary to a rating scheme), then images can have metadata that users can filter out. So if people are offended by images of women's face, then they can block "girls, women, etc."
The best thing about this is that it won't interfere with normal users, since people will have to opt IN to it.
A tagging system shouldn't claim to be 100% complete, it's just an attempt to help people avoid viewing offensive images. It doesn't have to be complete- barely anything on Wikipedia is ever truly 'complete.' A tagging system especially should never advertise to be complete, because there will always be the possibility of somebody uploading a new image that isn't tagged as offensive when it may be.
-- Blog: http://frazzydee.ca
-----BEGIN GEEK CODE BLOCK----- Version: 3.1 GCS d? s:- a--- C+++ UL++ P+ L+ E---- W++ N+ o+ K+ w+ O? M-- V? PS++ PE Y PGP++ t 5-- X+ R tv b++ DI++ D+ G++ e- h! !r !z ------END GEEK CODE BLOCK------
Jack Lutz wrote:
The people or organizations likely to want content tagging would require complete tagging, or else it would be useless to them, right? Yet being a wiki and volunteer, Wikipedia could never be "safeguarded" to their satisfaction. Is this not a useless discussion?
WikiEN-l mailing list WikiEN-l@Wikipedia.org http://mail.wikipedia.org/mailman/listinfo/wikien-l
And while I haven't given a whole lot of thought to this, a keyword scheme in addition to a rating categorization scheme could also have other benefits, like being able to search through images better.
I don't see why a rating categorization scheme (and maybe a keyword system, not too sure about that one) it should be avoided if it doesn't harm anybody, and only offers optional positive enhancements to users who want it.
-- Blog: http://frazzydee.ca
-----BEGIN GEEK CODE BLOCK----- Version: 3.1 GCS d? s:- a--- C+++ UL++ P+ L+ E---- W++ N+ o+ K+ w+ O? M-- V? PS++ PE Y PGP++ t 5-- X+ R tv b++ DI++ D+ G++ e- h! !r !z ------END GEEK CODE BLOCK------
Faraaz Damji wrote:
No, Wikipedia will never be completely safeguarded, just like it will never be completely finished. But if we use a keyword system (either instead of or complementary to a rating scheme), then images can have metadata that users can filter out. So if people are offended by images of women's face, then they can block "girls, women, etc."
The best thing about this is that it won't interfere with normal users, since people will have to opt IN to it.
A tagging system shouldn't claim to be 100% complete, it's just an attempt to help people avoid viewing offensive images. It doesn't have to be complete- barely anything on Wikipedia is ever truly 'complete.' A tagging system especially should never advertise to be complete, because there will always be the possibility of somebody uploading a new image that isn't tagged as offensive when it may be.
-- Blog: http://frazzydee.ca
-----BEGIN GEEK CODE BLOCK----- Version: 3.1 GCS d? s:- a--- C+++ UL++ P+ L+ E---- W++ N+ o+ K+ w+ O? M-- V? PS++ PE Y PGP++ t 5-- X+ R tv b++ DI++ D+ G++ e- h! !r !z ------END GEEK CODE BLOCK------
Jack Lutz wrote:
The people or organizations likely to want content tagging would require complete tagging, or else it would be useless to them, right? Yet being a wiki and volunteer, Wikipedia could never be "safeguarded" to their satisfaction. Is this not a useless discussion?
WikiEN-l mailing list WikiEN-l@Wikipedia.org http://mail.wikipedia.org/mailman/listinfo/wikien-l
WikiEN-l mailing list WikiEN-l@Wikipedia.org http://mail.wikipedia.org/mailman/listinfo/wikien-l
Faraaz Damji wrote:
No, Wikipedia will never be completely safeguarded, just like it will never be completely finished. But if we use a keyword system (either instead of or complementary to a rating scheme), then images can have metadata that users can filter out. So if people are offended by images of women's face, then they can block "girls, women, etc."
The best thing about this is that it won't interfere with normal users, since people will have to opt IN to it.
Exactly, this is the way to go. I just wrote pretty much the same thing in another email.
Zach
On 4/14/05, Zach Alexander zdalexander@gmail.com wrote:
Faraaz Damji wrote:
No, Wikipedia will never be completely safeguarded, just like it will never be completely finished. But if we use a keyword system (either instead of or complementary to a rating scheme), then images can have metadata that users can filter out. So if people are offended by images of women's face, then they can block "girls, women, etc."
The best thing about this is that it won't interfere with normal users, since people will have to opt IN to it.
Exactly, this is the way to go. I just wrote pretty much the same thing in another email.
I don't think there's anything wrong with allowing arbitrary tagging. There may be something wrong with any particular filtering system, and I'm not sure we should support any one in particular. Offering people the ability to edit metadata tags, without making some Official Policy concerning how they will be used, would be a useful addition to the wiki-toolbox.
These tags/keywords could be optionally shown at the top of each page and at the bottom of each image, for editors who wanted to see it... or just shown in a separate section linked to at the bottom of each article/image page.
SJ
ps - there was a similar outcry about the possible introduction of Categories; with some people insisting that since there was no One Good Way to categorize things, cats would just be a mess, would be pov, etc.
I really think this comes down to an issue of accessibility. If we're aiming to build a large, open and accessible encyclopedia, then I don't think it helps that cause to effectively limit its readership to the most extremely liberal people on the planet - which rules out a vast amount of our potential readers and contributors. By definition, it particularly rules out a rather large proportion of people in non-Western countries, which can't be good for our content.
I'm really tired of feeling like a conservative on Wikipedia. Somewhere, it seems we lost a bit of basic common sense. If there's a good reason an image is likely to offend people (and there's justification for it being there it all), then make people use a second click to access the image. In that instance, no information is lost, it's still just as accessible, and the amount of people likely to be offended/not use Wikipedia drops markedly.
-- ambi
On 4/15/05, Zach Alexander zdalexander@gmail.com wrote:
Faraaz Damji wrote:
No, Wikipedia will never be completely safeguarded, just like it will never be completely finished. But if we use a keyword system (either instead of or complementary to a rating scheme), then images can have metadata that users can filter out. So if people are offended by images of women's face, then they can block "girls, women, etc."
The best thing about this is that it won't interfere with normal users, since people will have to opt IN to it.
Exactly, this is the way to go. I just wrote pretty much the same thing in another email.
Zach
WikiEN-l mailing list WikiEN-l@Wikipedia.org http://mail.wikipedia.org/mailman/listinfo/wikien-l
Rebecca said:
I really think this comes down to an issue of accessibility. If we're aiming to build a large, open and accessible encyclopedia, then I don't think it helps that cause to effectively limit its readership to the most extremely liberal people on the planet - which rules out a vast amount of our potential readers and contributors.
There are no a priori access restrictions on Wikipedia. If the pictures offend you, don't download them.
Tony Sidaway wrote:
Rebecca said:
I really think this comes down to an issue of accessibility. If we're aiming to build a large, open and accessible encyclopedia, then I don't think it helps that cause to effectively limit its readership to the most extremely liberal people on the planet - which rules out a vast amount of our potential readers and contributors.
There are no a priori access restrictions on Wikipedia. If the pictures offend you, don't download them.
But often a person has no way of knowing whether a picture will be offensive until he has seen it. We can only suggest that a picture *may* be offensive to some, perhaps with some offensiveness rating. That can still give the user the option to set his own tolerance level.
Ec
Ray Saintonge said:
Tony Sidaway wrote:
Rebecca said:
I really think this comes down to an issue of accessibility. If we're aiming to build a large, open and accessible encyclopedia, then I don't think it helps that cause to effectively limit its readership to the most extremely liberal people on the planet - which rules out a vast amount of our potential readers and contributors.
There are no a priori access restrictions on Wikipedia. If the pictures offend you, don't download them.
But often a person has no way of knowing whether a picture will be offensive until he has seen it. We can only suggest that a picture *may* be offensive to some, perhaps with some offensiveness rating. That can still give the user the option to set his own tolerance level.
I don't think we can say whether a given picture is or is not likely to be offensive to anyone--for instance I was utterly unprepared for the apparently genuine expressions of distress at the presence of a smudgy nipple, and I don't think that's an untypical attitude. I'm not going to stand in the way of those who want to give it a go, though. We use consensus for judging a lot of stuff on Wikipedia and although it isn't infallible it seems to work better than you'd think it oughta.
Indeed, that's what I'm asking for. A choice as to whether to download them - an acknowledgement that the image is likely to be offensive, and then a *choice*. You, on the other hand, seem to be demanding that I either be forced to download them, or to go the ridiculous extreme of turning off all images.
-- ambi
On 4/16/05, Tony Sidaway minorityreport@bluebottle.com wrote:
Rebecca said:
I really think this comes down to an issue of accessibility. If we're aiming to build a large, open and accessible encyclopedia, then I don't think it helps that cause to effectively limit its readership to the most extremely liberal people on the planet - which rules out a vast amount of our potential readers and contributors.
There are no a priori access restrictions on Wikipedia. If the pictures offend you, don't download them.
WikiEN-l mailing list WikiEN-l@Wikipedia.org http://mail.wikipedia.org/mailman/listinfo/wikien-l
Rebecca said:
Indeed, that's what I'm asking for. A choice as to whether to download them - an acknowledgement that the image is likely to be offensive, and then a *choice*. You, on the other hand, seem to be demanding that I either be forced to download them, or to go the ridiculous extreme of turning off all images.
Not at all. Do you use Internet Explorer? 95% of people do. The other day I gave a step-by-step demo of how to make that browser replace all pictures by placeholders, so you have fine control over the download of images on an image-by-image basis. This is the way I normally browse WWW when I'm in a situation where I either am not interested in looking at pictures or know that some images may be unwanted for other reasons. It's much nicer than having your browser download everything, especially if you're on dial-up. Most other browsers can also do this.
Tony Sidaway wrote:
Rebecca said:
Indeed, that's what I'm asking for. A choice as to whether to download them - an acknowledgement that the image is likely to be offensive, and then a *choice*. You, on the other hand, seem to be demanding that I either be forced to download them, or to go the ridiculous extreme of turning off all images.
Not at all. Do you use Internet Explorer? 95% of people do. The other day I gave a step-by-step demo of how to make that browser replace all pictures by placeholders, so you have fine control over the download of images on an image-by-image basis. This is the way I normally browse WWW when I'm in a situation where I either am not interested in looking at pictures or know that some images may be unwanted for other reasons. It's much nicer than having your browser download everything, especially if you're on dial-up. Most other browsers can also do this.
Our policy should not depend on the user's understanding of browser settings. Many who do not understand these settings have also not figured out how to set the clock on their VCR. We cannot base policy on a presumption of what others understand about any outside technology. As much as possible we need to accept responsibility for what WE do. Any kind of restrictions about access to images should be based in our own software.
Ec
Please do not jump to conclusions concerning my understanding of browser settings. I know perfectly well how to disable images in my browser, and for the record, I use Firefox. However, I have no wish to disable them for every site - or to just disable them every time I use Wikipedia - because a handful of people want to use Wikipedia to push a liberal POV.
And for gods sake, Tony, stop throwing around the personal attacks. I'm sick and tired of being accused of being computer illiterate just because I do not wish to go to the patently absurd (and patently unnecessary) length of disabling all images.
-- ambi
On 4/17/05, Ray Saintonge saintonge@telus.net wrote:
Our policy should not depend on the user's understanding of browser settings. Many who do not understand these settings have also not figured out how to set the clock on their VCR. We cannot base policy on a presumption of what others understand about any outside technology. As much as possible we need to accept responsibility for what WE do. Any kind of restrictions about access to images should be based in our own software.
Ec
WikiEN-l mailing list WikiEN-l@Wikipedia.org http://mail.wikipedia.org/mailman/listinfo/wikien-l
There was nothing in what I wrote drawing conclusions about your or any other specific person's understanding of browser settings. The ones who are likely to have problems with the settings are probably not editors, but simple passive users of the material. I, in fact, support your view that users should not need to disable all pictures just to avoid seeing a few. I'm afraid that you're the one jumping to erroneous conclusions about what I said.
Ec
Rebecca wrote:
Please do not jump to conclusions concerning my understanding of browser settings. I know perfectly well how to disable images in my browser, and for the record, I use Firefox. However, I have no wish to disable them for every site - or to just disable them every time I use Wikipedia - because a handful of people want to use Wikipedia to push a liberal POV.
-- ambi
On 4/17/05, Ray Saintonge saintonge@telus.net wrote:
Our policy should not depend on the user's understanding of browser settings. Many who do not understand these settings have also not figured out how to set the clock on their VCR. We cannot base policy on a presumption of what others understand about any outside technology. As much as possible we need to accept responsibility for what WE do. Any kind of restrictions about access to images should be based in our own software.
Ec
Jack Lutz said:
The people or organizations likely to want content tagging would require complete tagging, or else it would be useless to them, right? Yet being a wiki and volunteer, Wikipedia could never be "safeguarded" to their satisfaction. Is this not a useless discussion?
I think it may be, ultimately, because I suspect that the people making a clamor about the odd nipple or whatever are probably not going to be satisfied until the offending object is removed. I could be wrong, though, so I don't object to people trying this in an effort to make Wikipedia better.
Karl A. Krueger wrote:
What's wrong with censorware tagging? Where to start? Here's the biggie: tagging is incompatible with Wikipedia's existing commitments.
No system of tags is compatible with Wikipedia's commitment to neutrality. The dimensions, biases, and extremes of any system of tags are created from a particular non-neutral point of view. Wikipedia is categorically forbidden from taking on such a point of view as its own.
By "dimensions" I mean the types of material that are considered worth tagging -- e.g. nudity; violence; religion. The reason that commercial censorware products have an "explicit nudity" dimension and not a "explicit Christianity" dimension should be tolerably obvious in the marketplace -- but Wikipedia does not have any business deciding for its readers that "nudity" is problematic and needs to be a filtering option but "Christianity" doesn't.
By "biases" I mean the inherent bigotries that will be encoded into any particular category. A system which considers female breasts to be "nudity" but male chests not to be is sexist by nature. (And anyone who thinks that female breasts are "sexual" but male chests are not simply has not asked enough straight women or gay men for their opinion on the issue.)
You're absolutely right, but you're also wrong.
First of all, on a minor note, you overstate the NPOVness of Wikipedia. There is no "view from nowhere." Every edit is POV -- so called NPOV just means arriving at POVs by consensus if possible and majority if not. It would be no different for breasts/Christianity/homosexuality tags than it is for anything else that we argue about.
But you are right about it being problematic, or excessively POV-laden, to have any simple criteria like I was suggesting, or just suggested on the other thread. (E.g. having "sexual content/nudity" being the "dimension" and only two or three options on that dimension.) As you point out there are many kinds of "objectionable" subject matter and many degrees within each.
But the solution is not to throw up our hands and sacrifice Wikipedia's success because NPOV supposedly ties our hands -- the solution would be to **leave the choice of objectionable content in the hands of the end user**. We create [[Special:Censorship]]. If a user blanks it, Wikipedia is uncensored. If a user puts the tags {{img.genitals}}, {{img.femalebreasts}}, and {{paganism}} on it, he doesn't see genitals or Kate Winslet, or even see articles about pagans. If another user puts {{img.malechests}}, {{img.ankles}}, and {{christianity}}, she doesn't see pictures of those things or articles on Christianity.
A setup more or less like this would make things more complicated, and like all things Wikipedia would never be completely finished, but it would solve the problem, and it is immune to Karl's objection. People would generally put {{img.femalebreasts}} tags on things like Kate Winslet, and people at work, etc. could set up their [[Special:Censorship]] page and that would be that.
Of course, most people would realize it's not worth the trouble trying to protect yourself/kids from homo kisses and pagans and boobies. But until everyone grows up, it's probably either "fork" (so to speak, via what I just described) or be forked (eventually).
Zach
Zach Alexander said:
the solution would be to **leave the choice of objectionable content in the hands of the end user**. We create [[Special:Censorship]]. If a user blanks it, Wikipedia is uncensored. If a user puts the tags {{img.genitals}}, {{img.femalebreasts}}, and {{paganism}} on it, he doesn't see genitals or Kate Winslet, or even see articles about pagans. If another user puts {{img.malechests}}, {{img.ankles}}, and {{christianity}}, she doesn't see pictures of those things or articles on Christianity.
This seems reasonable, though I'm not sure if it would be workable in practice. I still lean heavily towards the view that the reader know best precisely which images he prefers, and if it matters to him he can choose to download those images and precisely those images. But I think the best and simplest server-side support for suppressing downloads that we could give to users would be one that turned all inlines into links. The best browsers already provide a facility to do something similar and I use it all the time on Internet Explorer and Firefox. There should also be an option on links (all links, not just the ones that are suppressed inlines) that renders the image as part of the article. Incidentally anyone with IE can try this out:
Go to Tools/Internet Options
Click the Advanced tab
Under Multimedia make the following changes:
Ensure that Show Image Download Placeholders is checked
Ensure that Show Pictures is unchecked
If you now click OK and reload the current page, any pictures will be replaced by a placeholder. To view any of these pictures in the page, hover over the placeholder and select "Show Picture" in the right mouse button menu. It's that simple. No server-side support is required.
One possible way of making server-side filtering scale is for motivated editors to create their own ratings for images. [[User:Example/Classification/NotSuitableForKids]] might contain a list of links to images that, in the opinion of User:Example, are unsuitable for kids. Someone for whom this kind of thing matters can then pop this into his Censorship file, with the result that he won't be able to download anything that gets put into that article. The system would be based on trust. People with religious objections and whatnot could club together and maintain their censor lists without bothering anybody else.
Tony Sidaway (minorityreport@bluebottle.com) [050415 22:58]:
One possible way of making server-side filtering scale is for motivated editors to create their own ratings for images. [[User:Example/Classification/NotSuitableForKids]] might contain a list of links to images that, in the opinion of User:Example, are unsuitable for kids. Someone for whom this kind of thing matters can then pop this into his Censorship file, with the result that he won't be able to download anything that gets put into that article. The system would be based on trust. People with religious objections and whatnot could club together and maintain their censor lists without bothering anybody else.
That could be a very good and workable solution, allowing classifications to grow organically without a chilling effect on the production of raw material..
[ ] Block all images on pages linked to from the page: [________]
(that's a ticky-box at the start and a text box at the end.)
How's your PHP coding?
- d.
David Gerard wrote:
Tony Sidaway (minorityreport@bluebottle.com) [050415 22:58]:
pop this into his Censorship file, with the result that he won't be able to download anything that gets put into that article.
That could be a very good and workable solution, allowing classifications to grow organically without a chilling effect on the production of raw material..
Thumbs up, Tony and David. Minimal coding; maximum Wiki.
Tom Haws
Karl A. Krueger wrote:
On Wed, Apr 13, 2005 at 04:34:20PM +0100, Zach Alexander wrote:
Alphax wrote:
Er, the Kate Winslet nude is only marginally "nude". If it's such a problem, use a tabbed browser or disable images.
"Marginally nude" is still nudity. One breast counts as nudity.
In some cultures, exposure of the female *face* in public is "nudity".
As I jokingly said earlier, but yes. In some cultures, the face of a prophet cannot be shown.
You can argue that nudity should be allowed on Wikipedia, and I might even agree with you, but a spade's a spade.
Of course nudity should be allowed on Wikipedia, just as explicit pictures of tomatoes should be allowed. After all, the world contains both nudity and tomatoes, and Wikipedia's job is to describe the world.
The reality of NPOV is that we can't disallow anything that isn't illegal.
And you can't seriously mean that disabling images is a real solution to the problem. We'd have to have a notice on the main page, e.g. "Note: Nudity may appear in any entry. If you are offended by nudity, please disable images in your browser." I don't think that would go over well.
We already have a [[Wikipedia:Content disclaimer]] linked from the main page. It is at least as prominent as other quasi-legal notices posted on other Web sites, which purport to advise the reader of site policies -- for instance, privacy policies on many commercial Web sites.
We could have {{images}} at the top of pages with images, saying something like "This page contains images, which you may or may not be offended by".
How hard would it be to add to Mediawiki the option to tag explicit images, and have an option in the preferences to not see them? A la Google SafeSearch. This might be a win-win (aside from the programming work) solution.
This has been brought up time and again here.
Censorware tagging on Wikipedia is a little bit like "email postage" as an anti-spam solution -- it can't work; there are reasonably clear reasons it can't work; but it keeps getting proposed anyway.
And yet Google SafeSearch appears to work...
(The chief reason "email postage" can't work as an anti-spam solution is that spammers already steal other people's computing resources to send spam, e.g. using virus-infected computers, and would quite readily adapt to foist the "postage" cost off on innocents similarly. Ordinary mail users would be stuck with the "postage" while spammers would keep screwing over everyone else, just as they have always done. And all non-commercial mailing lists, such as this one, would be destroyed.)
What's wrong with censorware tagging? Where to start? Here's the biggie: tagging is incompatible with Wikipedia's existing commitments.
No system of tags is compatible with Wikipedia's commitment to neutrality. The dimensions, biases, and extremes of any system of tags are created from a particular non-neutral point of view. Wikipedia is categorically forbidden from taking on such a point of view as its own.
By "dimensions" I mean the types of material that are considered worth tagging -- e.g. nudity; violence; religion. The reason that commercial censorware products have an "explicit nudity" dimension and not a "explicit Christianity" dimension should be tolerably obvious in the marketplace -- but Wikipedia does not have any business deciding for its readers that "nudity" is problematic and needs to be a filtering option but "Christianity" doesn't.
By "biases" I mean the inherent bigotries that will be encoded into any particular category. A system which considers female breasts to be "nudity" but male chests not to be is sexist by nature. (And anyone who thinks that female breasts are "sexual" but male chests are not simply has not asked enough straight women or gay men for their opinion on the issue.)
By "extremes" I mean the judgment as to what is the "mildest" category of "objectionable content" which still merits tagging -- and what constitutes "mild" or "extreme". A system which tags exposure of the female breast, but not of the female ankle or face expresses a POV as to what is "acceptable" exposure. A system which tags explicit sexual activity but does not consider a homosexual kiss to be "explicit sexual activity" expresses a bias about an issue that many people apparently find offensive.
So are we prepared to sacrifice a tiny bit of NPOVness to make something of a higher quality? If we were truly neutral, we would prohibit deletion of material from articles, and the let article build up as a series of assertions. Readers would decide for themselves.
Karl A. Krueger said:
On Wed, Apr 13, 2005 at 04:34:20PM +0100, Zach Alexander wrote:
And you can't seriously mean that disabling images is a real solution to the problem.
It works for me, especially if I'm browsing from work.
We'd have to have a notice on the main page, e.g.
"Note: Nudity may appear in any entry. If you are offended by nudity, please disable images in your browser." I don't think that would go over well.
We already have a [[Wikipedia:Content disclaimer]] linked from the main page.
It is linked from every single page on Wikipedia.
geni said:
It is linked from every single page on Wikipedia.
The relivant part isn't (or wasn't when I last cheacked).
It's at the top of the disclaimer. The only text prior to that is the article title (Wikipedia:General disclaimer) and the attribution (From Wikipedia, the free encyclopedia.) It reads as follows:
"General disclaimer - Use Wikipedia at your own risk - Wikipedia does not give medical advice - Wikipedia does not give legal opinions - Wikipedia contains spoilers and content you may find objectionable"
The latter phrase is linked to the content disclaimer which expands further on the kind of stuff to expect.
Sean Barrett said:
For me, the largest part of the problem with the Kate Winslet nude is that it renders all of Wikipedia non-work-safe.
What does "work-safe" mean in this context? If it's only the pictures, why do you browse wikipedia with image downloads turned on?
Perhaps we can identify some reasonable notion of "work-safeness" and work to make the user experience work-safe for all of our readers. Our primary goal should be to make the encyclopedia a useful and reliable resource, for as many people as possible, in as many environments as possible. This includes making it reliably safe to let one's children use the encyclopedia, to refer to articles for illustration in the middle of a discussion or presentation, regardless of audience, to browse the encyclopedia at work.
I am not sure how to address the first case above - providing a site that parents would feel comfortable letting their 6-year-old browse. But I think we can deal with the last two scenarios pretty well without a great deal of work.
At each step along the way, we can assume that the reader is '''doing his or her best''' to avoid looking up undesired pages or content. Editors should avoid surprising users by showing them content they did not expect, and do not want, to see on a particular subject. For instance, we have {{spoiler}} messages. We should have the same kinds of messages for potentially offensive content [somehow I thought we did by now, but it seems this is not so.]
I've added a few templates ('obscenity', 'graphicviolence', and 'nonworksafe') to the end of the General templates page; please see whether the text is appropriate, and add them to articles where needed. http://en.wikipedia.org/wiki/Wikipedia:Template_messages/General
I think it is fair to assume that readers are doing all they can /not/ to come across content that they or their employers find inappropriate. It is also polite to warn readers when they are about to be shocked.
1. If the title of the article does *not* make it clear that it is not worksafe, or about violent or obscene or nude subjects; or 2. If the content of one section, despite its relevance to the article as a whole, is unusually graphic, or potentially offensive/startling;
then a spoiler-type warning is probably appropriate.
On 4/13/05, Tony Sidaway minorityreport@bluebottle.com wrote:
Sean Barrett said:
For me, the largest part of the problem with the Kate Winslet nude is that it renders all of Wikipedia non-work-safe.
What does "work-safe" mean in this context? If it's only the pictures, why do you browse wikipedia with image downloads turned on?
Images are a significant portion of the encyclopedia. No-one should have to turn images off simple because of a few poorly-announced and surprisingly-placed "non-work-safe" images.
No. It's a dumb term only used by people who can't operate a web browser.
Would you like to withdraw that statement? If not can you provide any sources to back it up (ie that every person who useses it doesn't know how to use a web browser?)
geni wrote:
No. It's a dumb term only used by people who can't operate a web browser.
Would you like to withdraw that statement? If not can you provide any sources to back it up (ie that every person who useses it doesn't know how to use a web browser?)
It's not so much whether people know how to operate the web browser, it's that it makes referring to WP an unduly risky activity. For instance, suppose I'm in the office of a less-clueful boss, and am trying to get the boss to understand a detail, and I know WP has a good explanation; it's not going to help my case if I have to ask the boss to turn off image display before I have him/her bring up a WP page.
This is another variant on the child-safe debate, and maybe the filtering needs to be done by a downstream organization (who could make a nice bit of money from a subscription service I bet), but if WP gets a reputation as risky to look at, and companies feel compelled to forbid its use at work, that's going to cut us off from a large population of professionals that we would really like to have participating, and during the day, not just nights and weekends.
Stan
Stan Shebs said:
It's not so much whether people know how to operate the web browser, it's that it makes referring to WP an unduly risky activity. For instance, suppose I'm in the office of a less-clueful boss, and am trying to get the boss to understand a detail, and I know WP has a good explanation; it's not going to help my case if I have to ask the boss to turn off image display before I have him/her bring up a WP page.
Precisely. You're blaming Wikipedia for your boss's cluelessness.
This is another variant on the child-safe debate
Indeed. We could make Wikipedia "child safe" (another utterly pointless term when applied to anything a three-year-old couldn't choke on), we could make it "granny safe", but it would be silly because all the people demanding that we make Wikipedia "such-and-such safe" are blaming their communication problems on Wikipedia and saying Wikipedia must be changed. Why do they do that? Presumably because it would involve less effort on their part if Wikipedia were to change to suit *their* personal requirements, and forget all that stuff about making a great encyclopedia. They wouldn't have to go about their difficult personal responsibilities, those of educating their clueless boss, or their children, or their granny, or failing that, ensuring that their non-Wikipedia-safe boss, child or granny is not operated in a manner that could cause damage to Wikipedia.
, and maybe the filtering needs to be done by a downstream organization
Or a downstream brain cell?
Precisely. You're blaming Wikipedia for your boss's cluelessness.
IT departments exist for a reason. I supose that you are going to try and use your learn to use your browser argument again. Unfortuently you don't appear to have thought about what we would have to do in order to allow people to do this. Supose I turn off images. Now supose I view an article and want to see the pictures. How do I know if the picture I am about to allow through is work safe? Well I can hope it is mention in the caption but if we take say this version of the article it clearly isn't: http://en.wikipedia.org/w/index.php?title=Titanic_%281997_movie%29&oldid...
So YOU are denying me the ability to make an informed descission.
Indeed. We could make Wikipedia "child safe" (another utterly pointless term when applied to anything a three-year-old couldn't choke on), we could make it "granny safe", but it would be silly because all the people demanding that we make Wikipedia "such-and-such safe" are blaming their communication problems on Wikipedia and saying Wikipedia must be changed. Why do they do that? Presumably because it would involve less effort on their part if Wikipedia were to change to suit *their* personal requirements, and forget all that stuff about making a great encyclopedia. They wouldn't have to go about their difficult personal responsibilities, those of educating their clueless boss, or their children, or their granny, or failing that, ensuring that their non-Wikipedia-safe boss, child or granny is not operated in a manner that could cause damage to Wikipedia.
Do you have any evidence of the above?
BTW since when did I have a responsibility to educate people?
Or a downstream brain cell?
downstream braincells can only react passivly . We can act activly.
On Thu, Apr 14, 2005 at 05:30:08PM +0100, geni wrote:
Precisely. You're blaming Wikipedia for your boss's cluelessness.
IT departments exist for a reason. I supose that you are going to try and use your learn to use your browser argument again. Unfortuently
Let's step back a moment and look at this from a more "principles of the discussion" perspective. I think you're both responding based on being too close to the details of the landscape, and missing the "big picture" panorama before you.
1. Tony Sidaway supports the idea that Wikipedia should be technically effective, and to heck with social factors that interfere with that in any way.
2. geni supports the idea that Wikipedia, to maximize opportunity for survival and growth, must pay homage to social conventions that demand certain (effectively arbitrary) limits on technical implementation.
My personal sympathies lie pretty well perfectly aligned with Tony's stance. I'm very much a "technical accuracy first, social pandering if we have time later" sort of person. Yes, I phrased that pejoratively, because that's how I tend to think of it. That doesn't mean I don't see the point of geni's stance.
Both are absolutely correct within the realm of their primary concerns in this debate. There are two constructive approaches that come immediately to my mind for resolving the matter:
1. We can take the organic growth approach, and try to just balance the two as well as we can by (very) rough consensus. This is what we're already doing. The big downside is that these arguments will continue indefinitely. I'll leave upside(s) as an exercise for the reader.
2. We can draft a set of guiding principles by which such matters will be judged, beginning with a logical progression from the core purpose of the project as a whole. My knee-jerk sympathies lie with this option, for much the same reasons that they lie as well with the "technical accuracy" approach above, but the big downside to this approach is the fact that, ultimately, such principles have to come from SOMEwhere, and in a very democratic ("mob rule") and consensus-driven environment there is little hope that the guiding principles will be decided for the right reasons, regardless of whether the right principles are chosen for the most part. Again, I'll leave upside(s) as an exercise for the reader.
So. Armed with this semipretentious breakdown of the conditions of the argument, do you think we might take a more productive approach to discussing the matter than oblique sniping?
-- Chad Perrin [ CCD CopyWrite | http://ccd.apotheon.org ]
Chad Perrin said:
- Tony Sidaway supports the idea that Wikipedia should be technically
effective, and to heck with social factors that interfere with that in any way.
More or less correct in this case, but only because the problem is not a technical one but a social one and my social solution is geared towards minimizing technical kludges. Only the end-user can possibly know what he does and does not want to see. If he visits an encyclopedia article about a PG13 movie he should expect it to contain PG13 illustrations, and if he is likely to have a problem with images from the movies he may want to read about he should certainly take appropriate precautions.
On 4/15/05, Tony Sidaway minorityreport@bluebottle.com wrote:
Chad Perrin said:
- Tony Sidaway supports the idea that Wikipedia should be technically
effective, and to heck with social factors that interfere with that in any way.
More or less correct in this case, but only because the problem is not a technical one but a social one and my social solution is geared towards minimizing technical kludges. Only the end-user can possibly know what he does and does not want to see. If he visits an encyclopedia article about a PG13 movie he should expect it to contain PG13 illustrations, and if he is likely to have a problem with images from the movies he may want to read about he should certainly take appropriate precautions.
WikiEN-l mailing list WikiEN-l@Wikipedia.org http://mail.wikipedia.org/mailman/listinfo/wikien-l
I wasn't aware that social enginearing was the aim of this project.
geni said:
On 4/15/05, Tony Sidaway minorityreport@bluebottle.com wrote:
Chad Perrin said:
- Tony Sidaway supports the idea that Wikipedia should be
technically effective, and to heck with social factors that interfere with that in any way.
More or less correct in this case, but only because the problem is not a technical one but a social one and my social solution is geared towards minimizing technical kludges. Only the end-user can possibly know what he does and does not want to see. If he visits an encyclopedia article about a PG13 movie he should expect it to contain PG13 illustrations, and if he is likely to have a problem with images from the movies he may want to read about he should certainly take appropriate precautions.
I wasn't aware that social enginearing was the aim of this project.
Nor I. There are social considerations implicit in the design of a website, but ultimately one should seek to leave all decisions with the user. Commercial websites seldom adopt this stance, but it's an essential element of good design.
picture I am about to allow through is work safe? Well I can hope it is mention in the caption but if we take say this version of the article it clearly isn't: http://en.wikipedia.org/w/index.php?title=Titanic_%281997_movie%29&oldid...
Is this a purely theoretical discussion? Naming that page as an example of a page that isn't "work safe" is laughable. Surely, in most work-places it isn't work safe to waste your salaried time reading up about Titanic, but THAT PARTICULAR IMAGE does not make the page less "work safe" than it already is.
BJörn Lindqvist said:
picture I am about to allow through is work safe? Well I can hope it is mention in the caption but if we take say this version of the article it clearly isn't: http://en.wikipedia.org/w/index.php?title=Titanic_%281997_movie%29&oldid...
Is this a purely theoretical discussion? Naming that page as an example of a page that isn't "work safe" is laughable. Surely, in most work-places it isn't work safe to waste your salaried time reading up about Titanic, but THAT PARTICULAR IMAGE does not make the page less "work safe" than it already is.
Geni has also unwittlingly demonstrated that the Wikipedia site will never be "worksafe" unless we delete the history files.
geni said:
Geni has also unwittlingly demonstrated that the Wikipedia site will never be "worksafe" unless we delete the history files.
How many people look through the history files?
It only takes one link on an external website placed by a joker. "Hey, look what they have here!" Every single version of every article on Wikipedia is URL-accessible. If the picture in question still exists, it will be shown in the disposition suggested by the version being viewed, not in the supposedly "work-safe" disposition of the current version. The only solution is to let the users take control of their own browsers. They know what they do and do not what to see.
It only takes one link on an external website placed by a joker. "Hey, look what they have here!" Every single version of every article on Wikipedia is URL-accessible.
We are not respociple for the actions of people offsite. That fact is that at present wikipedia does not make a very good shock site. I would like to keep things this way.
If the picture in question still exists, it will be shown in the disposition suggested by the version being viewed, not in the supposedly "work-safe" disposition of the current version. The only solution is to let the users take control of their own browsers. They know what they do and do not what to see.
As I showed no they don't. In that version there was no way to know what they are going to see if they allow the image through. Uniformed choice is meaningless.
geni said:
It only takes one link on an external website placed by a joker. "Hey, look what they have here!" Every single version of every article on Wikipedia is URL-accessible.
We are not respociple for the actions of people offsite. That fact is that at present wikipedia does not make a very good shock site. I would like to keep things this way.
Well I've just shown you that Wikipedia *is* a very good shocksite. A concrete example: fred1245 makes a userpage and pops the clitoris picture, enlarged to fill the browser page, into the page, saves that and then blanks it. Then he posts the innocent-looking Wikipedia URL http://en.wikipedia.org/w/index.php?title=User:Fred1245&oldid=12302327 on the Rapture Ready forum.
The only solution is to let the users take control of their
own browsers. They know what they do and do not what to see.
As I showed no they don't. In that version there was no way to know what they are going to see if they allow the image through. Uniformed choice is meaningless.
The picture in question is a still from a PG13 movie, placed on an article about that movie. If people go to that article, presumably they expect to see stills from the movie. If they don't, there's nothing we can do to help them.
Well I've just shown you that Wikipedia *is* a very good shocksite. A concrete example: fred1245 makes a userpage and pops the clitoris picture, enlarged to fill the browser page, into the page, saves that and then blanks it. Then he posts the innocent-looking Wikipedia URL http://en.wikipedia.org/w/index.php?title=User:Fred1245&oldid=12302327 on the Rapture Ready forum.
At that is a bad thing how? If I wanted to cause trouble at RR I would post some stats showing that christians in the US are in the majority or something. Anyway alturnativly he just rehosts the goatse and links to that (although linking to the skeptic's bible would probably result in a bigger fuss).
The picture in question is a still from a PG13 movie, placed on an article about that movie. If people go to that article, presumably they expect to see stills from the movie. If they don't, there's nothing we can do to help them.
And how am I ment to know that titanic is PG13 (a rating that doesn't even exist in my country)?
geni said:
Well I've just shown you that Wikipedia *is* a very good shocksite. A concrete example: fred1245 makes a userpage and pops the clitoris picture, enlarged to fill the browser page, into the page, saves that and then blanks it. Then he posts the innocent-looking Wikipedia URL http://en.wikipedia.org/w/index.php?title=User:Fred1245&oldid=12302327 on the Rapture Ready forum.
At that is a bad thing how?
I'm not going to argue about whether it's a bad thing, I'm just showing you that Wikipedia is easy to use as a shocksite in a manner that would be hard to detect prior to first use.
The picture in question is a still from a PG13 movie, placed on an article about that movie. If people go to that article, presumably they expect to see stills from the movie. If they don't, there's nothing we can do to help them.
And how am I ment to know that titanic is PG13 (a rating that doesn't even exist in my country)?
It's probably best to assume that *all* movies may contain things that some very sensitive people will find upsetting to look at. Fast cars, violence, explosions, extreme cruelty, glorification of stupidity, and last but not least the occasional bare chested young lady. People who are likely to be upset by movies know who they are and precisely what it is that is likely to upset them. We don't--hence this display of mutual incomprehension between those of us who don't know what the fuss is about and those who think it's blatantly obvious.
I'm not going to argue about whether it's a bad thing, I'm just showing you that Wikipedia is easy to use as a shocksite in a manner that would be hard to detect prior to first use.
Not really RR is not quite as uptight as you seem to think. There is even a disscussion on pubic hair shaveing in there somewhere.
It's probably best to assume that *all* movies may contain things that some very sensitive people will find upsetting to look at. Fast cars, violence, explosions, extreme cruelty, glorification of stupidity, and last but not least the occasional bare chested young lady. People who are likely to be upset by movies know who they are and precisely what it is that is likely to upset them. We don't--hence this display of mutual incomprehension between those of us who don't know what the fuss is about and those who think it's blatantly obvious.
Snow white and the seven dawfes? Most of the stuff listed as U in the UK?
geni said:
I'm not going to argue about whether it's a bad thing, I'm just showing you that Wikipedia is easy to use as a shocksite in a manner that would be hard to detect prior to first use.
Not really RR is not quite as uptight as you seem to think. There is even a disscussion on pubic hair shaveing in there somewhere.
I've posted there a lot in the past (nicely, and they liked me a lot). I know what they're like.
It's probably best to assume that *all* movies may contain things that some very sensitive people will find upsetting to look at. Fast cars, violence, explosions, extreme cruelty, glorification of stupidity, and last but not least the occasional bare chested young lady. People who are likely to be upset by movies know who they are and precisely what it is that is likely to upset them. We don't--hence this display of mutual incomprehension between those of us who don't know what the fuss is about and those who think it's blatantly obvious.
Snow white and the seven dawfes? Most of the stuff listed as U in the UK?
People who are that easily shocked may get upset at movies, period. It is notoriously difficult to tell the content of a movie from its name. And you just pointed out yourself that movie ratings in one country may mean nothing in another.
I've posted there a lot in the past (nicely, and they liked me a lot). I know what they're like.
Appel to authority logical fallacy. You haven't explained why they shouldn't just link to a mirror of a shock site for a much more effective effect.
People who are that easily shocked may get upset at movies, period. It is notoriously difficult to tell the content of a movie from its name. And you just pointed out yourself that movie ratings in one country may mean nothing in another.
They may get shocked at movies but they will not expect to get shocked by encyopedia articles.
geni said:
I've posted there a lot in the past (nicely, and they liked me a lot). I know what they're like.
Appel to authority logical fallacy.
What? I'm just explaining that I know what they're like. I was once a regular there.
You haven't explained why they shouldn't just link to a mirror of a shock site for a much more effective effect.
It isn't my argument that Wikipedia is the most effective shock site. I just demonstrated how it is, nevertheless, very easy to use as one in a manner that would be undetectable prior to first use--the user page wouldn't show in the "what links to" list for more than the few seconds it took for the miscreant to blank it.
People who are that easily shocked may get upset at movies, period. It is notoriously difficult to tell the content of a movie from its name. And you just pointed out yourself that movie ratings in one country may mean nothing in another.
They may get shocked at movies but they will not expect to get shocked by encyopedia articles.
If they're shocked by a boob or two I think it's fair to say that they'll be used to being shocked by encyclopedias. There is nothing we can do about this. My 1950s Britannica has boobs, I don't expect things have changed much on that score in the past 50 years.
geni wrote:
If the picture in question still exists, it will be shown in the disposition suggested by the version being viewed, not in the supposedly "work-safe" disposition of the current version. The only solution is to let the users take control of their own browsers. They know what they do and do not what to see.
As I showed no they don't. In that version there was no way to know what they are going to see if they allow the image through. Uniformed choice is meaningless.
Uninformed choice is also meaningless.
Ec
BJörn Lindqvist wrote:
picture I am about to allow through is work safe? Well I can hope it is mention in the caption but if we take say this version of the article it clearly isn't: http://en.wikipedia.org/w/index.php?title=Titanic_%281997_movie%29&oldid...
Is this a purely theoretical discussion? Naming that page as an example of a page that isn't "work safe" is laughable. Surely, in most work-places it isn't work safe to waste your salaried time reading up about Titanic, but THAT PARTICULAR IMAGE does not make the page less "work safe" than it already is.
The long-forgotten genesis of this thread was that Sean Barrett was observing that WP's current stance on images meant that a WP user could run afoul of companies' anti-sexual-harassment policies (images of nudes being considered to create a "hostile work environment"); some of those policies require that employees be terminated for violating them. So it's not at all theoretical; I would advise any person who works for such a company not to use WP for any purpose at all during work hours, even if the intended use of WP is work-related.
Stan
Tony Sidaway wrote:
Stan Shebs said:
It's not so much whether people know how to operate the web browser, it's that it makes referring to WP an unduly risky activity. For instance, suppose I'm in the office of a less-clueful boss, and am trying to get the boss to understand a detail, and I know WP has a good explanation; it's not going to help my case if I have to ask the boss to turn off image display before I have him/her bring up a WP page.
Precisely. You're blaming Wikipedia for your boss's cluelessness.
Are you fifteen years old or what? I'm describing the realities of the workplace - responsible adults with mortgages and families depending on them don't get themselves fired just to make a free speech point. You can fulminate about it all you want, but that's not going to change anything.
A WP with nudity on every page (which sounds great to me actually, much more interesting than what we have now :-) - I can contribute scans of my "nudes on postage stamps") is certainly doable, and will no doubt have a contingent characterizing the result as a "great encyclopedia", but its readership will be rather limited. You can expect to see a great many forks at that point, and most editors will likely migrate to one of them, because they're going to go where the readers are to be found. So the situation will resolve itself in favor of a somewhat restrictive image policy sooner or later; just a question of whether you want to do it now, or after a painful forking process.
Stan
Stan Shebs wrote:
I'm describing the realities of the workplace - responsible adults with mortgages and families depending on them don't get themselves fired just to make a free speech point. You can fulminate about it all you want, but that's not going to change anything.
A WP with nudity on every page (which sounds great to me actually, much more interesting than what we have now :-) - I can contribute scans of my "nudes on postage stamps") is certainly doable, and will no doubt have a contingent characterizing the result as a "great encyclopedia", but its readership will be rather limited. You can expect to see a great many forks at that point, and most editors will likely migrate to one of them, because they're going to go where the readers are to be found. So the situation will resolve itself in favor of a somewhat restrictive image policy sooner or later; just a question of whether you want to do it now, or after a painful forking process.
Tony,
Do what I did here. Take away the personal attack from Stan's message, and then read the rest. Stan is simply telling the truth.
Tom Haws
Tom Haws said:
Stan Shebs wrote:
I'm describing the realities of the workplace - responsible adults with mortgages and families depending on them don't get themselves fired just to make a free speech point.
Tony,
Do what I did here. Take away the personal attack from Stan's message, and then read the rest. Stan is simply telling the truth.
I don't accept that you have removed the personal attack. There is still the blatantly false claim that people who disagree with X or Y's deletionist argument are implicitly "trying to make a free speech point". I do not argue for free speech, ever--the concept is virtually unknown in my country and not one that I consciously espouse. This is simply a matter of employees falsely claiming that Wikipedia must change in order to solve their own personal communication and web navigation problems.
Tony Sidaway wrote:
This is simply a matter of employees falsely claiming that Wikipedia must change in order to solve their own personal communication and web navigation problems.
I'm pretty sure nobody has said that exactly. It is, however, the case that many employees will choose not to put their jobs at risk, even if WP would be helpful in their day-to-day work activities. Maybe that's OK; maybe we don't care if those people participate or not.
However, I think it would be really helpful to the goal of encyclopedia creation if it were possible for, say, the USGS to assign one of their technical writers to the task of improving WP's info on mountain ranges, or seafloor vents, or volcano prediction, plus adding pictures that are not currently available on the net. That kind of win can't happen if we don't come up with some way to better accommodate organizational policies on sexual harassment.
Stan
Stan Shebs said:
I think it would be really helpful to the goal of encyclopedia creation if it were possible for, say, the USGS to assign one of their technical writers to the task of improving WP's info on mountain ranges, or seafloor vents, or volcano prediction, plus adding pictures that are not currently available on the net. That kind of win can't happen if we don't come up with some way to better accommodate organizational policies on sexual harassment.
So the situation is this. You think that USGS would be willing to assign one of its technical writers to improve geometric data, but this chap is unlikely to want to use Wikipedia in its current state because he might be falsely accused, by person or persons unknown, of sexual harassment? Curiouser and curiouser!
I'm afraid you'll have to sort that one out for yourself. Wikipedia really shouldn't be involved in accommodating or condoning this kind of corporate idiocy.
Tony Sidaway wrote:
Stan Shebs said:
I think it would be really helpful to the goal of encyclopedia creation if it were possible for, say, the USGS to assign one of their technical writers to the task of improving WP's info on mountain ranges, or seafloor vents, or volcano prediction, plus adding pictures that are not currently available on the net. That kind of win can't happen if we don't come up with some way to better accommodate organizational policies on sexual harassment.
So the situation is this. You think that USGS would be willing to assign one of its technical writers to improve geometric data, but this chap is unlikely to want to use Wikipedia in its current state because he might be falsely accused, by person or persons unknown, of sexual harassment? Curiouser and curiouser!
Welcome to the 21st century. The US Geological Survey is actually a pretty liberal crowd. I've been to their offices in California, and they're the types who would be totally in favor of large-scale contribution to WP; I don't think it's all far-fetched to have some of their scientists and writers contribute as part of their day jobs.
However, the USGS is also an agency of the US govt, which as we know has all kinds of rules and policies that we and the USGS people might find idiotic, but it's not something they can do anything about. So prudent USGS managers will not ask their minions to do anything for WP during work, because the managers could potentially be held liable as well as the minions, especially should the boobie pictures happen to pop up just as the Congressional oversight committee happens to be walking through. (Note that turning off images is counterproductive if part of your work is to upload images.) We likely wouldn't ever hear about such a decision, at most one might notice a scarcity of experts participating in WP.
Ironically, organizations' rules forbidding images of nudity were originally pushed by feminist and other groups combating workplace sexism...
Stan
Stan Shebs said:
Welcome to the 21st century.
Most ironic.
So prudent USGS managers will not ask their minions to do anything for WP during work, because the managers could potentially be held liable as well as the minions, especially should the boobie pictures happen to pop up just as the Congressional oversight committee happens to be walking through.
This isn't something we can do anything about. Perhaps you could try voting for some sane congresscritters.
Tony Sidaway wrote:
So prudent USGS managers will not ask their minions to do anything for WP during work, because the managers could potentially be held liable as well as the minions, especially should the boobie pictures happen to pop up just as the Congressional oversight committee happens to be walking through.
This isn't something we can do anything about. Perhaps you could try voting for some sane congresscritters.
I *knew* I shouldn't have mentioned Congress, but I was hoping you wouldn't focus on that and ignore the key point, which is that this happens at all kinds of companies and government agencies around the world whether you like it or not. It's just not as simple as declaring everybody who disagrees with your attitude "idiotic" and "not sane".
Stan
This isn't something we can do anything about. Perhaps you could try voting for some sane congresscritters.
I *knew* I shouldn't have mentioned Congress, but I was hoping you wouldn't focus on that and ignore the key point, which is that this happens at all kinds of companies and government agencies around the world whether you like it or not. It's just
We would also like for the Saudi Geological Survey's professionals to contribute data to Wikipedia. I've heard that they are a very open-minded bunch. However, SGS is attached to the Ministry of Petroleum and Mineral Resources and it would be very unfortunate if something like http://en.wikipedia.org/wiki/Woman popped up while Fahd bin Abdul Aziz visits the office.
Piggybacking because for some reason I missed Stan's reply.
BJörn Lindqvist said:
Stan Shebs said:
This isn't something we can do anything about. Perhaps you could try voting for some sane congresscritters.
I *knew* I shouldn't have mentioned Congress, but I was hoping you wouldn't focus on that and ignore the key point, which is that this happens at all kinds of companies and government agencies around the world whether you like it or not. It's just
I can't imagine the Ordnance Survey adopting the attitude you describe, somehow. If they wanted to contribute stuff I expect they'd just get on with it. However your is a very strange argument. Wouldn't the Chinese government just love to contribute some geographical data? Well I guess they may a shopping list of things they would like removed from Wikipedia first. I know, let's guess what bits they wouldn't like, and remove them first, to encourage their geographers to contribute to Wikipedia...
BJörn Lindqvist wrote:
This isn't something we can do anything about. Perhaps you could try voting for some sane congresscritters.
I *knew* I shouldn't have mentioned Congress, but I was hoping you wouldn't focus on that and ignore the key point, which is that this happens at all kinds of companies and government agencies around the world whether you like it or not. It's just
We would also like for the Saudi Geological Survey's professionals to contribute data to Wikipedia. I've heard that they are a very open-minded bunch. However, SGS is attached to the Ministry of Petroleum and Mineral Resources and it would be very unfortunate if something like http://en.wikipedia.org/wiki/Woman popped up while Fahd bin Abdul Aziz visits the office.
Indeed that's a very good point. The way I look at it is purely utilitarian; would the Saudis be able to contribute a large amount of good new content? If so, I would look for ways to accommodate their sensitivities. After all, WP is supposed to be a encyclopedia for everybody, not just another vehicle for Europeans and Americans to impose their cultural standards on the rest of the world, right?
To put it another way, if you had to make a choice between a Kate Winslet picture and a thousand subject matter experts working on WP fulltime, which would you go for? I'd delete the image in a second.
Stan
Stan Shebs wrote:
Tony Sidaway wrote:
Stan Shebs said:
It's not so much whether people know how to operate the web browser, it's that it makes referring to WP an unduly risky activity. For instance, suppose I'm in the office of a less-clueful boss, and am trying to get the boss to understand a detail, and I know WP has a good explanation; it's not going to help my case if I have to ask the boss to turn off image display before I have him/her bring up a WP page.
Precisely. You're blaming Wikipedia for your boss's cluelessness.
Are you fifteen years old or what? I'm describing the realities of the workplace - responsible adults with mortgages and families depending on them don't get themselves fired just to make a free speech point. You can fulminate about it all you want, but that's not going to change anything.
The only workable solution is to get a text-only version.
A WP with nudity on every page (which sounds great to me actually, much more interesting than what we have now :-) - I can contribute scans of my "nudes on postage stamps") is certainly doable, and will no doubt have a contingent characterizing the result as a "great encyclopedia", but its readership will be rather limited. You can expect to see a great many forks at that point, and most editors will likely migrate to one of them, because they're going to go where the readers are to be found. So the situation will resolve itself in favor of a somewhat restrictive image policy sooner or later; just a question of whether you want to do it now, or after a painful forking process.
Forking for nudity would essentially create pornopedia - and then that'd fork to...
The reality of the stupid, hypocritical world some people live in - anyone remember the Superbowl thing? Imagine if that'd been (any image discussed on this mailing list). Now stop it, before you raise my wikistress any further.
Sj said:
Perhaps we can identify some reasonable notion of "work-safeness"
Let me be clear that this is not limited to images, or to nudity (hemi-semi or otherwise).
An all-text page about the last decade of FCUK ads from the French Connection, which included a 600px version of their text FCUK logo, would also not be work-safe.
As we start to improve the inclusion of sound and video, both linked and streaming, we will have to find ways to warn people of all manner of media that are not work-safe; pages which will automatically play loud sounds (the sound of a car crash as you open [[auto accident]], perhaps?) would fall into the same category.
As we start to improve the inclusion of sound and video, both linked and streaming, we will have to find ways to warn people of all manner of media that are not work-safe; pages which will automatically play loud sounds (the sound of a car crash as you open [[auto accident]], perhaps?) would fall into the same category.
Pages that automaticaly play sounds are a really bad idea. Think of the poor kids in africa on dialup.
geni wrote:
As we start to improve the inclusion of sound and video, both linked and streaming, we will have to find ways to warn people of all manner of media that are not work-safe; pages which will automatically play loud sounds (the sound of a car crash as you open [[auto accident]], perhaps?) would fall into the same category.
Pages that automaticaly play sounds are a really bad idea. Think of the poor kids in africa on dialup.
Even many of us on broadband are likely to find this unappealing. Automatically playing embedded sounds in a page ranks up there with automatically opening popup windows on my list of internet annoyances...
-Mark
On 4/14/05, Delirium delirium@hackish.org wrote:
geni wrote:
As we start to improve the inclusion of sound and video, both linked and streaming, we will have to find ways to warn people of all manner of media that are not work-safe; pages which will automatically play loud sounds (the sound of a car crash as you open [[auto accident]], perhaps?) would fall into the same category.
Even many of us on broadband are likely to find this unappealing. Automatically playing embedded sounds in a page ranks up there with automatically opening popup windows on my list of internet annoyances...
Oh, you must be one of those people can't operate a web browser...
On Thu, Apr 14, 2005 at 04:44:32AM -0400, Sj wrote:
Sj said:
Perhaps we can identify some reasonable notion of "work-safeness"
Let me be clear that this is not limited to images, or to nudity (hemi-semi or otherwise).
Whose workplace? I know a couple who who work in a sex-toy shop. Their work involves using words like "clitoris" and "fellatio" (though they'd almost certainly use earthier terms) all day.
Attempting to reduce Wikipedia to some kind of lowest common denominator of inoffensiveness is just as absurd on the grounds of "being work-safe" as it is on the grounds of "being child-safe". It is an impossible goal; but attempting to pursue it can cause untold harm to the project in the form of chilling effects, irreconcilable disputes, and biased coverage.
Wikipedia doesn't need censorious categorization. We are getting on _just fine_ without it. Speculations of the sort, "If we don't put up some kind of censorship, we could get sued!" are the "Niger yellowcake" of this discussion -- a nonexistent threat that is being pounded on to justify a preexisting agenda.
Whose workplace? I know a couple who who work in a sex-toy shop. Their work involves using words like "clitoris" and "fellatio" (though they'd almost certainly use earthier terms) all day.
I think we all know what is being refured to
Attempting to reduce Wikipedia to some kind of lowest common denominator of inoffensiveness is just as absurd on the grounds of "being work-safe" as it is on the grounds of "being child-safe". It is an impossible goal; but attempting to pursue it can cause untold harm to the project in the form of chilling effects, irreconcilable disputes, and biased coverage.
Evidence?
Wikipedia doesn't need censorious categorization. We are getting on _just fine_ without it. Speculations of the sort, "If we don't put up some kind of censorship, we could get sued!" are the "Niger yellowcake" of this discussion -- a nonexistent threat that is being pounded on to justify a preexisting agenda.
Evidence? Are you a lawer? Are you giving your formal legal opion?
On Thu, Apr 14, 2005 at 05:18:07PM +0100, geni wrote:
Attempting to reduce Wikipedia to some kind of lowest common denominator of inoffensiveness is just as absurd on the grounds of "being work-safe" as it is on the grounds of "being child-safe". It is an impossible goal; but attempting to pursue it can cause untold harm to the project in the form of chilling effects, irreconcilable disputes, and biased coverage.
Evidence?
It's been rehashed here time and again. The evidence of the massive conflicts over [[Clitoris]] and other articles should make it clear that giving people more stuff to fight over is NOT a recipe for a peaceful Wikipedia.
Hell, Wikipedia contributors manage to have flamewars over whether an article should have an {{NPOV}} dispute tag on it. If we can have a dispute over *whether a dispute exists* then I don't think we can be trusted to label articles _obscene_ or not -- that's just an invitation for worse conflicts.
What's more, these conflicts are basically POV in nature. People have fundamentally different opinions about what is appropriate for children, or for office workers for that matter. Inviting people to fuss over whether a given article should be tagged "child-safe" or "work-safe" is just not a very good idea for the civility and advancement of the project.
Chilling effects are a known problem with any restriction on expression. Because people do not want to run afoul of the restriction, they self- censor expression that comes close to the boundary. On Wikipedia, this would mean that people would tend to avoid contributing particular material to articles because they didn't want the article to be recategorized as "not child-safe" -- even though the added material might be highly informative and useful. Our purpose here is to produce an informative encyclopedia -- and since a "child-safe" attitude endangers that goal by dint of chilling effects, "child-safety" is not a compatible goal to seek.
Wikipedia doesn't need censorious categorization. We are getting on _just fine_ without it. Speculations of the sort, "If we don't put up some kind of censorship, we could get sued!" are the "Niger yellowcake" of this discussion -- a nonexistent threat that is being pounded on to justify a preexisting agenda.
Evidence? Are you a lawer? Are you giving your formal legal opion?
Of course not. But we shouldn't be taking actions "because Wikipedia could get sued" without advice from the Foundation's lawyers for that very reason -- it isn't our job; we're likely to have it wrong; and we may even expose the project to *more* risk thereby.
Uninformed speculation about legal matters leads to all kinds of moronic conclusions. Just take a look at some of the nonsensical speculation about open-source software licensing out there. My point is that we should *NOT* be basing our actions on speculations of risks that have simply not been demonstrated.
Karl A. Krueger wrote:
[...]
My point is that we should *NOT* be basing our actions on speculations of risks that have simply not been demonstrated.
So you're saying Sean actually has to lose his job before you'll believe he's taking a risk by using WP at work?
Stan
On Thu, Apr 14, 2005 at 11:41:07AM -0700, Stan Shebs wrote:
Karl A. Krueger wrote:
My point is that we should *NOT* be basing our actions on speculations of risks that have simply not been demonstrated.
So you're saying Sean actually has to lose his job before you'll believe he's taking a risk by using WP at work?
No.
The statement under doubt was not whether people might lose their jobs for reading Wikipedia. It was whether Wikimedia could be found liable in court for people's losing their jobs for reading Wikipedia; and also whether posting "not safe for work" notices would ameliorate this situation.
There's no doubt that people might lose their job for reading Wikipedia when they should be working. Nudity doesn't even enter into it.
However, I do not think we should believe that Wikimedia would be found liable for anyone losing their job over Wikipedia content -- or even that anyone would be likely to sue Wikimedia over such an issue -- unless we hear so from the Board or the Foundation's lawyers.
I have another related concern. People seem to occasionally bring up these purported legal risks as an attempt to get their way on an issue of Wikipedia governance or policy. While this is clearly not a violation of the _letter_ of the [[Wikipedia:No legal threats]] policy, it seems to me to border on infringement of the _spirit_ of that policy.
In these cases, the contributor is not *themselves* threatening to sue if they don't get their way. Rather, they are claiming that if they do not get their way, that *someone else* is likely to sue. This seems to me to be a "legal threat by proxy" or some such, and it does not seem like a very honorable way to go about a policy discussion.
Karl A. Krueger (kkrueger@whoi.edu) [050415 05:47]:
I have another related concern. People seem to occasionally bring up these purported legal risks as an attempt to get their way on an issue of Wikipedia governance or policy. While this is clearly not a violation of the _letter_ of the [[Wikipedia:No legal threats]] policy, it seems to me to border on infringement of the _spirit_ of that policy.
In these cases, the contributor is not *themselves* threatening to sue if they don't get their way. Rather, they are claiming that if they do not get their way, that *someone else* is likely to sue. This seems to me to be a "legal threat by proxy" or some such, and it does not seem like a very honorable way to go about a policy discussion.
Seconded. Cut this crap without some really quite solid backing.
- d.
It's been rehashed here time and again. The evidence of the massive conflicts over [[Clitoris]] and other articles should make it clear that giving people more stuff to fight over is NOT a recipe for a peaceful Wikipedia.
Hell, Wikipedia contributors manage to have flamewars over whether an article should have an {{NPOV}} dispute tag on it. If we can have a dispute over *whether a dispute exists* then I don't think we can be trusted to label articles _obscene_ or not -- that's just an invitation for worse conflicts.
No you have shown that the images already cause conflicts you have failed to show that any method of labeling would increase them.
What's more, these conflicts are basically POV in nature. People have fundamentally different opinions about what is appropriate for children, or for office workers for that matter. Inviting people to fuss over whether a given article should be tagged "child-safe" or "work-safe" is just not a very good idea for the civility and advancement of the project.
So what? {{cleanup}} {{wikify}}{{stub}}{{substub}} they are all POV
Chilling effects are a known problem with any restriction on expression. Because people do not want to run afoul of the restriction, they self- censor expression that comes close to the boundary. On Wikipedia, this would mean that people would tend to avoid contributing particular material to articles because they didn't want the article to be recategorized as "not child-safe" -- even though the added material might be highly informative and useful. Our purpose here is to produce an informative encyclopedia -- and since a "child-safe" attitude endangers that goal by dint of chilling effects, "child-safety" is not a compatible goal to seek.
In my expearence people on wikipedia play right up to any boundary. If you don't belive me see [[WP:AN/3RR]]
Of course not. But we shouldn't be taking actions "because Wikipedia could get sued" without advice from the Foundation's lawyers for that very reason -- it isn't our job; we're likely to have it wrong; and we may even expose the project to *more* risk thereby.
But you went beyond that. You went so far as the say the person was wrong.
Uninformed speculation about legal matters leads to all kinds of moronic conclusions. Just take a look at some of the nonsensical speculation about open-source software licensing out there. My point is that we should *NOT* be basing our actions on speculations of risks that have simply not been demonstrated.
No we should find out if those risks exist or not. You made an absolute stament that they did not.
Perhaps we can identify some reasonable notion of "work-safeness"
No. It's a dumb term only used by people who can't operate a web browser.
Wow, that's an amazingly helpful comment, Tony. I'm so glad you're here; you're such a valuable contributor to this project.
Sean Barrett wrote:
Perhaps we can identify some reasonable notion of "work-safeness"
No. It's a dumb term only used by people who can't operate a web browser.
Wow, that's an amazingly helpful comment, Tony. I'm so glad you're here; you're such a valuable contributor to this project.
That's enough, both of you. Sollog is winning. I'm not calling either of you Sollog; but this is *exactly* the kind of thing that the detractors of the Wikipedia want - for the project to fall apart through petty flame wars. We'll already lost enough valuable contributors - we don't need you to go as well. Now take a deep breath, and forget about this whole thing. It's not worth it. The topic of content suitability has been raised before, will be raised again, and I suggest that we all ignore it until a suitable proposal from the Wikimedia foundation is put forth.
That's enough, both of you. Sollog is winning. I'm not calling either of you Sollog; but this is *exactly* the kind of thing that the detractors of the Wikipedia want - for the project to fall apart through petty flame wars. We'll already lost enough valuable contributors - we don't need you to go as well. Now take a deep breath, and forget about this whole thing. It's not worth it. The topic of content suitability has been raised before, will be raised again, and I suggest that we all ignore it until a suitable proposal from the Wikimedia foundation is put forth.
Forget about it? Are you seriously suggesting that the [[NSFW]] article is accepterable? It is barely even a stub.
geni wrote:
That's enough, both of you. Sollog is winning. I'm not calling either of you Sollog; but this is *exactly* the kind of thing that the detractors of the Wikipedia want - for the project to fall apart through petty flame wars. We'll already lost enough valuable contributors - we don't need you to go as well. Now take a deep breath, and forget about this whole thing. It's not worth it. The topic of content suitability has been raised before, will be raised again, and I suggest that we all ignore it until a suitable proposal from the Wikimedia foundation is put forth.
Forget about it? Are you seriously suggesting that the [[NSFW]] article is accepterable? It is barely even a stub.
I meant the argument about people knowing how to use their browsers - possibly the argument about the image as well - but articles always need improving, and that is the point of Wikipedia, *not* flame wars about who's pictures are showing what. Since [[NSFW]] is a stub, I invite everyone to improve it - heck, nominate it as COTW even.
On 4/14/05, Alphax alphasigmax@gmail.com wrote:
Sean Barrett wrote:
Perhaps we can identify some reasonable notion of "work-safeness"
No. It's a dumb term only used by people who can't operate a web browser.
Wow, that's an amazingly helpful comment, Tony. I'm so glad you're here; you're such a valuable contributor to this project.
That's enough, both of you. Sollog is winning. I'm not calling either of you Sollog; but this is *exactly* the kind of thing that the detractors of the Wikipedia want - for the project to fall apart through petty flame wars. We'll already lost enough valuable contributors - we don't need you to go as well. Now take a deep breath, and forget about this whole thing. It's not worth it. The topic of content suitability has been raised before, will be raised again, and I suggest that we all ignore it until a suitable proposal from the Wikimedia foundation is put forth.
--
Alphax,
I agree that the petty flame-wars -- including bickering and thinly-disguised name-calling -- needs to stop. It seems that some email messages have little purpose beyond one-upsmanship; they don't seem to contribute anything to the goal of creating a great encyclopedia.
But I don't know that we can simply ignore this issue until the foundation makes a proposal. I'm not sure that the foundation is inclined to do that. Nor am I certain that it would be desirable for them to do so.
I believe that the Wikipedia community still has enough thoughtful people that a workable consensus can be achieved on controversial issues such as this. It will take time and effort, and it will take some of the same skills that we use to create NPOV articles. Isn't the ability to see and understand POV's other than your own important, if not essential, to our task?
However, that particular skill seems to be quite lacking in some recent conversations. I'm guessing that many involved Wikipedians who have that skill in abundance are not interested in engaging in the type of dialog that has been common of late. I suspect they're busily working on generating NPOV content. Perhaps when the flames die down, some real progress can be made.
-- Rich Holton
[[W:en:User:Rholton]]
It seems that there are some who want to keep Wikipedia work-safe, and others who want to avoid censorship. I think that it be a good compromise if we tagged the images individually by putting them in a category, and somehow allowing users to filter out certain images in their preferences. This would require a patch in MediaWiki, but I think it would be a good compromise, since: a) It keeps Wikipedia work-safe/child-safe, for those who want it to be b) It's not censoring Wikipedia since people are opting IN to it c) People won't have to browse with images off to avoid being suprised by offensive images d) It doesn't affect people who don't mind these images, so we're still providing information the same way it would be otherwise, we're just giving people more options e) It's easier, more precise, and more effective than turning off all images
The old argument is true: if you don't want to see a clitoris, then don't go to [[clitoris]]. However, I think that especially if there are semi-nude and other potentially offensive pictures in seemingly random places, it's important to give people the liberty to view any images they want. Also, if I want to learn more about what something like [[autofellatio]] is, I should be able to choose to display images as a link if I want. Even if I want to learn about [[autofellatio]], I might not want to have a picture of a guy masturbating appear right next to the information, and I should be given the liberty to choose to omit the image without turning all images off (which many people don't know how to do).
This would also solve the problem of not being able to censor images out of the fear of being forced to censor images that offend very small groups. I'm referring a bit to the [[Owl]]/Navajo argument here. While most see it as ridiculous to hide images of owls on [[Owl]] just because it offends a small minority, I think that it would be beneficial if we allow people to individually decide whether they want these types of images or not. If you think that's censorship, I disagree, because it's not banning any content, it's just giving users more power over the display of the images than they previously had. Since it doesn't disrupt 'normal' users, I think it would be a great compromise to this never-ending problem.
--Faraaz Damji (Frazzydee)
Blog: http://frazzydee.ca
-----BEGIN GEEK CODE BLOCK----- Version: 3.1 GCS d? s:- a--- C+++ UL++ P+ L+ E---- W++ N+ o+ K+ w+ O? M-- V? PS++ PE Y PGP++ t 5-- X+ R tv b++ DI++ D+ G++ e- h! !r !z ------END GEEK CODE BLOCK------
Sj wrote:
Perhaps we can identify some reasonable notion of "work-safeness" and work to make the user experience work-safe for all of our readers. Our primary goal should be to make the encyclopedia a useful and reliable resource, for as many people as possible, in as many environments as possible. This includes making it reliably safe to let one's children use the encyclopedia, to refer to articles for illustration in the middle of a discussion or presentation, regardless of audience, to browse the encyclopedia at work.
I am not sure how to address the first case above - providing a site that parents would feel comfortable letting their 6-year-old browse. But I think we can deal with the last two scenarios pretty well without a great deal of work.
At each step along the way, we can assume that the reader is '''doing his or her best''' to avoid looking up undesired pages or content. Editors should avoid surprising users by showing them content they did not expect, and do not want, to see on a particular subject. For instance, we have {{spoiler}} messages. We should have the same kinds of messages for potentially offensive content [somehow I thought we did by now, but it seems this is not so.]
I've added a few templates ('obscenity', 'graphicviolence', and 'nonworksafe') to the end of the General templates page; please see whether the text is appropriate, and add them to articles where needed. http://en.wikipedia.org/wiki/Wikipedia:Template_messages/General
I think it is fair to assume that readers are doing all they can /not/ to come across content that they or their employers find inappropriate. It is also polite to warn readers when they are about to be shocked.
- If the title of the article does *not* make it clear that it is
not worksafe, or about violent or obscene or nude subjects; or 2. If the content of one section, despite its relevance to the article as a whole, is unusually graphic, or potentially offensive/startling;
then a spoiler-type warning is probably appropriate.
On 4/13/05, Tony Sidaway minorityreport@bluebottle.com wrote:
Sean Barrett said:
For me, the largest part of the problem with the Kate Winslet nude is that it renders all of Wikipedia non-work-safe.
What does "work-safe" mean in this context? If it's only the pictures, why do you browse wikipedia with image downloads turned on?
Images are a significant portion of the encyclopedia. No-one should have to turn images off simple because of a few poorly-announced and surprisingly-placed "non-work-safe" images.
Faraaz Damji said:
It seems that there are some who want to keep Wikipedia work-safe, and others who want to avoid censorship.
Not really. There are those who promote the concept of "work safe" and those who use their browsers in a manner that does not conflict with their employers' standards. Censorship has nothing to do with it.
Not really. There are those who promote the concept of "work safe" and those who use their browsers in a manner that does not conflict with their employers' standards. Censorship has nothing to do with it.
Prove it. You have already claimed that they don't know how to use thier browesrs without providing any evidence.
On Friday, April 15, 2005 12:31 AM, geni geniice@gmail.com wrote:
Not really. There are those who promote the concept of "work safe" and those who use their browsers in a manner that does not conflict with their employers' standards. Censorship has nothing to do with it.
Prove it. You have already claimed that they don't know how to use thier browesrs without providing any evidence.
Geni, dear, the point that he is so obviously making with a subtlety that even surpasses a hoard of rhinos charging and engaging with an ocean liner perched on a precipice, but that you seem fixated on requiring him to spell out in big flashing shiny letters is that, were one able to "know how to user their browsers" (by which it is meant, were one aware of and able to use some form of functionality in their browser that prevented images from being downloaded unless specifically requested), then one desirious of not being surprised by images being loaded that one did not want would, indeed, engage such a device.
Happy?
Please stop this tiresome pettiness in demanding "answers" to rhetorical questions.
Yours,
Tony Sidaway wrote:
Not really. There are those who promote the concept of "work safe" and those who use their browsers in a manner that does not conflict with their employers' standards.
But wouldn't it be better if we let people choose which types of images they wanted to display? Turning off all images removes all the images in the article, while a content-filtering system would be specific to images deemed offensive by the reader. Also, they won't have to browse with all images off, but can instead choose to see some images as links.
If this is more widely used, it might even allow us *more* flexibility in displaying images, like in [[autofellatio]] (just an example!), since we can now use the argument that if you don't want to see it just turn the blocking feature on.
And let's get real here: Not everybody knows how to turn images off. Sure, you can say they 'should' know, and we shouldn't cater to their needs, but isn't it a step ahead if we can make the site more accessible to people?
-- Blog: http://frazzydee.ca
-----BEGIN GEEK CODE BLOCK----- Version: 3.1 GCS d? s:- a--- C+++ UL++ P+ L+ E---- W++ N+ o+ K+ w+ O? M-- V? PS++ PE Y PGP++ t 5-- X+ R tv b++ DI++ D+ G++ e- h! !r !z ------END GEEK CODE BLOCK------
Tony Sidaway wrote:
But wouldn't it be better if we let people choose which types of images they wanted to display? Turning off all images removes all the images in the article, while a content-filtering system would be specific to images deemed offensive by the reader.
This really is the only long term solution. We add some buttons in Special:Preferences, and have two or three options -- e.g. uncensored/moderate/strict. Explicit content would automatically be filtered from then on, or not.
It would involve more work coding, and more work maintaining tags on explicit content, and arguments about whether [[Image:Y]] should get tagged as explicit or not, and so on. But there's no other long term solution I can think of, because we do need to have boobs and clitorises in the appropriate places, but people do need to have a way not to see them if they'd rather not. (Making the user disable images doesn't count as a solution.) A fork on the basis of censorship would be unlikely after then.
Zach
Zach Alexander wrote:
Tony Sidaway wrote:
But wouldn't it be better if we let people choose which types of images they wanted to display? Turning off all images removes all the images in the article, while a content-filtering system would be specific to images deemed offensive by the reader.
This really is the only long term solution. We add some buttons in Special:Preferences, and have two or three options -- e.g. uncensored/moderate/strict. Explicit content would automatically be filtered from then on, or not.
It would involve more work coding, and more work maintaining tags on explicit content, and arguments about whether [[Image:Y]] should get tagged as explicit or not, and so on. But there's no other long term solution I can think of, because we do need to have boobs and clitorises in the appropriate places, but people do need to have a way not to see them if they'd rather not. (Making the user disable images doesn't count as a solution.) A fork on the basis of censorship would be unlikely after then.
"Explicit" is majorly POV. However, it would be possible to have NPOV labels like "bare female breasts" and "human blood". There's been some discussion at http://meta.wikimedia.org/wiki/End-user_image_suppression .
SPUI wrote:
Zach Alexander wrote:
We add some buttons in Special:Preferences, and have two or three options -- e.g. uncensored/moderate/strict. Explicit content would automatically be filtered from then on, or not. It would involve more work coding, and more work maintaining tags on explicit content, and arguments about whether [[Image:Y]] should get tagged as explicit or not, and so on. But there's no other long term solution I can think of, because we do need to have boobs and clitorises in the appropriate places, but people do need to have a way not to see them if they'd rather not. (Making the user disable images doesn't count as a solution.) A fork on the basis of censorship would be unlikely after then.
"Explicit" is majorly POV. However, it would be possible to have NPOV labels like "bare female breasts" and "human blood". There's been some discussion at http://meta.wikimedia.org/wiki/End-user_image_suppression .
You're right -- I realized there was some POV there, but didn't realize just how much, until Karl explained why on the other thread. In my reply to him I said something along similar lines to what you seems to be suggesting.
Zach
Zach Alexander said:
SPUI wrote:
"Explicit" is majorly POV. However, it would be possible to have NPOV labels like "bare female breasts" and "human blood". There's been some discussion at http://meta.wikimedia.org/wiki/End-user_image_suppression .
You're right
I don't get this. Explicit is just a synonym for "overt". In the picture we've been discussing, Kate Winslet's left breast is explicitly bare, whereas her left breast is only bare by implication. It's a completely neutral word, though its meaning in the phrase "explicit content" is difficult to decipher. This reader is left scratching his head and wondering if the phrase has something to do with dark matter.
Tony Sidaway wrote:
Zach Alexander said:
SPUI wrote:
"Explicit" is majorly POV. However, it would be possible to have NPOV labels like "bare female breasts" and "human blood". There's been some discussion at http://meta.wikimedia.org/wiki/End-user_image_suppression .
You're right
I don't get this. Explicit is just a synonym for "overt". In the picture we've been discussing, Kate Winslet's left breast is explicitly bare, whereas her left breast is only bare by implication. It's a completely neutral word, though its meaning in the phrase "explicit content" is difficult to decipher. This reader is left scratching his head and wondering if the phrase has something to do with dark matter.
I'm all for good use of language, and it would seem that SPUI has let "expkicit" run adrift. Even to say that it is a synonym for "overt" does not say the whole story since there is also the meaning of something being clearly stated or demonstrated. If the picture clearly shows Miss Winslet's left breast it is explicitly demonstrated. The picture, however, does not show her right breast which would be in a darkened part of the picture. The absence of clothing and her general posture might lead one to conclude that her right breast is implicitly bare.
Ec
Zach Alexander said:
But there's no other long term solution I can think of, because we do need to have boobs and clitorises in the appropriate places, but people do need to have a way not to see them if they'd rather not. (Making the user disable images doesn't count as a solution.)
We don't make the user do anything, but if he wants a censored Wikipedia he already has the capability to produce his own; we're hearing from people who for whatever reason are unwilling to do so and think that it is Wikipedia that should change to accommodate their intransigence. Thus a personal problem has been incorrectly reframed as a problem for Wikipedia.
On Fri, Apr 15, 2005 at 12:45:04PM +0100, Tony Sidaway wrote:
We don't make the user do anything, but if he wants a censored Wikipedia he already has the capability to produce his own; we're hearing from people who for whatever reason are unwilling to do so and think that it is Wikipedia that should change to accommodate their intransigence. Thus a personal problem has been incorrectly reframed as a problem for Wikipedia.
I think it's been made tolerably obvious at this point that anyone does have the option to disable images, *and* that users of well-designed browsers are frequently even capable of displaying or refusing images on a per-site basis. This likely doesn't need to be reiterated again.
I'm curious about the proposals for free-form tagging systems, where users would have the option to block out images or text matching specific sets of tags that they chose. I'm not sure that any such system could actually meet Wikipedia's needs and at the same time fulfill the desires of those who call for greater restriction on images.
To start with, it's not clear to me that such a system could do what some people are asking for -- that is, creating a view of Wikipedia that would be "work-safe" or otherwise fitting some particular content censorship standard. That goal seems impossible unless a tagging system suiting that standard were *mandatory* -- that is, any edit violating the tagging system would have to be considered abusive. This is not a rule that I would expect to garner community consensus, especially since the judgment of what content merits a particular tag would be POV-laden.
(Consider: What is the point of a "violence" tag if people who post images of violence are not required to use it? People can depend on such a system only if it is mandatory, but making it mandatory violates Wikipedia principles ... especially since people can be expected to disagree on what deserves the label "violence".)
I also think it's pretty clear that *no* such system will suit the goals of those who really do want to marginalize or exclude "evil" expression from the public arena. Those who actually want to _remove_ nudity or violence or sex education or "cult" content from Wikipedia, under the belief that such material is harmful to the public morality, will not be satisfied by mere tagging.
Another concern: Will the presence of a tagging system legitimize the behavior of people who really _do_ simply want to post nudity (or whatever) for its own sake? We already have the problem that people are more ready to talk about an image's offensiveness than its relevance. A tagging system might give an additional argument to people who simply want to put more nudity or violence or what-have-you on Wikipedia, and to hell with relevance: "If you don't want to see it, go turn on more censorship in your filtering preferences." This would be unfortunate.
Yet another possible consequence is that people could use a tagging system as a way of searching Wikipedia for all the "naughty images". Let us imagine a curious adolescent who today browses Wikipedia looking for Renaissance nude paintings, swimsuit images like those on [[Bikini]], the sketches of sexual positions, and so on. By aggregating all the nudity in an easy-to-find category, tagging would make it _easier_, not harder, to use Wikipedia as a source of titillation.
This could even lend itself to greater calls for the deletion of such material -- since it would make it trivial to construct a "nudes of Wikipedia" view that would highlight just that particular content.
Faraaz Damji said:
Tony Sidaway wrote:
Not really. There are those who promote the concept of "work safe" and those who use their browsers in a manner that does not conflict with their employers' standards.
But wouldn't it be better if we let people choose which types of images they wanted to display?
We could not stop them doing so if we tried.
Faraaz Damji wrote:
The old argument is true: if you don't want to see a clitoris, then don't go to [[clitoris]].
<snip>
Also, if I want to learn more about what something like [[autofellatio]] is, I should be able to choose to display images as a link if I want. Even if I want to learn about [[autofellatio]], I might not want to have a picture of a guy masturbating appear right next to the information
Suppose I want to learn about something. I look it up in an encyclopedia. The paper kind. I might happen to see something offensive. But how do I know I could see something offensive unless I know what it is?
Your argument "if you don't want to see a clitoris, don't go to clitoris" doesn't help people who don't know what a clitoris is. Therefore, saying "I might not want to have a picture of a guy masturbating appear right next to the information" falls into the same category - you can't object to something if you don't know what it is. It's like goatse - highly offensive, but unless you know what it is, you can't avoid it.
On 4/15/05, Andrew Cranwell andrew.cranwell@student.adelaide.edu.au wrote:
the information" falls into the same category - you can't object to something if you don't know what it is. It's like goatse - highly offensive, but unless you know what it is, you can't avoid it.
This is why anything which might reasonably be considered offensive to a significant portion of our readership should be prefaced by a modest warning. Note that traditional paper encyclopedias are careful not to have *any* images or text that could be considered deeply offensive; this is the definition of dealing "tastefully" with a difficult subject.
We can offer more choices than that; we can offer less tasteful and more explicit descriptions of things; but we should not force them on unsuspecting readers.
SJ
Andrew Cranwell said:
Your argument "if you don't want to see a clitoris, don't go to clitoris" doesn't help people who don't know what a clitoris is. Therefore, saying "I might not want to have a picture of a guy masturbating appear right next to the information" falls into the same category - you can't object to something if you don't know what it is. It's like goatse - highly offensive, but unless you know what it is, you can't avoid it.
Unless you see it and react to it, how can you know that you'll be offended by it? Don't ask me to second-guess your reaction if you don't know yourself.
On 4/16/05, Tony Sidaway minorityreport@bluebottle.com wrote:
Andrew Cranwell said: Unless you see it and react to it, how can you know that you'll be offended by it? Don't ask me to second-guess your reaction if you don't know yourself.
Fallacious reasoning. There are some things which are reasonably likely to offend the vast majority of people on this Earth. Goatse is a fine example. Autofellatio is, most probably, another. It's hardly extreme to note that these are thus quite likely to offend people, and to link them accordingly.
-- ambi
Rebecca said:
On 4/16/05, Tony Sidaway minorityreport@bluebottle.com wrote:
Andrew Cranwell said: Unless you see it and react to it, how can you know that you'll be offended by it? Don't ask me to second-guess your reaction if you don't know yourself.
Fallacious reasoning. There are some things which are reasonably likely to offend the vast majority of people on this Earth. Goatse is a fine example. Autofellatio is, most probably, another. It's hardly extreme to note that these are thus quite likely to offend people, and to link them accordingly.
That may be true of goatse and even possibly autofellation. But a nude scene from a PG13 movie? I could not have guessed at the extreme reactions it has provoked from people. You have not demonstrated any fallacy.
That may be true of goatse and even possibly autofellation. But a nude scene from a PG13 movie? I could not have guessed at the extreme reactions it has provoked from people. You have not demonstrated any fallacy.
It doesn't surprise me (although it doesn't offend me in the slightest personally). Still, I think people have a point here - there *are* concerns about nudity coming up where unexpected. In many cases, I think linking solves the problem fairly easily. In cases such as these, though, one has to wonder if the image is really that important. While it doesn't offend me in the last, in this particular case, I'm inclined to say no.
-- ambi
Rebecca said:
In many cases, I think linking solves the problem fairly easily.
It's awkward and clunky. It may be appropriate with images that many people may find shocking, but it still stops one of the image's major functions, which is to adorn the page and break up the text, so it's clearly inappropriate for most cases where you get piddling objections about boobs and whatnot.
Rebecca wrote:
There are some things which are reasonably likely to offend the vast majority of people on this Earth. Goatse is a fine example. Autofellatio is, most probably, another.
I'm sorry, but what a load of bullocks.
Christiaan
On 17 Apr 2005, at 9:32 pm, geni wrote:
I'm sorry, but what a load of bullocks.
Christiaan
are you seriously trying to suggest that there is a significant percentage of earth population that would not be offended by the the goatse?
I'm saying that Rebecca's comment that, "There are some things which are reasonably likely to offend the vast majority of people on this Earth. Goatse is a fine example. Autofellatio is, most probably, another" is a load of bullocks. When someone projects their view onto 4-5 billion people mine is a pretty rational response I would have thought.
Christiaan
Christiaan Briggs wrote:
On 17 Apr 2005, at 9:32 pm, geni wrote:
I'm sorry, but what a load of bullocks.
Christiaan
are you seriously trying to suggest that there is a significant percentage of earth population that would not be offended by the the goatse?
I'm saying that Rebecca's comment that, "There are some things which are reasonably likely to offend the vast majority of people on this Earth. Goatse is a fine example. Autofellatio is, most probably, another" is a load of bullocks. When someone projects their view onto 4-5 billion people mine is a pretty rational response I would have thought.
The expression is really "a load of ballocks", not "bullocks". That might better satisfy autofellatio-man's hunger. :-)
Ec
geni said:
I'm sorry, but what a load of bullocks.
Christiaan
are you seriously trying to suggest that there is a significant percentage of earth population that would not be offended by the the goatse?
I think they might snicker a bit and show it to their friends.
geni said:
I think they might snicker a bit and show it to their friends.
You now appear to be trying to speak on behalf of the worlds population.
The words "I think" and "might" may tip you off that I'm expressing my personal opinion--as are you.
geni said:
The words "I think" and "might" may tip you off that I'm expressing my personal opinion--as are you.
So you have nothing more than your personal opion to base your argument on?
No. Recall that you were expressing a personal opinion of yours as if it was fact. I simply demonstrated that a different opinion could be expressed. Thus I refuted your argument by assertion, to the effect that Kate Winslet's left breast is such a revolting sight that the world's population must be protected from it.
No. Recall that you were expressing a personal opinion of yours as if it was fact. I simply demonstrated that a different opinion could be expressed. Thus I refuted your argument by assertion, to the effect that Kate Winslet's left breast is such a revolting sight that the world's population must be protected from it.
No I was asking a question. Now if you read [[Nudity]] you will find that most of the world population has some form of nudity taboo.
geni said:
No. Recall that you were expressing a personal opinion of yours as if it was fact. I simply demonstrated that a different opinion could be expressed. Thus I refuted your argument by assertion, to the effect that Kate Winslet's left breast is such a revolting sight that the world's population must be protected from it.
No I was asking a question.
Ah, so we're agreed, then. I could have sworn that you said at least once that the world's population needed you to protect it from Ms Winslet's dangerous mammaries.
Now if you read [[Nudity]] you will find that most of the world population has some form of nudity taboo.
That would be the article illustrated by Manet's Olympia Michaelangelo's David and Goya's nude and clothed Maja? I must say I popped down to that article expecting to see something supporting your claim, but instead found an account of very diverse views and practices.
That would be the article illustrated by Manet's Olympia Michaelangelo's David and Goya's nude and clothed Maja?
Yup for some reason artitisic nudes seem to have more accptance
I must say I popped down to that
article expecting to see something supporting your claim, but instead found an account of very diverse views and practices.
Yup but they all seem to have some form of nudity taboo (even if it only to limit nudity to certian areas).
geni said:
That would be the article illustrated by Manet's Olympia Michaelangelo's David and Goya's nude and clothed Maja?
Yup for some reason artitisic nudes seem to have more accptance
And did you happen to look at Ms Winslet's pose?
I must say I popped down to that
article expecting to see something supporting your claim, but instead found an account of very diverse views and practices.
Yup but they all seem to have some form of nudity taboo (even if it only to limit nudity to certian areas).
Actually this is one thing that I can state with confidence is *NOT* in the article you referenced. I think we've been going around in circles for a while now. To summaries, we differ strongly about whether Ms Winslet's mammary glands are so horrific that the world needs to be protected from them. I'm confident that yours is the view of a tiny minority of people, so I won't waste any more time on it.
On 4/18/05, Tony Sidaway minorityreport@bluebottle.com wrote:
geni said:
That would be the article illustrated by Manet's Olympia Michaelangelo's David and Goya's nude and clothed Maja?
Yup for some reason artitisic nudes seem to have more accptance
And did you happen to look at Ms Winslet's pose?
Bit hard to tell really it was mostly hidden
Actually this is one thing that I can state with confidence is *NOT* in the article you referenced. I think we've been going around in circles for a while now. To summaries, we differ strongly about whether Ms Winslet's mammary glands are so horrific that the world needs to be protected from them. I'm confident that yours is the view of a tiny minority of people, so I won't waste any more time on it.
Persoanly I would not have described south america as a tiny minority.
On 4/18/05, Tony Sidaway minorityreport@bluebottle.com wrote:
geni said:
I'm sorry, but what a load of bullocks.
Christiaan
are you seriously trying to suggest that there is a significant percentage of earth population that would not be offended by the the goatse?
I think they might snicker a bit and show it to their friends.
Excellent case for editorial judgment. Unless we want to create a site that appeals to, or looks like it was created by, snickering juveniles.
Puddl Duk said:
On 4/18/05, Tony Sidaway minorityreport@bluebottle.com wrote:
geni said:
I'm sorry, but what a load of bullocks.
Christiaan
are you seriously trying to suggest that there is a significant percentage of earth population that would not be offended by the the goatse?
I think they might snicker a bit and show it to their friends.
Excellent case for editorial judgment. Unless we want to create a site that appeals to, or looks like it was created by, snickering juveniles.
I agree. I don't see much encyclopedic value in putting up Mr goatse. I just challenge the idea that it's such a shocking picture. It's a bit pathetic, the poor man must be in considerable discomfort, and it's very demeaning. But please, let's not pretend that it's as shocking as many encyclopedic pictures that we do carry with good reason. We've even got apicture of a chap about to have his head blown off by a uniformed nutter with a revolver, and I shouldn't be surprised if we had a picture of that poor napalmed little girl (Kim Phuc Phan Thi, who fortunately was taken to hospital by the photographer and now works as a UN ambassador for children). Why do people go out of their way to denounce this or that picture as shocking, when there are photographs like that around to remind us of what is really shocking about human behavior?
I take this very seriously. What's your supporting evidence?
The suspicious speed they have been turning up since the clitorsis incerdent. Not that I really care. Sonner or latter this will be settled because everyone will be totaly fed up (the power of apathy to settle dissputes)
geni said:
Tony Sidaway said:
Kevin Rector said: I've come to the realization that people are searching out nude pictures to put in the 'pedia. They are looking to stir up trouble, mostly to make a point. I take this very seriously. What's your supporting evidence?
The suspicious speed they have been turning up since the clitorsis incerdent. Not that I really care. Sonner or latter this will be settled because everyone will be totaly fed up (the power of apathy to settle dissputes)
Which clitoris incident are you referring to? According to evidence I prepared for Arbcom for the Dr Zen case, there has been an inline photograph on Clitoris ever since Anthere replaced the linked image by an inline in May, 2004.
Could you give examples from the upload log? Something demonstrating that there is a person or a number of persons uploading nude images in a way that suggests that people are actually "searching out nude pictures to put in the 'pedia?" I'd expect to see a pattern of certain people uploading more than one such picture and trying to sneak it into an article in an inappropriate way. Do we in fact have good evidence of this? You mention a "suspicious speed"? What is your estimate for the rate at which nude photographs are being uploaded to Wikipedia?
Which clitoris incident are you referring to? According to evidence I prepared for Arbcom for the Dr Zen case, there has been an inline photograph on Clitoris ever since Anthere replaced the linked image by an inline in May, 2004.
Sounds right the long running fight to keep it there is I suspect the reason for arbcom involvement
Could you give examples from the upload log? Something demonstrating that there is a person or a number of persons uploading nude images in a way that suggests that people are actually "searching out nude pictures to put in the 'pedia?"
appearence of goatse was very close to autofetllo incerdent to close.
I'd expect to see a pattern of certain people uploading more than one such picture and trying to sneak it into an article in an inappropriate way. Do we in fact have good evidence of this?
No. I don't think anyone has uploaded more than one.
You mention a "suspicious speed"? What is your estimate for the rate at which nude photographs are being uploaded to Wikipedia?
Hard to say 1 every 1-2 mounths which since there is very little PD stuff is fast. More significantly is the change in rate. For a long time uploads were very slow with the only photos I am aware of being either painting or the ones on the articles about the relivant areas of the human body. Do you remeber how much trouble had finding a picture with copyright for the clitoris article. It took ages. It doesn't seem to be taking that long any more.
Kevin Rector wrote
I've tried to stay out of the fray with all the nude/porn picture
debates that have gone on, but I've come to the realization that people are searching out nude pictures to put in the 'pedia. They are looking to stir up trouble, mostly to make a point.
That might be correct. Try [[Mull of Kintyre rule]], created not long ago.
Charles
Charles Matthews said:
Kevin Rector wrote
I've tried to stay out of the fray with all the nude/porn picture
debates that have gone on, but I've come to the realization that people are searching out nude pictures to put in the 'pedia. They are looking to stir up trouble, mostly to make a point.
That might be correct. Try [[Mull of Kintyre rule]], created not long ago.
Actually I've heard of that test. It's a venerable old thing. The article is encyclopedic and the redeployment of an existing illustration is absolutely inspired. It just goes to show that it's easy to misread intention if one doesn't know the context.
Example:
http://homepage.mac.com/kiltedpride/blogwavestudio/
The Herald October 14th 1999 - It's a Brief Encounter
(article about a famous Scottish lawyer who specializes in entertainment law, including obscenity) 'Findlay's fate was destined to be brighter. He started with small theatre contracts and one deal led to another. Today, he heads up Tods Murray's entertainment and media law team and, in his spare time, he sits on the board of the Lyceum. He has worked on films such as Trainspotting, waded through the logistical legal nightmares of organising the Edinburgh Military Tattoo, and negotiated the more delicate matters of obscenity. On the latter point, Findlay admits life was once simple. "It used to be the case that as long as it was the same as the Mull of Kintyre, so to speak, all was well with the world, but with certain relaxations on male nudity we can no longer rely on that old adage," explains Findlay.'
On 4/13/05, Charles Matthews charles.r.matthews@ntlworld.com wrote:
Kevin Rector wrote
I've tried to stay out of the fray with all the nude/porn picture
debates that have gone on, but I've come to the realization that people are searching out nude pictures to put in the 'pedia. They are looking to stir up trouble, mostly to make a point.
That might be correct. Try [[Mull of Kintyre rule]], created not long ago.
Charles
Pretty good could do with a reference or to and the position of the Mull of Kintyre being made a bit clearer on the map.