On Wed, Feb 23, 2005 at 08:17:56PM +0800, John Lee wrote:
Karl A. Krueger wrote:
Therefore, any labeling of selected categories as "potentially offensive" is an NPOV violation and unacceptable on that regard.
I refer you to the argument regarding disambiguation pages that has already been discussed by others on this list.
[snip]
It's not POV and actually inconveniencing for those of us looking for other articles (imagine , but it's accepted because the cost to benefit ratio is not that severe.
You're right: disambiguation isn't an NPOV problem. It also isn't terribly relevant to the issue at hand, even if you repeat that claim five times in one post. I'll respond to it just once.
Here's the difference: Disambiguation doesn't involve anyone trying to hide away material, or "protect" me from things "for my own good". It also does not involve any judgment that some material is "less suitable" or "offensive" -- only that it is less likely to be looked-for.
In contrast, any kind of censorship / filtering regime necessarily contains within it the assumption that some material is _lesser_ than others, by dint of offending the protected group ... and moreover, that some people's offense is unimportant, while other people's offense is important. It is not presented as a guess, but as a moral injunction ("this material IS offensive, it MUST BE turned off by default") which it is not within Wikipedia's purview to make.
However, I have also pointed out before that photographs have more psychological impact than illustrations, which have more psychological impact than text (just try asking a psychologist). Therefore, a good deal of people offended (or at least annoyed) by the photo formerly at [[autofellatio]] probably wouldn't feel the same way about text.
I don't think we actually have sufficient evidence on that. Again, consider _Ulysses_, a work of text (no images) that was once banned in the U.S. for obscenity.
One of the past discussions related to this issue had to do with whether Wikipedia should link to sexually explicit material offsite, rather than keeping copies of them within the database. The idea was that people who are judgmental about such material would thus not be able to decry Wikipedia as a collection of pornography. However, it seemed to me (and still does) that such folks would just as easily call Wikipedia a Web directory of pornography.
I have never argued for catering to the whim of extremists. That's stupid. There is no reason why we shouldn't filter certain images, though.
It seems to me that when challenged, you rather readily lean on what you call "common sense". I'm not sure what you mean by "common sense", since the some of points that we're discussing are actually pretty deep political ideas and not everyday issues at all. "Common sense" usually means people's knowledge of the everyday, like "things fall when you drop them".
It sounds to me kind of like you're using "common sense" to mean that everyone reasonable must agree with your assumptions, and anyone who disagrees is an "extremist" and their views should be ignored.
I don't think that's a very convincing mode of argument.
It's the cost to benefit ratio we're talking about here: You're extremely far more likely to find someone damn pissed to be looking at a photograph of a clitoris than to find someone damn pissed to be looking at a photograph of a caterpillar.
That position generalizes very poorly to other classes of articles, though. (And generalizability is a must for rules.) The fact that people get "damn pissed" about something is no reason not to keep it in an encyclopedia. There are an awful lot of _historical facts_ it's worth being damn pissed about.
Moreover, it isn't clear to me that we can do a cost-benefit analysis of the principles on which Wikipedia was founded, such as NPOV. Neutrality is not just a benefit, but a basic ground-rule of Wikipedia.
(NPOV for Wikipedia is not just a "benefit", but rather a "boundary". We don't ask, "Is this violation of NPOV worth what it gets us?" Rather, we take it as read that any violation of NPOV is a bad thing, and we try to avoid such violations wherever possible rather than making excuses for them. We don't always _succeed_, but we don't give up.)
Besides, it's not really an imposition of someone's POV as to what is offensive. It's an editorial decision about what images would be potentially offensive or greatly distractive to a substantial amount of people.
Sure, and I'll just go make an "editorial decision" that homeopathy is bunkum, and put that all over the article [[Homeopathy]]. When someone accuses me of an NPOV violation, I'll just tell them it was nothing of the sort, it was an "editorial decision". :)
Calling a tail a leg doesn't make it one. An NPOV violation is still against Wikipedia policy, even if you call it "editorial decision".
- All images are treated equally by default -- either displayed, or not
displayed. Thus, the system does not incorporate any POV biases such as the idea that nudity is "offensive" and caterpillars are not.
I do not entertain the idea of such extremist pandering to NPOV. By virtue of that logic, we might as well host a disambiguation page at [[Chicago]] even though almost everyone will be looking for [[Chicago, Illinois]].
If you think NPOV is "extremist" and that that's bad, you're on the wrong project. This is Wikipedia, where NPOV is a basic ground-rule.
Or, in another sense, there's nothing wrong with being extreme about NPOV. It's what we do here. Wikipedia _is_, in this sense, an extremist project; that's why there are so people from Britannica, mainstream media, and other "normal" "middle-of-the-road" publications who can't figure us out, and assume that we're doomed to failure.
- Existing NPOV policy is applied in full force to categories. This
means that it is a violation of Wikipedia policy to create categories such as "images appropriate for children" or "images of sexual perversions". Editors who do so persistently are treated the same as any other systematic POV-pushers.
Agreed.
Great! That's one of the biggest issues right there!
Now, read on for an implication of that ...
The trouble is as Wikipedia grows, we seem to have a penchant for relying on legalistic issues. Instead of having a general policy of "reverting except for vandalism is bad" we need to have the 3RR which is easily gamed. Instead of having a policy on personal attacks, we have to go through the whole merry-go-round of RFC, RFM and RFAr.
Sounds to me like your proposal would be more of this, rather than less.
It would end up with nasty, unresolvable arguments over what should go in the "off by default" categories, because of what those categories would really mean:
If the category "Nudity" is off by default, then the meaning of that category would really be not just "these images contain nudity" but rather "these images contain nudity AND Wikipedia believes nudity is offensive".
And that is a judgment that Wikipedia still has no business making; and it is a judgment that would cause endless dispute. Just as Wikipedia does not have an opinion on the issue of whether homeopathy is bunkum, Wikipedia does not have an opinion on whether nudity is offensive.
Trying to place that opinion in Wikipedia's mouth is precisely an NPOV violation.
Problems like these are going to crop up more often over the years, and I am pessimistic that more than a very few editors will learn the lessons of this debate. When the community is divided like hell on this (in other words, you would need hell to freeze over before obtaining anything even resembling consensus), we need something to defuse the situation.
I don't see how you get the idea that your proposal is defusing, rather than escalating. Defusing means that people quit fighting and go find something else to do ... not that you give them a whole brand new arena to fight in, and make the issues over which they're fighting all the more important.