On 3/28/06, Karl A. Krueger <karl(a)simons-rock.edu> wrote:
Right! That means your censorship proposal fails to
represent the
views of people who _do_ want religious censorship. That means it's
non-neutral: it represents only one POV about what should be censored
(the anti-sex view) and not another (the anti-blasphemy view).
I'm not sure I understand or agree with this viewpoint. Banning guns
but not prostitution is not somehow an endorsement of prostitution.
And I haven't actually hard of "anti-blasphemy" content filtering. My
proposal was made in the context of teachers refusing to let kids use
Wikipedia because of sexual photos or material being too easily
accessible.
Right! Each of those censorship bodies expresses
particular views
about what is "bad" or "explicit". Those groups are not bound by an
NPOV policy; indeed, they are frequently commercial firms hired for the
purpose of enforcing particular religious and moral points of view.
However, Wikipedia is supposed to be neutral. It isn't up to us to say
that nipples are "explicit" and elbows are not.
I don't think "Wikipedia" is supposed to be neutral. *Content* in
Wikipedia is supposed to be balanced. So I don't really see that as an
argument for anything.
But anyway, as for nipples...*everyone* (and in almost every culture)
recognises them as controversial. And there are people not using
Wikipedia because of the risk of encountering them. If and when elbows
become a problem, we could cross that bridge too.
Our policy against self-censorship is still a
consensus policy, though.
Censorship proposals (such as [[WP:TOBY]]) have been consistently and
roundly rejected.
I don't think Toby was the be-all and end-all of all possible content
filtering policies. And you're calling it "self-censorship", which I
think refers to excluding material from Wikipedia, so that *no one*
could see it. I'm only proposing preventing people already using
content filtering software from seeing it.
I'm really puzzled that my suggestion could be so controversial, but
there you go.
The "material" I was referring to was
censorship tags. Adding those
tags to Wikipedia articles, for the purpose of getting those articles
censored, is against the policy that Wikipedia doesn't do censorship.
I'm proposing changing that policy.
No. Writers will be discouraged from writing on
topics which won't be
seen because of censorship measures.
"Won't be seen" - I suspect that readership of any given page would
diminish by less than 5% even with content filtering tags. Perhaps
much less than that.
No. The coverage of [[sex]] and [[breast cancer]] and
[[testicle]] and
[[abortion]] and [[mastectomy]] and [[Playboy magazine]] and [[nude]]
will be worse because people will be less interested in writing if they
think the audience is smaller.
Those people should ask themselves which audience members they are
losing. Who would find themselves blocked from such a page? Think
about it.
Or, to pick another popular censorship topic --
violence -- the coverage
of [[war]] and [[murder]] and [[AK-47]] and [[crushing by elephant]] and
[[electrocution]] and [[Quake III Arena]] and [[terrorism]] will be
worse, because people will be less interested in writing if they think
the audience is smaller.
Same argument. Why write for people whose parents/teachers don't want
them to see this material?
People write to be read. The effect of censorship
measures is to reduce
the size of the audience. (If a "censorship" measure does not prevent
anyone from reading the "censored" material, then it is _ineffective_.)
If the audience size is reduced, then the incentive to write is reduced.
I honestly feel this is a pretty weak argument. You're bringing
economics into it, but aren't considering things like whether the
given user wouldn't go and read another page instead, for example.
No. Labelling it as "unsuitable for
reading" amounts to a personal
attack on the contributors. Any time you enable censorship of
particular material, you are making a claim that it is unfit to be read.
Well, again, you're not really attacking my proposal, which is to
allow end users to filter content themselves, based on tags we supply.
It's up to the end user (or their supervisor) to determine what is
"unfit" to read.
To put it another way: Why do we have a rule against
personal attacks?
Because personal attacks make people feel less welcome and less willing
to collaborate. Calling someone's work smutty or harmful to minors will
have that very same effect.
I don't think anyone working on [[autofellatio]] will be terribly
offended by it being called "smutty", to be perfectly frank. But see
my previous remark.
Not really. How would you like it if someone went
around to _your_
contributions and marked them up in ways you found insulting and
derogatory? That would be a bad thing, and nobody should do it. It
Sure. But that's not what I was proposing. If I was working on a porn
article, and someone labelled it "porn", I'd be hard pressed to argue.
Steve