-----Original Message----- From: William Pietri [mailto:william@scissor.com] Sent: Tuesday, June 5, 2007 07:19 AM To: 'English Wikipedia' Subject: [WikiEN-l] Wikipedia as moral tool?
My understanding of what Wikipedia does at the core has always been pretty simple: We take factual material elsewhere and summarize it neutrally and clearly.
We have always included material which is not factual in the slightest. If a subject attracts human attention, it is considered worth of inclusion.
I feel like the service we provide to readers is pretty simple: instead of making them dig through all the stuff out there on some topic just to get an overview, we do the first pass at that for them.
But lately I hear a different thing. Now that we've become so prominent, I hear people saying that we should be using Wikipedia as a moral instrument.
We have a duty to act responsibly. Although we are a corporation, we are not necessarily nihilistic.
If we don't like how sites treat our editors, we should disappear them.
Sites which cause serious harm to our users may incur such a penalty.
If we don't like that the media reports certain things, we should prune that information. It doesn't matter if it was in multiple reliable sources: if we don't trust our readers with the facts, we should cut them out.
Media reports which abuse the privacy of other persons, even those in the Washington Post, need not be repeated on Wikipedia as though we were mindless, nihilistic robots.
What worries me about this isn't so much the current uses, although they bother me a little. Instead, I worry about two things:
- Once we cross the line away from "just the NPOV facts, ma'am" to
Wikipedia-as-moral-tool, will it really be limited to these two things? Won't people find more ways to improve the world by restricting what we print?
We must work to strike a moral balance between to good of knowledge being available and the possible evil of harming others.
- Don't we risk eternal contention? It seems like getting people to
agree on the facts is hard enough. Can we ever come up with a shared morality?
We can move toward it, learning from experience and observing the results of our actions.
That's not to say that we shouldn't suppress facts for moral purposes. There are good arguments for it. I'm just wondering what the long-term cost is.
There are two costs, the costs of doing nothing and the costs of attempting to be responsible, both are substantial.
William
Responses by Fred
On 6/5/07, Fred Bauder fredbaud@waterwiki.info wrote:
We must work to strike a moral balance between to good of knowledge being available and the possible evil of harming others.
It is impossible to know which bits of information cause damage it is impossible to quantify the damage and again impossible to quantify the good.
The field of science has been dealing with this problem for some time. The position of arguing that information is neither intrinsically good nor evil appears to be the only sustainable option.
On 6/5/07, geni geniice@gmail.com wrote:
On 6/5/07, Fred Bauder fredbaud@waterwiki.info wrote:
We must work to strike a moral balance between to good of knowledge
being available and the possible evil of harming others.
It is impossible to know which bits of information cause damage it is impossible to quantify the damage and again impossible to quantify the good.
The field of science has been dealing with this problem for some time. The position of arguing that information is neither intrinsically good nor evil appears to be the only sustainable option.
I have to say I completely disagree with that. Maybe these things can't be quantified numerically, but I don't think we'll ever live in a society which can handle giving all information to all people. I forget who it was that said it, but someone in here mentioned that Wikipedia is fortunate to be constrained by certain external forces (mostly laws) which discourage us from truly printing every fact known to man. I'll add that Wikipedia's policy against original research and in favor of using reliable sources saves us from considering a lot more bad ideas. I'd hate to see some of the arguments that would go on without these constraints.
On 6/6/07, Anthony wikimail@inbox.org wrote:
I have to say I completely disagree with that. Maybe these things can't be quantified numerically, but I don't think we'll ever live in a society which can handle giving all information to all people. I forget who it was that said it, but someone in here mentioned that Wikipedia is fortunate to be constrained by certain external forces (mostly laws) which discourage us from truly printing every fact known to man. I'll add that Wikipedia's policy against original research and in favor of using reliable sources saves us from considering a lot more bad ideas. I'd hate to see some of the arguments that would go on without these constraints.
NOR and RS are to do with the information being correct not it's relation to good or evil.
The true information is generally legally fairly safe under US law exceptions would be
There is stuff that is obscene. Stuff that violates someone's IP Perhaps born secret stuff. Under certain conditions information the foundation itself collected Under certain conditions trade secrets Stuff that can't be released due to court orders? Probably a couple of other things I've missed
For the most part the law has little effect on what we want to do.
On 6/5/07, geni geniice@gmail.com wrote:
On 6/6/07, Anthony wikimail@inbox.org wrote:
I have to say I completely disagree with that. Maybe these things can't
be
quantified numerically, but I don't think we'll ever live in a society
which
can handle giving all information to all people. I forget who it was
that
said it, but someone in here mentioned that Wikipedia is fortunate to be constrained by certain external forces (mostly laws) which discourage us from truly printing every fact known to man. I'll add that Wikipedia's policy against original research and in favor of using reliable sources saves us from considering a lot more bad ideas. I'd hate to see some of
the
arguments that would go on without these constraints.
NOR and RS are to do with the information being correct not it's relation to good or evil.
Right, but you're missing my point. NOR and RS aren't rules that we follow to protect people's privacy, but they do have that positive side-effect.
The true information is generally legally fairly safe under US law
exceptions would be
There is stuff that is obscene. Stuff that violates someone's IP Perhaps born secret stuff. Under certain conditions information the foundation itself collected Under certain conditions trade secrets Stuff that can't be released due to court orders? Probably a couple of other things I've missed
For the most part the law has little effect on what we want to do.
That's a pretty long list in itself. I think U.S. law combined with the NOR and reliable source rules adequetely eliminate the vast majority of the undesirable information that might otherwise be in Wikipedia. But I'm much more on the open access side than most people.
On 6/6/07, Anthony wikimail@inbox.org wrote:
Right, but you're missing my point. NOR and RS aren't rules that we follow to protect people's privacy, but they do have that positive side-effect.
That is not their objective.
That's a pretty long list in itself.
It's padded though.
for non fiction we can rely on the "you can't copyright facts" to get around "Stuff that violates someone's IP"
born secret may not be constitutional
the next two would run into NOR
I think U.S. law combined with the NOR and reliable source rules adequetely eliminate the vast majority of the undesirable information that might otherwise be in Wikipedia. But I'm much more on the open access side than most people.
There are a few other bits and pieces (copyright although I'm not sure that is going to work in the case of the Diana crash pics) but yes.
On 6/5/07, geni geniice@gmail.com wrote:
born secret may not be constitutional
For those that don't know what this means: http://en.wikipedia.org/wiki/Born_secret
"*Born secret*" and "*born classified*" are both terms which refer to a
policy of information being classifiedhttp://en.wikipedia.org/wiki/Classified_Informationfrom the moment of its inception, usually regardless of where it was being created, usually in reference to specific laws in the United Stateshttp://en.wikipedia.org/wiki/United_States .
This is a whole other can of worms that Geni touched on directly involved with Wikipedia ethics and morality. Something be illegal, unconstitutional, and/or culturally immoral or unethical in certain locales and jurisdictions. Which gets to decide which is appropriate for Wikipedia? Florida and United States law? The ethics, morality, and legalities of the most vocal Wikipedians? Or the ethics, morality, and legalities of the most Wikipedians with the highest number of users? If something is blatantly illegal, say, in Turkey or the United Kingdom, but not in the United States, what happens? What if something is considered patently immoral if not depraved in large sections of the United States, but editors from other countries (or parts of the US) want to include it?
I agree based on this with the idea that applying ethical or moral standards without neutrality in them--ethics and morality that favors no one or one ideological, religious, or political standpoint over anyone else--is not tenable.
Regards, Joe http://www.joeszilagyi.com
Which gets to decide which is appropriate for Wikipedia? Florida and United States law?
If the Wikimedia Foundation is Incorporated in the US, whether or not you believe you should be restricted by US law - the US may have a differing opinion. :)
That said, this is not the US wiki, it is the English wiki - big difference there.
If the offensive portion of a topic is really required to give it proper coverage, we should not censor - but if it isn't essential to the content, at least take it to the talk page to discuss removing it. We should strive to be unoffensive unless absolutely needed.
Now, if you go intentionally looking at a topic you find offensive, I don't think you should have the right to complain. You knew what you were going to look at before you did it.