-----Original Message----- From: William Pietri [mailto:william@scissor.com] Sent: Tuesday, June 5, 2007 07:19 AM To: 'English Wikipedia' Subject: [WikiEN-l] Wikipedia as moral tool?
My understanding of what Wikipedia does at the core has always been pretty simple: We take factual material elsewhere and summarize it neutrally and clearly.
We have always included material which is not factual in the slightest. If a subject attracts human attention, it is considered worth of inclusion.
I feel like the service we provide to readers is pretty simple: instead of making them dig through all the stuff out there on some topic just to get an overview, we do the first pass at that for them.
But lately I hear a different thing. Now that we've become so prominent, I hear people saying that we should be using Wikipedia as a moral instrument.
We have a duty to act responsibly. Although we are a corporation, we are not necessarily nihilistic.
If we don't like how sites treat our editors, we should disappear them.
Sites which cause serious harm to our users may incur such a penalty.
If we don't like that the media reports certain things, we should prune that information. It doesn't matter if it was in multiple reliable sources: if we don't trust our readers with the facts, we should cut them out.
Media reports which abuse the privacy of other persons, even those in the Washington Post, need not be repeated on Wikipedia as though we were mindless, nihilistic robots.
What worries me about this isn't so much the current uses, although they bother me a little. Instead, I worry about two things:
- Once we cross the line away from "just the NPOV facts, ma'am" to
Wikipedia-as-moral-tool, will it really be limited to these two things? Won't people find more ways to improve the world by restricting what we print?
We must work to strike a moral balance between to good of knowledge being available and the possible evil of harming others.
- Don't we risk eternal contention? It seems like getting people to
agree on the facts is hard enough. Can we ever come up with a shared morality?
We can move toward it, learning from experience and observing the results of our actions.
That's not to say that we shouldn't suppress facts for moral purposes. There are good arguments for it. I'm just wondering what the long-term cost is.
There are two costs, the costs of doing nothing and the costs of attempting to be responsible, both are substantial.
William
Responses by Fred