Ray Saintonge wrote:
Ryan Delaney wrote:
My big problem with this is that very frequently, especially in
fields like science and philosophy, commonly held beliefs might be
very different from the "correct" beliefs, or the consensus among
learned experts. But because of the format of Wikipedia, some
extremely wrong beliefs are inserted into articles because they are
commonly held, even if they wildly contradict the research that
professionals in the field are doing -- and I mean this is just as
bad as saying the Earth is flat. The only difference is that the
roundness of the Earth is common knowledge, but there are some things
in science that are just as obvious to professionals but completely
unknown to the general public.
"Commonly held beliefs", "'correct' beliefs", and expert
consensus are
three different frames of mind, and all three can still be wrong.
Often, but certainly not always, the professionals have it right;
that's reason enough to leave open the avenues for criticizing
science. "Correct" beliefs are often promulgated by people with a
political end in mind, and they have no qualms about bending facts if
it will help their cause The general public, and thus our editors are
often in the difficult position of having a limted basis for making a
decision. The need to cite sources should apply equally to the
scientists and to those who express commonly held beliefs; the
scientists have an advantage here because that practice has been a
part of their experience.
Science is very poorly reported to the general public. An
understanding of what's going on is incompatible with the 15-second
sound bite. Look at the evolution of a long established publication
like "Popular Science". In its early days, shortly after the US Civil
War it had a lot of articles designed to get everybody to think about
science; since then it has managed to evolve into something far more
gadgety. Much of science has retreated into the ivory tower. This is
great for the protection of scientific sinecures, but is terrible for
the promotion of scientific understanding by the general public.
One of the greatest things that Wikipedia could accomplish would be to
produce a generation of critical thinkers with both the tools and the
confidance to question any kind of established truth wherever they can
find it.
The [[Race and intelligence]] article is a perfect
example of this
phenomenon. People who know nothing about the research done in this
field have many times gone into that article and edited it mercilessly
in the name of NPOV because the established scientific opinion
presented (and extensively referenced) in the article is very
contradictory to the "politically correct" opinion. In my view,
Wikipedians need to have more respect for references and experts to
prevent this kind of thing from happening. The usual Wiki philosophy
usually works in other cases, but in such an emotional subject as
[[Race and intelligence]], people tend to go way overboard, and the
NPOV and "everyone's equal" policies only make them more convictional
about their right to push their POV over that of the academic consensus.
It's a challenge. The statistics say what they
say - nothing more,
nothing less. The statistics thenselves are unconcerned about how
anyone misinterprets them. People don't usually understand what
statistics are all about, and are quick to draw conclusions that are
not warranted. This subject matter is a good example where we can
look for creative ways to build consensus. Simply telling the public
that they are wrong and that they should pay attention to their
academic betters will get us nowhere except into a never ending flame
war. Somewhere along the way the scientists dropped the ball.
Generally speaking (and I am speaking from my personal experience here)
whenever what I am calling a "learned expert" -- be it a graduate
student in a subject or a PhD or a professional with years of time on
the job -- makes contributions, they are well argued and highly
referenced. As you say, these people have a lot of experience making
these kinds of arguments and I think their method of discovering truth
is the best one we have. So no, I don't want to tell the public that
"they are wrong and that they should pay attention to their academic
betters". I don't think it should even have to be said.
I think people on Wikipedia should be more humble and conscious of the
limitations of their knowledge and expertise, and willing to admit that
if another person knows more about it than they do, that his or her
opinions are more likely to be closest to the truth. If an expert says
something you think you can disprove with adequate research and
referencing, then go ahead and bring that up. But I've seen some
astonishing arrogance in Wikipedians who think their common knowledge
should weigh just as heavily as the opinions of eminent and established
thinkers in an advanced field.
That doesn't promote the spread of academic-level scientific knowledge
that you want -- it obscures it, by breeding a culture that only
perpetuates the commonly held beliefs over the rigorous testing of
scientific method.
[snip]
In these
cases, I don't think that any amount of voiciferous
objecting and arguing should be considered relevant. I think that
even if the consensus of Wikipedians editing the article disagrees
with it, that consensus should lose, unless they can find some
evidence that the article is wrong. This obsession with consensus has
a real possibility of going terribly wrong. I think the emphasis
should be on having Wikipedia advance _correct_ beliefs, not popular
ones.
"Correct" too easily becomes "politically correct". It's too
easy to
become emotionally attached to one's "correct" beliefs. There is
great normalizing power to effective consensus building. Scientists
would do better by judiciously planting seeds in Wikipedia's great
fractal Mendelbrot.
I don't think that Wikipedia should take a position that truth is
subjective. I know that's not precisely what you're saying, but I do
think this is a dangerous slippery slope.
Some beliefs are wrong despite being widely held. Yes, if you dig deep
enough into the philosophy of correspondence theory and epistemology
you'll discover that we only know things with a limited degree of
certainty. But as you know, that doesn't mean that Wikipedia should
include caveats that the moon might actually be made of Norweigan beaver
cheese. Science has a way of shifting the burden of proof, and that also
happens in academic subjects where the discoveries of a few people are
not yet common knowledge despite the availability of the facts to anyone
who wants to look at them.
Human civilization works as well as it does because our skills are
diversified and we can each specialize to great degrees in individual
tasks. But Wikipedia has abandoned that format, and I think that in
exchange for freedom of information exchange, we're paying a price we
don't have to pay, in the form of scaring away the people who have
specialized the most in a single field. I don't want Wikipedia to be an
academic journal with all its red tape, politics, and anal-retentive
peer review. I just want Wikipedians to be encouraged to understand
their limitations, and stick to the things they know.
- Ryan