Thanks for initiating this interesting conversation with your paper, Darius.
As a retired professor and researcher and now active Wikipedian, I have a foot in both
camps.
Wearing my academic hat, the concerns I have are the ease of vandalism, the risk of subtle
vandalism (I agree obvious vandalism will be recognised as such by the reader), how
quickly a Wikipedia article can change from good to bad, neutral to biased etc. Although
as an insider to Wikipedia, I know about the Cluebot, the Recent Change Patrol,
watchlists, etc, but to the outside world there does not appear to be any system of
review, and I would have to admit that our methods of detecting vandalism are far from
perfect. When I go away on holidays, particularly if I don't take my laptop, I stop
watching my watchlist. Then when I get home and try to catch up on my watchlist (an
enormous task), I am stunned to find vandalism some weeks old in articles. Am I the only
active user watching that article? It would seem so. We have a tool (left-hand tool bar
when you are looking at any article in desktop mode) that reports how many users (but not
which users) are watching an article but for privacy no value is reported if there are
less than 30 watchers (it says "less than 30"). Yet what difference does it
make if there are 51 or 61 watchers or "less than 30" if the users are inactive
or are active but not checking their watchlist. Since none of us (except developers) can
gain access to the list of users watching any page, we have no way of measuring how many
articles are being checked by others following changes, how quickly are they checked or
are they checked it all? So I think we need a better "reviewing" system and one
more visible to the reader if we want to gain respectability in academic circles. We also
need to prevent as much vandalism as we can (why do we have "5 strikes until you are
blocked" policy?, let's make zero tolerance, one obvious vandalism and you're
blocked).
My 2nd point of difference is this. When I publish an academic paper, I put my real name
and my institution name on it, and with that I am risking my real world reputation and
also that of my institution. That's a powerful motivator to get it right. What risk
does User:Blogwort432 take to their real world reputation? Generally none. The user name
is not their real name. Even if blocked or banned, we know they can pop up again with a
new user name or be one of the myriad IP addresses who contribute. One of the reasons I
edit with my real name is precisely because I put my real world reputation on the line
(assuming you believe my user name is my real name of course) and that's a powerful
motivator for me to write good content AND to be civil in discussions. It's easy to be
the opposite when you hide behind the cloak of a randomly-chosen user name or IP address.
Also real world identities are more able to be checked for conflict of interest or paid
editing ("so you work for XYZ Corp and you've just added some lavish praise to
the XYZ article, hmm"). I think we would have a lot more credibility if we moved to
having real world user names (optionally verified) and were encouraged to add a short CV
(which is currently discouraged) so your credibility as a contributor could be assessed by
readers.
3rd point. Many academics have attempted to edit Wikipedia articles and got their edits
reverted with the usual unfriendly warnings on their User Talk page. When they reply
(often stating that they are an expert in this field or whatever claim they make), they
usually get a very unfriendly reaction to such statements. I can't imagine that
academics who have tried to contribute to Wikipedia and experienced hostility or seen
their edits reverted for reasons they did not understand or did not agree with are likely
to run around saying that Wikipedia is as good as the academic literature.
I think if we want to turn around academic perception, we need to:
1. make academics welcome on Wikipedia (apart from the usual conflict of interests)
2. as many contributors as possible should be real-world verified and invited to upload
their CV or link to one on another site (if we don't want them on Wikipedia User
pages)
3. demonstrate we have a comprehensive, fast and effective review of changed/new content
-- wouldn't be good if we could point to an edit in the article history and see who
reviewed it and how quickly that happened (and have gross statistics on how many reviewed,
how quickly, and tools that tell us what articles aren't being properly reviewed,
etc),
4. eliminate vandalism (well, reduce it substantially)
Or at least demonstrate we are moving towards these goals.
Personally I think some of the "norms" of Wikipedia may have served us well in
the early 2000s but don't serve us so well today. To my mind moving towards
real-world named accounts and then real-world verified accounts as a "norm" will
make us better contributors and if we rate-limited pseudonym and IP accounts, we would at
least reduce the amount of vandalism we currently have to deal with from IP accounts and
new user accounts, and make it harder for the sockpuppets to return, etc. I think we can
find ways to do this without eliminating the privacy needed by a small number of
contributors with legitimate fears about persecution.
Kerry