From: Daniel P. B. Smith [mailto:dpbsmith@verizon.net]
Back in June, I complained that a little particle of misinformation from Wikipedia had gotten lodged in my brain, and might potentially have affected my car-purchasing decision. Specifically, I was referring to an article that characterized the Toyota Echo, as a "flop" in the U.S. whose sales had tanked in 2004 and was due to be discontinued--despite continuing success in many other countries, including Canada.
When Wikipedia fails, this is the typical way it does fail. Someone makes insufficient effort to balance the article.
Now that summer is over, I will renew my Quarterly Qall for Quality (oops, typo! ;-) at Wikipedia.
We need a kind of rating scheme or quality assurance system or certification mechanism. It's not a problem of designing the software for this. Tim and Erik and Magnus and Brion and all the rest are superbly capable. But *they* are not going to take the lead in this. *We* must decide that we want some means of assuring readers that they are getting reliable information.
Once again, I suggest that we let users "review" a given "version" of an article and (using a "new" software feature) "mark" that version as:
* "patrolled" (as in Recent Changes "simple vandalism" patrol) * "accurate" (i.e. "I am personally convinced that Everything this article says is true and correct." * "balanced" (i.e., nothing has been left out or downplayed)
For problems: * "graffiti" (or "vandalism" = someone has messed up this version, but I don't have the time, inclination or ability to undo the damage) * "inaccurate" (contains mistakes, which we *hope* they'll mention on the talk page) * "bias" (tells one side of a story, especially in a raging controversy)
Now where we take it from here is really up for grabs. Some people won't even care about these tags. We can set the default to ignore quality tags unless you "opt in".
RC patrollers might like to know WHO has reviewed a version. If Mav says he's checked the diff for "simple vandalism", I wouldn't give it a second thought. He's the champ. He used to check EVERY change (!) when traffic was slow enough. Now that there are often 100s of edits per minute, this work needs to be split up. I would be happy to put in an hour, from time to time, if only I knew "who else" had certified a certain article version as "patrolled". I'd ignore known troublemakers, for example. Double check newbies, and not even bother reviewing the work of people I come to trust.
Knowing that a particular VERSION of an article had been certified as "balanced" would help Administrators if an edit war flares up. If they need to protect the article, they could go back to the last version which had been marked "balanced" by someone they trust. To be fair, they would almost certainly have to pick a version certified by someone other than themselves - unless they weren't involved in the dispute.
For the looming print version (or CD / DVD version), we could automatically choose article versions which have a suitable combination of Approval Tags and Problem tags. (My own preference would be "patrolled" and no "graffiti".) A library might insist on "accurate" and "balanced" with no "inaccurate" or "bias".
We can let people use the tags as a filtering system. When browsing or looking up information, you might want to see:
* the most recent version which HAS the tags you like;
Or
* the most recent version which DOES NOT HAVE any of the tags you hate
Or
* the most recent version which has all the tags you like and none of the tags you hate.
But when you went to edit, you'd see the latest version same as before. Perhaps you'd get a notice saying "This is the latest version. You were looking at yesterday's / last week's version."
What do you all think?