I wouldn't lose sleep over critics "having a field day" on any particular weak article. Sad to say, if people really want to find problematic material in Wikipedia, then they won't have to look very hard, regardless of the quality of any individual article like [[Opus Dei]]. Wikipedia is, effectively, permanently "under construction" -- although the increasingly large set of core articles is becoming pretty solid, it's always going to be easy to find an embarrasingly-naff entry somewhere.
This is like the people who test "machine translation" by tossing odd phrases it at, until they find some idiom that trips it up:
* "blood, sweat and tears" into Russian (and back) produces something like "bleeding, bile and body water"
Then, like the vultures they are, they pounce: "See? It's inaccurate?"
Remember, these are the same journalists who play gotcha with presidents they don't like (see [[Bushisms]]). If you look hard enough, you can always find some embarassing phrase or incident, to help you make your target look bad (so you can discredit him).
This is the sort of thing that drove away Larry Sanger (in part): lack of respect for accomplishment, diligence and solid scholarship. You try to find some small thing to pick on, and then (illegitimately) imply that it's representative of the whole. (I started to write an article on [[damaging quotation]]s one time.)
To make Wikipedia really solid, SOMEBODY has to start verifying and endorsing Article Versions. I still credit Larry Sanger as the originator of the "sifter project", and I eagerly wait integration of Magnus's software updates.
We need to be able to identify stable versions of articles - especially important articles. I want to see tags such as:
* copy-edited by Vicki R. * vandalism-patrolled by maveric149 * NPOV-checked by Anthere
Sure, multiple people can add their endorsements. I don't want to see a "tag war" start, where one person adds the NPOV-dispute tag, and another removes it. If someone *I* respect says the article passes or fails the NPOV test, then that's all I care about. If someone *you* respect tags it a certain way, that's all *you* care about.
Uncle Ed
Ed Poor wrote
This is the sort of thing that drove away Larry Sanger (in part): lack
of respect for accomplishment, diligence and solid scholarship.
I have tried to edit some of the Larry's Text articles. Please don't give me that. Larry wanted/wants people to defer to authority. It was the wrong way to go then, as it is now.
Charles
On 4/20/05, Poor, Edmund W Edmund.W.Poor@abc.com wrote:
This is the sort of thing that drove away Larry Sanger (in part): lack of respect for accomplishment, diligence and solid scholarship. You try to find some small thing to pick on, and then (illegitimately) imply that it's representative of the whole. (I started to write an article on [[damaging quotation]]s one time.)
To make Wikipedia really solid, SOMEBODY has to start verifying and endorsing Article Versions. I still credit Larry Sanger as the originator of the "sifter project", and I eagerly wait integration of Magnus's software updates.
I've been thinking about this a few days and I've decided that I disagree with it completely.
The best defense about Wikipedia being inaccurate is to say that it is constantly in flux and one should take it with a grain of salt. It's a good starting point but it can never been authoritative. This is not really a criticism -- one should have such a critical eye with *all* accounts of knowledge to some extent, it is just a lull in critical faculties which causes us to think that EB articles come with guaranteed authority.
The sifter seems like an attempt to put on a veneer of authority -- to say, "this has been checked, it is more accurate or reliable than the dynamic version." I think this is misleading and dangerous.
First of all, unless we have real, verified "experts" writing/editing these siftered articles, we can't in any way pretend to paste on any veneer of authority based in traditional models of it. And to require that degree of authentication seems antithetical to the Wikipedia collective philosophy, at the very least. Wikipedia's greatest claim to authority is that each piece has been looked at by dozens of different eyes from different backgrounds. It is a different model of authority.
Secondly, slapping an "approved" on something removes our only recourse to accusations of inaccuracy. At the moment we can say, "look, we don't guarantee anything." The second we start labeling some content as "fixed" is the second we implicitly lose that (even if we right "no guarantees" on the bottom of things -- even EB writes that in its disclaimers but nobody takes legalese seriously when it comes to questions of factual accuracy).
I've seen dozens of articles which would no doubt be considered "sifter"-worthy which I think contained gross inaccuries or poor reprensentations. It is very easy to make something look complete if it is not a topic which most people know all that much about and is written in an apparently sensible way. However I don't think an expertocracy is the way to go on this. The bonus of a "free" encyclopedia (like "free" software) is that you are also "free" from liability -- no guarantees provided, of course (who provides guarantees anymore?), but also no guarantees *expected*. Adobe does not guarantee that Photoshop will work correctly, but it is expected that it ought to. I think people are more liberal in their assessments of open source solutions like GIMP -- it is no surprise, and no real sin, that it doesn't always function as expected.
Just some thoughts I had on this while thinking about free licenses, dynamic content, and what they imply. (I believe Jimbo will be giving a talk 'roundabouts my neck of the woods on free licensing tomorrow, maybe I'll check it out...)
FF
Fastfission wrote:
The sifter seems like an attempt to put on a veneer of authority -- to say, "this has been checked, it is more accurate or reliable than the dynamic version." I think this is misleading and dangerous.
Personally, all I really want out of a sifter-type process is "this has been checked and is not blatantly vandalized or currently an active battleground, and the spelling looks okay to me." IMO a sifter like this would take a lot of stress off of editors who, rightly or wrongly, feel the need to keep a constant watch over articles and "fix" them instantly when problems crop up. It would also allow us to feel comfortable stamping ten thousand CDs without the fear that the database dump was taken at the exact moment a vandal stuck in something dreadful that is now immortalized in dimpled aluminium. All the standard Wikipedia disclaimers should still apply.
Once we have that, then maybe we can start looking at ways to produce an even more rigorously proofed versions that includes fact checking. I'm not in a hurry. :)
Bryan Derksen wrote:
Personally, all I really want out of a sifter-type process is "this has been checked and is not blatantly vandalized or currently an active battleground, and the spelling looks okay to me." IMO a sifter like this would take a lot of stress off of editors who, rightly or wrongly, feel the need to keep a constant watch over articles and "fix" them instantly when problems crop up. It would also allow us to feel comfortable stamping ten thousand CDs without the fear that the database dump was taken at the exact moment a vandal stuck in something dreadful that is now immortalized in dimpled aluminium. All the standard Wikipedia disclaimers should still apply.
Once we have that, then maybe we can start looking at ways to produce an even more rigorously proofed versions that includes fact checking. I'm not in a hurry. :)
This sounds like exactly the right approach to me. What's more, it fits nicely with the incremental-improvement process we've used so far. Writing top-quality articles from scratch a la Nupedia turned out to be fairly hard, but incrementally improving articles has worked quite well. Incrementally verifying articles might likewise have some advantages over trying to make a leap from "this could be anything, including random vandalism" to "this has been verified by experts in the field as the Perfect Article". There's always a risk of *too* much complication, but a few simple levels of verification, starting with "this revision is not vandalized, not in the midst of an edit war, and not obviously glaringly bad", would be nice.
-Mark
Personally, all I really want out of a sifter-type process is "this has been checked and is not blatantly vandalized or currently an active battleground, and the spelling looks okay to me." IMO a sifter like this would take a lot of stress off of editors who, rightly or wrongly, feel
But we already have that! We have the NPOV warning, the Cleanup warning and two dozen more tags editors slap onto articles. It is only in a few areas that Wikipedia suffers - the articles about the Israeli-Palestinian conflict is particularily atrocious. But compare those articles against articles on the same subject in Britannica and you'll see that Wikipedia isn't any worse than it. Wikipedia is good enough as it is IMHO.
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
BJörn Lindqvist schrieb:
Personally, all I really want out of a sifter-type process is "this has been checked and is not blatantly vandalized or currently an active battleground, and the spelling looks okay to me." IMO a sifter like this would take a lot of stress off of editors who, rightly or wrongly, feel
But we already have that! We have the NPOV warning, the Cleanup warning and two dozen more tags editors slap onto articles. It is only in a few areas that Wikipedia suffers - the articles about the Israeli-Palestinian conflict is particularily atrocious. But compare those articles against articles on the same subject in Britannica and you'll see that Wikipedia isn't any worse than it. Wikipedia is good enough as it is IMHO.
Wikipedia is good, no doubt about that :-)
The (main) difference between a "sifter" function and warning templates is that with the template system, the *absence* of a template means either * this is OK or * noone saw this, or cared enough to put a warning tag there
A "sifter" (or as I call it, "validation") function allows to actively mark a revision as "good" or "bad". Warning templates can only mark it as "bad".
Magnus
Jack Lutz (jack-lutz@comcast.net) [050502 08:40]:
David Gerard wrote:
Jack Lutz (jack-lutz@comcast.net) [050502 01:37]:
How do validation ratings age and handle sudden, mass replacement of content?
[[m:Article validation feature]]
This is not covered on the Meta page.
Sorry, the page does (or should) mention that the planned feature is to be per revision, not per page - so sudden mass replacement of content creates a new version, which gets a different rating.
(Expect vote spamming and calling out the vote on contentious articles and voting teams on less popular articles.)
- d.
On 5/1/05, David Gerard fun@thingy.apana.org.au wrote:
Jack Lutz (jack-lutz@comcast.net) [050502 08:40]:
David Gerard wrote:
Jack Lutz (jack-lutz@comcast.net) [050502 01:37]:
How do validation ratings age and handle sudden, mass replacement of content?
[[m:Article validation feature]]
This is not covered on the Meta page.
Sorry, the page does (or should) mention that the planned feature is to be per revision, not per page - so sudden mass replacement of content creates a new version, which gets a different rating.
(Expect vote spamming and calling out the vote on contentious articles and voting teams on less popular articles.)
Articles on schools are going to be fun.
geni wrote:
On 5/1/05, David Gerard fun@thingy.apana.org.au wrote:
Jack Lutz (jack-lutz@comcast.net) [050502 08:40]:
David Gerard wrote:
Jack Lutz (jack-lutz@comcast.net) [050502 01:37]:
How do validation ratings age and handle sudden, mass replacement of content?
[[m:Article validation feature]]
This is not covered on the Meta page.
Sorry, the page does (or should) mention that the planned feature is to be per revision, not per page - so sudden mass replacement of content creates a new version, which gets a different rating.
(Expect vote spamming and calling out the vote on contentious articles and voting teams on less popular articles.)
Articles on schools are going to be fun.
So that means that *every single edit* resets the counter? I can see edit wars resulting from that.
-[[en:User:Humblefool]]
To clear this up: Currently, * for each revision * of each article, * each user can "rate" different properties.
Once MediaWiki 1.5 is released, Jimbo wants to run this for a while to gather data how people rate things, then we decide on a final (meaning: this week:-) schema to handle the ratings.
Magnus
On 5/3/05, Magnus Manske magnus.manske@web.de wrote:
To clear this up: Currently,
- for each revision
- of each article,
- each user
can "rate" different properties.
Once MediaWiki 1.5 is released, Jimbo wants to run this for a while to gather data how people rate things, then we decide on a final (meaning: this week:-) schema to handle the ratings.
This week? Yikes! SJ
Magnus Manske (magnus.manske@web.de) [050504 00:54]:
To clear this up: Currently,
- for each revision
- of each article,
- each user
can "rate" different properties. Once MediaWiki 1.5 is released, Jimbo wants to run this for a while to gather data how people rate things, then we decide on a final (meaning: this week:-) schema to handle the ratings.
I asked Brion to switch this on for test.leuksman.com last night and, er, everything broke :-) So it's off again. But it would be good to have whatever's missing back on so we can beat it around on the test wiki.
- d.
David Gerard schrieb:
I asked Brion to switch this on for test.leuksman.com last night and, er, everything broke :-) So it's off again. But it would be good to have whatever's missing back on so we can beat it around on the test wiki.
Strange... I just set up a brand-new test wiki from CVS HEAD, and validation works just fine...
It is obviously turned off at the test site now, so I can't see what *might* be wrong.
Maybe the database update malfunctioned. Brion, can you check that with patch-validate.sql?
Thanks, Magnus
Magnus Manske wrote:
David Gerard schrieb:
I asked Brion to switch this on for test.leuksman.com last night and, er, everything broke :-) So it's off again. But it would be good to have whatever's missing back on so we can beat it around on the test wiki.
Strange... I just set up a brand-new test wiki from CVS HEAD, and validation works just fine...
Well, he mentioned somewhere else something about database table prefixes. Maybe your brand-new test wiki didn't use table prefixes and therefore didn't run into the problem?