On 8/6/07, David Goodman dgoodmanny@gmail.com wrote:
Their Wikimania slides have an interesting example, where a really problematic edit becomes gradually accepted simply from being there and other people editing the article but not fixing it
Was the gradual change due to increasing level of trust in the text of an individual edit, or of the user in general?
To illustrate the problem with it being based on the user (U:a is a new anon and U:r is a regular fact checker):
1. U:a changes [[T1]], [[T2]] and [[T3]] 2. U:r reverts the changes to [[T1]] 3. U:a waits a while and then restores the added text to [[T1]]
The text in [[T1]] should stay orange indefinitely, while [[T2]] and [[T3]] could gradually become normal.
I have concerns about displaying this on the live version of wikipedia. Currently all text is black on white, and we are drumming home the message that it should not ever be implicitly trusted. If we introduced something like this, many readers (and contributors) will trust black on white text more than they trust red on orange; i.e. some parts of the text are become "more" trusted due a software change. Ultimately, in order for this to be appropriate on an encyclopedia, the algorithm needs to be either very accurate measure, or kept out of sight.
Also, the blame map for dab pages is often largely orange.
http://enwiki-trust.cse.ucsc.edu/index.php/CRM http://enwiki-trust.cse.ucsc.edu/index.php/Cortex http://enwiki-trust.cse.ucsc.edu/index.php/Cortez
-- John