[WikiEN-l] Wired: Wikipedia to Color Code Untrustworthy Text
Emily Monroe
bluecaliocean at me.com
Mon Aug 31 01:34:57 UTC 2009
> Or perhaps it is a reputation score - my memory is fuzzy.
Either way, I would like the score to NOT be published. I'd hate to
have the community divided over a piece of software.
Emily
On Aug 30, 2009, at 8:32 PM, Brian wrote:
> On Sun, Aug 30, 2009 at 7:31 PM, Brian <Brian.Mingus at colorado.edu>
> wrote:
>
>>
>>
>> On Sun, Aug 30, 2009 at 6:24 PM, Keith Old <keithold at gmail.com>
>> wrote:
>>
>>> Folks,
>>>
>>> http://www.wired.com/wiredscience/2009/08/wikitrust/
>>>
>>> Wired reports:
>>>
>>>
>>> *"Starting this fall, you’ll have a new reason to trust the
>>> information
>>> you
>>> find on Wikipedia: An optional feature called “WikiTrust” will
>>> color code
>>> every word of the encyclopedia based on the reliability of its
>>> author and
>>> the length of time it has persisted on the page.*
>>>
>>> *More than 60 million people visit the free, open-access
>>> encyclopedia each
>>> month, searching for knowledge on 12 million pages in 260
>>> languages. But
>>> despite its popularity,
>>> **Wikipedia*<
>>> http://www.wired.com/wiredscience/2009/08/wikitrust/www.wikipedia.org
>>> >
>>> * has long suffered criticism from those who say it’s not reliable.
>>> Because
>>> anyone with an internet connection can contribute, the site is
>>> subject to
>>> vandalism, bias and misinformation. And edits are anonymous, so
>>> there’s no
>>> easy way to separate credible information from fake content
>>> created by
>>> vandals.*
>>>
>>> *Now, researchers from the **Wiki Lab* <http://trust.cse.ucsc.edu/
>>> >* at
>>> the
>>> University of California, Santa Cruz have created a system to help
>>> users
>>> know when to trust Wikipedia—and when to reach for that dusty
>>> Encyclopedia
>>> Britannica on the shelf. Called
>>> **WikiTrust*<http://wikitrust.soe.ucsc.edu/index.php/Main_Page>
>>> *, the program assigns a color code to newly edited text using an
>>> algorithm
>>> that calculates author reputation from the lifespan of their past
>>> contributions. It’s based on a simple concept: The longer
>>> information
>>> persists on the page, the more accurate it’s likely to be.*
>>>
>>> *Text from questionable sources starts out with a bright orange
>>> background,
>>> while text from trusted authors gets a lighter shade. As more
>>> people view
>>> and edit the new text, it gradually gains more “trust” and turns
>>> from
>>> orange
>>> to white."*
>>>
>>> More in story
>>>
>>> *Regards*
>>>
>>> **
>>>
>>> *Keith*
>>>
>>
>>
>> What's interesting about WikiTrust is that a trust score is
>> computed for
>> each individual. I wonder if these will be made public, and if so,
>> how they
>> will change the community of editors. It seems likely that they
>> will not be
>> made public. However, since the algorithm is published and I
>> believe the
>> source code as well anyone with the hardware could compute and
>> publish how
>> trusted each community member is.
>>
>
>
> Or perhaps it is a reputation score - my memory is fuzzy.
> _______________________________________________
> WikiEN-l mailing list
> WikiEN-l at lists.wikimedia.org
> To unsubscribe from this mailing list, visit:
> https://lists.wikimedia.org/mailman/listinfo/wikien-l
More information about the WikiEN-l
mailing list