Folks,
http://www.wired.com/wiredscience/2009/08/wikitrust/
Wired reports:
*"Starting this fall, you’ll have a new reason to trust the information you find on Wikipedia: An optional feature called “WikiTrust” will color code every word of the encyclopedia based on the reliability of its author and the length of time it has persisted on the page.*
*More than 60 million people visit the free, open-access encyclopedia each month, searching for knowledge on 12 million pages in 260 languages. But despite its popularity, **Wikipedia*http://www.wired.com/wiredscience/2009/08/wikitrust/www.wikipedia.org * has long suffered criticism from those who say it’s not reliable. Because anyone with an internet connection can contribute, the site is subject to vandalism, bias and misinformation. And edits are anonymous, so there’s no easy way to separate credible information from fake content created by vandals.*
*Now, researchers from the **Wiki Lab* http://trust.cse.ucsc.edu/* at the University of California, Santa Cruz have created a system to help users know when to trust Wikipedia—and when to reach for that dusty Encyclopedia Britannica on the shelf. Called **WikiTrust*http://wikitrust.soe.ucsc.edu/index.php/Main_Page *, the program assigns a color code to newly edited text using an algorithm that calculates author reputation from the lifespan of their past contributions. It’s based on a simple concept: The longer information persists on the page, the more accurate it’s likely to be.*
*Text from questionable sources starts out with a bright orange background, while text from trusted authors gets a lighter shade. As more people view and edit the new text, it gradually gains more “trust” and turns from orange to white."*
More in story
*Regards*
**
*Keith*
I'll just say I'm a bit surprised to be hearing it from Wired first.
Pakaran
On Sun, Aug 30, 2009 at 8:24 PM, Keith Oldkeithold@gmail.com wrote:
Folks,
http://www.wired.com/wiredscience/2009/08/wikitrust/
Wired reports:
*"Starting this fall, you’ll have a new reason to trust the information you find on Wikipedia: An optional feature called “WikiTrust” will color code every word of the encyclopedia based on the reliability of its author and the length of time it has persisted on the page.*
*More than 60 million people visit the free, open-access encyclopedia each month, searching for knowledge on 12 million pages in 260 languages. But despite its popularity, **Wikipedia*http://www.wired.com/wiredscience/2009/08/wikitrust/www.wikipedia.org
- has long suffered criticism from those who say it’s not reliable. Because
anyone with an internet connection can contribute, the site is subject to vandalism, bias and misinformation. And edits are anonymous, so there’s no easy way to separate credible information from fake content created by vandals.*
*Now, researchers from the **Wiki Lab* http://trust.cse.ucsc.edu/* at the University of California, Santa Cruz have created a system to help users know when to trust Wikipedia—and when to reach for that dusty Encyclopedia Britannica on the shelf. Called **WikiTrust*http://wikitrust.soe.ucsc.edu/index.php/Main_Page *, the program assigns a color code to newly edited text using an algorithm that calculates author reputation from the lifespan of their past contributions. It’s based on a simple concept: The longer information persists on the page, the more accurate it’s likely to be.*
*Text from questionable sources starts out with a bright orange background, while text from trusted authors gets a lighter shade. As more people view and edit the new text, it gradually gains more “trust” and turns from orange to white."*
More in story
*Regards*
**
*Keith* _______________________________________________ WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: https://lists.wikimedia.org/mailman/listinfo/wikien-l
2009/8/31 Nathan Russell windrunner@gmail.com:
I'll just say I'm a bit surprised to be hearing it from Wired first.
The tool has been around for a while. I guess the news is that it is being added to the site itself, rather than being run by a third party. I think that is news to me if it is true (I may have just forgotten about it, though).
On Sun, Aug 30, 2009 at 9:19 PM, Nathan Russellwindrunner@gmail.com wrote:
I'll just say I'm a bit surprised to be hearing it from Wired first.
Pakaran
WikiTrust itself has been announced and then mentioned on this list multiple times; in the absence of quotes in the article from Wikimedia Foundation staff members, I'm not 100% convinced that a decision has been made to roll it out to all users of the English Wikipedia. If I remember, there were significant technical issues associated with a real-time trust analysis on the entire encyclopedia. Perhaps the mentions of "gadgets" were intended to signify that it would be an optional gadget available in the preferences. It's a really interesting method of looking at the project, I'm in favor of adding it as a gadget if a way can be found to make it technically manageable.
Nathan
WikiTrust itself has been announced and then mentioned on this list multiple times; in the absence of quotes in the article from Wikimedia Foundation staff members, I'm not 100% convinced that a decision has been made to roll it out to all users of the English Wikipedia.
Which is what I meant, sorry.
If I
remember, there were significant technical issues associated with a real-time trust analysis on the entire encyclopedia. Perhaps the mentions of "gadgets" were intended to signify that it would be an optional gadget available in the preferences. It's a really interesting method of looking at the project, I'm in favor of adding it as a gadget if a way can be found to make it technically manageable.
It would also be useful for catching vandalism interspersed with good content (or at least telling which sentences to look closely at, without reading a 30 kb article).
Pakaran
2009/8/31 Nathan Russell windrunner@gmail.com:
WikiTrust itself has been announced and then mentioned on this list multiple times; in the absence of quotes in the article from Wikimedia Foundation staff members, I'm not 100% convinced that a decision has been made to roll it out to all users of the English Wikipedia.
Which is what I meant, sorry.
The Wired article says it was derived from a UCSC story, which seems to be the one here:
http://scicom.ucsc.edu/SciNotes/0901/pages/wiki/wiki.html
"After years of collaboration, WikiMedia bigwigs finally decided in April 2009 to make WikiTrust available for all registered Wikipedia users. The launch date for the new gadget has not been set, but de Alfaro thinks it will go live in September or October. "
I cannot for the life of me find any reference to this on the wikitrust website, on the mailing lists, etc - so, I dunno. "available for" and "Gadget" makes it sound like an additional preferences thing, which seems plausible - those tend to get installed pretty quietly.
I've copied this mail to Luca de Alfaro, who's posted here before, and hopefully he can shed some light on what's actually going on! :-)
In the absence of any actual validation that this measures "trust' or "reliability" or "quality", i am very skeptical it would be highly inappropriate to integrate into our gadgets.
David Goodman, Ph.D, M.L.S. http://en.wikipedia.org/wiki/User_talk:DGG
On Sun, Aug 30, 2009 at 9:44 PM, Andrew Grayandrew.gray@dunelm.org.uk wrote:
2009/8/31 Nathan Russell windrunner@gmail.com:
WikiTrust itself has been announced and then mentioned on this list multiple times; in the absence of quotes in the article from Wikimedia Foundation staff members, I'm not 100% convinced that a decision has been made to roll it out to all users of the English Wikipedia.
Which is what I meant, sorry.
The Wired article says it was derived from a UCSC story, which seems to be the one here:
http://scicom.ucsc.edu/SciNotes/0901/pages/wiki/wiki.html
"After years of collaboration, WikiMedia bigwigs finally decided in April 2009 to make WikiTrust available for all registered Wikipedia users. The launch date for the new gadget has not been set, but de Alfaro thinks it will go live in September or October. "
I cannot for the life of me find any reference to this on the wikitrust website, on the mailing lists, etc - so, I dunno. "available for" and "Gadget" makes it sound like an additional preferences thing, which seems plausible - those tend to get installed pretty quietly.
I've copied this mail to Luca de Alfaro, who's posted here before, and hopefully he can shed some light on what's actually going on! :-)
--
- Andrew Gray
andrew.gray@dunelm.org.uk
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: https://lists.wikimedia.org/mailman/listinfo/wikien-l
Color coding to show aging of text (Wikitrust) has been around for ages -- I think since shortly after the Seigenthaler incident or some 2006 incident, or some research around 2006 ish.
Maybe this means the owners will run it live or something. I don't know.
FT2
On Mon, Aug 31, 2009 at 2:19 AM, Nathan Russell windrunner@gmail.comwrote:
I'll just say I'm a bit surprised to be hearing it from Wired first.
Pakaran
On Sun, Aug 30, 2009 at 8:24 PM, Keith Oldkeithold@gmail.com wrote:
Folks,
http://www.wired.com/wiredscience/2009/08/wikitrust/
Wired reports:
*"Starting this fall, you’ll have a new reason to trust the information
you
find on Wikipedia: An optional feature called “WikiTrust” will color code every word of the encyclopedia based on the reliability of its author and the length of time it has persisted on the page.*
*More than 60 million people visit the free, open-access encyclopedia
each
month, searching for knowledge on 12 million pages in 260 languages. But despite its popularity, **Wikipedia*<
http://www.wired.com/wiredscience/2009/08/wikitrust/www.wikipedia.org%3E
- has long suffered criticism from those who say it’s not reliable.
Because
anyone with an internet connection can contribute, the site is subject to vandalism, bias and misinformation. And edits are anonymous, so there’s
no
easy way to separate credible information from fake content created by vandals.*
*Now, researchers from the **Wiki Lab* http://trust.cse.ucsc.edu/* at
the
University of California, Santa Cruz have created a system to help users know when to trust Wikipedia—and when to reach for that dusty
Encyclopedia
Britannica on the shelf. Called **WikiTrust*http://wikitrust.soe.ucsc.edu/index.php/Main_Page *, the program assigns a color code to newly edited text using an
algorithm
that calculates author reputation from the lifespan of their past contributions. It’s based on a simple concept: The longer information persists on the page, the more accurate it’s likely to be.*
*Text from questionable sources starts out with a bright orange
background,
while text from trusted authors gets a lighter shade. As more people view and edit the new text, it gradually gains more “trust” and turns from
orange
to white."*
More in story
*Regards*
**
*Keith* _______________________________________________ WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: https://lists.wikimedia.org/mailman/listinfo/wikien-l
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: https://lists.wikimedia.org/mailman/listinfo/wikien-l
On Sun, Aug 30, 2009 at 6:24 PM, Keith Old keithold@gmail.com wrote:
Folks,
http://www.wired.com/wiredscience/2009/08/wikitrust/
Wired reports:
*"Starting this fall, you’ll have a new reason to trust the information you find on Wikipedia: An optional feature called “WikiTrust” will color code every word of the encyclopedia based on the reliability of its author and the length of time it has persisted on the page.*
*More than 60 million people visit the free, open-access encyclopedia each month, searching for knowledge on 12 million pages in 260 languages. But despite its popularity, **Wikipedia*< http://www.wired.com/wiredscience/2009/08/wikitrust/www.wikipedia.org%3E
- has long suffered criticism from those who say it’s not reliable. Because
anyone with an internet connection can contribute, the site is subject to vandalism, bias and misinformation. And edits are anonymous, so there’s no easy way to separate credible information from fake content created by vandals.*
*Now, researchers from the **Wiki Lab* http://trust.cse.ucsc.edu/* at the University of California, Santa Cruz have created a system to help users know when to trust Wikipedia—and when to reach for that dusty Encyclopedia Britannica on the shelf. Called **WikiTrust*http://wikitrust.soe.ucsc.edu/index.php/Main_Page *, the program assigns a color code to newly edited text using an algorithm that calculates author reputation from the lifespan of their past contributions. It’s based on a simple concept: The longer information persists on the page, the more accurate it’s likely to be.*
*Text from questionable sources starts out with a bright orange background, while text from trusted authors gets a lighter shade. As more people view and edit the new text, it gradually gains more “trust” and turns from orange to white."*
More in story
*Regards*
**
*Keith*
What's interesting about WikiTrust is that a trust score is computed for each individual. I wonder if these will be made public, and if so, how they will change the community of editors. It seems likely that they will not be made public. However, since the algorithm is published and I believe the source code as well anyone with the hardware could compute and publish how trusted each community member is.
On Sun, Aug 30, 2009 at 7:31 PM, Brian Brian.Mingus@colorado.edu wrote:
On Sun, Aug 30, 2009 at 6:24 PM, Keith Old keithold@gmail.com wrote:
Folks,
http://www.wired.com/wiredscience/2009/08/wikitrust/
Wired reports:
*"Starting this fall, you’ll have a new reason to trust the information you find on Wikipedia: An optional feature called “WikiTrust” will color code every word of the encyclopedia based on the reliability of its author and the length of time it has persisted on the page.*
*More than 60 million people visit the free, open-access encyclopedia each month, searching for knowledge on 12 million pages in 260 languages. But despite its popularity, **Wikipedia*< http://www.wired.com/wiredscience/2009/08/wikitrust/www.wikipedia.org%3E
- has long suffered criticism from those who say it’s not reliable.
Because anyone with an internet connection can contribute, the site is subject to vandalism, bias and misinformation. And edits are anonymous, so there’s no easy way to separate credible information from fake content created by vandals.*
*Now, researchers from the **Wiki Lab* http://trust.cse.ucsc.edu/* at the University of California, Santa Cruz have created a system to help users know when to trust Wikipedia—and when to reach for that dusty Encyclopedia Britannica on the shelf. Called **WikiTrust*http://wikitrust.soe.ucsc.edu/index.php/Main_Page *, the program assigns a color code to newly edited text using an algorithm that calculates author reputation from the lifespan of their past contributions. It’s based on a simple concept: The longer information persists on the page, the more accurate it’s likely to be.*
*Text from questionable sources starts out with a bright orange background, while text from trusted authors gets a lighter shade. As more people view and edit the new text, it gradually gains more “trust” and turns from orange to white."*
More in story
*Regards*
**
*Keith*
What's interesting about WikiTrust is that a trust score is computed for each individual. I wonder if these will be made public, and if so, how they will change the community of editors. It seems likely that they will not be made public. However, since the algorithm is published and I believe the source code as well anyone with the hardware could compute and publish how trusted each community member is.
Or perhaps it is a reputation score - my memory is fuzzy.
Or perhaps it is a reputation score - my memory is fuzzy.
Either way, I would like the score to NOT be published. I'd hate to have the community divided over a piece of software.
Emily On Aug 30, 2009, at 8:32 PM, Brian wrote:
On Sun, Aug 30, 2009 at 7:31 PM, Brian Brian.Mingus@colorado.edu wrote:
On Sun, Aug 30, 2009 at 6:24 PM, Keith Old keithold@gmail.com wrote:
Folks,
http://www.wired.com/wiredscience/2009/08/wikitrust/
Wired reports:
*"Starting this fall, you’ll have a new reason to trust the information you find on Wikipedia: An optional feature called “WikiTrust” will color code every word of the encyclopedia based on the reliability of its author and the length of time it has persisted on the page.*
*More than 60 million people visit the free, open-access encyclopedia each month, searching for knowledge on 12 million pages in 260 languages. But despite its popularity, **Wikipedia*< http://www.wired.com/wiredscience/2009/08/wikitrust/www.wikipedia.org
- has long suffered criticism from those who say it’s not reliable.
Because anyone with an internet connection can contribute, the site is subject to vandalism, bias and misinformation. And edits are anonymous, so there’s no easy way to separate credible information from fake content created by vandals.*
*Now, researchers from the **Wiki Lab* <http://trust.cse.ucsc.edu/
- at
the University of California, Santa Cruz have created a system to help users know when to trust Wikipedia—and when to reach for that dusty Encyclopedia Britannica on the shelf. Called **WikiTrust*http://wikitrust.soe.ucsc.edu/index.php/Main_Page *, the program assigns a color code to newly edited text using an algorithm that calculates author reputation from the lifespan of their past contributions. It’s based on a simple concept: The longer information persists on the page, the more accurate it’s likely to be.*
*Text from questionable sources starts out with a bright orange background, while text from trusted authors gets a lighter shade. As more people view and edit the new text, it gradually gains more “trust” and turns from orange to white."*
More in story
*Regards*
**
*Keith*
What's interesting about WikiTrust is that a trust score is computed for each individual. I wonder if these will be made public, and if so, how they will change the community of editors. It seems likely that they will not be made public. However, since the algorithm is published and I believe the source code as well anyone with the hardware could compute and publish how trusted each community member is.
Or perhaps it is a reputation score - my memory is fuzzy. _______________________________________________ WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: https://lists.wikimedia.org/mailman/listinfo/wikien-l
On Sun, Aug 30, 2009 at 9:34 PM, Emily Monroebluecaliocean@me.com wrote:
Or perhaps it is a reputation score - my memory is fuzzy.
Either way, I would like the score to NOT be published. I'd hate to have the community divided over a piece of software.
Emily
There's also the possibility for "gaming the system" by, e.g., making subtle expansions that are very unlikely to be reverted to articles that don't get much attention. Unless the algorithm is more complex than I thought.
Pakaran
Agree - trust scores are likely to be divisive and easily gamed. I do not think "trust score league tables" will help the project.
However as they are also good ways to spot problems and see the "reliability profile" of an article on review, perhaps some way might be found to make some of their results available, in some limited manner? Admin only??
On the assumption admins are trusted anyway so they don't have such a vested interest in numbers, but they might be interested in problem editorship.
The other view is if you can see the aging or trust profile of the article, that's all you need. low trust-score users may simply be legitimate but inexperienced, bold and reverted, etc. There are other ways to ID problem editors, and if you need to know who wrote a specific sentence you can always use WikiBlame to check the history.
So overall I would say you don't need to publish trust scores of users, and even telling a user their own trust score is merely a toehold into self promotion/gaming at best. People should edit, not be encouraged to keep scorecards.....
FT2
On Mon, Aug 31, 2009 at 2:37 AM, Nathan Russell windrunner@gmail.comwrote:
On Sun, Aug 30, 2009 at 9:34 PM, Emily Monroebluecaliocean@me.com wrote:
Or perhaps it is a reputation score - my memory is fuzzy.
Either way, I would like the score to NOT be published. I'd hate to have the community divided over a piece of software.
Emily
There's also the possibility for "gaming the system" by, e.g., making subtle expansions that are very unlikely to be reverted to articles that don't get much attention. Unless the algorithm is more complex than I thought.
Pakaran
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: https://lists.wikimedia.org/mailman/listinfo/wikien-l
2009/8/31 FT2 ft2.wiki@gmail.com:
Agree - trust scores are likely to be divisive and easily gamed. I do not think "trust score league tables" will help the project.
However as they are also good ways to spot problems and see the "reliability profile" of an article on review, perhaps some way might be found to make some of their results available, in some limited manner? Admin only??
Perhaps the trust scores could be released in the form of categories. You can't find out an individuals actual score but you can find out if they are "untrustworthy", "average" or "trustworthy" (with dividing lines that we have spent at least a gigabyte arguing over, of course). I can't see any real use for the exact scores - the precision will be so low that the rough categories are all you can conclude from them.
I'm not convinced there is sufficient use for even such categories, though. They might be useful for prioritising recent changes in vandal fighting tools, that's about it. (Perhaps the vandal fighting tools could have access to the scores without their users having such access?)
(Perhaps the vandal fighting tools could have access to the scores without their users having such access?)
Being a user of vandal fighting tools, I like that idea.
Emily On Aug 30, 2009, at 8:48 PM, Thomas Dalton wrote:
2009/8/31 FT2 ft2.wiki@gmail.com:
Agree - trust scores are likely to be divisive and easily gamed. I do not think "trust score league tables" will help the project.
However as they are also good ways to spot problems and see the "reliability profile" of an article on review, perhaps some way might be found to make some of their results available, in some limited manner? Admin only??
Perhaps the trust scores could be released in the form of categories. You can't find out an individuals actual score but you can find out if they are "untrustworthy", "average" or "trustworthy" (with dividing lines that we have spent at least a gigabyte arguing over, of course). I can't see any real use for the exact scores - the precision will be so low that the rough categories are all you can conclude from them.
I'm not convinced there is sufficient use for even such categories, though. They might be useful for prioritising recent changes in vandal fighting tools, that's about it. (Perhaps the vandal fighting tools could have access to the scores without their users having such access?)
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: https://lists.wikimedia.org/mailman/listinfo/wikien-l
On Sun, Aug 30, 2009 at 7:42 PM, FT2 ft2.wiki@gmail.com wrote:
Agree - trust scores are likely to be divisive and easily gamed. I do not think "trust score league tables" will help the project.
However as they are also good ways to spot problems and see the "reliability profile" of an article on review, perhaps some way might be found to make some of their results available, in some limited manner? Admin only??
On the assumption admins are trusted anyway so they don't have such a vested interest in numbers, but they might be interested in problem editorship.
The other view is if you can see the aging or trust profile of the article, that's all you need. low trust-score users may simply be legitimate but inexperienced, bold and reverted, etc. There are other ways to ID problem editors, and if you need to know who wrote a specific sentence you can always use WikiBlame to check the history.
So overall I would say you don't need to publish trust scores of users, and even telling a user their own trust score is merely a toehold into self promotion/gaming at best. People should edit, not be encouraged to keep scorecards.....
FT2
Playing devils advocate, isn't there far too little information available about your average editor? How do you determine at a glance the reputation of an editor whose edits you are reviewing, or with whom you are having a conversation? Further, since the full history dump is publicly available and the given algorithm is just one of many related measures that could be computed, is it pointless to try and stop the information from being released? Lastly, in the interest of transparency should the information not be made available? Shouldn't the goal be to create an algorithm that can't be gamed? It may actually be the case that this one is not very subject to manipulation. The authors are very astute and it would take an awful lot of effort.
On Sun, Aug 30, 2009 at 7:48 PM, Brian Brian.Mingus@colorado.edu wrote:
On Sun, Aug 30, 2009 at 7:42 PM, FT2 ft2.wiki@gmail.com wrote:
Agree - trust scores are likely to be divisive and easily gamed. I do not think "trust score league tables" will help the project.
However as they are also good ways to spot problems and see the "reliability profile" of an article on review, perhaps some way might be found to make some of their results available, in some limited manner? Admin only??
On the assumption admins are trusted anyway so they don't have such a vested interest in numbers, but they might be interested in problem editorship.
The other view is if you can see the aging or trust profile of the article, that's all you need. low trust-score users may simply be legitimate but inexperienced, bold and reverted, etc. There are other ways to ID problem editors, and if you need to know who wrote a specific sentence you can always use WikiBlame to check the history.
So overall I would say you don't need to publish trust scores of users, and even telling a user their own trust score is merely a toehold into self promotion/gaming at best. People should edit, not be encouraged to keep scorecards.....
FT2
Playing devils advocate, isn't there far too little information available about your average editor? How do you determine at a glance the reputation of an editor whose edits you are reviewing, or with whom you are having a conversation? Further, since the full history dump is publicly available and the given algorithm is just one of many related measures that could be computed, is it pointless to try and stop the information from being released? Lastly, in the interest of transparency should the information not be made available? Shouldn't the goal be to create an algorithm that can't be gamed? It may actually be the case that this one is not very subject to manipulation. The authors are very astute and it would take an awful lot of effort.
I would also point out that competition can be a very healthy thing and it could very well be a motivating tool. Assuming an algorithm that is difficult to game editors might well be very interested in improving their reputation scores. It could even give some credibility to the encyclopedia.
2009/8/31 Brian Brian.Mingus@colorado.edu:
I would also point out that competition can be a very healthy thing and it could very well be a motivating tool. Assuming an algorithm that is difficult to game editors might well be very interested in improving their reputation scores. It could even give some credibility to the encyclopedia.
Yes, competition is a good motivator, but that is only useful if it is motivating people to do something desirable. We don't actually want people to try and avoid being reverted - WP:BOLD is still widely accepted as a good guideline, isn't it?
On Sun, Aug 30, 2009 at 8:28 PM, Thomas Dalton thomas.dalton@gmail.comwrote:
2009/8/31 Brian Brian.Mingus@colorado.edu:
I would also point out that competition can be a very healthy thing and
it
could very well be a motivating tool. Assuming an algorithm that is difficult to game editors might well be very interested in improving
their
reputation scores. It could even give some credibility to the
encyclopedia.
Yes, competition is a good motivator, but that is only useful if it is motivating people to do something desirable. We don't actually want people to try and avoid being reverted - WP:BOLD is still widely accepted as a good guideline, isn't it?
From the perspective of building an excellent encyclopedia you might want
people to be bold. This is an inherently inclusionist perspective where we assume that bold editors who write awful, inaccurate or mediocre stuff are still making valuable contributions. They are either contributing cruft which is easy to get rid of, or they are contributing seeds for some future editor to improve, or seeds for conversations on the talk page that will in time result in high quality content. Or if we're lucky, they are not only bold but really smart and only capable of producing brilliant prose. In short, in the limit of time any contribution is a good contribution. Even the worst contribution you can think of (which is probably engineered to stick but blatantly false) is going to eventually be tagged as vandalism and will help contribute to future intelligent algorithms that automatically weed out vandalism.
From the perspective of an editor whose reputation is at stake, they are
going to want to think more carefully about their contribution. On average they want all of their edits to remain in the encyclopedia for a long time. They might not want to be bold and thoughtless because that means they are simply planting a seed for another editor to improve on, making it easier for that other editor to improve their reputation at the stake of your reputation. You might want to start your seed of an edit as a draft and improve it over time, only finally submitting it to the encyclopedia after it is already high quality and likely to stick.
I tend to think that the latter version is healthier than encouraging everyone to contribute every thought that they have. Similar to the [[Foot-in-the-door technique]], first we convinced you to edit this page, now we'd like to ask you to spend some time thinking about your edit before you submit it. If you do, your reputation will improve and your peers will respect your edits more in the future.
Since the analysis is over a period of time, it's easy to trial it offline by statically calculating results for a past period or certain editors, then seeing if those mean anything. Overall my suspicion is 1/ it'll be so poorly correlated with quality as to be unhelpful compared to other guides, 2/ we don't want to encourage a move to that kind of user evaluation metric anyway for the many reasons given.
FT2
On Mon, Aug 31, 2009 at 3:57 AM, Brian Brian.Mingus@colorado.edu wrote:
On Sun, Aug 30, 2009 at 8:28 PM, Thomas Dalton <thomas.dalton@gmail.com
wrote:
2009/8/31 Brian Brian.Mingus@colorado.edu:
I would also point out that competition can be a very healthy thing and
it
could very well be a motivating tool. Assuming an algorithm that is difficult to game editors might well be very interested in improving
their
reputation scores. It could even give some credibility to the
encyclopedia.
Yes, competition is a good motivator, but that is only useful if it is motivating people to do something desirable. We don't actually want people to try and avoid being reverted - WP:BOLD is still widely accepted as a good guideline, isn't it?
From the perspective of building an excellent encyclopedia you might want people to be bold. This is an inherently inclusionist perspective where we assume that bold editors who write awful, inaccurate or mediocre stuff are still making valuable contributions. They are either contributing cruft which is easy to get rid of, or they are contributing seeds for some future editor to improve, or seeds for conversations on the talk page that will in time result in high quality content. Or if we're lucky, they are not only bold but really smart and only capable of producing brilliant prose. In short, in the limit of time any contribution is a good contribution. Even the worst contribution you can think of (which is probably engineered to stick but blatantly false) is going to eventually be tagged as vandalism and will help contribute to future intelligent algorithms that automatically weed out vandalism.
From the perspective of an editor whose reputation is at stake, they are going to want to think more carefully about their contribution. On average they want all of their edits to remain in the encyclopedia for a long time. They might not want to be bold and thoughtless because that means they are simply planting a seed for another editor to improve on, making it easier for that other editor to improve their reputation at the stake of your reputation. You might want to start your seed of an edit as a draft and improve it over time, only finally submitting it to the encyclopedia after it is already high quality and likely to stick.
I tend to think that the latter version is healthier than encouraging everyone to contribute every thought that they have. Similar to the [[Foot-in-the-door technique]], first we convinced you to edit this page, now we'd like to ask you to spend some time thinking about your edit before you submit it. If you do, your reputation will improve and your peers will respect your edits more in the future. _______________________________________________ WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: https://lists.wikimedia.org/mailman/listinfo/wikien-l
On Mon, Aug 31, 2009 at 3:28 AM, Thomas Daltonthomas.dalton@gmail.com wrote:
2009/8/31 Brian Brian.Mingus@colorado.edu:
I would also point out that competition can be a very healthy thing and it could very well be a motivating tool. Assuming an algorithm that is difficult to game editors might well be very interested in improving their reputation scores. It could even give some credibility to the encyclopedia.
Yes, competition is a good motivator, but that is only useful if it is motivating people to do something desirable. We don't actually want people to try and avoid being reverted - WP:BOLD is still widely accepted as a good guideline, isn't it?
Is it not more likely that most long-term editors who have been active for years have had most of their text mercilessly edited into oblivion and have very low average "trust" levels? And more recent editors may have higher trust levels?
Carcharoth
On Mon, Aug 31, 2009 at 4:08 AM, Carcharoth carcharothwp@googlemail.comwrote:
Is it not more likely that most long-term editors who have been active for years have had most of their text mercilessly edited into oblivion and have very low average "trust" levels? And more recent editors may have higher trust levels?
Carcharoth
That at least, no - I gather it's based on evaluation of edits' longevity, not whether they still exist now.
On Sun, Aug 30, 2009 at 9:08 PM, Carcharoth carcharothwp@googlemail.comwrote:
On Mon, Aug 31, 2009 at 3:28 AM, Thomas Daltonthomas.dalton@gmail.com wrote:
2009/8/31 Brian Brian.Mingus@colorado.edu:
I would also point out that competition can be a very healthy thing and
it
could very well be a motivating tool. Assuming an algorithm that is difficult to game editors might well be very interested in improving
their
reputation scores. It could even give some credibility to the
encyclopedia.
Yes, competition is a good motivator, but that is only useful if it is motivating people to do something desirable. We don't actually want people to try and avoid being reverted - WP:BOLD is still widely accepted as a good guideline, isn't it?
Is it not more likely that most long-term editors who have been active for years have had most of their text mercilessly edited into oblivion and have very low average "trust" levels? And more recent editors may have higher trust levels?
Carcharoth
With the disclaimer that I haven't read the paper since the 2006 Wikimania, no, the algorithm is smarter than that. Simply having your edits overwritten at some point in the future is not going to detract from the period of time that your edit lasted. Additionally, if some but not all of your words persist through rewrites that would contribute to your reputation.
On Mon, Aug 31, 2009 at 4:10 AM, BrianBrian.Mingus@colorado.edu wrote:
On Sun, Aug 30, 2009 at 9:08 PM, Carcharoth carcharothwp@googlemail.comwrote:
<snip>
Is it not more likely that most long-term editors who have been active for years have had most of their text mercilessly edited into oblivion and have very low average "trust" levels? And more recent editors may have higher trust levels?
With the disclaimer that I haven't read the paper since the 2006 Wikimania, no, the algorithm is smarter than that. Simply having your edits overwritten at some point in the future is not going to detract from the period of time that your edit lasted. Additionally, if some but not all of your words persist through rewrites that would contribute to your reputation.
If you merely revert vandalism that removes a persistent piece of text, doesn't that unfairly contribute to your reputation as the text continues to persist and the algorithm thinks that anyone who added it was doing so independently?
Carcharoth
On Sun, Aug 30, 2009 at 9:31 PM, Carcharoth carcharothwp@googlemail.comwrote:
On Mon, Aug 31, 2009 at 4:10 AM, BrianBrian.Mingus@colorado.edu wrote:
On Sun, Aug 30, 2009 at 9:08 PM, Carcharoth <carcharothwp@googlemail.com wrote:
<snip>
Is it not more likely that most long-term editors who have been active for years have had most of their text mercilessly edited into oblivion and have very low average "trust" levels? And more recent editors may have higher trust levels?
With the disclaimer that I haven't read the paper since the 2006
Wikimania,
no, the algorithm is smarter than that. Simply having your edits
overwritten
at some point in the future is not going to detract from the period of
time
that your edit lasted. Additionally, if some but not all of your words persist through rewrites that would contribute to your reputation.
If you merely revert vandalism that removes a persistent piece of text, doesn't that unfairly contribute to your reputation as the text continues to persist and the algorithm thinks that anyone who added it was doing so independently?
Carcharoth
If you have questions like that you should probably look into the website and the paper. I think that you'll find they realized most of these issues and incorporated them into the algo. They already detect reverts so it doesn't make sense to punish the reverter.
On Sun, Aug 30, 2009 at 8:31 PM, Carcharoth carcharothwp@googlemail.comwrote:
On Mon, Aug 31, 2009 at 4:10 AM, BrianBrian.Mingus@colorado.edu wrote:
On Sun, Aug 30, 2009 at 9:08 PM, Carcharoth <carcharothwp@googlemail.com wrote:
<snip>
Is it not more likely that most long-term editors who have been active for years have had most of their text mercilessly edited into oblivion and have very low average "trust" levels? And more recent editors may have higher trust levels?
With the disclaimer that I haven't read the paper since the 2006
Wikimania,
no, the algorithm is smarter than that. Simply having your edits
overwritten
at some point in the future is not going to detract from the period of
time
that your edit lasted. Additionally, if some but not all of your words persist through rewrites that would contribute to your reputation.
If you merely revert vandalism that removes a persistent piece of text, doesn't that unfairly contribute to your reputation as the text continues to persist and the algorithm thinks that anyone who added it was doing so independently?
Carcharoth
Why would it matter? If you did the right thing, thats all that there is to
care about. This is what im worried about, Wikipedia: The RPG getting even more ingrained.
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: https://lists.wikimedia.org/mailman/listinfo/wikien-l
Is it not more likely that most long-term editors who have been active for years have had most of their text mercilessly edited into oblivion and have very low average "trust" levels?
Sometimes. However, on new page patrol, I'll sometimes completely rewrite a page, both for practice and because I see an inkling of potential in a page that would normally be speedily deleted via SNOW via AfD in a heartbeat. In other words, a well-meaning contributor ALREADY can't be trusted...according to a piece of software.
Emily On Aug 30, 2009, at 10:08 PM, Carcharoth wrote:
On Mon, Aug 31, 2009 at 3:28 AM, Thomas Daltonthomas.dalton@gmail.com wrote:
2009/8/31 Brian Brian.Mingus@colorado.edu:
I would also point out that competition can be a very healthy thing and it could very well be a motivating tool. Assuming an algorithm that is difficult to game editors might well be very interested in improving their reputation scores. It could even give some credibility to the encyclopedia.
Yes, competition is a good motivator, but that is only useful if it is motivating people to do something desirable. We don't actually want people to try and avoid being reverted - WP:BOLD is still widely accepted as a good guideline, isn't it?
Is it not more likely that most long-term editors who have been active for years have had most of their text mercilessly edited into oblivion and have very low average "trust" levels? And more recent editors may have higher trust levels?
Carcharoth
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: https://lists.wikimedia.org/mailman/listinfo/wikien-l
Yes, competition is a good motivator, but that is only useful if it is motivating people to do something desirable. We don't actually want people to try and avoid being reverted - WP:BOLD is still widely accepted as a good guideline, isn't it?
Well, that's what I'm worried about, mostly.
Emily On Aug 30, 2009, at 9:28 PM, Thomas Dalton wrote:
2009/8/31 Brian Brian.Mingus@colorado.edu:
I would also point out that competition can be a very healthy thing and it could very well be a motivating tool. Assuming an algorithm that is difficult to game editors might well be very interested in improving their reputation scores. It could even give some credibility to the encyclopedia.
Yes, competition is a good motivator, but that is only useful if it is motivating people to do something desirable. We don't actually want people to try and avoid being reverted - WP:BOLD is still widely accepted as a good guideline, isn't it?
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: https://lists.wikimedia.org/mailman/listinfo/wikien-l
A bit like cryptography? If it needs obscurity to withstand gaming it's worthless?
A metric like "this user's edits are routinely reverted" or "routinely reverted on topic X" might be useful. Ditto a study of words used in the revert edit's summary.
Beyond that I'm not convinced it's feasible to calculate a score for trust, just because editors can edit in many different areas and ways. As an extreme example, a FA editor or project page developer who uses BRD to achieve more quicker, will score very differently from a POV warrior who writes obscure but slightly skewed pages, or a sock user. the page text will show reversion, recreation or aging which is useful... but the author's trust rating will be very variable.
FT2
On Mon, Aug 31, 2009 at 2:48 AM, Brian Brian.Mingus@colorado.edu wrote:
Playing devils advocate, isn't there far too little information available about your average editor? How do you determine at a glance the reputation of an editor whose edits you are reviewing, or with whom you are having a conversation? Further, since the full history dump is publicly available and the given algorithm is just one of many related measures that could be computed, is it pointless to try and stop the information from being released? Lastly, in the interest of transparency should the information not be made available? Shouldn't the goal be to create an algorithm that can't be gamed? It may actually be the case that this one is not very subject to manipulation. The authors are very astute and it would take an awful lot of effort.
On Sun, Aug 30, 2009 at 9:31 PM, BrianBrian.Mingus@colorado.edu wrote:
What's interesting about WikiTrust is that a trust score is computed for each individual. I wonder if these will be made public, and if so, how they will change the community of editors. It seems likely that they will not be made public. However, since the algorithm is published and I believe the source code as well anyone with the hardware could compute and publish how trusted each community member is.
I don't think the trust scores, if they're based on what the article describes, would really do much to show "trust" in a conventional sense.
Consider that someone doing Michael-style subtle vandalism (release dates, etc) may not get reverted for some time, while a good editor who likes to edit in political topics may get reverted frequently even when her contributions are good (or, perhaps, just have subtle PoV issues).
Pakaran