Hello everyone,
At this year's WikiSym, there was an interesting discussion on wiki measurement and evaluation. Wiki research often involves the measurement of pages to identify various editing patterns, such as highly concentrated editing activity, the development of cliques, or the emergence of highly active and inactive users.
Because some of the quantities that researchers desire to measure (such as "coordination", "concentration", and "quality") are necessarily vague, choosing a formula or metric that acts as a surrogate for the desired measurement requires some thought and discretion.
Because I was not able to find an existing compilation of metrics for wikis, I created several pages outlining some wiki usage patterns and the metrics that could identify them. Although the pages are not specific to Wikipedia (they were written with corporate wiki practitioners in mind), I think they would also be of interest to the Wikipedia research community. The pages can be found here: http://www.wikisym.org/ws2009/tiki-index.php?page=Corporate+Wiki+Metrics
I invite all interested researchers to add more metrics to the pages, or use the pages as a reference. Also, if there are any suggestions for a more appropriate wiki to host this information (other than the WikiSym '09 wiki) please let me know. (I do not know of any wikis that act as a repository for wiki research information -- does anybody know of one?)
Jeff
On Thu, Dec 3, 2009 at 3:29 PM, Jeff Stuckman stuckman@umd.edu wrote:
Hello everyone,
At this year's WikiSym, there was an interesting discussion on wiki measurement and evaluation. Wiki research often involves the measurement of pages to identify various editing patterns, such as highly concentrated editing activity, the development of cliques, or the emergence of highly active and inactive users.
Because some of the quantities that researchers desire to measure (such as "coordination", "concentration", and "quality") are necessarily vague, choosing a formula or metric that acts as a surrogate for the desired measurement requires some thought and discretion.
Because I was not able to find an existing compilation of metrics for wikis, I created several pages outlining some wiki usage patterns and the metrics that could identify them. Although the pages are not specific to Wikipedia (they were written with corporate wiki practitioners in mind), I think they would also be of interest to the Wikipedia research community. The pages can be found here: http://www.wikisym.org/ws2009/tiki-index.php?page=Corporate+Wiki+Metrics
I invite all interested researchers to add more metrics to the pages, or use the pages as a reference. Also, if there are any suggestions for a more appropriate wiki to host this information (other than the WikiSym '09 wiki) please let me know. (I do not know of any wikis that act as a repository for wiki research information -- does anybody know of one?)
Jeff
The best known predictor of quality is featured article status. Next up is a readability metric such as the Automated Readability Index. See my website for my research into quality. As far as I know I've done the most thorough analyses of predictors of quality, although I haven't been keeping completely up. The later paper is the more interesting in terms of predictors.
http://grey.colorado.edu/mingus
Cheers, Brian
Featured status on wikis might not be a reliable indicator, even on those wikis that have the process.
Case in point Muppet Wiki: they started a FA process, and received a burst of nominations and approvals for a few months... after a while though, it became a "meh"... there was no real reason to take time acknowledging successes, it took away from working on the project itself. Many of the highest quality articles go unacknowledged, because no one wants to bother nominating.
(Granted, I'm only guessing at the reasons; I'm a regular contributor, but I know that the process faded without discussion.)
Nick Moreau/Zanimum
2009/12/3 Brian J Mingus Brian.Mingus@colorado.edu:
The best known predictor of quality is featured article status. Next up is a readability metric such as the Automated Readability Index. See my website for my research into quality. As far as I know I've done the most thorough analyses of predictors of quality, although I haven't been keeping completely up. The later paper is the more interesting in terms of predictors.
http://grey.colorado.edu/mingus
Cheers, Brian
On 12/05/09 08:13, Nicholas Moreau wrote:
Featured status on wikis might not be a reliable indicator, even on those wikis that have the process.
Case in point Muppet Wiki: they started a FA process, and received a burst of nominations and approvals for a few months... after a while though, it became a "meh"... there was no real reason to take time acknowledging successes, it took away from working on the project itself. Many of the highest quality articles go unacknowledged, because no one wants to bother nominating.
I remember vaguely someone studying this question in Wikipedia (Ed Chi?). I believe the conclusion was that FA status had high precision but poor recall: i.e., FA articles were reliably high-quality but there were many high-quality articles that were not FA.
I could very well be wrong or misremembering, so take with a grain of salt.
Reid
wiki-research-l@lists.wikimedia.org