FYI, thought Wikipedia researchers would find this interesting:
Wikipedia Weekly podcast interviews researchers Felipe Ortega and Ed
Chi about recent WSJ article re: volunteer departures
http://bit.ly/5aG6si
--
-Andrew Lih
USC Annenberg School of Communication and Journalism
Email: andrew(a)andrewlih.com
WEB: http://www.andrewlih.com
BOOK: The Wikipedia Revolution: http:/www.wikipediarevolution.com
On Thu, Dec 3, 2009 at 2:13 PM, Charles Matthews
<charles.r.matthews(a)ntlworld.com> wrote:
> No, but I can think much better ways of framing the question than
> following up WSJ article would lead to. Studies and articles written by
> people not really aware of how our communities function are not really
> good places to start, if the issue is how to improve that functioning.
> It seems pretty clear that if you frame the question too loosely, you
> get a recital of some beefs that are brought up whatever the occasion.
Indeed.
Please read this page, starting at "Other kinds of errors are more
characteristic of poor science." down to "But not
paying attention to experiments like that is a characteristic of cargo
cult science." "
http://www.lhup.edu/~DSIMANEK/cargocul.htm
If you care about the research on Wikipedia at all, or making policies
from that research... and you are only going to read one thing
suggested me this year— this should be the thing you read. (Well, I
recommend everyone go and buy and read the feynman biographies…)
I'm not suggesting that anyone is engaging in cargo cult science so
much as just saying that we do not yet know much of anything about
researching Wikipedia. Unfortunately, people are being rewarded for
making loud conclusions even though many are easily dismissed as
uncertain because of confounding factors which are obvious to people
experienced in Wikipedia. This does not encourage the kind of careful
fundamentals research required before we can make real progress.
I'm interested in knowing which, if any, experienced Wikipedians have
ever participated in the peer review of an article studying Wikipedia.
I have not. Peer review isn't a magic bullet, but I'm surprised at
the number of obvious and easily correctable flawed clams (e.g. that
the dumps contain ALL edits so no effort needs to be taken to correct
for any sampling bias), and I wonder if anyone with significant first
hand experience is providing input.
Hello everyone,
At this year's WikiSym, there was an interesting discussion on wiki
measurement and evaluation. Wiki research often involves the measurement of
pages to identify various editing patterns, such as highly concentrated
editing activity, the development of cliques, or the emergence of highly
active and inactive users.
Because some of the quantities that researchers desire to measure (such as
"coordination", "concentration", and "quality") are necessarily vague,
choosing a formula or metric that acts as a surrogate for the desired
measurement requires some thought and discretion.
Because I was not able to find an existing compilation of metrics for wikis,
I created several pages outlining some wiki usage patterns and the metrics
that could identify them. Although the pages are not specific to Wikipedia
(they were written with corporate wiki practitioners in mind), I think they
would also be of interest to the Wikipedia research community. The pages can
be found here:
http://www.wikisym.org/ws2009/tiki-index.php?page=Corporate+Wiki+Metrics
I invite all interested researchers to add more metrics to the pages, or use
the pages as a reference. Also, if there are any suggestions for a more
appropriate wiki to host this information (other than the WikiSym '09 wiki)
please let me know. (I do not know of any wikis that act as a repository for
wiki research information -- does anybody know of one?)
Jeff