Hi,
Just something that occurs to me as I write up my dissertation - I
keep on thinking it would be nice to be able to cite some basic
figures to back up a point I am making, eg. how many times Wikipedia
is edited on a given day or how many pages link to this policy page -
as I asked in an email to the wikipedia-l list, which has mysteriously
vanished from the archives (August 11, entitled "What links here?"). I
realise these could be done by going to the recent changes or special
pages and counting them all, but I'm basically too lazy to do that -
we're talking about thousands of pages here, right? I'm also thinking
this is something that many people would be interested in finding out
and writing about. So what I'm asking is that to help researchers
generally, wouldn't it be an idea to identify some quick database
hacks that we could provide - almost like a kate's tools function? Or
are these available on the MediaWiki pages? If they are, and I've
looked at some database related pages, they're certainly not so
understandable from the perspective of someone who just wants to use
basic functions. You might be thinking of sending me to a page like
http://meta.wikimedia.org/wiki/Links_table - but *what does it mean?*
Can someone either help me out, or suggest what we could do about this
in the future?
Cheers,
Cormac
Colleagues,
A group of us had a rather interesting discussion on privacy, etc at
Wikimania. Not wanting to lose the momentum of that discussion I took
a stab at starting an article on a research policy statement. This is
just a start, please feel free to have at it and thank you for your
help!
http://meta.wikimedia.org/wiki/Wikimedia_Research_Network_Privacy_Policy
Kevin
Kevin J. Gamble. Ph.D.
Associate Director eXtension Initiative
Box 7641 NCSU
Raleigh, NC 27694-7641
v: 919.515.8447
c: 919.605.5815
AIM: k1v1n
Web: intranet.extension.org
Dear Andrew Lih,
dear scientific community,
I am a bit disappointed about the available material
that tries to measure the quality of Wikipedia articles.
The quoted newspaper article of the Wall street journal
for example just analyses technical topics but it would
be a dangerous claim to assume that quality is equally
distributed over the different fields and topics.
But you need that claim as condition for the method
of randomly picking articles and conclude for the rest.
There was another attempt to compare the Quality of
Wikipedia with other Encyclopedias in the German Computer
Newspaper C't with the same random approach.
(1) But there is a problem since it is a random way of
choosing articles to compare or to analyse. I see
some problems in non technical fields such as soft
sciences (in social science for example every theory
on society redefines all concepts of society on it's
own: how can an encyclopedia claim to have a definition?).
(2) Political terms are sometimes very complex topics
where the NPOV may not work, because there is no
right nor wrong. It is often a question of opinion
and majority that sometimes changes reality.
I observed a discussion and an edit war on the article
about Direct Democracy (in the Germen Wikipedia:
article "Direkte Demokratie") that led to a loss
of quality: only a minimal and weak consens
survived the different opinions: the evolutionary
process did not improve quality in that case.
(3) The third problem is the tendency of specific groups
that lead to vandalism. There are groups that use
values or ideologies and reject a neutral or scientific
view (moralists, religious groups, nationalists,
neocapitalists etc.). What about articles that are
important for these groups? Are these article tendentious?
My question: Is there a scientific study on the
quality of the Wikipedia ariticles? Does anyone
work on that problems? What methods could be used
to analyse the Quality?
Ingo Frost
(studies Wikipedia from a social system science view)