On Thu, May 8, 2014 at 1:30 AM, Nathan
<nawrich(a)gmail.com> wrote:
On Wed, May 7, 2014 at 8:24 PM, Wil Sinclair
<wllm(a)wllm.com> wrote:
I'm a total newb here, and I know the grant
system between WMF and the
different chapters has been debated in the past. But I have a simple
question: if WMF is funding these efforts through grants and the grant
money is used to review and/or manage content, wouldn't it be
indirectly getting involved with reviewing and managing content?
,Wil
Depends on the nature of the grant. In any case I think affiliates are
better placed to perform this kind of work anyway, since we'd want it to
be
done in more than one language and using diverse
panels with members from
more than just the U.S. But I do think it would be really cool research
and
the results would certainly be very interesting.
It also makes sense as
complementary to automated efforts, and then the results of the different
methods could be compared to assess effectiveness of the review
processes.
I don't think this is an issue; as Erik has kindly pointed out in this
thread, the Foundation has funded at least one such study in the past.
(However, this study does not seem to have been based on a random sample –
at least I cannot find any mention of the sample selection method in the
study's write-up. The selection of a random sample is key to any such
effort, and the method used to select the sample should be described in
detail in any resulting report.)
To me, funding work that results in content quality feedback to the
community does not mean that the Foundation is getting involved in content
management. The expert panel would obviously have to have complete academic
freedom to publish whatever their findings are, without pre-publication
review by the Foundation. I would not expect the experts involved to end up
editing Wikipedia; if any of them did, this would be their private
initiative as individuals, and not covered by any grant.
I would consider such a research programme an important service to the
community, just as the Board provides software, guidance through board
resolutions, and so forth.
It would be an equally vital service to the reading public that the
Foundation's projects serve.
In my view, any such programme of studies should begin with the English
Wikipedia, as it is the most comprehensive and most widely accessed
project, including by many non-native speakers looking for more detailed
information than their own language version of Wikipedia provides. Medical
content would be an excellent area to start with.
I think perhaps there is a lack of research into the extent of research
already being done by independent, qualified third parties. Several
examples are provided in the references of the study you posted, Andreas.
For example, this one in the Journal of Oncology Practice[1] compares
specific Wikipedia articles for patient-oriented cancer information against
the professionally edited PDQ database. It appears that the two were
comparable in most areas, except for readability, where the PDQ database
was considered significantly more readable. Now, again, this is a small
study and it has not been reproduced; however, it took me minutes to find
more information on the very subject you're interested in, created by
completely independent bodies who have "no pony in the race". There did
seem to be a fair number of studies related to medical topics. Now if only
we could learn from them - especially on the readability point, which I
think really is a very serious issue. Wikipedia isn't really intended to
educate physicians about medical topics, it's intended to be a general
reference for non-specialists.
Very few people are going to make life-and-death decisions based on our
math or physics topic areas, but I'll lay odds that any study would find a
significant readability issue with both of them, as well.
Risker/Anne
[1]