On 7 May 2014 18:14, Andreas Kolbe jayen466@gmail.com wrote:
Anne, there are really well-established systems of scholarly peer review. There is no need to reinvent the wheel, or add distractions such as infoboxes and other bells and whistles.
I find it extraordinary that, after 13 years, a project designed to make the sum of human knowledge available to humanity, with an annual budget of $50 million, has no clue how to measure the quality of the content it is providing, no apparent interest in doing so, and no apparent will to spend money on it.
For what it's worth, there was a recent external study of Wikipedia's medical content that came to unflattering results:
http://www.jaoa.org/content/114/5/368.full
---o0o---
Most Wikipedia articles for the 10 costliest conditions in the United States contain errors compared with standard peer-reviewed sources. Health care professionals, trainees, and patients should use caution when using Wikipedia to answer questions regarding patient care.
Our findings reinforce the idea that physicians and medical students who currently use Wikipedia as a medical reference should be discouraged from doing so because of the potential for errors.
Doesn't help very much in assessing the quality of the article on [[Liancourt Rocks]] - when depending on where in the world one is, the article can be reasonably accurate or completely inaccurate. This is one of the geographic issues of which I speak.
There are also issues with the study you reference - it's quite biased toward American information and the articles only have two reviewers. It perhaps points out how easy it is to get junk science published in peer-reviewed journals if the topic is "sexy" enough - their own study wouldn't meet our standards for inclusion.
Risker/Anne