[WikiEN-l] Thousands of *awful* articles on websites

charles.r.matthews at ntlworld.com charles.r.matthews at ntlworld.com
Tue Jan 9 12:19:46 UTC 2007


Robth wrote

>  Taking a lassez faire approach to article quality (and
> specifically, reference quality) will ensure that our average article
> remains rather poorly referenced, and strikes me as a sure way to at
> best delay the development of a higher quality encyclopedia, and at
> worst lastingly sidetrack the project.

You have a point. However the 'average article' is a somewhat problematic concept. For several reasons, in fact. One could weight any 'average' (mean) according to the word count of articles: so that large, well-referenced articles count for more than stubby ones. Following up that idea, one could realise that we do hypertext, not individual articles, and an unreferenced stub created to fill in a red link is still more to go on than the red link qua title. (It is a node, can gather up redirectsm, and so on.) What is more, under a hypothetical feature that hid stubs from people who didn't want them, it might all look rather better on a random sample.

In other words, the status quo says 'please judge us as a hypertext work in progress'. Adding references is in many cases (not all) a type of clean-up; we have always had clean-up to do.

Charles

-----------------------------------------
Email sent from www.ntlworld.com
Virus-checked using McAfee(R) Software 
Visit www.ntlworld.com/security for more information




More information about the WikiEN-l mailing list