The General Notability Guideline is our friend here. Because we require
articles to be verifiable that particular scenario doesn't apply - we
frequently have people try and add articles and content in situations as
unverifiable as the one the NY Times details. But we reject such content.
Where I believe our crowdsourcing model breaks down is when we don't have a
crowd, or we work too quickly for crowds to form:
- Speedy deletion where an admin and maybe one other editor will summarily
delete stuff, in theory only if it meets some strict criteria.
- Our smaller wikis. We now have about a thousand, and the wisdom of crowds
is inherently vulnerable to subdivision of crowds. A "One wiki per
language, plus one multilingual wiki for all those things where we work
across languages" would be a better model.
WSC
On 8 July 2012 00:18, ENWP Pine <deyntestiss(a)hotmail.com> wrote:
I thought this was interesting so I’m passing it along. This sentence
particularly caught my attention: “The answer, I think, is to take the best
of what both experts and markets have to offer, realizing that the
combination of the two offers a better window onto the future than either
alone.” Substitute the word “crowds” for “markets”, and perhaps there is
something here that could be applied to Wikipedia in our quest for quality,
mixing the best of expertise and crowdsourcing. I’d be very interested in
hearing comments from other Wikipedians.
https://www.nytimes.com/2012/07/08/sunday-review/when-the-crowd-isnt-wise.h…
Cheers,
Pine
_______________________________________________
Wiki-research-l mailing list
Wiki-research-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wiki-research-l