Try this as a general approach:
- This is not a new idea.
- We've got vandalism down to under one article in 200, and a variety of
advanced programs and patrols of hundreds of users who get the usual fixing
time down to seconds or minutes when it does happen.
- But obviously we want to do even better.
- A lot of people gauge Wikipedia in terms of quantity of edits. The last
2 years the focus has been on improving quality of edits, and especially,
finding even better ways to prevent deliberately harmful edits such as
vandalism.
- Our historical answer is "protection" - everyone is prevented from
editing a page if it is being badly mis-edited. That's highly disruptive and
frustrates many edits since one bad apple can hold up the process.
- A more recent addition is the Abuse filter, a system that allows
flagrantly bad edits to be prevented but lets through good ones. It's a
program though so it can't differentiate apparently good posts that are
really not good.
- Our newest answer is therefore this thing called "flagged revisions" -
the requirement that when an edit is made to a sensitive article, someone
who's been round a while, which is one of thousands of users, checks to say
it's okay, before letting it go "live".
- Our test bed has been the german wikipedia, the second largest language
to English in the Wikipedia websites.
- Our main target and test bed is articles about people, because those
are seen to be more sensitive and of special importance to get right. The
wider reported vandalism cases usually relate to these articles, just
because articles about people are so visible. So it makes sense to apply
possible solutions to these and see what effect it has on editing quantity
and quality.
That's how I'd explain it (condensed and simplified as needed for the media
concerned).
FT2
On Wed, Aug 26, 2009 at 12:26 AM, Thomas Dalton <thomas.dalton(a)gmail.com>wrote;wrote:
2009/8/26 David Gerard <dgerard(a)gmail.com>om>:
2009/8/25 Joseph Reagle <reagle(a)mit.edu>du>:
> In speaking to the press today, one of the things I believe I heard in
an
intro segment on a live radio discussion was that WP would have
professional editors flagging trusted content. I didn't get a chance to
correct that, and I know who gets to review is still up in the air to some
extent [1], but that's a likely source of confusion to the public
apparently.
IME the problem is the use of the word "editor". We use it as in "tens
of thousands of volunteers", everyone else assumes we mean as in "the
boss who decides what goes in." It's a jargon versus English problem.
I think the problem comes from the fact that all our articles are
collaborative works. Normally there is a writer and an editor and they
are distinct jobs. We have everyone as editors since everyone can
change what other people have written. That is very unusual and the
English language hasn't had a change to adapt to it.
_______________________________________________
WikiEN-l mailing list
WikiEN-l(a)lists.wikimedia.org
To unsubscribe from this mailing list, visit:
https://lists.wikimedia.org/mailman/listinfo/wikien-l