On 21/04/07, George Herbert george.herbert@gmail.com wrote:
I strongly disagree that we refuse to take any real steps to reduce the harm. There are a lot of good people who watch for bio article changes. We have additional steps and procedure and policy clearly defined for detection and handling of bio article problems.
The problem is, this is mitigating the harm when it appears, not reducing the potential.
We can mitigate the harm, on the whole, not too shabbily. We have our patrollers and our processes, and we have a glorious technical innovation on the horizon which should catch another third or so, and we're pretty good at taking stuff out as soon as someone complains - in fact, we're sufficiently good that when I reply to a complaint about vandalism on a page, it actually seems embarassingly bad unless I can say "was only there two minutes".
But let's examine the actual problem. It falls into two classes. One is routine defacement, where we are the victim, which we can handle pretty well - standard vandalism. The other is malicious editing, where the victim isn't us but is some third party. This is what Doc and I are concerned about.
We can't really reduce the former, other than by limiting editing - it happens to all articles at any particular time. But the latter tends to revolve around a (fuzzily-definable) set of articles, which are - broadly speaking - mostly "contemporary individuals", with some "current issues" as well. If not looked after by a reasonably competent editor, these often devolve into hatchetjobs by one or two people with an axe to grind; once a competent editor or three has their hand in the process, though, they're usually not too bad.
A large portion of these are safe - we have enough eyeballs on them that it's virtually impossible to seriously defame [[George W. Bush]] or [[Hillary Clinton]] or even anyone down to about the level of [[Nick Griffin]], for a random example. On the whole, most people with real first-class (or even third-class) importance get enough eyeballs this way; it's a truism to say that the more interesting the topic of an article to the world, the more likely it is to be maintained.
And even articles on trivially notable people - obscure individuals of limited importance, the fundamentally *unimportant* articles - can be made good; all it needs is one person with a sense of decency, some common sense, and the willingness to keep checking. I have dozens or hundreds of these on my watchlist, from an article on a porn starlet where someone keeps trying to add her personal background to a computer theorist who a crank decided is Really A Man And The World Needs To Know.* So do many others; we pick them up in our normal routine.
The problem is, not all of them get this care. We only have so many people willing to maintain things - this isn't a jab at people, I've reached pretty much a limit myself - and the eternal turnover means that articles which were once "curated" will eventually be unloved again.
So we have these articles which are risky, and of those we have some which are looked after by the community and some which aren't. Of the latter, the subjects themselves - with admirable fortitude - keep some clean, but generally less succesfully than we can; and we have a final class of unloved, risky, articles.
These are the problem. They are targets for vandalism, and the community lets them sit there. We can mitigate the harm when we become wise to it, add another article to the list of things our ever-patient users work on, but the harm's been done once - and then consider all the ones we never hear about.
So what we should be considering is some way of identifying these "risky" articles and doing something about them. Perhaps the ones who are of the most risk of hatchetry plus the least general importance should be prioritised, if we can figure out an evaluation system. I don't know how we can best deal with these articles. Deleting the extreme cases is one solution, and a tempting one. Deletion is strongly unpopular; merging back into a parent topic only a bit less so.
But in order to actively reduce the harm we are doing, these are the articles we need to take a good look at; we need to look at the culture we have which thinks "keep, cleanup" is a useful comment at an AFD, and then fails to do anything about it, leading to articles the community vaguely "wants kept" but which are unmaintainable. They are our most "dangerous" articles, and the least critical to our mission.