[WikiEN-l] Fwd: [Foundation-l] Board statement regarding biographies of l...
Delirium
delirium at hackish.org
Thu Apr 23 10:22:24 UTC 2009
David Gerard wrote:
> I think they do have meaning on an objective factual level. e.g. If
> the NYT gets a birthdate wrong and this error is perpetuated, that
> doesn't make it right however well cited it is.
>
> But that's a detail, not the point. Is there a formulation of what I
> said (the necessity of immediatism, the lack of the luxury of
> eventualism) that you'd agree with?
>
Not the person to whom you're replying, but it seems to be to depend on
*why* the article is problematic.
Your immediatism/distinction seems to me to apply better in the easier
case, which isn't the one you mention with the NYT example, is where
it's clearly our fault: we've missummarized the sources or in some way
produced a misleading article, which could be corrected by writing a
better one from the same or better reliable sources. In those cases, the
basic problem with the article is something that would be a problem with
any Wikipedia article, and the only difference with BLPs that we need to
be quicker and stricter about fixing the problem.
But a particularly problematic case that you seem to be nodding towards
are those were we *have*, at least more or less, correctly summarized
the available reliable sources, but it's alleged that the sources
themselves are wrong, despite no good contravening sources. In those
cases, it doesn't seem like a problem with the *encyclopedia* per se,
insofar as encyclopedias are just tertiary sources collecting a bunch of
stuff from elsewhere. It's rather a problem with the rest of the world
not having their story right. And it seems a lot more tricky for us to
start deciding which things involving external consensus we're not going
to summarize due to our suspicions that they might be wrong, and harmful
if so. These are hardly limited to BLPs; I'd say propagating
inflammatory misinformation about, say, a recent massacre, is a lot more
virulent a falsehood with a greater potential for harm than a lot of the
BLP issues that come up. Is there a general solution to that? It seems
to be rapidly getting outside our competence to try to figure out, even
for cases where misinformation would be harmful, which parts of external
consensus are wrong, and how we can determine it.
-Mark
More information about the WikiEN-l
mailing list