Ulrich Fuchs wrote:
I agree that stable and accepted articles are important
to have (for quoting
and so on). I do not agree that defining those versions must be done by
"people with baccalaureate degrees in the subject area". It would be far more
important to get these experts *Writing* instead of editing.
I am not in favour of a system where a lot of people drive thousands of
articles to a certain (excellent) state, and a few experts get the merits by
selecting the articels, making some smaller copyedits and then calling that
the "real" encyclopaedia, implicitly stating that the Wikipedia is not
serious at all.
I agree that such a system would be a bad idea, but I think it's
possible to work on a "stable distribution" of the Wikipedia (so to
speak) without that. It could be done within the current Wikipedia
system, but we'd need either a lot of organization or some additional
software help. Basically "this article in its current form is quite
good, and has no major errors" needs to be identified, and included in
the "stable" set. Then updated versions can periodically be added to
the stable set. The difference from the normal Wikipedia will be that
the goal is for every article in the stable set to be "good" at any
particular time, while with the normal Wikipedia any article at any time
could be right in the middle of a half-done revision or move, or could
just have been vandalized, and so on. So if, say, you wanted to publish
a paper version of the Wikipedia, even if you had infinite amounts of
paper, you could never take a snapshot of the active version, because at
any given time a lot of stuff is in flux and not ready to be published.
So basically the Sifter-type idea would take stable snapshots of
articles now and then.
-Mark