Toby Bartels wrote:
Neil Harris wrote in part:
How about, as was suggested before by someone
else, a new namespace:
something like "PDresource:" that can be used by auto-uploaders to get
PD info into a Wikified format without polluting the main namespace, but
can then be used as source material for the 'pedia by human contributors.
Something like this would be a good idea.
There's a difference between a direct upload
and the ordinary draft status of most of Wikipedia,
where at least the contributor thinks that the article is good,
at least to the extent that it's been written
(and hopefully leaves notes about the gaps).
If we had a "review score" system, then we could say that
* a page generated by a human edit starts with a review score of 1:
that is to say, at least one human being (the original poster) thought
that it was good.
* a page generated by a bot starts with a review score of 0, since no
human being has reviewed it
More than this, we could auto-detect bots by setting a minimum
inter-edit threshold of say 10 seconds, and mark all rapid sequences of
edits by a user, logged-on or not, as bot-generated.
Then, humans browsing the Wikipedia would normally operate at a review
cutoff level of 1: seeing all human-contributed pages, and no
bot-contributed pages. This would also mean that casual visitors would
not edit those pages, as they would never see them: if they created a
page with the same title as a bot-created one, they would supersede it,
and never know: this follows the principle that human contributions
would take precedence over mechanical article creation. Only logged-in
users who deliberately wanted to would set their review cutoff to 0, and
therefore be able to see and edit bot-contributed pages.
The scheme could be extended: the review score could be bumped by
counting page views and edits, and applying weighted or thresholded
increments.
Neil