On 25/02/2008, Gregory Maxwell gmaxwell@gmail.com wrote:
Last time I did a big scrub I found plenty of overt nastyness (direct links to trojans, browser crashing popup spam sites, etc) that were in articles for long spans of time.
First there needs to come some level of recorded review/oversight. Simply trusting that the links will get seen as is done today is demonstrably highly failure prone. Once that exists, teaching nofollow to follow it is 'just' technical details.
I think that first we need a working, protected whitelist.
Once there's a whitelist it shouldn't be too hard to come up with a policy to add links to it, and if necessary somebody could generate a bot to help do that.
It seems reasonable that well established users could put a comment mark next to a link that is good, and a bot could check these marks are placed there by a trusted user and add these links to the whitelist, and if all the marks were removed, then it would be delisted.
But the policy is less relevant than the whitelist, once there's a functioning whitelist I'm sure we could rustle up a policy.