I just had an inspiration! Now that I'm thinking about it, if this
counting-url-posted-in-the-last-x-hours things works out, maybe the best way
to implement it would be for that to be part of the captcha code - if you
try to post the same url anywhere in the wiki more than x times in y hours,
you have to answer a captcha. Perhaps then, at some higher threshold, it
either a) adds it to the spam blacklist or b) operates as a short term spam
blacklist - you can't post it more than some number of times (10? 25?) in y
hours - this gives the community time to respond and blacklist the url if it
is indeed spam. This still does not address the possibility that some
legitimate need may arise to post an url into 100 pages, but it is a much
"softer" but still effective solution. I'm liking it better and better...
maybe set different thresholds for logged in users vs. anonymous users...
Best Regards,
Aerik
Show replies by date