On Thu, Jul 26, 2012 at 5:00 AM, Derric Atzrott
<datzrott(a)alizeepathology.com> wrote:
This way if
people feel motivated at cheating at captcha they will end up
helping Wikipedia It
is up to us to try to balance things out.
I'm pretty sure users will be less annoyed at solving captchas that
actually
contribute some value.
Obligatory XKCD:
https://xkcd.com/810/
;-)
The best CAPTCHAs are the kind that do this. Look at
how hard it is to beat
reCAPTCHA because they have taken this approach. One must be careful though
that the CAPTCHA is constructed such that it won't be as simple as a lookup
though, and will actually require some thought (so that probably eliminates
the noun, verb, adjective idea).
This idea has my support.
We should use less CAPTCHAs.
If the problem is spam, we should build better "new URL" review
systems. There are externally managed spam lists that we could use to
identify spammers.
'new URL' s could be defined as domain names that were not in the
external links table for more than 24 hrs.
Addition of these new URLs could be smartly throttled.
un-autoconfirmed edits which include 'new URLs' could be throttled so
that they can only be added to a single article for the first 24
hours. That allows a new user to make use of a new domain name
unimpeded, however they can only use it on one page for the first 24
hrs. If the new URL was spam, it will hopefully be removed within 24
hrs, which resets the clock for the spammer. i.e. they can only add
the spam to one page each 24 hrs.
Another idea is for the wiki to ask the user that adds new URLs to
review three recent edits that included new URLs and ask the user to
indicate whether or not the new URL was SPAM and should be removed.
This may be unworkable because the spam-bot could use the linksearch
tool to check whether a link is good or not.
--
John Vandenberg