On 7/4/06, Steve Bennett stevage@gmail.com wrote:
Yay, I'm superhuman. And I wasn't even trying! Imagine if I trained a bit...
Nevermind that human response time lives in the sub-one-second range. If we had a good enough interface and fast enough servers, you could easily see people moving ~60 articles/second, doing subsecond edits, etc.
I could especially see frequent edits from someone fixing spelling or punctuation - pedantic, yes, but definitely adding value to the wiki.
Wikis are designed to be self-healing. Anything that reduces the capacity of the wiki to be edited is contrary to the nature of the wiki. This implies that there will be uneasiness - we don't always WANT things to be in flux! - but that's what a wiki is all about. It's better to have tools that can deal with that - for instance, by mass reverting a user's contributions - rather than penalizing all the innocents out there.
Also - never, never, never make assumptions about what's humanly possible or not. Physically possible, maybe. If it's a real human doing editing then they may as well go as fast as they want to, and who are we to tell them it's wrong?
Maybe what would make sense here would be doing something like stochastically requesting captchas from people doing frequent edits. Give them credit the more they answer right, so they get them less frequently. Make sure it doesn't break on massively tabbed edits. ;) Post a safe threshold under which you'll never get a captcha, so bots can fly under the radar if they go slowly.