On 22/02/07, Parker Peters parkerpeters1002@gmail.com wrote:
I'd agree we should be careful about this -- bots have the potential to be very helpful, but rogue or malfunctioning bots likewise have a potential to do some pretty nasty damage. Not so much to pages; sure, it'll be a chore, but we can revert all of that easily enough. I'd be especially concerned about bots that deal with newer users, and how a buggy bot can influence their first experiences on the wiki.
That's one of my concerns, but I'm also concerned about what happens when two bots start making the same change in slightly different ways; say someone writes a bot to make all date codes uniform, for example, and another bot starts thinking it's vandalizing something or it starts "datecoding" things that are in external links?
To be fair, these are issues pretty much independent of editing speed - doubling the speed just changes the amount of cleanup needing done, it doesn't prevent it - and are one of the main reasons we have a bot approval system. You want to do something nonstandard, they're going to go over it with a fine-toothed comb... and hopefully conflicts with other bots, or the potential to do Really Stupid Things (like altering source titles) will be spotted.
(In the example given, the obvious questions would be "why are you mass-changing date codes?" and "why this final version?", and if they both get approved with conflicting answers, the system's dropped the ball)
HagermanBot would be my specific example of a bot that doesn't need an
artificial limit. Some people have suggested it should wait a little longer before signing (no idea where overall consensus on this is, just pointing out), but even "wait x seconds before signing" is just a delay for each edit, rather than a limit on the overall editing rate.
HagermanBot is a special case because its usage is overly specialized. It still makes mistakes sometimes (such as when someone "signs" but then adds a P.S. or something after, or manually signs rather than using tildes).
Having it wait X seconds before an edit - I'd suggest 60, to allow for time for a user to realize, go back, sign it themselves, then make the bot re-check - is a good thing.
An elegant though computationally tricky one would be to log all such changes and then do a mass-update of them in batches - every hour or half-hour on a heavily trafficked page, say.