I think this is a good approach; for defeating a severe distributed
attack, it should also be possible to prevent both editing by *all* IPs
and the creation of new user accounts (to prevent a bot creating a sock
puppet, then spamming under that user name) for a limited time.
Additionally, during such an emergency, all prevented edits could be
cached and then manually approved. That would prevent having to clean up
a real flood, should the need ever arise.
Magnus
Jimmy Wales wrote:
Evan Prodromou wrote:
http://meta.wikipedia.org/wiki/Edit_throttling
It's not too effective against distributed attacks, though, since it
only throttles a given user or IP address.
*nod*
Did you see my idea of a table like this?
user page throttle expiration
jwales DNA 2 (timestamps go here)
* Israel 3
jwales Turkey 0
plautus * 0
wik * 3
The idea is that for user/page combos, which could possibly include
wildcards (or even regexp's, although simplicity is a virtue and the
power of regexps might be overkill), there could be a throttle which
would be the number of edits per whatever unit of time or similar.
What interests me about this way of looking at it is that it is a
generalization of what we already do, i.e. site bans and page protection
are both just special cases.
plautus * 0 says he can't edit anything.
* Ham 0 says that [[Ham]] is protected.
---
For a distributed spambot attack, I don't know if this is helpful. I
just wanted lots of people to see it and tell me what's wrong with it.
Perhaps a spambot attack could be taken care of with a "special case"
ad hoc technique, for example "if the body of the article contains
this string (spammers url) on a save of an edit, then we enter this
ip number into the throttle table like this:
129.79.1.1 * 1
with an expiration of say 24 hours.
--Jimbo
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)Wikipedia.org
http://mail.wikipedia.org/mailman/listinfo/wikitech-l