I've whipped up a fairly basic rate limiter which can be used to provide a brake to mass-floods of edits or page moves.
It's experimental and probably still needs some work. Since it's relatively self-contained and I think some people would like to have it yesterday ;) I've gone ahead and checked it into the REL1_4 branch as well as HEAD.
This is not a comprehensive antispam or antivandalism solution; it's part of a soft security system to keep things from getting too far out of human control: for instance you can specify that a new user account can only perform up to 2 pages in 90 seconds (or 1 page in 3600 seconds ;) so a malicious script would not be able to as easily flood things at a rate of say one move per second.
Currently it requires using memcached, though that's not really necessary and will be fixed soon.
Over the next few days I'll also be working on improvements to the open proxy blacklist checking and the shared spam URL blacklist system.
-- brion vibber (brion @ pobox.com)
Brion Vibber wrote:
I've whipped up a fairly basic rate limiter which can be used to provide a brake to mass-floods of edits or page moves.
That is great news. However, I have one little concern. Please could you try to make sure that, if a human editor hits the rate limit, they can just press "reload" to resubmit the same edit or move? You see, I will often move or edit several pages in several tabs at the same time, and this way I can very easily hit the rate limit.
Timwi
Timwi wrote:
Brion Vibber wrote:
I've whipped up a fairly basic rate limiter which can be used to provide a brake to mass-floods of edits or page moves.
That is great news. However, I have one little concern. Please could you try to make sure that, if a human editor hits the rate limit, they can just press "reload" to resubmit the same edit or move? You see, I will often move or edit several pages in several tabs at the same time, and this way I can very easily hit the rate limit.
That's exactly what I aimed for, and at least with Firefox that's just the way you can do it. I haven't tested IE etc but I _think_ it's got the same form resubmission behavior.
Of course if you're still over the limit you'll still get an error page, but you can keep reloading until the limit period expires. :)
-- brion vibber (brion @ pobox.com)
Brion Vibber wrote:
Timwi wrote:
Brion Vibber wrote:
I've whipped up a fairly basic rate limiter which can be used to provide a brake to mass-floods of edits or page moves.
That is great news. However, I have one little concern. Please could you try to make sure that, if a human editor hits the rate limit, they can just press "reload" to resubmit the same edit or move? You see, I will often move or edit several pages in several tabs at the same time, and this way I can very easily hit the rate limit.
That's exactly what I aimed for, and at least with Firefox that's just the way you can do it. I haven't tested IE etc but I _think_ it's got the same form resubmission behavior.
Thanks, that's good to know.
As for IE, I know that you can use "refresh" to reload a POST request. You will get a confirmation message pretty similar to the one in Firefox. However, IE doesn't allow you to use the "back" button to go back to the result page of a POST request. You will get an error message telling you that the data is no longer available, and if you try to reload it, it will resubmit the request as a GET request. (I know. I don't know why anyone uses this crap either. :-))
Timwi
wikitech-l@lists.wikimedia.org