But it's now being claimed (one might assume, in
defense of the
new policy) that disallowing missing User-Agent strings is cutting
20-50% of the (presumably undesirable) load. Which sounds pretty
primary. So which is it?
Check the CPU drop in Monday:
Network drop on API:
You can sure assume, that we need to come up with something to "defend a new
Presumably some percentage of that 20-50% will come
back as the
spammers realize they have to supply the string. Presumably we
then start playing whack-a-mole.
Yes, we will ban all IPs participating in this.
Presumably there's a plan for what to do when the
supplying a new, random string every time.
Random strings are easy to identify, fixed strings are easy to verify.
(I do worry about where this is going, though.)
Going where it always goes, proper operations of the website. Been there, done that.