Like for many of us, my wiki is hosted on a shared server so I have to be
careful about CPU usage. There's a hacker/attacker who has been recently
flooding my wiki with malicious requests. His intentions may be to just
increase CPU usage, slow down the site, get it kicked off the server - or
all of those. The last attack started right on the hour and the requests
are as frequent as 4 per second. It can go on for hours. I've gotten CPU
usage alerts from the server company. If I could afford it I would get a
dedicated server, but being flooded maliciously would be a problem for any
server. Here's a sample from the Access logs:
---------------------------
x.x.x.x - - [8/Aug/2012:05:02:46 -0500] "GET /wiki/NON_EXISTING_PAGE
HTTP/1.1" 301 - "thewikisite.net" ""
x.x.x.x - - [8/Aug/2012:05:02:46 -0500] "GET /wiki/NON_EXISTING_PAGE
HTTP/1.1" 301 - "thewikisite.net" ""
x.x.x.x - - [8/Aug/2012:05:02:46 -0500] "GET /wiki/NON_EXISTING_PAGE
HTTP/1.1" 301 - "thewikisite.net" ""
x.x.x.x - - [8/Aug/2012:05:02:46 -0500] "GET /wiki/NON_EXISTING_PAGE
HTTP/1.1" 301 - "thewikisite.net" ""
x.x.x.x - - [8/Aug/2012:05:02:47 -0500] "GET /wiki/NON_EXISTING_PAGE
HTTP/1.1" 301 - "thewikisite.net" ""
----------------------------
For evidence that this is a malicious attack, you can see:
- the title of the non-existing page
- the fact that the requests started right on the hour (for example 5:00AM
or close to it)
- there's no browser information. Usually the browser agent is identified
at the end of each line but those quotes are empty for this case.
- There are many requests from the same IP (2 per second on average) and it
goes on for hours.
- IP Whois does not reveal a search engine IP so its not a search engine
spider. A spider would also request existing pages.
Seeing all these points, this is most likely a malicious script. The CPU
consumption goes up to 100% so if this happens frequently, the company
could shut off the site in the worst case or in the best case just slow it
down temporarily, which is what they've done in the past. Currently the
server company can somehow manage these attacks but it means about four
hours of increased CPU usage after which they start delaying the scripts
and somehow block the attacker's IP address. I'm doing this manually right
now in the HTACCESS file, by monitoring CPU usage, checking the IP in the
log and blocking it in htaccess.
Previously I've been successful in blocking malicious edit/move floods also
done by scripts so now I'm thinking I could try doing a "view/request"
flood protection as well. I'm thinking a scheme like:
- If there's more than X amount of requests from a single IP in 10 minutes,
block that IP for 3 hours
This may be simple to do, but here's two worst cases for CPU usage:
- if the hacker gets multiple IPs and starts flooding with requests to
existing pages (thus harder to detect the attack)
- if there's a genuine huge traffic surge, for example if a very popular
website linked to us for a day on their front page.
In this case I could count the total number of requests and if they exceed
a certain number for 15 minutes, I would deny requests for the next 20
seconds. Genuine visitors in these traffic surges may be denied traffic but
that's ok since CPU usage has to be kept low at any cost. Logged in editors
who are established users of the site (a minimum number of edits) would not
be denied access. This would keep CPU consumption low both for the
malicious case, and for the genuine case (a huge traffic surge). That is
the priority because we don't want to make the server company mad.
Does anyone have any suggestions on how to do this:
- detect and block single IP attacks
- manage huge traffic surges or malicious attacks using multiple IPs
DDOS attacks have been going on for a long time so perhaps people have
created solutions for them. Maybe there's a server level program that can
be installed? The server company may not install it but I could try asking
them.
If there's no automatic solution on the server/OS level, another solution
is for me to make an extension in Mediawiki to detect the attacks and edit
Htaccess automatically or deny the IP's some other way. The extension would
check every page view and keep records of IP addresses in tables. Maybe
there would be two tables, one for monitoring the traffic and that table
would be bigger and another smaller table for blocked IP's.
For genuine traffic surges, I have some questions on how to make the site
faster but will ask them in a later email. I've tried to enable caches and
so on.
If anyone has any suggestions for how to deal with these kinds of DDOS
attacks, I would be grateful.
thanks
Dan
Show replies by date