> Like for many of us, my wiki is hosted on a shared server so I have to be
> careful about CPU usage. There's a hacker/attacker who has been recently
> flooding my wiki with malicious requests. His intentions may be...
I feel your pain. Don't bother guessing intentions. Don't take it personally. It's most likely just a bot, following some algorithm. (Well, unless you've done something to piss someone off, or if your wiki is highly controversial.)
> ... somehow block the attacker's IP address. I'm doing this manually right
> now in the HTACCESS file, by monitoring CPU usage, checking the IP in the
> log and blocking it in htaccess.
Ugh. Not the best way to do it, as they've already caused significant CPU usage before you even figure out the IP.
Is this a virtual server? If so, where you need to block it is in the firewall, using ipfw(8). You can make it work so that there is no reply to packets at all, which has the significant advantage that you actually slow down the attacker, since they have to wait for a TCP lost packet timeout.
Anything else you do slows you down while your defensive measures are executing.
This is probably not the best place to get advice on ipfw(8), but take my word for it, that's the place to do it, at the TCP/UDP level. I'd google around for things like "ipfw denial of service attack" and such.
I've had great (but temporary) success blocking spam that way, buy using ipfw(8) to block port 25 access from huge address ranges from sections of the globe where I don't expect email from -- like China. If your wiki is English-oriented and non-global in nature, perhaps you can stop access to big foreign address ranges to ease the problem.
Good luck!
----------------
:::: Entirely new ways of living are necessary, and if we don't adobt them voluntarily, we or our children will eventually adopt them involuntarily, and probably with great pain and difficulty in the process. -- Thom Hartmann
:::: Jan Steinman, EcoReality Co-op ::::
Recently I tried upgrading to mediawiki 1.19 on a shared server (hosted by Dreamhost). After upgrading to 1.19, my site was sooooooooo slow that it was barely operable. Pages would rarely successfully load and most would eventually turn into a "not found", 500 error.
Here is an excerpt from the log kept by our Process Watcher, the daemon that is
killing your troublesome php5.cgi processes:
Tue Aug 7 11:15:29 2012 procwatch3 INFO: PID 27983 (php5.cgi)
kingkwon:pg505086 - 36.4MB ram, 2.73 sec cpu [idle php]: killed for uid
ram
Tue Aug 7 11:15:49 2012 procwatch3 INFO: PID 29184 (php5.cgi)
kingkwon:pg505086 - 36.7MB ram, 1.12 sec cpu: killed for uid ram
Tue Aug 7 11:15:49 2012 procwatch3 INFO: PID 29073 (php5.cgi)
kingkwon:pg505086 - 36.6MB ram, 0.89 sec cpu: killed for uid ram
Tue Aug 7 11:15:59 2012 procwatch3 INFO: PID 28477 (php5.cgi)
kingkwon:pg505086 - 35.4MB ram, 2.48 sec cpu [idle php]: killed for uid
ram
Tue Aug 7 11:15:59 2012 procwatch3 INFO: PID 29079 (php5.cgi)
kingkwon:pg505086 - 33.5MB ram, 1.40 sec cpu [idle php]: killed for uid
ram
Kills prior days:
/var/log/procwatch.log.1.gz:7102
/var/log/procwatch.log.2.gz:4569
/var/log/procwatch.log.3.gz:38
Let me give you a bit of background information:
-My site: Koreanwikiproject.com
-We get around 1,400 hits a day.
-I would occasionally have problems with slow loading and processes getting killed. I originally thought this was due to people possibly using the pdf converter plugin. However when I had 1.19 installed, I hardly had any plugins installed as I was doing a fresh install then copying my image files over.
-No caching has been enabled.
Any help would be greatly appreciated. I will eventually be moving to VPS but I'd like to make sure this same problem doesn't carry over to the VPS.
Thank you,
Chris
On Sun, Aug 12, 2012 at 8:16 PM, Arthur Richards
<arichards(a)wikimedia.org> wrote:
> It would be awesome if we could put together a devroom at FOSDEM.
Agreed. If is going to do it I'd love to be involved. I've helped with
the cross-desktop dev rooms in previous years.
> Over the last couple of years, FOSDEM has become my favorite
> conference. The ethos of the conference is fantastic - totally
> grassroots, transparent, and open. It draws an unbelievable crowd. The
> technical breadth and depth of the talks is generally impressive. And
> the Wikimedia/Mediawiki-related talks pack the rooms - at least they
> did the last couple of years. We should have a much bigger presence at
> this event - from my perspective, it seems like it is a fantastic
> learning, community building, and recruiting opportunity - perhaps
> even more so than most of the other conferences at which we have a
> presence.
>
> If folks think this would be something cool to do, it might also be
> worth teaming with some other similarly-minded orgs with some overlap
> - like Mozilla, Creative Commons, OLPC, CiviCRM, etc. From the
> invitation for proposals, it sounds like this would increase our odds
> at securing a devroom, it would certainly help us further
> cross-pollinate, and ultimately strengthen the broader open source
> community.
Mozilla had their own room in previous years. In general teaming up
with other like-minded projects does increase the chances.
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Wikidata
Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Like for many of us, my wiki is hosted on a shared server so I have to be
careful about CPU usage. There's a hacker/attacker who has been recently
flooding my wiki with malicious requests. His intentions may be to just
increase CPU usage, slow down the site, get it kicked off the server - or
all of those. The last attack started right on the hour and the requests
are as frequent as 4 per second. It can go on for hours. I've gotten CPU
usage alerts from the server company. If I could afford it I would get a
dedicated server, but being flooded maliciously would be a problem for any
server. Here's a sample from the Access logs:
---------------------------
x.x.x.x - - [8/Aug/2012:05:02:46 -0500] "GET /wiki/NON_EXISTING_PAGE
HTTP/1.1" 301 - "thewikisite.net" ""
x.x.x.x - - [8/Aug/2012:05:02:46 -0500] "GET /wiki/NON_EXISTING_PAGE
HTTP/1.1" 301 - "thewikisite.net" ""
x.x.x.x - - [8/Aug/2012:05:02:46 -0500] "GET /wiki/NON_EXISTING_PAGE
HTTP/1.1" 301 - "thewikisite.net" ""
x.x.x.x - - [8/Aug/2012:05:02:46 -0500] "GET /wiki/NON_EXISTING_PAGE
HTTP/1.1" 301 - "thewikisite.net" ""
x.x.x.x - - [8/Aug/2012:05:02:47 -0500] "GET /wiki/NON_EXISTING_PAGE
HTTP/1.1" 301 - "thewikisite.net" ""
----------------------------
For evidence that this is a malicious attack, you can see:
- the title of the non-existing page
- the fact that the requests started right on the hour (for example 5:00AM
or close to it)
- there's no browser information. Usually the browser agent is identified
at the end of each line but those quotes are empty for this case.
- There are many requests from the same IP (2 per second on average) and it
goes on for hours.
- IP Whois does not reveal a search engine IP so its not a search engine
spider. A spider would also request existing pages.
Seeing all these points, this is most likely a malicious script. The CPU
consumption goes up to 100% so if this happens frequently, the company
could shut off the site in the worst case or in the best case just slow it
down temporarily, which is what they've done in the past. Currently the
server company can somehow manage these attacks but it means about four
hours of increased CPU usage after which they start delaying the scripts
and somehow block the attacker's IP address. I'm doing this manually right
now in the HTACCESS file, by monitoring CPU usage, checking the IP in the
log and blocking it in htaccess.
Previously I've been successful in blocking malicious edit/move floods also
done by scripts so now I'm thinking I could try doing a "view/request"
flood protection as well. I'm thinking a scheme like:
- If there's more than X amount of requests from a single IP in 10 minutes,
block that IP for 3 hours
This may be simple to do, but here's two worst cases for CPU usage:
- if the hacker gets multiple IPs and starts flooding with requests to
existing pages (thus harder to detect the attack)
- if there's a genuine huge traffic surge, for example if a very popular
website linked to us for a day on their front page.
In this case I could count the total number of requests and if they exceed
a certain number for 15 minutes, I would deny requests for the next 20
seconds. Genuine visitors in these traffic surges may be denied traffic but
that's ok since CPU usage has to be kept low at any cost. Logged in editors
who are established users of the site (a minimum number of edits) would not
be denied access. This would keep CPU consumption low both for the
malicious case, and for the genuine case (a huge traffic surge). That is
the priority because we don't want to make the server company mad.
Does anyone have any suggestions on how to do this:
- detect and block single IP attacks
- manage huge traffic surges or malicious attacks using multiple IPs
DDOS attacks have been going on for a long time so perhaps people have
created solutions for them. Maybe there's a server level program that can
be installed? The server company may not install it but I could try asking
them.
If there's no automatic solution on the server/OS level, another solution
is for me to make an extension in Mediawiki to detect the attacks and edit
Htaccess automatically or deny the IP's some other way. The extension would
check every page view and keep records of IP addresses in tables. Maybe
there would be two tables, one for monitoring the traffic and that table
would be bigger and another smaller table for blocked IP's.
For genuine traffic surges, I have some questions on how to make the site
faster but will ask them in a later email. I've tried to enable caches and
so on.
If anyone has any suggestions for how to deal with these kinds of DDOS
attacks, I would be grateful.
thanks
Dan
Hi Tomas,
Since your project includes Semantic MediaWiki specifically, SMW actually
has a repository for just such a thing - see here:
http://smw.referata.com/wiki/Category:Packages
As you can see, only one such "package" has ever been created - you can see
its homepage here:
http://wiki.creativecommons.org/CcTeamspace
It was created in 2008, and is significantly out of date. Still, it was
done nicely, and it's a good example for how such things can be put
together. (And it did get a good amount of use in its first few years.)
-Yaron
--
WikiWorks · MediaWiki Consulting · http://wikiworks.com
Forwarded from FOSDEM(a)lists.fosdem.org. Subscribe at
https://lists.fosdem.org/listinfo/fosdem.
Siebrand
---------------------------- Original Message ----------------------------
Subject: [FOSDEM] FOSDEM calls for devroom organizers and main track speakers
From: "Tias Guns" <tias(a)fosdem.org>
Date: Sun, August 12, 2012 14:40
To: "Fosdem Announce" <fosdem(a)lists.fosdem.org>
--------------------------------------------------------------------------
<<< help spread the word and make FOSDEM awesome >>>
FOSDEM is a non-commercial event offering open source communities a
place to meet, share ideas and collaborate. It is renowned for being
highly developer-oriented and brings together 5000+ geeks from all over
the world. FOSDEM will take place in Brussels, Belgium on the 2nd and
3rd of February 2013.
We invite proposals for *devrooms* and *main track talks*:
*Main Track Talks*
The main tracks host high-quality seminars for a broad and technical
audience. Every track is organized around a theme (security, kernel,
collaboration, ...). They are held in the two biggest auditoria and last
50 minutes. Each of the talks is given by a speaker who gets their
travel and accommodation costs reimbursed.
To apply for a FOSDEM Main Track talk, visit
https://fosdem.org/2013/call_for_main_speakers.html
To suggest a main track speaker that we should invite, mail
program(a)fosdem.org
*Devrooms*
A devroom is a 'developer room' in which open source communities can
organize their own schedule, made of presentations, brainstorming and
hacking sessions. Our goal is to stimulate developer collaboration and
cross-pollination between projects.
Each year we receive more requests than we can host. To better achieve
our goals, preference will be given to *proposals involving multiple,
collaborating projects*. Projects with similar goals/domains that make
separate requests will be asked to co-organize a devroom under their
common theme.
To propose organizing a devroom, visit
https://fosdem.org/2013/call_for_devrooms.html
Note! Linux distributions should apply to the dedicated distribution
mini-conference:
https://fosdem.org/2013/distrominiconf.html
*Key Dates*
- 1 October: deadline for devroom proposals
- mid October: devroom announcements
- 1 November: deadline main track proposals
- mid November: main track announcements
- 2 and 3 February: FOSDEM 2013
> From: Tom Hutchison <tom(a)hutch4.us>
>
> ... this brings up a discussion about
> Extensions flagged as a security risk and why the extension's code is
> still available for download?
I've experienced the converse: an extension being removed because someone flagged it as a security risk, only because it COULD be used in an insecure fashion.
By that test, LocalPreference.php should be flagged as a security risk.
The end result is that an SQL access extension that I regularly use responsibly (editing limited to certain users, with page protection) is no longer receiving development support.
Isn't it better to have a known risk exposed so that those who value the resource can fix it, than to ban it, so hapless prior users are still vulnerable?
Flagging, good. Banning, bad.
----------------
:::: It is not possible to use enormous amounts of resources to address a resource shortage. -- Mike Ruppert
:::: Jan Steinman, EcoReality Co-op ::::
On Wed, 08 Aug 2012 02:37:39 -0700, Jens Albrecht <jens.alb(a)gmx.net> wrote:
> Hi,
>
>
> is there a way to show a page not by using the title-form
> "index.php/Main_Page" but instead by using the page id like
> "index.php?pageid=X" ?
>
> I hope you get what I mean! Sorry for bad english.
>
>
> Regards
&curid= should work.
However you should avoid it as much as possible. There are multiple
situations where the page id for one page can change.
--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]
Hi,
I have a media wiki system handled by my IT department. I only has the
admin authority to edit the pages via browser. Since the most updated
part of the site are structured data, such as publications, I'd like to
edit it locally and update the created text file to the edit area on the
browser and update the wiki in this remote way. However, I have not
found a way or a tool to access the text area edit function via URL. Is
it even possible to do that?
--
Regards,
On 08/08/12 12:30, Jens Albrecht wrote :
> Hi,
>
> thank you very much!
> Are there any other request-parameters that can be passed to the index.php?
> Or is there a documentation of those parameters, because I cant fight this
> Information in the Wiki and Help Sites.
this page can be a starting point, but it says it's not complete:
https://www.mediawiki.org/wiki/Manual:Parameters_to_index.php
Alexis