This may be a bit extreme, but I added this to my LocalSettings.php
$agent= " " . $_SERVER['HTTP_USER_AGENT']; if ( strpos($agent,"msnbot") ) { exit; }
found the snippet via Googling. ===================================== Jim Hu
On Jan 2, 2007, at 6:31 PM, howard chen wrote:
Hello,
Can share something abt in these area?
Thanks. _______________________________________________ Wikitech-l mailing list Wikitech-l@wikimedia.org http://mail.wikipedia.org/mailman/listinfo/wikitech-l
Hello Howard,
I am not sure that blocking msnbot from your wiki would accomplish much. You should also block robots in /robots.txt, not in your LocalSettings.php file. If you want to protect against DDoS, you can not do it at a wiki level. You will need to buy special equipment that detects it on a lower level. (Some links: http://www.cisco.com/en/US/netsol/ns615/networking_solutions_sub_solution.ht... and http://www.juniper.net/solutions/service_provider/network_security/ ). If you want to protect against malicious bots, you will have to use something such as captchas to separate the humans from the bots.
I hope that this helps, Kasimir Gabert
On 1/2/07, Jim Hu jimhu@tamu.edu wrote:
This may be a bit extreme, but I added this to my LocalSettings.php
$agent= " " . $_SERVER['HTTP_USER_AGENT']; if ( strpos($agent,"msnbot") ) { exit; }
found the snippet via Googling.
Jim Hu
On Jan 2, 2007, at 6:31 PM, howard chen wrote:
Hello,
Can share something abt in these area?
Thanks. _______________________________________________ Wikitech-l mailing list Wikitech-l@wikimedia.org http://mail.wikipedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@wikimedia.org http://mail.wikipedia.org/mailman/listinfo/wikitech-l
On 1/3/07, Kasimir Gabert kasimir.g@gmail.com wrote:
Hello Howard,
I am not sure that blocking msnbot from your wiki would accomplish much. You should also block robots in /robots.txt, not in your LocalSettings.php file. If you want to protect against DDoS, you can not do it at a wiki level. You will need to buy special equipment that detects it on a lower level. (Some links: http://www.cisco.com/en/US/netsol/ns615/networking_solutions_sub_solution.ht... and http://www.juniper.net/solutions/service_provider/network_security/ ). If you want to protect against malicious bots, you will have to use something such as captchas to separate the humans from the bots.
I hope that this helps, Kasimir Gabert
On 1/2/07, Jim Hu jimhu@tamu.edu wrote:
This may be a bit extreme, but I added this to my LocalSettings.php
$agent= " " . $_SERVER['HTTP_USER_AGENT']; if ( strpos($agent,"msnbot") ) { exit; }
found the snippet via Googling.
Jim Hu
On Jan 2, 2007, at 6:31 PM, howard chen wrote:
Hello,
Can share something abt in these area?
Thanks. _______________________________________________ Wikitech-l mailing list Wikitech-l@wikimedia.org http://mail.wikipedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@wikimedia.org http://mail.wikipedia.org/mailman/listinfo/wikitech-l
-- Kasimir Gabert _______________________________________________ Wikitech-l mailing list Wikitech-l@wikimedia.org http://mail.wikipedia.org/mailman/listinfo/wikitech-l
that's correct.
i want to know what kind of software/hardware/techniques are currently used by wikipedia in defending DDOS. (not just robots)
we think these kind of experience is very useful for other web sites, too.
thanks.
On 03/01/07, howard chen howachen@gmail.com wrote:
i want to know what kind of software/hardware/techniques are currently used by wikipedia in defending DDOS. (not just robots)
Well, what we do, is we position Mark in front of the data centre with a big stick...
Rob Church
Rob Church wrote:
On 03/01/07, howard chen howachen@gmail.com wrote:
i want to know what kind of software/hardware/techniques are currently used by wikipedia in defending DDOS. (not just robots)
Well, what we do, is we position Mark in front of the data centre with a big stick...
How big is the stick? And does it conform to current industry standards for sticks? You may need to upgrade to a brickbat or sledge hammer.
r
On 03/01/07, Ron Hall ron.hall@mcgill.ca wrote:
How big is the stick? And does it conform to current industry standards for sticks? You may need to upgrade to a brickbat or sledge hammer.
We upgraded the stick from a 1.5m pole to a 2.5m ash stake last year, to appropriately scale to the needs of our user base. I believe we also trialled a distributed architecture whereby we had Mark, Domas and Kyle, all with sticks. Tests showed that if they kept the sticks moving, the effective spin-up time of the stick was eliminated when responding to attacks, however, their arms got tired more quickly, leaving them open to later attacks.
Rob Church
On 03/01/07, Rob Church robchur@gmail.com wrote:
On 03/01/07, Ron Hall ron.hall@mcgill.ca wrote:
How big is the stick? And does it conform to current industry standards for sticks? You may need to upgrade to a brickbat or sledge hammer.
We upgraded the stick from a 1.5m pole to a 2.5m ash stake last year, to appropriately scale to the needs of our user base. I believe we also trialled a distributed architecture whereby we had Mark, Domas and Kyle, all with sticks. Tests showed that if they kept the sticks moving, the effective spin-up time of the stick was eliminated when responding to attacks, however, their arms got tired more quickly, leaving them open to later attacks.
Uncyclopedia has implemented a fully automatic banstick:
http://uncyclopedia.org/wiki/Image:Deletion_Award.gif
One of these is also VERY useful on Special:newpages patrol:
http://uncyclopedia.org/wiki/Template:Burninator
And a packet of this:
http://uncyclopedia.org/wiki/Image:Admin-pms.png
- d.
I don't see why one should buy special equipment when a code snippet works. I added the snippet while an msnbot indexing swarm was in progress. Apparently modifying robots.txt while a msnbot swarm is in progress doesn't shut them off, and it was msnbot that was effectively causing a DDOS (if unintended) on my server. The wiki it was indexing has a lot of large category pages, and the maximum number of processes in the mysql installation was being overwhelmed. The snippet exits before any queries are launched, so after I added the blocker, I got back access to the wiki and several other sites using the same mysql instance. So it worked for me, because the DDOS was at the level of mysql limits, not at the level of bandwidth. YMMV.
Even if you didn't want to do this at the wiki level, I don't think special equipment is needed. If you know where the DDOS is coming from, you can block at the Apache level, the firewall level, or use throttling scripts. Captcha is to block spambots, but is not effective against DDOS.
JH
===================================== Jim Hu
On Jan 2, 2007, at 7:45 PM, Kasimir Gabert wrote:
Hello Howard,
I am not sure that blocking msnbot from your wiki would accomplish much. You should also block robots in /robots.txt, not in your LocalSettings.php file. If you want to protect against DDoS, you can not do it at a wiki level. You will need to buy special equipment that detects it on a lower level. (Some links: http://www.cisco.com/en/US/netsol/ns615/ networking_solutions_sub_solution.html and http://www.juniper.net/solutions/service_provider/ network_security/ ). If you want to protect against malicious bots, you will have to use something such as captchas to separate the humans from the bots.
I hope that this helps, Kasimir Gabert
On 1/2/07, Jim Hu jimhu@tamu.edu wrote:
This may be a bit extreme, but I added this to my LocalSettings.php
$agent= " " . $_SERVER['HTTP_USER_AGENT']; if ( strpos($agent,"msnbot") ) { exit; }
found the snippet via Googling.
Jim Hu
On Jan 2, 2007, at 6:31 PM, howard chen wrote:
Hello,
Can share something abt in these area?
Thanks. _______________________________________________ Wikitech-l mailing list Wikitech-l@wikimedia.org http://mail.wikipedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@wikimedia.org http://mail.wikipedia.org/mailman/listinfo/wikitech-l
-- Kasimir Gabert _______________________________________________ Wikitech-l mailing list Wikitech-l@wikimedia.org http://mail.wikipedia.org/mailman/listinfo/wikitech-l
Jim Hu wrote:
Even if you didn't want to do this at the wiki level, I don't think special equipment is needed. If you know where the DDOS is coming from, you can block at the Apache level, the firewall level, or use throttling scripts.
Clearly you have no experience with real DDoS attacks, then. If a web page falls in the forest, and no one is around to hear it, does it make a sound?
(For the zen-challenged: functioning servers aren't very useful when no one can reach them.)
wikitech-l@lists.wikimedia.org