New Apache Module For Fending Off DoS Attacks 62
Network Dweebs Corporation writes "A new Apache DoS mod, called mod_dosevasive (short for dos evasive maneuvers) is now available for Apache 1.3. This new module gives Apache the ability to deny (403) web page retrieval from clients requesting more than one or two pages per second, and helps protect bandwidth and system resources in the event of a single-system or distributed request-based DoS attack. This freely distributable, open-source mod can be found at http://www.networkdweebs.com/stuff/security.html"
Wget and pageview rate throttles (Score:1)
how's this going to affect my porn wgets?
From the GNU Wget help page:
Thus, you can still wget as many images as you want. You'll just have to speciy the -w option and (so you don't waste any online time) possibly read Slashdot while the image download proceeds.
Re:just one question... (Score:2)
Re:DSO? (Score:1)
How is this a troll? Clueless, maybe.
You wonder about moderation? You must be new here
Bandwidth still being used (Score:2, Insightful)
Re:Bandwidth still being used (Score:2, Insightful)
This is the same problem as with all filters automagically cutting off all requests from given ip/netblock after spotting some abuse.
Think big LAN behind masquerading firewall, or caching proxy for large organization, where one person using it can block access to the site using this automatic defenses.
Funny thing is that this broken-by-design solution is known for years, its flaws are known for years, and yet we see every once in a while another tool using this scheme.
Robert
Re:Bandwidth still being used (Score:2, Insightful)
Think big LAN behind masquerading firewall, or caching proxy for large organization, where one person using it can block access to the site using this automatic defenses.
Or think impostor sending requests with forged source IP.
What? TCP sequence numbers? Impossible to impersonate TCP session?
Think [bindview.com] again [coredump.cx].
Robert
Re:Bandwidth still being used (Score:2, Informative)
This tool wasn't designed as an end-all be-all solution, it was designed as a starting point for cutting off extraneous requests (so you don't have a few thousand CGIs running on your server, or a few thousand page sends) and to provide a means of detection. You could easily take this code and have it talk to your firewalls to shut down the ip addresses that are being blacklisted. If you don't have decentralized content or at the very least a distributed design, you're going to be DoS'd regardless, but this tool can at least make it take more power to do it.
Re:Bandwidth still being used (Score:1)
The hardware devices that you propose already exist. And they work to some extend.
The problem is bigger the most would think. What does diferenciate a attack from a legitim access? How do you detect an attack and start to counter it? Do you have bandwidth to withold even a pit bucket for the attacking packets?
And finally how much money are you investing in the DoS protection...
The apache module have as normal a very interisting cost/effectiveness ratio... [even if there are other more efficient solutions for the DoS problems - they are also very expensive].
Cheers...
How clever is it? (Score:2, Insightful)
you could do to make sure that this doesn't get in the way of normal browsing, but still catches DOS attacks. What sort of things does this module include to work intelligently? How tunable is it?
One thing that jumps to mind is that you could have some kind of ratio between images and html which has to be adhered to for any x second period. This would hopefully mean that going to webpages with lots of images (which are all requested really quickly) wouldn't cause any problems. Also, more than one request can be made in a single http session (I think - I don't really know anything about this) so I guess you could make use of that to assess whether the traffic fitted the normal profile of a websurfer for that particular site.
Also, is there anything you can do to ensure that several people behind a NATing firewall all surfing to the same site don't trip the anti-DOS features?
Just thinking while I type really...
Re:How clever is it? (Score:4, Insightful)
Whilst not totally impossible
Re:How clever is it? (Score:1)
Setup: You are teaching classes to a lab full(lets say 30 for the sake of discussion)of kids in a school setting(gee, ya wonder where I work?). Let's say you instruct all your kids to go to some site with material for the astronomy class you teach. Let's assume that all the kids do as they are told and they all immediately type in the URL you gave them and request a page.
Let's assume your school district is behind a firewall that also uses a NAT/Proxy setup. Therefore all the requests are coming en masse from one "real" IP. Wouldn't this possibly be deemed as a DoS attack by this plugin?
Re:How clever is it? (Score:1)
Re:How clever is it? (Score:2)
One thing that jumps to mind is that you could have some kind of ratio between images and html which has to be adhered to for any x second period.
lynx users wouldn't be too impressed.
The "why" behind this.. (Score:5, Informative)
People suggessted a javascript popup telling them the truth about what was going on, or an HTTP redirect to a very large file on the big guy's site, but Jonathan A. Zdziarski at the site linked above decided to write this patch as an ad-hoc solution.
I'd be very careful with this patch in production, as it is ad-hoc and not tested very much at all.
Re:The "why" behind this.. (Score:2, Interesting)
Re:The "why" behind this.. (Score:3, Insightful)
Or break out and redirect to a goatse-esque page or something similar... Since they're viewing his competitor's site it would appear to be his content right?
=tkk
Referer check revenge? (Score:1)
Re:The "why" behind this.. (Score:1)
Re:The "why" behind this.. (Score:1)
Re:The "why" behind this.. (Score:1)
Re:The "why" behind this.. (Score:1)
simple (Score:2, Interesting)
RewriteEngine on
RewriteCond %{HTTP_REFERER} ^http://(.+\.)*bigguysite.com/ [NC]
RewriteRule
I've also seen people who had bad domain names pointed at their ips, where you can check the HTTP_HOST. I've seen recursive download programs totally crush webservers, mod_rewrite can check the HTTP_USER_AGENT for that. Of course, download programs could always change the specified user agent, which is I guess where this apache module could come in handy. Good idea..
Re:simple (Score:1)
I also like to keep any interesting multimedia files up on a shared directory accessible from apache running on my home computer. Just so any of my friends can browse through and such. Eventually, I got listed on some warez search engines...
RewriteEngine on
RewriteCond %{HTTP_REFERER} ^http://(.+\.)*warezsite.com/ [NC]
RewriteRule
Teehee. I got removed pretty quickly.
In the case of the 1x1 frames on every page... I wonder what would happen if you redirected them back to the origional page, which would have a frame that would redirect them back to the origional page.. I guess browsers probably protect against recursive frames.
You could at least redirect their browsers back to the most resource intensive page or script on the big guy's site, at least doubling his resources while barely using yours. Ah.. sweet justice.
I like someone else's suggestion about frame-busting javascript, that'd be pretty interesting and would definantly get that frame removed right away. I sometimes wish my websites got these kind of attacks, I'd have so much fun
Re:simple (Score:1)
Sorta, though not deliberately, they are limited to something between 4-6 levels of nesting I believe... Same with nested tables.
Too slow/too fast. (Score:3, Insightful)
I can easily request a couple of pages a second, if i'm spawning off links to read in the background. On the other hand wouldnt an automated attack be requesting much faster than 2 per second?
Re:Too slow/too fast. (Score:1)
Why would you spawn off links to the same page? Do you read the same content more than once? The key to the article is "the SAME page in the 2 second period".
Re:Too slow/too fast. (Score:2)
~GoRK
A possible problem? (Score:3, Interesting)
Re:A possible problem? (Score:1, Informative)
Re:A possible problem? (Score:2, Insightful)
Misunderstanding about Module (Score:5, Informative)
Just wanted to clear up a bit of misunderstanding about this module. First off, please forgive me for screwing up the story submission. What it *should* have said was "...This new module gives Apache the ability to deny (403) web page retrieval from clients requesting THE SAME FILES more than once or twice per second...". That's the way this tool works; if you request the same file more than once or twice per second, it adds you to a blacklist which prevents you from getting any web pages for 10 seconds; if you try and request more pages, it adds to that 10 seconds.
Second, I'd like to address the idea that we designed this as the "ultimate solution to DoSes". This tool should help in the event of your average DoS attack, however to be successful in heavy distributed attacks, you'll need to have an infrastructure capable of handling such an attack. A web server can only handle so many 403's before it'll stop servicing valid requests (but the # of 403's it can handle as opposed to web page or script retrievals is greater). It's our hope that anyone serious enough about circumventing a DoS attack will also have a distributed model and decentralized content, along with a network built for resisting DoS attacks.
This tool is not only useful for providing some initial frontline defense, but can (should) also be adapted to talk directly to a company's border routers or firewalls so that the blacklisted IPs can be handled before any more requests get to the server; e.g. it's a great detection tool for web-based DoS attacks.
Anyhow, please enjoy the tool, and I'd be very interested in hearing what kind of private adaptations people have made to it to talk to other requipment on the network.
Re:Misunderstanding about Module (Score:1, Interesting)
Re:Misunderstanding about Module (Score:3, Informative)
What about wget-style attacks? (Score:2)
What I'd like to see... (Score:1)
I currently use a scheme where I created the appropriate directories on my web document tree
(/scripts for example)
and then set up 'deny to all' rules for them.
This way, the apache server doesn't even bother with a filesystem seek to tell that the file isn't there it just denies it.
Dropping packets would be even better.
Re:Misunderstanding about Module (Score:1)
If your logo is at the top and the bottom of the page, that's two references within a second. But if the browser is caching images, there will only be one request to the web server. So in practice that shouldn't be a problem...unless the browser checks if the image file changed for the second reference?
Border Router Blocking (Score:1)
If you're looking for an easy way to automate blocking at the border router, take a look at:
http://www.ipblocker.org [ipblocker.org]With a simple command line call to a Perl script you can have the ACL on a Cisco router updated to deny traffic from the offending user.
But thats not the real problem right? (Score:1)
Re:But thats not the real problem right? (Score:2, Informative)
Re:But thats not the real problem right? (Score:2)
To stop a non-bandwidth bogus-request attack, you just turn on syncookies and that's that. This module is designed to stop a different kind of attack, wherein the clients are completing entire transactions too many times and thus consuming your bandwidth. There are other types of DOS attacks too -- reflection attacks (where you get a ton of ACK packets from all over the internet, using up all your bandwidth), for example, have to be stopped at the router level upstream, which prevents the server from completing any transactions as a client (over the internet; it can still get through over the LAN, of course).
Re:Terrible Idea... What about NAT? (Score:1)
Re:Terrible Idea... What about NAT? (Score:1)
Speaking of Security-related Apache Modules (Score:1, Interesting)
This is cool, but, mod_bandwidth already does it (Score:2)
--CTH
mod_slashdot? (Score:1)
DOS and Design of Websites (Score:2)
So many designers that I ran into in my travels, still don't understand, that when you put Flash animations (Which I can't stand 99% of the time), large png files, or complex front pages, especially public pages, you increase you bandwidth costs.
Seems very simple to most. I am still surprised how many companies redesign sites, with gaudy graphics all over the place, and then find ALL OF A SUDDEN after deployment thier website goes down.
I can remember many customers I use to deal with, that had fixed contracts for hosting, yet they maintained thier own content, calling up and claiming our server was slow, and or down/experiencing technical difficulties.
I would usually say: "OH REALLY, I don't see any problems with the server per se. Did you happen to modify anything lately on the site?"
"Yes" they would reply: "We just put a flash movie movie on the front page..."
Immediately I knew what the problem is, they blew thier bandwidth budget. At times I would see companies quadruple the size of thier front pages, which reduces by about a quarter the number of users they can support at quality page download times. Especially if they are close to thier bandwidth limit as IS without the new pages.
The bigger the pages, the better the DOS or the easier the DOS is too perform.
In my design philosophy for my companies site, you can't get access to big pages without signing in first. If you sign in a zillion times at one or more pages, obviously that isn't normal behavior, and the software on my site is intelligent enough to figure that pout and disables the login, which then points you to a 2K Error page.
In any case, if you are trying to protect your website and you don't want to resort to highly technical and esoteric methods, to minimize DOS attacks. You might want to start with the design of the website content.
The lighter the weight of the pages, the harder it is for an individual to amass enough machines to prevent legitimate users from using your site.
IMHO, Flash plugins, and applets and other such features should be available only to registered users, and logins strictly controlled.
Hack