Why You Should Overload WebSite Errors

I often have to explain to people that they need to overload error messages on their websites, and I occasionally get weird looks. It’s important to know that there are three scenarios that regularly occur that can be used to an attacker’s advantage.

The first scenario is where an error occurs in the application logic and creates a 500 error (Internal Server Error). These errors can often include stack traces, which provide incredibly useful information for crafting SQL injection attacks, among others. There’s no reason a user should see these.

The second is 400 errors (Bad Request). If an attacker can send overly large cookies, the server will respond with a bad request and will include the cookie payload in the response. This response, which includes the over-long cookie, can be read by an XMLHTTPRequest, even if the HTTPOnly flag is set. This allows an attacker who can set cookies (via XSS for instance) to break the protection that HTTPOnly provides, so it’s critical that these errors are trapped.

The third scenario is that the difference between a 404 (File Not Found) and a 403 (Forbidden) is noticeable to a scanner and can help tools that do directory enumeration figure out which directories exist verses aren’t there, at which point it is possible to brute force the files that are contained within. Ideally all directories that have Directory Indexing turned off as well as files that aren’t there should respond with 200 OK to confuse scanners. Disabling Directory Indexing is important and useful, but so is making it impossible to tell which directories are there or not there. Overloading a 404 (File Not Found) error is also useful in that it can be used to your advantage for instance, to tell users in a graceful way that they can go back to another page, offer a search box to find what they need, or whatever makes sense for your users.

There may be many other use cases as well, but this should give you some ideas as to why it’s important to trap errors. Don’t worry that you will lose visibility into the problems your site may have. Just because the user can’t see it doesn’t mean you can’t. You can disable this type of error trapping in QA, and you can review your logs and identify pages that are problematic. This allows you to fix your issues without introducing additional security issues.

This entry was posted in Web Application Security on by .

About Robert Hansen

Robert Hansen is the Vice President of WhiteHat Labs at WhiteHat Security. He's the former Chief Executive of SecTheory and Falling Rock Networks which focused on building a hardened OS. Mr. Hansen began his career in banner click fraud detection at ValueClick. Mr. Hansen has worked for Cable & Wireless doing managed security services, and eBay as a Sr. Global Product Manager of Trust and Safety. Mr. Hansen contributes to and sits on the board of several startup companies. Mr. Hansen has co-authored "XSS Exploits" by Syngress publishing and wrote the eBook, "Detecting Malice." Robert is a member of WASC, APWG, IACSP, ISSA, APWG and contributed to several OWASP projects, including originating the XSS Cheat Sheet. He is also a mentor at TechStars. His passion is breaking web technologies to make them better. Robert can be found on Twitter @RSnake.