Weblog

29 December 2011, 21:58

As you probably already have heard, multiple languages appear to be vulnerable for hash collisions DoS attacks. If you don't know what this is about, I'll explain it here. When a webbrowser sends variables (GET or POST) to a website, the website application places those variables in an indexed array. For example, with the URL http://www.website.com/index.php?test=123, PHP places the value '123' in the $_GET array with 'test' as the index key. For quick lookup, most languages use hash algoritms, mostly 'Daniel J. Bernstein, Times 33 with Addition'.

In case of a collision (multiple keys give the same hash value), hashes will not be used. To lookup a value in the array in such case, string compares for each entry will be done. Of course, this requires more CPU power than using hashes to calculate the index key value. So, if a hacker sends a HTTP request in which he deliberately uses two keys which give the same hash and adds several thousands of other variables, the CPU of the webserver will be quite busy when looking up variables. POST requests are more interesting for this attack, because a request body can contain more data than the URL.

Most vendors are already working on patches for this problem. But what to do in the meantime or when you can't upgrade to the latest version of your web language? Well, the Hiawatha webserver wouldn't be the Hiawatha webserver if it hadn't a mechanism to protect you against such DoS attack.

You can of course limit the request size, but more effective is to limit the amount of variables in the request. This can be done via the UrlToolkit and the DenyBody option. For example, to limit the amount of variables to 10, use the following configuration:

UrlToolkit {
    ToolkitID = limit_params
    Match (&.*){10,} DenyAccess
}

VirtualHost {
    ...
    UseToolkit = limit_params
    DenyBody = (&.*){10,}
}

The value 10 is just an example. To see what value is right for your website, find the maximum amount of input elements in the forms of your website (don't forget the hidden inputs) and the maximum amount of URL parameters. Only use this if you think your website might be a possible target for hackers. Otherwise, it's just a waste of CPU power. Regular expressions don't come cheap...

  1. http://www.kb.cert.org/vuls/id/903934
  2. http://isc.sans.edu/diary.html?rss=&storyid=12286
  3. http://thehackernews.com/2011/12/web-is-vulnerable-to-hashing-denial-of.html
  4. http://theelitist.net/hash-algorithm-collision-denial-of-service-vulnerability

Palatinux
29 December 2011, 23:59
In addition to Hugo's solution, you could also upgrade to PHP 5.4 RC4 and protect yourself without wasting system resources.

This latest PHP version has a fix for these attacks. After installing PHP be sure to set the following in php.ini:

max_input time = 30
max_input_vars = 50

http://www.php.net/manual/en/info.configuration.php#ini.max-input-time
http://www.tcphp.org/aggregator/categories/4


You can even limit the resources of the PHP-(FCGI) processes by using the ulimit command:

http://linux.about.com/library/cmd/blcmdl1_ulimit.htm
Rene
30 December 2011, 08:09
Hallo,

thanks for your hint.

For PHP post_max_size with max_input_time can help, or suhosin.request.max_vars set to <20.

Gives any information about successful #hashdos-attacks?
Palatinux
31 December 2011, 04:11
Hoi Rene,

Nowadays there is little necessity for the Sushosin patch, as the important changes are already in PHP itself. So you'd better upgrade to the latest PHP version. I've been using unstable version for years in a row. Never had trouble with it..

Another option (yes another one) is to install Suricata or Snort. With psmon you can monitor your PHP processes for (overloading) cpu usage.


I also wrote a small bash script which can send you an email when the CPU usage is too high for a longer period. Interested ?
Xeross
6 January 2012, 11:39
@Palatinux: I can imagine that Suhosin might be an easier option if you're using official distro packages, a lot have Suhosin, but not 5.4.0.

...

Also, as an additional detail for the attack, and if I understand correctly, as soon as collisions occur that colliding hash will have a list of items rather than a single item, and that entire list is traversed for a lookup? So only a large amount of collisions on a single hash would be effective?
Hugo Leisink
6 January 2012, 13:00
All variables are stored in an array, no matter what.

When no collision has occured, a hash for each variable is calculated. When a variable is needed, the hash is used to lookup the value of the variable. The hash is an integer which directly points to the right position in the variable-array.

When a collision has occured, hashes can't be used. So, when a variable is needed, a string compare for each entry in the variable array is done. This requires of course a lot more CPU power.

When an attacker creates a collision for the first two variables in the POST request, string compares must be done to add more variables to the array. The CPU can there be kept busy when the attacker adds a lot of variables after the colliding variables.
SC
26 December 2013, 23:34
It's been a while but is there a vulnerability identifier available ?