Trickling through my RSS feeds this morning was an article with quite the topic “PHP Vulnerability May Halt Millions of Servers“.
In a nutshell: A modest size POST to almost all PHP versions in the wild (Sans 5.3.9+) are in danger of an extremely simple DoS.
The vulnerability exploits the PHP internal hash table function (responsible for managing data structures) – more specifically: the technique used to ‘hash’ (generate a key for the hash table) the key for a key=>value relationship.
Here’s the informative part regarding PHP’s problem in the security advisory for this:
Apache has a built in limit of 8K max request length (that is, maximum size in request URL) by default.
Can the damage from an 8k request (this affects GET) – really cause the mentioned DDoS attack on reasonable hardware?
Additionally – PHP has a limiter on POST data too: max_post_size.
It’s this configuration variable in particular I think should be put in the limelight.
max_post_size is a run-time/htaccess configurable directive that maybe we don’t respect like we should.
Often, administrators (myself included) just tell php.ini to accept a large POST size to allow form based file uploads – It’s not uncommon to see:
upload_max_filesize = 20M
post_max_size = 21M # or multitudes more!
– in almost any respectable setup.
Perhaps we should evaluate the underlying effects of this setting; maybe it should be something stupidly low by default (enough to allow a large WYSIWYG CMS article’s HTML and a bit more? 32K?) – and then delegate a higher limit using Apache configuration.
Caveat: these settings are PER DIR meaning:
- .htaccess use is limited, you can’t set the php_value in a .htaccess with a URL match – you’re stuck using a context sensitive .htaccess (within a dir) or use thedirective – this won’t work for people using front controllers through a single file on their websites/apps.
- Modifying the actual vhost/host configuration is a sound bet – you can do Location/File matching and set these at will; for situated web apps, this may be a feasible decision to take whitelist or blacklist approach on uploader destinations.
- Here’s the video that thoroughly covers the vulnerability – I’ve shortcut it to their recommended mitigation (outside of polymorphic hashing):
- A full blown rundown, including proof of concept (USE AT YOUR OWN RISK!)
- A string of hash collisions targeting DJBX33A for vuln testing (PS: Firefox seems to struggle with this in a GET format, Chrome doesn’t, odd!)