Tales of an IT Nobody

devbox:~$ iptables -A OUTPUT -j DROP

Observations: Google’s new Terms of Service January 27, 2012

The new TOS and Privacy Policy documents from Google are a welcome change, reducing 60 individual ones into a standard, global set is a much better idea for understanding’s sake.

Observation 1:

We may review content to determine whether it is illegal or violates our policies, and we may remove or refuse to display content that we reasonably believe violates our policies or the law. But that does not necessarily mean that we review content, so please don’t assume that we do.

Usng Our Services http://www.google.com/policies/terms/#toc-services

I get what they’re saying, but the wording seems a little humorous if you don’t hone in on ‘necessarily’.

Observation 2:

We provide information to help copyright holders manage their intellectual property online.

Privacy and Copyright Protection http://www.google.com/policies/terms/#toc-protection

Odd, this leaves some open questions to what information they provide … are they helping police things “SOPA style”?

Observation 3:

Some of our Services allow you to submit content. You retain ownership of any intellectual property rights that you hold in that content. In short, what belongs to you stays yours. 

 That’s neat… but wait, the next paragraph:

When you upload or otherwise submit content to our Services, you give Google (and those we work with) a worldwide license to use, host, store, reproduce, modify, create derivative works (such as those resulting from translations, adaptations or other changes we make so that your content works better with our Services), communicate, publish, publicly perform, publicly display and distribute such content.

Your Content in our Services http://www.google.com/policies/terms/#toc-content

Uh… ok…

I don’t have any intent on copyright infringement, so the first two don’t bother me, but the third one leaves me with some questions …

No Comments on Observations: Google’s new Terms of Service
Categories: google security

PHP Vulnerability – DJBX33A – Hash table collisions January 14, 2012

Trickling through my RSS feeds this morning was an article with quite the topic “PHP Vulnerability May Halt Millions of Servers“.

In a nutshell: A modest size POST to almost all PHP versions in the wild (Sans 5.3.9+) are in danger of an extremely simple DoS.

The vulnerability exploits the PHP internal hash table function (responsible for managing data structures) – more specifically: the technique used to ‘hash’ (generate a key for the hash table) the key for a key=>value relationship.

Here’s the informative part regarding PHP’s problem in the security advisory for this:

Apache has a built in limit of 8K max request length (that is, maximum size in request URL) by default.
Can the damage from an 8k request (this affects GET) – really cause the mentioned DDoS attack on reasonable hardware?

Additionally – PHP has a limiter on POST data too: max_post_size.
It’s this configuration variable in particular I think should be put in the limelight.

max_post_size is a run-time/htaccess configurable directive that maybe we don’t respect like we should.
Often, administrators (myself included) just tell php.ini to accept a large POST size to allow form based file uploads – It’s not uncommon to see:

– in almost any respectable setup.

Perhaps we should evaluate the underlying effects of this setting; maybe it should be something stupidly low by default (enough to allow a large WYSIWYG CMS article’s HTML and a bit more? 32K?) – and then delegate a higher limit using Apache configuration.

Caveat: these settings are PER DIR meaning:

  • .htaccess use is limited, you can’t set the php_value in a .htaccess with a URL match – you’re stuck using a context sensitive .htaccess (within a dir) or use thedirective – this won’t work for people using front controllers through a single file on their websites/apps.
  • Modifying the actual vhost/host configuration is a sound bet – you can do Location/File matching and set these at will; for situated web apps, this may be a feasible decision to take whitelist or blacklist approach on uploader destinations.

More resources:

  • Here’s the video that thoroughly covers the vulnerability – I’ve shortcut it to their recommended mitigation (outside of polymorphic hashing):
  • A full blown rundown, including proof of concept (USE AT YOUR OWN RISK!)
  • A string of hash collisions targeting DJBX33A for vuln testing (PS: Firefox seems to struggle with this in a GET format, Chrome doesn’t, odd!)
2 Comments on PHP Vulnerability – DJBX33A – Hash table collisions
Categories: php security servers

Why I won’t (can’t) adopt Google Chrome yet… January 10, 2012

Privacy aside, simply put: in my role, I do my fair share of design work, AJAX debugging, CSS, you name it –  I need tools at my fingertips to quickly do more than just rip apart the DOM of a page, these are my deal breaker extensions/capabilities that aren’t in chrome:

Dealbreakers:

1. Web Developer Toolbar – Session toggle, disable/enable cache
Chrome has no way to turn on/off cache at the click of a button. The closest thing I have found is a to create an icon that has a switch in the launch parameters. Another biggie for me is to clear a specific set of session cookies for a domain instead of all of them. The chrome version of Web Developer Toolbar completely lacks these options.

 

 

 

 

 

 2. Selenium IDE
Only firefox has the Selenium IDE plugin; for those of us who perform automation or frequent checking on forms for SQL injection or other; there’s a few alternatives out there for chrome, but none as extensive as Selenium (you can also reuse the IDE tests for Selenium RC)

 

 

 

3. S3Fox (or equiv.)

4. View image info, without a darn plugin… (’nuff said; even IE has it!)

Google Chrome ISthe browser of the future;  it’s still not quite there yet for me…

No Comments on Why I won’t (can’t) adopt Google Chrome yet…
Categories: google purdy tools