h1

Scanners, what are they good for?

August 24, 2010

The more I read the less I can add.  Zalewski covers all the technical reasons around why web application security scanners suck.  I’d go even farther and say in my experience the most popular (and expensive) scanners are pretty awful force multipliers for even experienced testers.

So, why on earth would you drop $30K and $5K/year on one of these things?

Years ago, I had the two “best” scanners in-house.  My testers were free to use whichever they preferred.  This worked out better than you’d expect — these two competing products would often catch each other in quality highs and lows.  When Product A started to become unwieldy and  more trouble than it was worth for an analyst to waste time with, Product B was usually in a quality upswing.  At any given time, one of the two would work.

I had a motive behind the scenes for keeping them both.  Together, these two products had around 85% of the market share.  A few of my clients used one or the other when they assessed my sites.  For a while, knowing ahead of time what false positives their tools would throw made it easy to address their “findings”.  Some clients demanded the canned scan reports from Product A or B as a bureaucratic checkbox.  In fact, they’d demand it even in the face of client facing documentation from a top-tier application testing company.

It is a testament to the marketing strength of these companies that a false-positive-ridden canned report would trump CFD from a well regarded consultancy.  I think deep-down, a significant slice of the information security workforce really does believe the hype that all of this can be automated and metric-ized.

<span>%d</span> bloggers like this: