Vulnerability Scans too Disruptive to Conduct Regularly

Thursday, July 12, 2012

Headlines

69dafe8b58066478aea48f3d0f384820

Solutions provider Skybox Security has released the Vulnerability Management Survey for 2012, and the findings indicate that organizations generally regard regular vulnerability scanning as being too disruptive.

The survey, conducted in conjunction with Osterman Research, revealed a "major disconnect between the frequency and the breadth of vulnerability scanning actually conducted and the amount that the respondents felt was needed."

The respondents included over one-hundred information security leaders, managers, and system engineers at companies of sized ranging between two-hundred employees and fifty to more than three-hundred and fifty-thousand personnel.

The vast majority of the companies survey, over ninety percent, indicate they have vulnerability management programs, yet nearly half of the respondents believe their systems are “somewhat” to “extremely” vulnerable.

"Even more surprisingly, 49 percent of companies surveyed have experienced a cyber attack leading to a service outage, unauthorized access to information, data breach, or damage over the past six months," the survey revealed.

The frequency of vulnerability scans is an issue, with critical systems being evaluated less than once per week and nearly half of the companies performing network-wide scans as little as once per month or less.

"The coverage, or percent of hosts scanned, was also an issue:  27 percent of large organizations reported scanning less than half of hosts in the DMZ per cycle, while 60 percent of medium sized companies scan less than half of the DMZ hosts. Yet, 49 percent of respondents said their organizations did not conduct vulnerability scanning as often or as in depth as they would like," the survey found.

Of the many reasons offered for the infrequency of the scans, more than half of the respondents indicate that the process is too disruptive to business functions, while one-third said some aspects of their networks were inaccessible.

"Fifty-seven percent of respondents reported that traditional active scanning often disrupts network services and vital business applications, 33 percent reported that parts of the network are not scannable, and 29 percent reported that they have difficulties gaining the system credentials required in order to conduct scans," the report notes.

Of concern, the finding that management finds the process required for effective risk management by way of thorough vulnerability scans to be too complicated.

“Evidently, active vulnerability scanning can cause huge management headaches due to its disruptive nature and information overload, so scanners tend to be used primarily for ‘spot checks’ that aren’t effective at minimizing risks. Critical vulnerabilities have to be identified, prioritized, and remediated daily, across a significant portion of the infrastructure, in order to systematically shrink the risk window and prevent data breaches and attacks," said Gidi Cohen, CEO at Skybox Security. 

Key survey takeaways:

  • More than 90 percent of firms have a vulnerability management program and consider vulnerability management a priority
  • 49 percent of companies have experienced a cyber attack leading to a service outage, unauthorized access to information, data breach, or damage over the past six months
  • 40 percent of companies scan their DMZ monthly or less frequently
  • Internal networks and data centers get the top priority in terms of scanning frequency with 35 percent of organizations scanning these zones on a daily basis
  • Large organizations (more than 1,500 employees) tend to scan more frequently and with greater coverage of hosts compared to mid-size organizations (250-1,499 employees)
  • 73 percent of large organizations (more than 1,500 employees) scan at least 50 percent of hosts in their DMZ, while only 39 percent of mid-size organizations (250-1,499 employees) scan at least 50 percent of hosts in their DMZ
  • Both large and mid-size organizations cite “concerns about disruptions caused by active scanning” and “don’t have the resources to analyze more frequent scan data” as the top reasons for scanning less often than desired.
  • Large organizations cite lack of patching resources and non-scannable hosts as a significantly greater issue than mid-size organizations.

The full survey findings are available for download at: http://lp.skyboxsecurity.com/VMSurvey.html

Source:  http://www.skyboxsecurity.com/news%2526events/press%20releases/skybox-security-survey-reveals-traditional-vulnerability-scanners-not-wor

Possibly Related Articles:
19099
Network->General
Information Security
breaches Risk Management Vulnerabilities Vulnerability Assessments Headlines Network Security SysAdmin survey Network Scanning
Post Rating I Like this!
B8db824b8b275afb1f4160f03cd3f733
Jack Daniel Sure, if you do it wrong. Otherwise, some misinformed folks.
1342216505
1de705dde1cf97450678321cd77853d9
Ian Tibble Cases i've come across: there's been cases where Cisco switches/routers will reload under TCP port scanning (default options with nmap) - I did experience this a few times - still rarely seen though. The Ciscos were non-specific models of Cisco or IOS versions, and devices with plenty of memory to spare.
Also - ancient HP-UX 11.22 (!!) RPC services would die when subject to nmap's -sV service scan, but this was a known issue, reported as a DoS bug - it's just that the 3 year old patch had not been applied (thereby making the scanning a raging success).
I recall the service provider with whom i was employed as an Analyst had a list of "sensitive devices"...mostly exotic bits of kit, certain types of printers etc

Bad things can happen, albeit rarely. Unauthenticated scanning is by and large harmless (but you'll notice those tests with red markers and exclamation marks against them...they might be disruptive). Authenticated scanning with agents is deprecated, or at least should be. Authenticated scanning with dissolvable agents has a reputation for being disruptive...but lamentably very few orgs even bother with unauthenticated scanning of any kind - so i'm not sure where the idea comes from about de-stabilizing targets.

No - the unfortunate reality is that in many cases - when someone wants to do some vulnerability assessment, there will be reservations raised by network ops or some other type of ops (and these are not always invalid objections), app teams, other BUs, and the objections won't be countered effectively. In fact believe it or not, sometimes the objections won't even be addressed at all.

"The findings indicate that organizations generally regard regular vulnerability scanning as being too disruptive" - this is because many security departments are occupying valuable real estate only for compliance reasons.


1342451931
6d117b57d55f63febe392e40a478011f
Anthony M. Freed Seems this study is more indicative of management's continued lack of understanding where security is concerned than any real judgement on the effectiveness of penetration testing and vulnerability assessments. If a vulnerability is found and mitigated, that's a successful assessment. If a vulnerability that could have been mitigated is exploited and data lost due to lack of assessments, that is a failure of corporate leadership.
1342458930
The views expressed in this post are the opinions of the Infosec Island member that posted this content. Infosec Island is not responsible for the content or messaging of this post.

Unauthorized reproduction of this article (in part or in whole) is prohibited without the express written permission of Infosec Island and the Infosec Island member that posted this content--this includes using our RSS feed for any purpose other than personal use.