Website Security Statistics Report (2010) - Industry Bechmarks

Thursday, September 23, 2010

Originally Posted by Jeremiah Grossman here:

Full Report and Slides are available on my slideshare account. Enjoy!


"How are we doing?" That's the question on the mind of many executives and security practitioners whether they have recently implemented an application security program, or already have a well-established plan in place. The executives within those organizations want to know if the resources they have invested in source code reviews, threat modeling, developer training, security tools, etc. are making a measurable difference in reducing the risk of website compromise, which is by no means guaranteed. They want to know if their online business is truly more secure or less secure than industry peers. If above average, they may praise their team’s efforts and promote their continued success. On the other hand, if the organization is a security laggard, this is cause for concern and action.

Every organization needs to know where it stands, especially against its adversaries. Verizon Business' 2010 Data Breach Investigations Report (DBIR), a study conducted in cooperation with the United States Secret Service, provides insight. The report analyzes over 141 confirmed data breaches from 2009 which resulted in the compromise of 143 million records. To be clear, this data set is restricted to incidents of a "data" breach, which is different than those only resulting in financial loss. Either way, the data is overwhelming. The majority of breaches and almost all of the data stolen in 2009 (95%) were perpetrated by remote organized criminal groups hacking "servers and applications." That is, hacking Web Servers and Web applications — "websites" for short. The attack vector of choice was SQL Injection, typically a vulnerability that can’t readily be "patched," and used to install customized malware.

As the Verizon DBIR describes, the majority of breach victims are targets of opportunity, as opposed to targets of choice. Directly in the crosshairs are the Financial Services, Hospitality, and Retail industries. Victimized organizations are selected because their security posture is weaker than others and the data they possess can be converted into cash, namely payment card data and intellectual property. As such, organizations are strongly encouraged to determine if they are similar potential targets of opportunity in these industries, have a relatively weak or unknown security posture, and the data they hold is similarly attractive. This is a key point because perfect security may not be necessary to avoid becoming another Verizon DBIR statistical data point.

There are of course many published examples in Web security where the victim was a target of choice. Currently, Clickjacking attacks targeting social networks, more specifically Facebook, are rampant. In these attacks, visitors are being tricked into posting unwanted messages to friends and installing malware. There has also been a rise in targeted Cross-Site Scripting attacks, including a notable incident involving in which passwords were compromised. Content Spoofing attacks have been aimed at Wired to spoof a Steve Jobs health scare. Sears suffered a similar embarrassment when a fake product listing appeared on the company’s website. In an Insufficient Authorization incident involving Anthem Blue Cross Blue Shield, customers' personally identifiable information was exposed.

The bottom-line is no matter how mature the software development lifecycle there are always ways to break into and defraud computer systems. A goal of reducing the number of vulnerabilities to zero is an unrealistic, futile pursuit, perhaps impossible, and as we’ve learned likely unnecessary. And, organizations should increase the emphasis on improving responsiveness when vulnerabilities are eventually identified. The risk management question then becomes, "How secure is secure enough?" If the organization is a target of opportunity, perhaps a goal of being at or above average among your peers is good enough. If a target of choice, perhaps.

Until now no metrics have been published which organizations can use as a benchmark to compare themselves against their industry peers. These benchmarks may help answer the question, "How are we doing?" or "Are we secure enough?" WhiteHat Security’s 10th Website Security Statistics Report presents a statistical picture of the vulnerability assessment results from over 2,000 websites across 350 organizations under WhiteHat Sentinel management. For the first time, we’ve broken down the numbers by industry and size of organization. The data provides a unique perspective on the state of website security that may begin answering some of these pressing questions.

Key Findings
• The average website had nearly 13 serious vulnerabilities.
• Banking, Insurance, and Healthcare industries performed the best overall regarding the average number of serious vulnerabilities having 5, 6, and 8 respectively. The worst were the IT, Retail, and Education sectors with an average of 24, 17, and 17.
• Large organizations (over 2,500 employees) had the highest average number of serious vulnerabilities totaling 13, followed by medium (150 - 2,500 employees) at 12, and third was small (Up to 150 employees) at 11.
• Cross-Site Request Forgery moved up to 4th place as one of the most prevalent vulnerability classes. Also new on the list, in 10th place, is Brute Force affecting 10% of websites.
• The Banking industry removed SQL Injection as one of the most prevalent issues they face while all the other industries are still grappling with it. Similarly, Cross-Site Scripting within the Insurance industry has about half the overall likelihood of being exploited versus the others at 36%.
• Industries with a greater average number of serious vulnerabilities tend to have worse remediation rates.
• Small organizations fix the most of their serious vulnerabilities by percentage (62%), followed by medium (58%) and
large (54%).
• 64% of Banking and Telecommunications websites, the industries leading in remediation, have fixed more than 60% of their reported serious vulnerabilities. The laggards are Insurance and IT where only 26% and 33% respectively have fixed more than 60% of their outstanding serious issues.
• It does not appear organization size significantly impacts an Industry’s average number of serious vulnerabilities, the type or degree of specific vulnerability classes, or their time-to-fix metrics. However, remediation rate does seem to correlate. Typically the larger the organization the fewer vulnerabilities they resolve by percentage.
• With respect to the average number of serious vulnerabilities within large organizations, Social Networking, Banking, and Healthcare had the best marks with 4.38, 5.18, and 3.68 respectively. The three Worst were IT, Retail, Financial Services, with 29.55, 18.44, and 10.34
• Among large organizations, the Banking, Financial Services, Healthcare and Education industries turned in the best time-to-fix metrics with 2 weeks, 3 weeks, 4 weeks, and 4 weeks respectively. The worst were Telecommunications, Insurance, Retail and Social Networking with 26 weeks, 10 weeks, 8 weeks, and 8 weeks.
• Telecommunications, Retail, and Healthcare industries had the three best remediation rates of large organizations with 67%, 60% and 58% respectively. The three worst were IT, Banking and Insurance with 32%, 35%, and 35%.

WhiteHat Security is a leading provider of website security services.

Vulnerabilities Webappsec->General
Post Rating I Like this!