30. December 2011 · Comments Off on XSS and Verizon DBIR; PCI DSS and anti-malware · Categories: blog · Tags: , , ,

Alex’s post, Web Application Security – from the start: XSS and Verizon DBIR suggests a conclusion that since the Verizon 2010 DBIR, released in April, 2011, shows that only 1% of breaches are a result of XSS, OWASP is putting too high a priority on XSS.

Here are my thoughts based on my review of the Verizon 2010 DBIR:

  1. Table 2 shows that of the 761 analyzed breaches, only 163 were from companies with 1,001 or more employees. over 70% (522 of 761) had fewer than 101 or an unknown number of employees. It’s been my experience that there is a huge disparity in deployed security controls between small and large companies, which, it seems to me, might alter the conclusions you could draw from the report.
  2. Figure 33 shows that the number of records stolen in the report is only 3.9 million. The previous five years the numbers ranged from 104M to 361M. I find this odd. This may reflect the high number of small companies in the report. Also, the number of records lost may not be the best indicator of breach severity. If Coca Cola lost only one record, but it was the Coke formula, the breach would be severe indeed.
  3. This report is heavily tied to Verizon’s PCI DSS practice. Table 15 shows that 96% of stolen records are Payment card numbers/data. We have seen very serious breaches where email addresses were the main data lost. See Epsilon where some estimate that 250 million email addresses were breached.
  4. Another indicator of the heavy PCI DSS orientation is that for each company examined they do a PCI DSS analysis. And (Table 16) shows the low percentage of these 761 companies that met basic PCI DSS security requirements. These percentages are not surprising given the large number of small companies in the report.

Of course, the conclusion they draw is the significant value of PCI DSS compliance in reducing breaches.

However, there is something else in the report that is worth noting that might refute the value of limiting your security goals to complying with PCI DSS. Figure 15 shows that 49% of the breaches involved Malware, representing 79% of the records breached. Of the malware analyzed, 63% (Figure 21) was custom! Could one conclude then that traditional anti-virus controls are not sufficient?

So what does the PCI DSS standard have to say about this? Requirement 5 is all about anti-virus. In fact, the recommend testing procedures are simply to “verify that anti-virus software is deployed,” and “verify that automatic updates and periodic scans are enabled.” So, based on PCI DSS one might conclude that as long as you have anti-virus deployed, you are safe from malware. However, since most of the malware that results in breaches is custom, and traditional anti-virus is not sufficient, then one could conclude that PCI DSS compliance is not a sufficient goal for mitigating malware risk.

I am not saying that PCI DSS does not have any value in risk reduction. But I am saying that in the all-important anti-malware area, PCI DSS is insufficient. Cymbel’s 12 Best Practices for mitigating the risks of modern malware is much more comprehensive and is aimed at larger organizations with more to protect than just credit card data.

29. December 2011 · Comments Off on Troy Hunt: 5 website security lessons courtesy of Stratfor · Categories: blog · Tags: ,

Troy Hunt: 5 website security lessons courtesy of Stratfor.

This wasn’t intended to be a Stratfor-bashing post, rather it’s an opportunity to see the fate which awaits those who don’t take website security seriously. Call it a quick reality check if you will.

Insightful lessons to be learned from analyzing the Stratfor breach:

  1. There doesn’t need to be a reason for you to be hacked
  2. The financial abuse of your customers will extend long and far
  3. Your customers’ other online services will be compromised
  4. Saltless password hashes are a thin veneer of security
  5. Your dirty software laundry will be aired quickly
Regarding #3 above, Bellovin’s article about passwords is relevant.

As I look over my experience in Information Security since 1999, I see three distinct eras with respect to the motivation driving technical control purchases:

  • Basic (mid-90’s to early 2000’s) – Organizations implemented basic host-based and network-based technical security controls, i.e. anti-virus and firewalls respectively.
  • Compliance (early 2000’s to mid 2000’s) – Compliance regulations such as Sarbanes-Oxley and PCI drove major improvements in security.
  • Breach Prevention and Incident Detection & Response (BPIDR) (late 2000’s to present) – Organizations realize that regulatory compliance represents a minimum level of security, and is not sufficient to cope with the fast changing methods used by cyber predators. Meeting compliance requirements will not effectively reduce the likelihood of a breach by more skilled and aggressive adversaries or detect their malicious activity.

I have three examples to support the shift from the Compliance era to the Breach Prevention and Incident Detection & Response (BPIDR) era. The first is the increasing popularity of Palo Alto Networks. No compliance regulation I am aware of makes the distinction between a traditional stateful inspection firewall and a Next Generation Firewall as defined by Gartner in their 2009 research report.  Yet in the last four years, 6,000 companies have selected Palo Alto Networks because their NGFWs enable organizations to regain control of traffic at points in their networks where trust levels change or ought to change.

The second example is the evolution of Log Management/SIEM. One can safely say that the driving force for most Log/SIEM purchases in the early to mid 2000s was compliance. The fastest growing vendors of that period had the best compliance reporting capabilities. However, by the late 2000s, many organizations began to realize they needed better detection controls. We began so see a shift in the SIEM market to those solutions which not only provided the necessary compliance reports, but could also function satisfactorily as the primary detection control within limited budget requirements. Hence the ascendancy of Q1 Labs, which actually passed ArcSight in number of installations prior to being acquired by IBM.

The third example is email security. From a compliance perspective, Section 5 of PCI DSS, for example, is very comprehensive regarding anti-virus software. However, it is silent regarding phishing. The popularity of products from Proofpoint and FireEye show that organizations have determined that blocking email-borne viruses is simply not adequate. Phishing and particularly spear-phishing must be addressed.

Rather than simply call the third era “Breach Prevention,” I chose to add “Incident Detection & Response” because preventing all system compromises that could lead to a breach is not possible. You must assume that Prevention controls will have failures. Therefore you must invest in Detection controls as well. Too often, I have seen budget imbalances in favor of Prevention controls.

The goal of a defense-in-depth architecture is to (1) prevent breaches by minimizing attack surfaces, controlling access to assets, and preventing threats and malicious behavior on allowed traffic, and (2) to detect malicious activity missed by prevention controls and detect compromised systems more quickly to minimize the risk of disclosure of confidential data.

18. December 2011 · Comments Off on Gartner December 2011 Firewall Magic Quadrant Comments · Categories: blog · Tags: , , , , ,

Gartner just released their 2011 Enterprise Firewall Magic Quadrant 21 months since their last one just days before Christmas. Via distribution from one of the firewall manufacturers, I received a copy today. Here are the key highlights:

  • Palo Alto Networks moved up from the Visionary to Leader quadrant
  • Juniper slid back from the Leader to the Challenger quadrant
  • Cisco remained in the Challenger quadrant
  • There are no manufacturers in the Visionary quadrant

In fact, there are only two manufacturers in the Leader quadrant – the aforementioned Palo Alto Networks and Check Point. And these two manufacturers are the only ones to the right of center!!

Given Gartner’s strong belief in the value of Next Generation Firewalls, one might conclude that both of these companies actually do meet Gartner’s 2009 research paper outlining the features of a NGFW. Unfortunately that is not the case today. Check Point’s latest generally available release simply does not meet Gartner’s NGFW requirements.

So the question is, why did Gartner include them in the Leader quadrant? The only explanation I can think of is that their next release meets their NGFW criteria. Gartner alludes to Project Gaia which is in beta at a few sites but says only that it is a blending of Check Point’s three different operating systems. So let’s follow through on this thought experiment. First, this would mean that none of the other vendors will meet Gartner’s NGFW criteria in their next release. If any of them did, why wouldn’t they too be placed to the right of center?

Before I go on, let’s review what a NGFW is. Let’s start with a basic definition of a firewall – a network security device that enables you to define a “Positive Control Model” about what traffic is allowed to pass between two network segments of different trust levels. By Positive Enforcement Model I mean you define what is allowed and deny everything else. Another term for this is “default deny.”

Traditional stateful firewalls enable this Positive Control Model at the port and protocol levels. NGFWs do this also but most importantly do this at the application level. In fact, an NGFW enables policies that combine port, protocol, and application (and more). Stateful inspection firewalls have no ability to control applications sharing open ports. Some have added application identification and blocking to their IPS modules, but this is a negative enforcement model. In other words, block what I tell you to block and allow everything else. Some have called this the “Wack-A-Mole” approach to application control.

In order then to qualify as a NGFW, the core traffic analysis engine has to be built from the ground up to perform deep packet inspection and application detection at the beginning of the analysis/decision process to allow or deny the session. Since that was Palo Alto Networks’ vision when they were founded in 2005, that’s what they did. All the other firewall manufacturers have to start from scratch and build an entirely new platform.

So let’s pick up where I left off three paragraphs ago, i.e. the only traditional stateful inspection firewall manufacturer that might have a technically true NGFW coming in its next release is Check Point. Since Palo Alto Networks shipped its first NGFW in mid-2007, this would mean that Check Point is, at best, four and half years, four major releases, and six thousand customers behind Palo Alto Networks.

On the other hand, if Check Point is in the Leader quadrant because it’s Palo Alto Networks’ toughest competitor, then Palo Alto Networks is in even a better position in the firewall market.