30. August 2011 · Comments Off on TaoSecurity: TaoSecurity Security Effectiveness Model · Categories: blog · Tags: , , ,

TaoSecurity: TaoSecurity Security Effectiveness Model.

I like Richard Bejtlich’s Security Effectiveness Model because it highlights the key notion that information security must start with (my words) an understanding of your organization’s adversaries’ motives and methods. Richard calls these “Threat Actions.” From there, you would develop a “Defensive Plan,” and implement “Live Defenses.”

This is represented as a Venn Diagram made up of three circles. The more overlap you have, the more effective your infosec security program is. Here is the diagram:

Bejtlich calls this”threat-centric” security.

So the first question that needs to be addressed in making this approach operational is, how do you get the needed visibility to understand the Threat Actions?

I see this visibility coming from two sources:

  1. Third party, generally available research. One such source would be SANS. In fact, SANS developed the SANS 20 Critical Security Controls specifically in response to its understanding of threat actions. In fact, the latest version provides a list of “Attack Types” in Appendix C on page 72.
  2. Organizational assessment. At the organizational level, it seems to me you are faced with an evaluation problem of selecting controls that are good at finding Threat Actions. Based on my experience, there is agreement that the primary attack vector today is at the application level. If this is correct, then the organizational assessment would focus on (a) a black-box vulnerability assessment of the organization’s customer-facing web applications and (2) an assessment of the web applications (and related threats) the organization’s employees and contractors are using.

I am looking forward to Richard and others expanding on his ideas. Could be another book is coming. 🙂

 

 

 

30. August 2011 · Comments Off on Compliance Is Not Security – Busted! « PCI Guru · Categories: blog · Tags: , ,

Compliance Is Not Security – Busted! « PCI Guru.

The PCI Guru defends the PCI standard as a good framework for security in general, arguing against the refrain that compliance is not security.

My view is that the PCI Guru is missing the point. PCI DSS is a decent enough security framework. Personally I feel the SANS 20 Critical Security Controls is more comprehensive and has a maturity model to help organizations build a prioritized plan.

The issue is the approach management teams of organizations take to mitigate the risks of information technology. COSO has called this “Tone at the Top.”

A quote that rings true to me is, “In theory, there is no difference between theory and practice. But in practice there is.”

Applying here, I would say, in theory there should be no difference between compliance and security. But in practice there often is when management teams of an organizations do not take an earnest approach to mitigating the risks of information technology. Rather they take a “check-box” mentality, i.e. going for the absolute minimum on which the QSA will sign off. It is for this reason that many in our industry say that compliance does not equal security.

 

14. August 2011 · Comments Off on The Six Dumbest Ideas in Computer Security – Revisited · Categories: blog · Tags: , , ,

Marcus Ranum’s The Six Dumbest Ideas in Computer Security, written in 2006, is still referred to regularly as the gospel. I think it should be reviewed in the context of the 2011 threat landscape.

  1. Default Permit – I agree. This still ranks as number one. In this age of application level threats, it’s more important than ever to implement “default deny” policies at the application level in order to reduce the organization’s attack surface. The objective must be “Unknown applications – Deny.”
  2. Enumerating Badness – While I understand that in theory it’s far better to enumerate goodness and deny all other actions (see #1 above), in practice, I have not yet seen a host or network intrusion prevention product that uses this model that is reliable enough to replace traditional IPS technology that uses vulnerability-based signatures. If you know of such a product, I would surely be interested in learning about it.
  3. Penetrate and Patch – I’m not sure I would call this dumb. Practically speaking, I would say it’s necessary but not sufficient.
  4. Hacking is Cool – I agree, although this is no different than “analog hacking.” Movies about criminals are still being made because the public is fascinated by them.
  5. Educating Users –  I strongly disagree here. The issue is that our methods for educating users have left out the idea of “incentives.” In other words, the problem is that in most organizations, users’ merely inappropriate or thoughtless behavior does not cost them anything. Employees and contractor behavior will change if their compensation is affected by their actions. Of course you have to have the right technical controls in place to assure you can properly attribute observed network activity to people. Because these controls are relatively new, we are at the beginning of the use of economic incentives to influence employee behavior. I wrote about Security Awareness Training and Incentives last week.
  6. Action is better than Inaction – Somewhat valid. The management team of each organization will have to decide for themselves whether they are “early adopters” or what I would call, “fast followers.” Ranum uses the term “pause and thinkers,” clearly indicating his view. However, if there are no early adopters, there will be no innovation. And as has been shown regularly, there are only a small number of early adopters anyway.

Of the “Minor Dumbs” I agree with all of them except the last one – “We can’t stop the occasional problem.” Ranum says “yes you can.” Not possible. You must assume you will suffer successful attacks. Good security means budget is allocated to Detection and Response in addition to Prevention.

07. August 2011 · Comments Off on Security Awareness Training and Incentives · Categories: blog · Tags: ,

I had an interesting conversation last week about the importance of security awareness training. I know this is a controversial topic, with many in the industry believing that it’s a waste of time. Ben Tomhave makes a really important point about getting users to pay attention to security policies.

The problem is this: people are once again falling into that rut of blaming the users for making bad security decisions, all the while having created, sustained, and grown an enablement culture that drastically abstracts users from the impact of those decisions. Plainly put: if the users don’t feel the pain of their bad decisions, then they have no incentive to make a change. This is basic psychology.

It’s time to quit trying the same old stupid donkey tricks. What we’re doing has failed, and will continue to fail. The rules of this game mean we lose – every. single. time. We need to change those rules, and fast. Specifically, we need to:

  1. Include security responsibilities in all job descriptions.
  2. Tie security performance into employee performance reviews.
  3. Include disciplinary actions for all security incidents.

Tomhave calls this psychology. I relate it to the “economics of security” as described by Tyler Moore and Ross Anderson.