30. August 2011 · Comments Off on TaoSecurity: TaoSecurity Security Effectiveness Model · Categories: blog · Tags: , , ,

TaoSecurity: TaoSecurity Security Effectiveness Model.

I like Richard Bejtlich’s Security Effectiveness Model because it highlights the key notion that information security must start with (my words) an understanding of your organization’s adversaries’ motives and methods. Richard calls these “Threat Actions.” From there, you would develop a “Defensive Plan,” and implement “Live Defenses.”

This is represented as a Venn Diagram made up of three circles. The more overlap you have, the more effective your infosec security program is. Here is the diagram:

Bejtlich calls this”threat-centric” security.

So the first question that needs to be addressed in making this approach operational is, how do you get the needed visibility to understand the Threat Actions?

I see this visibility coming from two sources:

  1. Third party, generally available research. One such source would be SANS. In fact, SANS developed the SANS 20 Critical Security Controls specifically in response to its understanding of threat actions. In fact, the latest version provides a list of “Attack Types” in Appendix C on page 72.
  2. Organizational assessment. At the organizational level, it seems to me you are faced with an evaluation problem of selecting controls that are good at finding Threat Actions. Based on my experience, there is agreement that the primary attack vector today is at the application level. If this is correct, then the organizational assessment would focus on (a) a black-box vulnerability assessment of the organization’s customer-facing web applications and (2) an assessment of the web applications (and related threats) the organization’s employees and contractors are using.

I am looking forward to Richard and others expanding on his ideas. Could be another book is coming. 🙂

 

 

 

30. October 2010 · Comments Off on TaoSecurity: What Do You Investigate First? · Categories: blog · Tags: , , , ,

TaoSecurity: What Do You Investigate First?.

Richard Bejtlich offers the obvious, but usually difficult to implement answer to the following question:

Let’s say for example, there is a cesspool of internal suspicious activity from netflow, log and host data. You have a limited number of resources who must have some criteria they use to grab the worst stuff first. What criteria would you use to prioritize your investigation activities?

Bejtlich offers two answers which generally converge into one: focus on assets, i.e. the most critical assets in your organization.

Ideally, the log, flow, event collection and analysis system you are using has the ability to discover all network attached assets and then enable you to group them into IT/Business Services. The you can prioritize your focus based on the criticality of each IT/Business Service.

01. October 2010 · Comments Off on The Big Picture of the Security Incident Cycle · Categories: Security-Compliance · Tags: , ,

The Big Picture of the Security Incident Cycle.

Via Lenny Zelster, Richard Bejtlich, a well known Computer Incident Response Team (CIRT) person has an interesting view of IT Security pictured here:

What is normally considered the major functions of IT Security, are simply the first two phases of Bejtlich’s Incident Response cycle – Plan and Resist.

Note the use of the word, “Resist” rather than “Prevent,” thus forcing the recognition that incidents will happen. In other words, if you are not detecting incidents, it’s because you don’t have the right tools in place.

Well worth reading the whole post. Also there is a link to the Bejtlich’s complete presentation.

04. December 2009 · Comments Off on Two views on Cyber War: Rand/Libicki vs.Bejtlich · Categories: Cyberwar · Tags: , , , ,

Martin Libicki's recently published a book, Cyberdeterrence and Cyberwar (also available here as a free PDF), in his words, "presents the results of a fiscal year 2008 study [performed by the Rand Corporation and funded by the US Air Force], 'Defining and Implementing Cyber Command and Cyber Warfare.' It discusses the use and limits of power in cyberspace, which has been likened to a medium of potential conflict, much as air and space domains are."

Libicki's key conclusion is the Air Force should not invest
heavily in cyber warfare because: (1) the difficulties of being sure of
the source of a cyber attack and (2) the losses due to cyber attacks
are not severe enough to warrant strong offensive capabilities. In his
own words:

"It is thus hard to argue that the
ability to wage strategic cyberwar should be a priority area for U.S.
investment and, by extension, for U.S. Air Force investment. It is not
even clear whether there should be an intelligence effort of the
intensity required to enable strategic cyberwar."

I am uncomfortable with Libicki's conclusions in part because he makes assertions which cause me to question his understanding of cyberspace in general and "cyberattacks in particular." Note this paragraph on page 143

Cyberattacks are about deception, and the essence of deception is the difference between what you expect and what you get: surprise. This is why operational cyberwar is tailor-made for surprise attack and a poor choice for repeated attacks: It is difficult to surprise the same sysadmin twice in the same way.

In other words, according to Libicki, since the United States does not believe in surprise attacks and cyber war is oriented to surprise attacks, then cyber war is not appropriate for the US. Just because we don't subscribe to surprise attacks like Pearl Harbor does not mean that our military does not attempt to use surprise and misdirection in attacks.

Also, there is no reason to assume that an attacker would have only one method of attack or that it couldn't be used repeatedly. First, the breadth of attack vectors is huge and has the same sort of asymmetry as terrorist attacks. I think by now it's accepted wisdom that anti-terrorism must be proactive and have major offensive components. (The current anti-terrorism debate is more about degree and tactics.) By analogy, the asymmetric nature of cyber attacks due to the same difficulty of defending every inch of the attack surface leads one to conclude that offensive capabilities are needed.

Second, as to repeating the same attack over and over, one only needs to look at the history of the Conficker worm, which is now over one year old and still infecting systems!!

Richard Bejtlich, an IT Security practitioner at a Fortune 5 company and a former member of the Air Force CERT, wrote a much more comprehensive review of Libicki's book on Amazon and on his blog. Bejtlich claims Libricki's analysis contains five key flaws which I have quoted as directly as possible from Bejtlich's review:

  1. Libicki is wrong when he says, "cyberattacks are possible only because systems have flaws."
  2. Libicki's fatal understanding of digital vulnerability is compounded by his ignorance of the role of vendors and service providers.
  3. The "blame the victim" mentality is compounded by the completely misguided notions that defense is easy and recovery from intrusion is simple.
  4. Libicki makes no distinction between "core" and "peripheral" systems, with the former controlled by users and the later [sic] by sys admins.
  5. In addition to not understanding defense, Libicki doesn't understand offense.

Here are the final two paragraphs of Bejtlich's review:  

Furthermore, by avoiding offense, Libicki makes a critical mistake:
if cyberwar has only a "niche role," how is a state supposed to protect
itself from cyberwar?
In Libicki's world, defense is cheap and
easy. In the real world, the best defense is 1) informed by offense,
and 2) coordinated with offensive actions to target and disrupt
adversary offensive activity. Libicki also focuses far too much on
cyberwar in isolation, while real-world cyberwar has historically
accompanied kinetic actions.

Of course, like any good
consultant, Libicki leaves himself an out on p 177 by stating
"cyberweapons come relatively cheap. Because a devastating cyberattack
may facilitate or amplify physical operations and because an
operational cyberwar capability is relatively inexpensive (especially
if the Air Force can leverage investments in CNE), an offensive
cyberwar capability is worth developing." The danger of this misguided
tract is that policy makers will be swayed by Libicki's misinformed
assumptions, arguments, and conclusions, and believe that defense alone
is a sufficient focus for 21st century digital security. In
reality, a kinetically weaker opponent can leverage a cyber attack to
weaken a kinetically superior yet net-centric adversary.
History shows, in all theatres, that defense does not win wars, and that the best defense is a good offense.

My final comment on the book is that it's analysis is too static considering the constantly evolving technologies, government and business uses of cyberspace, threats, and economics. It seems to ignore the importance and impact of research and resulting game changing breakthroughs that could impact the feasibility, strategy, and tactics of cyber warfare and cyber deterrence.