Risk Management and Critical Infrastructure Protection: Assessing, Integrating, and Managing Threats, Vulnerabilities and Consequences

The Homeland Security Act of 2002 (P.L. 107-296) and other Administration documents have assigned the Department of Homeland Security specific duties associated with coordinating the nation’s efforts to protect its critical infrastructure, including using a risk management approach to set priorities. Many of these duties have been delegated to what is now called the National Protection and Programs Directorate.

Risk assessment involves the integration of threat, vulnerability, and consequence information. Risk management involves deciding which risk reduction measures to take based on an agreed upon risk reduction strategy. Many models/methodologies have been developed by which threats, vulnerabilities, and consequences are integrated to determine risks and then used to inform the allocation of resources to reduce those risks. For the most part, these methodologies consist of the following elements, performed, more or less, in the following order.

identify assets and identify which are most critical

identify, characterize, and assess threats

assess the vulnerability of critical assets to specific threats

determine the risk (i.e., the expected consequences of specific types of attacks on specific assets)

identify ways to reduce those risks

prioritize risk reduction measures based on a strategy

Beginning in 2003, the Department of Homeland Security has been accumulating a list of infrastructure assets (specific sites and facilities). From this list the Department selects high-priority assets that it judges to be critical from a national point of view, based on the potential consequences associated with their loss. The Department intends to assess the vulnerability of all the high-priority assets it has identified. Department officials have described, in very general terms, that these vulnerability and consequence assessments are used to determine the risk each asset poses to the nation. This risk assessment is then used to prioritize subsequent additional protection activities. While these statements allude to some of the steps mentioned above, they do so only in a most general way. With its release of the National Infrastructure Protection Plan in June 2006, the Department has laid out a much more detailed discussion of the risk management methodology it intends to use (or is using). The Department’s efforts, to date, still raise several questions, ranging from the process and criteria used to populate its lists of assets, its prioritization strategy, and the extent to which the Department is coordinating its efforts with the intelligence community and other agencies both internal and external to the Department. This report will be updated as needed.

Risk Management and Critical Infrastructure Protection: Assessing, Integrating, and Managing Threats, Vulnerabilities and Consequences

July 17, 2007 (RL32561)

Summary

The Homeland Security Act of 2002 (P.L. 107-296) and other Administration documents have assigned the Department of Homeland Security specific duties associated with coordinating the nation's efforts to protect its critical infrastructure, including using a risk management approach to set priorities. Many of these duties have been delegated to what is now called the National Protection and Programs Directorate.

Risk assessment involves the integration of threat, vulnerability, and consequence information. Risk management involves deciding which risk reduction measures to take based on an agreed upon risk reduction strategy. Many models/methodologies have been developed by which threats, vulnerabilities, and consequences are integrated to determine risks and then used to inform the allocation of resources to reduce those risks. For the most part, these methodologies consist of the following elements, performed, more or less, in the following order.

  • identify assets and identify which are most critical
  • identify, characterize, and assess threats
  • assess the vulnerability of critical assets to specific threats
  • determine the risk (i.e., the expected consequences of specific types of attacks on specific assets)
  • identify ways to reduce those risks
  • prioritize risk reduction measures based on a strategy

Beginning in 2003, the Department of Homeland Security has been accumulating a list of infrastructure assets (specific sites and facilities). From this list the Department selects high-priority assets that it judges to be critical from a national point of view, based on the potential consequences associated with their loss. The Department intends to assess the vulnerability of all the high-priority assets it has identified. Department officials have described, in very general terms, that these vulnerability and consequence assessments are used to determine the risk each asset poses to the nation. This risk assessment is then used to prioritize subsequent additional protection activities. While these statements allude to some of the steps mentioned above, they do so only in a most general way. With its release of the National Infrastructure Protection Plan in June 2006, the Department has laid out a much more detailed discussion of the risk management methodology it intends to use (or is using). The Department's efforts, to date, still raise several questions, ranging from the process and criteria used to populate its lists of assets, its prioritization strategy, and the extent to which the Department is coordinating its efforts with the intelligence community and other agencies both internal and external to the Department. This report will be updated as needed.


Risk Management and Critical Infrastructure Protection: Assessing, Integrating, and Managing Threats, Vulnerabilities and Consequences

Introduction

The Homeland Security Act of 2002 and other Administration documents have assigned the Department of Homeland Security specific duties associated with coordinating the nation's efforts to protect its critical infrastructure. Many of these duties were delegated to the Information Analysis and Infrastructure Protection Directorate.1 In particular, the Directorate was charged with integrating threat assessments with vulnerability assessments in an effort to identify and manage the risk associated with possible terrorist attacks on the nation's critical infrastructure. By doing so, the Directorate is to help the nation set priorities and take cost-effective protective measures.

This report is meant to support congressional oversight by discussing, in more detail, what this task entails and issues that need to be addressed. In particular, the report defines terms (e.g., threat, vulnerability, and risk), discusses how they fit together in a systematic analysis, describes processes and techniques that have been used to assess them, and discusses how the results of that analysis can inform resource allocation and policy.

While the Directorate was given this task as one of its primary missions, similar activities are being undertaken by other agencies under other authorities and by the private sector and states and local governments. Therefore, this report also discusses to some extent the Directorate's role in coordinating and/or integrating these activities.

Background

The Directorate's Responsibilities

The National Strategy for Homeland Security,2 anticipating the establishment of the Department of Homeland Security, stated:

  • "... the Department would build and maintain a complete, current, and accurate assessment of vulnerabilities and preparedness of critical targets across critical infrastructure sectors...[This assessment will] guide the rational long-term investment of effort and resources."3
  • "... we must carefully weigh the benefit of each homeland security endeavor and only allocate resources where the benefit of reducing risk is worth the amount of additional cost."4

Among the specific tasks delegated to the Directorate's Undersecretary by Section 201(d) of the Homeland Security Act of 2002 (P.L. 107-296, enacted November 25, 2002) were:

  • "... identify and assess the nature and scope of terrorist threats to the homeland;"
  • "... understand such threats in light of actual and potential vulnerabilities of the homeland;"
  • "... carry out comprehensive assessments of the vulnerabilities of the key resources and critical infrastructures of the United States, including the performance of risk assessments to determine the risk posed by particular types of terrorist attacks within the United States ...."
  • "... integrate relevant information, analyses, and vulnerability assessments ... in order to identify priorities for protective and support measures ...."
  • "... develop a comprehensive national plan for securing the key resources and critical infrastructure of the United States ...."
  • "... recommend measures necessary to protect the key resources and critical infrastructure of the United States ...."

The National Strategy for the Physical Protection of Critical Infrastructure and Key Assets5 stated:

  • "DHS, in collaboration with other key stakeholders, will develop a uniform methodology for identifying facilities, systems, and functions with national-level criticality to help establish federal, state, and local government, and the private-sector protection priorities. Using this methodology, DHS will build a comprehensive database to catalog these critical facility, systems, and functions."6

Homeland Security Presidential Directive Number 7 (HSPD-7)7 stated that the Secretary of Homeland Security was responsible for coordinating the overall national effort to identify, prioritize, and protect critical infrastructure and key resources. The Directive assigned Sector Specific Agencies8 the responsibility of conducting or facilitating vulnerability assessments of their sector, and encouraging the use of risk management strategies to protect against or mitigate the effects of attacks against critical infrastructures or key resources. It also required the Secretary to produce a comprehensive, integrated National Plan for Critical Infrastructure and Key Resources Protection.9 That National Plan was to include a strategy and a summary of activities to be undertaken to: define and prioritize, reduce the vulnerability of, and coordinate the protection of critical infrastructure and key resources.

The terms "vulnerabilities," "threats," "risk," "integrated," and "prioritize" are used repeatedly in the documents cited above. However, none of the documents defined these terms or discussed how they were to be integrated and used. Also, in hearings, articles in the press, and other public discourse these terms are used loosely, clouding the intent of what is being proposed or discussed.10 What might seem trivial differences in definitions can make a big difference in policy and implementation. The following section provides definitions and a generic model for integrating them in a systematic way.

A Generic Model for Assessing and Integrating Threat, Vulnerability, and Risk

Many models/methodologies have been developed by which threats, vulnerabilities, and consequences are assessed and then used to inform the cost-effective allocation of resources to reduce risks. For this report, CRS reviewed vulnerability and risk assessment models or methodologies, including some developed and used, to varying degrees, in certain selected sectors (electric power, ports, oil and gas).11 These are listed in the "References" section of this report. In addition, this report draws upon information contained in a book by Carl Roper entitled Risk Management for Security Professionals.12 Essential elements of these models/methods have been distilled and are presented below. They may provide some guidance in overseeing DHS's methodology as it is developed and employed.

For the most part, each of the methodologies reviewed consist of certain elements. These elements can be divided into: assessments per se; and, the use of the assessments to make decisions. The elements are performed, more or less, in the following sequence:

Assessments

  • identify assets and identify which are most critical
  • identify, characterize, and assess threats
  • assess the vulnerability of critical assets to specific threats
  • determine the risk (i.e., the expected consequences of specific types of attacks on specific assets)

Using Assessments to Identify and Prioritize Risk Reduction Activities

  • identify and characterize ways to reduce those risks
  • prioritize risk reduction activities based on a risk reduction strategy

Assessments

Identifying Assets and Determining Criticality

The infrastructure of a facility, a company, or an economic sector, consists of an array of assets which are necessary for the production and/or delivery of a good or service. Similarly, the infrastructure of a city, state, or nation consists of an array of assets necessary for the economic and social activity of the city and region, and the public health and welfare of its citizens. The first step in the process is to determine which infrastructure assets to include in the study. The American Chemistry Council, the Chlorine Institute, and the Synthetic Organic Chemical Manufacturers Association, in their Site Security Guidelines for the U.S. Chemistry Industry, broadly define assets as people, property, and information. Roper's Risk Management for Security Professionals (and DOE's Energy Infrastructure Risk Management Checklists for Small and Medium Sized Energy Facilities) broadly define assets as people, activities and operations, information, facilities (installations), and equipment and materials.

The methodologies reviewed do not provide a definitive list of such assets but suggest which ones might be considered. For example, people assets may include employees, customers, and/or the surrounding community. Property usually includes a long list of physical assets like buildings, vehicles, production equipment, storage tanks, control equipment, raw materials, power, water, communication systems, information systems, office equipment, supplies, etc. Information could include product designs, formulae, process data, operational data, business strategies, financial data, employee data, etc. Roper's examples of activities and operations assets include such things as intelligence gathering and special training programs. Many methodologies suggest considering, initially, as broad a set of assets as is reasonable.

However, not every asset is as important as another. In order to focus assessment resources, all of the methodologies reviewed suggest that the assessment should focus on those assets judged to be most critical. Criticality is typically defined as a measure of the consequences associated with the loss or degradation of a particular asset. The more the loss of an asset threatens the survival or viability of its owners, of those located nearby, or of others who depend on it (including the nation as a whole), the more critical it becomes.

Consequences can be categorized in a number of ways: economic; financial; environmental; health and safety; technological; operational; and, time. For example, a process control center may be essential for the safe production of a particular product. Its loss, or inability to function properly, could result not only in a disruption of production (with its concomitant loss of revenue and additional costs associated with replacing the lost capability), but it might also result in the loss of life, property damage, or environmental damage, if the process being controlled involves hazardous materials. The loss of an asset might also reduce a firm's competitive advantage, not only because of the financial costs associated with its loss, but also because of the loss of technological advantage or loss of unique knowledge or information that would be difficult to replace or reproduce. Individual firms, too, have to worry about loss of reputation. The American Petroleum Institute and the National Petrochemical and Refiners Association (API/NPRA) in their Security Vulnerability Assessment Methodology for the Petroleum and Petrochemical Industries also suggested considering the possibility of "excessive media exposure and resulting public hysteria that may affect people that may be far removed from the actual event location."13

While the immediate impact is important, so, too, is the amount of time and resources required to replace the lost capability. If losing the asset results in a large immediate disruption, but the asset can be replaced quickly and cheaply, or there are cost-effective substitutes, the total consequence may not be so great. Alternatively, the loss of an asset resulting in a small immediate consequence, but which continues for a long period of time because of the difficulty in reconstituting the lost capability, may result in a much greater total loss.

Another issue which decision makers may consider is if the loss of a particular asset could lead to cascading effects, not only within the facility or the company, but also cascading effects that might affect other infrastructures. For example, the loss of electric power can lead to problems in the supply of safe drinking water. The loss of a key communications node can impair the function of ATM machines.

The initial set of assets are categorized by their degree of criticality. Typically the degree of criticality is assessed qualitatively as high, medium, or low, or some variation of this type of measure. However, even if assessed qualitatively, a number of methodologies suggest being specific about what kind of consequence qualifies an asset to be placed in each category. For example, the electric utility sector methodology suggests that a highly critical asset might be one whose loss would require an immediate response by a company's board of directors, or whose loss carries with it the possibility of off-site fatalities, property damage in excess of a specified amount of dollars, or the interruption of operations for more than a specified amount of time. Alternatively, an asset whose loss results in no injuries, or shuts down operations for only a few days, may be designated as having low criticality.

For those sectors not vertically integrated, ownership of infrastructure assets may span a number of firms, or industries. Whoever is doing the analysis may feel constrained to consider only those assets owned and operated by the analyst or analyst's client. For example, transmission assets (whether pipeline, electric, or communication) may not be owned or operated by the same firms that produce the commodity being transmitted. Both the production assets and the transmission assets, however, are key elements of the overall infrastructure. Also, a firm may rely on the output from a specific asset owned and operated by someone else. The user may consider that asset critical, but the owner and operator may not. Some of the methodologies reviewed encourage the analyst to also consider (or at least account for) the vulnerability of those assets owned or operated by someone else that provide critical input into the system being analyzed. These "interdependency" problems are often characterized in terms of inter-sector dependencies (e.g., the reliance of water systems on electric power), but they may also exist intra-sector. The interdependency issue is both a technical one (i.e., identifying them) and a political/legal one (i.e., how can entity A induce entity B to protect an asset).

Identify, Characterize, and Assess Threat

Roper and the API/NPRA define threat as "any indication, circumstance or event with the potential to cause loss or damage to an asset."14 Roper includes an additional definition: "The intention and capability of an adversary to undertake actions that would be detrimental to U.S. interests."15

To be helpful in assessing vulnerability and risk, threats need to be characterized in some detail. Important characteristics include type (e.g., insider, terrorist, military, or environmental (e.g., hurricane, tornado)); intent or motivation; triggers (i.e., events that might initiate an attack); capability (e.g., skills, specific knowledge, access to materials or equipment); methods (e.g., use of individual suicide bombers, truck bombs, assault, cyber); and trends (what techniques have groups used in the past or have experimented with, etc.).

Information useful to characterizing the threat can come from the intelligence community, law enforcement, specialists, news reports, analysis and investigations of past incidents, received threats, or "red teams" whose purpose is to "think" like a terrorist. Threat assessment typically also involves assumptions and speculation since information on specific threats may be scant, incomplete, or vague.

Once potential threats have been identified (both generically (e.g., terrorists), and specifically (e.g., Al Qaeda), and characterized, a threat assessment estimates the "likelihood of adversary activity against a given asset or group of assets."16 The likelihood of an attack is a function of at least two parameters: (a) whether or not the asset represents a tempting target based on the goals and motivation of the adversary (i.e., would a successful attack on that asset further the goals and objectives of the attacker); and (b) whether the adversary has the capability to attack the asset by various methods. Other parameters to consider include past history of such attacks against such targets by the same adversary or by others, the availability of the asset as a target (e.g., is the location of the target fixed or does it change and how would the adversary know of the target's existence or movement, etc.). The asset's vulnerability to various methods of attack (determined in the next step) may also affect the attractiveness of the asset as a target.

As an example of a threat assessment technique, the U.S. Coast Guard, using an expert panel made up of Coast Guard subject matter and risk experts, evaluated the likelihood of 12 different attack modes against 50 different potential targets (i.e., 600 scenarios). Attack modes included "... boat loaded with explosives exploding along side a docked tank vessel," or "... tank vessel being commandeered and intentionally damaged." The Coast Guard also considered scenarios where port assets could be stolen or commandeered and used as a weapon or used to transport terrorists or terrorism materials. Potential targets included various types of vessels (including ferries), container facilities, water intakes, utility pipelines, hazardous materials barges, etc. The panel of experts judged the credibility of each scenario. For example, using a military vessel for transporting terrorists or terrorism materials was judged not to be credible given the inherent security measures in place, but an external attack on a military target was considered credible. Each credible scenario was assigned one of 5 threat levels representing the perceived probability (likelihood) of it occurring, after considering the hostile group's intent, its capabilities, prior incidents, and any existing intelligence.

The Electricity Sector's methodology used a checklist which asks for the specific attack mode (such as the use of explosives, truck bomb, or cyber attack) and whether it is likely that such an attack would be carried out by: (a) an individual; or (b) by an assault team of up to five members. In this case, the analyst is to identify likely targets for each type of attack scenario and the objective that the adversary would achieve by such an attack.

Likelihood can be measured quantitatively, by assigning it a probability (e.g., an 85% chance of occurring), or qualitatively, such as "Very High Threat Level," which might mean there is a credible threat, with a demonstrated capability, and it has happened before. As with criticality, a number of methodologies suggested specific criteria be used to define what would constitute varying threat levels.

A threat assessment need not be static in time. Threats (i.e., the likelihood that an adversary may attack) may rise and fall over time, depending on events, anniversary dates, an increase in capability, or the need for the adversary to reassert itself. Intelligence may detect activity that indicates pre-attack activity or a lull in such activity, or an explicit threat may be made.

Assess Vulnerability

Roper defines vulnerability as a "weakness that can be exploited to gain access to a given asset."17 The API/NPRA expands this definition to include "... and subsequent destruction or theft of [the] ... asset."18 The Coast Guard defines vulnerability as "the conditional probability of success given that a threat scenario occurs.19"

Weaknesses, like criticality, can be categorized in a number of ways: physical (accessibility, relative locations, visibility, toughness, strength, etc.), technical (susceptible to cyber attack, energy surges, contamination, eavesdropping, etc.), operational (policies, procedures, personal habits), organizational (e.g., would taking out headquarters severely disrupt operations), etc.

Existing countermeasures may already exist to address these weaknesses. A vulnerability assessment must evaluate the reliability and effectiveness of those existing countermeasures in detail. For example, security guards may provide a certain degree of deterrence against unauthorized access to a certain asset. However, to assess their effectiveness, a number of additional questions may need to be asked. For example, how many security guards are on duty? Do they patrol or monitor surveillance equipment? How equipped or well trained are they to delay or repulse an attempt to gain access? Have they successfully repulsed any attempt to gain unauthorized access?

Vulnerabilities are assessed by the analyst against specific attacks. API/NPRA identifies three steps to assessing vulnerabilities: (1) determine how an adversary could carry out a specific kind of attack against a specific asset (or group of assets); (2) evaluate existing countermeasures for their reliability and their effectiveness to deter, detect, or delay the specific attack; and (3) estimate current state of vulnerability and assign it a value. Specific types of attacks can be informed by the preceding threat assessment.

The Coast Guard measured vulnerability of potential targets for each attack scenario in four areas: (1) is the target available (i.e., is it present and/or predictable as it relates to the adversary's ability to plan and operate); (2) is it accessible (i.e., how easily can the adversary get to or near the target); (3) what are the "organic" countermeasures in place (i.e., what is the existing security plan, communication capabilities, intrusion detection systems, guard force, etc.); and, (4) is the target hard (i.e., based on the target's design complexity and material construction characteristics, how effectively can it withstand the attack). Each of these four vectors were evaluated on a level of 1 to 5, with each level corresponding to a assigned probability of a successful attack. By comparison, the electricity sector process measured vulnerability as a probability that existing countermeasures can mitigate specific attack scenarios (e.g., probability of surviving attack = 80%).

Alternatively, the analyst can value vulnerability qualitatively. For example, a "highly vulnerable" asset might be one that is highly attractive as a target, for which no countermeasures currently exist against a highly credible threat. An asset with low vulnerability might be one that has multiple effective countermeasures.

Assess Risk

Risk implies uncertain consequences. Roper defines risk as the "... probability of loss or damage, and its impact..."20 The Coast Guard refers to a risk assessment as "essentially an estimate of the expected losses should a specific target/attack scenario occur."21 "Expected" loss is determined by multiplying the estimated adverse impact caused by a successful threat/attack scenario by the probabilities associated with threat and vulnerability. API/NPRA defines risk as "a function of: consequences of a successful attack against an asset; and, likelihood of a successful attack against an asset."22 "Likelihood" is defined as "a function of: the attractiveness of the target to the adversary [based on the adversary's intent and the target's perceived value to the adversary], degree of threat [based on adversary's capabilities], and degree of vulnerability of the asset."23 An important point is that risk, as defined here, is a discounted measure of consequence; i.e., discounted by the uncertainty of what might happen (see the example given below).

As noted in the first step, impact can be categorized in a number of ways. Impact or consequences may be measured more precisely at this point in the process, however, to better inform the prioritization of risk reduction steps that follows.

The Coast Guard considered six categories of impact: death/injury; economic; environmental; national defense; symbolic effect; and secondary national security issues. Each target/attack scenario measured the potential impact in each of these categories on a severity scale from 1 to 5 (from low to catastrophic). The assigned scale value was based on benchmarks. The API/NPRA, which used a similar construct, suggested the following benchmarks for its severity scale. The severity of death and injury varied from high to low depending on whether they occurred off-site or on-site, and whether they were certain or possible. The severity of environmental damage again varied from high to low depending on whether it was large scale (spreading off-site) or small scale (staying on-site). The severity of financial losses or economic disruptions were valued on threshold dollar amounts and time-frames.

The analyst can also try to measure risk quantitatively. For example, for a specific target/attack scenario, the analysis may determine that there is a 50/50 chance (i.e., we don't know) that the adversary will try to attack a particular government building. But, if they did, there is a 75% chance that they would use a truck bomb (i.e. we are pretty sure that if they attack they would try to use a truck bomb). If they try use a truck bomb, the vulnerability assessment determined that they would have a 30% chance of succeeding (i.e., if they try, there is a good chance that the current protective measures will prevent them from getting close enough to the building to bring it down). The consequences of a successful attack (bringing the building down) could be 500 people killed and $300 million in property damage.24 The risk associated with this scenario would be:

expected loss = (consequence) x ( probability that an attack will occur) x (conditional probability that the attacker uses a truck bomb) x (the conditional probability that they would be successful),25 or

(500 people killed + $300 million in damage) x (.5) x (.75) x (.3), or

risk = 56 expected deaths and $33.8 million in expected damages.26

Risk is often measured qualitatively (e.g., high, medium, low). Since consequences may be measured along a number of different vectors, and threat and vulnerability have been measured separately, a qualitative measure of risk must have some criteria for integrating the number of different qualitative measures. For example, how should the assessment decide what risk rating to give a medium threat against a highly vulnerable target that would have a low death/injury impact, a medium environmental impact, but a high short-term financial impact? Does this scenario equal a high, medium, or low level of risk?

Using Assessments to Identify and Prioritize Risk Reduction Activities

Identify Ways to Reduce Risk

Risks can be reduced in a number of ways: by reducing threats (e.g., through eliminating or intercepting the adversary before he strikes); by reducing vulnerabilities (e.g., harden or toughen the asset to withstand the attack); or, by reducing the impact or consequences (e.g., build back-ups systems or isolate facilities from major populations). For each potential countermeasure, the benefit in risk reduction should also be determined.27 More than one countermeasure may exist for a particular asset, or one countermeasure may reduce the risk for a number of assets. Multiple countermeasures should be assessed together to determine their net effects. The analyst should also assess the feasibility of the countermeasure.

The cost of each countermeasure must also be determined. Costs, too, are multidimensional. There may be up-front financial costs with associated materials, equipment, installation, and training. There are also longer term operational costs of the new protective measures, including maintenance and repair. There may also be operational costs associated with changes to overall operations. Costs also include time and impact on staff, customers, and vendors, etc. Expenditures on the protection of assets also results in opportunity costs (i.e., costs associated with not being able to invest those resources in something else).

Prioritize and Decide In What to Invest

Once a set of countermeasures have been assessed and characterized by their impact on risk, feasibility, and cost, priorities may be set. Decision makers would have to come to a consensus on which risk reduction strategy to use to set priorities.

Most of the methods reviewed suggest a cost-effective selection process (i.e., implementation of the risk-reduction method(s) should not cost more than the benefit derived by the reduced risk). Cost-effectiveness could also imply that the country invest in risk reduction to the point where the marginal cost to society equals the marginal benefit. Alternatively, given a fixed budget, cost-effectiveness might imply investing in protections that maximize the benefits for that investment. Countermeasures that lower risk to a number of assets may prove to be most cost-effective. Also, focusing attention on those assets associated with the highest risks may yield the greatest risk reduction and be one way to implement a cost-effective approach.

While cost-effectiveness is usually the recommended measure for setting priorities, decision makers may use others. For example, decision makers may be risk averse. In other words, even if the chance of an attack is small, or the potential target is not particularly vulnerable, the consequences may be too adverse to contemplate. In this case, decision makers may wish to bear the costs of additional protection that exceed the "expected" reduction in risk. Roper notes, however, that, in general, protection costs should not exceed a reasonable percentage of the total value of the asset.28

Another measure by which to select protective actions might be to favor maximizing the number or geographical distribution of assets for which risks are reduced. Alternatively, decision makers might want to focus efforts on reducing a specific threat scenario (e.g., dirty bombs) or protecting specific targets (e.g., events where large numbers of people attend).

The electric utility checklist states that the ultimate goal of risk management is to select and implement security improvements to achieve an "acceptable level of risk" at an acceptable cost. The concept of acceptable risk is mentioned in a number of methodologies, and it needs to be determined by decision makers.

After selecting which protective measures to pursue, programs, responsibilities, and mechanisms for implementing them must be established. Many of the reviewed methodologies conclude with the recommendation to revisit the analysis on a regular basis.

Status of Directorate's Risk Management Efforts

Following September 11, 2001, owners/operators of critical infrastructure assets, to varying degrees, began identifying critical assets, assessing their vulnerabilities to attack, and developing security plans or increased protections. For example, the Federal Transit Authority assessed the vulnerabilities of the nation's largest mass transit systems. The freight rail companies developed additional security measures to coincide with the level of threat identified by DHS's color-coded National Alert System. The Public Health Security and Bioterrorism Preparedness Act (P.L. 107-188) required drinking water authorities to conduct vulnerability assessments and to develop security plans based on those assessments. The Maritime Transportation Security Act (P.L. 107-295) required port facilities and maritime vessels to do the same. The American Petroleum Institute, the North American Electric Reliability Council, and other industry associations offered guidance to their respective members on how to conduct vulnerability assessments and how to manage their risk from possible attack. DHS's ability to coordinate this activity developed more slowly. It only recently released its National Infrastructure Protection Plan in June 2006, which details a uniform risk management methodology that could allow DHS to generate a set of national priorities across all sectors (see below).

Directorate's Internal Activity

While it has been developing the NIPP, the Directorate has been engaged in its own risk management activities. Shortly before the beginning of Operation Iraqi Freedom in 2003, as part of Operation Liberty Shield,29 the Directorate identified a list of 160 assets or sites, including chemical and hazardous materials sites, nuclear power plants, energy facilities, business and finance centers, and more, that it considered critical to the nation based on their vulnerability to attack and potential consequences. Over time this list grew. In testimony before the House Appropriations Committee on April 1, 2004, then-Undersecretary for Information Analysis and Infrastructure Protection, Frank Libutti, stated that DHS had identified 1700 sites as being high priority sites.30

According to the testimony, DHS intended to visit each of these sites. These Site Assistance Visits (SAVs) are conducted with owners and operators, on a voluntary basis, to discuss vulnerabilities and protective measures that can be taken "inside the fence." In addition, DHS meets with law enforcement officials of jurisdiction to assist them in developing Buffer Zone Protection Plans (BZPPs). BZPPs focus on protections that can be taken "outside the fence," including how to identify threatening surveillance, patrolling techniques, and how to assert command and control if an incident should occur. DHS has provided training and technical assistance to help state and local law enforcement entities develop their own BZPPs.

It is not clear how many sites DHS officials have visited, how many vulnerability assessments have been conducted, how many security plans have been developed, and how many have been implemented.31 Nor has the Directorate been transparent about the processes or methodology that it uses to identify and prioritize these high-priority sites or for selecting the recommended protective measures. It is not clear, even, how many assets or sites DHS still considers to be high-priority. The original list of 1700 or more sites received some criticism for including sites that were no longer in use or whose criticality was questioned. According to the Department's Inspector General, DHS itself found its original list unreliable.32 The Assistant Secretary for Infrastructure Protection, Robert Stephan, in July 2006, wrote that DHS had a list of more than 600 high-priority sites that it uses to focus its efforts.33 What, if any, relationship this list of 600 has to the original list of 1700 was not explained. In a more recent statement relating to the process used to allocate federal grants in the Urban Area Security Initiative Program (see below), Secretary Chertoff said that DHS has a list of approximately 2000 sites or assets that it considered to be of national or regional importance.34 How these sites relate to the 600 mentioned by the Assistant Secretary, or to the original 1700 sites was not mentioned.

Supporting State and Local Efforts

In addition to the activity discussed above, DHS also has been supporting state and local efforts to protect assets critical to them and the nation. DHS grants support a wide range of counter-terrorism activities. These include funds for law enforcement, fire fighters, emergency response and management, medical providers, citizen corps, etc. Some also include funding for critical infrastructure protection. For example, the State Homeland Security Grant Program and the Urban Areas Security Initiative grants, while primarily focused on the needs of first responders, also allow funding for critical infrastructure protection, such as the purchase of surveillance equipment, detectors, fences, cybersecurity hardware and software, etc.35 DHS also funds grants more specific to critical infrastructure protection. These include port, rail, mass transit, trucking, and inter-city bus security grants.

Allocation of funds through the State Homeland Security Grant program is based partially on a formula determined by Congress.36 Initial allocation of funds for the Urban Areas Security Initiative grants (and the more specific port and transportation-related grants mentioned above) are based on a risk assessments performed by what is now called Grants and Training within DHS (formerly called the Office of Domestic Preparedness) and states must justify their proposals, based in part, on a risk management process they perform.

The guidelines for the FY2006 Urban Areas Security Initiative grant program provides a glimpse into the risk assessment process used by Grants and Training, which has evolved over the last few years. DHS considered all cities with a population greater than 100,000 and any city with reported threat data during the past fiscal year. Cities on this list with shared boundaries were combined into a single candidate urban area. A 10-mile buffer was then drawn around the candidate area or city to define a geographic area in which data was evaluated. This could transcend state boundaries, leading to a regional approach. All candidate areas with a combined population greater than 200,000 were then considered for the final analysis.

The FY2006 guidance made a distinction between asset-based risk and geographically-based risk, both of which were considered when making the final selection of those urban areas eligible for the FY2006 grants. Asset-based risk as described in the guidance basically follows the processes discussed in this report. It considered specific types of attacks against potential targets within the urban area, combining the risks for an overall risk estimate. Consequences included human health, economic, strategic mission, and psychological impact, but focused on human and economic impact. Threat was defined as the likelihood that an attack might be attempted and included specific types of attacks as well as strategic intent, "chatter," attractiveness of the targets within the urban area, and capabilities. Vulnerability was defined as the likelihood that an attack might succeed (although "succeed" was not defined nor were the parameters that were considered). Geographically-based risk expanded upon this by considering certain prevailing attributes intrinsic to the area that may further contribute to the level of risk; for example, proximity to national boundaries, population density and the number of visitors and commuters that pass through the urban area. Threat calculations included such things as total number of FBI investigations in the area, number of suspicious incidents that have occurred within the area, and the total number of visitors that come from countries of special interest.

Grants and Training considered the process described above as more rigorous than previous analyses. The increased rigor is due, in part, to the more quantitative nature of the data being used and its specificity in terms of specific assets, specific attack scenarios, etc. The analysis included over 120,000 specific assets in 38 different asset types.37 Following this new methodology, not all of the urban areas that received funding in previous years were considered eligible for FY2006 funds.

To receive a grant, urban areas also must have developed an urban area security strategy, a needs assessment tied to that strategy, and an investment plan that addresses those needs. The needs assessment considers a set of capabilities that DHS has determined are necessary to prevent, protect, and respond to various types of events. Urban areas assessed their current capabilities against these to determine where they fell short. This defined their needs. Grants were made to fund programs that DHS determined would yield the highest rate of return in meeting those needs. In the past, urban areas were allocated, a priori, a certain amount of funding for which it could apply. In FY2006, allocations were made competitively based on the investment programs submitted.

A number of urban areas saw their FY2006 grant awards decline from the previous year's, while other saw theirs increase. Those whose allocations declined (including New York City and Washington, DC) adamantly voiced their concern that DHS's methodology, or its data, were flawed. DHS, at the time, defended its allocations saying they were based not just on risk, but on need, and the alignment of the investment strategies with identified needs.

Since then, Secretary Chertoff has stated that the FY2007 process has introduced some "common sense" into the process. For the first time, urban areas have been divided into two tiers. Six urban areas categorized as tier 1 (i.e., areas associated with the highest risks) will receive 55% of the Urban Areas Security Initiative funds, the remaining 39 urban areas will receive the balance. Also, the number and types of infrastructure assets that figure into the analysis has been reduced (from over 120,000 to approximately 2,000). Only those assets, whose loss would have a national or regional economic impact (or impact military readiness) are being considered. Assets such as office buildings, monuments, (and presumably stadiums, casinos, theme parks, etc.), which were considered specifically in FY2006, will not be considered specifically in FY2007. The rationale for not including these assets is that concerns about them in the past were primarily casualty related, which will be captured instead by criteria related to population: total population, population density, and numbers of commuters and tourists. Even with these changes, however, the allocation of funds within each tier will still be competitive; based, again, on the ability of urban areas to align their proposals with identified needs and return on investment.

The National Infrastructure Protection Plan

The National Infrastructure Protection Plan (NIPP) is meant to provide a unifying structure for integrating critical infrastructure protection efforts, including those already underway, and to guide protection investments both within each sector and among sectors. The NIPP plans to use sector-level plans, to be developed cooperatively by Sector Specific Agencies and representatives of their sector, as its foundation. The NIPP outlines what would become a common framework by which each sector could identify critical assets, conduct risk assessments (by integrating threat, vulnerability and consequences), and, then, use the results to help direct resources toward those activities that can most reduce the risks for a given investment.

The risk management process described in the National Infrastructure Protection Plan (NIPP) contains all of the elements described above. It calls for the setting of specific goals in terms of the security and recovery posture that the sectors wish to attain. It calls for the identification of assets that constitute each infrastructure and to screen these for criticality based on potential consequences. Factors to consider include the assets function, proximity to significant populations or other critical assets, and relative importance to the national economy.

The NIPP defines risk as a being a function of consequences, vulnerability, and threat. It defines consequences as the negative effects on public health and safety, the economy, public confidence, and the functioning of government, that can be expected if an asset is damaged, destroyed, or disrupted by a terrorist attack or natural disaster. Consequences include impacts on human life and physical well-being, both direct and indirect economic impact (e.g., the cost to respond, cost to rebuild, downstream costs resulting from disruption of product or service, and long term environmental costs), impact on public confidence, and impact on governments' ability to maintain order and provide minimum essential services. It states that consequences should consider the worst-reasonable-case scenario.

Vulnerability is defined as the likelihood that a characteristic of, or flaw in, an asset's design, location, security posture, process or operation renders it susceptible to destruction, incapacitation, or exploitation. Vulnerability assessments are to be scenario based, including specific attack tactics and weapons. Vulnerability assessments should consider operational, people, cyber, as well as physical issues.

The NIPP defines threat as the likelihood that a particular asset will suffer an attack or incident, based on the intent and capability of an adversary or the probability of a natural event. Threat should consider methods and tactics, including physical and cyber, and should also consider insider threats as well as external threats.

The NIPP calls for the assessment of these elements to be measured quantitatively if possible, or on a numeric scale if necessary, and combined mathematically to calculate a numerical risk score. Consideration of risk reduction measures is to follow a two-step process. The first step is to focus on those assets which have the highest risk scores. The second step is to identify protective measures expected to result in the greatest reduction of risk for any given investment in these high priority assets. Protective measures should include actions that can prevent, deter, or mitigate a threat, reduce a vulnerability, minimize the consequences, or enable timely and efficient response and recovery. According to the NIPP, some issues to consider when estimating cost-effectiveness are: lowering of coordination costs; long lead-time investments; appropriate roles for stakeholders; existing market incentives; and, public interests.

Finally, metrics should be developed that can track the performance of the protective measures being implemented and which can be used to provide feedback to the risk management process.

Questions and Issues

While the statements and documents referenced above allude to many of the steps outlined in the first part of this report, many questions still remain regarding process, methodology, criteria, etc.

Identifying Assets

According the DHS Inspector General, the list of high-priority sites begun by DHS as part of Operation Liberty Shield eventually morphed into a much larger and broader list of infrastructure assets now called the National Asset Database. According to the Inspector General, as of January 2006, the Database included over 77,000 entries, covering all the critical infrastructure sectors. DHS continues to refine and populate the Database.

The Database has generated considerable debate.38 A primary concern is that it includes thousands of entries that many consider not to be of national significance. Also, the Inspector General opined that it also did not include assets that many might consider to be of national significance. Other concerns include the accuracy and quality of the data included on each entry and an inconsistency of data from state to state, locality to locality (for example some regional mass transit system assets were characterized en masse, while others were characterized station-by-station).

While ceding that quality and consistency of data were a problem early in the development of the Database, DHS has taken a number of steps to correct these problems. However, in response to concerns about the Database including assets that are hard to imagine being nationally significant, DHS asserts that the Database is an inventory of assets and not a list of critical assets. In other words, it represents a list of assets, supplied by states and localities, commercial and private databases and other sources, from which critical assets can be identified. This would appear to correspond with the initial step of a risk management process: identifying assets. Even so, critics feel that the Database should be purged of those assets that are found not to be of national significance. DHS has rejected this idea.

Selecting High Priority Assets

On what basis did (or does) the Directorate select the 1,700 (or 600 or approximately 2000) high priority assets? According to the Undersecretary, in his testimony referenced above, the 1,700 assets were ones with a credible potential for loss of life and loss of citizen confidence and that these impacts would be felt nationally. He described these assets as "ones we cannot afford to lose."

Roper, and other methodologies reviewed for this report, recommended the criteria for assessing the level of criticality be specific. For example, at what point is the impact of an attack felt nationally versus one felt primarily locally or regionally? How many casualties rise to the level of having a national impact? What level of economic impact or what measure of reduced confidence would rank an asset as nationally critical? Again, the answers to these questions would probably require a consensus among decision makers.

An example of an analysis that provides more detail as to what might be considered nationally critical can be found in a white paper entitled Interagency Paper on Sound Practices to Strengthen the Resilience of the U.S. Financial System. The authors of the white paper, the Federal Reserve Board, U.S. Security Exchange Commission, and Office of the Comptroller of the Currency, determined that a disruption in the services of certain "core clearing and settlement" organizations could, by virtue of their market share, present a systemic risk to the smooth operations of the financial markets they service. The paper defined "systemic risk" as the risk that failure of one participant to meet its required obligations will cause other participants to be unable to meet their obligations when due, causing significant liquidity or credit problems and threatening the stability of financial markets. The white paper identified a threshold market share, above which a firm's plans associated with back-up capacity, geographic location, and recovery would be subject to review by the appropriate agency.

According to the DHS Inspector General, in a second more detailed request to states for data to populate the National Asset Database, DHS offered more specific guidance for identifying "national level" critical infrastructure. For example:

  • producers with herd of more than 20,000 bovine, 30,000 swine, 500,000 poultry or distribution to more than 10 states or production of 50,001-250,000 bushels of crops;
  • chemical sites that could cause death or serious injury in the event of a chemical release and have greater than 300,000 persons within a 25-mile radius of the facility;
  • major power generation facilities that exceed 2000MW and if successfully attacked would disrupt the regional electric grid;
  • refineries with refining capacity in excess of 225,000 barrels per day;
  • cruise ports/terminals located within urban centers with a population of greater than 500,000 or servicing greater than 10,000 passengers daily;
  • seaports and facilities that service the Strategic Petroleum Reserve.

These criteria are similar (but not necessarily the same) as those offered in the guidelines for the State Homeland Security Grants and the Urban Areas Security Initiative Grants. It is not known if the Directorate's internal activity uses these criteria.

Assessing Threat

The Homeland Security Act assigned to the Directorate the responsibility of integrating all-source information in order to identify and assess the nature and scope of terrorist threats against the homeland and to detect and identify threats of terrorism against the United States. However, shortly after the act was passed, the Bush Administration, in January 2003, established the Terrorist Threat Integration Center (TTIC) and placed it within the Central Intelligence Agency. Many observers felt that the TTIC assumed many of the same responsibilities of the Information Analysis (IA) function of the Directorate. The Homeland Security Act designated DHS a member of the intelligence community and, and as such, was given a seat at the TTIC. Issues and concerns associated with the division of labor between TTIC and the Directorate, expressed at the time, are beyond the scope of this report.39 Passage of the Intelligence Reform and Terrorism Prevention Act (P.L. 108-458), which created the position of Director of National Intelligence and created a National Counterterrorism Center within his office has raised additional questions.40 The 2005 reorganization of DHS moved the IA function out of the new Preparedness Directorate and put it directly under the Secretary.

Regardless of the organizational changes that have occurred, there are two key questions that are relevant to this report. Is there a consistent characterization of the threat used throughout the intelligence community and made available to the Directorate and beyond to other stakeholders? Is that characterization used consistently to inform the teams sent out to do vulnerability assessments or those agencies and other stakeholders tasked with assessing the vulnerabilities of the sectors for which they are responsible?

According to the National Infrastructure Protection Plan (NIPP), the Homeland Infrastructure Threat and Risk Analysis Center (HITRAC) will develop three types of threat analysis that can be used by each sector in their risk assessments. These products are: Common Threat Scenarios, General Threat Environment, and Specific Threat Information. The Common Threat Scenarios are descriptions ("detailed vignettes') of potential terrorist attack methods, based on known or desired capabilities of specific terrorist groups. The General Threat Environment analysis will be more sector- and sub-sector specific. According to the NIPP, each potential attack method will be cross-referenced with each potential set of targets across all sectors, based on the whether that attack scenario could achieve the goals and objectives of the attack. The resulting Terrorist Strategic Target Selection Matrix will help sectors narrow the range of threats they need to consider in their subsequent vulnerability, consequence, and risk assessments. In other words, a blank cell in the matrix indicates that the intelligence analysts do not think that particular attack scenario would likely be used or be successful against a particular target set. The Specific Threat Information is based on real-time intelligence information of explicit threats that could cause the nation's (or a sector's) alert level to rise. The General Threat Environment will be updated as needed based on Specific Threat Information. It is unlikely that earlier risk management activities benefitted from this analysis.

Another issue is whether the Directorate values all threats equally. For example, Al Qaeda has demonstrated capabilities in a number of attack modes (e.g., bombs, hijacking and piloting planes). But, their capability in other attack modes are not necessarily as well developed. How does the Directorate consider this in their threat assessments? According to Government Accountability Office (GAO),41 the Directorate has developed what are called "benchmark scenarios," but was not yet able to assess the relative likelihood of one over the other. The Matrix referred to in the NIPP, to the extent it exists, could suggest that this may no longer be an issue.42

Assessing Vulnerabilities

The testimony and statements of the Directorate officials cited above implied that the Directorate will either perform or lead vulnerability assessments in the field. However, many of the early efforts were performed by contractors or details from other agencies until the Directorate was more fully staffed. Also, it is not clear if the Directorate used the vulnerability assessments performed by other agencies or stakeholders in lieu of doing their own. A key question is whether or not contractors, details, or other agencies and stakeholders follow a similar protocol in doing their vulnerability assessments? The NIPP is suppose to supply that standardization. DHS will accept vulnerability assessments made with alternative methodologies, if they meet certain baseline criteria identified in the NIPP (see Appendix 3A). For example, as a minimum, a sector's vulnerability assessment should consider not only physical security, but also personnel, cyber, and operational security. Dependencies and interdependencies are supposed to be considered. Also, current abilities to deter, detect, and delay attacks are to be considered. However, Congress might want to ensure that certain general considerations are included.

Assessing Consequences

What consequences does the Directorate consider when assessing risk? The testimony of the then Undersecretary mentioned that the criticality of an asset was measured in part by loss of life and loss of citizen confidence, and the Directorate's budget justification alludes to forecasting national security, economic, and public safety implications.

HSPD-7 lists the types of attacks that animate national critical infrastructure policy. These are attacks that could: cause catastrophic health effects or mass casualties; impair federal agencies' ability to perform essential missions; undermine the ability of state and local governments' to maintain order and provide essential services; damage the orderly function of the economy; or undermine the public's morale or confidence. One could assume that the Directorate has considered these factors in the internal assessments of risk. The NIPP states that, at a minimum, assessments should focus on the two most fundamental impacts: the human and the most relevant direct economic impacts (e.g., cost to rebuild, cost to respond and recover, clearly identified costs resulting from the unavailability of product or service; long term environmental costs). But, are they all considered together? How are different consequences integrated into an overall risk rating for a given scenario?43 Does the Directorate weigh each category of consequence equally? HSPD-7 stated that the Secretary of Homeland Security, when identifying, prioritizing, and coordinating the protection of critical infrastructures, should emphasis those infrastructures that could be exploited to cause catastrophic health effects or mass casualties comparable to those from the use of a weapon of mass destruction. In this case, might preventing an attack on the Super Bowl take precedent over an attack on one of those financial "core clearing and settlement facilities" mentioned above, the destruction of which might significantly disrupt national financial markets, but not necessarily lead to loss of life? To what extent, if any, is the Directorate risk averse?

Another question is how are these consequences measured? Are potential deaths based on experiential data or models or best estimates? How is confidence or morale, and the impact on morale measured? Are economic models used to determine economic impact? How are cascading effects due to interdependencies determined? How far down the chain of reactions does the Directorate consider?44

Recognizing the complexity of estimating some of these consequences, the NIPP states that assessment methodologies are required and that some standards for estimating consequences need to be developed. However, aside from referencing the modeling capabilities developed at the National Infrastructure Simulation and Analysis Center, the NIPP offers little in way of setting standards for what measures to use, and the assumptions that need to be made.

Risk Reduction

The risk associated with a specific attack on an asset can be reduced by reducing the level of threat to it, by reducing its vulnerability to that threat, or by reducing the consequences or impact of an attack should it happen. This parallels the Bush Administration's overall strategy for homeland security: (1) prevent terrorist attacks, (2) reduce America's vulnerability to terrorism, and (3) minimize the damage and recover from attacks that do occur.45 The Department of Defense, the Central Intelligence Agency, the Federal Bureau of Investigations, elements of DHS's Border and Transportation Directorate, and other law enforcement and intelligence agencies have the primary role of reducing threat, by disrupting, finding, detaining, or eliminating individuals that threaten the United States. DHS's emergency preparedness and response activities address mitigating the consequences of an attack, through rapid response and quick recovery. The Directorate's critical infrastructure protection activities primarily address reducing an asset's vulnerability. As discussed above, it is doing so mainly by hardening the asset against attack, by improving the ability of those protecting the asset to deny access to the asset and to improve their ability to repulse an attack.46

This raises the question, however, of whether or not, and by what mechanism, are the various efforts to reduce threat (prevent), vulnerability (protect), and consequences (prepare) coordinated both within DHS and between DHS and other agencies and to what extent, and by what mechanism, are the allocation of federal resources to these three areas influenced at all by comparing the risk reduction achieved by each? For example, effective screening of people entering the country likely contributes greatly to reducing the risks associated with an attack on critical infrastructure. To what extent is the marginal risk reduction associated with an additional investment in the Department's border screening effort balanced against the marginal risk reduction associated with an additional investment in hardening assets. This would likely require a level of risk management currently beyond the Directorate's mandate.47

Prioritizing Protection Activities

According to the NIPP, prioritizing protection activities should be a two step process. First, those critical assets that pose the greatest risks are addressed first. Protective measures for these assets are identified and their potential for reducing risk determined. Second, the amount of resources available is divided among these measures in a way that maximizes the reduction in risk. Presumably, according to the NIPP, DHS will use a similar approach in recommending budget levels for these and other federal programs that address infrastructure security needs as part of a National Critical Infrastructure and Key Resources Protection Annual Report to the Office of Management and Budget.

In allocating funds in its Homeland Security Grants, its Urban Areas Security Initiative, and some of its more infrastructure-specific grants, DHS has resorted to ranking assets or geographic areas into tiers, based on the level or risk (or at least potential consequences) associated with them. Funds are then allocated to each tier and entities within each tier compete for those funds. DHS then ranks proposals based on a variety of factors including the proposal's contribution to risk reduction or the degree to which identified needs or vulnerabilities are addressed.

While allocating resources primarily on risk-oriented cost-effectiveness seems relatively straightforward, it may not be easy to implement, or may it lead to a distribution of resources that is politically unpalatable. For example, depending on the budget and the protective measures proposed and their expected degree of effectiveness in lowering risk, it is conceivable that most of a given budget could go to a few areas or assets or that some areas or assets do not receive any funding. Alternatively, if proposals are only partially funded, it may be difficult to prorate the associated risk reduction. As Secretary Chertoff suggested, such a strategy may have to be modified by "common sense," something less than objective, and probably in need of explanation, if not consensus.

Conclusion

DHS and the Directorate have been tasked with a very complex problem. Security oriented risk management is typically done at the site or facility level or at the corporate level. The Directorate is being asked to do this at the national level, assessing and comparing perhaps thousands of disparate sites and facilities it has judged as being nationally important.

The Directorate is to consider not only economic impacts and loss of life, but also the possible impact on national morale and the ability of state and local governments to maintain order and deliver essential services. None of these are easy to measure and all are difficult to trade off one against the other, should the analysis come down to that. To determine the economic impact of the loss of an asset is more difficult than determining the effect on a company's bottom line. The Directorate has been instructed to determine economic impacts two to three levels through the supply chain. It is not clear how the Directorate can or intends to measure the impact on national morale associated with the loss of an asset, especially a cultural icon. Comparing the potential loss of life in one scenario with the potential loss of life in another scenario, while sensitive, presents a direct comparison. However, comparing the importance of an asset whose loss may result in a relatively small loss of life with another asset the loss of which might result in a large economic impact is much harder.

The exercise will be less than perfect and probably less than objective. The Bush Administration and Congress are allocating resources in any event, so these choices are getting made implicitly. If such processes were more transparent, Congress could better oversee them and offer guidance if necessary.

The 9/11 Commission, in discussing a need for a layered security system for public transportation systems, stated that the Transportation Security Administration should be able to identify for Congress the array of potential terrorist attacks, the layers of security in place, and the reliability provided by each layer.48 Expanding on this, the Directorate should be able to tell Congress what criteria it has used to select assets of national importance, the basic strategy it uses to determine which assets warrant additional protective measures, by how much these measures could reduce the risk to the nation, and how much these additional measures might cost.

It is not clear that the Directorate has had a consistent systematic approach for identifying nationally critical assets, assessing the risks they pose, and using that information to inform cost-effective allocation of resources to protective action, especially in its early efforts. The NIPP appears to provide a framework for a written protocol that outlines specifically the steps taken in the risk assessment and risk management process and the assumptions, criteria, and tradeoffs that are made. While the NIPP lays out a clear process, it is not clear how transparent the implementation of the plan will be. DHS has stated that Section Specific Plans and their integration into a set of national priorities could be classified.49

Finally, Congress may choose to offer its guidance to the Directorate on some of these criteria or tradeoffs. To do so with the same systematic approach that the Directorate has been asked to do, the different committees with jurisdiction over different infrastructures may want to consider coordinating their advice.

References

Carl Roper, Risk Management for Security Professionals, Butterworth-Heinemann, 1999.

U.S. Coast Guard, Implementation of National Maritime Security Initiatives, Federal Register, Vol. 68, No. 126, July 1, 2003, pp 39240-39250.

American Petroleum Institute and the National Petrochemical & Refiners Association, Security Vulnerability Assessment Methodology for the Petroleum and Petrochemical Industries, May 2003.

U.S. Department of Energy, Office of Energy Assurance, Vulnerability Assessment Methodology, Electric Power Infrastructure (Draft), September 30, 2002.

National Communications Systems, Office of the Manager, Public Switched Network Security Assessment Guidelines, September 2000.

Association of Metropolitan Sewerage Agencies, Protecting Wastewater Infrastructure Assets: Asset Based Vulnerability Checklist for Wastewater Utilities, 2002.

Government Accountability Office, Homeland Security: Key Elements of a Risk Management Approach, GAO-02-150T, October 12, 2001.

American Chemistry Council, the Chlorine Institute, and the Synthetic Organic Chemical Manufacturers Association, in their Site Security Guidelines for the U.S. Chemistry Industry.

Argonne National Laboratory, et al., prepared for the Office of Energy Assurance, U.S. Department of Energy, Energy Infrastructure Vulnerability Survey Checklists, February 22, 2002.

Department of Homeland Security. National Infrastructure Protection Plan. June 2006.

Footnotes

1.

The Information Analysis and Infrastructure Protection Directorate was established in the Homeland Security Act, but has since undergone two reorganizations, evolving first into the Preparedness Directorate, then subsequently into the National Protection and Programs Directorate, which currently has these responsibilities. The term "Directorate" used throughout this report refers interchangeably to these Directorates.

2.

Office of Homeland Security, National Strategy for Homeland Security, July 2002.

3.

Ibid., p. 33.

4.

Ibid., p. 64.

5.

Office of Homeland Security, The National Strategy for the Physical Protection of Critical Infrastructures and Key Assets, February 2003.

6.

Ibid., p. 23.

7.

Homeland Security Presidential Directive Number 7, Critical Infrastructure Identification, Prioritization, and Protection, December 17, 2003.

8.

The Clinton Administration referred to these as Lead Agencies in its Presidential Decision Directive Number 63 (PDD-63, May 1998). HSPD-7 supersedes PDD-63 in those instances where the two disagree.

9.

The Directive required that the National Plan be developed by the end of calender year 2004. A completed National Infrastructure Protection Plan was released in June 2006. See, http://www.dhs.gov/xlibrary/assets/NIPP_Plan.pdf.

10.

Just as one example, the 9/11 Commission Report when discussing the basis upon which federal resources should be allocated to states and localities, stated that such assistance should be based "strictly on an assessment of risks and vulnerabilities." Later, in the next paragraph, it stated "the allocation of funds should be based on an assessment of threats and vulnerabilities." In the next paragraph it stated that resources "must be allocated according to vulnerabilities." The National Commission on Terrorist Attacks Upon the United States, The 9/11 Commission Report, W.W. Norton and Company, 2004, p. 396.

11.

These models and methodologies, as does the original version of this report, predate the National Infrastructure Protection Plan. As is discussed later in this report, the National Plan incorporates many of the processes and addresses many of the issues identified and discussed in this report. Some of these methodologies may have been superseded since the original writing of this report.

12.

Carl A. Roper, Risk Management for Security Professionals, Butterworth-Heinemann. 1999.

13.

American Petroleum Institute and the National Petrochemical and Refiners Association, Security Vulnerability Assessment Methodology for the Petroleum and Petrochemical Industries, May 2003, p. 4.

14.

American Petroleum Institute, op. cit., p. 5.

15.

Roper, op. cit., p. 43.

16.

This quote is taken from the Government Accountability Office testimony, Homeland Security: Key Elements of a Risk Management Approach, GAO-02-150T, before the Subcommittee on National Security, Veteran's Affairs, and International Relations, House Committee on Government Reform, October 21, 2001. It is used in several of the other methodologies reviewed.

17.

Roper, op. cit., p. 63.

18.

American Petroleum Institute, op. cit., p. 5.

19.

Federal Register, Department of Homeland Security, Coast Guard, Implementation of National Maritime Security Initiatives, vol. 68, no. 126, July 1, 2003, p. 39245.

20.

Roper, op. cit., p. 73.

21.

Federal Register, op. cit., p. 39245.

22.

American Petroleum Institute, op. cit., p. 3.

23.

Ibid.

24.

Consequences, too, could be uncertain. For example, it may be determined that in the above scenario, a successful attack may cause a distribution of possible deaths between zero and 500 people.

25.

This formulation assumes that the uncertainties in this case are independent, which in many cases is not accurate. The attractiveness of a target (an element in determining threat) may very much depend on its vulnerability. Likewise, the consequence of an attack may also depend on a target's vulnerability. This complicates the calculation.

26.

Note: the risk in this scenario is not 500 people dead, but 56 expected deaths. That is not to say that if an attack were actually successfully carried out only 56 people might die. In fact, in this scenario, it has been judged that 500 people would likely die. Choosing to use the 500 potential deaths in subsequent decisions, essentially assuming an attack will occur and be successful, would be called risk averse in this construct. Taking a risk averse position is a legitimate policy option. See further discussion on risk aversion below.

27.

Again, dependencies between threat, risk, and consequences need to be considered.

28.

Roper, op. cit., p. 88.

29.

Operation Liberty Shield was a comprehensive national plan to protect the homeland during operations in Iraq.

30.

According to the Department's Inspector General, the number reached 1,849 assets. Department of Homeland Security. Office of Inspector General. Progress in Developing the National Asset Database. OIG-06-04. June 2006.

31.

The Directorate, in its FY2007 budget request, stated that 200 Site Assistance Visits were made in FY2005, and that 150 more were expected to be made each year in FY2006 and FY2007. According to the Directorate's Performance Budget Overview for FY2007, which matches specific programs with specific performance measures, vulnerability assessments had been conducted at 14% of DHS's high priority sites in FY2005. In addition, the Directorate conducted had set a goal of assessing vulnerabilities at 25% of its high-priority sites and to have at least two suitable protective actions implemented at 20% of its high-priority sites by FY2007. It is not clear if this refers to the 600 or more sites mentioned by the Assistant Secretary.

32.

Department of Homeland Security. Office of Inspector General, op. cit., p. 16.

33.

USA Today. "Database is Just the 1st Step," by Robert Stephan. July 21, 2006. p. 8A.

34.

Department of Homeland Security. News Release. Remarks by Secretary Michael Chertoff at a Press Conference on the Fiscal Year 2007 Homeland Security Grant Program. January 5, 2007.

35.

Fifteen percent of the Urban Areas Security Initiative grants go toward infrastructure protection. Conversation with Assistant Secretary Stephan, July 12, 2007.

36.

The formulae have generated some debate among states. For a discussion of this issues and the debate that took place within Congress in 2005, see CRS Report RL33050, Risk-Based Funding in Homeland Security Grant Legislation: Issues for the 109th Congress, by [author name scrubbed]. The formulae remains an issue for the 110th Congress.

37.

The 38 assets types were: chemical manufacturing facilities, city road bridges, colleges and universities, commercial airports, commercial overnight shipping facilities, convention centers, dams, electricity generation facilities, electricity substations, enclosed shopping malls, ferry terminals/buildings, financial facilities, hospitals, hotel casinos, levees, liquid natural gas terminals, maritime port facilities, mass transit commuter rail and subway stations, national monuments and icons, national health stockpile sites, natural gas compressor stations, non-power nuclear reactors, nuclear power plants, nuclear research labs, petroleum pumping stations, petroleum refineries, petroleum storage tank farms, potable water treatment facilities, primary and secondary schools, railroad bridges, railroad passenger stations, railroad tunnels, road commuter tunnels, stadiums, tall commercial buildings, telecommunication-telephone hotels, trans oceanic cable landings, and theme parks.

38.

For a more detailed discussion of the debate associated with the National Asset Database, see CRS Report RL33648, Critical Infrastructure: The National Asset Database, by [author name scrubbed].

39.

For information on these, see CRS Report RS21283, Homeland Security: Intelligence Support, by [author name scrubbed]

40.

See CRS Report RL33616, Homeland Security Intelligence: Perceptions, Statutory Definitions, and Approaches, by [author name scrubbed].

41.

United States Government Accountability Office. Risk Management: Further Refinements Needed to Assess Risks and Prioritize Protective Measures at Ports and Other Critical Infrastructure. GAO-06-91. December 2005.

42.

However, the Matrix may only suggest "yes" or "no" when deciding which attack scenarios to consider. The GAO report may be referring to the ability to assess relative likelihood within the set of relevant scenarios for a given target.

43.

For example, the Coast Guard considered six categories of consequences, including death/injury, economic, environmental and symbolic impacts, all equally weighted, and assigned a value of 1 to 5 to each of these, based on severity. An overall level of risk was determined by the sum total value.

44.

The Senate Appropriation Committee, in its FY2005 appropriations bill's report, recommended continued funding for risk analysis activities that include evaluating second- and third-order cascade effects associated with market interdependencies.

45.

See Office of Homeland Security, The National Strategy for the Physical Protection of Critical Infrastructures and Key Assets, February 2003, p. 1.

46.

Notwithstanding the National Infrastructure Protection Plan's inclusion of measures that reduce the consequences of an attack as options to consider in reducing risk, some believe that the Directorate's critical infrastructure protection activities could be more "resilience" oriented. See, Homeland Security Advisory Council. Report of the Critical Infrastructure Task Force. January 2006. See, http://www.dhs.gov/xlibrary/assets/HSAC_CITF_Report_v2.pdf. Last viewed July 17, 2007.

47.

The 2005 reorganization of DHS, which established the position of Under Secretary for Policy might be a place to address this issue.

48.

The National Commission on Terrorist Attacks Upon the United States, The 9/11 Commission Report, W.W. Norton and Company, 2004, p. 392.

49.

The Department released the first version of the Sector Specific Plans (SSPs) on May 21, 2007. Of the 17 SSPs, 10 are considered For Official Use Only. The remainder can be viewed at http://www.dhs.gov/xprevprot/programs/gc_1179866197607.shtm#2. This site was last viewed on July 17, 2007.