National Academies Press: OpenBook
« Previous: 1 Introduction
Suggested Citation:"2 Grand Architecture." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×

Page 11

2—
Grand Architecture

Protecting commercial aviation from terrorist threats is a complex systems problem. As illustrated in Figure 2-1, there are many paths by which a threat material (i.e., a bomb) can endanger the security of an aircraft. These threat vectors include checked baggage, cargo, mail, passengers and their carry-on bags, flight crews, catering and service personnel, and missiles. Aviation security must protect aircraft against attack—from explosives, chemical agents, biological agents, and other threat items—by all of these threat vectors, and possibly others. There is no perfect defense against all threats to commercial aviation, and optimizing aviation security with respect to performance, cost, and efficiency of air travel will ultimately require compromises in the selection of security equipment, procedures, and personnel.

The focus of this study is on the security measures deployed by the FAA for detecting and containing explosives introduced by two threat vectors, checked baggage and carryon baggage. Although this does not encompass all potential threat vectors, designing and implementing effective security measures to address even these two vectors together are a complex systems problem. Because no single "silver bullet" technology can protect against all threat vectors, the

image

Figure 2-1
Threat vectors. The paths by which people, baggage, and equipment board 
a plane are also routes by which threats may board a plane.

Suggested Citation:"2 Grand Architecture." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×

Page 12

overall security system must be a multilayered network of subsystems. The performance of the overall system depends on the technical capabilities of the detection and containment equipment and the performance of the equipment operators, as well as a host of environmental, management, and policy factors. System performance is also affected by operator training, maintenance practices, management priorities and loyalties, and the nature of the threat itself. Like all of the other factors that influence system performance, the nature of the threat is variable and time dependent, which contributes to the complexity of the overall security system.

Complex systems problems can best be understood in an SOS framework. The primary performance measure for an SOS for aviation security is, of course, protection against explosives. Performance can best be described as an SEF, which measures the improvement in security relative to a specified baseline (e.g., the security afforded by the system configuration in a particular year). SEF is a simple measure that condenses an exceedingly complex SOS into a single parameter that can accommodate the variabilities and time dependencies of protective measures, as well as of the threat itself. The SEF compares an improved security system to the previous system in terms of the probability that a bomb will be taken aboard a plane (and cause catastrophic damage to it) by comparing the number of test bombs that had defeated an older security system to the number of test bombs that have defeated the newer security system. The panel concluded that the SEF is one way to reduce the complexity of analyzing the aviation security system.

In the context of a well defined SOS framework, security systems can be designed or modified to optimize aviation security. As the responsibility for security measures becomes increasingly diffuse and more and more liability claims are being disputed, the need for an SOS framework for interpreting the viability of security systems is becoming more urgent. Although predicting the performance of an aviation security system against a terrorist event is difficult, an SOS approach makes it possible to estimate the performance range of a security system based on thorough and realistic operational testing.

For the reasons cited above, the panel adopted an SOS approach to assess the FAA's deployment of explosives-detection equipment, HULDs, and security procedures. Security equipment must work in concert with other units in the overall airport security system and, therefore, should be measured and assessed in this larger context. The panel developed the TAAS (total architecture for aviation security) as a framework for assessing their performance.1 The panel presents a rationale for using a high-level systems approach and a methodology for continuously monitoring and upgrading the TAAS in response to changes in the threat environment, security technologies, and procedures. In addition, qualitative requirements for the overall system architecture are related to performance and operational characteristics of the subsystem components, including policies. Once these concepts are endorsed by the FAA, the airlines, and airports quantitative measures can be developed to assess and optimize improvements to the TAAS.

Total Architecture for Aviation Security Concepts

Figure 2-2 illustrates a top-level TAAS for explosives detection and containment. The first step in the analysis is the description of the real-world threat. This threat is variable and can be described as a range of probabilities that a specified amount, type, and configuration of explosive will enter the system, or the amount of explosive can be a random variable whose distribution varies with the type of explosive. Articles accompanying the explosive and other features of the operating environment (e.g., passenger characteristics, air traffic) also appear as random variables. Because there are many ways to define the explosives threat, the modeling of the threat environment and its dependencies on the venue and other factors will have to be carefully considered.

The security layers in Figure 2-2 are shown as generic components labeled A, B, C etc. These components include physical security,2 metal detection, bulk and trace explosives-detection equipment, operator inspection of x-ray screens for carry-on luggage, CAPS, PPBM, and the possible containment of baggage in HULDs. Each component, including associated operational protocols (training, calibration, maintenance, and other procedures), comprises a subsystem in the TAAS. Each subsystem is described by measures for determining its effectiveness, such as throughput rate, false-alarm rate, operational cost, installation cost, and probability of detection. Links between the subsystems are an important aspect of the TAAS. For instance, passenger-profiling information (from CAPS) could be provided to x-ray screening consoles to enable operators to rapidly resolve a security alert from a particular bag (an example of feed forward). Because these links affect the top-level performance of the TAAS, the flow pattern is also an important part of the analysis. Two routes through the TAAS are shown in Figure 2-2 (A-B-C and D-E-F-G) to indicate that there may be more than one way through the TAAS to the plane.

The role of various authorities in the management of the TAAS may add to its complexity. Currently, airlines have significant discretion over the deployment of aviation security subsystems. Baggage, cargo, and mail flow vary among

1 In a recent report, Aviation Security Technology Integration Plan, the FAA uses a similar systems-architecture perspective for planning aviation security strategies (LaMonica et al., 1995).

2Physical security includes measures to control access to concourses, gates, and airplanes to ensure that passengers and bags go through the security system.

Suggested Citation:"2 Grand Architecture." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×

Page 13

image

Figure 2-2
A top-level total architecture for aviation security (TAAS).

domestic and foreign air carries, change with airline transfer protocols, and change again with the imposition of regulatory policies by various agencies, such as the FAA and local airport authorities. Because of the large variability in baggage, cargo, and mail flow, the TAAS for each airport will be specific to that facility. Thus, an evaluation of a deployed system will necessarily be specific to a particular site, and a top-level strategy for future deployments must be based on both universal and site-specific characteristics.

Simplifying, comparing, and analyzing various TAAS configurations will require an overall SEF. Because of the variable nature of the threat and other components and the probabilistic behavior of the subsystems, the TAAS performance measures can also be described as random quantities. A primary output of the assessment of a particular TAAS would be the probability that an explosive threat will bring down an aircraft, which is expressed as a probability distribution over the various threat amounts. One measure of the performance of the TAAS will be the SEF, which measures the reduction, relative to the baseline architecture, in the probability that a threat will bring down an aircraft.

Improved security is the ultimate goal but not the only performance measure of the TAAS. Every system architecture has a number of costs and customer (passenger) convenience features that must be traded-off against the SEF. For example, airline passengers are not likely to tolerate extra delays for extensive scrutiny of their baggage in peacetime when no threat is perceived. Similarly, substantial purchase and deployment costs by the FAA would not be tenable for small improvements in security. Additional costs to the airlines for the TAAS would surely meet with customer resistance if the cost was translated to significantly higher ticket prices. These and other performance measures are also shown as outputs in Figure 2-2. One way to assess security enhancement/cost trade-offs would be to determine the cost per passenger per percent of increase in the probability of detection of an explosive threat (Hammar, 1998). An evaluation of security architectures and additional discussion of the TAAS concept are contained in Chapter 10.

Total Architecture for Aviation Security Subsystems

The explosives-detection equipment and containment equipment are the security subsystems comprising the networks in the TAAS in Figure 2-2. Figure 2-3 shows the subsystems that were in place prior to the 1997–1998 deployment, including conventional x-ray radiography for scanning carry-on baggage, metal detector portals for screening passengers, canine teams for screening checked bags, physical searches of baggage, and limited passenger-bag matching. A network like the one in Figure 2-3 could be used as the baseline architecture for the SEF.

In response to the congressional mandate (PL 104-264), the FAA focused on the development and deployment of the following security measures: CAPS, FAA-certified EDSs, noncertified bulk explosives-detection equipment, TEDDs, PPBM, and HULDs. Current airport security systems also include conventional x-ray scanning, physical searches, metal detectors, and canine teams. Figure 2-4 shows a representative TAAS in the early stages of the FAA's mandated deployment. To evaluate the effectiveness of this deployment, one must estimate the improvement in security afforded by the TAAS in Figure 2-4 over that in Figure 2-3 (including the performance of detection and containment equipment, management policies, and human factors).

Once reliable and statistically significant data are available on the performance characteristics of individual subsystems (or security measures) and their dependencies, the optimal configuration of the TAAS can be determined. In general, complementary detection devices yield higher SEFs than redundant identical devices. The TAAS provides a basis for a systematic analysis of trade-offs among subsystems.

Security Enhancement

The purpose of a system that includes hardened containers would be to reduce the probability that an onboard explosion would bring down a plane. Because hardened containers

Suggested Citation:"2 Grand Architecture." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×

Page 14

image

Figure 2-3
Notional airport security configuration for international flights prior to the 1997–1998 deployment.
Source: Dickey and Fuqua (1998).

have not yet been deployed, security enhancement is currently evaluated by the decrease in the probability that a bomb would be taken onto a plane. Thus the critical system-performance measure is the proportion of bombs that defeat the security system. To evaluate the probability that a bomb would defeat a security system, realistic bomb simulants (e.g., the modular test set) must be used for blind operational tests (e.g., so-called red-team testing). The SEF of a System B relative to a baseline system (System A) is defined as:

image

The same set of test bombs and the same testing procedures are used for both systems. The probability that a bomb would defeat a security system is conditional on a bomb being present and not on the probability of bombing attempts, which will vary from the baseline year.

System A will probably change over time. Therefore, the

image

Figure 2-4
Notional aviation security configuration for international flights during 
the early stages of the 1997–1998 deployment.
Source: Dickey and Fuqua (1998).

Suggested Citation:"2 Grand Architecture." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×

Page 15

baseline system will have to be redefined as new and improved equipment becomes available and as the threat evolves with time. For example, if a new explosive compound that is difficult to detect with deployed equipment suddenly becomes prevalent among terrorists, the SEF would actually decrease. In this case, the security system that was in place prior to the appearance of the new explosive compound would not be relevant as a baseline. The ''new" baseline system (System A) would be the system in place at the time the new explosive compound became known. An improved system (System B) would be the security system after a change has been made to System A to improve performance. Thus, the SEF would be the ratio of simulated explosives (including a simulant of the new explosive compound) that defeated System B to the simulated explosives that defeated System A. This scenario also has implications for determining the SEF when new equipment is deployed to an existing TAAS.

Analysis Techniques

Monte Carlo techniques can be used to calculate the distributions for the outputs shown in Figure 2-2. The distributions are based on repeated inputs (from TAAS performance measurements for selected component and threat distributions). Because of the randomness of the system, different Monte Carlo runs of the same input values may yield different outputs. Monte Carlo methods, which were developed to assess the susceptibility or risk of complex systems to various threats (or failures), are often called probabilistic risk assessments (PRAs) (Aven, 1992; Henly and Kumamoto, 1992). Three aspects of PRAs that are specific to the TAAS analysis are listed below.

1. The results of PRAs are heavily dependent on the underlying assumptions of the distributions of the input variables and the performance characteristics of the layered security subsystems. All subsystems must, therefore, be carefully evaluated in field operation and all dependencies investigated. The results of PRAs will only be as valid as the probabilistic models used to specify the TAAS.

2. A PRA of the TAAS using performance measures for actual bombing attempts would result in a very small input distribution. When this occurs, obtaining a precise output distribution (e.g., the probability that an explosive will bring down a plane) requires a very large number of Monte Carlo runs. For this reason, realistic simulated bomb attacks3 are the only reliable way to test the capability of the TAAS. An SEF measure that compares actual successful attacks against the baseline and the improved TAAS would yield a ratio of two very small quantities, making it difficult to estimate improvements. Rather than evaluating the real threat of attack, the SEF shows the improvement in the system once a threat is introduced. This approach focuses on the critical false-negative rate rather than the false-alarm rate.

3. It may be possible to simplify the threat distribution, and therefore the PRA, by establishing the likelihood that a particular explosive will be used. For example if terrorists tend to use only one type of explosive, (e.g., explosive X) 99 percent of the time and explosive Y only 1 percent of the time, only explosive X would have to be considered in the analysis. Assume the distribution of the amount (Q) of explosive is sufficient to produce a distribution of possible losses for different planes without HULDs and another distribution of an amount greater than Q (Q+) is required for planes with HULDs, and assume that there is a 90 percent chance that an amount Q will bring down a plane without a HULD and a 90 percent chance that an amount Q+ will bring down a plane with a HULD. Then Q or Q+ for a specific explosive can be used as simple parameters to evaluate the success of the TAAS without requiring the evaluation of every distribution.

The FAA's Deployment Strategy

The FAA's deployment of advanced explosives-detection technologies was initiated with an allocation of funds by Congress specifically for this purpose (PL 104-208). This equipment had to be deployed on an accelerated schedule because of time constraints on the funds (PL 104-264, PL 104-208), which precluded the development of a strategic deployment plan. Even though many deployments were less than optimal and some airport surveys were not completed, the placement of advanced technologies into the field has provided valuable data for future deployments (e.g., effect of location of the explosives-detection equipment and the flight destination on alarm rates). A great deal has also been learned about the deployment process, particularly the fundamental importance of securing the cooperation of airport authorities, the airlines, and FAA personnel.

Because the FAA was aware of the importance of these field data, steps were taken to ensure that the data were analyzed and stored (Dickey, 1998; Fuqua and O'Brien, 1998). Based on these data, the FAA now has the opportunity and means of pursuing a genuine deployment strategy. Indeed, the panel observed that the FAA has taken the following steps toward comprehensive strategic deployment:

• The FAA's Office of Policy and Planning for Civil Aviation Security has developed a series of potential scenarios (dubbed "end states") for future aviation se-

3 A simulated attack implies that the dynamic range of the testing process is sufficient to evaluate the differences between the baseline and improved TAAS.

Suggested Citation:"2 Grand Architecture." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×

Page 16

curity and is developing requirements for security checkpoints; the training and performance of personnel; and the handling of cargo, mail, and checked baggage. In addition, this office envisions airport-airline-FAA partnerships to facilitate the deployment of security equipment and procedures and provide assistance in risk management, vulnerability assessments, and contingency planning. As part of the strategic planning, the office has made a cost comparison of the widespread deployment of certified EDSs and the deployment of a slower, cheaper (i.e., noncertified) alternative (Fainberg, 1998).

• The Aviation Security Technology Integration Plan uses a systems architecture for planning aviation security strategies. The FAA has developed a comprehensive, although qualitative, plan for continuously monitoring and improving this architecture (LaMonica et al., 1995; Polillo, 1998).

• The SEIPT Operational Assessment Report describes data requirements for planning and fielding security equipment (Fuqua and O'Brien, 1998). The FAA's system-assessment concept outlines a plan for obtaining operational data. A deployment-analysis database is also being planned for obtaining and maintaining operational data that will be readily available for future assessments and planning (Hammar, 1998; Fuqua, 1998).

• The Passenger Bag Flow Model is a large simulation model of the flow of passengers and baggage in an airport being developed by the FAA. The model will incorporate operational performance data and airport characteristics, such as layout, flight schedules, and passenger base, to simulate overall system behavior (Hammar, 1998).

Conclusions and Recommendations

The FAA is now in a position to adopt a systems approach for the strategic deployment of security equipment and procedures. This approach will require active participation by airlines and airports, a systematic process for collecting operational data, and a well defined SEF-type measure to reduce the complexity of the analysis. An SOS framework, such as the TAAS described above, would be capable of describing and assessing the deployment of explosives-detection equipment and other security measures.

Recommendation

The FAA should define a total architecture for aviation security (TAAS) for describing and assessing the deployment of explosives-detection equipment, hardened unit-loading devices, and security procedures.

Recommendation

The FAA should formulate a security enhancement factor (SEF) for the integrated total architecture for aviation security based on data collected during blind operational testing.

Recommendation

The FAA should aggressively define operational performance metrics for security subsystems and for the total architecture for aviation security as a whole. The FAA should establish an action team whose principal task is the systematic collection of operational data. These data should be systematically placed into a database and made available to those who have a "need to know."

Suggested Citation:"2 Grand Architecture." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×
Page 11
Suggested Citation:"2 Grand Architecture." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×
Page 12
Suggested Citation:"2 Grand Architecture." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×
Page 13
Suggested Citation:"2 Grand Architecture." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×
Page 14
Suggested Citation:"2 Grand Architecture." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×
Page 15
Suggested Citation:"2 Grand Architecture." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×
Page 16
Next: 3 Roles and Responsibilities »
Assessment of Technologies Deployed to Improve Aviation Security: First Report Get This Book
×
Buy Paperback | $47.00 Buy Ebook | $37.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

This report assesses the operational performance of explosives-detection equipment and hardened unit-loading devices (HULDs) in airports and compares their operational performance to their laboratory performance, with a focus on improving aviation security.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!