Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 13
--> 1 Introduction The mission of the Federal Aviation Administration (FAA) is to promote and ensure the safety and security of air travel in the United States. The FAA has fostered an air travel system that is safer than virtually any other means of travel on Earth (White House Commission on Aviation Safety and Security, 1996, 1997) and, in partnership with several administrations, has continually improved aviation safety and security by recognizing new and potential threats and implementing policies to counter them. The years since the mid-1960s have seen the implementation of many new approaches to improving aviation security. In the late 1960s, increased hijackings resulted in the establishment of the Anti-Hijacking Program of the FAA through the Air Transportation Security Act of 1974 (Public Law 93-366). This program spurred the implementation in U.S. airports of the now-familiar metal-detection screening portals for passengers and the x-ray inspection systems for carry-on baggage. The destruction of Pan American Airlines Flight 103 over Lockerbie, Scotland, on December 21, 1988, resulted in the creation of a Commission on Airline Security and Terrorism in 1989 by President Bush (President's Commission on Aviation Security and Terrorism, 1990), which led to his signing of the Aviation Security Improvement Act of 1990 (Public Law 101-604). Most recently, the White House Commission on Aviation Safety and Security published a report specifically addressing aviation security (White House Commission on Aviation Safety and Security, 1997). These events, along with other terrorist incidents in the United States and throughout the world and the perceived vulnerability and visibility of commercial airplanes as the targets of such terrorism, have sparked vigorous debate on how to improve aviation security. From the White House Commission on Aviation Safety and Security to the local newspapers, calls for more and better-trained security personnel, increased security awareness, and an improvement in the overall aviation security system are resulting in a focus on the detection of weapons and explosives. One aspect of an overall aviation security system is technology for screening passenger baggage to determine if the bags contain anything that could be a threat to an airplane, from weapons that could be used in a hijacking to explosives intended to cause the airplane to explode in midair. The FAA has responded to the American public's desire for more-effective aviation security by fielding an unprecedented variety and number of new technologies for detecting explosives in checked passenger bags. These new technologies can peer through a bag to provide an image that can be used to determine the bag's contents or sample the air around a bag or even the surface of the bag itself to determine if traces of explosive materials are present and to indicate the need for more-extensive scrutiny of that bag. Many of the technologies being fielded today were developed within the past ten years, and more are under development through the efforts of manufacturers and the FAA. Now that the FAA, along with many foreign governments and airports, is fielding this wide variety of technologies to detect explosives, one question has come to the forefront: How can the FAA, the manufacturer, and the end user determine when the equipment is working properly at the manufacturing site before deployment or in the airport after deployment? The metal-detecting portals common in airports around the world can be tested by walking through the portal carrying a weapon or simulated weapon and determining whether the machine gives the correct response. Analogously, to determine if a machine to detect explosives is working, an inspector could test the machine with a sample of an appropriate explosive to determine if the correct response is given. However, because of the attendant safety problems, most airports and other public places forbid the handling of explosives. Therefore, the testing of systems and devices to detect explosives will require more-creative solutions to provide assurance of proper operation to all of the stakeholders, including the air carriers, who pay for and operate screening equipment; the airports, who provide space and support to the air carriers for aviation security; the FAA, whose mission is to ensure aviation security; the equipment
OCR for page 14
--> manufacturers, who establish their companies' reputations by providing accurate and reliable equipment; and the flying public, who depend on the other parties to work together to ensure safe and secure air travel operations. The stakeholders most directly responsible for aviation security are the end users,1 usually the air carriers, the FAA, and the equipment manufacturers. Because these stakeholders have both the responsibility for and the capability of ensuring aviation security, the recommendations in this report are directed toward them. Intuitively it is not unreasonable to suspect that detecting threats to aviation security is a straightforward process and that a primary standard threat object (e.g., an explosive) could easily be defined. If such were the case, a nonexplosive secondary standard related to the primary standard could be produced and used to test the performance of explosives-detection equipment2 in an airport. The nature of the threat to aviation security, however, is not singularly defined. There are, in fact, several explosive compounds, which can be formulated into many different explosives and configured into an infinite number of shapes. Adding to this complexity is the fact that there are a medley of benign materials and objects contained in passenger baggage that could be misinterpreted as an explosive threat. These complications make it impracticable to establish a single meter bar for evaluating the performance of explosives-detection equipment. The manufacturers and users of explosives-detection equipment are not alone in their need to ensure that a difficult-to-test, complicated mix of hardware, firmware, and software works correctly. Manufacturers and users of fighter airplanes, medical computed tomography (CT) x-ray systems, and even the international telecommunications system face similar dilemmas every day. There is no one solution that can ensure that these systems will continually operate properly. In practice, manufacturers and users of any system rely on a mix of testing and of controlling the quality and makeup of the system's components and subsystems to gain sufficient confidence in proper system performance. For both hardware and software systems, configuration management, which encompasses change control and documentation, provides assurance that the impacts of any changes in manufacturing processes or materials are assessed before the changes are implemented and that any changes made are tracked so that performance can be maintained into the future. The other side of the coin is performance verification—testing to ensure that system performance has not degraded due to unanticipated causes or due to the synergistic effects of a series of small changes, each of which alone would not be expected to have an impact on system performance. Typically, system managers utilize a quality system3 to balance the ability of configuration management to control changes to and maintain the configuration of a system (with known satisfactory performance) with performance verification, which provides a direct measure of system performance. FAA Aviation Security Program The FAA Aviation Security Program, by regulating the deployment of systems and other equipment to detect explosives, is embarking on a new phase in ensuring aviation security. The FAA's role in this mix of configuration management and performance verification is complicated. In its role in ensuring aviation security, it must be able to verify that explosives-detection equipment are operating properly. The FAA, however, also wants to encourage the development of competing systems and devices, with the intent of improving operational performance while decreasing costs. Achieving a balance between the need to ensure consistent performance of available equipment and the need for equipment with improved performance will be challenging. In a sense, the regulatory system that the FAA mandates now will influence the future of the field of explosives detection, which is currently in its infancy. The concepts of configuration management and performance verification are not new to the FAA as a regulatory agency. The FAA regularly applies these principles in certifying aircraft and associated components through the FAA' s Type Certification process (FAA, 1996). However, the changing and unpredictable nature of the threat against aviation security makes the challenge of regulating explosives-detection equipment unique, because today's most sophisticated explosives-detection technologies may in fact be obsolescent as a result of tomorrow's explosive threats. In comparison, the product lifetime of other complicated systems, such as cellular telephones or personal computers, is determined predominantly by (comparatively) less threatening market forces. Furthermore, FAA-regulated aviation equipment such as aircraft must operate safely and effectively every day, and therefore they receive the necessary attention and scrutiny to ensure that they do so. The visibility (and often times the implementation) of aviation security equipment and procedures, however, is crisis dependent. 1 Included with end users are the air carriers, airports, third-party equipment operators contracted by the air carrier or airport, and third-party maintenance providers contracted by the air carrier, airport, or equipment manufacturer. 2 The following terminology is used throughout this report. An explosives-detection system is a self-contained unit composed of one or more integrated devices that has passed the FAA's certification test. Explosives-detection equipment is any equipment, certified or otherwise, that can be used to detect explosives. 3 A quality system is a model for quality assurance in design, development, production, and maintenance. In addition, it defines and documents a stakeholder's configuration-management and performance-verification procedures. A quality standard specifies the requirements of the quality system (ISO, 1994). Finally, the quality of a product is the degree to which it satisfies the wants or needs of a specific customer, or may be stated in terms of the degree to which it conforms to specification (Blanchard, 1986).
OCR for page 15
--> Months, or even years, of low-threat situations can be quickly interrupted by times of high-level threats, as in the failed 1995 plot of Ramzi Ahmed Yousef to place bombs on U.S. commercial jetliners. During the periods of relative tranquillity, the FAA should develop and implement a system to enable immediate response and quick deployment of explosives-detection equipment in times of crisis. Furthermore, the FAA must maintain procedures for handling certification, deployment, and maintenance of explosives-detection equipment in times of stability and in times of crisis. An automated explosives-detection system (EDS) consists of hardware, software, and firmware that determines the presence of an explosive device in baggage without operator intervention. To make up a fully operational aviation security system, a human operator is needed as well to perform the functions required for alarm resolution. The panel believes that a means to test and to continually improve the performance of the human operator is imperative. Although a competent human operator is required for an effective security system, this study focuses on the configuration management and performance verification of the hardware, firmware, and software. In developing a regulatory framework for explosives-detection equipment, the FAA requested that the National Research Council provide guidance on the FAA's role in the manufacture and deployment of explosives-detection equipment, especially with regard to the regulation of certified EDSs. The National Research Council appointed a panel—the Panel on Technical Regulation of Explosives-Detection Systems, under the supervision of the Committee on Commercial Aviation Security—to assess configuration-management and performance-verification options for the development and regulation of EDSs and other equipment designed for detection of explosives. To accomplish these tasks, the panel assessed the advantages and disadvantages, relative to the FAA's needs for explosives-detection equipment regulation, of various commercial approaches for configuration management and performance verification developed a framework for a performance-verification strategy that the FAA could implement to ensure that FAA-certified EDSs continue to perform at certification standards in the airport environment outlined an overarching management plan, inclusive of configuration management and performance verification, that will encourage commercial development and improvement of EDSs while ensuring that such systems are manufactured, deployed, operated, and maintained to meet FAA certification requirements Certified Explosives-Detection Systems The primary objective of aviation security is to prevent explosives and other threat objects from being brought aboard commercial aircraft. The Aviation Security Improvement Act of 1990 (Public Law 101-604) directs the FAA to develop technologies to detect explosives in checked baggage and, when technologies are shown to meet the criteria of the Act by passing FAA certification testing, mandate the deployment of certified systems in U.S. airports. In response to this directive, the FAA developed a set of certification criteria4 for automated bulk5 detection systems, that is, systems that detect bulk explosives 6 without intervention by a human operator (FAA, 1993). These certification criteria specify the types and amounts of explosives that must be detected, the detection and false-alarm rates that would be acceptable, and a throughput rate that is compatible with air carrier operation. Certification testing of candidate detection systems takes place at the FAA William J. Hughes Technical Center (FAA Technical Center) in Atlantic City, New Jersey, and equipment that is demonstrated to meet the certification criteria is designated as an EDS. In 1994, the InVision CTX-5000 (and later the CTX-5000-SP and CTX-5500) was certified by the FAA as an EDS. These two machines are the only FAA-certified EDSs at the time of this report. Simulated passenger baggage,7 with a percentage of the bags containing explosives, is inspected to determine detection rates, false-alarm rates, and throughput of a candidate detection system. The use of explosive material is the truest test for determining whether a system is capable of detecting a particular explosive. It is unrealistic, however, to subject subsequently manufactured copies of an EDS to the extensive testing performed at the FAA Technical Center. Because it is illegal to bring explosives into most U.S. airports, it is unlikely that explosive materials will be used in airports regularly to verify performance of an EDS. However, the FAA, the traveling public, the air carriers (who pay to purchase, deploy, and operate passenger and baggage- 4 Public Law 101-604 states that "[n]o deployment or purchase of any explosive detection equipment . . . shall be required . . . unless the [FAA] Administrator certifies that, based on the results of tests conducted pursuant to protocols developed in consultation with expert scientists from outside the Federal Aviation Administration, such equipment alone or as part of an integrated system can detect under realistic air carrier operating conditions the amounts, configurations, and types of explosive material that would be likely to be used to cause catastrophic damage to commercial aircraft." These certification criteria are classified and are linked to the FAA's ability to mandate implementation. The air carriers may deploy EDSs, either FAA certified or noncertified, on their own without an FAA mandate. Note that the Federal Aviation Reauthorization Act of 1996 (Public Law 104-264) directs the Administrator to deploy (temporarily) both certified and noncertified systems. 5 Equipment that remotely senses a physical or chemical property of the object under investigation is termed bulk explosives-detection equipment. Other types of equipment, called trace equipment, require that particles or vapor from the object under investigation be collected and identified. More detail is provided in Appendix A. 6 In this report bulk explosives include all forms and configurations of an explosive at threat level (e.g., shaped, sheet, etc.). 7 The explosives-detection equipment is tested with a primary standard bag set, which consists of "typical" passenger bags, some of which contain explosives at threat quantities.
OCR for page 16
--> screening equipment), and the manufacturers need to be assured that subsequently manufactured units, at the time of deployment and throughout their service life, meet the same performance requirements (e.g., detection rates, false-alarm rates, and throughput rates) as the unit that passed certification testing. Performance Verification of Aviation Security Equipment As a result of the recommendations of the White House Commission on Aviation Safety and Security, the FAA was directed to deploy commercially available (certified and noncertified) explosives-detection equipment that will significantly enhance aviation security (Public Law 104-264). Therefore, the FAA needs a means to verify the performance of certified and noncertified explosives-detection equipment in the field. To verify performance, a baseline level of performance must be established from which to reference the results of a field test. For example, for certification testing of explosives-detection equipment, the FAA has established a set of bags (the primary standard bag set) held at the FAA Technical Center, which is intended to be representative of the general population of international passenger bags (FAA, 1993). This test bag set consists of two subsets: (1) a threat subset, which contains explosives and is used to measure equipment detection performance, and (2) a nonthreat subset, which does not contain explosives and is used to measure the false-alarm rate of the equipment. The certification test is, in essence, an operational test that represents the only performance baseline available to the FAA. Similarly, the FAA has developed a baseline test that is used to determine the baseline performance of noncertified bulk explosives-detection equipment to be deployed in airports. The FAA tests such equipment against the same categories and amounts of explosives that are used in certification testing. Determination of which equipment is deployed (by the FAA) is related to the average probability of detection (for each device) over the given categories of explosives, and is also related to the false-alarm rate. Unlike bulk detection devices, there are no defined FAA certification criterion for trace detection devices. Trace detection devices axe based on direct chemical identification of either particles of explosive material or vapors given off by explosive material. These devices require three distinct steps to be effective: (1) sample collection, (2) transfer of a sample to a chemical detector, and (3) sample analysis. Sample collection and transfer are accomplished by using a high-volume air flow to gather vapors or dislodge particles from surfaces or by making physical contact with the subject (e.g., identification with a wipe). Sample analysis techniques employ a variety of detection methods, including gas chromatography, chemical luminescence, and mass spectroscopy (NRC, 1993, 1996). Trace detection techniques are capable of detecting the presence of explosive materials but are unable to determine if they are present in threat quantities. Therefore, it is impossible to establish a test protocol for trace detection devices that is based on identifying the presence of a threat amount of explosives. An FAA approval process has, however, been established to determine which trace detection devices of the FAA will deploy (FAA, 1997a). A variation of the test protocol used in the FAA approval process may be applicable to maintaining a baseline level of performance in devices for detecting trace amounts of explosives. The FAA's protocol for evaluating trace explosives-detection equipment uses extremely small amounts (of known quantity) of each category of explosives on a variety of substrates, with some substrates intentionally left uncontaminated. Several trace explosives-detection devices were shown by the FAA to be capable of finding explosive materials on the surface of various types of carry-on luggage. These devices are currently being deployed. To be assured that certified and noncertified deployed equipment continue to meet FAA baseline performance specifications (without transporting the entire FAA primary standard bag set to every airport or every EDS off the production line to the FAA Technical Center for testing), a protocol must be developed to verify the performance of such systems, both at the manufacturing site and in the airport environment. The FAA has already established a test protocol to verify performance at manufacturing sites and airports for certain types of security equipment. For example, the FAA regularly tests passenger-screening portals using a standard set of guns, determining if the metal-detection portal alarm sounds for each test gun, as it is designed to do. This FAA test is performed immediately after manufacture, after a metal-detection portal is installed, whenever a metal-detection portal is moved, and periodically to "spot check" the security system. In addition, air carriers regularly conduct their own tests to check the operation of their security equipment (FAA, 1997b). The x-ray radiographic equipment that is currently used for screening carry-on baggage requires an operator to view a transmission x-ray image of each bag and scrutinize the contents of the bag for items that may be a threat to aviation security.8 To test this equipment, the manufacturer, the FAA, and the air carriers periodically use a standard test object, such as the step wedge (with wires) to measure spacial resolution and contrast sensitivity as described in ASTM F792-88 (1993), to determine baseline performance. Implicit in this test is the assumption that the results of the test correlate with the ability of the equipment to present clear, high spatial resolution and high contrast sensitivity images of the 8 Current legal interpretation allows the air carriers to search only for items that are a threat to the aircraft or air crew. They may not conduct a search to determine if a person is carrying other illegal but nonthreat items such as drugs (NRC, 1996).
OCR for page 17
--> contents of the bag so that equipment operators recognize threat objects in hand-carried items. As described in ASTM F792-88 (1993), the test step wedge is not meant to simulate a weapon or explosive device, nor does it consist of explosive material. The step wedge is used, rather, to test the sensitivity and dynamic range of the x-ray imaging equipment. The adequacy of the operation of the equipment, as determined by this test, is then used as an indirect measure of its efficacy in imaging threat objects. This is an example of the use of secondary standards in airport testing of security equipment. Because explosives are not allowed in airports and a site license is required for manufacturers to handle them, secondary standards and associated test protocols for performance verification of explosives-detection equipment are needed. Report Organization This report discusses an integrated life-cycle management plan for explosives-detection equipment that will aid the FAA in ensuring the production quality and operational consistency of EDSs, as well as the acceptable performance of recently deployed noncertified explosives-detection equipment. In addition, recommendations are made regarding performance verification of deployed noncertified explosives-detection equipment. Stakeholder (FAA, manufacturers, air carriers, and airports) needs and requirements of the FAA's management plan are discussed briefly in Chapter 2. Chapter 3 introduces the anatomy of explosives-detection equipment to put into context the recommendations regarding configuration management and performance verification made in the report. Chapter 4 outlines available configuration-management and performance-verification options and reviews several quality standards for use in developing a quality system. Finally, in Chapter 5 the panel's recommendations for a management plan are made, and the associated roles and responsibilities of the stakeholders in maintaining the performance of certified and noncertified explosives-detection equipment are discussed.
Representative terms from entire chapter: