National Academies Press: OpenBook
« Previous: Front Matter
Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

Executive Summary

Terrorist incidents around the world involving commercial airplanes have received tremendous visibility, increasing the perception of vulnerability for those who fly, while recent bombing incidents in the United States have raised public awareness of the U.S.'s vulnerability to terrorism. Such events have sparked a vigorous debate about how to improve aviation security. Calls for improvements in the U.S. aviation security system have come from a wide variety of sources, ranging from local newspaper editors to the White House Commission on Aviation Safety and Security.1 Critics and supporters alike have called for improvements in security personnel, such as increasing their numbers, improving training, and heightening public awareness of security threats, in addition, many have called for improvements in equipment for detecting weapons and explosives, including increasing the number of units deployed and developing new technologies.

The ability to detect weapons and explosives hidden in baggage is a critical element of aviation security. X-ray radiographic systems that screen checked and carry-on baggage for weapons and explosives have been in use for many years. Early systems focused on detecting weapons that might be used to hijack an airplane; more recently the focus has expanded to improving the ability of equipment to detect explosive devices in passenger bags. For more than a decade, the Federal Aviation Administration (FAA) has worked with manufacturers to improve the ability of x-ray radiographic systems to detect explosives and to develop new technologies for explosives detection. Several types of equipment, including machines that provide images of bag contents and instruments that sample the air around bags or the surfaces of bags for traces of explosive materials are commercially available but have only recently been widely deployed in the United States. Equipment that remotely senses a physical or chemical property of the object under investigation is termed bulk2 explosives-detection equipment (see Box ES-l). Other types of equipment, called trace equipment, require that particles or vapor from the object under investigation be collected and identified. In 1996, the FAA was directed by the White House Commission on Aviation Safety and Security, under Public Law 104-264 (1996), to purchase and deploy this commercially available equipment to increase aviation security. At the same time, the FAA and manufacturers are continuing their efforts to develop explosives-detection equipment with improved capabilities.

To provide a quantitative assessment of the ability of explosives-detection equipment to detect a wide variety of explosives, the FAA has developed certification standards and a certification test protocol to measure performance of bulk explosives-detection equipment (FAA, 1993). These certification standards are based on classified FAA analyses of threats to aviation security and include parameters regarding the types and amounts of explosives to be detected, acceptable rates of detection and false alarms, and required baggage throughput rates.3 The certification test protocol requires that, for each manufacturer's model, a single explosives-detection unit be tested at the FAA William J. Hughes Technical Center (FAA Technical Center) in Atlantic City, New Jersey, using a standard set of test bags. These bags are all similar in appearance and contents, but some contain explosives. Once a technology has passed the certification requirements, it is referred to as an explosives-detection system (EDS), as are all subsequent reproductions of the original unit (see Box ES-2).

With the certification of InVision' s CTX-5000 in 1994 it became important to know how the FAA can ensure that

1  

The White House Commission on Aviation Safety and Security was convened on July 25, 1996, just days after the explosion and crash of Trans World Airlines Flight 800. President Clinton requested an initial report from the commission specifically to address questions of aviation security (White House Commission on Aviation Safety and Security, 1996, 1997).

2  

In this report bulk explosives include all forms and configurations of an explosive at threat level (e.g., shaped, sheet, etc.).

3  

Specific certification criteria are classified.

Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

BOX ES-1 Terminology for Explosives-Detection Equipment

In the field of explosives detection, an explosives-detection system (EDS) is a self-contained unit that has been certified by the FAA to detect, without operator intervention, the full range of explosives designated in the certification criteria. However, there is a wide variety of equipment that has not been demonstrated to perform at certification levels or for which there are no certification criteria.

In this report, the phrase EDS is used to describe certified equipment. The phrase explosives-detection equipment (also referred to as advanced technology) is used generically, describing any piece of equipment, certified or not, designed to detect explosives.

reproductions of the originally tested unit are working properly at the manufacturing site before deployment and in the airport after deployment. Because each unit of a certified EDS is required to perform at certification level, the FAA must now develop a framework of procedures for manufacturers and operators that will ensure the required performance level of each unit. This framework must include performance requirements that can be measured at manufacturing sites and in airports and must specify the documentation and record-keeping that manufacturers and users must provide to maintain EDS certification.

The FAA Aviation Security Program is embarking on a new phase in aviation security (i.e., ensuring the certification-level performance of individual EDS units after deployment, after maintenance, and after upgrades). As part of their role in maintaining aviation security, the FAA must be able to verify that EDSs are operating properly, which will involve defining performance requirements and establishing equipment testing protocols. At the same time, the FAA wants to encourage the continuing development of certified systems and devices, with the intent of improving performance and decreasing costs. Balancing the need to ensure the consistent performance of available equipment and the need to encourage the development of new and better equipment will be challenging. In a sense, the regulatory system the FAA puts in place now will influence the future of the explosives-detection field, which is still in its infancy. Because of the unknowable and changing nature of threats to aviation security, the regulatory system will also have to be flexible enough to adapt to changes in certification requirements and to allow rapid responses to emergencies by the FAA, EDS manufacturers, and equipment users.

The logistics of verifying EDS performance at manufacturing sites and in airports over the system's life cycle are complex. One well-known example of in-service detection performance verification is the standard used to test the metal-detecting portals commonly deployed in airports around the world (FAA, 1997a). These portals are tested by an individual carrying a weapon or simulated weapon through them. Analogously, one might reason that, to verify the detection performance of an EDS, an inspector could simply insert an appropriate explosive sample to ascertain if the machine gives the correct response. Although a few metal

BOX ES-2 Certified Versus Noncertified Explosives-Detection Equipment

Public Law 101-604 (Aviation Security Improvement Act of 1990) states that "[n]o deployment or purchase of any explosive detection equipment . . . shall be required . . . unless the [FAA] Administrator certifies that, based on the results of tests conducted pursuant to protocols developed in consultation with expert scientists from outside the Federal Aviation Administration, such equipment alone or as part of an integrated system can detect under realistic air carrier operating conditions the amounts, configurations, and types of explosive material that would be likely to be used to cause catastrophic damage to commercial aircraft."

In response to this directive, the FAA developed certification criteria for explosives detection and test protocols that would determine if a unit of explosives-detection equipment meets those criteria. These certification criteria are classified and are linked to the FAA's ability to mandate implementation.

The Federal Aviation Reauthorization Act of 1996 (Public Law 104-246, 1996) directs the FAA to deploy (temporarily)both certified and noncertified systems. The FAA is purchasing and installing a variety of explosives detection equipment and is using the opportunity to improve installation and evaluation procedures. Although the FAA has no regulatory role in the design and manufacture of noncertified equipment, as purchaser the FAA may set performance standards and criteria against which to measure airport performance of this equipment. The air carriers may deploy explosives-detection systems, either FAA certified or noncertified, on their own without an FAA mandate.

Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

guns (e.g., ferrous, nonferrous, nonmagnetic nonferrous) may effectively represent all guns in terms of verifying the detection performance of a metal-detection portal, a single—or even several—explosives can not represent the salient characteristics and properties of the many explosive types and arrangements that must be detected. Furthermore, because of the attendant safety problems, most airports and other public places forbid the handling of explosives. Therefore, developing effective testing procedures to ensure the proper operation of explosives-detection equipment, both when the equipment is manufactured and when it is in operation, will require creative solutions.

Study Approach and Scope

In 1995, the FAA requested that the National Research Council (NRC) assist them in the development of a framework for ensuring proper detection performance by suggesting guidelines for the manufacture and deployment of EDSs. In response to this request, the NRC appointed the Panel on Technical Regulation of Explosives-Detection Systems, under the auspices of the Committee on Commercial Aviation Security of the National Materials Advisory Board. The panel was charged with assessing options for configuration management and performance verification for the development and regulation of FAA-certified commercial EDSs and other equipment designed to detect explosives. To accomplish these tasks, the panel took the following actions:

  • assessed the advantages and disadvantages of various commercial approaches to configuration management and performance verification in terms of the FAA's regulation of explosives-detection equipment
  • developed a framework for a performance-verification strategy that the FAA could use to ensure that EDSs perform at certification levels in airports
  • outlined an overarching management plan that includes configuration management and performance verification to encourage commercial development and improvements in EDSs and to ensure that systems are manufactured, deployed, and maintained in such a way as to meet FAA certification requirements

Any framework for ensuring the certification-level performance of explosives-detection equipment must take into account the interests, capabilities, and needs of the various groups involved, including U.S. air carriers, who pay for and operate screening equipment; U.S. airports, which provide space and support to air carriers for aviation security; the FAA, which is responsible for maintaining aviation security; manufacturers of explosives-detection equipment, whose companies' reputations depend on their providing accurate and reliable equipment; the U.S. Congress, which appropriates funds for FAA development and deployment of explosives-detection equipment; and, most important, the flying public, who depend on the other groups to provide safe and secure air travel. The stakeholders most directly responsible for aviation security are the end users,4 usually the air carriers, the FAA, and the equipment manufacturers. Because these stakeholders have both the responsibility for and the capability of ensuring aviation security, the recommendations in this report are directed toward them.

This Executive Summary is primarily focused on the role of the FAA in the certification of EDSs and the verification of performance levels of both new systems and systems in use in airports; less attention is paid to the roles of manufacturers and end users. A more in-depth discussion of manufacturers' and end users' roles can be found in the body of this report. Many of the practices and procedures recommended in this report are already included in FAA documents; the panel includes these recommendations to reinforce their importance and to place these issues in the overall framework of the manufacture, deployment, operation, and regulation of EDSs.

Quality Systems

For both hardware and software systems, a balance must be achieved between the predictive methodology of configuration management and the confirming methodology of performance verification. Configuration management, which encompasses change control and documentation, ensures that changes in manufacturing processes or materials and upgrades of equipment in service are assessed before they are implemented and that they are tracked to ensure that the configuration of units is known at all times. Performance verification comprises testing to ensure that system performance has not degraded unexpectedly as a result of the changes themselves, as a result of the synergistic effects of a number of small changes, or as a result of changes in performance over time. Typically, system managers use a quality system to balance performance verification and configuration management. The purpose of a quality system is to ensure consistent quality in design, development, production, installation, operation, maintenance, and upgrades of equipment.

Recommendation. Every stakeholder must have a quality system in place that includes oversight, validation, verification, and procedures for configuration management and performance verification covering that stakeholder's responsibilities throughout the life cycle of an EDS.

In the opinion of the panel, manufacturers without a quality system that incorporates configuration management will be unable to compete in competitive industries. Quality standards provide a framework for developing such effective quality systems. Perhaps the best-known and most widely used quality standard is the ISO 9000 series of quality sys-

4  

Included with end users are the air carriers, airports, third-party equipment operators contracted by the air carrier or airport, and third-party maintenance providers contracted by the air carrier, airport, or equipment manufacturer.

Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

tem standards, which allows individual companies or units within a company to develop a unique quality system that is effective for their organization (ISO, 1994). Numerous quality systems have been developed, each one tailored to the specific needs of the organization using it.

Recommendation. Because there is already a global movement toward using the ISO 9000 series of quality system standards, the panel recommends that the FAA base both its in-house quality system and its requirements for other quality systems on these standards. The FAA should accept the quality system of any stakeholder that has the following attributes:

  • a definition of the critical parameters and their tolerances, procedures, and processes to be monitored
  • documented evidence of an internal quality system
  • a definition of the methods for controlling and verifying changes to procedures and processes
  • a definition of an internal audit program
  • provision for a third-party audit of conformance with the quality system

The panel concluded that if a stakeholder can demonstrate to the FAA—through a third-party audit—that its existing quality system has the salient attributes outlined above from the ISO 9000 series of quality standards, it should not be required to be formally certified as being compliant with a particular ISO 9000 standard.

Quality Systems and Life-Cycle Phases

To determine the role of the FAA in the certification and deployment of EDSs, the panel found it convenient to identify five EDS life-cycle phases (see Figure ES-l):

  • the research phase, which includes basic research and testing up to and including proof of concept and "breadboard" implementation of those concepts
  • the engineering phase, which includes the development of the manufacturer's original design, modifications to that design, and leads to pilot production of a system for certification testing by the FAA
  • the manufacturing phase, which includes pilot production, certification (performance and documentation of precertification testing and certification testing at the FAA Technical Center), and manufacturing (which includes the assembly and testing of reproductions of systems that have been certified and the incorporation of approved improvements and upgrades to the certified design)
  • the operational phase, which includes initial deployment and normal use and operation
  • the retirement phase, which includes the removal and disposal of a system

In its regulatory capacity, the FAA participates in two of these five phases: the manufacturing phase, through the change control process for design changes; and the operational phase, through the change control process for equipment maintenance or upgrades and verification testing for maintenance of certification (see Figure ES-1). Although the FAA has supported research on new concepts for explosives detection and the engineering of new explosives-detection technologies, in the regulatory arena, the FAA does not become officially involved until a manufacturer begins to prepare a unit for certification testing (FAA, 1993). The FAA tests the original explosives-

Figure ES-1

Five phases in the life cycle of an EDS.

Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

detection equipment,5 which is a product of the manufacturing phase; works with the manufacturer who produces the EDS units or incorporates improvements to the original design; and works with the end users who operate and maintain the EDS. The FAA may participate in development testing of engineering prototype systems leading to the first-run production system. On the other end of the life cycle, once an EDS has demonstrated that it can no longer perform at the levels required to maintain certification, the FAA may withdraw certification from that unit. The user is then responsible for the unit; they may choose to continue to use the uncertified unit in an application where certification is not required, or they may choose to retire the unit. Once certification is withdrawn, the FAA has no responsibilities for the retirement phase of the unit.

The panel focused on the three aspects of the EDS life cycle (certification, manufacturing, and operation) in which the FAA participates in a regulatory capacity. These three aspects can be thought of as encompassing the certification of an EDS and the maintenance of certification for duplicates of the certified system as well as for deployed EDSs. Most challenges to maintaining certification arise during the manufacturing and operation phases as a result of manufacturer or user requests to improve performance by incorporating design changes into currently manufactured systems and incorporating configuration changes into deployed systems. Another challenge is the EDS redesign required when a subsystem or specific technology that has been designed into the EDS becomes unavailable due to the rapid pace of technology changes or when a subcontractor of the EDS manufacturer discontinues a subsystem used in the EDS design. The overall impact of these changes or combinations of changes on performance may be poorly understood.

The panel endorses the requirement in the FAA certification criteria (FAA, 1993) specifying that quality systems should be in place prior to certification testing for each subsequent life-cycle phase of an EDS. Because each stakeholder has different responsibilities during each life-cycle phase, the quality systems may vary. But they must all have the attributes specified in the previous recommendation. Although individual stakeholder activities may already be under the control of quality systems, the panel believes that it is critically important to have comprehensive quality systems that cover all life-cycle phases of an EDS. Furthermore, it is important that each stakeholder periodically receive a third-party audit of their quality system, including, where applicable, a configuration audit.

Recommendation. Explosives-detection equipment presented to the FAA for certification testing must be the product of an implemented and documented manufacturing quality system. Subsequent units must be manufactured according to the same quality system to maintain certification.

Recommendation. The FAA should implement and document its own quality system under which precertification activities, certification testing, test standards development and maintenance, and testing for maintaining certification are conducted.

Recommendation. Each stakeholder must have a documented and auditable quality system that governs its specific responsibilities in the manufacture, deployment, operation, maintenance, and upgrading of EDSs.

Recommendation. The FAA should ensure that each stakeholder periodically receive an audit of their quality system—including (where applicable) a configuration audit—from an independent third-party auditor.

Change Control Process

The change control process encompasses a proposal of a change to the design of explosives-detection equipment or to the configuration of an existing unit, agreement by all stakeholders on the change, implementation of the change, and verification of the impacts of the change. The keys to the successful implementation of a change control process, which must be included in an acceptable quality system, are the agreement of stakeholders on the classification of proposed changes; defined stakeholder involvement at each classification level; periodic audits of design and configuration changes and test results; and periodic testing of EDSs to determine that the combined effect of all design or configuration changes does not degrade EDS performance.

The panel found it useful to outline a change control process for EDSs in which changes in EDS designs or configurations are made in response to needs or problems identified by the manufacturer, the FAA, or the end user (see Figure Es-2). Changes to the design or configuration of an EDS may be desirable either during the manufacturing process or after the system has already been deployed. One critical aspect of making changes to an EDS is evaluating the potential impact of changes on the operation of the EDS in the context of each stakeholder's responsibilities. For example, users, who are responsible for installing and operating systems in the field, may be concerned about the effects of changes in throughput, floor plan area, weight, or costs. The FAA is likely to be concerned about the effects of a change on explosives-detection performance levels. Manufacturers may focus on compatibility with previously manufactured units. Every change must be ranked by each stakeholder for its potential impact on its area of concern.

Recommendation. All stakeholders (air carriers, FAA, equipment manufacturers) should agree prior to the manufacture or deployment of an EDS on a change classification system and process to ensure that stakeholders are notified of changes as necessary.

5  

The documentation of FAA certification outlines both precertification requirements and certification testing protocols for prospective EDSs.

Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

Figure ES-2

Configuration change process for an EDS during manufacture or operation.

The benefits of having a change classification process in place prior to the manufacture or deployment of an EDS include the incorporation of improvements with a minimum of confusion about the role of each stakeholder, the clarification of retesting required to confirm the effects of changes, and the empowerment of each stakeholder to evaluate EDS performance continually and suggest improvements. A typical change classification system ranks changes as follows:

  • Class 1 changes are likely to affect system performance.
  • Class 2 changes might affect some aspects of system performance.
  • Class 3 changes are not likely to affect system performance.

Once a change classification process is in place, stakeholders who propose changes to an EDS design or to an individual EDS would have a clear approval process and implementation plan to follow. For example—if the three-tiered change classification process described above were adopted—once the classification level of a change has been determined, either the change would be implemented (Class 2 and Class 3) or the other stakeholders would be notified of the proposed change and arrangements would be made to obtain stakeholder agreement on the specific design or configuration change and retesting plan (Class 1). If no agreement can be reached, the stakeholders (i.e., FAA, manufacturers, and end users) can modify the change or retesting plan or can withdraw the proposed change from consideration. The FAA, however, would act as the final arbiter in such a situation. Once a design or configuration change has been made, the EDS would be tested according to the specified retesting plan and the test results documented for future audits (see Figure ES-2). Although, in reality, determining the class of a change will be the responsibility of the manufacturer, review of the classification of changes (and the appropriateness of such classifications) should be included in a third-party configuration audit.

Recommendation. Configuration-control boards such as the ones used in many industries should be established to determine which proposed changes will be implemented and the implementation and testing conditions that will be imposed.

All proposed changes would be brought before a board that has the expertise and authority to determine their potential impact and determine the appropriate change classification level. The panel envisions the establishment of several configuration-control boards, each responsible for overseeing one or more aspects of the EDS life cycle. For example, the manufacturers might establish a configuration-control board that includes representatives of their research, engineering, quality assurance, marketing, and service areas. Users, such as air carders, could establish a board that includes representation from operations, security, and other departments to oversee upgrades and the maintenance of EDSs in airports. The FAA should establish a configuration-control board with representation from all stakeholders to

Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

oversee Class 1 changes. The FAA should have final authority in resolving the debate on whether or not a Class 1 change should be approved or require recertification testing.

FAA Testing for Maintenance of Certification

The panel found it useful to define seven levels of testing that might take place during the life cycle of an EDS: precertification, certification, baseline, qualification, verification, monitoring, and self-diagnosis (see Table ES-1). Although other types of testing may be undertaken by the FAA, manufacturers, or end users, the seven types listed above are the ones related to certification and maintenance of certification. The FAA's current certification documentation specifies the testing protocols and test objects to be used for precertification and certification testing (FAA, 1993). Testing protocols and test objects are being developed for the four test levels that follow manufacturing and deployment—qualification, verification, monitoring, and self-diagnosis.

The panel focused its discussions of performance testing on postcertification testing, which the panel recommends be required for maintaining certification. The panel defined the purpose of each postcertification testing level as follows (see Table ES-l):

  • Qualification testing determines if a new unit meets its performance requirements. Qualification testing would be performed at the manufacturing site—prior to shipping—to assure the purchaser that the individual unit has been properly manufactured.
  • Verification testing determines if a deployed unit meets its performance requirements. Verification testing would normally be performed in the airport at initial deployment and at specified intervals using a secondary standard bag set to demonstrate to the user and the FAA that the unit is functioning as specified.
  • Monitoring critical system parameters determines if unit performance has changed. System parameters for a computed-tomography (CT)-based system such as the CTX-5000 might include spatial resolution and contrast sensitivity. Monitoring would normally be done in the airport at specified intervals using test articles to demonstrate to the user and the FAA that unit performance has not changed.
  • Self-diagnosis determines if components or subsystems of a unit are functional. Ideally, self-diagnostics will evolve to the point in which they are capable of determining if components and subsystems are operating according to their individual specifications. Self-diagnosis includes the continuous measurement of subsystem parameters (e.g., voltages and currents) during routine operation as well as self-diagnostic routines on machine start-up.

The development of appropriate test objects and testing protocols will be critical to each level of testing. The panel has identified three types of test objects that should be required for each type of EDS: a primary standard bag set, a secondary standard bag set, and individual test articles to test critical system parameters (see Table ES-2).

Recommendation. The FAA should require a wide variety of tests for maintaining EDS certification, including qualification testing and periodic verification testing of detection performance levels (using a secondary standard bag set), frequent monitoring of critical system parameters (using test articles), and continuous self-diagnosis of subsystem parameters (e.g., voltages and currents) to detect incipient problems.

Note that validation of critical and subsystem parameter ranges may require monitoring the correlation of these ranges with equipment performance over time—even after deployment. In this context, system performance pertains to the ability of the equipment to detect the explosive compositions

TABLE ES-1 Seven Proposed Testing Levels during the Life Cycle of an EDS

Test Level

Purpose

Location

Test Objects

Frequency

Precertification

Determine if technology is ready for certification testing.

Manufacturer's site

Test articles

Oncea

Certification of an EDS

Determine if technology performance is at certification level.

FAA Technical Center

Primary standard bag set

Oncea

or Baseline.

Establish baseline performance for noncertified equipment.

FAA Technical Center

Primary standard bag set

Oncea

Qualification

Verify the performance of an individual manufactured unit to qualify that unit for deployment.

Manufacturer's site

Secondary standard bag set and test articles

Oncea

Verification

Verify the performance of an individual deployed unit to confirm that performance is at qualification level.

Airport

Secondary standard bag set

At specified occasional intervals

Monitoring

Verify critical system parameters to confirm the consistency of system performance.

Airport

Test articles

At specified frequent intervals

Self-diagnosis

Verify that subsystem parameters are operating according to specifications for continuous "system health."

Airport

None

Continuous

a May require retesting until the unit passes the specified test.

Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

TABLE ES-2 Types and Purposes of Test Objects

Test Objects

Definition

Purpose

Primary standard bag set

Representative passenger bags, some containing explosives at threat quantity

Simulate threat

Secondary standard bag set

Representative passenger bags, some containing simulants representing explosives at threat quantity

Simulate primary standard bag set (requires no special safety-related handling permits or precautions)

Test articles

Individual articles, including items such as simulants, the InVision IQ simulants test bag, and ASTM (1993) standardized step wedges and CT phantoms

Elicit a predetermined response to test critical system parameters

and configurations (e.g., sheet and nonsheet bulk explosives) defined by the FAA's certification criteria. Therefore, parameter values measured outside of accepted ranges should trigger testing the equipment with the secondary standard bag set in the field.

Development of Test Objects

The FAA has already developed a standard bag set that includes bags with explosives for certification testing (FAA, 1993). In this report this bag set is referred to as the primary standard bag set. Because EDSs in airports or at manufacturers' sites cannot easily be tested with real explosives, the FAA must develop a secondary standard bag set consisting of representative passenger bags, some containing standard materials that simulate explosive threat materials and some without any simulated threats. The secondary standard bag set could be used periodically to test the detection performance of EDSs in airports or at manufacturers' sites. However, in order for such tests to be relevant to measuring the certified performance of an EDS, the secondary standard bag set must be validated against the primary standard bag set, perhaps at the time of certification testing. Figure ES-3 illustrates a verification testing process.

The critical issues for tests using the secondary standard bag set are determining the performance levels that will be acceptable to the FAA and determining how those performance levels will be measured. All of the major stakeholders—the FAA, the manufacturers, and the users—should be involved in the development of secondary standards and test protocols for testing in nonsecure areas, such as manufacturers' sites, or public places, such as airports. However, because the FAA regularly gathers intelligence on threats to civil aviation security and is responsible for ensuring aviation security, only the FAA can determine acceptable performance levels. Of course, the FAA must be mindful of the practical limitations of testing and the capabilities of EDSs. For example, requiring that certification performance levels be used as performance requirements in an airport would be counterproductive because testing several hundred bags would disrupt airport operations, and testing with explosives in a public environment could be unsafe.

Recommendation. For qualification and verification testing, the FAA should work with EDS manufacturers and users to develop a secondary standard bag set for each specific technology or technology class.

Recommendation. The secondary standard bag set should be controlled to assure reliable test results, as is done by the FAA for the primary standard bag set. It is important that the FAA periodically (on the order of the lifetime of the simulants) verify the condition, configuration, and performance of the secondary standard bag set.

Recommendation. For monitoring the performance of EDSs, the FAA should work with manufacturers to develop a set of critical system parameters (and their tolerances) that could be monitored frequently and recorded to track changes in performance during normal operations or to verify performance after maintenance or upgrading.

Recommendation. The panel recommends that the FAA verify critical system parameters during certification testing.

Monitoring critical system parameters will require test articles and practical test protocols, as well as specified nominal values and ranges for each critical system parameter. For CT-based systems, such as InVision's CTX-5000,6 critical system parameters might include spatial resolution and contrast sensitivity. The manufacturer, who is the most familiar with a specific technology, is best qualified to establish critical system parameters and their tolerances, which the FAA could verify during certification testing. Figure ES-3 illustrates the process of monitoring testing for an EDS in the operational phase.

The stakeholders must agree on the process that will be followed if an EDS unit fails during monitoring, verification, or qualification testing, and on criteria for withdrawing an individual EDS unit from operation. In addition, a process should be in place for the FAA and the manufacturer to identify the failure mode, correct it, and requalify the EDS unit for use. The FAA and the manufacturer should determine if this particular failure mode is common to a particular

6  

The CTX-5000 was the first FAA-certified EDS. The InVision CTX-5000-SP has also been certified.

Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

Figure ES-3

Monitoring and verification testing for certification maintenance.

version of an EDS and whether they need to go through the same correction and requalifying procedure. If the cause of the failure camnot be determined or corrected, the EDS unit or units should be withdrawn from service indefinitely. Furthermore, if several units of a particular version fail and the cause of failure cannot be determined or corrected, certification should be withdrawn for that version.

Because the monitoring testing would use standard test articles, it would be useful to include the monitoring test as a part of the qualification testing for newly manufactured EDSs prior to shipping them to airports. In other words, the qualification testing protocol should include protocol for validation of the test article to be used for monitoring.

The manufacturer's design will include many subsystem parameters that could be measured by frequent periodic diagnostic tests, if not continuous self-diagnostic tests as the system operates. Changes in voltages and currents, for example, can be early warnings of a change in performance characteristics. The manufacturer should establish acceptable ranges as part of the quality system and should document the measurements.

Noncertified Explosives-Detection Equipment

The previous discussion was focused on certification and certification maintenance for EDSs. However, the FAA also purchases noncertified equipment for airport demonstrations and operational testing. As purchaser, the FAA should require that the manufacturers of noncertified equipment have in place a quality system that meets the same standards as the quality systems for manufacturers of certified equipment. However, to verify the performance of noncertified equipment, the FAA must establish a ''baseline performance'' level for each explosives-detection equipment design. The panel has labeled testing to determine the baseline performance of noncertified equipment "baseline testing" (see Table ES-1).

Certification testing could be used to establish the baseline performance for some types of explosives-detection equipment that does not meet all of the certification criteria (e.g., advanced radiography). The baseline performance would be the "score" that the equipment achieved on the certification test.

However, the FAA also must establish a testing protocol to determine the baseline performance for equipment for which certification criteria have not been developed (e.g., trace detection equipment). For trace technologies, which are designed to collect and identify traces of explosives on the outside surfaces of bags or in the surrounding air, the FAA has already developed a baseline test that the panel believes may provide a basis for determining baseline performance (FAA, 1997b).

Recommendation. The FAA should require that the manufacturers of noncertified equipment demonstrate the same level of quality system covering both manufacturing and

Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

upgrade as required for manufacturers of EDSs when noncertified equipment is purchased by the FAA for airport demonstrations, operational testing, or airport deployment.

Recommendation. The FAA should ensure that the manufacturers of noncertified explosives-detection equipment purchased by the FAA periodically receive an audit of their quality system—including a configuration audit—from an independent third-party auditor.

Recommendation. The FAA should ensure that airlines, other end users, and organizations responsible for maintenance and upgrades (e.g., manufacturers or third-party service providers) demonstrate a quality system that covers the operation, maintenance, and upgrade of noncertified EDSs that are purchased by the FAA for airport demonstrations, operational testing, or airport deployment. Such a quality system should meet the FAA's requirements of quality systems used by operators and maintainers of certified explosives-detection equipment.

Summary

The FAA plays the leading role in establishing the performance requirements for operational EDSs, just as it has for laboratory performance requirements for certification. Performance requirements include not only the acceptable detection and false-alarm rates, but also the testing protocols and the test objects used to determine system performance. Like other stakeholders in the manufacture and deployment of EDSs, the FAA must also have a quality system in place that defines the critical parameters and their tolerances, procedures, and processes to be monitored; documents the implementation of the quality system; defines procedures for controlling and verifying changes to procedures and processes; and provides for third-party audits of conformance.

The FAA's main role prior to certification testing is to develop certification test objects and test protocols and to demonstrate that they are under the control of an acceptable quality system (see Figure ES-4). The FAA must establish standards for performance of certified systems and develop and validate a secondary standard bag set and other test objects for testing at the manufacturer's site and in airports (see Figures ES-5 and ES-6). These performance standards can be the same as the ones used for qualification testing at the manufacturer's facilities and can use the same secondary standard bag set to demonstrate the consistency of performance at the factory and in the airport. Once an EDS is deployed, the FAA is responsible for periodic verification testing to ensure consistent performance over the life of the EDS. Another postdeployment issue is the maintenance of explosives-detection equipment. In many industries, including the aircraft/airline industries, a third-party service provider is contracted by the user (e.g., the airlines) for maintenance of the equipment. There is no reason to believe that this will not occur, at some point, with explosives-detection equipment. Third-party audits of manufacturers, users, third-party service providers, and the FAA will ensure compliance with all configuration-management and performance-verification requirements.

FAA

Manufacturer

User

Develop primary standard bag set.

Identify system baseline configuration.

Participate in and facilitate precertification airport testing for explosives-detection technology.

Demonstrate that a quality system is used for all certification test objects and test protocols.

Identify critical system parameters, including acceptable values and ranges.

 

Ensure that a quality system is in place for the manufacturing and operational life cycle phases of an EDS.

Demonstrate a quality system for manufacturing, including a change control process.

 

Develop secondary standard bag set.

Demonstrate completion of precertification activities.

 

Figure ES-4

Responsibilities of stakeholders for moving from the engineering phase to certification.

Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

FAA

Manufacturer

User

Ensure the development of a secondary standard bag set.

Develop proposed manufacturing change classification levels and procedures and facilitate acceptance by other stakeholders.

Participate in the development of manufacturing change classification levels and procedures.

Validate secondary standard bag set.

 

 

Participate in the development of manufacturing change classification levels and procedures.

Participate in the development of performance standards, test protocols, and test objects for qualification testing.

Participate in the development of performance standards, test protocols, and test objects for qualification testing.

Establish performance standards and test protocols for qualification testing.

 

 

Validate test objects for qualification testing.

 

 

Figure ES-5

Responsibilities of stakeholders for moving from certification to the manufacture of an EDS

FAA

Manufacturer

User

Establish performance standards and test protocols for validation and monitoring testing.

Participate in the development of operational configuration change process.

Demonstrate quality systems for EDS testing, maintenance, and upgrades.

Validate test objects for validation and monitoring testing.

Participate in the development of performance standards, test protocols, and test objects for verification and monitoring testing.

Develop an operational configuration change process.

Participate in the development of operational configuration change process.

Notify FAA and users of maintenance requirements and opportunities for upgrades.

Participate in the development of performance standards, test protocols, and test objects for verification and monitoring testing.

Perform periodic verification testing and maintain data.

Maintain self-diagnostics data.

Perform routine maintenance.

Perform qualification testing with manufacturer.

Maintain configuration documentation of units to be deployed.

Perform monitoring testing and maintain data.

 

 

Maintain configuration documentation of units in airports.

Figure ES-6

Responsibilities of stakeholders for moving from the manufacturing phase to the operational phase.

Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

References

ASTM (American Society for Testing and Materials). 1993. F792-88. Design and Use of Ionizing Radiation Equipment for the Detection of Items Prohibited in Controlled Access Areas. West Conshohocken, Pa.: American Society for Testing and Materials.


FAA (Federal Aviation Administration). 1993. Management Plan for Explosives Detection System Certification Testing. Washington, D.C.: Federal Aviation Administration.

FAA. 1997a. Metal Detector Calibration/Certification Procedures for Course #00032. Washington, D.C.: Federal Aviation Administration.

FAA. 1997b. Summary of Trace Testing Methodology in Support of the Security Equipment Integrated Product Team. Washington, D.C.: Federal Aviation Administration.


ISO (International Organization for Standardization). 1994. ISO 9001, Quality Systems: Model for Quality Assurance in Design, Development, Production, Installation, and Servicing, 2nd ed. Geneva, Switzerland: International Organization for Standardization.


White House Commission on Aviation Safety and Security. 1996. Initial Report to the President, September 9, Washington, D.C. Available on the internet: http://www.aviationcommission.dot.gov

White House Commission on Aviation Safety and Security. 1997. Final Report to the President, February 12, Washington, D.C. Available on the internet: http://www.aviationcommission.dot.gov

Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 1
Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 2
Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 3
Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 4
Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 5
Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 6
Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 7
Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 8
Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 9
Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 10
Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 11
Suggested Citation:"Executive Summary." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 12
Next: 1 Introduction »
Configuration Management and Performance Verification of Explosives-Detection Systems Get This Book
×
 Configuration Management and Performance Verification of Explosives-Detection Systems
Buy Paperback | $47.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

This report assesses the configuration-management and performance-verification options for the development and regulation of commercially available Explosive Detection Systems (EDS) and other systems designed for detection of explosives. In particular, the panel authoring this report (1) assessed the advantages and disadvantages of methods used for configuration management and performance verification relative to the FAA's needs for explosives-detection equipment regulation, (2) outlined a "quality management program" that the FAA can follow that includes configuration management and performance verification and that will encourage commercial development and improvement of explosives-detection equipment while ensuring that such systems are manufactured to meet FAA certification requirements, and (3) outlined a performance-verification strategy that the FAA can follow to ensure that EDSs continue to perform at certification specifications in the airport environment.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!