National Academies Press: OpenBook
« Previous: 4 Tools for Ensuring Operational Performance
Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

5
Life-Cycle Management Plan

Chapter 2 outlined the minimum requirements for a credible, flexible, and effective regulatory strategy. Subsequent chapters introduced explosives-detection technologies, configuration management, performance verification, and quality systems. This final chapter returns to the requirements addressed in Chapter 2 and presents a management plan that the panel believes will meet the requirements of all stakeholders. In addition, details of the panel's findings and conclusions that led to the selection of this strategy are presented. Finally, the panel's recommended life-cycle management plan is presented.

Explosives-Detection Equipment Life Cycle

The life cycle of explosives-detection equipment can be delineated in several manners. The panel identified the following five broad phases in the life cycle of explosives-detection equipment, as depicted in Figure 5-1:

  • the research phase, which includes basic research and testing up to and including proof of concept
  • the engineering phase, which includes the development of the manufacturer's original design and modifications to that design and the completion of a production-representative system for certification testing by the FAA
  • the manufacturing phase, which includes pilot production, certification (performance and documentation of precertification testing and certification testing at the FAA Technical Center), and manufacturing (which includes the assembly and testing of reproductions of systems that have been certified and the incorporation of approved improvements and upgrades to the certified design)
  • the operational phase, which includes initial deployment and normal use and operation
  • the retirement phase, which includes removal and disposal of a system

Each of the above life-cycle phases encompasses quality assurance, performance verification, and configuration management of hardware, software, and firmware.1 However, as shown in Figure 5-2 only the manufacturing and operational phases receive oversight by the FAA through the management plan because these phases are where the FAA acts in its role to ensure proper explosives-detection performance (through certification or through purchase for airport deployment) and where changes to design or configuration that affect detection performance of manufactured units in the airport are likely to occur. The actions taken by the stakeholders during each life-cycle phase regarding quality assurance, performance verification, and configuration management are shaped by the needs of the stakeholders, which are influenced by external developments such as emerging technologies, changing threats, and economic constraints. Therefore, the evolution of explosives-detection equipment is dependent on actions taken by or for the stakeholders as a result of external developments. The delegation of responsibility for such actions should be based on the responsibilities and expertise of each stakeholder, as suggested in Figures 5-3, 5-4, and 5-5.

Management Plan

A life-cycle management plan, or more generally, a management plan , describes a strategy to integrate all the aspects of manufacturing and deployment. Such a plan is needed because changes are a normal part of the life cycle of any manufactured product. Examples of changes that may affect the performance of explosives-detection equipment include

1  

Firmware is software, inclusive of programs or data, that has been permanently written onto read-only memory (ROM) or programmable read-only memory (PROM). Firmware is a combination of software and hardware. Firmware is included as an item that should be under configuration management, because, like hardware and software, changes made to firmware can alter the performance of explosives-detection equipment.

Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

Figure 5-1

Five phases in the life cycle of an EDS.

  • design changes in a subsystem
  • nominal production and process variations
  • manufacturing changes in hardware, software, and firmware and improvements to these elements
  • software, firmware, or hardware upgrades to explosives-detection equipment already in service
  • repair and replacement of an equipment module at the installation site (e.g., at the airport)
  • incorporation of new components from a different vendor
  • relocation of explosives-detection equipment from one installation site to another
  • in-service degradation and aging

These factors can cause the performance of an EDS to vary from certification requirements and noncertified explosives-detection equipment to vary from its baseline performance. Therefore, the FAA should employ a management plan that specifies the methodologies for configuration management and performance verification to ensure an acceptable and known level of confidence in the performance of explosives-detection equipment.

The panel has devised a plan—the life-cycle management plan—that it believes would address the needs of the FAA to ensure detection performance while also addressing the needs of the manufacturers and users to pursue their business opportunities. The panel recommends that this plan reside with and be maintained by the FAA. This plan defines and documents the FAA's configuration-management, performance-verification, and quality-assurance requirements for the following stakeholders:

  • the FAA during certification or baseline, qualification, and verification testing of explosives-detection equipment (this would include control of test objects, procedures, and test results and accompanying documentation)
  • explosives-detection equipment manufacturers during the certification, manufacturing, and operational phases of the life cycles
  • the air carriers and other end users, with regard to deployed explosives-detection equipment, during the operational life cycle (this would include control of operating and maintenance procedures)

In essence, the goal of the management plan is to provide a systematic framework for the FAA to define configuration-management, performance-verification, and quality system requirements; a mechanism to meet these requirements; a means to measure if the requirements are being met; and a method to communicate with other stakeholders—all to ensure performance throughout the life cycle of explosives-detection equipment. It is recognized that over time there may be changes in the threats to aviation security and in the equipment that is used to detect them. Thus, ideally, the management plan adopted by the FAA should be sufficiently robust and flexible to accommodate a range of scenarios as these scenarios shift over time.

The FAA already has documented procedures that address much of what is suggested above. For example, the FAA's Management Plan for Explosives Detection System Certification Testing (FAA, 1993) and their Technical Center Baggage Inspection System: Airport Operational

Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

Figure 5-2

Activities over the life cycle of explosives-detection equipment.

Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

FAA

Manufacturer

User

Develop primary standard bag set.

Identify system baseline configuration.

Participate in and facilitate precertification airport testing for explosives-detection technology.

Demonstrate that a quality system is used for all certification test objects and test protocols.

Identify critical system parameters, including acceptable values and ranges.

 

Ensure that a quality system is in place for the manufacturing and operational life-cycle phases of an EDS.

Demonstrate a quality system for manufacturing, including a change control process.

 

Develop secondary standard bag set.

Demonstrate completion of precertification activities.

 

Figure 5-3

Responsibilities of stakeholders for moving from the engineering phase to certification.

FAA

Manufacturer

User

Ensure the development of a secondary standard bag set.

Develop proposed manufacturing change classification levels and procedures and facilitate acceptance by other stakeholders.

Participate in the development of manufacturing change classification levels and procedures.

Validate secondary standard bag set.

 

 

Participate in the development of manufacturing change classification levels and procedures.

Participate in the development of performance standards, test protocols, and test objects for qualification testing.

Participate in the development of performance standards, test protocols, and test objects for qualification testing.

Establish performance standards and test protocols for qualification testing.

 

 

Validate test objects for qualification testing.

 

 

Figure 5-4

Responsibilities of stakeholders for moving from certification to the manufacture of an EDS.

Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

FAA

Manufacturer

User

Establish performance standards and test protocols for validation and monitoring testing.

Participate in the development of operational configuration change process.

Demonstrate quality systems for EDS testing, maintenance, and upgrades.

Validate test objects for validation and monitoring testing.

Participate in the development of performance standards, test protocols, and test objects for verification and monitoring testing.

Develop an operational configuration change process.

Participate in the development of operational configuration change process.

 

 

Perform periodic verification testing and maintain data.

Notify FAA and users of maintenance requirements and opportunities for upgrades.

Participate in the development of performance standards, test protocols, and test objects for verification and monitoring testing.

Perform qualification testing with manufacturer.

Maintain self-diagnostics data.

Perform routine maintenance.

 

Maintain configuration documentation of units to be deployed.

Perform monitoring testing and maintain data.

 

 

Maintain configuration documentation of units in airports.

Figure 5-5

Responsibilities of stakeholders for moving from the manufacturing phase to the operational phase.

Demonstration Project Test and Evaluation Master Plan (FAA, 1995), outline several of the FAA's existing documented procedures that are appropriate for ensuring the performance of explosives-detection equipment. The panel endorses these plans and recommends that the FAA implement the life-cycle management plan described above, which will provide a highly visible formal structure to follow existing guidelines as well as those recommended in this report.

To be effective, the FAA's management plan must address the specific needs of each stakeholder during the certification, manufacturing, and operational phases of the explosives-detection equipment life cycle. The management plan should be structured to encourage industry to continue to develop and improve explosives-detection equipment, while at the same time assuring the regulator (FAA) and the users (air carriers and airports) of the performance of such equipment.

As the regulatory agency responsible for air safety and security, the FAA must retain their authority to establish the performance specifications for explosives-detection equipment and to establish airport testing protocols and schedules. These tests are likely to be performed in airports by non-FAA personnel on equipment designed and manufactured by independent companies, and so the test protocols and the performance levels must be specified such that they are practical to be carried out by air carrier or third-party personnel. Furthermore, the manufacturers must include the appropriate hardware and software in their equipment designs such that testing personnel can carry out the FAA-mandated tests. Because of the extensive involvement of the manufacturers and users in the testing, these stakeholders should be involved in the development of the management life-cycle plan so that appropriate roles are assigned to the FAA, the manufacturers, and the users.

Preparing performance specifications and testing protocols for deployed explosives-detection equipment will be fundamentally different from, and more difficult than, developing certification standards and test protocols. In developing the performance specifications and testing protocols for the FAA certification testing, the FAA evaluated the threat to aviation security and set amounts and types of explosives that must be detected. Working with air carriers and equipment manufacturers, the FAA set instrument false-alarm and throughput rates that were acceptable to the other stakeholders. Protocols were developed to test equipment against certification specifications. Because the FAA performed the certification tests at the FAA Technical Center, they could also determine the limitations placed by the testing facility on the testing protocols, including testing time and test object availability. In the case of testing deployed explosives-detection equipment, the FAA has little control over the facilities and on the time available at the airports.

Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

Recommendation. The FAA should involve the stakeholders in implementing the generalized framework for the management plan, including

  • establishing explosives-detection equipment performance requirements and correlating parameter specifications (tolerances)
  • developing a performance-verification protocol suitable for airport testing of explosives-detection equipment
  • tailoring performance-verification planning guidelines to specific air terminals, and explicitly identifying the roles of each stakeholder

Finally, it is imperative that best practices be employed throughout the life cycle of explosives-detection equipment. That is, the FAA, the equipment manufacturer, the air carriers or other end users, and the airports should (1) have a well-defined quality system in writing and (2) ensure that everyone within a particular stakeholder enterprise who is involved with the process (e.g., testing, manufacturing, or operating explosives-detection equipment) understands and adheres to the written procedures. One resource available to all stakeholders to aid in identifying the best practices used in industry, government, and academia for manufacturing and management is the Best Manufacturing Practices Center of Excellence, as discussed in Box 5-1.

Configuration Management Plan

A configuration management plan is a document that defines how configuration management will be implemented for a particular acquisition program or system (DOD, 1995). In this case the program is the FAA's Aviation Security Program, and the system is for detecting explosives. These two issues are dealt with separately below.

Configuration Management of the FAA Aviation Security Program

As discussed above, the FAA is responsible for identifying and documenting threats to aviation security and determining the requirements for explosives-detection equipment. The FAA is also responsible for developing and controlling test protocols for verifying that explosives-detection equipment meets these requirements. Over time, FAA certification requirements, test protocols, and standards will have to change to respond to new threats, as well as to take advantage of new detection technologies. Using the principles of configuration management to track such changes would ensure a means for maintaining records of them for certified and noncertified systems alike.

Recommendation. The FAA should implement a configuration-management plan that focuses on controlling the configuration of

BOX 5-1 Best Manufacturing Practices Program of the Office of Naval Research

The mission of the Best Manufacturing Practices Center of Excellence is to identify and promote exemplary manufacturing practices and disseminate this information to U.S. manufacturers of all sizes. It is an outgrowth of the Department of the Navy's Best Manufacturing Practices program and was created in 1993 as a joint effort of the Office of Naval Research; the National Institute of Standards and Technology, Manufacturing Extension Partnerships; and the Engineering Research Center of the University of Maryland at College Park. The program provides a national resource to foster the identification and sharing of best practices being used in government, industry, and academia; and to work together through a cooperative effort aimed at strengthening the U.S. industrial base and its global competitive position.

Independent teams of government, industry, and academic experts survey organizations that are ready to share information about their own best processes. Participation in the survey is voluntary and at no cost to the requesting organization. Once the team completes its work, the surveyed activity or organization reviews a draft of the report. Copies of the final report are distributed to representatives of government, industry, and academia throughout the country and entered in the Best Manufacturing Practices database. The information in the reports is designed to help organizations evaluate their own processes by identifying, analyzing, and emulating the processes of organizations that excel in those areas.

Since the inception of the Best Manufacturing Practices program, more than 95 organizations have participated in surveys. A synopsis of each survey is available from the program's website at http://www.bmpcoe.org/. Further detailed information can be obtained by contacting the person identified in each survey or by contacting the Best Manufacturing Practices Center of Excellence at 4321 Hartwick Rd., Suite 400, College Park, MD 20740; 1-800-789-4267.

Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
  • documented threat definition
  • performance requirements and test protocols for precertification, certification, baseline, qualification, verification, monitoring, and self-diagnostic testing
  • test objects, including simulants and primary and secondary standard bag sets
  • test results
  • test protocols and performance capability categories of noncertified equipment

Recommendation. As a part of the FAA's configuration-management plan, the FAA should, for each of the above items,

  • identify the baseline configuration, including identification of configuration items (CIs) using unique identifiers
  • track all changes
  • identify the criticality of CIs and the degree and criticality of potential changes to them

Configuration Management of Explosives-Detection Equipment

The FAA recognizes the importance of a documented configuration-management plan, as indicated by their stated requirements for manufacturers of explosives-detection equipment who are applying for certification: ''The vendor must provide documentation describing the system configuration management and quality assurance plans and practices applied during system development, production, and test and evaluation. This shall include hardware, software, and firmware version tracking and control, test equipment/ tool certification tracking, explosive/simulant validation tracking and documentation up-date/control'' (FAA, 1993).

Recommendation. The FAA should continue to enforce its existing guidelines by ensuring periodic third-party reviews of the configuration-management plan and practices of equipment manufacturers. These reviews should include physical-configuration audits (i.e., a technical examination of the EDS to verify that, "as-built," it conforms to the certified baseline) and in-process audits (i.e., an examination to determine if the configuration-management process established by an organization is being followed).

Recommendation. Definition of the baseline configuration of explosives-detection equipment prior to certification provides a mechanism for the stakeholders to determine the degree and criticality of changes to explosives-detection equipment. Therefore, the panel recommends that the FAA require manufacturers of explosives-detection equipment to take the following actions:

  • establish the baseline configuration of the explosives-detection equipment, including identification of CIs with unique identifiers, prior to certification (certification baseline)
  • have a process in place to determine the degree and criticality of changes in the EDS or noncertified explosives-detection equipment
  • notify the FAA of changes in configuration-management practices
  • implement version control, inclusive of baseline management, as a minimum requisite for certification

Recommendation. To maintain the configuration of explosives-detection equipment during certification testing, the panel endorses and recommends continued documentation of equipment configuration and changes to this configuration through use of the FAA's Configuration Log as described in their 1993 Management Plan for Explosives Detection System Certification Testing.

Configuration Control

Ideally, manufacturers of explosives-detection equipment should establish configuration control procedures during the engineering phase. The other stakeholders (FAA and end users such as the airlines or airports) will, however, only be directly involved in change decisions after certification (i.e., during the manufacturing and operational lifecycle phases). For effective configuration control, all of the stakeholders must agree up front to the types of changes to the certification baseline that need to be brought to the attention of the stakeholders prior to implementation (see, for example, Figure 5-6).

Recommendation. All stakeholders (air carriers, FAA, equipment manufacturers) should agree prior to the manufacture or deployment of an EDS on a change classification system and process to ensure that stakeholders are notified of changes as necessary.

The benefits of having a change classification process in place prior to the manufacture or deployment of an EDS include the incorporation of improvements with a minimum of confusion about the role of each stakeholder, the clarification of retesting required to confirm the effects of changes, and the empowerment of each stakeholder to evaluate EDS performance continually and suggest improvements. A typical change classification system ranks changes as follows:

  • Class 1 changes are likely to affect system performance.
  • Class 2 changes might affect some aspects of system performance.
  • Class 3 changes are not likely to affect system performance.

To determine the impact and acceptability of changes to a system, many manufacturers have instituted a change approval process that involves formation of a configuration-control board, which includes representatives from the research, engineering, quality-assurance, marketing, and service areas. The board is the final arbiter of which changes

Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

Figure 5-6

Configuration change process for an EDS during manufacture or operation.

are accepted or rejected. However, as discussed above, the effects of some changes are such that the customer and regulator should have input into the required evaluations and the decision of whether to implement a proposed change. A stakeholders' configuration-control board, involving representatives of the end users (air carriers) and the FAA, would provide a mechanism for such a review, with the manufacturers configuration-control board having the responsibility to ensure that changes of an appropriate degree and criticality are brought before the stakeholders' configuration-control board. The stakeholders' configuration-control board should determine the method of verifying the influence of a proposed change on system performance. In the event that consensus cannot be gained within the stakeholders' configuration-control board, the FAA should act as the final arbiter to resolve the situation.

Recommendation. Configuration-control boards such as the ones used in many industries should be established to determine which proposed changes will be implemented and the implementation and testing conditions that will be imposed.

Recommendation. The FAA should

  • require explosives-detection equipment manufacturers to implement change management as a minimum requisite for certification
  • develop a change classification system with stakeholder involvement
  • establish a mechanism (e.g., a stakeholders configuration-control board) for stakeholder review of proposed changes (of an agreed-upon priority) to explosives-detection equipment

Although, in reality, determining the class of a change will be the responsibility of the manufacturer, review of the classification of changes (and the appropriateness of such classifications) should be included in a third-party configuration audit.

Configuration Management of Software for Explosives-Detection Equipment

Although software, hardware, and firmware form an integrated system, configuration management of software is a unique case due, in part, to the ease with which changes can be made to software code by a vendor (locally or remotely) and the relative difficulty for the user or regulator to recognize a software change. Because multiple systems are being installed at multiple locations, it is critical that software version levels be managed and controlled such that the version being executed at any one location is known. Uncontrolled modification of software in the field should not be permitted.

Similar to a hardware CI, a computer software configuration item (CSCI) is a uniquely identifiable subset of the system configuration. Each software module, for example, may constitute a separate CSCI. Furthermore, each CSCI should

Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

be assigned a unique identifier that relates the software to its associated software design documentation, revision, and release date.

Recommendation. The FAA should

  • require manufacturers of explosives-detection equipment to select and document the baseline configuration of each CSCI prior to certification
  • require manufacturers of explosives-detection equipment to uniquely identify each CSCI with a name or serial number and a version number (identifiers should be embedded in source, object, or executable file code and, when applicable, electronically embedded in firmware)
  • require EDS vendors to identify software changes by labeling modified software with a unique version number that is traceable to the original certified software (e.g., modified software version 3.1 may be labeled "version 3.1 a")

As with hardware changes, proposed changes in software will require review by a body with appropriate expertise to determine the scope of regression testing2 necessary to ensure that critical operational performance characteristics are maintained. Internally, manufacturers may choose to maintain a separate software configuration-control board to perform this function. However, when the other stakeholders are involved, the panel recommends that software and hardware experts be included in a single board that will review proposed system changes—software, firmware, and hardware alike.

Changes to the software of a particular model of explosives-detection equipment may be testable through the use of a digital database, analogous to a test article for the system hardware. Such a database might contain the images or other appropriate data collected during certification testing, with a record of the decision made for each bag (for which data were collected) by the baseline configuration system. Comparing the decisions made for each bag by a system with updated or modified software and using the standard input data (from the digital database) would allow determination of improvements in performance or other effects of software changes. Such data could also be used to test individual CSCIs without the necessity of physical tests of all hardware components. Ideally, it would be possible to obtain digital data from the output of the different operational subsystems. This data could then be used as a performance-verification tool for future changes in explosives-detection equipment by a specific manufacturer.

Recommendation. The FAA should develop and maintain control of a digital database that contains information collected during certification testing of explosives-detection equipment. Control of this database by the FAA in a manner similar to how they control the primary standard bag set, without release to equipment manufacturers or any other party, would provide assurance of comparability of tests performed at different times.

A test digital database similar to the digital database proposed above could be developed by testing a different bag set than the one used to develop the digital database (i.e., a bag set other than the primary standard bag set or the secondary standard bag set). This "test digital database" could then be made available to manufacturers so that they could perform software tests (e.g., precertification-type tests) at their own site. This tool would allow for an accelerated design program, both for new designs and for the modification of existing designs.

Configuration Management of Deployed Explosives-Detection Equipment

Discovery of failures, faults, or errors during operational tests or service of deployed explosives-detection equipment will often necessitate postcertification changes. Such changes can lead to "secondary" failures—faults or errors that were either not present or not detected when the system was first certified. Testing detects the presence of errors, but cannot guarantee the absence of errors. With each change, therefore, testing protocols should be re-evaluated to determine if they are capable of detecting errors that may be introduced by a change.

Often service contracts are negotiated by the end user of explosives-detection equipment with a party other than the manufacturer of the equipment. This third-party service provider is an important stakeholder in the configuration-management and performance-verification process.

Recommendation. The FAA should require the party responsible for postdeployment maintenance (e.g., third-party service providers) of explosives-detection equipment to apply configuration control to explosives-detection equipment in airports.

Expertise Required and Software Tools Available for Configuration Management

It is likely that configuration-management experience and procedures will vary from manufacturer to manufacturer and within the FAA. As a whole, the industry for explosives-detection equipment is relatively young, and the level of configuration-management expertise tends to correlate directly with the maturity of the manufacturer. The FAA and the manufacturers should, at a minimum, maintain a level of configuration-management expertise that allows them to

2  

Regression testing is the process of validating modified parts of a software program and ensuring that no new errors are introduced into a previously tested code. Although software may have been tested during its development, program changes during maintenance require that parts of the software be tested by a regression test.

Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

interpret and apply the fundamental principles of baseline management and change management. For example, the FAA and the manufacturers should be able to identify appropriate CIs and establish a baseline configuration for test items and explosives-detection equipment, respectively.

Recommendation. The FAA and EDS manufacturers should maintain expertise to fully implement the following configuration-management concepts:

  • configuration item identification
  • configuration-control and change management
  • configuration status accounting
  • configuration auditing

There are several commercially available software-based configuration-management tools that are applied in technology-intensive industries. The panel believes that the use of software-based configuration-management tools would facilitate tracking changes to the configuration of a CI, a CSCI, or a system baseline. Existing commercially available configuration-management tools can meet the needs of all explosives-detection equipment stakeholders, including equipment manufacturers, airlines/airports, and the FAA. However, software-based configuration-management tools developed internally by manufacturers may also be appropriate—if they enable version control.

Recommendation. The FAA and the manufacturers of explosives-detection equipment should implement and maintain expertise in software-based configuration-management tools as a part of their management plan.

Performance Verification

The FAA has defined and prioritized the threats to aviation security and has defined an operational test protocol for radiographic x-ray scanners (used for screening carry-on baggage) and passenger-screening metal-detecting portals. However, they have not developed performance requirements that are testable in airports or at manufacturing sites or adopted an airport performance-verification protocol for automated explosives-detection equipment. Developing such performance requirements and test protocols would allow for clear communication between the FAA and the manufacturers and end users. The test plan outlined in a 1993 National Research Council report (NRC, 1993) could serve as a model for a test protocol appropriate for use in an airport or at a manufacturing site. Such a protocol should include

  • definition of the baseline configuration and management approach of the test
  • definition of the test conditions and requirements including test objects (e.g., simulated explosives), support equipment, test personnel, and test procedures
  • configuration management of the test protocol, including test objects, collected data, test requirements, and documentation
  • determination of test duration and frequency
  • determination of funding requirements

Recommendation. The FAA should require a wide variety of tests for maintaining EDS certification, including qualification testing and periodic verification testing of detection performance levels (using a secondary standard bag set), frequent monitoring of critical system parameters (using test articles), and continuous self-diagnosis of subsystem parameters (e.g., voltages and currents) to detect incipient problems.

In the sections that follow, a protocol is discussed to provide guidelines for developing performance-verification procedures for certified and noncertified explosives-detection equipment.

Explosives-Detection Equipment Architecture and Performance Verification

Certification testing determines the integrated performance of all of the operational subsystems of equipment under examination. During certification, explosives-detection equipment is tested as a monolithic entity—without testing individual components. The panel believes that this mode of testing results in a limited amount of information regarding how modifications to operational subsystems and components could affect the performance of the explosives-detection equipment. Testing of appropriate parameters, however, could provide such information. Early identification of operational subsystems will increase the efficiency and rigor of performance verification, particularly with respect to factory testing. For x-ray CT-based equipment, examples of operational subsystems include the illuminator—that is, the x-ray generator composed of a high-voltage generator and an x-ray source—and the detector, which measures the modulation in the x-ray profile exiting the scanned bag. The relationship between component parameter values and the performance of operational subsystems, and, ultimately, overall detection performance must be known for such tests to be effective.

Test Objects for Testing Bulk Explosives-Detection Systems

Test objects are a variety of objects used to test the performance of explosives-detection equipment. These objects range from test articles that measure a critical system parameter (an example is described in Appendix F), to materials that simulate explosives, to the primary standard bag set—which contains explosives—to determine the detection performance of explosives-detection equipment (Table 5-1). The ability to test with a simulated explosive (i.e., as a part of a secondary standard bag set) is critical to the bulk explosives-detection equipment development and manufacturing process, as well as to field testing. However, the nature of a simulated explosive is dependent on the explosive

Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

TABLE 5-1 Types and Purposes of Test Objects

Test Objects

Definition

Purpose

Primary standard bag set

Representative passenger bags, some containing explosives at threat quantity

Simulate threat

Secondary standard bag set

Representative passenger bags, some containing simulants representing explosives at threat quantity

Simulate primary standard bag set (requires no special safety-related handling permits or precautions)

Test articles

Individual articles, including items such as simulants, the InVision IQ simulants test bag, and ASTM (1993) standardized step wedges and CT phantoms

Elicit a predetermined response to test critical system parameters

material it is intended to simulate, as well as the technology that will be used to detect it.

Currently, only explosives are used during certification testing of explosives-detection equipment, and the availability of FAA-validated secondary standards is limited. The FAA certification process, however, provides an ideal opportunity to correlate, for a particular piece of explosives-detection equipment, technical test data using explosives with that obtained using secondary standards. Furthermore, it is the opinion of the panel that the FAA has the responsibility for and should continue to work toward ensuring the availability of appropriate secondary standard materials.

Prior to FAA validation of secondary standard materials developed by equipment manufacturers, the FAA should require manufacturer validation of such materials as discussed in paragraph 2.4.5 on page 17 of the Management Plan for Explosives Detection System Certification Testing (FAA, 1993). Furthermore, the FAA should validate or arrange for independent validation of all simulants (regardless of who developed them) to be used for qualification testing and verification testing according to the guidelines presented in Annex II of Detection of Explosives for Commercial Aviation Security (NRC, 1993).

Recommendation. For qualification and verification testing, the FAA should work with EDS manufacturers and users to develop a secondary standard bag set for each specific technology or technology class.

Recommendation. The FAA should

  • continue to support the development and validation of test objects, simulated explosives, and associated test articles
  • continue to work with the International Civil Aviation Organization on the development of simulated explosives and other test objects to encourage development of internationally recognized standards
  • develop a secondary standard test article with each manufacturer that will meet the FAA's and the end user's needs for daily testing of explosives-detection equipment in airports
  • develop a secondary standard bag set that consists of a number of representative international passenger bags that do not contain threat objects and a number of bags containing simulated explosives at an amount that represents a threat quantity of explosives (the simulated explosives should mimic the threats in the primary standard bag set used for certification and that have been validated for the explosives-detection technology)

Recommendation. The secondary standard bag set should be developed, retained, and controlled by the FAA personnel responsible for the conduct of the certification test and evaluation and utilized in the conduct of periodic verification testing.

Recommendation. The secondary standard bag set should be controlled to assure reliable test results, as is done by the FAA for the primary standard bag set. It is important that the FAA periodically (on the order of the lifetime of the simulants) verify the condition, configuration, and performance of the secondary standard bag set.

All data generated by the use of the secondary standard bag set should be collected, analyzed, reported, and maintained by the FAA. A proposed test protocol based on the secondary standard bag set is presented in Appendix F.

The performance of explosives-detection equipment may be inferred by testing critical system parameters using a test article or by continuously monitoring subsystem parameters (Table 4-1). A critical system parameter is a parameter that is fundamental to the performance of explosives-detection equipment. Such a parameter is directly related to the ability of explosives-detection equipment to detect an explosive. For the case of a CT-based imaging system, these parameters include the macrosystem transfer function, spatial resolution, and background noise. Test articles exist to test critical system parameters for many explosives-detection technologies as a result of their use in other applications—for example, CT in the medical field.

In addition to the critical system parameters, there are subsystem parameters, which are measurable parameters that indicate the operational consistency of explosives-detection equipment (e.g., voltage and current measurements). There is a rich history in the medical imaging community of testing subsystem parameters associated with specific components of medical CT systems. Most manufacturers perform tests

Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

associated with the x-ray generator (or, referring to our defined explosives-detection equipment anatomy, the illuminator) that determines the accuracy of the x-ray tube potential and the x-ray tube current. Inappropriate calibration of these two subsystem parameters can cause errors in the output of the complete CT system, particularly with respect to any quantitative determination of the x-ray attenuation coefficient, a critical system parameter used to characterize the objects being imaged Testing, monitoring, and evaluating the accuracy of the x-ray tube potential and the x-ray tube current is as important in CT-based explosives-detection equipment as it is in a medical CT system. Tests on other subsystem parameters, such as voltage and noise levels associated with the x-ray detector—a separate component from the illuminator—are also incorporated in most factory test programs for medical CT systems and, therefore, could also be considered for CT-based explosives-detection equipment.

Recommendation. For monitoring the performance of EDSs, the FAA should work with manufacturers to develop a set of critical system parameters (and their tolerances) that could be monitored frequently and recorded to track changes in performance during normal operations or to verify performance after maintenance or upgrading.

Critical system parameters (e.g., attenuation coefficient) are measurable with appropriately designed test articles that need not include simulated explosives (Table 5-1). Subsystem parameters (e.g., voltages) can be monitored continuously and do not necessarily require the use of a test article.

Certification testing determines the integrated performance of all operational subsystems of explosives-detection equipment. During certification, explosives-detection equipment is tested as a monolithic entity without testing individual subsystems or components. The panel concluded that this mode of testing results in a limited amount of information regarding how modifications to operational subsystems and components could affect the performance of the explosives-detection equipment. Testing of appropriate parameters, however, can provide such information.

Recommendation, The FAA should verify critical system parameters during certification testing.

Recommendation. The FAA should

  • request that manufacturers of explosives-detection equipment identify critical system parameters that are directly related to the ability of such equipment to detect explosives (e.g., macrosystem transfer function in CT-based equipment) and provide for monitoring and reporting of these parameters at appropriate intervals
  • request that manufacturers of explosives-detection equipment explicitly define appropriate subsystems consistent with the panel's taxonomy for sampling, analyzing, classifying, and interfacing
  • request that manufacturers of explosives-detection equipment identify appropriate test parameters (including tolerances) for the subsystems or components on which subsystems depend (e.g., voltage or current levels)
  • measure and record critical system parameter and subsystem test parameter values during certification testing to determine baseline test parameter values
  • establish critical test parameter and subsystem test parameter value ranges, based on certified baseline test parameter values, that may be used as an indication that the overall system meets certified performance (parameter values measured in the field could be referenced against the established parameter value ranges to infer performance)

Note that validation of critical and subsystem parameter ranges may require monitoring the correlation of these ranges with equipment performance over time—even after deployment. In this context, system performance pertains to the ability of the equipment to detect the explosive compositions and configurations (e.g., sheet and nonsheet bulk explosives) defined by the FAA's certification criteria. Therefore, parameter values measured outside of accepted ranges should trigger testing the equipment with the secondary standard bag set in the field. Figure 5-7 illustrates a verification testing process.

Performance Verification of Bulk Explosives-Detection Systems

The FAA needs qualification testing, verification testing, and monitoring protocols to verify the performance of deployed EDSs, because it is not reasonable to duplicate certification testing in airports and manufacturing sites, and there is a need for determining that an EDS is operating properly on a daily basis. The first step in the development of a qualification, verification, and monitoring testing protocol to verify the performance of deployed bulk explosives-detection equipment is the realization that performance requirements cannot easily be directly related to the FAA certification standard (e.g., explosives cannot be easily handled in airports), but must instead be governed by what is practical to handle at manufacturing sites and in airports. To accomplish this, the FAA must first define a primary standard (e.g., a set of suitcases containing bombs) from which to reference a more practical test item. For example, the bag set used for certification testing of an EDS may serve as this primary standard. Given the assumption that the primary standard bag set accurately reflects the "real threat," there is a level of uncertainty (or risk) involved in modeling the primary standard bag set with a secondary standard bag set (e.g., a simulated explosive in a test suitcase). The next logical step, testing critical parameters (e.g., resolution) with test articles that are intended to correlate to system performance, also brings into question how realistically the test object reflects the real threat. Continuous monitoring of appropriate

Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

Figure 5-7

Monitoring and verification testing for certification maintenance.

subsystem parameters (e.g., x-ray tube current) can provide evidence of changes in system performance that may indicate changes in detection and false-alarm rates.

As described in Figure 5-8, the further removed a test method is from the real threat, the greater the uncertainty about the ability of the test method to effectively measure the performance of the EDS. The practicality of conducting performance verification, however, increases concurrently with degree of uncertainty. In the context of performance verification, practicality reflects test duration and difficulty and the likelihood that the test will not disrupt airport and airline operations. That is, the less obtrusive the performance-verification test, the more practical it is. Furthermore, Figure 5-8 indicates the dependence of each successive level of performance verification on the previous level. For example, the quality of the primary standard bag set, and analogously the secondary standard bag set, is dependent on the understanding of the real threat. Similarly, the efficacy of testing a critical parameter with a test article is dependent on the secondary standard bag set to resolve a potential problem detected by such a test.

In this light, it is apparent that the FAA needs to develop a performance-verification protocol for airport testing of an EDS that incorporates more than one level of testing. For example, daily diagnostic testing of critical parameters of an EDS could be augmented by monthly testing with simulated explosives hidden in luggage or an annual check with explosives. Regardless of the performance-verification protocol established, external developments such as emerging explosives-detection technologies and changing threats to aviation security should be reflected with appropriate changes to primary standards, secondary standards, and diagnostic tests.

Recommendation. The FAA should design and validate a two-tiered protocol for performance-verification testing of bulk EDSs:

  1. Using a test article (e.g., a simulated explosive, step wedge, etc.) during certification testing, establish acceptable ranges of the critical system parameter values deemed critical to the alarm/no alarm output of a certified EDS. This test can then be repeated in the airport or at the manufacturing site to determine if the deployed or to-be-deployed certified EDS has parameter values within the acceptable limits. In addition, subsystem parameter values can be continuously monitored to assure the operational consistency of the EDS.
  2. Using a secondary standard bag set, obtain estimates of the probability of detection (PD) and the probability
Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

Figure 5-8

Schematic representation of the relationship between various  levels of performance verification test objects and the  "real threat" and the relative practicality and degree of  uncertainty associated with them.

  • of false alarm (PFA) to be used in addition to the PD and PFA values determined using the primary standard bag set during certification testing. If airport test results (e.g., PD and PFA of the deployed EDS) and results from tests performed at the FAA (e.g., PD and PFA of the EDS that underwent certification testing) using the same secondary standard bag set show statistically significant agreement, the performance of the deployed EDS could be said to be verified. The FAA could use the protocol presented in Appendix A of Detection of Explosives for Commercial Aviation Security as a model for developing a performance-verification protocol suitable for use in airports or at manufacturing sites.

The first option would be useful for daily calibration and diagnostic testing of an EDS. This approach, however, will only provide inferential information regarding the capacity of the EDS capacity for detecting explosives. The adequacy of this information for predicting detection performance depends on the proper choice and understanding of the critical parameters measured. This type of testing will not provide a quantitative measure of performance relative to certification criterion (i.e., it does not estimate PD and PFA) but could indicate small changes in system behavior, which may give early warning about changes in detection performance. Therefore, the panel recommends that the second, more rigorous, approach be utilized periodically to yield performance probabilities (PD and PFA) to correlate performance-verification testing more directly with certification specifications.3 An example test protocol for each approach is given in Appendix F.

In addition to monitoring the detection performance of an EDS, it is important to monitor the false-alarm rate and the baggage throughput rate of the system. These quantities can be monitored continuously as passenger baggage passes through the system. However, if the false-alarm rate is determined in this manner to be unacceptably high, the system should not be taken off-line. Rather, this situation warrants testing the system with the secondary standard bag set to determine if the false-alarm rate is within specifications. If the system is found—with the secondary standard bag set—to have a false-alarm rate that is outside of specifications, the FAA, the user, and the manufacturer should develop a plan to correct this problem. Similarly, if the baggage throughput rate drops to a rate that is unacceptably low, the FAA, the user, and the manufacturer should develop a plan to correct this problem.

Performance Verification of Noncertified Explosives-Detection Equipment

Noncertified explosives-detection equipment comprise bulk explosives-detection equipment that have not met FAA certification requirements and trace detection devices, for which there are no FAA-defined certification criterion. As a result of the Federal Aviation Reauthorization Act of 1996 (Public Law 104-264), noncertified explosives-detection equipment has been, and will continue to be, deployed by the FAA in U.S. airports. The panel believes that the FAA has the responsibility for determining the performance capabilities of all equipment deployed as per Public Law 104-264 and for establishing a plan for maintaining the determined level of performance in the field.

Bulk Explosives-Detection Equipment

To date, the FAA has not established formal performance specifications for noncertified explosives-detection equipment. The FAA has, however, developed a baseline test that is used to determine what noncertified bulk explosives-detection equipment will be deployed (FAA, 1995). The FAA tests explosives-detection equipment against the same categories and amounts of explosives that are used in certification testing (as well as amounts greater than and less than those used during certification testing) to determine a performance baseline for that equipment. A field test protocol for the deployed explosives-detection equipment may involve simulants for each of the categories in a secondary standard bag set, where the expectation would be that the average PD and PFA would be similar to that when explosives were used.

Recommendation. For bulk noncertified systems and devices, the panel recommends that

3  

 Note that the bags that compose the primary and secondary standard bag sets will not necessarily be representative of those being processed at any one airport. It is likely that the daily PFA for actual passenger baggage at a particular airport will vary from that determined using the secondary standard bag set. Therefore, the daily PFA should not be used for comparison against certification requirements for the purpose of disqualifying a deployed system from service.

Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
  • the FAA require the manufacturers to submit one copy of each noncertified model that has been, or is going to be, deployed to certification testing so as to obtain a measure of the baseline performance against certification criteria
  • the test protocol described above for bulk EDSs be used, with the exception that performance specifications would be based on the baseline performance determined for each type of noncertified system or device rather than on certification requirements
Trace Detection Devices

The FAA's protocol for evaluating trace explosives-detection devices uses extremely small amounts (of known quantity) of each category of explosives on a variety of substrates, with some substrates intentionally left uncontaminated. Several trace explosives-detection devices were shown by the FAA to be capable of finding explosive materials on the surface of various types of carry-on luggage. These tests, however, do not distinguish between the capabilities of the machine and the ability of the operator to locate and adequately sample the contaminated surface.

In contrast to bulk detection methods, trace detection systems are more likely to suffer from false negatives (i.e., an explosive-containing bag is not detected) due to inadequate sample collection, than false positives (i.e., false alarms) triggered by nonexplosive items. Furthermore, for trace detection devices sample collection is operator dependent to the point that the performance of a trace device is directly dependent on the performance of the human operator.

The panel determined that a performance-verification testing protocol for trace explosives-detection devices should test each of the following three tasks:

  • sample collection: determine if, during normal operation, the operators adequately sample simulated carry-on luggage that (unknown to the operators) has known amounts of explosive placed on baggage handles, zippers, and other areas that would likely be contaminated if the baggage contained an explosive
  • sample transfer: determine the efficiency with which the sample collection techniques transfer the material for detection from a surface known to be contaminated with a known amount of explosive
  • sample analysis: determine if the trace detection device adequately maintains the required detection limit while functioning continuously

Quality Systems and Standards

In the preceding discussions, specific recommendations were made regarding performance verification and configuration management. However, performance verification and configuration management are elements of a broader quality system. A quality system structure should be put in place for oversight, validation, verification, and management of configuration management and performance-verification activities throughout the life cycle of explosives-detection equipment.

Recommendation. Every stakeholder must have a quality system in place that includes oversight, validation, verification, and procedures for configuration management and performance verification covering that stakeholder's responsibilities throughout the life cycle of an EDS.

Recommendation. Each stakeholder must have a documented and auditable quality system that governs its specific responsibilities in the manufacture, deployment, operation, maintenance, and upgrading of EDSs.

To provide effective aviation security, each of the stakeholders should have a quality system in place. As the regulator of U.S. commercial aviation, the FAA should define quality system requirements that are compatible with stakeholder needs. For example, the manufacturers of explosives-detection equipment need a quality system that deters unintentional performance degradation changes without stifling innovative product improvements. Air carriers, airports, and other end users need a quality system that ensures that baggage-handling facilities operate smoothly and that proper detection performance could be demonstrated as requested by the FAA. In addition to ensuring confidence in its testing protocols, procedures, data handling, and test objects, the FAA must have its own quality system. All of the quality standards considered by the panel could provide the framework for development of quality systems that meet individual stakeholders needs.

Recommendation. Because there is already a global movement toward using the ISO 9000 series of quality system standards, the FAA should base both its in-house quality system and its requirements for other quality systems on these standards. The FAA should accept the quality system of any stakeholder that has the following attributes:

  • a definition of the critical parameters, procedures, and processes to be monitored
  • documented evidence of an internal quality system
  • a definition of the methods for controlling and verifying changes to procedures and processes
  • a definition of an internal audit program
  • provision for third-party auditing of conformance with the quality system

The FAA would benefit from applying the principles of ISO 9000 to its Aviation Security Program. For example, the FAA's internal quality system could track the FAA's conformance to its Management Plan for Explosives Detection System Certification Testing (FAA, 1993), its Technical Center Baggage Inspection System: Airport Operational Demonstration Project Test and Evaluation Master Plan (FAA, 1995), or the manufacturing management plan recommended

Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

in this report. It is the opinion of the panel that diligent adherence by the FAA to an auditable quality system (e.g., ISO 9000) would significantly improve the integrity of deployed explosives-detection equipment. Finally, an auditable quality system would provide a mechanism to determine the FAA's conformance to its own management and test and evaluation plans.

Recommendation. The FAA should implement and document its own quality system under which precertification activities, certification testing, test standards development and maintenance, and testing for maintaining certification are conducted.

Recommendation. The FAA should

  • define and record its critical test and evaluation procedures, equipment performance requirements, and data-handling procedures
  • monitor its requirements and procedures by its quality system
  • identify in their quality plan all objects (e.g., secondary standard bag set), including hardware, software, and firmware, that are critical to conducting certification, approval, or qualification, or verification testing of explosives-detection equipment
  • receive an audit of their quality system—including (where applicable) a configuration audit—from an independent third-party auditor

For a quality system to be effective in a manufacturing environment, the critical manufacturing steps must be defined. An ISO 9001 audit, for example, will verify that a manufacturer is following their quality system, but does not validate that the quality system results in the production of a certifiable EDS.

Recommendation. The FAA should

  • require that equipment manufacturers utilize a quality system consistent with the requirements for ISO 9001
  • require that equipment manufacturers define critical manufacturing parameters, procedures, and processes that will be monitored by the manufacturers' quality system
  • require equipment manufacturers to identify critical manufacturing steps, components, and software modules as part of their quality planning process
  • require explosives-detection equipment manufacturers to provide documented evidence of their internal quality system prior to certification testing
  • provide guidelines for, and requirements of, a quality system to potential applicants for certification of explosives-detection equipment

Recommendation. The FAA should ensure that each stake-holder periodically receive an audit of their quality system—including (where applicable) a configuration audit—from an independent third-party auditor.

The previous discussion was focused on quality systems for the FAA and for manufacturers of certified EDSs. However, the FAA also purchases noncertified equipment for airport demonstrations and operational testing. As purchaser, the FAA should require that the manufacturers of non-certified equipment have in place a quality system that meets the same standards as the quality systems for manufacturers of certified equipment.

Recommendation. The FAA should require that the manufacturers of noncertified equipment demonstrate the same level of quality system covering both manufacturing and upgrade as required for manufacturers of EDSs when noncertified equipment is purchased by the FAA for airport demonstrations, operational testing, or airport deployment.

Recommendation. The FAA should ensure that the manufacturers of noncertified explosives-detection equipment purchased by the FAA periodically receive an audit of their quality system including a configuration audit—from an independent third-party auditor.

Recommendation. The FAA should ensure that airlines, other end users, and organizations responsible for maintenance and upgrades (e.g., manufacturers or third-party service providers) demonstrate a quality system that covers the operation, maintenance, and upgrade of noncertified EDSs that are purchased by the FAA for airport demonstrations, operational testing, or airport deployment. Such a quality system should meet the FAA's requirements of quality systems used by operators and maintainers of certified explosives-detection equipment.

Once deployed, explosives-detection equipment becomes an essential part of an air carrier's baggage-handling system. Air carriers utilize quality control and maintenance procedures to ensure that baggage gets to the proper airplane on time and free from damage. Incorporating explosives-detection equipment into the baggage-handling system adds a critical function to the intent of baggage handling, that is, delivering baggage to the proper airplane free from explosive threats. Proper operation, testing, and maintenance of explosives-detection equipment is crucial to its effectiveness, and therefore the principles of ISO 9002 should be followed to maintain operational, testing, and maintenance procedures.

Recommendation. The FAA should

  • require that air carriers (or other end users) utilize a quality system that is consistent with the requirements for ISO 9002
  • define, in partnership with the air carriers and equipment manufacturers, critical operating, testing, and maintenance procedures that should be monitored by the air carriers quality system
  • require air carriers to provide documented evidence of their internal quality system that supports their aviation security effort
Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
  • provide its guidelines for, and requirements of, a quality system to the air carriers or other entities responsible for operation, testing, and maintenance of explosives-detection equipment

Precertification Requirements

Modifications to explosives-detection equipment during the two-to-three week certification testing period can be time-consuming and costly. Furthermore, such modifications can lead to unforeseen problems later in the life cycle of the equipment. Therefore, the baseline configuration established prior to certification should be maintained throughout the certification process. Diligent use of the Configuration Log, as defined in the Management Plan for Explosives Detection System Certification Testing (FAA, 1993) would facilitate baseline management during certification testing. In addition, a manufacturer of explosives-detection equipment is responsible for providing quantitative evidence that a system submitted for certification is a production-representative system (not a developmental system), manufactured on ''released for manufacture'' documentation, and prepared for certification testing.

The panel endorses the requirement in the FAA certification criteria (FAA, 1993) specifying that quality systems should be in place prior to certification testing for each subsequent life-cycle phase of an EDS. Because each stake-holder has different responsibilities during each life-cycle phase, the quality systems may vary. But they must all have the attributes specified in the previous recommendation. Although individual stakeholder activities may already be under the control of quality systems, the panel believes that it is critically important to have comprehensive quality systems that cover all life-cycle phases of an EDS. Furthermore, it is important that each stakeholder periodically receive a third-party audit of their quality system, including, where applicable, a configuration audit.

Recommendation. Explosives-detection equipment presented to the FAA for certification testing must be the product of an implemented and documented manufacturing quality system. Subsequent units must be manufactured according to the same quality system to maintain certification.

The panel concurs with the FAA's Vendor Instructions and Data Qualification Requirements (Section II, FAA, 1993), which must be followed and submitted to begin the process of certification of explosives-detection equipment. The panel, however, is of the opinion that the Test Plan (Paragraph 2.4.2, FAA, 1993) should include additional requirements. For example, a formal precertification test, conducted by the manufacturer on candidate production-representative equipment, would determine that the equipment is likely to meet FAA certification criteria. This process would include a review (by the FAA) of the results of a formal pre-certification test to determine if the system is ready to be submitted for certification testing at the FAA Technical Center (see DOD, 1995).

Recommendation. The FAA should adhere to the requirement that the manufacturers conduct, and provide documentation of, precertification testing of explosives-detection equipment they have deemed to be production representative.

The formal precertification test provides quantitative evidence that the system meets (or fails to meet) the FAA's performance requirements prior to certification testing. During the formal precertification test, errors, faults, and failures may be encountered. Problems detected in the equipment or individual subsystems during the test are sorted by priority and documented as Priority 1 (critical), Priority 2 (crucial), Priority 3 (essential), or Priority 4 (nonessential), as shown in Table 5-2. The panel concluded that manufacturers should be required to enact, and provide documentation of, remediation actions to problems detected during the formal precertification test prior to submitting the explosives-

TABLE 5-2 Criteria for Classifying Problems with Explosives-Detection Equipment

Classification

Criteria

Priority 1 (critical)

Any failure condition or design error that would (1) prevent the accomplishment of an operational or mission-essential capability (e.g., detecting explosives) specified in baseline requirements, (2) prevent the operator's accomplishment of an operational or mission-essential capability, or (3) jeopardize personnel safety.

Priority 2 (crucial)

Failure conditions or design errors, for which no alternative work-around solution is known, that would adversely affect (1) the accomplishment of an operational or mission-essential capability specified by baseline requirements so as to degrade performance, or (2) the operator's accomplishment of an operational or mission-essential capability specified by baseline requirements so as to degrade performance.

Priority 3 (essential)

Failure conditions or design errors, for which an alternative work-around solution is known, that would adversely affect (1) the accomplishment of an operational or mission-essential capability specified by baseline requirements so as to degrade performance, or (2) the operator's accomplishment of an operational or mission-essential capability specified by baseline requirements so as to degrade performance.

Priority 4 (nonessential)

Failures or design errors that could not affect required operational or mission-essential capability and are an inconvenience to the operator.

 

Sources: Radio Technical Commission for Aeronautics (1992) and DOD (1994).

Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×

detection equipment for certification testing. When software problems are detected during precertification testing they should be corrected with completely compiled software code as opposed to software patches. 4

Recommendation. The FAA should follow the guidelines listed below in developing requirements for problem remediation:

  • Priority 1 or 2 problems: Mandate that all changes to remedy system failures and design errors have been implemented and appropriately retested. Priority 1 or 2 problems should not be corrected by software patches, but should possess completely compiled software code. Priority 1 or 2 problems should not be corrected with hardware patches.
  • Priority 3 or 4 problems: For problems classified as Priority 3 or 4, the FAA may allow manufacturers to proceed with certification without correcting the problems and retesting the equipment provided that uncorrected problems and problems corrected by software patches are specifically identified as part of the certification process. Patches may be made but are not encouraged. A limited number of patches may be necessary due to preproduction hardware configurations that impact system performance without functionally limiting it. The FAA should, however, strongly encourage manufacturers to submit equipment that possess a completely compiled software code.

If, during certification testing, the FAA determines that any of the appropriate precertification requirements have not been met—for example, if Priority 1 or 2 problems are encountered—the certification process should be stopped immediately. Certification testing should not be resumed until the manufacturer corrects any Priority 1 or 2 shortfalls, reestablishes a baseline configuration, and re-attests to the system's readiness for certification. The FAA may also stop dedicated certification testing if the number of Priority 3 problems encountered is sufficient to significantly impact the integrity of the test or the FAA's ability to continue testing.

4  

A software patch is a software modification in which a binary code is inserted into executable files without recompiling from the source program. The use of patches is not considered acceptable software engineering practice, although there are situations in which there are no other options (Buckley, 1993).

Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 33
Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 34
Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 35
Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 36
Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 37
Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 38
Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 39
Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 40
Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 41
Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 42
Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 43
Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 44
Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 45
Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 46
Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 47
Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 48
Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 49
Suggested Citation:"5 Life-Cycle Management Plan." National Research Council. 1998. Configuration Management and Performance Verification of Explosives-Detection Systems. Washington, DC: The National Academies Press. doi: 10.17226/6245.
×
Page 50
Next: References »
Configuration Management and Performance Verification of Explosives-Detection Systems Get This Book
×
Buy Paperback | $47.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

This report assesses the configuration-management and performance-verification options for the development and regulation of commercially available Explosive Detection Systems (EDS) and other systems designed for detection of explosives. In particular, the panel authoring this report (1) assessed the advantages and disadvantages of methods used for configuration management and performance verification relative to the FAA's needs for explosives-detection equipment regulation, (2) outlined a "quality management program" that the FAA can follow that includes configuration management and performance verification and that will encourage commercial development and improvement of explosives-detection equipment while ensuring that such systems are manufactured to meet FAA certification requirements, and (3) outlined a performance-verification strategy that the FAA can follow to ensure that EDSs continue to perform at certification specifications in the airport environment.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!