Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 24
--> 4 Tools for Ensuring Operational Performance To ensure operational performance throughout the life cycle of a unit of explosives-detection equipment, the FAA should implement a life-cycle management plan that defines and documents configuration-management, performance-verification, and quality-assurance procedures for all stakeholders. This chapter describes ''tools'' that could be used by the stakeholders to produce explosives-detection equipment that consistently meets the performance requirements of the FAA. Individually, none of the tools will ensure the operational performance of explosives-detection equipment—neither configuration management nor performance verification are singularly effective methods of ensuring the performance of explosives-detection equipment. These tools must be integrated synergistically with a quality system to effectively maintain equipment performance at a level acceptable to the FAA. Configuration Management Configuration management is a process to identify the functional and physical characteristics of a software, firmware, or hardware item during its life cycle; control changes to those characteristics; and record and report change processing and implementation status. Configuration management is applied to ensure operational efficiency and control cost and may be applied to achieve uniformity in procedures and practices within the FAA and between the FAA and industry. Properly applied, configuration management could provide the FAA and manufacturers of explosives-detection equipment with the formal mechanisms for determining how changes affect operating characteristics, including detection ability. Applying configuration-management techniques, however, requires judgment to be exercised: Inconsistent, unmoderated configuration management may compromise performance; rigid, inflexible configuration management may stifle innovation. Thus, effective configuration management relies on the knowledge and judgment of the people responsible for implementing the configuration-management plan. Configuration management is performed by subjecting every change to a configuration item (CI) to review and approval by authorized and knowledgeable personnel. A CI is a collection of hardware, software, and firmware that is a uniquely identifiable subset of the system and that represents the smallest portion of the system to be subject to configuration-control procedures (DOD, 1995; Buckley, 1993).1 Furthermore, documentation that describes the configuration of a CI is itself a configuration item that needs to be defined and maintained. CIs that are specific to system software are referred to as computer software configuration items2 (DOD, 1995). CIs must be individually controlled, because any change to a CI may affect the performance of the explosives-detection equipment. It is crucial that all explosives-detection equipment have a readily auditable list of CIs that includes their current status. Configuration management, as shown in Figure 4-1, consists of four basic functions3 (DOD, 1995; Buckley, 1993; Blanchard, 1986): Configuration identification: identification of the functional and physical characteristics of a software, firmware, or hardware CI at any given time during its life cycle. This includes formal selection of CIs and 1 Any one of the operational subsystems (e.g., analyzer) can be identified as a CI. Conversely, another operational subsystem (e.g., sampling) might consist of several CIs (e.g., x-ray source, x-ray detector, power supply). The designation of a CI is often a judgment call made by a project manager or project management team. It is immaterial to configuration management how this decision is made (Buckley, 1993). 2 For the purpose of this report, CI is used as a general term for all configuration items—hardware, firmware, or software. Computer software CI is only used to specify a computer software configuration item. 3 Configuration management terms used in this report are defined in the Glossary. For a more complete collection of definitions that apply to configuration management, refer to Appendix A of Implementing Configuration Management (Buckley, 1993).
OCR for page 25
--> Figure 4-1 Major divisions of configuration management. Source: Buckley, 1993 IEEE. maintenance of the documents that identify and define the baseline of a CI and the overall system. Configuration control: systematic proposal, justification, evaluation, coordination, approval, or disapproval of proposed changes and the implementation of all approved changes to the configuration of each CI and the documentation that identifies the configuration of the CI. Configuration status accounting: recording and reporting the implementation of changes to the configuration and its identification documents. Configuration auditing: checking a CI or system for compliance with the identified configuration. These functions are performed to manage the configuration of explosives-detection equipment throughout its life cycle. Configuration Identification Configuration identification involves selection of CIs and maintenance of the documents that identify and define the baseline configuration of such an item or the overall system (e.g., an EDS). This includes the determination of the types of configuration documentation for each CI—the issuance of unique identifiers (e.g., serial numbers) affixed to each CI and to the technical documentation that defines the CI's configuration. A configuration baseline is the documented configuration of CIs and of the equipment as identified at a particular point in the life cycle of explosives-detection equipment. Definition of the baseline configuration requires documenting the physical and functional characteristics of each CI, as well as the configuration of the items within a system. The manufacturer of explosives-detection equipment is likely to identify several baselines throughout the life cycle of the equipment. Definition of the baseline of explosives-detection equipment (including individual CIs) at the time of certification (a certification baseline) would provide a mechanism for the manufacturer and the FAA to track the degree and criticality of changes to the equipment. Here "degree" refers to the extent of the change (e.g., localized versus all encompassing) and "criticality" refers to the importance of the item being changed to system performance. For example, changing the configuration of the x-ray detector in an x-ray-based EDS should receive a more thorough review than changing the color of the external cabinet. Configuration Control Configuration control, sometimes referred to as change management or change control, is the set of management functions necessary to ensure that compatibility is maintained between all items of a system whenever any single item is changed (Blanchard, 1986). This includes configuration control of a CI (e.g., an x-ray detector) after establishment of the configuration baseline. Changes in configuration are not uniform in their degree or criticality, and, therefore, classifying the impact of a software, firmware, or hardware change is crucial to determining the extent of verification (up to and including recertification) that would be required. Figure 4-2 describes the basic steps in addressing changes to manufactured equipment. Classifying the impact of a change on detection performance and the potential need for recertification is a crucial function. Many companies establish a configuration-control board to evaluate each change in terms of its impact on other configuration items prior to a decision on whether or not to incorporate a change. The concept of a configuration-control board is discussed in more detail in Chapter 5. Configuration Status Accounting and Auditing Configuration status accounting involves recording and reporting proposed and approved changes to the baseline configuration of a CI. This includes a record of the approved CI documentation and identification numbers, the status of proposed changes to CI configuration, and the implementation status of approved changes. A configuration audit is the process of reviewing a CI or system for compliance with the identified configuration. As suggested above, the control function of configuration management is to subject all changes to a CI to review and approval by authorized personnel. Configuration status accounting and auditing are tasks that must be diligently performed to maintain configuration control. The description of
OCR for page 26
--> Figure 4-2 Graphical depiction of configuration control. Source: LOGISTICS ENGINEERING AND MANAGEMENT 5/E by Blanchard, Benjamin S., 1998. Adapted by permission of Prentice-Hall, Inc., Upper Saddle River, N.J. how configuration management will be implemented is documented in a configuration management plan. Several software-based configuration management tools are available to aid in the control task of a configuration management plan, some of which facilitate automated control of CIs. Configuration-Management Tools4 There are a variety of software tools that are useful for establishing and maintaining a configuration management plan. Not all configuration management tools are the same. They are derived from different concepts, have different architectures, and are designed to address a variety of user requirements. Given that there are several tools in the marketplace, selecting the appropriate configuration management tool to meet the needs of the FAA as well as manufacturers of explosive-detection equipment is not a trivial problem. Based on user needs, business goals, and long-term plans, the proper configuration management tool may vary from one manufacturer to another, yet all of these tools may still meet the requirements of the FAA. Configuration management tools have matured over the past ten years. The amount of functionality, their quality, their useability, and their platform coverage have been greatly improved. The tools can provide the necessary automation support to implement a configuration management plan. Current configuration management tools can be categorized into three classes according to their functionality (Figure 4-3). These tools can be delineated in the following manner: Figure 4-3 Classes of configuration management tools. 4 This subsection is an overview of configuration management to set the context for recommendations regarding configuration management that are made in this report (see Chapter 5). This subsection is not meant to be a complete tutorial on the subject. For more information please refer to references on the subject such as Buckley (1993) or Burrows et al. (1996).
OCR for page 27
--> Class 1 -Version-control tools: individual versions of objects, such as source code, executables, graphics, x-ray sources, detectors, are archived. Simple problem tracking, if any, and limited parallel development may be supported. Class 2 -Developer-oriented tools: these include version-control capabilities as well as supporting the many needs of teams of developers and managers in parallel development, creating, merging, changing, and releasing products for distribution. Class 3 -Process-oriented tools: these include version-control capabilities, at least some of the developer-oriented capabilities, the ability to automate the software-flow life cycles, customize the out-of-the-box process model, and provide an integrated approach to change management where problem tracking is associated with the code. Class 3 tools typically have more functionality than Class 2 and integrate Class 2 functionality as part of the tool's infrastructure. Similarly both Class 2 and 3 tools have more functionality than Class 1. It is likely that the market will eventually demand a uniform (or standard) model of functionality for configuration management tools. As vendors of the tools move toward a uniform model, the classes could disappear so that a single-standard class of tools would support all configuration management needs. Start-up companies, however, may not require all the functionality that such a standard tool may have and might support a market for lower-cost configuration management tools with less functionality. Then, as the company continues to develop and their demands of a configuration management tool grow, it may require a more sophisticated tool. Given the dynamic nature of the configuration management tool industry and the variance in tool user needs, it would not be appropriate to recommend a specific tool at this time. For examples of currently available configuration management tools please refer to Appendix B. As part of their overall quality systems, the panel recommends that each stakeholder use an appropriate level configuration management tool. Performance Verification It could be argued that, when comprehensive and flawlessly performed, configuration management alone could guarantee uniform performance of subsequent copies of EDS. In practice, however, it is impossible to define and control every critical parameter in a manufacturing process or to predict the effect of every change on explosives-detection performance. Therefore, a means of verifying the performance of an EDS is needed to complement the configuration management plan in ensuring operational performance. Performance verification is defined as the process of verifying that explosives-detection equipment complies with the requirements allocated to it. For example, during the research phase performance verification might include verifying that the proper x-ray attenuation coefficient is measured for a known test material. For deployed (or to-be deployed) EDSs, performance verification might involve testing to determine whether or not the EDS has a similar probability of detection (PD) and probability of false alarm (PFA) to that of the latest configuration certified by the FAA. For noncertified equipment, performance verification might test to verify that deployed equipment performs at the same level it did during FAA baseline testing (see below for discussion of types of testing). The panel found it useful to define seven levels of testing that might take place during the life cycle of an EDS: precertification, certification, baseline, qualification, verification, monitoring, and self-diagnosis (see Table 4-1). Although other types of testing may be undertaken by the FAA, manufacturers, or end users, the seven listed above are the TABLE 4-1 Seven Proposed Testing Levels during the Life Cycle of an EDS Test Level Purpose Location Test Objects Frequency Precertification Determine if technology is ready for certification testing. Manufacturer's site Test articles Oncea Certification of an EDS Determine if technology performance is at certification level. FAA Technical Center Primary standard bag set Oncea or Baseline Establish baseline performance for noncertified equipment. FAA Technical Center Primary standard bag set Oncea Qualification Verify the performance of an individual manufactured unit to qualify that unit for deployment. Manufacturer's site Secondary standard bag set and test articles Oncea Verification Verify the performance of an individual deployed unit to confirm that performance is at qualification level Airport Secondary standard bag set At specified occasional intervals Monitoring Verify critical system parameters to confirm the consistency of system performance. Airport Test articles At specified frequent intervals Self-diagnosis Verify that subsystem parameters axe operating according to specifications for continuous "system health." Airport None Continuous a May require retesting until the unit passes the specified test.
OCR for page 28
--> ones related to certification and certification maintenance. The FAA's current certification documentation specifies the testing protocols and test objects to be used for the first three test levels, i.e. precertification, certification, and baseline (FAA, 1993). The remaining four test levels—qualification, verification, monitoring, and self-diagnosis follow manufacturing and airport deployment and have thus far not been included. To verify detection performance of bulk explosives-detection equipment at manufacturing sites and in airports will require two integrated steps: (1) definition of performance specifications for these environments, and (2) development of test protocols that are practical at manufacturing sites and in airports, including the development of appropriate test objects. In both cases performance specifications are central to effectively defining a performance-verification protocol. Performance specifications must be validated for performance verification to be effective. That is, the performance specifications for explosives-detection equipment must represent the performance required to detect explosives in checked baggage. Performance verification and validation will likely be challenging due to variations in explosive threats and changes in explosives-detection equipment design, manufacture, and operational usage. As threats and system configurations change, a means to verify the performance of explosives-detection equipment at airports and manufacturing sites is needed to ensure that performance specifications continue to be met after such changes occur. Precertification testing allows the FAA to guide manufacturers in development of new equipment to meet the requirements of explosives-detection certification testing. At this stage of development the FAA and the manufacturer should have established a mutual understanding of the requirements and the technology being tested. The panel recommends that specified parameters critical to performance be monitored and recorded during precertification testing. At the same time, the manufacturer may take advantage of this testing opportunity to record the response to any of their in-house test articles to enhance their ability to develop future systems quickly and cost effectively. FAA Certification Testing The FAA certification process involves testing explosives-detection equipment that has an identified baseline configuration and that has passed precertification testing at the manufacturing facility 5 (FAA, 1993). Certification testing is performed at the FAA Technical Center using, as a primary standard, bags of different sizes and shapes with a variety of explosives located at different positions in the bag and with a variety of normal bag contents that may interfere with detection of explosive materials. Special facilities have been constructed at the FAA Technical Center to allow real explosive materials to be handled and tested. The results of these tests include a figure of merit for the probability of detection (PD), for the probability of false alarms (PFA), and for the baggage throughput rate. Testing in Airports In 1995 the FAA Technical Center developed a test and evaluation master plan in support of their Baggage Inspection System Airport Operational Demonstration Project (FAA, 1995). The purpose of this demonstration project was to assess the operational feasibility, suitability, and effectiveness of the FAA-certified InVision CTX-5000-SP EDS and, therefore, to support deployment decisions. The demonstration project, however, does not define performance-verification protocol for use in routine testing of explosives-detection equipment in airports. The knowledge gained during this demonstration should be incorporated into the development of operational requirements and a performance-verification protocol including development of appropriate test articles. Testing and Technology The technology for detecting explosives in baggage must be capable of detecting an explosive and notifying the operator of the potential presence of a threat. As with many technologies, however, the performance of explosives-detection equipment is subject to practical limitations. These limitations arise because the equipment performs a complicated task with incumbent confusion (or noise), which results from the variety of baggage that passes through the system, from the nonthreat contents within each bag and the variability of the measurement equipment. In addition, different technologies for explosives detection will have different critical performance parameters. Thus, the specifics (e.g., test articles, critical system parameters, etc.) of performance verification are likely to vary from one explosives-detection technology to another. The general performance-verification protocol (e.g., logistics), however, should apply to several different technologies. In developing a performance-verification protocol, the characteristics that define explosives-detection equipment must be considered. Any specific performance-verification protocol must be tailored to the particular type of equipment being tested, but an early identification of the different components and subsystems making up the equipment (as discussed in Chapter 3) will increase the efficiency and rigor of the protocol. 5 In cases where manufacturers do not have the license or facilities to handle explosive materials, portions of precertification testing may be conducted (by the manufacturer) at the FAA Technical Center using a bag set that is entirely different from the bag set used for certification testing. In addition, precertification testing typically includes operational testing at airports using actual passenger baggage (without explosives or simulants) to collect false-alarm and throughput rate data.
OCR for page 29
--> Approaches to Test and Evaluation As indicated above, the FAA certification process relies almost entirely on test and evaluation, at the FAA Technical Center, of a complete unit of explosives-detection equipment (FAA, 1993). As such, certification testing incorporates all of the factors that contribute to the spread of the decision variables (physical parameters measured), as depicted in Figure 4-4. (For a complete discussion of the threat-decision paradigm, see Appendix C.) The inclusion of all factors during a test is not the only, nor the most efficient, way to evaluate the performance of explosives-detection equipment. The capability of explosives-detection equipment to detect explosives can be estimated through the measure of physical parameters associated with operational subsystems of the equipment. For example, equipment using image data for classification of objects may be partially evaluated through the measure of parameters such as contrast sensitivity, system spatial resolution, and the quantity and character of noise and related parameters (ICRU, 1995). This International Commission on Radiation Units and Measurements report describes a means of evaluating the random fluctuations due to the statistical processes by characterizing the manner in which the noise power is distributed over spatial frequency. Although this measure represents a fairly sophisticated measure of noise, given the digital nature of the image from a CT system, there is no reason why similar measures could not be performed on an x-ray CT-based EDS system at the factory, the FAA Technical Center, or the airport facility. The American Association of Physicists in Medicine (AAPM) also provides some candidates for test articles and test protocols to be used to make the performance measurements related directly or indirectly to contrast, spatial resolution, and the quantity and character of noise (AAPM, 1993). In addition to the test articles described in this AAPM report, several commercial sources6 are available for purchasing test articles for making similar measurements that have been used to test medical CT systems. Tests using such articles could be performed in the factory or at an airport. For an x-ray CT-based EDS system in the airport environment, the resolution of alarms typically requires that an operator scrutinize an image and, therefore, will depend on the quality of the image on the system's video monitor. Any test and evaluation plan must include this component of the entire imaging system. The approach in the medical imaging community in testing this component is to use digital test patterns such as the Society of Motion Picture and Television Engineers (SMPTE) digital test pattern (Gray et al., 1985). Again, because this type of test is straightforward, an equivalent test could be used not only in the factory but at the airport facility to test the display device on the CT-based EDS system. Figure 4-4 Factors contributing to the spread of the measured physical parameter(s). Manufacturers of explosives-detection equipment based on other technologies could identify parameters indicative of their system's performance. The manufacturer has unique knowledge about which critical parameters and test articles are appropriate for their system, but it is to all stakeholders' advantage to identify these at the precertification stage and to maintain them throughout the life cycle of the EDS. In addition to relating explosives-detection equipment performance to specific system parameters, there is a need for the FAA to continue development of materials that simulate explosives. These simulants could be used in a secondary standard bag set that yields test results that correlate strongly with the results obtained from testing with the primary standard bag set (see, for example, Annex II in NRC, 1993). These simulants must be matched to specific explosives and to specific explosives-detection technologies. Similar to how the FAA controls the explosives used in the primary standard bag set, simulants used in the secondary standard bag set need to be controlled to ensure that they continue to accurately represent the explosive threat (i.e., explosives at threat level). Factors that need to be considered in controlling the simulants include changes to the primary threat and the shelf life (e.g., degradation of chemical and physical properties of simulants over time) of the simulants. Furthermore, the configuration of the secondary bag set would need to be controlled. Quality Systems and Standards The total quality system is the agreed company-wide and plant-wide operations work structure, documented in effective, integrated, technical, and managerial procedures, for guiding the coordinated actions of the work force, the machines, and the information of the company and plant in the best and most practical ways to assure customer quality satisfaction and economical costs of quality (Feigenbaum, 1983). 6 Some sources that the panel is aware of are Computerized Imaging Reference Systems, Inc., Norfolk, Va.; Nuclear Associates, Carle Place, N.Y.; Gammex RMI, Middleton, Wis. This list is provided as a resource only; the panel does not endorse any specific company or product.
OCR for page 30
--> A quality system is a model for quality assurance in design, development, production, installation, operation, and maintenance. The quality system balances basic principles of quality control such as configuration management and performance verification—facilitating the synergistic use of these tools to ensure the performance of explosives-detection equipment. An effective, thorough quality system is crucial to maintaining the performance of explosives-detection equipment. For the operational performance of explosives-detection equipment to be ensured during its life cycle, the key attributes of a quality system must be defined and supported by the management of each stakeholder, as well as the personnel responsible for quality assurance. Adequately addressing the key attributes of a quality system, as defined in Box 4-1, have been found to be crucial to the success of a quality system. Standards for Panel Consideration There are likely as many quality systems in practice as there are organizations that use quality systems. Each one is tailored to meet the specific needs of a particular organization. The panel considered the following quality standards from government sources and commercial industry for the FAA to use in developing its framework for a quality management program: U.S. Department of Defense standard MIL-Q-9858A (DOD, 1963) FAA Type Certification (for certification of airplanes and components; 14 Code of Federal Regulations, 1997) FDA's Good Manufacturing Practices (21 Code of Federal Regulations, 1995) ISO 9000 series of quality system standards (ISO, 1994) North Atlantic Treaty Organization standards AQAP-130 (NATO, 1993a)and AQAP-131 (NATO, 1993b) In addition, the panel considered the use of a quality standard developed by the FAA in-house, as has been done by various private companies. The U.S. Department of Defense and the North Atlantic Treaty Organization quality systems and standards were BOX 4-01 Attributes of an Effective Quality System Quality Planning-defining methods that will ensure the compatibility of the design, the production process, installation, servicing, inspection, and test procedures and the applicable documentation. Design Control-establishing and maintaining a process to control and verify the design of the product to ensure that the specified requirements are met. This attribute, along with the attribute of document and data control, is the basis for the activities that are defined in configuration management. Document and Data Control-establishing and maintaining a process to provide documentation, which identifies the key quality activities, throughout the organization to properly design, plan, and perform the necessary work. Process Control-identifying and planning of the production, installation, and servicing processes that directly affect the quality of the product. Compliance with reference standards and regulatory requirements is accounted for by integrating them with the processes that are to be controlled. Inspection and Testing-establishing and maintaining procedures for performance verification to verify that the specified requirements are being met. This activity takes into account the receipt of materials and processes, the in-process activities during the build of the product, and the final review of the product before delivery. These levels of inspection and testing correlate the designed configuration of the system with the as-built configuration and verify that the product performs as specified by regulatory and customer requirements. Control of Inspection, Measuring, and Test Equipment-establishing and maintaining the appropriate equipment with which to monitor the entire process and measure the results of performance-verification activities. Control of Nonconforming Product-establishing and maintaining procedures and processes by which products that do not conform to the specified regulatory or customer requirements are removed from the process and prevented from being accidentally installed and delivered. Corrective Action-establishing and maintaining a process to identify the origin of a problem (e.g., product performance or manufacturing processes) and implement a solution to prevent problem recurrence. Training-establishing the job proficiency levels required for the manufacture, operation, and maintenance of a product (including configuration management and performance verification) and providing appropriate information, instruction, and resources to raise personnel skill levels to ensure that customer and regulatory quality requirements are met.
OCR for page 31
--> Figure 4-5 ISO 9000 standards and guidelines. The emphasis of this report is on the compliance standards ISO 9001, ISO 9002, and ISO 9003. Source: Lockheed Martin Corporation. eliminated from further consideration by the panel because they have been (or are in the process of being) canceled without replacement, with the caveat that commercially available quality standards are to be used as a model for developing future quality systems. The panel, furthermore, eliminated from consideration the development (by the FAA) of a unique non-standards-based quality system for use by all manufacturers of explosives-detection equipment because commercial-quality standards exist that could serve the needs of the FAA, the manufacturers, and the end users; the development of a new, unique quality system is unnecessary. Based on the expertise of the panel members, testimony from outside experts, and review of relevant literature, the panel critically assessed three quality standards for use by the FAA: FAA Type Certification (14 Code of Federal Regulations, 1997) the Food and Drug Administration's Good Manufacturing Practices (21 Code of Federal Regulations, 1995) the ISO 9000 series of quality system standards (ISO, 1994) Each of these quality standards has the following characteristics: It requires that a documented quality system be in place. It requires a documented method of configuration management that incorporates the needs of the stakeholders (i.e., the FAA, manufacturers, air carriers, and airports). It requires that the manufacturer be accountable for the product up to and including delivery of the product to its operational environment. Included in this accountability is the provision of a maintenance plan and information regarding product upgrades. Because of the flexibility in application, the availability of companies that provide third-party monitoring and reporting, and the global move toward the ISO 9000 series of quality standards, the panel believes that ISO 9000 has the most promise for addressing the quality system needs of the FAA, as discussed in the next subsection. Discussion of FAA Type Certification and the FDA's Good Manufacturing Practices can be found in Appendix D. ISO 9000 Series Quality Standards The ISO 9000 series international standard specifies quality system requirements for use when a supplier' s capability to design and supply products that meet a designated standard needs to be demonstrated (ISO, 1994). The ISO 9001 Compliance Standard covers design, development, production, installation, and servicing. ISO 9002 covers only production, installation, and servicing, and ISO 9003 covers final inspection and testing (see Figure 4-5). ISO 9001 and ISO 9002 standards are used to guide several domestic and foreign organizations. The ISO 9000 series quality standards have four basic requirements7: 7 The requirements listed for ISO 9001 refer to the "manufacturer" as the actor; the panel believes that these requirements are equally applicable to the FAA and to the users of explosives-detection equipment.
OCR for page 32
--> Executive management of the manufacturer must be responsible for defining and documenting its policy for quality. This policy should be relevant to the manufacturer's organizational goals; the needs of its customers; and in this case, the requirements of the FAA. A key requirement of the ISO 9000 series standards is that senior management, not department-level management, is responsible for the quality system. To provide structure and consistency to the quality system, all basic processes (e.g., design, contract review, procurement, manufacturing, inspection, and testing) must be documented. The quality system should be flexible in application, while at the same time providing a consistent structure for manufacturing employees to follow. Performance verification (confirmation that the equipment fulfills the specified requirements) and validation (confirmation that the specified requirements satisfy customer/regulator needs) are required throughout the life of the product. Part of the ISO 9000 series quality standards is a set of guidance documents that can be used to help interpret and tailor the requirements of the standards to a specific business (ISO, 1994): ISO 9000-1. Quality management and quality-assurance standards (Part 1: Guidelines for selection and use) ISO 9000-2. Quality management and quality-assurance standards (Part 2: Generic guidelines for the application of ISO 9001, ISO 9002, and ISO 9003) ISO 9000-3. Quality management and quality-assurance standards (Part 3: Guidelines for the application of ISO 9001 to the development, supply, and maintenance of software) There are several advantages to using the ISO 9000 series quality standards as a basis for the FAA to develop its quality system for maintaining consistency in the regulation of EDSs. A quality system that meets the intent of (is consistent with) ISO 9001 or ISO 9002 will provide the necessary framework to maintain consistency such that the FAA could regulate the manufacture of EDSs. Furthermore, the manufacturer of EDSs is allowed the flexibility to develop a quality system that supports its specific product and processes. At the same time, ISO 9001 and ISO 9002 provide the FAA with a standard mechanism by which to audit, and thereby regulate, several different manufacturers through a third party at no cost to the FAA. Finally, ISO 9001 certification or registration will likely be necessary for explosives-detection equipment manufacturers to compete in the international marketplace. The flexibility of the ISO 9000 series quality standards can, however, result in the development of a quality system that may not be sufficient for EDS manufacture. That is, the manufacturer's quality system may not comprehensively track the production of the EDS, yet the process in place will still meet the intent of the ISO 9000 series quality standards. This point is crucial, because although a quality audit will verify that the manufacturer is following its quality system, it does not validate that the quality system results in the production of a certifiable EDS. It will be critical for the FAA, the manufacturers, and the users to work together as experience is gained from deploying EDSs to ensure that appropriate measurements and quality systems are in place. Also worthy of consideration is the fact that the auditing process required to gain ISO 9000 certification does incur expenses, and this is discussed in Appendix E. Because the auditing process is crucial to determining that the quality system of each stakeholder is functioning correctly, the panel emphasizes that the cost of performing periodic audits is part of regulating and complying with regulation, and must be considered an integral part of the cost of certification.
Representative terms from entire chapter: