National Academies Press: OpenBook
« Previous: 8 Computer-Assisted Passenger Screening and Positive Passenger-Bag Matching
Suggested Citation:"Human Factors." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×

Page 48

9—
Human Factors

Human factors should be an important consideration in the deployment of EDSs, noncertified bulk explosives-detection equipment, and TEDDs. The primary performance measures for all of these systems are throughput and operational errors, which are often considered to be human errors. Even if the equipment itself fails, an analysis usually reveals an underlying human error in management, installation, or maintenance. For this reason, the performance of equipment in the laboratory should not be considered an estimate of system performance but rather an upper bound. In congressional testimony by the U.S. Department of Transportation Inspector General's Office, human factors were acknowledged to be crucial to system effectiveness (DOT, 1998). A telling example (detailed in Chapter 10) is the sensitivity of overall system effectiveness to human performance in resolving alarms of the CTX-5000 SP. Human performance can be improved by improving training or by changing the equipment and design and operating procedures.

Human performance, as measured by Pd and Pfa, varies with the configuration and use of the EDS. For example, Pd and Pfa of the human/EDS system are sensitive to throughput rate, which is highly dependent on how bags are selected for screening. Thus, human factors should be explicitly considered throughout the design and deployment process rather than treated as a late addition. Inherent deficiencies in a particular system cannot be remedied by post facto operator training.

Models of Bulk and Trace Screening

Although both bulk and trace explosives-detection equipment are based on different technologies, have different human interfaces, and are deployed in different branches of the TAAS, they are both inspection systems intended to detect threats with minimal false alarms and maximum throughput. Currently, a common inspection model is used to evaluate the deployment of both kinds of systems. This model (Drury, 1989), which was derived from earlier models (Harris and Chaney, 1969; Sinclair and Drury, 1979), has been used for some time for industrial inspections and aviation structural inspections. In Figure 9-1, the functions are defined in generic terms along the center column (i.e., without reference to a specific hardware or application). Each function can be assigned either to a human operator, to a machine, or (occasionally) to parallel human and machine systems. In the deployment of bulk and trace systems, the allocation of function is part of the system development process. Thus, deployment involves managing installation and operation rather than reallocating functions. Operational data may reveal, however, that functions were not allocated optimally or that the implementation of particular functions could be improved. The side columns of Figure 9-1 show whether a generic function has been allocated to a human or a machine. Note the distinct differences between the bulk and trace systems, which require different capabilities by the human operators.

Figure 9-1 shows only one component in the overall TAAS. This component is linked to other components by how bags arrive for screening and by the alarm-resolution procedures. For example, if a TAAS configuration has two EDSs in series with alarms on the first EDS resolved by the second, the human decision in the first EDS may not be critical. If noncertified bulk explosives-detection equipment is used that does not have image display capability but uses a go/no-go indicator, the operator is not involved in determining if there is an alarm but is involved in resolving it.

Factors that Affect Human and System Performance

In Table 9-1, the factors that affect human and system performance are broken down by task, operator, machine, and environment. In bulk systems, the visual search function is allocated to machine hardware, a reasonable first step because the human visual search function is uniformly unreliable (Hou et al., 1994). In TEDDs, however, the search

Suggested Citation:"Human Factors." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×

Page 49

image

Figure 9-1
Role of the human operator in explosives detection.

function involves an operator wiping or passing a wand over the bags, a human function of uncertain reliability.

As Table 9-1 also shows, many management support issues affect the operator and, hence, the performance of security equipment. First, management must actively ensure that procedures are followed (e.g., maintenance, alarm resolution) and that decisions are made (and actions taken) based on security concerns. The clearance of an alarmed bag represents a potentially dangerous situation because explosives are more likely to be present in alarmed bags than in unalarmed bags. Therefore, clearing or not clearing a bag can be a perceptually difficult and emotionally charged decision. The operator requires management backup in these decisions and in the application of resolution procedures. Second, management is responsible for system-wide conditions that affect operator performance (e.g., number and selection of bags reaching the operator, time pressures on the operator, training, and feedback to the operator). Third, management is responsible for job design, which can affect, for example, operator turnover. A well designed job affords operators great latitude in making decisions (i.e., control of their own work), low levels of psychological demand (i.e., workload and work difficulty), and high levels of support (i.e., from supervisors and peers) (Karasek and Theorel, 1990). Traditional x-ray screening jobs have few of these desirable characteristics and, in addition, have socially undesirable hours and low pay. Therefore, high levels of turnover are to be expected and have been noted in previous NRC reports (1996a, 1997). Job design is important for bulk and trace explosives-detection equipment. Because additional operator training is required (particularly for bulk EDS), management has both an incentive and an opportunity to rethink job designs to reduce the turnover of trained operators.

Table 9-1 also shows operator/machine interfaces and personnel issues, such as operator abilities (determined by selection, retention, or job rotation) and operator training (quality and timeliness). For bulk equipment, such as the CTX-5000 SP, the human/machine interface is a manipulable screen image, usually in false color, with the suspect area highlighted. The operator may have to access other

Suggested Citation:"Human Factors." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×

Page 50

TABLE 9-1 Factors That Affect Operator and System Performance

Function

Task

Operator

Machine

Environment

Setup

1. Calibration procedures

1. Ability to follow procedures accurately

1. Operator interface for calibration

1. Management support of equipment calibration and maintenance

2. Maintenance procedures

2. Training (quality and timeliness) for calibration and maintenance

2. Operator interface for maintenance

2. Location of equipment

3. Time available for tasks

3. Availability of equipment operating procedures and other job aids

Presentation

1. Baggage selection

1. Training

1. Physical layout of equipment

1. Management control over baggage selected

2. Perceived and actual passenger pressure

Searcha

1. Defined wand search pattern

1. Training (quality and timeliness) for search pattern

1. Human interface (e.g., wand, alarm indicator)

1. Management support to ensure procedures are followed

2. Operator dexterity

2. Perceived and actual passenger pressure

Decision

1. Pd

1. Knowledge of threats

1. Operator interface to display design

1. Management support for following correct decision procedures

2. Pfa

2. Knowledge of potential false-alarm items

2. Operator interface to acquire additional information

2. Perceived and actual passenger pressure

3. Time available

3. Experience, overall and recent

4. Ability to make a decision

5. Resistance to passenger pressure

Response

1. Alarm-resolution procedures

1. Knowledge of alarm-resolution procedures

1. Interface to other systems

1. Management support for alarm-resolution procedures

2. Management provision of feedback

2. Resistance to passenger pressure

2. Perceived and actual passenger pressure

a Note that search has been largely allocated to machine functions for bulk explosives detection.

Suggested Citation:"Human Factors." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×

Page 51

views of the highlighted area to determine its spatial surroundings or even call for more data, such as additional scans. Even if operators can confidently classify the alarmed area as a false alarm, they must perform a visual search (which has potentially limited reliability) of the rest of the image because a known false alarm in a bag could be a diversionary tactic to draw attention away from a smaller true threat in the same bag.

The operator/machine interfaces in TEDDs, which are capable of determining specific chemical species, are simpler because chemical false alarms are less frequent. Thus, the interface can be as simple as a two-stage indicator (alarm/no alarm). In TEDDs, the interface is not specifically designed to assist the operator in resolving alarms, and alarm resolutions are typically procedural (e.g., eliciting information from the passenger whose bag caused the alarm).

Deployment Issues

Now that explosives-detection equipment has been (and continues to be) deployed, concerns have arisen about physical installation, the training of security personnel, and the integration of the equipment into ongoing security and baggage-handling operations. One particular concern is that the CTX-5000 systems in many locations are operating at a fraction of their rated capacity (DOT, 1998). This under-utilization may be attributable partly to alarm resolution time, partly to location, and partly to the number of bags being sent to them. Underutilization poses a potential problem for the maintenance of operator skills, particularly the skills required for resolving alarms, because underpracticed skills often deteriorate. At some locations, the throughput rate has been so low that operators could even lose their skills for operating the equipment. Reliable data on improvements or the deterioration of screening skills as a function of experience are sparse, but in other disciplines skill maintenance has been a problem.

In analogous military systems with rare threats, the military's solution has been to increase training, particularly embedded training (i.e., the introduction of simulated threats into the system during operations) (Walsh and Yee, 1990). Responses to these threats can be quickly analyzed and feedback provided to the operator. A technology that could be used for embedded training in airport security screening (called threat image projection system [TIPS]) has been developed but is not being used on all conventional x-ray scanners or bulk explosives-detection systems. With an inservice measurement system, such as TIPS, it should be possible to establish a performance timeline, including an initial learning curve, long-term fall-off in performance, and the benefits of recurrent training.

The deployment of TEDDs has led to some integration problems (e.g., calibration and maintenance) but fewer than have been encountered for bulk explosives-detection equipment. The FAA has not provided realistic, precisely specified random challenges to TEDDs for testing and feedback, although they have been developed for other equipment used at security checkpoints and the checked-baggage stream (e.g., FAA test objects, TIPS). Currently, performance measurement for TEDDs is not well coordinated, and feedback is dependent on the items presented by passengers and the reliability of alarm resolutions.

One problem common to the development and deployment of both bulk and trace explosives-detection equipment is lack of human-factors support. Not many human-factors engineers are familiar with security issues, and human-factors engineers from other domains have not received enough training to be helpful. Unfortunately, although the FAA has expanded its efforts, its resources are limited.

A number of human-factors issues discussed in this chapter have become apparent only as explosives-detection equipment has been deployed. Although none of these issues is new, at least to human-factors professionals, the fact that they are being raised as primary concerns now shows that they were not addressed during the research and development phase that proceeded this deployment (DOT, 1998).

The most serious issue is that the FAA certifies equipment only and not the human/machine system. Thus, certification provides only a bound on total system performance, and when the system is deployed, the performance level of the equipment/operator system is sometimes well below the performance level of the equipment alone. The FAA should continue to certify equipment but should also certify that the human/equipment system meets performance requirements. Thus, if the operator must resolve each alarm, the operator should have demonstrated that he or she can correctly identify threats and correctly clear nonthreats with defined probabilities. If equipment manufacturers know that systems will be tested as a whole, they will have to consider the human role in the system and include design displays and procedures to support it:

• Bulk explosives-detection equipment must provide support for alarm resolution. The CTX-5000 produces a visual display, but whether this is the optimum way to support alarm resolution has not been determined.

• The trace explosives-detection equipment must be easy to maintain and calibrate. The contribution of TEDDs to overall system performance (i.e., TAAS) cannot be assessed unless the equipment operates with known detection parameters.

Management support is essential for many operator functions. Responsive management practices (e.g., setting and maintaining policies that emphasize effectiveness rather than throughput), appropriate job aids (e.g., devices for calibrating and maintaining trace equipment), and the time and training to use them all affect operator performance. A job with a very low ratio of job-tenure time to skill-acquisition time raises questions about job design and management commitment. The addition of new technology that is not well

Suggested Citation:"Human Factors." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×

Page 52

integrated into the overall security system only increases the demands on equipment operators.

Timely, controlled feedback is essential to the development and maintenance of high performance levels and is essential to embedded training. If the only feedback operators receive is the outcome of imperfect alarm-resolution procedures, managers will not have control. Effective feedback requires systematically challenging the system with well characterized simulated threats. Current simulated threats are FAA test objects and modular bomb test sets (MBTSs). For the current x-ray CTX-5000 equipment, TIPS can be used, although it may not work on all current x-ray systems. Furthermore, an equivalent system to TIPS will have to be developed for trace equipment. On a larger scale, challenges can be provided by one-time evaluations, such as double-blind testing,1 but these are more useful for audits than for regular feedback because of their irregular use.

The FAA test objects are currently used to evaluate the performance of airport security systems on a regular basis, with specified procedures for dealing with detection failures. These tests should be expanded and revised for both current systems and newly deployed technologies. Overall performance could be improved by testing with more realistic threat objects.

Once a performance measurement system is in place, the FAA should consider certifying individual equipment operators. The FAA could issue performance standards and conduct evaluations or examinations, as it does for other occupations in civil aviation, such as pilots or maintenance technicians. However, the responsibility for training operators to FAA standards should rest with private industry, such as airlines, security companies, and trade schools. The introduction of new equipment, which increases the complexity of the job, could then provide for job progression, which—coupled with redesigned jobs and operator licensing—could increase job tenure and reduce turnover. The annual turnover rate is more than 100 percent in many locations, and with longer training times (for CTX operators), the personnel problem is becoming more acute. The measures suggested here could lower turnover rates by addressing underlying reasons for worker turnover.

Current evaluations of both older screening systems and the newly deployed systems are based on direct performance measures, including missed detections, false alarms, and response times. However, measuring performance does not in itself explain why that performance occurred or how it could be improved. For example, if a bulk system alarm is resolved improperly by an operator and results in a missed detection, the reason the error occurred must be determined. Possible causes could include a poor display for the operator, the assumption that an alarm is triggered by a familiar ''recognized" material, failure to obtain the correct additional information, time pressure, and perceived customer/management pressure. In any particular case, the causal factors usually lead back to underlying latent failures, such as inadequate training, poor management, or even improper allocation of function in the design (Reason, 1990).

Detailed error analyses will increase the cost of data collection and will require at least a function-level model of the screening process (Neiderman and Fobes, 1997). However, the data would provide much needed guidance for raising performance levels. The same analyses could be used to determine the reasons for successful (or correct) decisions by equipment operators.

Conclusions and Recommendations

Human operators are integral to the performance of all deployed explosives-detection equipment. Because fully automated explosives-detection equipment is not likely to be developed in the foreseeable future—particularly with respect to alarm resolution—human operators will continue to be immensely important to realizing the potential of deployed security hardware. Current certification testing of explosives-detection equipment, however, only defines the operational capability (or performance) of the equipment, and human factors have resulted in a lower operational performance level than the certified detection capability of the equipment. Thus, the human operator/equipment combination should also be required to meet performance requirements for FAA certification. Furthermore, standards of operator performance for all systems should be raised and then monitored and the results regularly provided to airlines.

Recommendation

In addition to certifying explosives-detection systems, the FAA should ensure that these systems can be operated (by a human operator) at a specified probability of detection, probability of false alarm, and throughput rate.

Recommendation

The FAA should deploy more human-factors experts throughout the system. Human-factors training should be provided for at least some system managers and operators.

Recommendation

The FAA should initiate a program to improve operator performance that includes the following elements:

• measurements of the performance of the equipment/operator combination

• valid and reliable challenges (tests) for system components

• embedded training to improve and maintain operator performance

• improved test and qualification procedures for operators

1 An example of double-blind testing is an FAA employee posing as a terrorist trying to sneak a simulated bomb onto an airplane without the knowledge of the security equipment operators.

Suggested Citation:"Human Factors." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×
Page 48
Suggested Citation:"Human Factors." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×
Page 49
Suggested Citation:"Human Factors." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×
Page 50
Suggested Citation:"Human Factors." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×
Page 51
Suggested Citation:"Human Factors." National Research Council. 1999. Assessment of Technologies Deployed to Improve Aviation Security: First Report. Washington, DC: The National Academies Press. doi: 10.17226/9726.
×
Page 52
Next: Evaluation of Architectures »
Assessment of Technologies Deployed to Improve Aviation Security: First Report Get This Book
×
 Assessment of Technologies Deployed to Improve Aviation Security: First Report
Buy Paperback | $47.00 Buy Ebook | $37.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

This report assesses the operational performance of explosives-detection equipment and hardened unit-loading devices (HULDs) in airports and compares their operational performance to their laboratory performance, with a focus on improving aviation security.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!