Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 226
Flight to the Future: Human Factors in Air Traffic Control 11 Human Factors and System Development There are three modes by which the human factor enters into the system development process. The first mode is based on existing data. As described in Chapter 10, research reports, textbooks and handbooks are the sources of such data. In many cases, the first mode involves the application of standard human factors principles from such sources when design decisions are being made. Since these principles incorporate the physiological and psychological capabilities of humans, system performance is not likely to be impaired by faulty design when this mode is properly carried out. The second mode is called advanced applications. In this mode, judgments are based on psychophysiological theories, and findings from nonsystem-specific research are used to inform the design decisions. A specific example is the design of the garments worn by astronauts in extravehicular activities. The designs had to be completed before any actual experience could be accumulated. Consequently, the arrangement of environmental status controls was determined on the basis of experiences with other types of protective garments. The values realized from the employment of this mode can be substantial (Karat, 1992). The third mode is more speculative. It comes into play when technological advances or changes in the operational situation are so large that conclusions about design options cannot be based on data or experience. The problem for the human factors expert in this mode is often that of a choice between extrapolating from what is already known or calling for focused research to resolve the uncertainty about the decisions to be made. Although the research option generates costs of time and funds, the answers to design questions are likely to be much
OCR for page 227
Flight to the Future: Human Factors in Air Traffic Control better than those obtained by extrapolation—particularly if the research involves extensive transactions with users (Bikson, Law et al., 1995). This chapter examines the possibilities for ensuring the safety and efficiency of the air traffic control system and related systems by the inclusion of human factors—using whatever mode is appropriate—during the system development process. The basic warrant for such an enterprise is that the current national policy stipulates that, no matter how sophisticated the air traffic control system becomes with respect to its inclusion of extensive computer capabilities, there will remain a human presence with operational decision making and management at its core (Federal Aviation Administration, 1995). Even if such a doctrine were not in effect, insofar as the system is operated to achieve human purposes and to serve human clients and customers, information about human capabilities, limitations, and preferences must be included in the decisions about the design and development of the system (Air Traffic Management, 1995). The challenge of incorporating human factors in system development is particularly important because the air traffic control subsystems that will form the national airspace system are likely to incorporate increasingly advanced technology that will have the potential to do much more of the controller's job. In this chapter we highlight the positive advantages of early and sustained inclusion of human factors in the sequence of system development. We also attempt to characterize the barriers to such inclusion—and the means by which such barriers might be overcome. We provide a series of route markers that, if followed, should help ensure that the human factor is considered early in the system development sequence and that this aspect is sustained whenever design decisions are being made. HISTORY, ORIENTATION, AND RATIONALE Since the early months of World War II, the inclusion of human factors in system development has been at least partly assigned to people specially trained in psychology or physiology or both (see e.g., Fitts, 1951a, 1951b). However, professional engineers who typically were concerned with other technical issues were most often the people given the primary responsibilities for system development. As larger and larger programs of system development were launched, it became evident that success was more likely if the program participants from various disciplines worked as a team (Chapanis, 1960). This observation, in turn, led program managers to designate specific roles for team members. One consequence was the emergence of systems engineering as a special field that incorporated particular responsibilities for system integration and program management. In short, systems engineers became the formal leaders of the team. Simultaneously, the human factors role also became more clearly defined as a distinct professional specialty. The teamwork approach has been effective when the system engineer, the
OCR for page 228
Flight to the Future: Human Factors in Air Traffic Control human factors specialist, and other members of the team have shared a set of objectives and have sought to produce a system that performs its functions at the highest levels of effectiveness—at the least aggregate cost (Goode and Machol, 1957). In 1974, Singleton proposed the concept of human-centered design in recognition of the fact that most of the variability in system performance derived from the responses of the human operator (Singleton, 1974). The concept has been elaborated in the intervening years. It is the human who becomes fatigued. It is the human who makes errors if the displayed information is ambiguous or incomplete. Humans can become distracted or frustrated with features of the work setting; machines show no such tendencies. Insofar as the human user is a likely source of significant variance in system performance, it is better to control this source of variance early. Failure to do so will almost always result in a system that is inconsistent in its operation and less than fully dependable. Human-centered or user-centered design is partly a reaction against conditions in the recent past in which the technology drove the constraints on design options. It is easy to forget that early computer utilities demanded that the user be adept at programming—or that users employ the services of an intermediary. It was only when the technology became more versatile that it was possible to make advanced computer-based systems that readily accommodated human characteristics. The present emphasis on usability is a welcome shift—away from the exclusivity of the computer technicians and in the direction of responsiveness to a broader user population (Lingaard, 1994). Likewise, it should be noted that, in systems that rely on computers, the software engineers may not be able to anticipate all event contingencies. The human operator is usually expected to compensate for such gaps. Consequently, it is usually a good idea to use human factors concepts to direct the design in ways that will facilitate operator flexibility. The implementation of the concept of human- or user-centered design—or its friendly competitor, "total systems design"—requires designers to have as much knowledge about human physiology and psychology as possible (Bailey, 1982). It is this knowledge base that the human factors specialist is responsible for bringing to the design team. Ideally, the human factors specialist would not only be a carrier of such knowledge but also would share it—in an educational sense—with the other team members. In any case, when existing human factors knowledge is applied, more effective designs can result. An example of the successful application of human factors principles outside the field of aviation can be found in the design of agricultural equipment such as tractors. For many years, the designers ignored operator comfort. Then human factors specialists proposed new seating designs and moved the whole industry toward the adoption of enclosed cabs in which noise, vibration effects, dust, and temperature could be controlled for the improvement
OCR for page 229
Flight to the Future: Human Factors in Air Traffic Control of total system performance—not to mention user acceptance (e.g., Hornick, 1962; Woodson and Conover, 1964; Sahal, 1975). In a second example, human factors research has made major contributions to the effectiveness of maintenance work. Specifically, the utility of various formats for maintenance manuals was compared, and the results pointed to fully proceduralized job aids as the better way to support the maintenance worker. The use of this step-by-step approach, combined with clearly intelligible graphics showing such acts as tool positioning, made large differences in performance—even when the workers were given only modest training in working on the test systems (Duffy et al., 1987). In air traffic control systems, human factors studies done in the 1950s revealed that the use of radar greatly improved the capabilities of the human-machine system. When radar was augmented by target identification, and other information such as altitude was presented adjacent to the radar target, performance improved even more (Schipper et al., 1957). It might be argued that these innovations would have come about without the attention given by human factors researchers. However, the history of developments in air traffic control, particularly the resistance of operators to the first use of radar in civilian air traffic situations, suggests that progress in these ways would have been far slower if the human factors research had not been done and the findings had not been widely distributed throughout the technical communities responsible for air traffic control system development. FORMAL ARRANGEMENTS FOR INCORPORATING HUMAN FACTORS The classic method of imposing design standards on the system development and procurement processes is the issuance of formal specifications. Such specifications were put in place by the armed services in the early 1950s and have been updated since. Current human factors standards for design of military equipment are incorporated in U.S. Army TM 21-62, U.S. Navy MIL-H-22174(AER), and U.S. Air Force MIL-STD-1472 and MIL-H-27894. However, these dicta have tended to generate a ritualistic response from systems contractors, who were not interested in spending more than they had to on what some designers still considered to be a frill. To counteract such resistance, the U.S. Air Force attempted to strengthen the inclusion of human factors in the early stages of system design by means of a standard operating procedure labeled "Qualitative and Quantitative Personnel Requirements Information." In general, the idea was to give an institutional role to human factors people on the government's side of the systems acquisition process. If an Air Force human factors specialist was present to impose design requirements on contractors from the outset of the design/development activity, then integration would take place in spite of the contractor's lack of enthusiasm (Demaree and Marks, 1962; Eckstrand et al., 1962). This effort was
OCR for page 230
Flight to the Future: Human Factors in Air Traffic Control pursued with some vigor so long as top Air Force officers endorsed it, but it withered when they retired. In the late 1970s, a joint Army-Navy-Air Force project was initiated at the urging of high civilian officials in the Department of Defense. The lead agency was the U.S. Army Research Institute for the Behavioral and Social Sciences. The conceptual goal was to show key agency people, such as the top technologists at the U.S. Army's Training and Doctrine Command (TRADOC), that their efforts at developing superior weapon systems would be furthered if they embraced human factors and included it at the earliest stages of the development process (i.e., when requirements from the field are interpreted in the form of new mission analyses by TRADOC staffers). The avowed goal was to be achieved by the preparation and publication of a how-to manual for the military systems engineering community (Price et al., 1980). This effort was the precursor of the Office of Management and Budget's Circular No. A-109, which is now the main source document for procurement procedures related to the inclusion of human factors for the Air Force, the Navy, and the Federal Aviation Administration (see Order 1810.1F, Acquisition Policy, DOT/FAA, March 19, 1993.). Meanwhile, in the late 1980s, high-level Army officers again became dissatisfied with the degree to which human factors were included in the development of their systems. New reforms were initiated. The focus this time included the Army Materiel Command as well as the Training and Doctrine Command. The ambitions were loftier as well—including the idea that human factors might become a driving force in technological innovation—not merely a means of guiding a development process driven by advances in technology. The program that emerged was called MANPRINT (Booher, 1990). The concept is still in effect in the U.S. Army procurement activities and has been copied abroad (by the British military) but has not yet been adopted by the Air Force, the Navy, or the FAA (U.S. Department of the Army, 1994). From the issuance of specifications and monitoring schemes that were to be imposed on systems contractors, the goal of MANPRINT was that someone who was qualified by particular educational accomplishments would examine the proposed configuration of a system under development at specified stages in the development process to make sure that no violations of human factors principles were taking place. The presence of these specifications also served to ensure that the human factors profession would mature in particular ways. For example, all system contractors of even moderate size were expected to employ a human factors specialist—either as a full-time employee or as a consultant. Academic programs were established to meet the demand for qualified people and, within such academic programs, research projects were promulgated. The results of this research, along with the like products of government laboratories, enlarged the base of established principles that contractors were required to follow. Thus, the field of human factors became a self-amplifying discipline. It also became institutionalized
OCR for page 231
Flight to the Future: Human Factors in Air Traffic Control under various names (e.g., ergonomics, human engineering) throughout the industrialized world. Despite of the increasing stability of the field, however, its impact on system development programs has remained modest. Although the managers of system development programs and human factors specialists share crucial goals, there is also some major divergence of viewpoints. For example, engineers tend to be rewarded for being inventive and often seek to incorporate the most recent technology in their plans, whether it is best for the functions of the system or not (Carrigan and Kaufman, 1966; Kidd, 1990). Human factors specialists can appreciate the potential advantages of advanced technologies but often worry about the degree to which the form of a given technology is compatible with the general capabilities and limitations of prospective human operators (Fanwick, 1967). Those representing the human factors viewpoint will sometimes assert that a particular design decision should not be made until more information on the question of human compatibility is available. If the information must come from research that takes some time to complete, the engineers tend to see in this scenario the prospect of missed deadlines and cost overruns. All this is not to condemn project managers' attitudes toward human factors. They tend to avoid all inputs that might add uncertainty to design decision making or that might add to their investment in the information-gathering phase in such deliberations (Rouse and Cody, 1988). There is a tendency on the part of project managers to minimize human factors in system development projects. Such a tendency is probably aggravated by a lack of appreciation of technical issues on the part of some human factors specialists. Some system design efforts incorporate high levels of human factors participation, and some do not. This variability in the amount and quality of the human factors contribution is a problem for those who have high-level administrative responsibilities for system acquisition activities. Since most senior officials in the agencies that initiate and fund system development efforts are aware that, when human factors are ignored, serious system failures can be the consequence (Busey, 1991; Del Balzo, 1995), administrative rules and standard procedures have been promulgated over the years since World War II as means to ensure that the human factor would indeed come into the deliberations on system development whenever such inputs were needed. The issue that remains is that these arrangements can serve as a means to give the superficial appearance of incorporating human factors when the real impact is minimal. Even when MANPRINT is nominally adhered to by contractors, the process can appear to be more of a ritual than a rigorous procedure (Government and Systems Technology Group, 1995). UNDERTAKINGS WITH RESPECT TO AIR TRAFFIC CONTROL As suggested above, the FAA generally follows the Department of Defense protocols on human factors. These practices tend to distribute responsibility
OCR for page 232
Flight to the Future: Human Factors in Air Traffic Control between the FAA and its contractors. The situation is somewhat complicated for air traffic control because of the ways in which its operations are managed, the relative heterogeneity of its operational settings, and the distribution of research, development, test, and evaluation activities among internal laboratories (i.e., the Civil Aeromedical Institute, the Atlantic City Technical Center), Transportation Department facilities (i.e., the Volpe Center), other government agencies such as NASA and the Air Force, nonprofit organizations (e.g., MITRE and certain academic centers), and commercial contractors. The need for better coordination among these organizations is discussed in Chapter 8. Currently, two forces for conceptual integration are provided by the FAA headquarters units called the Directorate of Air Traffic Plans and Requirements and the Research and Acquisitions organization. These offices oversee the full array of FAA system development programs. The FAA is also in the process of reviewing the question of how best to structurally organize the human factors effort within the agency (see Chapter 8). The goal is to achieve a better level of mutual support between in-house research, contracted research and development, and system installation and operations. The Acquisition of Automated Systems Further formal procedures for the inclusion of human factors in system development can be found in FAA Order 1810.1F, which describes the FAA's acquisition policy. The policy document stipulates the following as critical steps: Performance of the mission need analysis and preparation of the mission need statement, which describes the required operational capability of the new system and explains the deficiencies that the new system will rectify. Analysis of the trade-offs among alternative concepts and preparation of an operational requirements document to define the system-level functional and performance objectives of the new system. Preparation of a maintenance requirements document (FAA Notice 6000.162). Preparation of the system-level specification that defines in detail the system-level functional and performance requirements of the new system. Preparation of the request for proposal that defines for potential bidders the work requested by the FAA toward the design, development, fabrication, and delivery of the new system. Evaluation of bidders' proposed designs. Evaluation of the selected bidders' designs at preliminary design reviews, critical design reviews, and any other program-specific design reviews. Testing of the design at various levels (e.g., developmental tests, operational tests, acceptance tests, and field shakedowns).
OCR for page 233
Flight to the Future: Human Factors in Air Traffic Control These steps represent the FAA's interpretation of the standard sequence for systems development projects. They scarcely convey the true difficulties of doing real system design. For example, it is extremely rare that a new system does not have a predecessor. The characteristics of the predecessor constrain what the new system can be. This is clearly exemplified by the fact that the current air routes were laid over the routes developed when pilots flew at night from one searchlight beacon to another. System developers must find ways to fit the most exotic new technologies into frameworks laid down by traditions—the very roots of which may now be entirely forgotten. Furthermore, the designers must cope with the fact that the system objectives can reflect competing values. In air traffic control, this is clearly the case when attempts are made to reconcile the desire for denser traffic flows with the overriding goal of safety. In addition to these steps, which reflect a systems approach that is fully congruent with human factors participation, there are some even more specific provisos that reflect continuing concerns on the part of top-level administrators. Thus, FAA Order 1810.1F stipulates that: Human factors shall be applied to the development and acquisition of national airspace systems software, equipment, and facilities. Human factors engineering shall be integrated with the system engineering and development effort throughout the acquisition process, including requirements analysis, system analysis, task analysis, system design, equipment and facilities design, testing, and reporting. An initial human factors plan shall be developed prior to the finalization of the system level specification. These instructions represent a long-standing set of practices and are the relatively routinized part of the human factors contribution. However, even such well-established guidelines have not been adequate to ensure that human factors have been fully incorporated in some design processes. A prime example of a missed opportunity is provided by the actual configuration of equipment in the typical Airway Facilities maintenance control center, discussed in Chapters 4 and 9. Such centers are an assembly of disparate workstations that do not exhibit an integrated human-computer interface. Although the FAA-MD-793A specification (Federal Aviation Administration, 1994) represents an attempt to require that all new systems provide data in standard formats to the remote monitoring system that feeds into a center, idiosyncratic designs are still being generated. There appears to be a need for an overall maintenance center automation strategy as a baseline for evaluating proposed designs. These same needs apply to the design of tools that support other Airway Facilities activities, such as off-line diagnosis of equipment, maintenance logging, and maintenance of software (Simms, 1993).
OCR for page 234
Flight to the Future: Human Factors in Air Traffic Control Limitations in the Formal Order Procedure Although FAA Order 1810.1F recommends the use of military standard MIL-H-46855 as a guide for the human factors plan, the plan may be tailored by the systems contractor to the scope and level of specificity judged appropriate to the new system's complexity and its major attributes. The acquisition policy also prescribes involvement by both human factors and operational representatives (users). The policy implicitly relies on the author of the human factors plan to identify methods for addressing the issues especially pertinent to automated systems (the policy itself does not identify these issues and does not address special concerns for automated systems). Other issues not included in the statement of acquisitions policy are the distinctions between system acquisition procedures at the national level and the acquisition support activities in the various regions. Each region has its own culture, and each has some unique features, such as the configuration of airways and the locations of terminals within the region's geographic boundaries. New subsystem acquisitions must be designed so that they can be adapted to all these distinctive settings. Proposed Reforms Recent additional modifications of the FAA's acquisitions procedures include the establishment of integrated product teams (IPTs) to smooth the discrepancies between the perceptions of FAA officials, suppliers, and users—and to generally expedite the actual fielding of new subsystems. The IPT concept follows the line-of-business approach to acquisitions that is provided to establish clearer assignment of decision-making authority and responsibility. The variations in culture among the regions have not been clearly designated as one of the problems to be solved by the IPTs, but this is an area in which these new organizational units could make useful contributions to the total modernization process. Such contributions would be facilitated by a provision for the full- or part-time presence of a fully qualified human factors specialist on each IPT. This practice might recapture the advantages that characterized the QQPRI (qualitative and quantitative personnel resources inventory) approach and that are now sought in the MANPRINT procedures. It would give the FAA a means of enforcing the protocols regarding human factors contained in Order 1810.1F with regard to contractors' adherence to human factors principles. A second reform being sought is an increase in the utilization of equipment for which no special development effort is required—nondevelopmental items and commercial off-the-shelf equipment. The advantages in time and dollar costs are self-evident. However, there can be hidden costs with respect to the human factors question. Simply depending on the manufacturer to have carried out adequate human factors engineering—in the initial interest of furthering commercial market appeal—is not prudent. The point is that each item to be procured
OCR for page 235
Flight to the Future: Human Factors in Air Traffic Control by a straight commercial transaction should be subjected to a strict human factors review, in much the same mode as would be used on a system under contractual development. A good analogy is that of consumer protection in the conventional, competitive marketplace. THE IMPLEMENTATION OF INNOVATIONS Acceptance by System Operators The historical pattern of technological development of many advanced systems reveals a number of pitfalls on the road to modernization. One such pitfall is the lack of user acceptance. New systems might represent real technical advances, but they will serve no good purpose if their use is forestalled by their unacceptability to operational personnel. For example, there was strong initial resistance to the use of radar by the FAA—despite the favorable operational examples provided by the Air Force use of this technology in directly parallel operations such as the Berlin airlift (Fitts et al., 1958). Likewise, the airborne automatic conflict warning system (TCAS, discussed Chapter 12) is receiving a negative reaction from some controllers. This recent innovation has great significance because it can directly influence the relationship between the controllers and the pilots. It has been noted that pilots have changed course independently when under positive control. When the automatic warning system was the source of steering instructions, the pilots sometimes have delayed reporting to the controller what is happening (U.S. House of Representatives, 1993). A positive instance of user acceptance is provided by the success story of one significant subsystem's development for the national airspace system. The example is a computer-based aid for the proper spacing of aircraft on converging approach paths used in landing at a terminal with convergent active runways. Significant improvements in traffic flow rates at night or during adverse weather conditions have been achieved in field tests of this subsystem. Moreover, because it was presented to users in advanced prototype form in a laboratory setting and then field tested at many TRACONs in different regions, it seems likely that the subsystem would generate little resistance if it were to be installed at all the appropriate terminals across the country (for details see—Mundra, 1989; Mundra and Levin, 1990). There is a substantial body of knowledge on the problems of innovation acceptance and adaptation (see e.g., Rogers, 1983). Acceptance is largely dependent on the user's subjective cost-benefit calculations. However, several other factors are also important. Studies of workers' reactions to changes in work procedures date from the research of Alex Bavelas on the responses of the workers in a factory in Marion, Virginia, to different levels of involvement in the decision to implement new cloth-cutting equipment (Lewin, 1947). This work was followed by specific verifications (Coch and French, 1948) and by major
OCR for page 236
Flight to the Future: Human Factors in Air Traffic Control philosophical and practical guidebooks (McGregor, 1960; Bennis et al., 1976). These works and others make it quite clear that adoption of an innovation is strongly influenced by the collective responses of the rank-and-file workers. When relatively cohesive work groups are brought into the decision-making processes related to distinctive modifications of work procedures—well in advance of the implementation of the change—a positive work climate is maintained and productivity is good. If the workers are excluded from the decision processes and simply told, arbitrarily, that a change is to take place—even when the reasons for the change are explained thoroughly—negative reactions of many kinds, including production suppression, can be expected. This does not mean that managers or systems engineers need to abandon their prerogatives or that workers or groups of workers need to be involved in every decision made by management in an organization. Vroom has shown in repeated studies that workers fixate only on functions that affect their daily tasks and their status in the workplace (Vroom and Jago, 1987). For example, workers are perfectly willing to forego consultation on company investment strategies. They apparently concede that financial experts in the management cadre should make such decisions—not lay people at the worker level. Acceptance at Different Organizational Levels Operational personnel are not the only ones who can accept or reject innovations. Resistance to change has been experienced at all levels. For example, despite the presence of a union representative in the office that provides key oversight of all systems development decisions, it is evident that some union leaders still feel themselves to be excluded from the modernization deliberations (Thornton, 1993). Supervisors, managers, and administrators must also provide support if the adoption of an innovation is to be successful. One approach to dealing with this prospective problem is to try to assess the economic, political, and psychological issues in addition to the technological factors that are driving the change. When all these conditions are identified and understood, managers can assess the specific requirements for organizational adaptations and move to correct any gaps in the support functions. New Approaches When the principles of innovation promotion and the integration of human factors in system development are put together, the conclusion that emerges is that extensive user participation is a key to success in both sets of activities. In fact, if there is very extensive user participation throughout the design and development process, there is a strong possibility that the resultant system will yield
OCR for page 237
Flight to the Future: Human Factors in Air Traffic Control superior performance and will also be more readily acceptable by its prospective adopters. The main barriers to the use of user participation are logistical and economic. That is, it takes time and money to provide large numbers of users with exposure to prospective design features, and it can be very expensive to create conditions that lead to incremental changes in design. Known as ''requirements creep," such changes are a notorious problem for the managers of large-scale system development efforts. There are also difficulties in the area of logical rigor, in the sense that it is difficult to obtain objective, quantitative conclusions from what can be a series of informal preference statements from prospective users. Some of these difficulties can be overcome by techniques invented by researchers in the fields of marketing and advertising psychology. Examples include the use of opinion surveys and focus groups for product assessments prior to mass production. During the development of the first versions of the Apple Computer, Steve Wozniak, who was responsible for the technical development of the product, initiated a focus group by recruiting high school students who had formed a "hackers" club in San Jose, California. New system features were reviewed informally by club members as they were instantiated by Wozniak in his garage workshop. Club members vigorously debated the pros and cons of the design and then arrived at a reasonable consensus about the attractiveness of each feature. Apparently, Wozniak never—or very rarely—went against the collective judgments of the club members. The remarkable market penetration of the early versions of the Apple Computer lends credence to the proposition that user participation should guide development and that, when it does, an appealing product is the likely outcome (Byte, 1984, 1985). Similar successes have been attained by air traffic control subsystem developers who have used various methods of simulation to give users an opportunity to experience the operational features of a particular configuration while major modifications could still be made at low cost (see, for example, Lee, 1994; Erzberger et al., 1993). This procedure can give the designer confidence about the design approach being used, and it also appears to serve as a way of avoiding major design errors by giving the programmers useful clues about what features are attractive (Baldwin and Chung, 1995). For an earlier exposition of similar concepts, see Sinaiko and Belden (1961). They show that important information can be obtained to guide the design engineer without resorting to formal experimentation. The techniques based on focus groups have also been used to solve some problems related to air traffic control that do not involve advanced technologies to any great degree. Specifically, a team of researchers from two universities and NASA Ames convened a focus group made up of airline dispatchers, the coordinators who negotiate with FAA intermediaries at the Air Traffic Control System Command Center and traffic managers from en route control centers. There are many causes of friction between these entities, and the focus group was empaneled
OCR for page 238
Flight to the Future: Human Factors in Air Traffic Control to determine if there were some solid common ground on which improved coordination procedures might be built. The process was successful with the proviso that some continuing follow-through would be needed (McCoy et al., 1995). Exploitation of the computer as an aid in system design in conjunction with user participation has been tried in Sweden (Akselsson et al., cited in Karwowski and Rahimi, 1990). These investigators used a local area network to link a group of 15 stakeholders in a project to redesign the materials flow and workspace layout in a factory. The network facilitated communication, and problem solving was enhanced. (However, there were some real stresses within the group associated with the lack of uniform levels of skill in the use of the computer as a communication tool.) On an even more positive note, rapid prototyping, discussed in the previous chapter, has been shown to be successful in certain areas of system development. Rapid prototyping can be regarded as a sophisticated version of basic trial-and-error methods of problem solving (Connell and Shafer, 1995). Another way of describing the process within the system development sequence is the test-and-adjust approach. Human factors specialists were not long in adopting the structured procedures of rapid prototyping for their own purposes. In fact, Gould and Lewis (1983) found themselves admonishing programmers to make sure that they, the programmers, actually follow some form of rapid prototyping in every major software development project. Variations on the theme of rapid prototyping are exemplified by work on development and installation of a new information system in a manufacturing firm (Mankin et al., 1996). The procedure, called mutual design and implementation, is intended to facilitate user acceptance of computer-based systems; it incorporates the techniques of user participation into a full-scale organizational adaptation program. The basic message is that a technological innovation must be correctly perceived as being congruent with both the overall objectives of the organization and the individual objectives of the users of the technology. The approach moves away from traditional bureaucratic norms but, as such, is not incompatible with the streamlined FAA acquisition process (Donahue, 1996). The approach also recaptures the idea of organizational dynamics found in the early Air Force studies of human-machine systems (Porter, 1964). If, in the instance of developing a crucial subsystem, costs are seen as a strongly limiting boundary to extensive user participation, compromise protocols do exist. Among the most attractive is the sequential experimentation protocol laid out by Williges and colleagues (1993). Their approach is a variant of the successive approximation strategy in system design. The scheme works in steps toward the initiation of a large-scale, multivariate, factorial experiment—the results of which should resolve the total configuration issue near the end of the design process. Rapid prototyping is restricted to the second step of an 11-step sequence. Its role in this scheme is simply to screen out extremely unacceptable
OCR for page 239
Flight to the Future: Human Factors in Air Traffic Control design options after brainstorming and other inclusive techniques have generated a large set of candidates. The next sequence of steps are relatively small-scale experiments that test the surviving design options separately or in limited factorial combinations. At about step 8, the relative strengths of single variables in driving critical performance measures will be known, gaps in the database will be evident, and clues will be present about the presence of significant interactions among the independent variables. The crucial multivariate experiment (or experiments) is to be conducted at step 9. Williges et al. recommend that step 10 be devoted to the construction of an abstract model of the relationship between the tested design options and performance and that step 11 consist of the actual promulgation of the newly determined optimal design configuration. This methodology represents a major commitment to a mode 3 approach. It also provides the means for engaging users early in the design process. The engagement of a relatively large cross section of operational personnel would yield the added advantage of providing a core of individuals in each FAA region who could become influential in the acceptance of the system upon its installation. The very fact that colleagues had their say in the final configuration of the system would be a positive factor among the rest of the workforce. Moreover, those operators recruited for evaluation episodes would have acquired skill in the use of the system and consequently would be able to help explain the procedures and train their coworkers in the actual employment of the new tool. The final and perhaps the most important advantage to the gradual approach represented by prototyping as a tool in the development sequence for air traffic control modernization is the prospect that the installed system could vary in some modest ways from region to region and even from site to site. The specifications for such variants would involve the determination of local requirements and the actual inclusion of the variant features in each subsystem when it is installed. Such steps would take some time but are commensurate with a test-and-adjust approach to system development. CONCLUSIONS In general, system developers adopt a top-down approach to the design process. The crucial move in this approach is the delineation of functional requirements. The assumption is that, if these functional requirements are fulfilled within the constraints of the technology and time and money expenditure, the system is thereby successful. Traditional human factors contributions have been shaped to fit into this approach. For example, user task analyses are intended to contribute greatly to the specification of functional requirements. Also, since the top-down approach can be neatly divided into discrete steps, it is easy to stipulate at what points design review should take place in order to ensure that no human factors principles have been violated. More recent experiences with systems designed exclusively by top-down
OCR for page 240
Flight to the Future: Human Factors in Air Traffic Control procedures indicate that, in the more complex systems in which cognitive behaviors and strong affective elements come into play through the human user/operator, serious deficiencies can become apparent after the system is delivered and is put to work. One means to avoid such deficiencies involves the early fabrication of prototypes and their evaluation by user groups. Prototypes that could be regarded as virtual representations of an operational system can also serve these purposes. When the end user is so engaged in the design process, it becomes bottom-up (e.g., problem-driven or scenario-based design) rather than top-down. Integration of top-down and bottom-up procedures will probably be needed to achieve optimal cost-effectiveness for the national airspace system of the future. In addition to bridging the barrier that is likely to exist between ordinary users and design engineers, human factors specialists bring to design deliberations knowledge about human capabilities and limitations that has been acquired by rigorous scientific research. The human factors specialist is also expert in identifying and seeing the implications of subjective user attitudes, opinions, and tastes. The integrated project team for every major subsystem in the advanced automation system should contain at least one full-time human factors specialist who would have the authority and responsibility to ensure that (a) user participation is timely and extensive, (b) human capabilities, limitations, and values are considered as part of every design decision, and (c) gaps in the knowledge base that could compromise the quality of the resultant system are identified and rectified by appropriately rigorous research. It is important to consider specific interests of the government as system procurer in the allocation of human factors resources. That is, the final design of a major subsystem should not be the exclusive prerogative of a contractor. A way should be found to ensure that human factors/human engineering is not slighted by contractors as an arbitrary cost-saving ploy. Authoritative oversight is essential in this matter. Systems that enjoy intensive and extensive user participation in their development are generally more likely to be more usable, effective, and acceptable than systems that are thrust on users after development has been completed. However, user participation can be expensive and time-consuming—and can lead as easily to ambiguity as to clarity with respect to the choice of design options if good care is not exercised. In particular, users' perceptions can change while the development process is still under way, and user demands can expand over time. The resultant "requirements creep" can seriously disrupt the procurement process. There is a variety of strategies for minimizing the costs, delays, and ambiguities that can come from extensive user participation. Such strategies look at simplification of the procedures of rapid prototyping and the limitation of the use of such procedures to stipulated stages in the system development sequence.
Representative terms from entire chapter: