Page 49

3—
Why Do Errors Happen?

The common initial reaction when is to find and blame an error occurs someone. However, even apparently single events or errors are due most often to the convergence of multiple contributing factors. Blaming an individual does not change these factors and the same error is likely to recur. Preventing errors and improving safety for patients require a systems approach in order to modify the conditions that contribute to errors. People working in health care are among the most educated and dedicated workforce in any industry. The problem is not bad people; the problem is that the system needs to be made safer.

This chapter covers two key areas. First, definitions of several key terms are offered. This is important because there is no agreed-upon terminology for talking about this issue.1 Second, the emphasis in this chapter (and in this report generally) is about how to make systems safer; its primary focus is not on "getting rid of bad apples," or individuals with patterns of poor performance. The underlying assumption is that lasting and broad-based safety improvements in an industry can be brought about through a systems approach.

Finally, it should be noted that although the examples may draw more from inpatient or institutional settings, errors occur in all settings. The concepts presented in this chapter are just as applicable to ambulatory care,



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 49
3 Why Do Errors Happen? T he common initial reaction when an error occurs is to find and blame someone. However, even apparently single events or errors are due most often to the convergence of multiple contributing factors. Blam- ing an individual does not change these factors and the same error is likely to recur. Preventing errors and improving safety for patients require a systems approach in order to modify the conditions that contribute to errors. People working in health care are among the most educated and dedicated workforce in any industry. The problem is not bad people; the problem is that the system needs to be made safer. This chapter covers two key areas. First, definitions of several key terms are offered. This is important because there is no agreed-upon terminology for talking about this issue.1 Second, the emphasis in this chapter (and in this report generally) is about how to make systems safer; its primary focus is not on “getting rid of bad apples,” or individuals with patterns of poor per- formance. The underlying assumption is that lasting and broad-based safety improvements in an industry can be brought about through a systems ap- proach. Finally, it should be noted that although the examples may draw more from inpatient or institutional settings, errors occur in all settings. The con- cepts presented in this chapter are just as applicable to ambulatory care, 49

OCR for page 49
50 TO ERR IS HUMAN home care, community pharmacies, or any other setting in which health care is delivered. This chapter uses a case study to illustrate a series of definitions and concepts in patient safety. After presentation of the case study, the chapter will define what comprises a system, how accidents occur, how human error contributes to accidents and how these elements fit into a broader concept of safety. The case study will be referenced to illustrate several of the con- cepts. The next section will examine whether certain types of systems are more prone to accidents than others. Finally, after a short discussion of the study of human factors, the chapter summarizes what health care can learn from other industries about safety. An Illustrative Case in Patient Safety Infusion devices are mechanical devices that administer intravenous solu- tions containing drugs to patients. A patient was undergoing a cardiac pro- cedure. This patient had a tendency toward being hypertensive and this was known to the staff. As part of the routine set-up for surgery, a nurse assembled three different infusion devices. The nurse was a new member of the team in the operating room; she had just started working at the hospital a few weeks before. The other members of the team had been working together for at least six months. The nurse was being very careful when setting up the devices because one of them was a slightly different model than she had used before. Each infusion device administered a different medication that would be used during surgery. For each medication, the infusion device had to be programmed according to how much medication would flow into the patient (calculated as “cc’s/hour”). The medications had different concentrations and each required calculation of the correct dose for that specific patient. The correct cc’s/hour were programmed into the infusion devices. The anesthesiologist, who monitors and uses the infusion devices during surgery, usually arrived for surgery while the nurse was completing her set-up of the infusion devices and was able to check them over. This particular morn- ing, the anesthesiologist was running behind from a previous surgery. When he arrived in the operating room, the rest of the team was ready to start. The anesthesiologist quickly glanced at the set-up and accepted the report as given to him by the nurse. One of the infusion devices was started at the beginning of surgery. About

OCR for page 49
51 WHY DO ERRORS HAPPEN? WHY DO ACCIDENTS HAPPEN? Major accidents, such as Three Mile Island or the Challenger accident, grab people’s attention and make the front page of newspapers. Because they usually affect only one individual at a time, accidents in health care delivery are less visible and dramatic than those in other industries. Except for celebrated cases, such as Betsy Lehman (the Boston Globe reporter who died from an overdose during chemotherapy) or Willie King (who had the wrong leg amputated),2 they are rarely noticed. However, accidents are a form of information about a system.3 They represent places in which the system failed and the breakdown resulted in harm. The ideas in this section rely heavily upon the work of Charles Perrow halfway through the surgery, the patient’s blood pressure began to rise. The anesthesiologist tried to counteract this by starting one of the other infusion devices that had been set up earlier. He checked the drip chamber in the intravenous (IV) tubing and did not see any drips. He checked the IV tubing and found a closed clamp, which he opened. At this point, the second device signaled an occlusion, or blockage, in the tubing by sounding an alarm and flashing an error message. The anesthesiologist found a closed clamp in this tubing as well, opened it, pressed the re-start button and the device resumed pumping without further difficulty. He returned to the first device that he had started and found that there had been a free flow of fluid and medication to the patient, resulting in an overdose. The team responded appropriately and the patient recovered without further incident. The case was reviewed two weeks later at the hospital’s “morbidity and mortality” committee meeting, where the hospital staff reviews cases that en- countered a problem to identify what happened and how to avoid a recur- rence. The IV tubing had been removed from the device and discarded. The bioengineering service had checked the pump and found it to be functioning accurately. It was not possible to determine whether the tubing had been inserted incorrectly into the device, whether the infusion rate had been set incorrectly or changed while the device was in use, or whether the device had malfunctioned unexpectedly. The anesthesiologist was convinced that the tub- ing had been inserted incorrectly, so that when the clamp was open the fluid was able to flow freely rather than being controlled by the infusion device. The nurse felt the anesthesiologist had failed to check the infusion system adequately before turning on the devices. Neither knew whether it was pos- sible for an infusion device to have a safety mechansim built into it that would prevent free flows from happening.

OCR for page 49
52 TO ERR IS HUMAN and James Reason, among others. Charles Perrow’s analysis of the accident at Three Mile Island identified how systems can cause or prevent accidents.4 James Reason extended the thinking by analyzing multiple accidents to ex- amine the role of systems and the human contribution to accidents.5 “A system is a set of interdependent elements interacting to achieve a common aim. The elements may be both human and non-human (equipment, technolo- gies, etc.).” Systems can be very large and far-reaching, or they can be more local- ized. In health care, a system can be an integrated delivery system, a cen- trally owned multihospital system, or a virtual system comprised of many different partners over a wide geographic area. However, an operating room or an obstetrical unit is also a type of system. Furthermore, any element in a system probably belongs to multiple systems. For example, one operating room is part of a surgical department, which is part of a hospital, which is part of a larger health care delivery system. The variable size, scope, and membership of systems make them difficult to analyze and understand. In the case study, one of the systems used during surgery is the automated, medication adminstration system, which includes the equipment, the people, their interactions with each other and with the equipment, the procedures in place, and the physical design of the surgical suite in which the equipment and people function. When large systems fail, it is due to multiple faults that occur together in an unanticipated interaction,6 creating a chain of events in which the faults grow and evolve.7 Their accumulation results in an accident. “An accident is an event that involves damage to a defined system that disrupts the ongoing or future output of that system. ”8 The Challenger failed because of a combination of brittle O-ring seals, unexpected cold weather, reliance on the seals in the design of the boosters, and change in the roles of the contractor and NASA. Individually, no one factor caused the event, but when they came together, disaster struck. Perrow uses a DEPOSE (Design, Equipment Procedures, Operators, Supplies and materials, and Environment) framework to identify the potential sources of failures. In evaluating the environment, some researchers explicitly include organizational design and characteristics.9

OCR for page 49
53 WHY DO ERRORS HAPPEN? In the case study, the accident was a breakdown in the delivery of IV medica- tions during surgery. The complex coincidences that cause systems to fail could rarely have been foreseen by the people involved. As a result, they are reviewed only in hindsight; however, knowing the outcome of an event influences how we assess past events.10 Hindsight bias means that things that were not seen or understood at the time of the accident seem obvious in retrospect. Hind- sight bias also misleads a reviewer into simplifying the causes of an accident, highlighting a single element as the cause and overlooking multiple contrib- uting factors. Given that the information about an accident is spread over many participants, none of whom may have complete information,11 hind- sight bias makes it easy to arrive at a simple solution or to blame an indi- vidual, but difficult to determine what really went wrong. Although many features of systems and accidents in other industries are also found in health care, there are important differences. In most other industries, when an accident occurs the worker and the company are di- rectly affected. There is a saying that the pilot is always the first at the scene of an airline accident. In health care, the damage happens to a third party; the patient is harmed; the health professional or the organization, only rarely. Furthermore, harm occurs to only one patient at a time; not whole groups of patients, making the accident less visible. * In any industry, one of the greatest contributors to accidents is human error. Perrow has estimated that, on average, 60–80 percent of accidents involve human error. There is reason to believe that this is equally true in health. An analysis of anesthesia found that human error was involved in 82 percent of preventable incidents; the remainder involved mainly equipment failure.12 Even when equipment failure occurs, it can be exacerbated by human error.13 However, saying that an accident is due to human error is not the same as assigning blame. Humans commit errors for a variety of *Public health has made an effort to eliminate the term, “accident,” replacing it with unin- tentional injuries, consistent with the nomenclature of the International Classification of Dis- eases. However, this report is not focused specifically on injury since an accident may or may not result in injury. See Institute of Medicine, Reducing the Burden of Injury, eds. Richard J. Bonnie, Carolyn Fulco and Catharyn Liverman. Washington, D.C., National Academy Press, 1999).

OCR for page 49
54 TO ERR IS HUMAN expected and unexpected reasons, which are discussed in more detail in the next two sections. Understanding Errors The work of Reason provides a good understanding of errors. He de- fines an error as the failure of a planned sequence of mental or physical activities to achieve its intended outcome when these failures cannot be at- tributed to chance.14 It is important to note the inclusion of “intention.” According to Reason, error is not meaningful without the consideration of intention. That is, it has no meaning when applied to unintentional behav- iors because errors depend on two kinds of failure, either actions do not go as intended or the intended action is not the correct one. In the first case, the desired outcome may or may not be achieved; in the second case, the desired outcome cannot be achieved. Reason differentiates between slips or lapses and mistakes. A slip or lapse occurs when the action conducted is not what was intended. It is an error of execution. The difference between a slip and a lapse is that a slip is observable and a lapse is not. For example, turning the wrong knob on a piece of equipment would be a slip; not being able to recall something from memory is a lapse. In a mistake, the action proceeds as planned but fails to achieve its in- tended outcome because the planned action was wrong. The situation might have been assessed incorrectly, and/or there could have been a lack of knowl- edge of the situation. In a mistake, the original intention is inadequate; a failure of planning is involved. In medicine, slips, lapses, and mistakes are all serious and can poten- tially harm patients. For example, in medicine, a slip might be involved if the physician chooses an appropriate medication, writes 10 mg when the inten- tion was to write 1 mg. The original intention is correct (the correct medica- tion was chosen given the patient’s condition), but the action did not pro- ceed as planned. On the other hand, a mistake in medicine might involve selecting the wrong drug because the diagnosis is wrong. In this case, the situation was misassessed and the action planned is wrong. If the terms “slip” and “mistake” are used, it is important not to equate slip with “minor.” Patients can die from slips as well as mistakes. For this report, error is defined as the failure of a planned action to be completed as intended (e.g., error of execution) or the use of a wrong plan to achieve an aim (e.g., error of planning). From the patient’s perspective, not

OCR for page 49
55 WHY DO ERRORS HAPPEN? only should a medical intervention proceed properly and safely, it should be the correct intervention for the particular condition. This report addresses primarily the first concern, errors of execution, since they have their own epidemiology, causes, and remedies that are different from errors in plan- ning. Subsequent reports from the Quality of Health Care in America project will consider the full range of quality-related issues, sometimes classified as overuse, underuse and misuse.15 Latent and Active Errors In considering how humans contribute to error, it is important to distin- guish between active and latent errors.16 Active errors occur at the level of the frontline operator, and their effects are felt almost immediately. This is some- times called the sharp end.17 Latent errors tend to be removed from the direct control of the operator and include things such as poor design, incorrect instal- lation, faulty maintenance, bad management decisions, and poorly structured organizations. These are called the blunt end. The active error is that the pilot crashed the plane. The latent error is that a previously undiscovered design malfunction caused the plane to roll unexpectedly in a way the pilot could not control and the plane crashed. In the case study, the active error was the free flow of the medication from the infusion device. Latent errors pose the greatest threat to safety in a complex system be- cause they are often unrecognized and have the capacity to result in multiple types of active errors. Analysis of the Challenger accident traced contribut- ing events back nine years. In the Three Mile Island accident, latent errors were traced back two years.18 Latent errors can be difficult for the people working in the system to notice since the errors may be hidden in the design of routine processes in computer programs or in the structure or manage- ment of the organization. People also become accustomed to design defects and learn to work around them, so they are often not recognized. In her book about the Challenger explosion, Vaughan describes the “normalization of deviance” in which small changes in behavior became the norm and expanded the boundaries so that additional deviations became acceptable.19 When deviant events become acceptable, the potential for er-

OCR for page 49
56 TO ERR IS HUMAN rors is created because signals are overlooked or misinterpreted and accu- mulate without being noticed. Current responses to errors tend to focus on the active errors by punish- ing individuals (e.g., firing or suing them), retraining or other responses aimed at preventing recurrence of the active error. Although a punitive re- sponse may be appropriate in some cases (e.g., deliberate malfeasance), it is not an effective way to prevent recurrence. Because large system failures represent latent failures coming together in unexpected ways, they appear to be unique in retrospect. Since the same mix of factors is unlikely to occur again, efforts to prevent specific active errors are not likely to make the system any safer.20 In our case study, a number of latent failures were present: • Multiple infusion devices were used in parallel during this cardiac sur- gery. Three devices were set up, each requiring many steps. each step in the assembly presents a possibility for failure that could disrupt the entire system. • Each of the three different medications had to be programmed into the infusion device with the correct dose for that patient. • Possible scheduling problems in the operating suites may have contrib- uted to the anesthesiologist having insufficient time to check the devices be- fore surgery. • A new nurse on the team may have interrupted the “normal” flow between the team members, especially communication between the anesthe- siologist and the nurse setting up the devices. There was no standardized list of checks between the nurse and anesthesiologist before starting the proce- dure. • Training of new team members may be insufficient since the nurse found herself assembling a device that was a slightly different model. As a new employee, she may have been hesitant to ask for help or may not have known who to ask. Focusing on active errors lets the latent failures remain in the system, and their accumulation actually makes the system more prone to future fail- ure.21 Discovering and fixing latent failures, and decreasing their duration, are likely to have a greater effect on building safer systems than efforts to minimize active errors at the point at which they occur.

OCR for page 49
57 WHY DO ERRORS HAPPEN? In the case study, a typical response would have been to retrain the nurse on how to assemble the equipment properly. However, this would have had no effect on weaknesses in equipment design, team management and communi- cations, scheduling problems, or orienting new staff. Thus, free flow errors would likely recur. Understanding Safety Most of this chapter thus far has drawn on Perrow’s normal accident theory, which believes that accident are inevitable in certain systems. Al- though they may be rare, accidents are “normal” in complex, high technol- ogy industries. In contrast to studying the causes of accident and errors, other researchers have focused on the characteristics that make certain in- dustries, such as military aircraft carriers or chemical processing, highly reli- able.22 High reliability theory believes that accidents can be prevented through good organizational design and management.23 Characteristics of highly reliable industries include an organizational commitment to safety, high levels of redundancy in personnel and safety measures, and a strong organizational culture for continuous learning and willingness to change.24 Correct performance and error can be viewed as “two sides of the same coin.”25 Although accidents may occur, systems can be designed to be safer so that accidents are very rare. The National Patient Safety Foundation has defined patient safety as the avoidance, prevention and amelioration of adverse outcomes or injuries stemming from the processes of health care.26 Safety does not reside in a person, device or department, but emerges from the interactions of compo- nents of a system. Others have specifically examined pharmaceutical safety and defined it to include maximizing therapeutic benefit, reducing risk, and eliminating harm.27 That is, benefit relates to risk. Other experts have also defined safety as a relative concept. Brewer and Colditz suggest that the acceptability of an adverse event depends on the seriousness of the underly- ing illness and the availability of alternative treatments.28 The committee’s focus, however, was not on the patient’s response to a treatment, but rather on the ability of a system to deliver care safely. From this perspective, the committee believes that there is a level of safety that can and should be ensured. Safety is relative only in that it continues to evolve over time and, when risks do become known, they become part of the safety requirements.

OCR for page 49
58 TO ERR IS HUMAN Safety is more than just the absence of errors. Safety has multiple di- mensions, including the following: • an outlook that recognizes that health care is complex and risky and that solutions are found in the broader systems context; • a set of processes that identify, evaluate, and minimize hazards and are continuously improving, and • an outcome that is manifested by fewer medical errors and minimized risk or hazard.29 For this report, safety is defined as freedom from accidental injury. This simple definition recognizes that from the patient’s perspective, the primary safety goal is to prevent accidental injuries. If an environment is safe, the risk of accidents is lower. Making environments safer means looking at processes of care to reduce defects in the process or departures from the way things should have been done. Ensuring patient safety, therefore, involves the es- tablishment of operational systems and processes that increase the reliability of patient care. ARE SOME TYPES OF SYSTEMS MORE PRONE TO ACCIDENTS? Accidents are more likely to happen in certain types of systems. When they do occur, they represent failures in the way systems are designed. The primary objective of systems design ought to be to make it difficult for acci- dents and errors to occur and to minimize damage if they do occur.30 Perrow characterizes systems according to two important dimensions: complexity and tight or loose coupling.31 Systems that are more complex and tightly coupled are more prone to accidents and have to be made more reliable.32 In Reason’s words, complex and tightly coupled systems can “spring nasty surprises.”33 In complex systems, one component of the system can interact with multiple other components, sometimes in unexpected or invisible ways. Al- though all systems have many parts that interact, the problem arises when one part serves multiple functions because if this part fails, all of the depen- dent functions fail as well. Complex systems are characterized by specializa- tion and interdependency. Complex systems also tend to have multiple feed- back loops, and to receive information indirectly, and because of

OCR for page 49
59 WHY DO ERRORS HAPPEN? specialization, there is little chance of substituting or reassigning personnel or other resources. In contrast to complex systems, linear systems contain interactions that are expected in the usual and familiar production sequence. One compo- nent of the system interacts with the component immediately preceding it in the production process and the component following it. Linear systems tend to have segregated subsystems, few feedback loops, and easy substitutions (less specialization). An example of complexity is the concern with year 2000 (Y2K) com- puter problems. A failure in one part of the system can unexpectedly inter- rupt other parts, and all of the interrelated processes that can be affected are not yet visible. Complexity is also the reason that changes in long-standing production processes must be made cautiously.34 When tasks are distrib- uted across a team, for example, many interactions that are critical to the process may not be noticed until they are changed or removed. Coupling is a mechanical term meaning that there is no slack or buffer between two items. Large systems that are tightly coupled have more time- dependent processes and sequences that are more fixed (e.g., y depends on x having been done). There is often only one way to reach a goal. Compared to tightly coupled systems, loosely coupled systems can tolerate processing delays, can reorder the sequence of production, and can employ alternative methods or resources. All systems have linear interactions; however, some systems additionally experience greater complexity. Complex interactions contribute to accidents because they can confuse operators. Tight coupling contributes to accidents because things unravel too quickly and prevent errors from being intercepted or prevent speedy recovery from an event.35 Because of complexity and cou- pling, small failures can grow into large accidents. In the case study, the medication adminstration system was both complex and tightly coupled. The complexity arises from three devices functioning simulta- neously, in close proximity, and two having problems at the same time. The tight coupling arises from the steps involved in making the system work prop- erly, from the steps required to assemble three devices, to the calculation of correct medication dosage levels, to the operation of multiple devices during surgery, to the responses when alarms start going off.

OCR for page 49
60 TO ERR IS HUMAN Although there are not firm assignments, Perrow considered nuclear power plants, nuclear weapons handling, and aircraft to be complex, tightly coupled systems.36 Multiple processes are happening simultaneously, and failure in one area can interrupt another. Dams and rail transportation are considered tightly coupled because the steps in production are closely linked, but linear because there are few unexpected interactions. Universi- ties are considered complex, but loosely coupled, since the impact of a deci- sion in one area can likely be limited to that area. Perrow did not classify health care as a system, but others have sug- gested that health care is complex and tightly coupled.37 The activities in the typical emergency room, surgical suite, or intensive care unit exemplify complex and tightly coupled systems. Therefore, the delivery of health care services may be classified as an industry prone to accidents.38 Complex, tightly coupled systems have to be made more reliable.39 One of the advantages of having systems is that it is possible to build in more defenses against failure. Systems that are more complex, tightly coupled, and are more prone to accidents can reduce the likelihood of accidents by simplifying and standardizing processes, building in redundancy, develop- ing backup systems, and so forth. Another aspect of making systems more reliable has to do with organi- zational design and team performance. Since these are part of activities within organizations, they are discussed in Chapter 8. Conditions That Create Errors Factors can intervene between the design of a system and the produc- tion process that creates conditions in which errors are more likely to hap- pen. James Reason refers to these factors as psychological precursors or pre- conditions.40 Although good managerial decisions are required for safe and efficient production, they are not sufficient. There is also a need to have the right equipment, well-maintained and reliable; a skilled and knowledgeable workforce; reasonable work schedules, well-designed jobs; clear guidance on desired and undesired performance, et cetera. Factors such as these are the precursors or preconditions for safe production processes. Any given precondition can contribute to a large number of unsafe acts. For example, training deficiencies can show up as high workload, undue time pressure, inappropriate perception of hazards, or motivational difficul- ties.41 Preconditions are latent failures embedded in the system. Designing

OCR for page 49
61 WHY DO ERRORS HAPPEN? safe systems means taking into account people’s psychological limits and either seeking ways to eliminate the preconditions or intervening to mini- mize their consequences. Job design, equipment selection and use, opera- tional procedures, work schedules, and so forth, are all factors in the pro- duction process that can be designed for safety. One specific type of precondition that receives a lot of attention is tech- nology. The occurrence of human error creates the perception that humans are unreliable and inefficient. One response to this has been to find the unreliable person who committed the error and focus on preventing him or her from doing it again. Another response has been to increase the use of technology to automate processes so as to remove opportunities for humans to make errors. The growth of technology over the past several decades has contributed to system complexity so this particular issue is highlighted here. Technology changes the tasks that people do by shifting the workload and eliminating human decision making.42 Where a worker previously may have overseen an entire production process, he or she may intervene now only in the last few steps if the previous steps are automated. For example, flying an aircraft has become more automated, which has helped reduce workload during nonpeak periods. During peak times, such as take-off and landing, there may be more processes to monitor and information to interpret. Furthermore, the operator must still do things that cannot be automated. This usually involves having to monitor automated systems for rare, abnor- mal events43 because machines cannot deal with infrequent events in a con- stantly changing environment.44 Fortunately, automated systems rarely fail. Unfortunately, this means that operators do not practice basic skills, so work- ers lose skills in exactly the activities they need in order to take over when something goes wrong. Automation makes systems more “opaque” to people who manage, maintain, and operate them.45 Processes that are automated are less visible because machines intervene between the person and the task. For example, automation means that people have less hands-on contact with processes and are elevated to more supervisory and planning tasks. Direct information is filtered through a machine (e.g., a computer), and operators run the risk of having too much information to interpret or of not getting the right information.

OCR for page 49
62 TO ERR IS HUMAN In the case study, the infusion device administered the medication and the professional monitored the process, intervening when problems arose. The medication administration process was “opaque” in that the device provided no feedback to the user when the medication flowed freely and minimal feed- back when the medication flow was blocked. One of the advantages of technology is that it can enhance human per- formance to the extent that the human plus technology is more powerful than either is alone.46 Good machines can question the actions of operators, offer advice, and examine a range of alternative possibilities that humans cannot possibly remember. In medicine, automated order entry systems or decision support systems have this aim. However, technology can also create new demands on operators. For example, a new piece of equipment may provide more precise measurements, but also demand better precision from the operator for the equipment to work properly.47 Devices that have not been standardized, or that work and look differently, increase the likelihood of operator errors. Equipment may not be designed using human factors principles to account for the human–machine interface.48 In the case study, safer systems could have been designed by taking into consideration characteristics of how people use machines and interact with each other in teams. For example: • Redesign the devices to default to a safe mode • Reduce the difficulties of using multiple devices simultaneously • Minimize the variety of equipment models purchased • Implement clear procedures for checking equipment, supplies, etc., prior to begixnning surgery • Orient and train new staff with the team(s) with which they will work • Provide a supportive environment for identifying and communicating about errors for organizational learning and change to prevent errors. Technology also has to be recognized as a “member” of the work team. When technology shifts workloads, it also shifts the interactions between team members. Where processes may have been monitored by several people, technology can permit the task to be accomplished by fewer people. This affects the distributed nature of the job in which tasks are shared among

OCR for page 49
63 WHY DO ERRORS HAPPEN? several people and may influence the ability to discover and recover from errors.49 In this context, technology does not involve just computers and infor- mation technology. It includes “techniques, drugs, equipment and proce- dures used by health care professionals in delivering medical care to indi- viduals and the systems within which such care is delivered.”50 Additionally, the use of the term technology is not restricted to the technology employed by health care professionals. It can also include people at home of different ages, visual abilities, languages, and so forth, who must use different kinds of medical equipment and devices. As more care shifts to ambulatory and home settings, the use of medical technology by non-health professionals can be expected to take on increasing importance. RESEARCH ON HUMAN FACTORS Research in the area of human factors is just beginning to be applied to health care. It borrows from the disciplines of industrial engineering and psychology. Human factors is defined as the study of the interrelationships between humans, the tools they use, and the environment in which they live and work.51 In the context of this report, a human factors approach is used to under- stand where and why systems or processes break down. This approach ex- amines the process of error, looking at the causes, circumstances, condi- tions, associated procedures and devices and other factors connected with the event. Studying human performance can result in the creation of safer systems and the reduction of conditions that lead to errors. However, not all errors are related to human factors. Although equipment and materials should take into account the design of the way people use them, human factors may not resolve instances of equipment breakdown or material failure. Much of the work in human factors is on improving the human–system interface by designing better systems and processes.52 This might include, for example, simplifying and standardizing procedures, building in redun- dancy to provide backup and opportunities for recovery, improving com- munications and coordination within teams, or redesigning equipment to improve the human–machine interface. Two approaches have typically been used in human factors analysis. The first is critical incident analysis. Critical incident analysis examines a signifi- cant or pivotal occurrence to understand where the system broke down,

OCR for page 49
64 TO ERR IS HUMAN why the incident occurred, and the circumstances surrounding the inci- dent.53 Analyzing critical incidents, whether or not the event actually leads to a bad outcome, provides an understanding of the conditions that pro- duced an actual error or the risk of error and contributing factors. In the case study, researchers with expertise in human factors could have helped the team investigate the problem. They could examine how the device performed under different circumstances (e.g., what the alarms and displays did when the medication flow changed), varying the setup and operation of the infusion device to observe how it performed under normal and abnormal conditions. They could observe how the staff used the particular infusion de- vice during surgery and how they interacted with the use of multiple infusion devices. A critical incident analysis in anesthesia found that human error was involved in 82 percent of preventable incidents. The study identified the most frequent categories of error and the riskiest steps in the process of administering anesthesia. Recommended corrective actions included such things as labeling and packaging strategies to highlight differences among anesthesiologists in the way they prepared their workspace, training issues for residents, work–rest cycles, how relief and replacement processes could be improved, and equipment improvements (e.g., standardizing equipment in terms of the shape of knobs and the direction in which they turn). Another analytic approach is referred to as “naturalistic decision mak- ing.”54 This approach examines the way people make decisions in their natu- ral work settings. It considers all of the factors that are typically controlled for in a laboratory-type evaluation, such as time pressure, noise and other distractions, insufficient information, and competing goals. In this method, the researcher goes out with workers in various fields, such as firefighters or nurses, observes them in practice, and then walks them through to recon- struct various incidents. The analysis uncovers the factors weighed and the processes used in making decisions when faced with ambiguous information under time pressure. In terms of applying human factors research, David Woods of Ohio State University describes a process of reporting, investigation, innovation, and dissemination (David Woods, personal communication, December 17, 1998). Reporting or other means of identifying errors tells people where

OCR for page 49
65 WHY DO ERRORS HAPPEN? errors are occurring and where improvements can be made. The investiga- tion stage uses human factors and other analyses to determine the contribut- ing factors and circumstances that created the conditions in which errors could occur. The design of safer systems provides opportunities for innova- tion and working with early adopters to test out new approaches. Finally, dissemination of innovation throughout the industry shifts the baseline for performance. The experience of the early adopters redefines what is pos- sible and provides models for implementation. Aviation has long analyzed the role of human factors in performance. The Ames Research Center (part of the National Aeronautics and Space Administration) has examined areas related to information technology, au- tomation, and the use of simulators for training in basic and crisis skills, for example. Other recent projects include detecting and correcting errors in flight; interruptions, distractions and lapses of attention in the cockpit; and designing information displays to assist pilots in maintaining awareness of their situation during flight.55 SUMMARY The following key points can be summarized from this chapter. 1. Some systems are more prone to accidents than others because of the way the components are tied together. Health care services is a complex and technological industry prone to accidents. 2. Much can be done to make systems more reliable and safe. When large systems fail, it is due to multiple faults that occur together. 3. One of the greatest contributors to accidents in any industry includ- ing health care, is human error. However, saying that an accident is due to human error is not the same as assigning blame because most human errors are induced by system failures. Humans commit errors for a variety of known and complicated reasons. 4. Latent errors or system failures pose the greatest threat to safety in a complex system because they lead to operator errors. They are failures built into the system and present long before the active error. Latent errors are difficult for the people working in the system to see since they may be hid- den in computers or layers of management and people become accustomed to working around the problem. 5. Current responses to errors tend to focus on the active errors. Al- though this may sometimes be appropriate, in many cases it is not an effec-

OCR for page 49
66 TO ERR IS HUMAN tive way to make systems safer. If latent failures remain unaddressed, their accumulation actually makes the system more prone to future failure. Dis- covering and fixing latent failures and decreasing their duration are likely to have a greater effect on building safer systems than efforts to minimize ac- tive errors at the point at which they occur. 6. The application of human factors in other industries has successfully reduced errors. Health care has to look at medical error not as a special case of medicine, but as a special case of error, and to apply the theory and approaches already used in other fields to reduce errors and improve reliability.56 REFERENCES 1. Senders, John, “Medical Devices, Medical Errors and Medical Accidents,” in Hu- man Error in Medicine, ed., Marilyn Sue Bogner, Hillsdale, NJ: Lawrence Erlbaum Asso- ciates, 1994. 2. Cook, Richard; Woods, David; Miller, Charlotte, A Tale of Two Stories: Contrast- ing Views of Patient Safety, Chicago: National Patient Safety Foundation, 1998. 3. Cook, Richard and Woods, David, “Operating at the Sharp End: The Complexity of Human Error,” in Human Error in Medicine, ed., Marilyn Sue Bogner, Hillsdale, NJ: Lawrence Erlbaum Associates, 1994. 4. Perrow, Charles, Normal Accidents, New York: Basic Books, 1984. 5. Reason, James, Human Error, Cambridge: Cambridge University Press, 1990. 6. Perrow, 1984; Cook and Woods, 1994. 7. Gaba, David M.; Maxwell, Margaret; DeAnda, Abe, Jr.. Anesthetic Mishaps: Breaking the Chain of Accident Evolution. Anesthesiology. 66(5):670–676, 1987. 8. Perrow, 1984. 9. Van Cott, Harold, “Human Errors: Their Causes and Reductions,” in Human Error in Medicine, ed., Marilyn Sue Bogner, Hillsdale, NJ: Lawrence Erlbaum Associates, 1994. Also, Roberts, Karlene, “Organizational Change and A Culture of Safety,” in Pro- ceedings of Enhancing Patient Safety and Reducing Errors in Health Care, Chicago: Na- tional Patient Safety Foundation at the AMA, 1999. 10. Reason, 1990. See also Cook, Woods and Miller, 1998. 11. Norman, Donald, Things That Make Us Smart, Defending Human Attributes in the Age of Machines, Menlo Park, CA: Addison-Wesley Publishing Co., 1993. 12. Cooper, Jeffrey B.; Newbower, Ronald; Long, Charlene, et al. Preventable Anes- thesia Mishaps: A Study of Human Factors. Anesthesiology. 49(6):399–406, 1978. 13. Cooper, Jeffrey B. and Gaba, David M. A Strategy for Preventing Anesthesia Accidents. International Anesthesia Clinics. 27(3):148–152, 1989 14. Reason, 1990. 15. Chassin, Mark R.; Galvin, Robert W., and the National Roundtable on Health Care Quality. The Urgent Need to Improve Health Care Quality, JAMA. 280(11):1000– 1005, 1998. 16. Reason, 1990.

OCR for page 49
67 WHY DO ERRORS HAPPEN? 17. Cook, Woods and Miller, 1998. 18. Reason, 1990. 19. Vaughan, Diane, The Challenger Launch Decision, Chicago: The University of Chicago Press, 1996. 20. Reason, 1990. 21. Reason, 1990. 22. Roberts, Karlene, 1999. See also: Gaba, David, “Risk, Regulation, Litigation and Organizational Issues in Safety in High-Hazard Industries,” position paper for Work- shop on Organizational Analysis in High Hazard Production Systems: An Academy/ Industry Dialogue,” MIT Endicott House, April 15–18, 1997, NSF Grant No. 9510883- SBR. 23. Sagan, Scott D., The Limits of Safety, Princeton, NJ: Princeton University Press, 1993. 24. Sagan, Scott D., 1993 and Robert, Karlene, 1999. 25. Reason, James, “Forward,” in Human Error in Medicine, ed., Marilyn Sue Bogner, Hillsdale, NJ: Lawrence Erlbaum Associates, 1994. 26. “Agenda for Research and Development in Patient Safety,” National Patient Safety Foundation at the AMA, http://www.ama-assn.org/med-sci/npsf/research/ research.htm. May 24, 1999. 27. Dye, Kevin M.C.; Post, Diana; Vogt, Eleanor, “Developing a Consensus on the Accountability and Responsibility for the Safe Use of Pharmaceuticals,” Preliminary White Paper prepared for the National Patient Safety Foundation, June 1, 1999. 28. Brewer, Timothy; Colditz, Graham A. Postmarketing Surveillance and Adverse Drug Reactions, Current Perspectives and Future Needs. JAMA. 281(9):824–829, 1999. 29. VHA’s Patient Safety Improvement Initiative, presentation to the National Health Policy Forum by Kenneth W. Kizer, Under Secretary for Health, Department of Veterans Affairs, May 14, 1999, Washington, D.C. 30. Leape, Lucian L. Error in Medicine. JAMA. 272(23):1851–1857, 1994. 31. Perrow, 1984. 32. Cook and Woods, 1994. 33. Reason. 1990. 34. Norman, 1993. 35. Perrow, 1984. 36. Perrow, 1984. 37. Cook, Woods and Miller, 1998. 38. On the other hand, in some places, the health system may be complex, but loosely coupled. For example, during an emergency, a patient may receive services from a loosely networked set of subsystems—from the ambulance to the emergency room to the outpa- tient clinic to home care. See Van Cott in Bogner, 1994. 39. Cook and Woods, 1994. 40. Reason, 1990. 41. Reason, 1990. 42. Cook and Woods, 1994. 43. Reason, 1990. 44. Van Cott, 1994. 45. Reason, 1990. 46. Norman, 1993.

OCR for page 49
68 TO ERR IS HUMAN 47. Cook and Woods, 1994. 48. Van Cott, 1994. 49. Norman, 1993. 50. Institute of Medicine, Assessing Medical Technologies, Washington, D.C.: National Academy Press, 1985. 51. Weinger, Matthew B; Pantiskas, Carl; Wiklund, Michael; Carstensen, Peter. In- corporating Human Factors Into the Design of Medical Devices. JAMA. 280(17):1484, 1998. 52. Reason, 1990. Leape, 1994. 53. Cooper, Newbower, Long, et al., 1978. 54. Klein, Gary, Sources of Power: How People Make Decisions, Cambridge, MA: The MIT Press, 1998. 55. “Current Projects,” Human Factors Research and Technology Division, Ames Research Center, NASA, http://human-factors.arc.nasa.gov/frameset.html 56. Senders, 1994.