Page 155

8—
Creating Safety Systems in Health Care Organizations

Unsafe acts are like mosquitoes. You can try to swat them one at a time, but there will always be others to take their place. The only effective remedy is to drain the swamps in which they breed. In the case of errors and violations, the "swamps" are equipment designs that promote operator error, bad communications, high workloads, budgetary and commercial pressures, procedures that necessitate their violation in order to get the job done, inadequate organization, missing barriers, and safeguards . . . the list is potentially long but all of these latent factors are, in theory, detectable and correctable before a mishap occurs.1

Safety systems in health care organizations seek to prevent harm to patients, their families and friends, health care professionals, contract-service workers, volunteers, and the many other individuals whose activities bring them into a health care setting. Safety is one aspect of quality, where quality includes not only avoiding preventable harm, but also making appropriate care available—providing effective services to those who could benefit from them and not providing ineffective or harmful services.2

As defined in Chapter 3, patient safety is freedom from accidental injury. This definition and this report intentionally view safety from the perspective of the patient. Accordingly, this chapter focuses specifically on patient safety. The committee believes, however, that a safer environment for patients would also be a safer environment for workers and vice versa, because both



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 155
8 Creating Safety Systems in Health Care Organizations Unsafe acts are like mosquitoes. You can try to swat them one at a time, but there will always be others to take their place. The only effective remedy is to drain the swamps in which they breed. In the case of errors and violations, the “swamps” are equipment designs that promote operator error, bad communications, high workloads, budgetary and commercial pressures, procedures that necessitate their violation in order to get the job done, inadequate organization, missing barriers, and safeguards . . . the list is potentially long but all of these latent factors are, in theory, detectable and correctable before a mishap occurs.1 S afety systems in health care organizations seek to prevent harm to patients, their families and friends, health care professionals, con- tract-service workers, volunteers, and the many other individuals whose activities bring them into a health care setting. Safety is one aspect of quality, where quality includes not only avoiding preventable harm, but also making appropriate care available—providing effective services to those who could benefit from them and not providing ineffective or harmful services.2 As defined in Chapter 3, patient safety is freedom from accidental injury. This definition and this report intentionally view safety from the perspective of the patient. Accordingly, this chapter focuses specifically on patient safety. The committee believes, however, that a safer environment for patients would also be a safer environment for workers and vice versa, because both 155

OCR for page 155
156 TO ERR IS HUMAN are tied to many of the same underlying cultural and systemic issues. As cases in point, hazards to health care workers because of lapses in infection control, fatigue, or faulty equipment may result in injury not only to workers but also to others in the institution. This chapter introduces what has been learned from other high-risk in- dustries about improving safety. It then discusses key concepts for designing systems and their application in health care. This is followed by a discussion of five principles to guide health care organizations in designing and imple- menting patient safety programs. Lastly, the chapter discusses a critical area of safety, namely medication safety and illustrates the principles with strate- gies that health care organizations can use to improve medication safety. RECOMMENDATIONS The committee is convinced that there are numerous actions based on both good evidence and principles of safe design that health care organiza- tions can take now or as soon as possible to substantially improve patient safety. Specifically, the committee makes two overarching recommendations: the first concerns leadership and the creation of safety systems in health care settings; the second concerns the implementation of known medication safety practices. RECOMMENDATION 8.1 Health care organizations and the pro- fessionals affiliated with them should make continually improved pa- tient safety a declared and serious aim by establishing patient safety programs with a defined executive responsibility. Patient safety pro- grams should: (1) provide strong, clear, and visible attention to safety; implement nonpunitive systems for reporting and analyzing errors within their organizations; (2) incorporate well-understood safety principles, such as, standardizing and simplifying equipment, sup- plies, and processes; and (3) establish interdisciplinary team training programs, such as simulation, that incorporate proven methods of team management. Chief executive officers and boards of trustees must make a serious and ongoing commitment to creating safe systems of care. Other high-risk indus- tries have found that improvements in safety do not occur unless there is commitment by top management and an overt, clearly defined, and continu- ing effort on the part of all personnel and managers. Like any other pro- gram, a meaningful safety program should include senior-level leadership,

OCR for page 155
157 CREATING SAFETY SYSTEMS IN HEALTH CARE ORGANIZATIONS defined program objectives, plans, personnel, and budget, and should be monitored by regular progress reports to the executive committee and board of directors. According to Cook,3 Safety is a characteristic of systems and not of their components. Safety is an emergent property of systems. In order for this prop- erty to arise, health care organizations must develop a systems orientation to patient safety, rather than an orientation that finds and attaches blame to individuals. It would be hard to overestimate the underlying, critical impor- tance of developing such a culture of safety to any efforts that are made to reduce error. The most important barrier to improving patient safety is lack of awareness of the extent to which errors occur daily in all health care settings and organizations. This lack of awareness exists because the vast majority of errors are not reported, and they are not reported because per- sonnel fear they will be punished. Health care organizations should establish nonpunitive environments and systems for reporting errors and accidents within their organizations. Just as important, they should develop and maintain an ongoing process for the discovery, clarification, and incorporation of basic principles and inno- vations for safe design and should use this knowledge in understanding the reasons for hazardous conditions and ways to reduce these vulnerabilities. To accomplish these tasks requires that health care organizations provide resources to monitor and evaluate errors and to implement methods to re- duce them. Organizations should incorporate well-known design principles in their work environment. For example, standardization and simplification are two fundamental human factors principles that are widely used in safe industries and widely ignored in health care. They should also establish interdisciplinary team training programs— including the use of simulation for trainees and experienced practitioners for personnel in areas such as the emergency department, intensive care unit, and operating room; and incorporating proven methods of managing work in teams as exemplified in aviation (where it is known as crew resource management). RECOMMENDATION 8.2 Health care organizations should imple- ment proven medication safety practices. A number of practices have been shown to reduce errors in the medica- tion process and to exemplify known methods for improving safety. The committee believes they warrant strong consideration by health care organi-

OCR for page 155
158 TO ERR IS HUMAN zations including hospitals, long-term-care facilities, ambulatory settings, and other health care delivery sites, as well as outpatient and community pharmacies. These methods include: reducing reliance on memory; simplifi- cation; standardization; use of constraints and forcing functions; the wise use of protocols and checklists; decreasing reliance on vigilance, handoffs, and multiple data entry; and differentiating among products to eliminate look-alike and sound-alike products. INTRODUCTION Errors occur in all industries. Some industrial accidents involve one or a few workers. Others affect entire local populations or ecosystems. In health care, events are well publicized when they appear to be particularly egre- gious—for example, wrong-site surgery or the death of a patient during what is thought to be a routine, low-risk procedure. Generally, however, accidents are not well publicized; indeed, they may not be known even to the patient or to the family. Because the adverse effects may be separated in time or space from the occurrence, they may not even be recognized by the health care workers involved in the patient’s care. Nevertheless, we know that errors are ubiquitous in all health care set- tings.4 Harms range from high-visibility cases to those that are minimal but require additional treatment and time for the patient to recuperate or result in a patient’s failure to receive the benefit of appropriate therapy. In aggre- gate, they represent a huge burden of harm and cost to the American people as described in Chapter 2. To date, however, those involved in health care management and deliv- ery have not had specific, clear, high-level incentives to apply what has been learned in other industries about ways to prevent error and reduce harm. Consequently, the development of safety systems, broadly understood, has not been a serious and widely adopted priority within health care organiza- tions. This report calls on organizations and on individual practitioners to address patient safety. Health care is composed of a large set of interacting systems—para- medic, emergency, ambulatory, inpatient care, and home health care; testing and imaging laboratories; pharmacies; and so forth—that are connected in loosely coupled but intricate networks of individuals, teams, procedures, regulations, communications, equipment, and devices that function with dif- fused management in a variable and uncertain environment.5 Physicians in community practice may be so tenuously connected that they do not even

OCR for page 155
159 CREATING SAFETY SYSTEMS IN HEALTH CARE ORGANIZATIONS view themselves as part of a system of care. They may see the hospitals in which they are attendings as platforms for their work. In these and many other ways, the distinct cultures of medicine (and other health professions) add to the idiosyncrasy of health care among high-risk industries. Nevertheless, experience in other high-risk industries has provided well- understood illustrations that can be used in improving health care safety. Studies of actual accidents, incident-reporting systems, and research on hu- man factors (i.e., the interface of human beings and machines and their per- formance in complex working environments) have contributed to our grow- ing understanding about how to prevent, detect, and recover from accidents. This has occurred because, despite their differences from health care, all systems have common characteristics that include the use of technologies, the users of these technologies, and an interface between the users and the technologies.6 The users of technology bring certain characteristics to a task such as the quality of their knowledge and training, level of fatigue, and careful or careless habits. They also bring characteristics that are common to everyone, including difficulty recalling material and making occasional errors. Safety Systems in High-Risk Industries The experience in three high-risk industries—chemical and material manufacturing and defense—provides examples of the information and sys- tems that can contribute to improved safety and of the safety achievements that are possible. Claims that health care is unique and therefore not suscep- tible to a transfer of learning from other industries are not supportable. Rather, the experiences of other industries provide invaluable insight about how to begin the process of improving the safety of health care by learning how to prevent, detect, recover, and learn from accidents. E.I. du Pont de Nemours and Company E.I. du Pont de Nemours and Company has one of the lowest rates of occupational injury of any company, substantiation of an 11-point safety philosophy that includes the tenets that all injuries are preventable; that man- agement is responsible and accountable for preventing injury; that safety must be integrated as a core business and personal value; and that deficien- cies must be corrected promptly. In 1994, Conoco Refining, a subsidiary, reported only 1.92 work-loss days per 200,000 hours of exposure. In 1998,

OCR for page 155
160 TO ERR IS HUMAN this rate was further reduced to 0.39. Some of DuPont’s plants with more than 2,000 employees have operated for more than 10 years without a lost- time injury, and one plant producing glycolic acid celebrated 50 years with- out a lost workday.7 DuPont credits its safety record, at least in part, to its implementation of a nonpunitive system to encourage employees to report near-miss incidents without fear of sanctions or disciplinary measures and its objective to create an all-pervasive, ever-present awareness of the need to do things safely.8,9 Alcoa, Inc. Another industry example is Alcoa, which is involved in mining, refin- ing, smelting, fabricating, and recycling aluminum and other materials. Alcoa uses a worldwide on-line safety data system to track incidents, analyze their causes, and share preventive actions throughout all of its holdings. One of its principles is that all incidents, including illnesses, injuries, spills, and ex- cursions, can be prevented whether they are immediate, latent, or cumula- tive. Although Alcoa reduced its international lost work day rate per 200,000 hours worked from 1.87 in 1987 to 0.42 in 1997, it has recently gone even further and announced a plan to eliminate fatalities and reduce the average injury rate by 50 percent by the end of the year 2000.10 Several aspects of these two examples are striking. In comparison to the health care industry, DuPont, Alcoa, and others systematically collect and analyze data about accidents. They have been tracking their own perfor- mance over time and are able to compare themselves to others in their in- dustries. They are willing to publish their results as information to which stockholders and employees are entitled and as a source of pride, and their efforts have achieved extremely low and continuously decreasing levels of injury. The importance of a strong culture of safety, as nurtured by both DuPont and Alcoa, is viewed by many in the safety field as being the most critical underlying feature of their accomplishments. U.S. Navy: Aircraft Carriers People are quick to point out that health care is very different from a manufacturing process, mostly because of the huge variability in patients and circumstances, the need to adapt processes quickly, the rapidly chang- ing knowledge base, and the importance of highly trained professionals who must use expert judgment in dynamic settings. Though not a biological sys-

OCR for page 155
161 CREATING SAFETY SYSTEMS IN HEALTH CARE ORGANIZATIONS tem, the performance of crews and flight personnel on aircraft carriers pro- vides an example that has features that are closer to those in health care environments than manufacturing. On an aircraft carrier, fueling aircraft and loading munitions are ex- amples of the risks posed when performing incompatible activities in close proximity. On the flight deck, 100 to 200 people fuel, load munitions, and maintain aircraft that take off and are recovered at 48- to 60-second inter- vals. The ability to keep these activities separate requires considerable orga- nizational skill and extensive ongoing training to avoid serious injury to flight and nonflight personnel, the aircraft, and the ship. Despite extremely dan- gerous working conditions and restricted space, the Navy’s “crunch rate” aboard aircraft carriers in 1989 was only 1 per 8,000 moves which makes it a very highly reliable, but complex, social organization.* Students of accident theory emphasize how the interactive complexity of an organization using hazardous technologies seems to defy efforts of system designers and operators to prevent accidents and ensure reliability. In part, this is because individuals are fallible and in part because unlikely and rare (and thus unanticipated) failures in one area are linked in complex systems and may have surprising effects in other systems—the tighter the “coupling,” generally, the more likely that failure in one part will affect the reliability of the whole system. Nevertheless, even in such systems, great consistency is achievable using four strategies in particular: the prioritization of safety as a goal; high levels of redundancy, the development of a safety culture that involves continuous operational training, and high-level organi- zational learning.11 Weick and Roberts12 have studied peacetime flight operations on air- craft carriers as an example of organizational performance requiring nearly continuous operational reliability despite complex patterns of interrelated activities among many people. These activities cannot be fully mapped out beforehand because of changes in weather (e.g., wind direction and strength), sea conditions, time of day and visibility, returning aircraft arriv- als, and so forth. Yet, surprisingly, generally mapped out sequences can be carried out with very high reliability in novel situations using improvisation and adaptation and personnel who are highly trained but not highly edu- cated. *A crunch occurs when two aircraft touch while being moved, either on the flight or hangar deck, even if damage is averted.

OCR for page 155
162 TO ERR IS HUMAN Naval commanders stress the high priority of safety. They understand the importance of a safety culture and use redundancy (both technical and personnel) and continuous training to prepare for the unexpected. The Navy also understands the need for direct communication and adaptability. Be- cause errors can arise from a lack of direct communication, the ship’s con- trol tower communicates directly with each division over multiple channels. As in health care, it is not possible in such dynamic settings to anticipate and write a rule for every circumstance. Once-rigid orders that prescribed how to perform each operation have been replaced by more flexible, less hierarchical methods. For example, although the captain’s commands usu- ally take precedence, junior officers can, and do, change these priorities when they believe that following an order will risk the crew’s safety. Such an ex- ample demonstrates that even in technologically sophisticated, hazardous, and unpredictable environments it is possible to foster real-time problem solving and to institute safety systems that incorporate a knowledge of hu- man factors. In summary, efforts such as those described in the three examples have resulted neither in stifled innovation nor loss of competitive benefit; nor have they resulted in unmanageable legal consequences. Rather, they are a source of corporate and employee pride. Characteristics that distinguish suc- cessful efforts in other industries include the ability to collect data on errors and incidents within the organization in order to identify opportunities for improvement and to track progress. The companies make these data avail- able to outsiders. Other notable features of these efforts include the impor- tance of leadership and the development of a safety culture, the use of so- phisticated methods for the analysis of complex processes, and a striving for balance among standardization where appropriate, yet giving individuals the freedom to solve problems creatively. KEY SAFETY DESIGN CONCEPTS Designing safe systems requires an understanding of the sources of er- rors and how to use safety design concepts to minimize these errors or allow detection before harm occurs. This field is described in greater detail in Chapter 3 which includes an error taxonomy first proposed by Rasmussen13 and elaborated by Reason14 to distinguish among errors arising from (1) skill-based slips and lapses; (2) rule-based errors; and (3) knowledge-based mistakes. Leape has simplified this taxonomy to describe what he calls “the patho- physiology of error.” He differentiates between the cognitive mechanisms

OCR for page 155
163 CREATING SAFETY SYSTEMS IN HEALTH CARE ORGANIZATIONS used when people are engaging in well-known, oft-repeated processes and their cognitive processes when problem solving. The former are handled rapidly, effortlessly, in parallel with other tasks, and with little direct atten- tion. Errors may occur because of interruptions, fatigue, time pressure, an- ger, anxiety, fear, or boredom. Errors of this sort are expectable, but condi- tions of work can make them less likely. For example, work activities should not rely on weak aspects of human cognition such as short-term memory. Safe design, therefore, avoids reliance on memory. Problem-solving processes, by contrast, are slower, are done sequen- tially (rather than in parallel with other tasks), are perceived as more diffi- cult, and require conscious attention. Errors are due to misinterpretation of the problem that must be solved, lack of knowledge to bring to bear, and habits of thought that cause us to see what we expect to see. Attention to safe design includes simplification of processes so that users who are unfa- miliar with them can understand quickly how to proceed, training that simu- lates problems, and practice in recovery from these problems. As described in Chapter 3, instances of patient harm are usually attrib- uted to individuals “at the sharp end” who make the visible error. Their prevention, however, requires systems that are designed for safety—that is, systems in which the sources of human error have been systematically recog- nized and minimized.15,16 In recent years, students of system design have looked for ways to avoid error using what has been called by Donald Norman17 “user-centered de- sign.” This chapter draws on six strategies that Norman outlines. They are directed at the design of individual devices so that they can be used reliably and safely for their intended purposes. Although these strategies are aimed at the human–machine interface, they can also be usefully applied to pro- cesses of care. The first strategy is to make things visible—including the conceptual model of the system—so that the user can determine what actions are pos- sible at any moment—for example, how to turn off a piece of equipment, how to change settings, and what is likely to happen if a step in a process is skipped. The second strategy is to simplify the structure of tasks so as to minimize the load on working memory, planning, or problem solving. A third strategy is what Norman calls the use of affordances and natural mappings. An affordance is a characteristic of equipment or workspace that communicates how it is to be used, such as a push bar on an outward open- ing door that indicates where to push. Another example is a telephone hand- set that is uncomfortable to hold in any position but the correct one.

OCR for page 155
164 TO ERR IS HUMAN Natural mapping refers to the relationship between a control and its movement; for example, in steering a car to the right, one turns the wheel right. Natural mapping takes advantage of physical analogies and cultural knowledge to help users understand how to control devices. Other examples of natural mapping are arranging light switches in the same pattern as lights in a lecture room; arranging knobs to match the arrangement of burners on a stove; or using louder sound, an increasingly brighter indicator light, or a wedge shape to indicate a greater amount. A fourth important strategy is the use of constraints or “forcing func- tions” to guide the user to the next appropriate action or decision. A con- straint makes it hard to do the wrong thing; a forcing function makes it impossible. A classic example of a forcing function is that one cannot start a car that is in gear. Norman’s fifth strategy is to assume that errors will occur and to design and plan for recovery by making it easy to reverse operations and hard to carry out nonreversible ones. An example is the Windows computer oper- ating system that asks if the user really intends to delete a file, and if so, puts it in a “recycle” folder so that it can still be retrieved. Finally, Norman advises that if applying the earlier strategies does not achieve the desired results, designers should standardize actions, outcomes, layouts, and displays. An example of standardization is the use of protocols for chemotherapy. An example of simplification is reducing the number of dose strengths of morphine in stock. Safety systems can be both local and organization wide. Local systems are implemented at the level of a small work group—a department, a unit, or a team of health care practitioners. Such local safety systems should be supported by, and consistent with, organization-wide safety systems. Anesthesiology is an example of a local, but complex, high-risk, dy- namic patient care system in which there has been notably reduced error. Responding to rising malpractice premiums in the mid-1980s, anesthesiolo- gists confronted the safety issues presented by the need for continuing vigi- lance during long operations but punctuated by the need for rapid problem evaluation and action. They were faced with a heterogeneity of design in anesthesia devices; fatigue and sleep deprivation; and competing institu- tional, professional, and patient care priorities. By a combination of techno- logical advances (most notably the pulse oximeter), standardization of equip- ment, and changes in training, they were able to bring about major, sustained, widespread reduction in morbidity and mortality attributable to the admin- istration of anesthesia.18

OCR for page 155
165 CREATING SAFETY SYSTEMS IN HEALTH CARE ORGANIZATIONS Organization-wide systems, on the other hand, are implemented and monitored at the level of a health care organization. These include programs and processes that cross departmental lines and units. In hospitals, infection control and medication administration are examples of organization-wide systems that encompass externally imposed regulations, institutional poli- cies and procedures, and the actions of individuals who must provide poten- tially toxic materials at the right time to the right patient. PRINCIPLES FOR THE DESIGN OF SAFETY SYSTEMS IN HEALTH CARE ORGANIZATIONS Hospitals and other institutions have long-standing efforts to ensure patient safety in a variety of areas. Appendix E provides an overview of some of these efforts in hospitals. Some have been very effective in certain units or certain hospitals. These activities have not, however, succeeded in eliminating error or injury, and they have not been part of national or even institution-wide, high-priority efforts. Compared to hospital care, out-of-hospital care—whether in institu- tions, homes, medical offices or other settings, both the knowledge of the kind and magnitude of errors and the development of safety systems are rudimentary. Safety tends to be addressed narrowly by reliance on education and training, policies, and procedures. There are undoubtedly many reasons for the lack of attention to safety including: small staff size, lack of technical knowledge of effective ways to improve quality or an infrastructure to sup- port deploying this knowledge; lack of recognition of error (because the harm is removed in time or space from the error and because individuals are unharmed); lack of data systems to track and learn from error (most of the adverse drug events studies use emergency visits or hospital admissions to establish a denominator); the speed of change and the introduction of new technologies; and clearly, the same cultural barriers that exist in hospi- tals—namely, the high premium placed on medical autonomy and perfec- tion and a historical lack of interprofessional cooperation and effective communication. With the rise in outpatient and office-based surgery, attention is turning to anesthesia safety in settings such as private physician offices, dental, and podiatry offices. For example, guidelines for patient assessment, sedation, monitoring, personnel, emergency care, discharge evaluation, maintenance of equipment, infection control, and the like have been developed by an ad hoc committee for New York State practitioners.19

OCR for page 155
194 TO ERR IS HUMAN most powerful and useful medications in the therapeutic armamentarium. Examples are heparin, warfarin, insulin, lidocaine, magnesium, muscle re- laxants, chemotherapeutic agents, and potassium chloride (see below), dex- trose injections, narcotics, adrenergic agents, theophylline, and immuno- globin.65,66 Both to alert personnel to be especially careful and to ensure that dosing is appropriate, special protocols and processes should be used for these “high-alert” drugs. Such protocols might include written and com- puterized guidelines, checklists, preprinted orders, double-checks, special packaging, and labeling. Do Not Store Concentrated Potassium Chloride Solutions on Patient Care Units Concentrated potassium chloride (KCl) is the most potentially lethal chemical used in medicine. It is widely used as an additive to intravenous solutions to replace potassium loss in critically ill patients. Each year, fatal accidents occur when concentrated KCl is injected because it is confused with another medication. Because KCl is never intentionally used undiluted, there is no need to have the concentrated form stocked on the patient care unit. Appropriately diluted solutions of KCl can be prepared by the phar- macy and stored on the unit for use. After enacting its sentinel event reporting system, JCAHO found that eight of ten incidents of patient death resulting from administration of KCl were the result of the infusion of KCl that was available as a floor stock item.67 This has also been reported as a frequent cause of adverse events by the U.S. Pharmacopoeia (USP) Medication Errors Reporting Program.68 Ensure the Availability of Pharmaceutical Decision Support Because of the immense variety and complexity of medications now available, it is impossible for nurses or doctors to keep up with all of the information required for safe medication use. The pharmacist has become an essential resource in modern hospital practice. Thus, access to his or her expertise must be possible at all times.69,70 Health care organizations would greatly benefit from pharmaceutical decision support. When possible, medi- cations should be dispensed by pharmacists or with the assistance of phar- macists. In addition, a substantial number of errors are made when nurses or other nonpharmacist personnel enter pharmacies during off hours to ob- tain drugs. Although small hospitals cannot afford and do not need to have a

OCR for page 155
195 CREATING SAFETY SYSTEMS IN HEALTH CARE ORGANIZATIONS pharmacist physically present at all times, all hospitals must have access to pharmaceutical decision support, and systems for dispensing medications should be designed and approved by pharmacists. Include a Pharmacist During Rounds of Patient Care Units As the major resource for drug information, pharmacists are much more valuable to the patient care team if they are physically present at the time decisions are being made and orders are being written. For example, in teaching hospitals, medical staff may conduct “rounds” with residents and other staff. Pharmacists should actively participate in this process and be present on the patient care unit when appropriate. Such participation is usually well received by nurses and doctors, and it has been shown to signifi- cantly reduce serious medication errors. Leape et al.71 measured the effect of pharmacist participation on medical rounds in the intensive care unit. They found that in one large, urban, teaching hospital the rate of prevent- able adverse drug events related to prescribing decreased significantly—66 percent—from 10.4 per 1,000 patient-days before the intervention to 3.5 after the intervention; the rate in the control group was unchanged. Make Relevant Patient Information Available at the Point of Patient Care Many organizations have implemented ways to make information about patients available at the point of patient care as well as ways to ensure that patients are correctly identified and treated. With medication administra- tion, some inexpensive but useful strategies include the use of colored wrist- bands (or their equivalent) as a way to alert medical staff of medication allergies. Colored wristbands or their functional equivalent can alert person- nel who encounter a patient anywhere in a hospital to check for an allergy before administering a medication. Using computer-generated MARs, can minimize transcription errors and legibility problems as well as provide flow charts for patient care. Improper doses, mix-ups of drugs or patients, and inaccurate records are common causes of medication errors in daily hospital practice. Bar cod- ing (or an electronic equivalent) is an effective remedy.72 It is a simple way to ensure that the identity and dose of the drug are as prescribed, that it is being given to the right patient, and that all of the steps in the dispensing and administration processes are checked for timeliness and accuracy. Bar

OCR for page 155
196 TO ERR IS HUMAN coding can be used not only by drug manufacturers, but also by hospitals to ensure that patients and their records match. The Colmercy-O’Neil VA Medical Center in Topeka, Kansas, reports, for example, a 70 percent re- duction in medication error rates between September, 1995 and April, 1998 by using a system that included bar coding of each does, use of a hand-held laser bar code scanner, and a radio computer link.73 Improve Patients’ Knowledge About Their Treatment A major unused resource in most hospitals, clinics, and practices is the patient. Not only do patients have a right to know the medications they are receiving, the reasons for them, their expected effects and possible compli- cations, they also should know what the pills or injections look like and how often they are to receive them. Patients should be involved in reviewing and confirming allergy information in their records. Practitioners and staff in health care organizations should take steps to ensure that, whenever possible, patients know which medications they are receiving, the appearance of these medications, and their possible side ef- fects.74 They should be encouraged to notify their doctors or staff of dis- crepancies in medication administration or the occurrence of side effects. If they are encouraged to take this responsibility, they can be a final “fail-safe” step. At the time of hospital discharge, patients should also be given both verbal and written information about the safe and effective use of their medi- cations in terms and in a language they can understand. Patient partnering is not a substitute for nursing responsibility to give the proper medication properly or for physicians to inform their patients, but because no one is perfect, it provides an opportunity to intercept the rare but predictable error. In addition to patients’ informing their health care practitioner about their current medications, allergies, and previous adverse drug experiences, the National Patient Safety Partnership has rec- ommended that patients ask the following questions before accepting a newly prescribed medication:75 • Is this the drug my doctor (or other health care provider) ordered? What are the trade and generic names of the medication? • What is the drug for? What is it supposed to do? • How and when am I supposed to take it and for how long? • What are the likely side effects? What do I do if they occur?

OCR for page 155
197 CREATING SAFETY SYSTEMS IN HEALTH CARE ORGANIZATIONS • Is this new medication safe to take with other over-the-counter or prescription medication or with dietary supplements that I am already tak- ing? What food, drink, activities, dietary supplements, or other medication should be avoided while taking this medication? SUMMARY This chapter has proposed numerous actions based on both good evi- dence and principles of safe design that health care organizations could take now or as soon as possible to substantially improve patient safety. These principles include (1) providing leadership; (2) respecting human limits in process design; (3) promoting effective team functioning; (4) anticipating the unexpected; and (5) creating a learning environment. The committee’s recommendations call for health care organizations and health care professionals to make continually improved patient safety a spe- cific, declared, and serious aim by establishing patient safety programs with defined executive responsibility. The committee also calls for the immediate creation of safety systems that incorporate principles such as (1) standardiz- ing and simplifying equipment, supplies, and processes; (2) establishing team training programs; and (3) implementing nonpunitive systems for reporting and analyzing errors and accidents within organizations. Finally, drawing on these principles and on strong evidence, the committee calls on health care organizations to implement proven medication safety practices. REFERENCES 1. Reason, James T. Forward to Human Error in Medicine, Marilyn Sue Bogner, ed. Hillsdale, NJ: Lawrence Erlbaum Associates, 1994, p. xiv. Though ecologically unsound, the analogy is apt. 2. Chassin, Mark R.; Galvin, Robert W., and the National Roundtable on Health Care Quality. The Urgent Need to Improve Health Care Quality. Institute of Medicine National Roundtable on Health Care Quality. JAMA. 280:1000–1005, 1998. 3. Cook, Richard I. Two Years Before the Mast: Learning How to Learn About Patient Safety. Invited presentation. “Enhancing Patient Safety and Reducing Errors in Health Care,” Rancho Mirage, CA, November 8–10, 1998. 4. Senders, John. “Medical Devices, Medical Errors and Medical Accidents,” in Hu- man Error in Medicine, Marilyn Sue Bogner, ed. Hillsdale, NJ: Lawrence Erlbaum Asso- ciates, 1994. 5. Van Cott, Harold. “Human Errors: Their Causes and Reduction,” in Human Er- ror in Medicine, Marilyn Sue Bogner, ed., Hillsdale, NJ: Lawrence Erlbaum Associates, 1994. 6. Van Cott, 1994.

OCR for page 155
198 TO ERR IS HUMAN 7. MacCormack, George. Zeroing in on Safety Excellence—It’s Good Business. http://www.dupont.com/safety/esn97-1/zeroin.html 5/27/99. 8. DuPont Safety Resources. Safety Pays Big Dividends for Swiss Federal Railways. http://www.dupont.com/safety/ss/swissrail22.html 5/3/99. 9. From “ Executive Safety News ” D uPont Safety Resources. http:// www.dupont.com/safety/esn98-3.html 5/3/99. 10. Alcoa. Alcoa Environment, Health and Safety Annual Report. 1997. 11. Sagan, Scott D. The Limits of Safety. Organizations, Accidents, and Nuclear Weap- ons. Princeton, N.J.: Princeton University Press, 1993. 12. Weick, Karl E. and Roberts, Karlene H. Collective Mind in Organizations: Heed- ful Interrelating on Flight Decks. Administrative Science Quarterly. 38:357–381, 1993. 13. Rasmussen, Jens. Skills, rules, Knowledge: Signals, Signs, and Symbols and Other Distinctions in Human Performance Models. IEEE Transactions: Systems, Man & Cyber- netics (SMC-13): 257–267, 1983. 14. Reason, James. Human Error. New York: Cambridge University Press, 1990. 15. Moray, Nevill. “Error Reduction as a Systems Problem,” in Human Error in Medi- cine, ed., Marilyn Sue Bogner, Hillsdale, NJ: Lawrence Erlbaum Associates, 1994. 16. Van Cott, 1994. 17. Norman, Donald A. The Design of Everyday Things. NY: Doubleday/Currency, 1988. 18. Gaba, David; Howard, Steven K., and Fish, Kevin J. Crisis Management in Anes- thesiology. NY: Churchill-Livingstone, 1994. 19. Committee on Quality Assurance in Office-Based Surgery. A Report to New York State Public Health Council and New York State Department of Health, June, 1999. 20. Berwick, Donald M. “Taking Action to Improve Safety: How to Increase the Odds of Success,” Keynote Address, Second Annenberg Conference, Rancho Mirage, CA, November 8, 1998. 21. Haberstroh, Charles H. “Organization, Design and Systems Analysis,” in Hand- book of Organizations, J.J. March, ed. Chicago: Rand McNally, 1965. 22. Leape, Lucian L.; Kabcenell, Andrea; Berwick, Donald M., et al. Reducing Ad- verse Drug Events. Boston: Institute for Healthcare Improvement, 1998. 23. Institute of Medicine. Guidelines for Clinical Practice. From Development to Use. Marilyn J. Field and Kathleen N. Lohr, eds. Washington, D.C.: National Academy Press, 1992. 24. Leape, et al., 1998. 25. Leape, et al., 1998. 26. Leape, et al., 1998. 27. Leape, et al., 1998. 28. Leape, et al., 1998. 29. Hwang, Mi Y. JAMA Patient Page. Take Your Medications as Prescribed. JAMA. 282:298, 1999. 30. Blumenthal, David. The Future of Quality Measurement and Management in a Transforming Health Care System. JAMA. 278:1622–1625, 1997. 31. Sheridan, Thomas B. and Thompson, James M. “People Versus Computers in Medicine,” in Human Error in Medicine, Marilyn Sue Bogner, ed., Hillsdale, NJ: Lawrence Erlbaum Associates, 1994.

OCR for page 155
199 CREATING SAFETY SYSTEMS IN HEALTH CARE ORGANIZATIONS 32. Hyman, William A. “Errors in the Use of Medical Equipment,” in Human Error in Medicine, Marilyn Sue Bogner, ed., Hillsdale, NJ: Lawrence Erlbaum Associates, 1994. 33. Senders, John W. “Medical Devices, Medical Errors, and Medical Accidents,” in Human Error in Medicine, Marilyn Sue Bogner, ed., Hillsdale, NJ: Lawrence Erlbaum Associates, 1994. 34. Cook, Richard I. Two Years Before the Mast: Learning How to Learn About Patient Safety. “Enhancing Patient Safety and Reducing Errors in Health Care,” Rancho Mirage, CA, November 8–10, 1998. 35. Leape, et al., 1998. 36. Helmreich, Robert L.; Chidester, Thomas R.; Foushee, H. Clayton, et al. How Effective is Cockpit Resource Managmeent Training? Flight Safety Digest. May:1–17, 1990. 37. Chopra,V.; Gesink, Birthe J.; deJong, Jan, et al. Does Training on an Anaesthesia Simulator Lead to Improvement in Performance? Br J Anaesthesia. 293–297, 1994. 38. Denson, James S. and Abrahamson, Stephen. A Computer-Controlled Patient Simulator. JAMA. 208:504–508, 1969. 39. Howard, Steven K.; Gaba, David, D.; Fish, Kevin J., et al. Anesthesia Crisis Re- source Management Training: Teaching Anesthesiologists to Handle Critical Incidents. Aviation, Space, and Environmental Medicine. 63:763–769, 1992. 40. Spence, Alastair. A. The Expanding Role of Simulators in Risk Management. Br J Anaesthesia. 78:633–634, 1997. 41. Inaccurate Reporting of Simulated Critical Anaesthetic Incidents. Br J Anaesthe- sia. 78:637–641, 1997. 42. Helmreich, Robert L. and Davies, Jan M. Anaesthetic Simulation and Lessons to be Learned from Aviation. [Editorial]. Canadian Journal of Anaesthesia. 44:907–912, 1997. 43. Institute of Medicine. The Computer-Based Patient Record. An Essential Technol- ogy for Health Care. Revised Edition. Washington, DC: National Academy Press, 1997. 44. Tuggy, Michael L. Virtual Reality Flexible Sigmoidoscopy Simulator Training: Im- pact on Resident Performance. J Am Board Fam Pract. 11:426–433, 1998. 45. Leape, Lucian, L.; Woods, David D.; Hatlie, Martin, J., et al. Promoting Patient Safety and Preventing Medical Error. JAMA. 280:1444–1447, 1998. 46. Leape, Lucian L. Error in Medicine. JAMA 272:1851–1857, 1994. 47. Kizer, Kenneth W. VHA’s Patient Safety Improvement Initiative, presentation to the National Health Policy Forum, Washington, D.C., May 14, 1999. 48. Nolan, Thomas. Presentation, IHI Conference, Orlando, FL, December, 1998. 49. Zimmerman, Jack E.; Shortell, Stephen M., et al. Improving Intensive Care: Ob- servations Based on Organizational Case Studies in Nine Intensive Care Units: A Pro- spective, Multicenter Study. Crit Care Med. 21:1443–1451, 1993. 50. Shortell, Stephen M.; Zimmerman, Jack E.; Gillies, Robin R., et al. Continuously Improving Patient Care: Practical Lessons and an Assessment Tool From the National ICU Study. QRB Qual Rev Bull. 18:150–155, 1992. 51. Manasse, Henri R. Jr. Toward Defining and Applying a Higher Standard of Qual- ity for Medication Use in the United States. Am J Health System Pharm. 55:374–379, 1995. 52. Lesar, Timothy S.; Briceland, Laurie; and Stein, Daniel S. Factors Related to Er- rors in Medication Prescribing. JAMA. 277:312–317, 1997.

OCR for page 155
200 TO ERR IS HUMAN 53. Avorn, Jerry. Putting Adverse Drug Events into Perspective. JAMA. 277:341– 342, 1997. 54. Healthcare Leaders Urge Adoption of Methods to Reduce Adverse Drug Events. News Release. National Patient Safety Partnership, May 12, 1999. 55. Massachusetts Hospital Association (Massachusetts Coalition for the Prevention of Medical Errors). “MHA Best Practice Recommendations to Reduce Medication Er- rors,” Kirle, Leslie E.; Conway, James; Peto, Randolph, et al. http://www.mhalink.org/ mcpme/recommend.htm. 56. Leape, et al., 1998. 57. Consensus Statement. American Society of Health-system Pharmacists. Top-Pri- ority Actions for Preventing Adverse Drug Events in Hospitals. Recommendations of an Expert Panel. Am J Health System Pharm. 53:747–751, 1996. 58. Bates, David W.; Leape, Lucian L.; Cullen, David J., et al. Effect of Computerized Physician Order Entry and a Team Intervention on Prevention of Serious Medical Error. JAMA. 280:1311–1316, 1998. 59. Bates, 1998. 60. Evans, R. Scott; Pestotnik, Stanley L.; Classen, David C., et al. A Computer-As- sisted Management Program for Antibiotics and Other Anti-Infective Agents. N Engl J Med. 338(4):232–238, 1997. See also: Schiff, Gordon D.; and Rucker, T. Donald. Com- puterized Prescribing: Building the Electronic Infrastructure for Better Medication Us- age. JAMA. 279:1024–1029, 1998. 61. Bates, David W.; Spell, Nathan, Cullen; David J., et al. The Costs of Adverse Drug Events in Hospitalized Patients. JAMA. 227:307–311, 1997. 62. Bates, David W.; O’Neil, Anne C.; Boyle, Deborah, et. al. Potential Identifiablity and Preventability of Adverse Events Using Information Systems. J American Informatics Assoc. 5:404–411, 1994. 63. Institute for Safe Medication Practices. Over-reliance on Pharmacy Computer Systems May Place Patients at Great Risk. http://www.ismp.org/ISMP/MSAarticles/ Computer2.html 6/01/99. 64. Bates, David W.; Cullen, David J.; Laird, Nan, et al. Incidence of Adverse Drug Events and Potential Adverse Drug Events. JAMA. 274:29–34, 1995. 65. Leape, Lucian L.; Kabcenell, Andrea; Berwick, Donald M., et al. Reducing Ad- verse Drug Events. Boston: Institute for Healthcare Improvement, 1998. 66. Cohen, Michael; Anderson, Richard W.; Attilio, Robert M., et al. Preventing Medi- cation Errors in Cancer Chemotherapy. Am J Health System Pharm. 53:737–746, 1996. 67. Sentinel Event Alert. The Joint Commission on Accreditation of Healthcare Or- ganizations, Oakbrook Terrace, IL: JCAHO, 1998. 68. Cohen, Michael. Important Error Prevention Advisory. Hosp Pharmacists. 32:489– 491, 1997. 69. ASHP Guidelines on Preventing Medication Errors in Hospitals. Am J Hospital Pharmacists. 50:305–314, 1993. 70. Crawford, Stephanie Y. Systems Factors in the Reporting of Serious Medication Errors. Presentation at Annenburg Conference, Rancho Mirage, CA, November 8, 1998. 71. Leape, Lucian L.; Cullen, David J.; Clapp, Margaret D., et al. Pharmacist Partici- pation on Physician Rounds and Adverse Drug Events in the Intensive Care Unit. JAMA.. 282(3):267–270, 1999.

OCR for page 155
201 CREATING SAFETY SYSTEMS IN HEALTH CARE ORGANIZATIONS 72. Top Priority Actions for Preventing Adverse Drug Events in Hospitals. Recom- mendations of an Expert Panel. Am J Health System Pharm. 53:747–751, 1996. 73. Gebhart, Fred. VA Facility Slashes Drug Errors Via Bar-Coding. Drug Topics. 1:44, 1999. 74. Joint Commission on Accreditation of Healthcare Organizations. 1998 Hospital Accreditation Standards. Oakbrook Terrace, IL: Joint Commission, 1998. 75. Healthcare Leaders Urge Adoption of Methods to Reduce Adverse Drug Events. News Release. National Patient Safety Partnership, May 12, 1999.

OCR for page 155

OCR for page 155
Appendixes

OCR for page 155