4

Opportunities to Build a Safer System for Health IT

Health IT supports a safety-critical system: its design, implementation, and use can either provide a substantial improvement in the quality and safety of patient care or pose serious risks to patients. In any sociotechnical system, consideration of the interactions of the people, processes, and technology form the baseline for ensuring successful system performance. Evidence suggests that existing health IT products in actual use may not yet be consistently producing the anticipated benefits, indicating that health IT products, in some cases, can contribute to unintended risks of harm.

To improve safety, health IT needs to optimize the interaction between people, technology, and the rest of the sociotechnical system. Sociotechnical theory, as described in Chapter 3, advocates for direct involvement of end users in system design. It shifts the paradigm for software development from technical development done in isolation by software and systems engineers to a process that is inclusive and iterative, engaging end users in design, deployment, and integration of the software product into workflow to enhance satisfaction and effectiveness.

Adhering to well-developed practices for design, training, and use can minimize safety risks. Building safer health IT involves exploring both real and potential hazards so that hazards are minimized or eliminated. Health IT can be viewed as having two related but distinct life cycles, with one relating to the design and development of health IT and the other associated with the implementation and use of health IT. Vendors and implementing organizations have specific roles in all phases of both life cycles and ought to coordinate their efforts for ensuring safety. The size, complexity, and resources available to large and small clinician practices and health care



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 77
4 Opportunities to Build a Safer System for Health IT Health IT supports a safety-critical system: its design, implementation, and use can either provide a substantial improvement in the quality and safety of patient care or pose serious risks to patients. In any sociotechnical system, consideration of the interactions of the people, processes, and technology form the baseline for ensuring successful system performance. Evidence suggests that existing health IT products in actual use may not yet be consistently producing the anticipated benefits, indicating that health IT products, in some cases, can contribute to unintended risks of harm. To improve safety, health IT needs to optimize the interaction between people, technology, and the rest of the sociotechnical system. Sociotechnical theory, as described in Chapter 3, advocates for direct involvement of end users in system design. It shifts the paradigm for software development from technical development done in isolation by software and systems engineers to a process that is inclusive and iterative, engaging end users in design, deployment, and integration of the software product into workflow to enhance satisfaction and effectiveness. Adhering to well-developed practices for design, training, and use can minimize safety risks. Building safer health IT involves exploring both real and potential hazards so that hazards are minimized or eliminated. Health IT can be viewed as having two related but distinct life cycles, with one relating to the design and development of health IT and the other associated with the implementation and use of health IT. Vendors and implementing organizations have specific roles in all phases of both life cycles and ought to coordinate their efforts for ensuring safety. The size, complexity, and resources available to large and small clinician practices and health care 77

OCR for page 77
78 HEALTH IT AND PATIENT SAFETY organizations may affect their abilities to fully realize the benefits of health IT products intended to facilitate safer care. This chapter reflects as much as possible the literature, experiences of key stakeholders, and the com- mittee’s expert opinion. FEATURES OF SAFE HEALTH IT Technology does not exist in isolation from its operator. As such, the design and use of health IT are interdependent. The design and develop- ment of products affects their safe performance and the extent to which clinician users will accept or reject the technology. To the end user, a safely functioning health IT product is one that includes • Easy retrieval of accurate, timely, and reliable native and imported data; • A system the user wants to interact with; • Simple and intuitive data displays; • Easy navigation; • Evidence at the point of care to aid decision making; • Enhancements to workflow, automating mundane tasks, and stream- lining work, never increasing physical or cognitive workload; • Easy transfer of information to and from other organizations and providers; and • No unanticipated downtime. Investing in health IT products aims to make care safer and improve health professional workflow while not introducing harm or risks. Key fea- tures such as enhanced workflow, usability, balanced customization, and interoperability affect whether or not clinician users enjoy successful inter- actions with the product and achieve these aims. Effective design and devel- opment drive the safe functioning of the products as well as determine some aspects of safe use by health professionals. Collaboration among users and vendors across the continuum of technology design, including embedding products into clinical workflow and ongoing product optimization, represents a dynamic process characterized by frequent feedback and joint accountabil- ity to promote safer health IT. The combination of these activities can result in building safer systems for health IT, as summarized in Figure 4-1. Safer Systems for Health IT Seamlessly Support Cognitive and Clinical Workflows The cognitive work of clinicians is substantial. Clinicians must rapidly integrate large amounts of data to make decisions in unstable and complex

OCR for page 77
79 OPPORTUNITIES TO BUILD A SAFER SYSTEM Health Professionals, Health Care Organizations, Vendors Features of Health IT Design and Development Implementation – Workflow – Software requirements – Planning and goal – Usability and development setting – Balanced customization – User interface design – Deployment – Interoperability – Testing – Stabilization – Deployment – Optimization – Maintenance and – Transformation upgrade Safer Systems for Health IT FIGURE 4-1 Interdependent activities for building a safer system for health IT. settings. The use of health IT is intended to aid in performing technical work that is also cognitive work, such as coordinating resources for a pro- cedure, assembling patient data for action, or supporting a decision that requires knowledge of resource availability. However, creating a graphical representation of the information needed to support the complex processes clinicians use to collect and analyze data elements, consider alternative choices, and then make a definitive decision is challenging. The introduction of health IT sometimes changes clinical workflows in unanticipated ways; these changes may be detrimental to patient safety. Although some templates may be very useful to providers, a rigid template for recording the “history of present illness,” for example, may alter the conversation between physician and patient in such a way that important historical clues are not conveyed or received. An inflexible order sequence may require the provider to hold important orders in mind while navigat- ing through mandatory screens, increasing the cognitive workload of com-

OCR for page 77
80 HEALTH IT AND PATIENT SAFETY municating patient care orders and adding to the possibility that intended orders are forgotten. A time-consuming process for locating laboratory or radiographic data presents a barrier to retrieval. In addition, the time spent on cumbersome data retrieval and data remodeling is time taken away from other clinical demands, requiring shortcuts in other aspects of care. Evaluation of the impact of introducing health IT on the cognitive workload of clinicians is important to determine unintended consequences and the potential for distraction, delays in care, and increased workload in general. The timeframe for greatest threats to safety is during initial implemen- tation, when workflow is new, a steep learning curve threatens previous practice, and nonperformance of any aspect of a technology causes the user to seek immediate alternate pathways to achieve a particular functionality, otherwise called a workaround. Alternatively, users of mature health IT products are at risk for habituation and overreliance on a technology, requiring vigilant attention to alerts or other notifications so that safety features are not ignored. When use of health IT impedes workflow, there must be a way to identify not only the faulty process that results but also any potential increase in workload for clinicians. Workarounds, common in health IT environments, are often a symp- tom of suboptimal design. When workarounds circumvent built-in safety features of a product, patient safety may be compromised. Integrating health IT within real-world clinical workflows requires attention to in situ use to ensure appropriate use of safety features (Koppel et al., 2008). For example, coping mechanisms such as “paste forward” (or “copy forward”), a practice of copying portions of previously entered documenta- tion and reusing the text in a new note, may be understood as compensa- tory survival strategies in an environment where the electronic environment does not support an efficient clinician workflow. However, this function may encourage staff to repeat an earlier evaluation rather than consider whether it is still accurate. In addition, the problem list in some electronic health records (EHRs) is limited to structured International Classification of Diseases (ninth revision) (ICD-9) entries, which may not capture the rel- evant clinical information required for optimal care. Paste forward is then employed as a means of bringing forward important, longitudinal data, such as richly detailed descriptions of the prior evaluation and medical thinking for each of a patient’s multiple medical problems that otherwise is not accommodated in the EHR. Yet, if done without exquisite attention to detail, these workarounds themselves can create risk. The optimal design and implementation of EHRs should include a deep understanding of and response to the clinician-initiated workarounds.

OCR for page 77
81 OPPORTUNITIES TO BUILD A SAFER SYSTEM Usability Is a Key Driver of Safety Health professionals work in complex, high-risk, and frequently cha- otic environments fraught with interruptions, time pressures, and incom- plete, disorganized, and overwhelming amounts of information. Health professionals require technologies that make this work easier and safer, rather than more difficult. Health IT products are needed that promote ef- ficiency and ease of use while minimizing the likelihood of error. Many health information systems used today provide poor support for the cognitive tasks and workflow of clinicians (NRC, 2009). This can lead to clinicians spending time unnecessarily identifying the most relevant data for clinical decision making, potentially selecting the wrong data, and missing important information that may increase patient safety risks. If the design of the software disrupts an efficient workflow or presents a cumber- some user interface, the potential for harm rises (see Box 4-1). Software design and its effect on workflow, as well as an effective user interface, are key determinants of usability. The committee expressed concerns that poor usability, such as the example in Box 4-1, is one of the single greatest threats to patient safety. On the other hand, once improved, it can be an effective promoter of patient safety. The common expectation is that health IT should make “the right thing to do the easy thing to do” as facilitated by effective design. Evaluation of the impact of health IT on usability and on cognitive workload is important to determine unintended consequences and the potential for distraction, delays in care, and increased workload in general. Usability guidelines and principles focused on improving safety need to be put into practice. Research over the past several decades supports a number of usability guidelines and principles. For example, there are a finite number of styles with which a user may interact with a computer system: direct manipulation (e.g., moving objects on a screen), menu selection, form fill-in, command language, and natural language. Each of these styles has known advantages and disadvantages, and one (or perhaps a blend of two or more) may well be more appropriate for a specific application from a usability standpoint. The National Institute of Standards and Technology (NIST) has been developing guidelines and standards for usability design and evaluation. One report, NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, introduces the basic concepts of usability, common principles of good usability design, methods for usability evaluation and improvement, processes of usability engineering, and the importance of organizational commitment to usability (NIST, 2010b). The second report, Customized Common Industry Format Template for Elec-

OCR for page 77
82 HEALTH IT AND PATIENT SAFETY BOX 4-1 Opportunities for Unintended Consequences Health IT that is not designed to facilitate common tasks can result in unintended consequences. • he most common ordering sequence is frequently not the most T prominent sequence presented to the clinician, increasing the chance of an inadvertent error. For example, in the hospital set- ting, where anticoagulation is being initiated or where patient characteristics are in flux, the required dose of Coumadin varies from day to day. When the selections for “__ mg of Coumadin daily” appear at the top of the list of choices, there is an increased chance clinicians will inadvertently select “5 mg of Coumadin daily” rather than scrolling down to the bottom of the page and finding “5 mg of Coumadin today.” This design–workflow mismatch may result in patients receiving unintended Coumadin doses or similarly may affect other medications requiring daily dose adjustment. • hen a patient’s medications are listed alphabetically or randomly W rather than grouped by type, users are forced through several pages of medications and mentally knit together the therapeutic program for each individual condition. In this situation, the cogni- tive workload of understanding of the patient’s diabetic regimen, for example, is made unnecessarily complex, and a clinician may easily miss one of the patient’s five diabetic medications, scat- tered among the patient’s 24 medications displayed across three different pages. Likewise, a patient’s congestive heart failure medi- cations may be dispersed across the same several pages, inter- spersed with medications for other conditions, again increasing the mental workload. In one example, “furosemide 80 mg q am” was toward the top of the list and then, separated by many inter- vening medications and on the next page, the clinician later found an entry for “furosemide 40 mg q pm.” Such data disorganization contributes to the possibility of clinical error. Even if clinicians are aware of these issues and become more diligent, health IT products that are not designed for users’ needs create addi- tional cognitive workload, which, over time, may cause the clinician to be more susceptible to making mistakes. Personal communication, Christine A. Sinsky, August 11, 2011.

OCR for page 77
83 OPPORTUNITIES TO BUILD A SAFER SYSTEM tronic Health Record Usability Testing, is not only a template for reporting usability evaluation but also a guideline for what and how usability evalu- ation should be conducted (NIST, 2010a). NIST released draft guidance on design evaluation and human user performance testing for usability issues related to patient safety, Technical Evaluation, Testing and Evaluation of the Usability of Electronic Health Records. NIST will publish its final guidance based on constructive technical feedback received during a public comment process for the draft report (NIST, 2011). The National Center for Cognitive Informatics and Decision Making in Healthcare (NCCD) has developed the Rapid Usability Assessment process to assess the usability of EHRs on specific meaningful use objectives and to provide detailed and actionable feedback to vendors to help improve their systems. The Rapid Usability Assessment process is based on two established methodologies. The first is the use of well-established usabil- ity principles to identify usability problems that are targets for potential improvements (Neilson, 1994; Zhang et al., 2003). This evaluation is per- formed by usability experts. The usability problems identified in the process are documented, rated for severity by the experts, and communicated to the vendors. The second phase of the Rapid Usability Assessment involves the use of a technique known as the “keystroke-level model” (Card et al., 1983; Kieras, unpublished). Using this method, it is possible to estimate the time and steps required to complete specific tasks. This method makes the as- sumption that an expert user would be using the system and therefore pro- vides the optimal or fastest time to complete the task. The Rapid Usability Assessment uses a software tool, CogTool, to enhance the accuracy and reliability of the keystroke-level model (John et al., 2004). The program calculates the amount of time an expert user will use to complete the task and steps involved in that task. A confidential report is provided to partici- pating vendors, which includes objective measures of the usability of the system, actionable results, and opportunities for further consultation with the usability evaluation team. It is important to note that although usability is integral to safe systems, sometimes safe practices require taking more time to perform a task to do it safely. In addition to the Rapid Usability Assessment, the NCCD also devel- oped a unified framework for EHR usability, called TURF, which stands for the four major factors for usability: task, user, representation, and function (Zhang and Walji, 2011). TURF is a theory that describes, explains, and predicts usability differences across EHR systems. It is also a framework that defines and measures EHR usability systematically and objectively. The NCCD is currently developing and testing software tools to automate a subset of the features of TURF, but these tests are still laboratory based. A dynamic tension exists between the need for design standards devel-

OCR for page 77
84 HEALTH IT AND PATIENT SAFETY opment and vendor competitive differentiation, which is discussed further in the next section. As a result, dissemination of best practices for EHR design has been restrained (McDonnell et al., 2010). Without a compre- hensive set of standards for EHR-specific functionalities, general software usability design practices are used knowing that modification will likely be needed to meet the needs of health professionals. User-centered design and usability testing takes into account knowl- edge, preferences, workflow, and human factors associated with the com- plex information needs of varied providers in diverse settings. Many new product enhancements address functionalities desired by users that facilitate or improve workflow and improve on current health IT products. One such example is the electronic capture of gestures observed in an operating room that is then recorded as activities requiring no interruption of the clinician’s working within the sterile field. To support usability within EHRs, Shnei- derman has identified eight heuristically and experientially derived “golden rules” for interface design (Shneiderman et al., 2009) (see Table 4-1). Achieving the Right Balance Between Customization and Standardization Current health IT products do not arrive as finished products ready for out-of-the-box or turnkey deployment, but rather often require substantial completion on site. Many smaller organizations do not have the resources for such onsite “customization” and must get by without the products being user ready. For example, when a large institution recognized the need for a diabetic flow sheet that was not supplied by the vendor, it created its own diabetic flow sheet locally. A smaller organization, using the same EHR product and with the same need for a diabetic flow sheet, did not have this capability and its clinicians reverted to a paper workaround using a handwritten flow sheet. Widespread institution-specific customization presents challenges to maintenance, upgrades, sharing of best practices, and interoperability across multiple-user organizations. Some standardization is necessary, but too much standardization can unnecessarily restrict an organization. In some instances, the implementing organization needs to customize and adapt innovation—the product being integrated—in order to better adopt the innovation (Berwick, 2003). The committee believes there is value in standardization and expressed the need for judicious use of customization when appropriate. Vendors are encouraged to provide more complete, responsive, and resilient health IT products as a preferred way to decrease the need for extensive customization.

OCR for page 77
85 OPPORTUNITIES TO BUILD A SAFER SYSTEM TABLE 4-1 Eight Golden Rules for Interface Design Principles Characteristics Strive for – Similar tasks ought to have similar sequences of action to consistency perform, for example: • Identical terminology in prompts and menus • Consistent screen appearance – Any exceptions should be understandable and few Cater to universal – Users span a wide range of expertise and have usability different desires, for example: • Expert users may want shortcuts • Novices may want explanations Offer informative – Systems should provide feedback for every user action to: feedback • Reassure the user that the appropriate action has been or is being done • Instruct the user about the nature of an error if one has been made – Infrequent or major actions call for substantial responses, while frequent or minor actions require less feedback Design dialogs – Have a beginning, middle, and end to action sequences to yield closure – Provide informative feedback when a group of actions has been completed – Signal that it is okay to drop contingency plans – Indicate the need for preparing the next group of actions Prevent errors – Systems should be designed so that users cannot make serious errors, for example: • Do not display menu items that are not appropriate in a given context • Do not allow alphabetic characters in numeric entry fields – User errors should be detected and instructions for recovery offered – Errors should not change the system state Permit easy – When possible, actions (and sequences of actions) should reversal of actions be reversible Support internal – Surprises or changes should be avoided in familiar behav- locus of control iors and complex data-entry sequences Reduce short-term – Interfaces should be avoided if they require users to memory load remember information from one screen for use in connection with another screen

OCR for page 77
86 HEALTH IT AND PATIENT SAFETY Interoperability Increased interoperability (i.e., the ability to exchange health informa- tion between health IT products and across organizational boundaries) can improve patient safety (Kaelber et al., 2008). Multiple levels of inter- operability exist and are needed for different levels of communication (see Table 4-2). Currently, laboratory data have been relatively easy to exchange because good standards exist such as Logical Observation Identifiers Names and Codes (LOINC) and are widely accepted. However, important informa- tion such as problem lists and medication lists (which exist in some health IT products) are not easily transmitted and understood by the receiving health IT product because existing standards have not been uniformly adopted. Standards need to be further developed to support interoperability throughout all health IT products. The committee believes interoperability must extend throughout the continuum of care, including pharmacies, laboratories, ambulatory, acute, post-acute, home, and long-term care settings. For all these organizations to safely coordinate care, health IT products must use common nomencla- tures, encoding formats, and presentation formats. Interoperability with personal health record systems and other patient engagement tools is also desirable, both in delivering data to patients and in collecting information from any patient-operated systems. Failure to achieve interoperability has considerable risks for patient safety. Without the ability of different health IT products to exchange data, information must be transferred by hand or electronic means outside the primary method (e.g., facsimile). Every time information is copied or transmitted by hand, there is a risk of error or loss of data. Incomplete and erroneous records may cause delays in care and result in harm. Imported data must be timely, accurate, accessible, and displayed in a user-friendly fashion. Patient safety can be at risk even among products that have achieved some level of interoperability. For example, when a data value expected as a number arrives as a string, it can be misinterpreted, resulting in a wrong display. Also, electrocardiogram tracings can be dis- played on a screen split in parts and rotated 90 degrees. The extra time required to mentally process such data into what is familiar delays care and increases the chance of error. The nationwide exchange of data is intended to support portability and immediate access to one’s health information. However, the competi- tive marketplace today provides few incentives for vendors themselves to support portability. The committee believes conformance tests ought to be available to clinicians so they can ensure their data are exchangeable. Independent entities need to be supported to develop “stress tests” that can be applied to validate whether medical record interoperability can be

OCR for page 77
87 OPPORTUNITIES TO BUILD A SAFER SYSTEM TABLE 4-2 Aspects of Interoperability and Their Impact on Patient Safety Aspects of Impact on Interoperability Definition Patient Safety Ability to exchange the Allows for electronic Software components that physical data stream communication between cannot communicate with of bits (Hoekstra et al., software components each other force users 2009) that represent to reenter data manually, relevant information which: – Detracts from time better used attending to patient safety and – Increases opportunities to enter misinformation Ability to exchange data Ability for software system The loss of meaning in without loss of semantic to properly work when received data compromises content modules from different patient safety vendors are “plugged in” Accept “plug-ins” Semantic content refers – The inability to use mul- seamlessly to information that allows tiple modules within one software to understand organization decreases the electronic bits the likelihood that users can provide their patient information to another health IT product – Lack of “plug-in” interop- erability means that the user does not have the ability to select modules from multiple vendors that may perform a spe- cific function more safely Display similar – Different health IT When information is dis- information in the products display played inconsistently across same way similar information organizations, the user in similar ways must reconcile the different representations of the in- – Systems are consistent formation mentally, which: in matters such as screen position of – Requires an increased fields, color, and units cognitive effort that could be better used toward safe care and – Increases the chance that a user may make a mistake

OCR for page 77
104 HEALTH IT AND PATIENT SAFETY As shown in Figure 4-2, implementation includes the stages of planning and goal setting, deployment, stabilization, optimization, and transformation. Planning and Goal Setting In planning and goal setting, organizations target improvements in quality, safety, and efficiency when automating processes. For example, automating orders or medication administration targets improvements of systems composed of nonlinear, complex workflows. The first step is similar to the acquisition phase—determining the organization’s needs and identify- ing the resources needed to achieve those needs. In aligning technological and organizational change, there needs to be effective management of both technological and organizational change (Majchrzak and Meshkati, 2001). The organization needs to analyze existing workflow, envision the optimal workflow, and select the automated system that achieves the optimal auto- mated workflow. This may involve customization of the purchased system. If workflow analysis and redesign are not completed before implementa- tion, it is possible automation will create unanticipated safety risks. For example, the introduction of a new lab system may require new work and the right person to take on that new work may not have been correctly selected, resulting in the new work falling to the physician or others at the cost of other clinical activities. Users need to be actively involved in the planning and goal-setting stage. Mechanisms to identify, escalate, and remediate patient safety issues need to be in place as the organization proceeds to the deployment stage. Metrics to be considered at this stage include ensuring that organization leaders have identified objectives, teams, and resources committed to the implementation. Deployment When organizations deploy a selected health IT, they make assumptions that vendors have made safety a primary goal in specification and design, that their products support high-reliability processes, and health IT stan- dards (e.g., content, vocabulary, transport) have evolved to address safety, safe use, and value (McDonnell et al., 2010). At the same time, strategies are needed to address potential patient-safety events that can arise from decisions such as whether the organization should take a big bang or se- quential approach and how to manage partial paper and electronic systems that can create opportunities for missing data and communication lapses. Collaboration between user organizations and vendors to improve patient safety is critical and requires a specific and immediate information loop between the parties, allowing detailed information to be exchanged to

OCR for page 77
I: Planning/Goal Setting II: Deployment III: Stabilization IV: Optimization V: Transformation – What are we hoping to – Is it installed? – Is it reliable? – Is the “build” optimal? – Are we creating a new achieve? – How many users are up? – Are users proficient – Have we improved work- value proposition for – What are our criteria – Is it running? enough? flows? patients? for success? – Are we improving quality, – Are we supporting an – What training is valuable service, and cost of care? ongoing learning health and needed? care organization? – What are the roles for champions and governance? Factors That Enhance or Detract from Patient Safety – Are users actively involved – Is big bang or sequential – How are “downtimes” – What clinical decision sup- – How do we learn about in the acquisition and design safer? managed? port is important to have? errors, hazards that occur of the system? in the organization? – How to best retrofit paper – When do downtimes – What other mechanisms – What are the mechanisms into electronic? become hazardous? improve patient safety now – …outside the organization to identify, escalate, and that EHR is installed? with the same vendor – “Old data” conversion – When can a user be deemed remediate patient safety product? proficient enough to ensure – What principles and policies – Training strategies: what issues and errors as deploy- patient safety? are in place for sound – …and how do we remediate frequency and targeted ment proceeds? stewardship of secondary as quickly as possible? content? – Are human–computer – How does patient safety re- data use? interactions being managed – How do we test opera- – What quality testing is main a major organizational to avoid hazards and create – Is there an iterative review tional system after annual sufficient for interfaces? and provider priority? safe use? of workflow process for upgrades or crashes And go-lives? – Do the design principles • fatigue, impatience, stress, the task distribution to the to ensure that it is still – Are clinicians clear about increase safety or generate switching off alerts care team as a result of the performing safely? data stewardship? measures that verify • communication hazards technology deployment? • Who acts on data when usable design? • difficult user interface there is more than one • “paste forward” and other – Are safety enhancements recipient? user-developed shortcuts standardized across all users • Who is clinically account- (response, formats, data – Is system dependable, avail- able to follow up on data? presentation, colors, etc.)? able, reliable, and secure? • Who fixes bad data? – What systems (past and • Who fixes new “bad” hab- – When a user–computer present) will interface its (i.e., potentially harmful interaction leads to a with EHRs? shortcuts and egregious potential hazard, how will behavior)? the organization resolve – Does the culture encourage FIGURE 4-2 the underlying problem users’ willingness to report – How will the organization rather than blame the user? Implementation life cycle. safety concerns? track and address new work created by the EHR (i.e., SOURCE: Adapted from unintended consequences of work flow changes)? Kaiser Permanente experience. 105

OCR for page 77
106 HEALTH IT AND PATIENT SAFETY quickly address safety issues or potential safety issues. Attention to collat- eral impact in a multivendor environment is also necessary to identify any other corrective actions. For example, in the event a safety incident occurs, rapid notification of health IT stakeholders should be accompanied by rapid correction of system-safety flaws, including vendor notification and col- laboration to change software, process, or policy as indicated. Most mecha- nisms impose a large burden on users to specify what was happening when a problem occurred (e.g., a system error, an instance of inconvenient use, an instance of avoidable provider error). The organization should consider developing mechanisms for providing feedback from users to vendors in an easy-to-use way, such as a “report problem here” button on every screen. Local testing is needed to verify safety, interoperability, security, and effectiveness, particularly at interfaces with other software systems during the go-live event. Policies to define data stewardship need to address ac- countability for following up on data when there is more than one recipi- ent, development of processes for correcting incorrect data, and ways to identify and avoid potentially harmful shortcuts or behaviors that result in unintended use of the system. Ongoing monitoring to assure secure ex- change of data as well as adherence to predetermined security performance expectations is essential; users must have confidence that data are secure at all times. General metrics for evaluating deployment include failure rates, quality assurance rates for each interface, percentage of users trained, and checklists of essential functions. Clinicians using multiple EHRs also experience challenges of retaining information about how to use different EHR systems. This affects profes- sionals in training as well as those on staff who care for patients in multiple settings or organizations. Stabilization Following deployment, health IT enters a stage of stabilization. Dur- ing stabilization, potentially hazardous human–computer interactions such as alert fatigue, communication hazards, and workarounds such as “paste forward” must be managed. During this stage, the organization ought to be monitoring the dependability, reliability, and security of the installed system and taking steps to resolve any potential hazards. More specific measures of these system characteristics will guide actions for clinician re- training, software modification, and the need for additional guidance and policies. As with the maintenance activity in the design and development of technology, stabilization also provides time to evaluate how downtime is managed. Organizations can measure the stability of a system by assess- ing user proficiency (e.g., reduction in helpdesk calls, higher percentage of

OCR for page 77
107 OPPORTUNITIES TO BUILD A SAFER SYSTEM e-prescriptions), the percentage of time the system is available, and trends in identified errors for a given timeframe. Optimization In the optimization stage, the organization analyzes how well it is using the functions such as decision support and safety effects of computerized provider order entry (CPOE) and other functions. Revisiting revised work- flows to measure achievement of intended changes can reveal improvements or degradation in quality and service. Assessing task distribution to the care team can help evaluate the impact of the health IT. Further evalua- tion is important to assess changes in quality measures over time as well as whether the health IT is being used in a meaningful manner. One example of measuring optimization is by tracking quality over time and the level of an organization’s reliance on paper. Self-assessment tools are an important adjunct approach to assessing aspects of EHR use such as clinical decision support performance (Metzger et al., 2010). A sample set of concepts for metrics to track a successful implementation across the life cycle of an EHR appears in Table 4-6. Transformation: The Learning Health System Transformation is the future state of an organization that has extracted learning from the system itself and from application of knowledge. It can be evaluated by identifying changes in practice derived from aggregate data analysis and application of new knowledge that result in improved outcomes. Proactive monitoring for new failure modes created by the implementation and use of health IT is going to be necessary. Proactive monitoring can also help define how such failures occur, and the contrib- uting forces. In the small practice and hospital setting this is particularly challenging because there is limited to no experience in these methods or approaches. A learning health care organization creates a new value proposition by improving quality and value of care. Ultimately the transformation achieved through optimal use of health IT will improve outcomes over time and achieve a safer system. Continuous evaluation and improvement occurs over the dynamic and iterative life cycle of health IT products (Walker et al., 2008). Maintenance Activities Maintenance begins after implementation when activities are carried out to keep a system operational and to support ongoing use. It is a period

OCR for page 77
108 TABLE 4-6 Measure Concepts for Successful Implementation I II III IV V Planning and Goal Setting Deployment Stabilization Optimization Transformation – IT, medical, and – Percentage of time – Ongoing health IT – Show improved – System up operations leaders system is available patient safety with outcomes over time – Percentage of users identified and in agree- analysis of reporting – User proficiency – Health care is safer trained ment on objectives and remediation of measures based on identified – Percentage of users safety issue – Teams identified measures – Trend of errors logged in – Tracking of quality – Money and resources identified in the field – Care processes – Number and nature measures over time made available per week redesigned of errors identified in – Quality improvement – Alert overrides – Continuous improve- the field per week (QI) projects and re- ment of new steady – Event reports – Quality assurance stats sults state for each interface – System passes ongo- – Quality assurance user ing safety tests after all acceptance testing upgrades, crashed stats for system or new application implementations – “Shakedown cruise” stats—review of functionality in initial period post go-live

OCR for page 77
109 OPPORTUNITIES TO BUILD A SAFER SYSTEM when there is shared responsibility between the vendor and the organiza- tion. The cost and effort needed to maintain an IT system is influenced by the underlying complexity of the system and design choices made by both vendors and users. A relationship exists between the complexity of the system and the error-proneness of a system immediately after installation. Complexity also increases the overall lifetime effort and cost associated with maintenance activities. Contingency planning for downtime procedures and data loss is neces- sary to address both short- and long-term occurrences. Scheduled down- time for maintenance and upgrades typically occurs at the organization’s discretion to minimize work disruption. Downtime procedures for planned outages as well as emergency procedures are necessary to protect security and are to include measures to prevent data loss; these measures will dif- fer based on the type of EHR architecture. Planning for obsolescence and eventual system replacement also requires a contingency plan that includes safeguarding of data (American Academy of Family Physicians, 2011). Any disruption, no matter how small, can present safety risks resulting from unfamiliarity with manual backup systems, delays in care, and data loss. Power interruptions and other unexpected events can result in un- avoidable downtime. Procedures should address immediate communica- tion and deployment of contingency plans as well as reentry to normal functioning and subsequent recovery actions. Advance planning, education, training, and practice for downtime can aid in successful performance dur- ing planned or unplanned outages. MINIMIZING RISKS OF HEALTH IT TO PROMOTE SAFER CARE Although not everything is known about the risks of health IT, there is some evidence to suggest there will be failures, design flaws, and user behaviors that thwart safe performance and application of these systems. To better understand these failures, more research, training, and education will be needed. Specifically, measures of safe practices need to be developed to assess health IT safety. Vendors, health care providers, and organiza- tions could benefit from following a proven set of general safe practices representing the best evidence about design, implementation strategies, usability features, and human-computer interactions for optimizing safe use. Vendors take primary responsibility for the design and development of technologies with iterative feedback from users. Users assume responsibility for safe implementation and work with vendors through the health IT life cycle. The mutual exchange of ideas and feedback regarding any actual or potential failures or unintended consequences can also inform safer design and use of health IT products. Because of the variations in health IT products and their implementa-

OCR for page 77
110 HEALTH IT AND PATIENT SAFETY tion, a set of development requirements that stipulates consistent criteria known to produce safer design and user interactions would be beneficial. Consistent testing procedures could then be applied to ensure the safety of health IT products. Inclusion of appropriate requirements and iterative test- ing can contribute to effective practices for safe design and implementation. With growing experience of health IT and EHR deployment, gleaning best practices for implementation from case reports and reviews is possible. The importance of testing for safe designs, functioning, and usability to reduce deployment errors can enhance safety in user adoption. Continual testing and retesting, for any change such as when upgrades are installed, will be needed. As high-prevalence and high-impact EHR-related patient safety risks are identified, these should be incorporated into pre- and post- deployment testing. Feedback from testing as well as learning from event reports and detecting workarounds is also important as part of the iterative process of continually improving health IT. The partnership to design, develop, implement, and optimize systems extends beyond a single vendor and single organization. Many public- and private-sector groups have a stake in the safety of health IT to ensure the very systems intended to help improve quality of care are performing with- out creating risk of harm. Indeed such a public–private partnership already exists in this area through the National Quality Forum’s safe practices, one of which is focused on CPOE and includes in its standard the routine use of a postdeployment test of the safety of operational CPOE systems in hospitals (Classen et al., 2010). Ensuring this outcome will entail additional requirements for public and private agencies, vendors, and users across the health IT life cycle. Current government programs, such as EHR product certification, can also be a path toward more effective usability and safer use of health IT products. Together, product developers, certification groups, and the Office of the National Coordinator for Health IT (ONC) can expand product requirements that address safer deployment with strategies to mitigate anticipated risks and address those that develop unexpectedly. Accrediting agencies can reinforce relevant standards and criteria for safer health IT by including review criteria for areas such as training, standardized testing procedures, maintenance, and safety issue reporting and remediation inter- nally and with vendors. Vigilance before, during, and after the selection and implementation of health IT is a shared responsibility. Data from EHRs also can be used to evaluate the impact of health IT on patient safety. Methods for collecting and evaluating these data are needed. Recommendation 1: The Secretary of Health and Human Services (HHS) should publish an action and surveillance plan within 12 months that includes a schedule for working with the private sector to assess

OCR for page 77
111 OPPORTUNITIES TO BUILD A SAFER SYSTEM the impact of health IT on patient safety and minimizing the risk of its implementation and use. The plan should specify: a. The Agency for Healthcare Research and Quality (AHRQ) and the National Library of Medicine (NLM) should ex- pand their funding of research, training, and education of safe practices as appropriate, including measures specifically related to the design, implementation, usability, and safe use of health IT by all users, including patients. b. The ONC should expand its funding of processes that pro- mote safety that should be followed in the development of health IT products, including standardized testing proce- dures to be used by manufacturers and health care organi- zations to assess the safety of health IT products. c. The ONC and AHRQ should work with health IT vendors and health care organizations to promote postdeployment safety testing of EHRs for high-prevalence, high-impact EHR-related patient safety risks. d. Health care accrediting organizations should adopt criteria relating to EHR safety. e. AHRQ should fund the development of new methods for measuring the impact of health IT on safety using data from EHRs. CONCLUSION Building health IT for safer use by health professionals is indeed a shared responsibility. Vendors, care providers, provider organizations and their health IT departments, and public and private agencies focused on quality of care are all partners in building a safer system in which health IT is used. The recommendations outlined in this chapter seek to align this shared responsibility and to provide structured guidance to further support safer care enabled by health IT. The committee acknowledges that health IT is an evolving domain and, as such, guidance, structure, and processes will need to evolve as well. REFERENCES Amalberti, R., Y. Auroy, D. Berwick, and P. Barach. 2005. Five system barriers to achieving ultrasafe health care. Annals of Internal Medicine 142(9):756-764. American Academy of Family Physicians. 2011. Develop a contingency plan for down time and data loss. http://www.centerforhit.org/online/chit/home/cme-learn/tutorials/networking/ network201/contingency.printerview.html (accessed June 28, 2011). Armijo, D., C. McDonnell, and K. Werner. 2009. Electronic health record usability: Interface design considerations. Rockville, MD: Agency for Healthcare Research and Quality.

OCR for page 77
112 HEALTH IT AND PATIENT SAFETY Berwick, D. 2003. Disseminating innovations in health care. Journal of the American Medical Association 289(15):1969-1975. Bodenheimer, T., and H. H. Pham. 2010. Primary care: Current problems and proposed solu- tions. Health Affairs 29(5):799-805. Canfora, G., L. Cerulo, M. Di Penta, and F. Pacilio. 2010. An exploratory study of factors influencing change entropy. Paper presented at 2010 IEEE 18th International Conference on Program Comprehension (ICPC), June 30 to July 2, 2010. Card, S., T. Moran, and A. Newell. 1983. The psychology of human–computer interaction. Hillsdale, NJ: Lawrence Erlbaum. Classen, D. C., D. W. Bates, and C. R. Denham. 2010. Meaningful use of computerized prescriber order entry. Journal of Patient Safety 6(1):15-23. Franzke, M. 1995. Turning research into practice: Characteristics of display-based interaction. Paper presented at the SIGCHI Conference on Human Factors in Computing Systems, Denver, CO, May 7, 1995. Frisse, M. E., and J. Metzer. 2005. Information technology in the rural setting: Challenges and more challenges. Journal of the American Medical Informatics Association 12(1):99-100. Hoekstra, M., M. Vogelzang, E. Verbitskiy, and M. W. N. Nijsten. 2009. Health technology assessment review: Computerized glucose regulation in the intensive care unit—how to create artificial control. Critical Care 13(5):223. John, B., K. Prevas, D. Salvucci, and K. Koedinger. 2004. Predictive human performance modeling made easy. Paper presented at the Conference on Human Factors in Computing Systems, Vienna, Austria, April 24 to 29, 2004. Kaelber, D. C., A. K. Jha, D. Johnston, B. Middleton, and D. W. Bates. 2008. A research agenda for personal health records (PHRs). Journal of the American Medical Informatics Association 15(6):729-736. Keselman, A., X. Tang, V. L. Patel, T. R. Johnson, and J. Zhang. 2004. Institutional decision- making for medical device purchasing: Evaluating patient safety. Medinfo 11:1357-1361. Kieras, D. (unpublished). Using the keystroke-level model to estimate execution times. Koppel, R., T. Wetterneck, J. L. Telles, and B. T. Karsh. 2008. Workarounds to barcode medi- cation administration systems: Their occurrences, causes, and threats to patient safety. Journal of the American Medical Informatics Association 15(4):408-423. Majchrzak, A. and N. Meshkati. 2001. Aligning technological and organizational change when implementing new technology. In The handbook of industrial engineering, 3rd ed, edited by G. Salvendy. New York: Wiley. Pp. 948-974. McDonnell, C., K. Werner, and L. Wendel. 2010. Electronic health record usability: Vendor practices and perspectives. Rockville, MD: Agency for Healthcare Research and Quality. Metzger, J., E. Welebob, D. W. Bates, S. Lipsitz, and D. C. Classen. 2010. Mixed results in the safety performance of computerized physician order entry. Health Affairs 29(4):655-663. Neilson, J. 1994. Usability engineering. San Diego, CA: Academic Press. NIST (National Institute of Standards and Technology). 2010a. Customized common industry format template for electronic health record usability testing. Gaithersburg, MD: NIST. NIST. 2010b. NIST guide to the processes approach for improving the usability of electronic health records. Gaithersburg, MD: NIST. NIST. 2011. Technical evaluation, testing and validation of the usability of electronic health records. Gaithersburg, MD: NIST. NRC (National Research Council). 2009. Compuational technology for effective health care: Immediate steps and strategic directions. Washington, DC: The National Academies Press. Parnas, D. L. 1994. Software aging. Paper presented at the 16th International Conference on Software Engineering, Sorrento, Italy, May 16 to 21, 1994.

OCR for page 77
113 OPPORTUNITIES TO BUILD A SAFER SYSTEM Shneiderman, B., C. Plaisant, M. Cohen, and S. Jacobs. 2009. Designing the user interface: Strategies for effective human-computer interaction. Boston: Addison-Wesley. Thimbleby, H. 2008. Ignorance of interaction programming is killing people. Interactions 15(5):52-57. Thimbleby, H., and P. Cairns. 2010. Reducing number entry errors: Solving a widespread, serious problem. Journal of The Royal Society Interface 7(51):1429-1439. Walker, J. M., P. Carayon, N. Leveson, R. A. Paulus, J. Tooker, H. Chin, A. Bothe, Jr., and W. F. Stewart. 2008. EHR safety: The way forward to safe and effective systems. Journal of the American Medical Informatics Association 15(3):272-277. Zhang, J., and M. Walji. 2011. TURF: Toward a unified framework of EHR usability. Journal of Biomedical Informatics 44(6):1056-1067. Zhang, J., T. R. Johnson, V. L. Patel, D. L. Paige, and T. Kubose. 2003. Using usability heuristics to evaluate patient safety of medical devices. Journal of Biomedical Informatics 36(1-2):23-30.

OCR for page 77