6
Promoting Adoption of Clinical Practice Guidelines

Abstract: Promoting uptake and use of clinical practice guidelines (CPGs) at the point of care delivery represents a final translation hurdle to move scientific findings into practice. Characteristics of the intended users and context of practice are as important as guideline attributes for promoting adoption of CPG recommendations. The committee’s recommendations for individual and organizational interventions for CPG implementation are as follows: Effective multifaceted implementation strategies targeting both individuals and healthcare systems should be employed by implementers to promote adherence to trustworthy CPGs. Increased adoption of electronic health records and clinical decision support (CDS) will open new opportunities to rapidly move CPGs to the patient encounter. The committee recommends that guideline developers and implementers take the following actions to advance this aim. Guideline developers should structure the format, vocabulary, and content of CPGs (e.g., specific statements of evidence, the target population) to facilitate ready implementation of electronic clinical decision support (CDS) by end-users. CPG developers, CPG implementers, and CDS designers should collaborate in an effort to align their needs with one another. In considering legal issues affecting CPG implementation, the committee suggests clinicians will be more likely to adopt guidelines if they believe they offer malpractice litigation protection. The committee



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 145
6 Promoting Adoption of Clinical Practice Guidelines Abstract: Promoting uptake and use of clinical practice guidelines (CPGs) at the point of care delivery represents a final translation hurdle to move scientific findings into practice. Characteristics of the intended users and context of practice are as important as guideline attributes for promoting adoption of CPG recommen- dations. The committee’s recommendations for individual and organizational interventions for CPG implementation are as fol - lows: Effective multifaceted implementation strategies tar- geting both individuals and healthcare systems should be employed by implementers to promote adherence to trust- worthy CPGs. Increased adoption of electronic health records and clinical decision support (CDS) will open new opportunities to rapidly move CPGs to the patient encounter. The commit- tee recommends that guideline developers and implementers take the following actions to advance this aim. Guideline develop- ers should structure the format, vocabulary, and content of CPGs (e.g., specific statements of evidence, the target population) to facilitate ready implementation of electronic clinical decision support (CDS) by end-users. CPG develop- ers, CPG implementers, and CDS designers should collaborate in an effort to align their needs with one another. In considering legal issues affecting CPG implementation, the committee sug- gests clinicians will be more likely to adopt guidelines if they believe they offer malpractice litigation protection. The committee 145

OCR for page 145
146 CLINICAL PRACTICE GUIDELINES WE CAN TRUST also suggests courts will be more likely to adopt guidelines that are trustworthy and urges them, given reliance on CPGs, to use those deemed trustworthy when available. INTRODUCTION Clinical practice guidelines (CPGs) draw on synthesized research findings to set forth recommendations for state-of-the-art care. Trust- worthy CPGs are critical to improving quality of care, but many CPGs are not developed for ready use by clinicians. They are typi- cally lengthy documents of written prose with graphical displays (e.g., decision trees or flow charts) making them difficult for clinical use at the point of care delivery. Furthermore, recommendations from CPGs must be applied to patient specific data to be useful, and often, data required for a given guideline either are not available or require too much time to ascertain in a useful form during a typical patient encounter (Mansouri and Lockyer, 2007). Passive dissemi- nation (e.g., distribution) of CPGs has little effect on practitioner behaviors and thus, active implementation (e.g., opinion leaders) efforts are required. Even with the exponential growth in publicly available CPGs (NGC, 2010), easy access to high quality, timely CPGs is out of reach for many clinicians. Large gaps remain between recommended care and that delivered to patients. A 2003 study by McGlynn et al. of adults living in 12 metropolitan areas of the United States found participants received recommended care 54.9 percent of the time. The proportion of those receiving recommended care varied only slightly among adults in need of preventive care (54.9 percent), acute care (53.5 percent) and care for chronic conditions (56.1 percent). Yet, when McGlynn et al. (2003) inspected particular medical conditions, they noticed a substantial difference in received recommended care, ranging from 10.5 percent for alcohol dependence to 78.7 percent for senile cataract. In an observational study of 10 Dutch guidelines, Grol et al. concluded that general practitioners followed guideline recommendations in only 61 percent of relevant situations (Grol et al., 1998). Furthermore, in an analysis of 41 studies of the implemen- tation of mental health CPGs—including depression, schizophrenia, and addiction—Bauer found that physicians adhered to guidelines only 27 percent of the time in both cross-sectional and pre-post studies and 67 percent of the time in controlled trials (Bauer, 2002; Francke et al., 2008). Of course, not all quality measures are valid and reliable, nor should all CPGs necessarily be adhered to; how-

OCR for page 145
147 PROMOTING ADOPTION OF CLINICAL PRACTICE GUIDELINES ever, those CPGs that meet standards proposed herein should be associated with high levels of adherence. This chapter focuses on a variety of strategies to promote adop- tion of CPGs. The first section describes how adoption is affected by a number and variety of factors, and presents several individual and organizational implementation strategies for developers and imple - menters. The second section discusses use of the electronic health record (EHR) and computer-aided decision supports to promote use of CPGs in practice. The third section discusses legal issues related to CPGs that could affect their implementation. STRATEGIES FOR IMPLEMENTATION OF CPG RECOMMENDATIONS Promoting uptake and use of CPGs at the point of care delivery represents a final translation hurdle to move scientific findings into practice. The field of translation research is a relatively young sci- ence, and addressing this final step of bringing research findings into the mainstream of typical practice is an important challenge (Avorn, 2010). A body of knowledge in implementation science is growing and provides an empirical base for promoting adoption of CPGs (Bradley et al., 2004b; Brooks et al., 2009; Carter et al., 2006; Chin et al., 2004; Demakis et al., 2000; Eccles and Mittman, 2006; Feldman et al., 2005; Grimshaw et al., 2004c, 2006a; Horbar et al., 2004; Hysong et al., 2006; Irwin and Ozer, 2004; Jamtvedt et al., 2006b; Jones et al., 2004; Katz et al., 2004a; Levine et al., 2004; Loeb et al., 2004; McDonald et al., 2005; Murtaugh et al., 2005; Shiffman et al., 2005; Shojania and Grimshaw, 2005; Shojania et al., 2006; Solberg et al., 2000; Solomon et al., 2001; Stafford et al., 2010; Titler et al., 2009). An emerging principle for promoting adoption of CPGs is that attributes of the CPG (e.g., ease of use, strength of the evidence) as perceived by users and stakeholders are neither stable features nor isolated determinants of adoption. Rather it is the interaction among characteristics of the CPG (e.g., specificity, clarity), the intended users (physicians, nurses, pharmacists), and a particular context of practice (e.g., inpatient, ambulatory, long-term care setting) that determines the rate and extent of adoption (Greenhalgh et al., 2005b). A number of conceptual models have been tested and are used to guide implementation of CPG recommendations (Damschroder et al., 2009; Davies et al., 2010; Dobbins et al., 2009; Rycroft-Malone and Bucknall, 2010). The Implementation Model, illustrated in Figure 6-1, is used here as an organizing framework where the rate and extent of adoption of CPGs are influenced by the nature of the CPG (e.g.,

OCR for page 145
148 CLINICAL PRACTICE GUIDELINES WE CAN TRUST Social System Communication Rate and Extent of Characteristics of the Communication CPG Adoption Clinical Practice Process Guideline (CPG) Users of the EBP FIGURE 6-1 Implementation model. NOTE: EBP = evidence-based practice. Figure 6-1 SOURCE: Titler and Everett (2001). complexity, type, and strength of the evidence) and how it is com - municated (e.g., academic detailing, audit and feedback) to users of the evidence-based practice (e.g., physicians, nurses, pharmacists) of a social system/context of practice (e.g., clinic, inpatient unit, health system) (Kozel et al., 2003; Titler and Everett, 2001; Titler et al., 2009). Although discussion of implementation strategies is organized by these four areas (nature of the CPG, communication, members, con- text), these categories are not independent of one another. CPG Characteristics Characteristics of a CPG that influence the extent to which it can be implemented include clarity, specificity, strength of the evidence, perceived importance, relevance to practice, and simplicity versus complexity of the medical condition it is addressing. For example, CPGs on relatively simple healthcare practices (e.g., influenza vac- cines for older adults) are more easily adopted in less time than those that are more complex (e.g., acute pain management for hos- pitalized older adults). To foster use of trustworthy CPGs, develop- ers must consider organization of content, layout of key messages within the CPG, specificity of practice recommendations, and length of the CPG prose. Additionally, CPGs typically focus on one medi- cal condition (e.g., heart failure), thereby making it challenging to use CPGs for patients with multiple comorbidities. (This topic is discussed further in Chapter 5.) Implementation strategies that address the process of integrat- ing essential content from CPGs to the local practice context and

OCR for page 145
149 PROMOTING ADOPTION OF CLINICAL PRACTICE GUIDELINES workflow include clinical reminders, quick reference guides, and decision aids (Balas et al., 2004; BootsMiller et al., 2004; Bradley et al., 2004b; Fung et al., 2004; Loeb et al., 2004; Wensing et al., 2006). One- page quick reference guides, depicted pictorially as flow diagrams or algorithms, are attractive from the busy provider’s perspective (Baars et al., 2010; Boivin et al., 2009; Chong et al., 2009). A number of one-page quick reference guides related to prevention and treat- ment of cardiovascular diseases have been published (Coronel and Krantz, 2007; Krantz et al., 2005; Smith et al., 2008), though data on widespread acceptability and effectiveness warrants further study. Reminders have a small to moderate effect on adoption of CPGs when used alone or in association with other interventions, pri- marily regarding the use of preventive health care such as screen- ing tests, immunizations, test ordering, and medication prescribing (Dexheimer et al., 2008; Grimshaw et al., 2004a; Shojania et al., 2009). Reminders are likely more effective for simple (e.g., ordering a lipid test [Mehler et al., 2005]) than complex actions. Ultimately, incorporation of reminders and clinical care algo- rithms into electronic decision support systems holds great promise to promote use of CPGs and is discussed in further detail in the sec - tion on Electronic Interventions for CPG Implementation. Electronic decision support systems can also address adoption of recommenda- tions from multiple CPGs in the care of individuals with multiple comorbidities. Communication Strategies Methods of communication and forms of communication chan- nels influence adoption of CPGs (Greenhalgh et al., 2005b). Imple - mentation strategies discussed in this section are education and mass media, academic detailing, and opinion leaders. Education and Mass Media Printed educational materials are one of the most common forms of communicating guidelines through dissemination of complete guideline documents and abridged summaries or concise reference cards. Based on an evidence review of 23 studies, the impact of printed educational materials on changing processes of care is small (median absolute increase of 4.3 percent for categorical processes to 13.6 percent for continuous processes) when compared to no inter- vention (Farmer et al., 2008). Given the low cost and high feasibility of printed materials, it may be reasonable to consider them as one

OCR for page 145
150 CLINICAL PRACTICE GUIDELINES WE CAN TRUST part of a multifaceted implementation intervention, given both gaps in adoption and diversity of implementation barriers (e.g., for a brand new practice or a change in established practice). Forsetlund summarized 81 trials on continuing medical educa - tion didactic lectures and workshops and found consistent but small effects, with a mean 6 percent absolute increase in desired clinical practices from educational meetings used alone or as a compo- nent of multifaceted interventions (Forsetlund et al., 2009). Meta- regression results suggested educational interventions were more effective when attendance was higher, when interactive sessions were mixed with didactic, and when clinical outcomes of intended actions were more serious. Education alone did not appear effective for more complex practice changes. A review by Grilli et al. (2002) of 20 studies using interrupted time-series designs demonstrated that mass media (e.g., television, radio, newspapers, leaflets, posters, and pamphlets), targeted at the population level (providers, patients, and general public), has some effect on the use of health services for the targeted behavior (e.g., colorectal cancer screening), including providers’ use. These chan - nels of communication have an important role in influencing use of healthcare interventions; those engaged in promoting uptake of research evidence in clinical practice should consider mass media as one of the tools that may encourage use of effective services and discourage those of unproven effectiveness. However, little empiri- cal evidence is available to guide design of mass communication messages to achieve the intended change (Grilli et al., 2002). Opinion Leaders An opinion leader is from the local peer group, viewed as a respected source of influence, considered by colleagues as techni - cally competent, and trusted to judge the fit between the evidence base of practice and the local situation (Berner et al., 2003; Grimshaw et al., 2006b; Harvey et al., 2002; Soumerai et al., 1998). Opinion leadership is multifaceted and complex, with role functions varying by circumstances (e.g., nature of the CPG, clinical setting, clinician), but few successful projects to implement recommended practices in healthcare organizations have managed without the use of opinion leaders (Greenhalgh et al., 2005b; Kozel et al., 2003; Watson, 2004). Several studies have demonstrated that opinion leaders are effective in changing behaviors of healthcare practitioners (Ber- ner et al., 2003; Cullen, 2005; Dopson et al., 2001; Greenhalgh et al., 2005b; Irwin and Ozer, 2004; Locock et al., 2001; Redfern and Christian, 2003), especially when used in combination with aca -

OCR for page 145
151 PROMOTING ADOPTION OF CLINICAL PRACTICE GUIDELINES demic detailing or performance feedback (discussed hereafter). A Cochrane Review summarized 12 studies engaging opinion lead - ers with or without other interventions (Doumit et al., 2007). Most studies focused on inpatient settings with an absolute increase of 10 percent in desired behaviors. Challenges to application of these strategies include identification of opinion leaders and high resource levels for deployment. Academic Detailing Academic detailing, or educational outreach, as applied to CPGs, involves interactive face-to-face education of individual prac- titioners in their practice setting by an educator (usually a clinician) with expertise in a particular topic (e.g., cancer pain management), and is one means of changing practice to better align with provision of CPG recommendations. Academic detailers are able to explain the research foundations of CPG recommendations and respond convincingly to specific questions, concerns or challenges that a practitioner might raise. An academic detailer also might deliver feedback on provider or team performance with respect to a selected CPG (e.g., frequency of pain assessment) or CPG-based quality mea- sure (Avorn, 2010; O’Brien et al., 2007). Multiple studies have demonstrated that academic detailing pro- motes positive changes in practice behaviors of clinical practitioners (Avorn et al., 1992; Feldman et al., 2005; Greenhalgh et al., 2005a; Hendryx et al., 1998; Horbar et al., 2004; Jones et al., 2004; Loeb et al., 2004; McDonald et al., 2005; Murtaugh et al., 2005; O’Brien et al., 2007; Solomon et al., 2001; Titler et al., 2009). In a review of 69 studies, academic detailing was found to produce a median abso- lute increase in desired clinical practice of 6 percent. Improvements were highly consistent for prescribing (median absolute increase of 5 percent), and varied for other types of professional performance (median absolute increase of 4 to 16 percent). A few head-to-head studies also suggest academic detailing has a slightly larger impact than audit and feedback (O’Brien et al., 2007). Academic detailing is more costly than other interventions; one analysis found that it is cost-effective (Soumerai and Avorn, 1986) while a more recent ana- lysis concluded that it was not (Shankaran et al., 2009). Members of Social System (CPG Users) Intended users of a CPG must be clearly delineated to promote use of CPG recommendations at the point of care delivery. CPGs are likely to impact the practice of multiple players and types of clini-

OCR for page 145
152 CLINICAL PRACTICE GUIDELINES WE CAN TRUST cians involved in delivery of care. Those promoting adoption of a CPG must understand the work and challenges of these multiple stakeholders. Members of a social system (e.g., nurses, physicians, clerical staff) influence how quickly and widely CPGs are adopted (Greenhalgh et al., 2005b). In addition to communication strategies for implementation discussed in the previous section, implementa- tion strategies targeted to users of a CPG include audit and feedback (A/F), performance gap assessment (PGA), and financial incentives. PGA and A/F consistently have shown positive effects on chang- ing provider practice behavior of providers (Bradley et al., 2004b; Horbar et al., 2004; Hysong et al., 2006; Jamtvedt et al., 2006a) . Performance Gap Assessment PGA applies performance measures to provide information and discussion of current practices relative to recommended CPG prac- tices at the beginning of a clinical practice change (Horbar et al., 2004; Titler et al., 2009). This implementation strategy is used to engage clinicians in discussions of practice issues and formulation of steps or system-level strategies to promote alignment of their practices with CPG recommendations. Specific practice indicators selected for PGA are derived from CPG recommendations. Studies have shown improvements in performance when PGA is part of a multifaceted implementation intervention (Horbar et al., 2004; Titler et al., 2009), but use of this approach by itself is unlikely to result in improved adoption of CPG recommendations (Buetow and Roland, 1999). Yano (2008) discusses the essential nature of performance gap assessment in CPG implementation in the Veterans Affairs Quality Enhancement Research Initiative (VA QUERI) program (Yano, 2008). Audit and Feedback Audit and feedback is a continuous process of measuring per- formance (both process and outcome), aggregating data into reports, and discussing findings with practitioners (Greenhalgh et al., 2005b; Horbar et al., 2004; Jamtvedt et al., 2006a; Katz et al., 2004a,b; Titler et al., 2009). This strategy helps clinicians see how their efforts to improve care processes (e.g., pain assessment every 4 hours) and patient outcomes (e.g., lower pain intensity) are progressing. There is not clear empirical evidence for how to provide audit and feed - back, although findings from several studies and systematic reviews suggest that effects may be larger when clinicians are active partici - pants in implementing change and discussion of data audits rather

OCR for page 145
153 PROMOTING ADOPTION OF CLINICAL PRACTICE GUIDELINES than being passive recipients of feedback reports (Hysong et al., 2006; Jamtvedt et al., 2006a; Kiefe et al., 2001). A Cochrane review compared audit and feedback uniquely or with other interventions based on 118 studies (Jamtvedt et al., 2006a). Results of audit and feedback varied substantially, with a small median effect of a 5 percent absolute increase in performance. Audit and feedback seemed most effective when baseline perfor- mance was low and feedback intensive. A meta-analysis of 19 stud- ies demonstrated that specific suggestions for improving care, writ- ten feedback, and more frequent feedback strengthened the effect (Hysong, 2009). Qualitative studies provide some insight into use of audit and feedback (Bradley et al., 2004a; Hysong et al., 2006). One study on use of data feedback for improving treatment of acute myo- cardial infarction found that (1) feedback data must be perceived by physicians as important and valid; (2) the data source and timeliness of data feedback are critical to perceived validity; (3) it takes time to establish credibility of data within a hospital; (4) benchmarking improves the validity of data feedback; and (5) physician leaders can enhance the effectiveness of data feedback. The literature also sup- ports that data feedback profiling an individual physician’s practices can be effective, but may be perceived as punitive; data feedback must persist to sustain improved performance; and effectiveness of data feedback is intertwined with the organizational context, includ - ing physician leadership and organizational culture (Bradley et al., 2004a). Hysong and colleagues (2006) found that high-performing institutions provided timely, individualized, nonpunitive feedback to providers whereas low performers were more variable in their timeliness and nonpunitiveness and relied more on standardized, facility-level reports (Hysong et al., 2006). The concept of actionable feedback emerged as the core concept shared across timeliness, indi- vidualization, nonpunitiveness, and customizability. Financial Incentives Financial incentives have been evaluated for impact on provider performance and quality of care measures, including appropriate pre- scribing for specific conditions such as heart failure and appropriate delivery of preventive services (Werner and Dudley, 2009). Medicare, other insurers, and integrated health plans have begun tying reim- bursement rates to targets for performance or improvement. Many “pay for performance” interventions have targeted hospitals or phy- sician groups, in part because of the need to have sufficient num- bers to measure performance reliably. Integrated health plans have

OCR for page 145
154 CLINICAL PRACTICE GUIDELINES WE CAN TRUST employed incentives targeting individual clinicians. Limited litera- ture on individual-level incentives suggests generally positive effects, targeting measures of preventive care, diabetes, asthma, and heart failure (Christianson et al., 2008; Giuffrida et al., 1999; Greene and Nash, 2009; Petersen et al., 2006). Petersen’s review reported that five of six studies of physician-level incentives and seven of nine studies of group-level incentives found partial or positive effects on quality of care process measures (e.g., cervical cancer screening, mammography, and hemoglobin A1c testing) (Petersen et al., 2006). Obstacles associ- ated with incentives also have been documented: physicians may try to “game” measures by excluding certain patients; improvements may reflect better documentation rather than practice changes; and performance targets and payment strategies must be tailored to goals of the incentive program and participating practices’ performance variations (Christianson et al., 2008; Werner and Dudley, 2009). Social System/Context of Practice Clearly, the social system or context of care delivery matters when implementing CPGs (Anderson et al., 2005; Batalden et al., 2003; Cummings et al., 2007; Estabrooks et al., 2008; Fleuren et al., 2004; Fraser, 2004; Greenhalgh et al., 2005a; Kirsh et al., 2008; Kochevar and Yano, 2006; Kothari et al., 2009; Litaker et al., 2008; Redfern and Christian, 2003; Rubenstein and Pugh, 2006; Scott- Findlay and Golden-Biddle, 2005; Scott et al., 2008; Stetler, 2003; Stetler et al., 2009; Titler et al., 2009; Yano, 2008). Implementation strategies described above are instituted within a system of care delivery. Strategies that focus on organizational factors alter the clinical practice environment by systematizing work processes and involving physicians and others (e.g., nurses, physical therapists) in guideline implementation. The underlying principle of organization implementation strategies is creating systems of practice that make it easier to consistently adopt guideline recommendations. Factors within and across healthcare systems that foster use of CPGs include overall size and complexity of the healthcare system, infrastructure support (e.g., absorptive capacity for new knowledge; assessing and structuring workflow), multihealth system collabora - tives, and professional associations. Each is described briefly in the following sections. Healthcare Systems Type (e.g., public, private) and complexity of healthcare organi- zations influence adoption of CPG recommendations. For example,

OCR for page 145
155 PROMOTING ADOPTION OF CLINICAL PRACTICE GUIDELINES Vaughn et al. (2002) demonstrated that organizational resources, physician full-time equivalents per 1,000 patient visits, organiza - tional size, and urbanicity affected use of evidence in the VA health - care system. Aarons et al. (2009) demonstrated in a large multisite study that providers working in private organizations had more positive attitudes toward evidence-based practices and their orga- nizations provided more support for implementing CPG recommen- dations (Aarons et al., 2009; Yano, 2008). Large, mature, functionally differentiated organizations (e.g., divided into semiautonomous departments and units) that are spe- cialized, with a focus of professional knowledge, available resources to channel into new projects, decentralized decision making, and low levels of formalization will more readily adopt innovations such as new CPG-based practices. Larger organizations are gener- ally more innovative because size increases the likelihood that other predictors of CPG adoption will be present, such as financial and human resources and role differentiation (Greenhalgh et al., 2005a; Yano, 2008). Establishing semiautonomous teams is associated with successful implementation of CPGs, and thus should be considered in managing organizational units (Adler et al., 2003; Grumbach and Bodenheimer, 2004; Shojania et al., 2006; Shortell, 2004). Infrastructure Support Infrastructure support to promote use of CPG recommenda- tions is defined in a variety of ways, but usually includes absorptive capacity, leadership, and technology infrastructure (discussed in the section on Electronic Interventions for CPG Implementation) to sup- port application of CPG recommendations at the point of care deliv - ery. Absorptive capacity is the knowledge and skills to enact CPG recommendations, remembering that strength of evidence alone will not promote adoption. An organization that is able to systematically identify, capture, interpret, share, reframe, and recodify new knowl- edge, then use it appropriately will be better able to assimilate CPG recommendations (BootsMiller et al., 2004; Ferlie et al., 2001; Stetler et al., 2009; Wensing et al., 2006). Variation in capacity for change affects sustained implementation of evidence-based preventive ser- vice delivery in community-based primary care practices (Litaker et al., 2008). A learning culture and proactive leadership that promotes knowledge sharing are important components of building absorp- tive capacity for new knowledge (Estabrooks, 2003; Horbar et al., 2004; Lozano et al., 2004; Nelson et al., 2002). Components of a receptive context include strong leadership, clear strategic vision,

OCR for page 145
176 CLINICAL PRACTICE GUIDELINES WE CAN TRUST Bertoni, A. G., D. E. Bonds, H. Chen, P. Hogan, L. Crago, E. Rosenberger, A. H. Barham, C. R. Clinch, and D. C. Goff, Jr. 2009. Impact of a multifaceted interven - tion on cholesterol management in primary care practices: Guideline adherence for heart health randomized trial. Archives of Internal Medicine 169(7):678–686. Bishop, T. F., A. D. Federman, and S. Keyhani. 2010. Physicians’ views on defensive medicine: A national survey. Archives of Internal Medicine 170(12):1081–1083. Blumenthal, D. 2009. Stimulating the adoption of health information technology. New England Journal of Medicine 360(15):1477–1479. Boivin, A., J. Green, J. van der Meulen, F. Legare, and E. Nolte. 2009. Why consider patients’ preferences?: A discourse analysis of clinical practice guideline devel - opers. Medical Care 47(8):908–915. BootsMiller, B. J., J. W. Yankey, S. D. Flach, M. M. Ward, T. E. Vaughn, K. F. Welke, and B. N. Doebbeling. 2004. Classifying the effectiveness of Veterans Affairs guideline implementation approaches. American Journal of Medical Quality 19(6):248–254. Bradley, E. H., E. S. Holmboe, J. A. Mattera, S. A. Roumanis, M. J. Radford, and H. M. Krumholz. 2004a. Data feedback efforts in quality improvement: lessons learned from U.S. hospitals. Quality and Safety in Health Care 13(1):26–31. Bradley, E. H., M. Schlesinger, T. R. Webster, D. Baker, and S. K. Inouye. 2004b. Translating research into clinical practice: Making change happen. Journal of the American Geriatrics Society 52(11):1875–1882. Brooks, J. M., M. G. Titler, G. Ardery, and K. Herr. 2009. Effect of evidence-based acute pain management practices on inpatient costs. Health Services Research 44(1):245–263. Buetow, S. A., and M. Roland. 1999. Clinical governance: Bridging the gap between managerial and clinical approaches to quality of care. Quality Health Care 8(3):184–190. Butzlaff, M., H. Vollmar, B. Floer, N. Koneczny, J. Isfort, and S. Lange. 2004. Learning with computerized guidelines in general practice?: A randomized controlled trial. Family Practice 21(2):183–188. Carrier, E. R., J. D. Reschovsky, M. M. Mello, R. C. Mayrell, and D. Katz. 2010. Phy - sicians’ fears of malpractice lawsuits are not assuaged by tort reforms. Health Affairs 29(9):1585–1592. Carter, B. L., A. Hartz, G. Bergus, J. D. Dawson, W. R. Doucette, J. J. Stewart, and Y. Xu. 2006. Relationship between physician knowledge of hypertension and blood pressure control. Journal of Clinical Hypertension (Greenwich) 8(7):481–486. Chin, M. H., S. Cook, M. L. Drum, L. Jin, M. Guillen, C. A. Humikowski, J. Koppert, J. F. Harrison, S. Lippold, and C. T. Schaefer. 2004. Improving diabetes care in midwest community health centers with the health disparities collaborative. Diabetes Care 27(1):2–8. Chong, C. A., I. J. Chen, G. Naglie, and M. D. Krahn. 2009. How well do guidelines incorporate evidence on patient preferences? Journal of General Internal Medicine 24(8):977–982. Christianson, J. B., S. Leatherman, and K. Sutherland. 2008. Lessons from evaluations of purchaser Pay-for-Performance programs: A review of the evidence. Medical Care Research Review 65(6 Suppl):5S–35S. Coronel, S., and M. J. Krantz. 2007. Medical therapy for symptomatic heart failure: A contemporary treatment algorithm. Critical Pathways in Cardiology: A Journal of Evidence-Based Medicine 6(1):15–17. Cullen, L. 2005. Evidence-based practice: Strategies for nursing leaders. In Leader- ship and nursing care management, 3rd ed., edited by D. Huber. Philadelphia, PA: Elsevier. Pp. 461–478.

OCR for page 145
177 PROMOTING ADOPTION OF CLINICAL PRACTICE GUIDELINES Cummings, G. G., C. A. Estabrooks, W. K. Midodzi, L. Wallin, and L. Hayduk. 2007. Influence of organizational characteristics and context on research utilization. Nursing Research 56(4 Suppl):S24–S39. Damschroder, L., D. Aron, R. Keith, S. Kirsh, J. Alexander, and J. Lowery. 2009. Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science 4(1):50. Davies, P., A. Walker, and J. Grimshaw. 2010. A systematic review of the use of theory in the design of guideline dissemination and implementation strategies and interpretation of the results of rigorous evaluations. Implementation Science 5(1):14. Dayton, C. S., J. Scott Ferguson, D. B. Hornick, and M. W. Peterson. 2000. Evaluation of an Internet-based decision-support system for applying the ATS/CDC guide- lines for tuberculosis preventive therapy. Medical Decision Making 20(1):1–6. Demakis, J. G., L. McQueen, K. W. Kizer, and J. R. Feussner. 2000. Quality Enhance - ment Research Initiative (QUERI): A collaboration between research and clinical practice. Medical Care 38(6 Suppl 1):I17–I25. DesRoches, C. M., E. G. Campbell, S. R. Rao, K. Donelan, T. G. Ferris, A. Jha, R. Kaushal, D. E. Levy, S. Rosenbaum, A. E. Shields, and D. Blumenthal. 2008. Electronic health records in ambulatory care—A national survey of physicians. New England Journal of Medicine 359(1):50–60. Dexheimer, J. W., T. R. Talbot, D. L. Sanders, S. T. Rosenbloom, and D. Aronsky. 2008. Prompting clinicians about preventive care measures: A systematic review of randomized controlled trials. Journal of the American Medical Informatics Associa- tion 15(3):311–320. Dexter, P. R., S. Perkins, J. M. Overhage, K. Maharry, R. B. Kohler, and C. J. McDonald. 2001. A computerized reminder system to increase the use of preventive care for hospitalized patients. New England Journal of Medicine 345(13):965–970. Dobbins, M., S. E. Hanna, D. Ciliska, S. Manske, R. Cameron, S. L. Mercer, L. O’Mara, K. DeCorby, and P. Robeson. 2009. A randomized controlled trial evaluating the impact of knowledge translation and exchange strategies. Implementation Science 4:61. Dopson, S., L. Locock, D. Chambers, and J. Gabbay. 2001. Implementation of evidence- based medicine: Evaluation of the Promoting Action on Clinical Effectiveness programme. Journal of Health Services Research and Policy 6(1):23–31. Doumit, G., M. Gattellari, J. Grimshaw, and M. A. O’Brien. 2007. Local opinion lead - ers: Effects on professional practice and health care outcomes. Cochrane Database of Systematic Reviews (1):CD000125. Durieux, P., L. Trinquart, I. Colombet, J. Nies, R. Walton, A. Rajeswaran, M. Rege Walther, E. Harvey, and B. Burnand. 2008. Computerized advice on drug dos- age to improve prescribing practice. Cochrane Database of Systematic Reviews (3):CD002894. Eccles, M. P., and B. S. Mittman. 2006. Welcome to implementation science. Implemen- tation Science 1:7:1–6. Eccles, M., E. McColl, N. Steen, N. Rousseau, J. Grimshaw, and D. Parkin. 2002. Effect of computerised evidence-based guidelines on management of asthma and angina in adults in primary care: Cluster randomised controlled trial. BMJ 325:941–948. Estabrooks, C. A. 2003. Translating research into practice: Implications for organiza - tions and administrators. Canadian Journal of Nursing Research 35(3):53–68.

OCR for page 145
178 CLINICAL PRACTICE GUIDELINES WE CAN TRUST Estabrooks, C. A., L. Derksen, C. Winther, J. N. Lavis, S. D. Scott, L. Wallin, and J. Profetto-McGrath. 2008. The intellectual structure and substance of the knowl - edge utilization field: A longitudinal author co-citation analysis, 1945 to 2004. Implementation Science 3:49. Farley, D. O., M. C. Haims, D. J. Keyser, S. S. Olmsted, S. V. Curry, and M. Sorbero. 2003. Regional health quality improvement coalitions: Lessons across the life cycle. Santa Monica, CA: RAND Health. Farmer, A. P., F. Legare, L. Turcot, J. Grimshaw, E. Harvey, J. L. McGowan, and F. Wolf. 2008. Printed educational materials: Effects on professional practice and health care outcomes. Cochrane Database of Systematic Reviews (3):CD004398. Feldman, P. H., C. M. Murtaugh, L. E. Pezzin, M. V. McDonald, and T. R. Peng. 2005. Just-in-time evidence-based e-mail “reminders” in home health care: Impact on patient outcomes. Health Services Research 40(3):865–885. Feldstein, A., P. J. Elmer, D. H. Smith, M. Herson, E. Orwoll, C. Chen, M. Aickin, and M. C. Swain. 2006. Electronic medical record reminder improves osteoporosis management after a fracture: A randomized, controlled trial. Journal of the Ameri- can Geriatrics Society 54(3):450–457. Ferlie, E., J. Gabbay, L. Fitzgerald, L. Locock, and S. Dopson. 2001. Evidence-based medicine and organisational change: An overview of some recent qualitative research. In Organisational behavior and organisational studies in health care: Reflec - tions on the future, edited by L. Ashburner. Basingstoke: Palgrave. Fleuren, M., K. Wiefferink, and T. Paulussen. 2004. Determinants of innovation within health care organizations: Literature review and Delphi study. International Jour- nal of Quality Health Care 16(2):107–123. Florida Agency for Health Care Administration 1998. Practice guidelines as affirmative defense: The Cesarean Demonstration Project Report. Fonarow, G. C., M. J. Reeves, X. Zhao, D. M. Olson, E. E. Smith, J. L. Saver, and L. H. Schwamm. 2010. Age-related differences in characteristics, performance measures, treatment trends, and outcomes in patients with ischemic stroke. Circulation 121(7):879–891. Forsetlund, L., A. Bjorndal, A. Rashidian, G. Jamtvedt, M. A. O’Brien, F. Wolf, D. Davis, J. Odgaard-Jensen, and A. D. Oxman. 2009. Continuing education meet - ings and workshops: Effects on professional practice and health care outcomes. Cochrane Database of Systematic Reviews (2):CD003030. Francke, A. L., M. C. Smit, A. J. de Veer, and P. Mistiaen. 2008. Factors influencing the implementation of clinical guidelines for health care professionals: A systematic meta-review. BMC Medical Informatics and Decision Making 8:38. Fraser, I. 2004. Organizational research with impact: Working backwards. Worldviews Evidence Based Nursing 1(Suppl 1):S52–S59. Fung, C. H., J. N. Woods, S. M. Asch, P. Glassman, and B. N. Doebbeling. 2004. Variation in implementation and use of computerized clinical reminders in an integrated healthcare system. American Journal of Managed Care 10(11 Pt 2):878–885. Garg, A. X., N. K. J. Adhikari, H. McDonald, M. P. Rosas-Arellano, P. J. Devereaux, J. Beyene, J. Sam, and R. B. Haynes. 2005. Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: A system - atic review. JAMA 293(10):1223–1238. Giuffrida, A., H. Gravelle, and M. Roland. 1999. Measuring quality of care with routine data: Avoiding confusion between performance indicators and health outcomes. BMJ 319(7202):94–98.

OCR for page 145
179 PROMOTING ADOPTION OF CLINICAL PRACTICE GUIDELINES Graham, I. D., J. Tetroe, and M. Gagnon. 2009. Lost in translation: Just lost or begin - ning to find our way? Annals of Emergency Medicine 54(2):313–314; discussion 314. Greene, S. E., and D. B. Nash. 2009. Pay for Performance: An overview of the litera - ture. American Journal of Medical Quality 24(2):140–163. Greenhalgh, T., A. Collard, and N. Begum. 2005a. Sharing stories: Complex interven - tion for diabetes education in minority ethnic groups who do not speak English. BMJ 330(7492):628. Greenhalgh, T., G. Robert, P. Bate, F. Macfarlane, and O. Kyriakidou. 2005b. Diffusion of innovations in health service organisations: A systematic literature review . Malden, MA: Blackwell Publishing Ltd. Grilli, R., C. Ramsay, and S. Minozzi. 2002. Mass media interventions: Effects on health services utilisation. Cochrane Database of Systematic Reviews (1):CD000389. Grimshaw, J. M., L. Shirran, R. Thomas, G. Mowatt, C. Fraser, L. Bero, R. Grilli, E. Harvey, A. Oxman, and M. A. O’Brien. 2001. Changing provider behavior: An overview of systematic reviews of interventions. Medical Care 39(8 Suppl 2): II2–II45. Grimshaw, J., L. M. McAuley, L. A. Bero, R. Grilli, A. D. Oxman, C. Ramsay, L. Vale, and M. Zwarenstein. 2003. Systematic reviews of the effectiveness of qual- ity improvement strategies and programmes. Quality and Safety in Health Care 12(4):298–303. Grimshaw, J., M. Eccles, and J. Tetroe. 2004a. Implementing clinical guidelines: Cur- rent evidence and future implications. The Journal of Continuing Education in the Health Professions 24(Suppl 1):S31–S37. Grimshaw, J., R. Thomas, G. MacLennan, C. Fraser, C. Ramsay, L. Vale, P. Whitty, M. Eccles, L. Matowe, L. Shirran, M. Wensing, R. Dijkstra, and C. Donaldson. 2004b. Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technology Assessment 8(6):1–72. Grimshaw, J. M., R. E. Thomas, G. MacLennan, C. Fraser, C. R. Ramsay, L. Vale, P. Whitty, M. P. Eccles, L. Matowe, L. Shirran, M. Wensing, R. Dijkstra, and C. Donaldson. 2004c. Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technology Assessment 8(6):iii–iv, 1–72. Grimshaw, J., M. Eccles, R. Thomas, G. MacLennan, C. Ramsay, C. Fraser, and L. Vale. 2006a. Toward evidence-based quality improvement. Journal of General Internal Medicine 1(Suppl 2):S14–S20. Grimshaw, J. M., M. P. Eccles, J. Greener, G. Maclennan, T. Ibbotson, J. P. Kahan, and F. Sullivan. 2006b. Is the involvement of opinion leaders in the implementation of research findings a feasible strategy? Implementation Science 1:3. Grol, R., J. Dalhuijsen, S. Thomas, C. Veld, G. Rutten, and H. Mokkink. 1998. Attri - butes of clinical guidelines that influence use of guidelines in general practice: Observational study. BMJ 317(7162):858–861. Gross, P. A. 2000. Implementing Evidence-Based Recommendations for Health Care: A Roundtable Comparing European and American Experiences. Joint Commis- sion Journal on Quality and Patient Safety 26:547–553. Gross, P. A., S. Greenfield, S. Cretin, J. Ferguson, J. Grimshaw, R. Grol, N. Klazinga, W. Lorenz, G. S. Meyer, C. Riccobono, S. C. Schoenbaum, P. Schyve, and C. Shaw. 2001. Optimal methods for guideline implementation: Conclusions from Leeds Castle meeting. Medical Care 39(8 Suppl 2):II85–II92. Grumbach, K., and T. Bodenheimer. 2004. Can health care teams improve primary care practice? JAMA 291(10):1246–1251.

OCR for page 145
180 CLINICAL PRACTICE GUIDELINES WE CAN TRUST Hagedorn, H., M. Hogan, J. Smith, C. Bowman, G. Curran, D. Espadas, B. Kimmel, L. Kochevar, M. Legro, and A. Sales. 2006. Lessons learned about implement - ing research evidence into clinical practice. Journal of General Internal Medicine 21(0):S21–S24. Harvey, G., A. Loftus-Hills, J. Rycroft-Malone, A. Titchen, A. Kitson, B. McCormack, and K. Seers. 2002. Getting evidence into practice: The role and function of fa - cilitation. Journal of Advanced Nursing 37(6):577–588. Hendryx, M. S., J. F. Fieselmann, M. J. Bock, D. S. Wakefield, C. M. Helms, and S. E. Bentler. 1998. Outreach education to improve quality of rural ICU care. Results of a randomized trial. American Journal of Respiratory and Critical Care Medicine 158(2):418–423. Horbar, J. D., R. F. Soll, G. Suresh, J. Buzas, M. B. Bracken, P. E. Plsek. 2004. Evidence- based surfactant therapy for preterm infants. In Final progress report to AHRQ. Burlington: University of Vermont. Hyams, A. L., J. A. Brandenburg, S. R. Lipsitz, D. W. Shapiro, and T. A. Brennan. 1995. Practice guidelines and malpractice litigation: A two-way street. Annals of Internal Medicine 122(6):450–455. Hyatt, J. D., R. P. Benton, and S. F. Derose. 2002. A multifaceted model for implement - ing clinical practice guidelines across the continuum of care. Journal of Clinical Outcomes Management 9(4):199–206. Hysong, S. J. 2009. Meta-analysis: Audit and feedback features impact effectiveness on care quality. Medical Care 47(3):356–363. Hysong, S., R. Best, and J. Pugh. 2006. Audit and feedback and clinical practice guide- line adherence: Making feedback actionable. Implementation Science 1(1):9. ICSI (Institute for Clinical Systems Improvement). 2010. ICSI history. http://www. icsi.org/about/icsi_history/ (accessed July 8, 2010). Irwin, C., and E. M. Ozer. 2004. Implementing adolescent preventive guidelines. In Final progress report to AHRQ. San Francisco: University of California–San Fran- cisco Division of Adolescent Medicine. Jamtvedt, G., J. M. Young, D. T. Kristoffersen, M. A. O’Brien, and A. D. Oxman. 2006a. Audit and feedback: Effects on professional practice and health care outcomes. Cochrane Database of Systematic Reviews (2):CD000259. Jamtvedt, G., J. M. Young, D. T. Kristoffersen, M. A. O’Brien, and A. D. Oxman. 2006b. Does telling people what they have been doing change what they do? A systematic review of the effects of audit and feedback. Quality and Safety in Health Care 15(6):433–436. Jones, K. R., R. Fink, C. Vojir, G. Pepper, E. Hutt, L. Clark, J. Scott, R. Martinez, D. Vincent, and B. K. Mellis. 2004. Translation research in long-term care: Im- proving pain management in nursing homes. Worldviews Evidence Based Nursing 1(Suppl 1):S13–S20. Jones, J., C. Snyder, and A. Wu. 2007. Issues in the design of Internet-based systems for collecting patient-reported outcomes. Quality of Life Research 16(8):1407–1417. Jones, J. B., W. F. Stewart, J. Darer, and D. F. Sittig. 2010. Beyond the threshold: Real time use of evidence in practice. In Committee on Standards for Trustworthy Clinical Practice Guidelines commissioned paper. Kanter, M., O. Martinez, G. Lindsay, K. Andrews, and C. Denver. 2010. Proactive of - fice encounter: A systematic approach to preventive and chronic care at every patient encounter. The Permanente Journal 14(3):38–43. Katz, D. A., R. B. Brown, D. R. Muehlenbruch, M. C. Fiore, and T. B. Baker. 2004a. Implementing guidelines for smoking cessation: Comparing the efforts of nurses and medical assistants. American Journal of Preventive Medicine 27(5):411–416.

OCR for page 145
181 PROMOTING ADOPTION OF CLINICAL PRACTICE GUIDELINES Katz, D. A., D. R. Muehlenbruch, R. L. Brown, M. C. Fiore, and T. B. Baker. 2004b. Effectiveness of implementing the Agency for Healthcare Research and Quality smoking cessation clinical practice guideline: A randomized, controlled trial. Journal of National Cancer Institute 96(8):594–603. Kiefe, C. I., J. J. Allison, O. D. Williams, S. D. Person, M. T. Weaver, and N. W. Weiss - man. 2001. Improving quality improvement using achievable benchmarks for physician feedback: A randomized controlled trial. JAMA 285(22):2871–2879. Kirsh, S. R., R. H. Lawrence, and D. C. Aron. 2008. Tailoring an intervention to the context and system redesign related to the intervention: A case study of imple - menting shared medical appointments for diabetes. Implementation Science 3:34. Kochevar, L. K., and E. M. Yano. 2006. Understanding health care organization needs and context. Beyond performance gaps. Journal of General Internal Medicine 21(Suppl 2):S25–S29. Kothari, A., N. Edwards, N. Hamel, and M. Judd. 2009. Is research working for you? Validating a tool to examine the capacity of health organizations to use research. Implementation Science 4:46. Kozel, C. T., W. M. Kane, E. M. Rogers, J. E. Brandon, M. T. Hatcher, M. J. Hammes, and R. E. Operhall. 2003. Exploring health promotion agenda-setting in New Mexico: Reshaping health promotion leadership. Promoting Education 10(4):171– 177, 198, 209. Krantz, M. J., S. Cornel, and W. R. Hiatt. 2005. Use of ankle brachial index screening for selecting patients for antiplatelet drug therapy. Pharmacotherapy 25(12):1826– 1828. Kuilboer, M. M., M. A. van Wijk, M. Mosseveld, E. van der Does, J. C. de Jongste, S. E. Overbeek, B. Ponsioen, and J. van der Lei. 2006. Computed critiquing integrated into daily clinical practice affects physicians’ behavior: A randomized clinical trial with AsthmaCritic. Methods of Information in Medicine 45(5):431–437. LeCraw, L. L. 2007. Use of clinical practice guidelines in medical malpractice litiga - tion. Oncology Practice (3):254. Levine, R. S., B. A. Husaini, N. Briggs, V. Cain, T. Cantrell, C. Craun.. 2004. Translat - ing prevention research into practice. In Final progress report to AHRQ. Nashville: Meharry Medical College/Tennessee State University. Litaker, D., M. Ruhe, S. Weyer, and K. Stange. 2008. Association of intervention outcomes with practice capacity for change: Subgroup analysis from a group randomized trial. Implementation Science 3(1):25. Locock, L., S. Dopson, D. Chambers, and J. Gabbay. 2001. Understanding the role of opinion leaders in improving clinical effectiveness. Social Science and Medicine 53(6):745–757. Loeb, M., K. Brazil, A. McGeer, K. Stevenson, S. D. Walter, L. Lohfeld. 2004. Optimiz - ing antibiotic use in long term care. In Final progress report to AHRQ. Hamilton, Ontario, Canada: McMaster University. Lozano, P., J. A. Finkelstein, V. J. Carey, E. H. Wagner, T. S. Inui, A. L. Fuhlbrigge, S. B. Soumerai, S. D. Sullivan, S. T. Weiss, and K. B. Weiss. 2004. A multi- site randomized trial of the effects of physician education and organizational change in chronic-asthma care: Health outcomes of the Pediatric Asthma Care Patient Outcomes Research Team II Study. Archives of Pediatric Adolescent Medi- cine 158(9):875–883. Mansouri, M., and J. Lockyer. 2007. A meta-analysis of continuing medical educa tion effectiveness. Journal of Continuing Education in the Health Professions 27(1):6–15.

OCR for page 145
182 CLINICAL PRACTICE GUIDELINES WE CAN TRUST McDonald, M. V., L. E. Pezzin, P. H. Feldman, C. M. Murtaugh, and T. R. Peng. 2005. Can just-in-time, evidence-based “reminders” improve pain management among home health care nurses and their patients? Journal of Pain Symptom Management 29(5):474–488. McGlynn, E. A., S. M. Asch, J. Adams, J. Keesey, J. Hicks, A. DeCristofaro, and E. A. Kerr. 2003. The quality of health care delivered to adults in the United States. New England Journal of Medicine 348(26):2635–2645. Mehler, P. S., M. J. Krantz, R. A. Lundgren, R. O. Estacio, T. D. MacKenzie, L. Petralia, and W. R. Hiatt. 2005. Bridging the quality gap in diabetic hyperlipidemia: A practice-based intervention. American Journal of Medicine 118(12):1414. Mehta, R. H., C. K. Montoye, M. Gallogly, P. Baker, A. Blount, J. Faul, C. Roychoud - hury, S. Borzak, S. Fox, M. Franklin, M. Freundl, E. Kline-Rogers, T. LaLonde, M. Orza, R. Parrish, M. Satwicz, M. J. Smith, P. Sobotka, S. Winston, A. A. Riba, and K. A. Eagle. 2002. Improving quality of care for acute myocardial infarction: The Guidelines Applied in Practice (GAP) initiative. JAMA 287(10):1269–1276. Mello, M. M. 2001. Of swords and shields: The role of clinical practice guidelines in medical practice litigation. University of Pennsylvania Law Review 149 U. Pa. L. Rev. 645. Miller, P. L., S. J. Frawley, and F. G. Sayward. 2001. Maintaining and incrementally revalidating a computer-based clinical guideline: A case study. Journal of Biomedi- cal Informatics 34(2):99–111. Murtaugh, C. M., L. E. Pezzin, M. V. McDonald, P. H. Feldman, and T. R. Peng. 2005. Just-in-time evidence-based e-mail “reminders” in home health care: Impact on nurse practices. Health Services Research 40(3):849–864. Nelson, E. C., P. B. Batalden, T. P. Huber, J. J. Mohr, M. M. Godfrey, L. A. Head- rick, and J. H. Wasson. 2002. Microsystems in health care: Learning from high- performing front-line clinical units. Joint Commission Journal of Quality Improve- ment 28(9):472–493. NGC (National Guideline Clearinghouse). 2010. National Guideline Clearinghouse. http://www.guideline.gov/ (accessed April 7, 2010). Nieva, V., R. Murphy, N. Ridley, N. Donaldson, J. Combes, P. Mitchell. 2005. From science to service: A framework for the transfer of patient safety research into practice , advanced in patient safety: From research to implementation. Rockville, MD: Agency for Healthcare Research and Quality. O’Brien, M. A., S. Rogers, G. Jamtvedt, A. D. Oxman, J. Odgaard-Jensen, D. T. Kristof- fersen, L. Forsetlund, D. Bainbridge, N. Freemantle, D. A. Davis, R. B. Haynes, and E. L. Harvey. 2007. Educational outreach visits: Effects on professional practice and health care outcomes. Cochrane Database of Systematic Reviewvs (4):CD000409. O’Connor, P. J., L. I. Solberg, J. Christianson, G. Amundson, and G. Mosser. 1996. Mechanism of action and impact of a cystitis clinical practice guideline on out - comes and costs of care in an HMO. Joint Commission Journal of Quality Improve- ment 22(10):673–682. Open Clinical. 2010. Guideline modeling methods summaries. http://www.openclinical. org/gmmsummaries.html (accessed July 8 2010). Petersen, L. A., L. D. Woodard, T. Urech, C. Daw, and S. Sookanan. 2006. Does pay- for-performance improve the quality of health care? Annals of Internal Medicine 145(4):265–272. Prior, M., M. Guerin, and K. Grimmer-Somers. 2008. The effectiveness of clinical guideline implementation strategies—A synthesis of systematic review findings. Journal of Evaluation in Clinical Practice 14(5):888–897.

OCR for page 145
183 PROMOTING ADOPTION OF CLINICAL PRACTICE GUIDELINES Redfern, S., and S. Christian. 2003. Achieving change in health care practice. Journal of Evaluation in Clinical Practice 9(2):225–238. Rosoff, A. J. 2001. Evidence-based medicine and the law: The courts confront clinical practice guidelines. Journal of Health Politics, Policy & Law 26(2):327–368. Rubenstein, L. V., and J. Pugh. 2006. Strategies for promoting organizational and prac- tice change by advancing implementation research. Journal of General Internal Medicine 21 (Suppl 2):S58–S64. Rycroft-Malone, J., and T. Bucknall. 2010. Models and frameworks for implementing evidence-based practice: Linking evidence to action. Evidence-based Nursing Series. Chichester, West Sussex, UK, and Ames, IA: Wiley-Blackwell. Sales, A., D. Atkins, M. J. Krantz, and L. Solberg. 2010. Issues in implementation of trusted clinical practice guidelines. In Committee on Standards for Developing Trustworthy Clinical Practice Guidelines commissioned paper. Scott, S. D., R. C. Plotnikoff, N. Karunamuni, R. Bize, and W. Rodgers. 2008. Factors influencing the adoption of an innovation: An examination of the uptake of the Canadian Heart Health Kit (HHK). Implementation Science 3:41. Scott-Findlay, S., and K. Golden-Biddle. 2005. Understanding how organizational culture shapes research use. Journal of Nursing Administration 35(7–8):359–365. Sequist, T. D., T. K. Gandhi, A. S. Karson, J. M. Fiskio, D. Bugbee, M. Sperling, E. F. Cook, E. J. Orav, D. G. Fairchild, and D. W. Bates. 2005. A randomized trial of electronic clinical reminders to improve quality of care for diabetes and coronary artery disease. Journal of the American Medical Informatics Association 12(4):431–437. Shankaran, V., T. H. Luu, N. Nonzee, E. Richey, J. M. McKoy, J. Graff Zivin, A. Ashford, R. Lantigua, H. Frucht, M. Scoppettone, C. L. Bennett, and S. Shein - feld Gorin. 2009. Costs and cost effectiveness of a health care provider-directed intervention to promote colorectal cancer screening. Journal of Clinical Oncology 27(32):5370–5375. Shiffman, R., J. Dixon, C. Brandt, A. Essaihi, A. Hsiao, G. Michel, and R. O’Connell. 2005. The guideline implementability appraisal (glia): Development of an instru- ment to identify obstacles to guideline implementation. BMC Medical Informatics and Decision Making 5(1):23. Shojania, K. G., and J. M. Grimshaw. 2005. Evidence-based quality improvement: The state of the science. Health Affairs 24(1):138–150. Shojania, K. G., S. R. Ranji, K. M. McDonald, J. M. Grimshaw, V. Sundaram, R. J. Rushakoff, and D. K. Owens. 2006. Effects of quality improvement strategies for Type 2 diabetes on glycemic control: A meta-regression analysis. JAMA 296(4):427–440. Shojania, K. G., A. Jennings, A. Mayhew, C. R. Ramsay, M. P. Eccles, and J. Grimshaw. 2009. The effects of on-screen, point of care computer reminders on processes and outcomes of care. Cochrane Database of Systematic Reviews (3):CD001096. Shojania, K. G., A. Jennings, A. Mayhew, C. Ramsay, M. Eccles, and J. Grimshaw. 2010. Effect of point-of-care computer reminders on physician behaviour: A systematic review. Canadian Medical Association Journal 182(5):E216–E225. Shortell, S. M. 2004. Increasing value: A research agenda for addressing the mana - gerial and organizational challenges facing health care delivery in the United States. Medical Care Research Reviews 61(3 Suppl):12S–30S. Sittig, D., A. Wright, J. S. Ash, and B. Middleton. 2009a. A set of preliminary standards recommended for achieving a national repository of clinical decision support interventions. AMIA Annual Symposium Proceedings 2009: 614–618.

OCR for page 145
184 CLINICAL PRACTICE GUIDELINES WE CAN TRUST Sittig, D. F., A. Wright, J. S. Ash, and B. Middleton. 2009b. A set of preliminary standards recommended for achieving a national repository of clinical decision support interven - tions. Paper presented at AMIA 2009 Symposium, San Francisco, CA. Smith, C. S., M. G. Harbrecht, S. M. Coronel, and M. J. Krantz. 2008. State consensus guideline for the prevention of cardiovascular disease in primary care settings. Critical Pathways in Cardiology: A Journal of Evidence-Based Medicine 7:122–125. Solberg, L. 2009. Lessons for non-VA care delivery systems from the U.S. Department of Veterans Affairs Quality Enhancement Research Initiative: QUERI Series. Implementation Science 4(1):9. Solberg, L. I., M. L. Brekke, C. J. Fazio, J. Fowles, D. N. Jacobsen, T. E. Kottke, G. Mosser, P. J. O’Connor, K. A. Ohnsorg, and S. J. Rolnick. 2000. Lessons from experienced guideline implementers: Attend to many factors and use multiple strategies. Joint Commission Journal of Quality Improvement 26(4):171–188. Solberg, L. I., M. C. Hroscikoski, J. M. Sperl-Hillen, P. G. Harper, and B. F. Crabtree. 2006. Transforming medical care: Case study of an exemplary, small medical group. Annals of Family Medicine 4(2):109–116. Solomon, D. H., L. Van Houten, R. J. Glynn, L. Baden, K. Curtis, H. Schrager, and J. Avorn. 2001. Academic detailing to improve use of broad-spectrum antibiotics at an academic medical center. Archives of Internal Medicine 161(15):1897–1902. Soumerai, S. B., and J. Avorn. 1986. Economic and policy analysis of university-based drug “detailing.” Medical Care 24(4):313–331. Soumerai, S. B., T. J. McLaughlin, J. H. Gurwitz, E. Guadagnoli, P. J. Hauptman, C. Borbas, N. Morris, B. McLaughlin, X. Gao, D. J. Willison, R. Asinger, and F. Gobel. 1998. Effect of local medical opinion leaders on quality of care for acute myocardial infarction: A randomized controlled trial. JAMA 279(17):1358–1363. Stafford, R. S., L. K. Bartholomew, W. C. Cushman, J. A. Cutler, B. R. Davis, G. Daw - son, P. T. Einhorn, C. D. Furberg, L. B. Piller, S. L. Pressel, and P. K. Whelton. 2010. Impact of the ALLHAT/JNC7 Dissemination Project on thiazide-type diuretic use. Archives of Internal Medicine 170(10):851–858. Stetler, C. B. 2003. Role of the organization in translating research into evidence-based practice. Outcomes Management 7(3):97–103; quiz 104–105. Stetler, C. B., M. W. Legro, J. Rycroft-Malone, C. Bowman, G. Curran, M. Guihan, H. Hagedorn, S. Pineros, and C. M. Wallace. 2006a. Role of “external facilitation” in implementation of research findings: A qualitative evaluation of facilitation experiences in the Veterans Health Administration. Implementation Science 1:23. Stetler, C. B., M. W. Legro, C. M. Wallace, C. Bowman, M. Guihan, H. Hagedorn, B. Kimmel, N. D. Sharp, and J. L. Smith. 2006b. The role of formative evaluation in implementation research and the QUERI experience. Journal of General Internal Medicine 21(Suppl 2):S1–S8. Stetler, C. B., J. A. Ritchie, J. Rycroft-Malone, A. A. Schultz, and M. P. Charns. 2009. Institutionalizing evidence-based practice: An organizational case study using a model of strategic change. Implementation Science 4:78. Thomas, J. W., E. C. Ziller, and D. A. Thayer. 2010. Low costs of defensive medicine, small savings from tort reform. Health Affairs 29(9):1578–1584. Titler, M. G., and L. Q. Everett. 2001. Translating research into practice. Consider- ations for critical care investigators. Critical Care Nursing Clinics of North America 13(4):587–604. Titler, M. G., L. Cullen, and G. Ardery. 2002. Evidence-based practice: An administra - tive perspective. Reflections Nursing Leadership 28(2):26–27, 45, 46.

OCR for page 145
185 PROMOTING ADOPTION OF CLINICAL PRACTICE GUIDELINES Titler, M. G., K. Herr, J. M. Brooks, X. J. Xie, G. Ardery, M. L. Schilling, J. L. Marsh, L. Q. Everett, and W. R. Clarke. 2009. Translating research into practice interven - tion improves management of acute pain in older hip fracture patients. Health Services Research 44(1):264–287. Vaughn, T. E., K. D. McCoy, B. J. BootsMiller, R. F. Woolson, B. Sorofman, T. Tripp- Reimer, J. Perlin, and B. N. Doebbeling. 2002. Organizational predictors of ad - herence to ambulatory care screening guidelines. Medical Care 40(12):1172–1185. Wallin, L. 2009. Knowledge translation and implementation research in nursing. International Journal of Nursing Studies 46(4):576–587. Ward, M. M., T. C. Evans, A. J. Spies, L. L. Roberts, and D. S. Wakefield. 2006. Na- tional Quality Forum 30 safe practices: Priority and progress in Iowa hospitals. American Journal of Medical Quality 21(2):101–108. Watson, N. M. 2004. Advancing quality of urinary incontinence evaluation and treat - ment in nursing homes through translational research. Worldviews Evidence Based Nursing 1(Suppl 1):S21–S25. Weinstein, M. C., B. O’Brien, J. Hornberger, J. Jackson, M. Johannesson, C. McCabe, and B. R. Luce. 2003. Principles of good practice for decision analytic modeling in health-care evaluation: Report of the ISPOR Task Force on Good Research Practices—Modeling Studies. Value Health 6(1):9–17. Wensing, M., H. Wollersheim, and R. Grol. 2006. Organizational interventions to implement improvements in patient care: A structured review of reviews. Imple- mentation Science 1:2. Werner, R. M., and R. A. Dudley. 2009. Making the “Pay” matter in Pay-For- Performance: Implications for payment strategies. Health Affairs 28(5):1498–1508. Wright, A., H. Goldberg, T. Hongsermeier, and B. Middleton. 2007. A description and functional taxonomy of rule-based decision support content at a large inte - grated delivery network. Journal of the American Medical Informatics Association 14(4):489–496. Wright, A., D. F. Sittig, J. S. Ash, S. Sharma, J. E. Pang, and B. Middleton. 2009. Clini - cal decision support capabilities of commercially-available clinical information systems. Journal of the American Medical Informatics Association 16(5):637–644. Yano, E. M. 2008. The role of organizational research in implementing evidence-based practice: QUERI Series. Implementation Science 3:29.

OCR for page 145