Click for next page ( 2


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 1
Summary INTRODUCTION AND OVERVIEW The demand for better evidence to guide healthcare decision making is increasing rapidly for a variety of reasons, including the adverse conse- quences of care administered without adequate evidence, emerging insights into the proportion of healthcare interventions that are unnecessary, recog- nition of the frequency of medical errors, heightened public awareness and concern about the very high costs of medical care, the burden on employers and employees, and the growing proportion of health costs coming from out of pocket (Fisher and Wennberg, 2003; Fisher et al., 2003a, 2003b; IOM, 2000, 2001, 2008a; McGlynn et al., 2003; Wennberg et al., 2002). Although nearly $2.5 trillion was spent in 2009 on health and medical care in the United States, only a very small portion of that amount—perhaps less than one tenth of 1 percent—was devoted to learning what works best in health care, for whom, and under what circumstances. To improve the effectiveness and value of the care delivered, the nation needs to build its capacity for ongoing study and monitoring of the relative effectiveness of clinical interventions and care processes through expanded trials and studies, systematic reviews, innovative research strategies, and clinical registries, as well as improving its ability to apply what is learned from such study through the translation and provision of information and decision support. Several recent initiatives have proposed the development of an entity to support expanded study of the comparative effectiveness of interventions. To inform policy discussions on how to meet the demand for more comparative effectiveness research (CER) as a means of improving 

OCR for page 1
2 LEARNING WHAT WORKS the effectiveness and value of health care, the Institute of Medicine (IOM) Roundtable on Value & Science-Driven Health Care convened a workshop on July 30–31, 2008, titled Learning What Works: Infrastructure Required for Comparative Effectiveness Research. Box S-1 describes the issues that motivated the meeting’s discussions: the substantial and growing interest in activities and approaches related to CER; the lack of coordination of key activities, such as the selection and design of studies, synthesis of existing evidence, methods innovation, and translation and dissemination of CER information; shortfalls and widening gaps in the workforce needed in all areas of CER; the opportunities presented by the recent calls for expanded resources for work on the comparative effectiveness of clinical interven- tions; the growing appreciation of the infrastructure needed to support this work; and the need for a trusted, common venue to identify and character- ize the need categories, begin to estimate the shortfalls, consider approaches to addressing the shortfalls, and identify priority next steps. BOX S-1 Issues Motivating the Discussion 1. ubstantial demand for greater insights into the comparative clinical S effectiveness of clinical interventions and care processes to improve the effectiveness and value of health care. 2. xpanded interest and activity in the work needed—e.g., compara- E tive effectiveness research, systematic reviews, innovative research strategies, clinical registries, coverage with evidence development. 3. urrently fragmented and largely uncoordinated selection of studies, C study design and conduct, evidence synthesis, methods validation and improvement, and development and dissemination of guidelines. 4. xpanding gap in workforce with skills to develop data sources and E systems, design and conduct innovative studies, translate results, and guide application. 5. pportunities presented by the attention of recent initiatives and the in- O creasing possibility of developing an entity and resources for expanded work on the comparative effectiveness of clinical interventions. 6. rowing appreciation of the importance of assessing the infrastructure G needed for this work—e.g., workforce needs, data linkage and improve- ment, new methodologies, research networks, technical assistance. 7. esirability of a trusted, common venue to identify and character- D ize the need categories, begin to estimate the shortfalls, consider approaches to addressing the shortfalls, and identify priority next steps.

OCR for page 1
 SUMMARY The goal of the workshop was to clarify the elements and nature of the needed capacity, solicit quantitative and qualitative assessments of the needs, and characterize them in a fashion that will facilitate engagement of the issues by policy makers. Two assumptions guided the discussions but were not explored as part of the workshop: resources will be available to expand work on the comparative effectiveness of medical interventions, and, given recent public discourse on the need for a stronger focus on the work, a designated entity would be developed with a formal charge to coordinate the expanded work. The workshop gathered leading practitioners in health policy, technol- ogy assessment, health services research, health economics, information technology (IT), and health professions education and training to explore, through invited presentations, the current and future capacity needed to generate new knowledge and evidence about what works best, including skills and workforce, data linkage and improvement, study coordination and result dissemination, and research methods innovation. Participants explored, in both qualitative and quantitative terms, the nature of the work required, the IT and integrative vehicles required, the skills and training programs required, the priorities to be considered, the role of public–private partnerships, and the strategies for immediate attention while considering the long-term needs and opportunities. Through the course of the work- shop, a number of common themes and implications emerged. These are indicated below, along with a number of possible follow-up actions identi- fied for Roundtable consideration. Since the meeting, three events have occurred with significant implica- tions for the infrastructure necessary for comparative effectiveness research: (1) the American Recovery and Reinvestment Act of 2009 (ARRA) included $1.1 billion for the conduct of CER; (2) formal assessments by the IOM and the federal government have recommended priorities for such research; and (3) the Accountable Care Act of 2010 (ACA) established an independent Patient-Centered Outcomes Research Institute (PCORI). See Appendixes C, D, and, E for additional background. Accordingly some of the informa- tion has been updated as appropriate to bring the text current with 2011 circumstances. Comparative Effectiveness Research and the Roundtable on Value & Science-Driven Health Care The IOM’s Roundtable on Value & Science-Driven Health Care pro- vides a trusted venue for key stakeholders to work cooperatively on inno- vative approaches to the generation and application of evidence that will drive improvements in the effectiveness and efficiency of medical care in the United States. Participants seek the development of a learning health system

OCR for page 1
 LEARNING WHAT WORKS that enhances the availability and use of the best evidence for the collabora- tive healthcare choices of each consumer and healthcare professional, that drives the process of discovery as a natural outgrowth of patient care, and that ensures innovation, quality, safety, and value in health care. As leaders in their fields, Roundtable members work with their colleagues to identify issues not being adequately addressed, determine the nature of the barri- ers and possible solutions, and set priorities for action. They marshal the energy and resources of the sectors represented on the Roundtable to work for sustained public–private cooperation for change. This work is focused on the three major dimensions of the challenge: 1. accelerating progress toward the long-term vision of a learning health system, in which evidence is both applied and developed as a natural product of the care process, 2. expanding the capacity to meet the acute, near-term need for evi- dence of comparative effectiveness to support medical care that is maximally effective and produces the greatest value, 3. improving public understanding of the nature of evidence, the dynamic character of evidence development, and the importance of insisting on medical care that reflects the best evidence. Roundtable members have set a goal that by the year 2020, 90 percent of clinical decisions will be supported by accurate, timely, and up-to-date clinical information and will reflect the best available evidence. To achieve this goal, Roundtable members and their colleagues work to identify priori- ties for action on those key issues in health care where progress requires cooperative stakeholder engagement. Central to these efforts is the Learning Health System series of workshops and publications that collectively char- acterize the key elements of a healthcare system that is designed to generate and apply the best evidence about the healthcare choices of patients and providers as well as identify barriers to the development of such a system and opportunities for progress. Each meeting is summarized in a publication available through the National Academies Press. Workshops in this series include the following: • T  he Learning Healthcare System (July 20–21, 2006) • Judging the Evidence: Standards for Determining Clinical Effective-  ness (February 5, 2007) • L eadership Commitments to Improve Value in Healthcare: Toward Common Ground (July 23–24, 2007) • R  edesigning the Clinical Effectiveness Research Paradigm: Innova- tion and Practice-Based Approaches (December 12–13, 2007)

OCR for page 1
 SUMMARY • C  linical Data as the Basic Staple of Health Learning: Creating and Protecting a Public Good (February 28–29, 2008) • E ngineering a Learning Healthcare System: A Look to the Future (April 28–29, 2008) • L earning What Works: Infrastructure Required for Learning Which Care is Best (July 30–31, 2008) • V  alue in Health Care: Accounting for Cost, Quality, Safety, Out- comes, and Innovation (November 17–18, 2008) • T  he Healthcare Imperative: Lowering Costs and Improving Out- comes (May, July, September, December, 2009) • D  igital Infrastructure for the Learning Health System: The Founda- tion for Continuous Improvement in Health and Health Care (July, September, October, 2010) This publication summarizes the proceedings of the seventh workshop in the Learning Health System series, which focused on the infrastructure needs—e.g., methods, coordination capacities, data resources and linkages, workforce—for developing an expanded and efficient national capacity for CER. A synopsis of the key points from each of the sessions is included in this chapter, with more detailed information on session presentations and discussions found in the chapters that follow. Sections of the workshop summary not specifically attributed to an individual are based on the pre- sentations, background papers, and discussions associated with the work- shop, and reflect the views of this publication’s rapporteurs, not those of the IOM Roundtable on Value & Science-Driven Health Care. Day 1 featured two keynote speakers who provided a vision for devel- oping an infrastructure that can contribute to an evidence base of what works best for whom, as well a sense of some of the potential returns from health care driven by evidence (Chapter 1), and presentations by speakers asked to characterize the nature of the work (Chapter 2), the information networks (Chapter 3), and the talent (Chapter 4) needed to carry out that vision. Day 2 featured discussions focused on identifying priority items for implementation to meet current shortfalls and opportunities to build upon existing public–private partnership efforts (Chapter 5). Chapter 6 provides a summary of the final session’s discussion to outline key elements of a roadmap for progress, suggest some “quick hits” for immediate imple- mentation, and opportunities to build needed support; this chapter also highlights common themes from the meeting’s discussions and suggestions on opportunities for follow-up actions by the Roundtable. An overview of the topics discussed in specific manuscripts is provided in Table S-1. A white paper, authored by staff in 2007 and titled Learning What Works Best: The Nation’s Need for Evidence on Comparative Effectiveness

OCR for page 1
 TABLE S-1 Overview of the Specific Aspects of Comparative Effectiveness Research (CER) Infrastructure Addressed in This Publication’s Manuscript CER Research Methods Clinical Data Health Evidence Coordination Workforce and Development Information Review and and Education International Chapter Manuscript and Author(s) Settings and Use Technology Synthesis Dissemination and Training CER Efforts The Nation’s Need for Evidence on 1 Comparative Effectiveness in Health Care: Learning What Works Best J. Michael McGinnis et al. A Vision for the Capacity to Learn What Care Works Best Mark B. McClellan The Potential Returns from Evidence- Driven Health Care Gail R. Wilensky The Cost and Volume of 2 Comparative Effectiveness Research Erin Holve and Patricia Pitman Intervention Studies That Need to Be Conducted Douglas B. Kamerow Clinical Data Sets That Need to Be Mined Jesse A. Berlin and Paul E. Stang Knowledge Synthesis and Translation That Need to Be Applied Richard A. Justman

OCR for page 1
Methods That Need to Be Developed Eugene H. Blackstone et al. Coordination and Technical Assistance That Need to Be Supported Jean R. Slutsky Electronic Health Records: Needs, 3 Status, and Costs for U.S. Healthcare Delivery Organizations Robert H. Miller Data and Information Hub Requirements Carol C. Diamond Integrative Vehicles Required for Evidence Review and Dissemination Lorne A. Becker Comparative Effectiveness 4 Workforce—Framework and Assessment William R. Hersh et al. Toward an Integrated Enterprise— The Ontario, Canada, Case Sean R. Tunis et al. Information Technology Platform 5 Requirements Mark E. Frisse continued 

OCR for page 1
 TABLE S-1 Continued CER Research Methods Clinical Data Health Evidence Coordination Workforce and Development Information Review and and Education International Chapter Manuscript and Author(s) Settings and Use Technology Synthesis Dissemination and Training CER Efforts 5 Data Resource Development and Analysis Improvement T. Bruce Ferguson, Jr., and Ansar Hassan Practical Challenges and Infrastructure Priorities for Comparative Effectiveness Research Daniel E. Ford Transforming Health Professions Education Benjamin K. Chu Building the Training Capacity for a Health Research Workforce of the Future Steven A. Wartman and Claire Pomeroy Public–Private Partnerships Carmella A. Bocchino et al. The Roadmap—Policies, Priorities, 6 Strategies, and Sequencing Stuart Guterman et al.

OCR for page 1
9 SUMMARY in Health Care, provided important context for the workshop discussions. The executive summary of that white paper and the full manuscript are included in Chapter 1 and Appendix A, respectively. Appendix B includes evidence summaries of research questions identified and other materials relevant to discussion in a paper in Chapter 2. Appendixes C and D pres- ent the recommendations of two groups for priority studies in CER: Initial National Priorities for Comparative Effectiveness Research, an Institute of Medicine report; and the Federal Coordinating Council for Comparative Effectiveness Research Report to the President and Congress. Appendix E contains the portions of the ACA relevant to the structure, funding, and charge of PCORI. The workshop agenda, biographical sketches of the workshop participants, and a list of workshop attendees can be found in Appendixes F, G, and H, respectively. COMMON THEMES Common themes that emerged from the 2 days of discussion are sum- marized in Box S-2 and elaborated in the text that follows: C are that is effective and efficient stems from the integrity of the • infrastructure for learning. The number of medical diagnostics and treatments available to patients and caregivers is increasing, but the knowledge about their effectiveness—in particular, their compara- tive effectiveness—is not keeping pace. This is in part a function of the rate of change, but it is also a product of capacity that is both underdeveloped and, as several participants noted, substantially fragmented, which leads to gaps, inefficiencies, and inconsistencies in the work. The accelerating rate of change in the interventions requiring effectiveness assessment compels a substantial shoring up in the level of effort, the nature of the effort, and the coordination of the effort in order to produce the necessary insights into the right care for different people under different circumstances. C oordinating work and ensuring standards are key components of • the evidence infrastructure. Several presentations highlighted the point that substantial activity is currently under way in effective- ness research, including work on comparative effectiveness, but the activities are fragmented and often redundant in both structure and function. The fact that the application of evidence lags behind its production is in part a function of the disparate and “siloed” approaches between and within organizations seeking and develop- ing information. The notions of infrastructure for evidence devel- opment therefore also include the capacity for greater coordination in the setting of study priorities; the development of systematic

OCR for page 1
0 LEARNING WHAT WORKS BOX S-2 Infrastructure Required for Comparative Effectiveness Research: Common Themes • are that is effective and efficient stems from the integrity of the in- C frastructure for learning. • oordinating work and ensuring standards are key components of the C evidence infrastructure. • earning about effectiveness must continue beyond the transition from L testing to practice. • imely and dynamic evidence of clinical effectiveness requires bridg- T ing research and practice. • urrent infrastructure planning must build to future needs and C opportunities. • eeping pace with technological innovation compels more than a K head-to-head and time-to-time focus. • eal-time learning depends on health information technology R investment. • eveloping and applying tools that foster real-time data analysis is an D important element. • trained workforce is a vital link in the chain of evidence stewardship. A • pproaches are needed that draw effectively on both public and pri- A vate capacities. • fficiency and effectiveness compel globalizing evidence and local- E izing decisions. decisions for the conduct of CER, systematic reviews, and guideline development; and the need to ensure the consistent translation of developed information. The identification of priority conditions, evaluation, and evidence gaps is needed in order to target limited resources, especially for high-cost or high-volume procedures and interventions. L earning about effectiveness must continue beyond the transi- • tion from testing to practice. “The learning process cannot stop when the label is approved,” one meeting participant pointed out. Premarket testing for the safety and effectiveness of various interventions cannot assess the results for all populations or the circumstances of use and differences in practice patterns, so gath- ering information as interventions are applied in practice settings should represent a key focus in designing the infrastructure to learn which care is best. Local coverage decisions and private insurer use

OCR for page 1
 SUMMARY of coverage with evidence development approaches were cited as opportunities to learn as a part of the care process. T imely and dynamic evidence of clinical effectiveness requires • bridging research and practice. Although historical insulation of clinical research from the regular delivery of healthcare services evolved to facilitate data capture and control for confounding fac- tors, it may not adequately inform the real-world setting of clinical practice. With the prospect of enhanced electronic data capture at the point of care on real-world patient populations, and statisti- cal approaches to improve analysis, as well as increasing demand to keep pace with technologic innovation, this divide increasingly limits the utility of research results. Efforts under way to better engage health delivery organization, practitioners, patients, and the community in research prioritization, conduct, and results dis- semination should be supported and expanded. C urrent infrastructure planning must build to future needs and • opportunities. Research is often driven more by the methods than the questions. In fact, both are important, and infrastructure plan- ning must account for both the key emerging healthcare ques- tions and the key emerging CER opportunities. Emerging questions include those related to the management of multiple co-occurring chronic diseases of increasing prevalence in an aging population, the improved insights into individual variation relevant to both treatments and diagnostics, and the impact of innovation in short- ening the lifecycle of any particular intervention. Emerging tools include innovations in trial design, the development of new statisti- cal approaches to data analysis, and the development of electronic medical and personal health records. K eeping pace with technological innovation compels more than • a head-to-head and time-to-time focus. Much of the current dis- cussion about CER has emphasized the need for more clinical trials and more head-to-head studies. Although there are numer- ous examples of diagnostic and treatment interventions for which such studies are needed, the notion of a research process that essentially offers periodic and static determinations is inherently limited. Especially with the rapid pace of change in the nature of interventions and the difficulty, expense, and time required to develop studies—and the challenges of ensuring the generalizabil- ity of results in the face of limitations of the traditional approach to randomized controlled trials (RCTs)—a first-order priority for effectiveness research is the establishment of infrastructure for a more dynamic, real-time approach to learning. Leveraging new tools, such as health information technology (IT) should allow for

OCR for page 1
 LEARNING WHAT WORKS emphasizes bundled payments for episodes of care and evidence-informed case rates, or capitation. To correct gaps in care and to ensure safe and effective interventions, health professionals will increasingly have to work together in teams and share accountability for their patients’ clinical outcomes. Acute episodes of illness will require coordination of handoffs, patient safety protocols, and checklists and other interventions designed to minimize harm and to maximize benefit to patients. Chronic disease management and adherence to preventive measures that are known to be effective will become system- wide accountability requirements. The complexity of care and the huge burden placed on shorter physician–patient interactions with a multitude of different clinicians will require that other health professionals as well as ancillary staff be used to bridge the gaps. Every touch point, enhanced with Web-based and other communication-based tools, will be an opportunity to maximize care. A new professionalism will build on the principles of lifelong learning, duty to patients, and devotion to finding best outcomes as well as emphasize teamwork and evidence-based care. Computerized simu- lation training will become a staple for health profession education. Team skills—the ability to lead, develop, and encourage the active contribution of other professionals in the clinical setting—will become an essential core of professionalism. Demonstrated competency both in clinical arenas and in the ability to work effectively with others will be required. Building the Training Capacity for a Healthcare Workforce of the Future Research holds the promise of finding and testing the answers to the challenges that face U.S. health care, but traditional approaches are inad- equate. Steven A. Wartman, president of the Association of Academic Health Centers, called for the development of a new kind of research infrastructure focused on health and health care that can guide and inform decision making. Such an approach would support research to discover, disseminate, and optimize the adoption of practices that advance the health of individuals and the public as a whole. In his discussion, Wartman sug- gested that the key to the changes needed is expanding the continuum of medical research to ensure that discoveries ultimately serve the public. This expansion would include all aspects of health, including biomedical, pub- lic health, and multidisciplinary research on the social and environmental determinants of health. Table S-7 outlines an approach to achieving this new research vision, and among the most pressing needs is the development of a new cadre of researchers, clinicians, and health leaders—a workforce that includes, among others, health professionals, engineers, sociologists, urban planners, policy experts, and economists. The cross-cutting nature of academic health centers (AHCs) suggests an unprecedented opportunity to

OCR for page 1
 SUMMARY TABLE S-7 An Approach to Achieving a New Vision for Health Research • New People and Skills Multidisciplinary teams • Strategic faculty recruitment • Expansion and training of research support staff • ew partners (e.g., industry, nongovernmental N organizations, faith-based organizations, payers, government, public, organizations diverse communities, patients, general public) • New venues (e.g., community-based research) • raining to provide new skills, including inter- T professional training • ncentives within academia to support all types of health I researchers (e.g., academic home, revised promotion, tenure criteria) • nformation technology investments (e.g., electronic New Infrastructure I health records, personal health record, regional health information organizations) • Biostatistics and data management support • Biorepositories • Streamlined clinical research approval processes • Efficient intellectual property policies • inks between academia, industry, and venture capitalists L • xpanded funding for clinical, translational, and social New Investments and E Incentives health research by the National Institutes of Health, National Science Foundation, foundations, others • dentification of new funding sources, especially for T2 I and T3, behavioral, public health, and social health research • ncreased organizational investment in translational I research cores (e.g., informatics, clinical research nurses) • ational coordination of research resources (e.g., N informatics linkages, data sharing) SOURCE: Wartman and Pomeroy, 2009. build AHCs to foster interprofessional collaborative activity and to develop needed health research teams. These teams may reside in new departments, institutes, and centers as typical academic silos give way to more horizontal integration. Organizational and management trends taking place in the nation’s AHCs are remolding the ivory tower into a complex business enterprise. This transition is characterized by reorganization along nondisciplinary lines and a management structure that, conceptually and operationally, better aligns the entire institution. To build the needed training capacity, AHCs will need to ensure commitment of their own leaders to expand “health research,” invest in new infrastructure (e.g., IT, data repositories,

OCR for page 1
 LEARNING WHAT WORKS biorepositories), and support curricular and training innovations to develop multidisciplinary, multisector research teams. In addition, AHC leadership can drive this new vision of health research by calling for adequate and innovative funding mechanisms, providing needed culture and infrastruc- ture, and facilitating the partnerships with government, industry, and com- munity groups that are needed for health research. It will be necessary to provide clear-cut career paths for health researchers along with adequate and appropriate institutional resources. Key opportunities include the pro- vision of academic homes for translational researchers, the development of appropriate recruitment packages, and criteria for promotion and tenure. Many healthcare sectors—industry, community, and other nonacademic organizations—have important roles in facilitating fundamental change in medical research. The involvement of community constituencies affected by research will increasingly be an essential component of health research— through contributing input into research priorities, helping build trust of community participation in research, or disseminating findings. Particularly critical are national policy makers who will drive this transformation by endorsing the importance of health research in leveraging biomedical dis- coveries for health improvements; by providing adequate funding for the full range of health research needed, including workforce development; and by helping to address current barriers to research (e.g., Health Insurance Portability and Accountability Act procedures). Public–Private Partnerships A fundamental challenge in advancing CER is developing an infra- structure that is sufficiently robust to support and nurture productive relationships among stakeholders with different perspectives and organi- zational missions. Without a mechanism for bringing these parties to the same table, fundamental differences in institutional cultures can impede or even preclude stakeholder-to-stakeholder communication. Public–private partnerships can bridge these gaps and remove barriers to cooperation. This mechanism not only creates space for collaboration—in which bar- riers to cooperation can be discussed and addressed—but also offers a structure and operational guidelines, typically tailored to a specific partner- ship by the participants, that help facilitate cooperative work. A value for participating entities is they can learn more and distribute new knowledge more quickly in a collaborative environment. Public–private partnerships can help link some of health care’s disparate component elements and draw productively on the respective assets of participating stakeholders. Public–private partnerships are viewed by some as fundamental building blocks in the development of the CER infrastructure. A panel discussion featuring perspectives of health plans, the federal government, and industry

OCR for page 1
9 SUMMARY representatives considered current and planned public–private partnership efforts as well as how these efforts can be used in a more expansive fashion to develop infrastructure for CER. Carmella A. Bocchino, vice president for clinical affairs and strategic planning at America’s Health Insurance Plans, discussed several successful public–private partnerships in which health plans and federal agencies have partnered to create databases that are useful in identifying potential safety issues and opportunities to improve care and care delivery. An extension of these activities could contribute to the development of a national data system to serve as a central part of the nation’s health research infrastruc- ture. The United States Renal Data System, a large national data registry for end-stage renal disease patients, offers a potential model for a more com- prehensive national data registry. Research and surveillance networks, such as the HMO Research Network, the HMO Cancer Research Network, and the Vaccine Safety Datalink, demonstrate the potential of distributed data networks to help address national research and public health questions. Similar models, such the National Data Aggregation Initiative (NDAI), are being explored for quality measurement and reporting. NDAI seeks to combine Medicare and private-sector data to generate physician perfor- mance measures. While these initiatives demonstrate the inherent value of developing the infrastructure and tools to aggregate and analyze these data across populations, challenges remain. Agreement is needed on a shared methodology that can facilitate comparative analyses across the broad spec- trum of current clinical research. Data systems design should facilitate data mining as well as the identification and tracking of safety and effectiveness issues in real time. Progress will require the standardization and compila- tion of data from disparate sources as well as ensuring thoughtful and appropriate design of emerging data sources, such as EHRs, so that data is produced that can help answer questions important to understanding clinical effectiveness. Establishing governance structures will also be a key challenge, as will developing approaches for sustainable funding of these types of research and contending with issues related to ownership of data. Rachel Behrman, associate commissioner for clinical programs and director of the office of critical path programs at the FDA, summarized two public–private partnerships housed in the FDA: the Critical Path Initiative and the Sentinel Network. The Critical Path Initiative seeks to modern- ize the way in which FDA-regulated products, including drugs, biological products, and medical devices, are developed, evaluated, and manufac- tured. The Sentinel Initiative is intended to establish a national integrated electronic structure and approach for monitoring medical product safety. These initiatives have focused on several key issues that require collabora- tive engagement, including research methods and data analysis tools that ensure the production of timely, reliable, and secure information, as well as

OCR for page 1
0 LEARNING WHAT WORKS governance structures and policies that meet stakeholder needs while also putting appropriate safeguards into place. Specifically, questions related to data access, use, and stewardship need to be resolved. With respect to a CER infrastructure, attention should initially focus on developing mecha- nisms for priority setting, sustainable financing, and collaboration gover- nance as well as on data transparency so that conduct and reporting of analyses result in high-quality information. Contending with issues related to proprietary data and patentable tools and processes will be essential to progress. William Z. Potter discussed two public–private partnerships, the Bio- markers Consortium and the Alzheimer’s Disease Neuroimaging Initiative (ADNI), that have productively linked pharmaceutical companies, gov- ernment agencies, and other stakeholders. The Biomarkers Consortium, which aims to speed up the development of biological markers in sup- port of drug development, preventive medicine, and medical diagnostics, demonstrates the need for careful delineation of specific areas of research focus that protect the individual interests of consortium members. Areas of collaboration were carefully selected, and research was conducted in precompetitive spaces to ensure that the work would achieve the common goals of advancing human health and improving patient care; speeding the development of medicine and therapies for detection, prevention, diagnosis, and treatment of disease; and making project results broadly available to the entire research community. The Biomarkers Consortium had to address issues related to data quality, study design variation, and data sharing—and a project on placebo response was described to illustrate how such work can inform discussions on needed improvements. The ADNI demonstrates that infrastructure can be developed to foster cross-sector communication and work. Underlying this project’s initial, promising results is ADNI’s abil- ity to adequately address data transparency issues. Key barriers identified relevant to the CER infrastructure included the need for internal industry champions to drive collaborative work; the need to meet the costs of full- time equivalent and data management; skepticism by industry, NIH, and academic leadership on the value of such partnerships; and variable legal opinions on intellectual property and medicolegal risks. Moving Forward Although expanding CER capacity offers many potential gains for health care, the scale of needed transformation is also large and spans all healthcare sectors. A long-term strategy must appropriately incorporate existing infrastructure, prioritize and sequence needs, engage all stake- holders, and build sustained, cross-sector support. Discussed in the final workshop session were key considerations for such a strategy—roadmap

OCR for page 1
 SUMMARY elements, quick hits, and opportunities to build support. The final chapter includes a synthesis of this session’s discussion, a review of common themes heard at the workshop, and a number of possible follow-up actions to be considered for ongoing multistakeholder involvement through the IOM Roundtable on Value & Science-Driven Health Care. The Roadmap—Policies, Priorities, Strategies, and Sequencing Stuart Guterman, senior program director for the Commonwealth Fund’s Program on Medicare’s Future, outlined six broad areas discussed during the workshop that should be considered in the development of poli- cies and strategies: data, methods, workforce, organization, translation, and financing. Clear end goals for each area, priority needs within and between categories, and key actors or existing infrastructure that could help initi- ate the activities needed were discussed. Suggested goals for these areas included the development of capacity to produce relevant data, ensuring maximal value of data through integration and system linkages and mak- ing data and information available to appropriate users when and where needed; development of research approaches to meet the needs of CER end users; education of a cadre of professionals—from across healthcare sectors—trained to use tools and techniques for developing and apply- ing comparative effectiveness information; prioritization and coordination across the many organizations engaged in various aspects of evidence devel- opment—primary research, synthesis, translation—to enable more efficient information production; movement from evidence to evidence-based deci- sion making; and sufficient and sustained funding to establish and support CER and its application as an integral part of the U.S. healthcare system. Quick Hits—Things That Can Be Done Now Actions that can be undertaken immediately will be essential to help accelerate progress by demonstrating in the near term the benefit of expanded CER. W. David Helms, president and CEO of AcademyHealth, noted several opportunities for collaborative efforts by stakeholders to lay the groundwork for a national capacity for CER—advocating for congres- sional action to establish a platform for CER, increasing federal funding for CER, articulating the case for CER, examining models for an expanded national capacity, and educating state policy representatives and Medicaid officials about the potential and needs for CER. He noted that work can also begin immediately to build up the needed workforce. The many other recommendations for immediate action offered by session respondents and throughout the workshop were also summarized. Subsequent to this meeting, Congress increased the national capacity for CER with the estab-

OCR for page 1
2 LEARNING WHAT WORKS lishment, in the ACA of 2010, of Patient-Centered Outcomes Research Institute, previously described. Building Support While building upon many existing activities and infrastructure, an enhanced focus on CER is a shift in the nation’s approach to clinical research and practice. Although viewed as an important element of health reform by most healthcare stakeholders, additional work is needed to build support by the public and policy makers for needed investments and potential returns from CER. An open discussion session on this topic was led by Mary Woolley, from Research!America, who noted four fundamen- tal requirements for building support: (1) having clarity on the ultimate goal, (2) understanding the target audience, (3) ensuring all stakeholders are involved, and (4) understanding the context. This framework suggests several key opportunities to build support for the expanded development and use of CER, including finding ways to frame the many infrastructure needs in simple terms that make sense to all stakeholders, including the public and policy makers; tailoring communications to the interests and concerns of different stakeholders; and engaging in clear communication and crisp, well-tested messaging. Finally, she noted that communication should not be unidirectional, but structured to fully engage all stakehold- ers involved in infrastructure building. Suggestions offered by workshop participants for possible goals, for opportunities to better engage consumers and patients, and for research that might better inform communications were summarized. Issues for Possible Roundtable Follow-Up Throughout the course of discussions, a number of items were identi- fied as candidates for follow-up attention by the Roundtable on Value & Science-Driven Health Care: B etter characterization of the elements of the infrastructure: Build • on the work sponsored by the Roundtable on workforce needs and IT infrastructure, continue to improve the initial estimates and pursue similar assessments related to requirements for new analytic tools and methods, establish processes for efficient and effective operation of the fields of work, and shape the strategy for attention and phasing. Include examples of effective work at institutional level. C larification of the nature of the “prework” needed for a more • systematic approach to the necessary RCTs: Even though a more

OCR for page 1
 SUMMARY practical portfolio of research approaches is essential, the RCT offers the key standard for the rigor required for certain circum- stances. Their most effective deployment requires attention to issues of the criteria indicating the need for an RCT, the issues and priorities to be assessed, the best structure of the research ques- tions, and improved approaches to trial design, conduct, and data collection. M ore focus on the infrastructure needed for guideline develop- • ment, implementation, and evaluation: Several issues could be productively engaged, including transparency and collaboration across professional groups on improving consistency in the meth- ods, standards, rules, and participants in guideline development and approaches to implementation. S hare meeting discussions with organizational stakeholders in ele- • ments of the infrastructure: Examples given included the National Quality Forum; the Association of American Medical Colleges; the Association of Academic Health Centers, the Quality Improvement Program, and CMS/Department of Health and Human Services in the context of development of the 10th quality improvement orga- nization statement of work; the American Hospital Association Quality Forum; the International Society for Pharmacoeconomics and Outcomes Research; and provider groups. D evote additional attention to data stewardship issues: Because • the basic resource for effectiveness research is the clinical data system, the Roundtable needs to catalyze more discussion on the integrity of this resource, including issues of maintenance, privacy, and data ownership. I dentify possible incentives: Look at how subsidies and reimburse- • ment regulations can stimulate increased use of HIT in medical care, increased use of HIT for application of evidence, and increased use of HIT for the development of evidence. E xpand engagement of the business case and demand function • for infrastructure investment: Give additional attention to the eco- nomic or business case for employers to appreciate the investment and its necessity to improve value from health care, the case for more attention by states, the case for deployment of the personal health record to drive more patient–provider interaction, and work on the consequences of not investing. M ore focus on the issues of strategies and infrastructure for • implementing findings on effectiveness: Since evidence is virtually useless if not applied, the Roundtable could give more attention to understanding the infrastructure needs for effective guideline implementation.

OCR for page 1
 LEARNING WHAT WORKS Sponsor discussions on training and health professions education • reorientation: With greater appreciation for team-based, networked information stewardship roles by caregivers, the health professions groups should be recruited for collaborative consideration of the training implications. P rovide information on the Roundtable’s Web site: The resources • of the workshop presentations and discussions should be posted on the Web site—slides, links, and speaker contact information. REFERENCES AcademyHealth. 2005. Placement, Coordination, and Funding of Health Services Research within the Federal Government. In AcademyHealth Report, September 2005. http:// www.academyhealth.org/files/publications/placementreport.pdf (accessed September 3, 2010). Buto, K., and P. Juhn. 2006. Can a center for comparative effectiveness information succeed? Perspectives from a health care company. Health Affairs 25(6):w586-w588. Clancy, C. M. 2006. Getting to “smart” health care. Health Affairs 25(6):w589-w592. Fisher, E. S., and J. E Wennberg. 2003. Health care quality, geographic variations, and the challenge of supply-sensitive care. Perspectives in Biology and Medicine 46(1):69-79. Fisher, E. S., J.E. Wennberg, T. A. Stukel, D.J . Gottlieb, F. L. Lucas, and E. L. Pinder. 2003a. The implications of regional variations in Medicare spending. Part 1: The content, qual- ity, and accessibility of care. Annals of Internal Medicine 138(4):273. Fisher, E. S., J. E. Wennberg, T. A. Stukel, D. J. Gottlieb, F. L. Lucas, and E. L. Pinder. 2003b. The implications of regional variations in Medicare spending. Part 2: Health outcomes and satisfaction with care. Annals of Internal Medicine 138(4):288. Health Industry Forum. 2006. Comparative Effectiveness Forum: Executive summary. http:// healthforum.brandeis.edu/meetings/materials/2006-30-Nov./ExecBrief.pdf (accessed July 20, 2010). Hopayian, K. 2001. The need for caution in interpreting high quality systematic reviews. Brit- ish Medical Journal 323(7314):681-684. IOM (Institute of Medicine). 2000. To err is human: Building a safer health system. Washing- ton, DC: National Academy Press. ———. 2001. Crossing the quality chasm: A new health system for the 2st century. Wash- ington, DC: National Academy Press. ———. 2007. Learning what works best: The nation’s need for evidence on comparative ef- fectiveness in health care. Washington, DC: The National Academies Press. ———. 2008a. Knowing what works in health care: A roadmap for the nation. Washington, DC: The National Academies Press. ———. 2008b. Learning what works: Infrastructure required for comparative effectiveness research. Washington, DC: The National Academies Press. Kamerow, D. 2009. Comparative effectiveness studies inventory project. Washington, D.C.: A commissioned activity for the IOM Roundtable on Value & Science-Driven Health Care. Kupersmith, J., S. Sung, M. Genel, H. Slavkin, R. Califf, R. Bonow, L. Sherwood, N. Reame, V. Catanese, C. Baase, J. Feussner, A. Dobs, H. Tilson, and E. A. Reece. 2005. Creating a new structure for research on health care effectiveness. Journal of Investigative Medicine 53(2):67-72.

OCR for page 1
 SUMMARY McGlynn, E. A., S. M. Asch, J. Adams, J. Keesey, J. Hicks, A. DeCristofaro, and E. A. Kerr. 2003. The quality of health care delivered to adults in the United States. New England Journal of Medicine 348(26):2635-2645. Moher, D., J. Tetzlaff, A. C. Tricco, M. Sampson, and D. G. Altman. 2007. Epidemiology and reporting characteristics of systematic reviews. PLoS Medicine 4(3):e78. Rowe, J. W., D. A. Cortese, and J. M. McGinnis. 2006. The emerging context for advances in comparative effectiveness assessment. Health Affairs 25(6):w593-w595. Wartman, S., and C. Pomeroy. 2009. Building the training capacity: Implementation priorities In Learning what works: Infrastructure required for comparative effectiveness research. Washington, DC: The National Academies Press. Wennberg, J. E., E. S. Fishers, and J. S. Skinner. 2002. Geography and the debate over Medi- Medi- care reform. Health Affairs Web Exclusive:w96-w114. Wilensky, G. 2005. Developing a center for comparative effectiveness information. Health Affairs Web Exclusive:w572-w585.

OCR for page 1