Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 113
Evaluating Research Efficiency in the U.S. Environmental Protection Agency Appendix G OMB’s Research and Development Program Investment Criteria1,2 As another initiative of the President’s Management Agenda, the development of explicit R&D investment criteria builds on the best of the planning and assessment practices that R&D program managers use to plan and assess their programs. The Administration has worked with experts and stakeholders to build upon lessons learned from previous approaches. Agencies should use the criteria as broad guidelines that apply at all levels of Federally funded R&D efforts, and they should use the PART as the instrument to periodically evaluate compliance with the criteria at the program level. To make this possible, the R&D PART aligns with the R&D criteria. The R&D criteria are reprinted here as a guiding framework for addressing the R&D PART. The R&D criteria address not only planning, management, and prospective assessment but also retrospective assessment. Retrospective review of whether investments were well-directed, efficient, and productive is essential for validating program design and instilling confidence that future investments will be wisely invested. Retrospective reviews should address continuing program relevance, quality, and successful performance to date. While the criteria are intended to apply to all types of R&D, the Administration is aware that predicting and assessing the outcomes of basic research in particular is never easy. Serendipitous results are often the most interesting and 1 (OMB 2007). 2 To assist agencies with significant research programs, additional instructions were added to the PART Guidance and titled the “Research and Development Program Investment Criteria.” The R&D Investment Criteria are found in Appendix C of the PART instructions. Unlike the main body of the PART instructions, which apply to all federal agencies and programs, the R&D Investment Criteria attempt to clarify OMB’s expectations specifically for R&D programs.
OCR for page 114
Evaluating Research Efficiency in the U.S. Environmental Protection Agency ultimately may have the most value. Taking risks and working toward difficult-to-attain goals are important aspects of good research management, and innovation and breakthroughs are among the results. However, there is no inherent conflict between these facts and a call for clearer information about program goals and performance toward achieving those goals. The Administration expects agencies to focus on improving the management of their research programs and adopting effective practices, and not on predicting the unpredictable. The R&D investment criteria have several potential benefits: Use of the criteria allows policy makers to make decisions about programs based on information beyond anecdotes, prior-year funding levels, and lobbying of special interests. A dedicated effort to improve the process for budgeting, selecting, and managing R&D programs is helping to increase the return on taxpayer investment and the productivity of the Federal R&D portfolio. The R&D investment criteria will help communicate the Administration’s expectations for proper program management. The criteria and subsequent implementation guidance will also set standards for information to be provided in program plans and budget justifications. The processes and collected information promoted under the criteria will improve public understanding of the possible benefits and effectiveness of the Federal investment in R&D. DETAILS ON THE CRITERIA The Relevance, Quality, and Performance criteria apply to all R&D programs. Industry- or market-relevant applied R&D must meet additional criteria. Together, these criteria can be used to assess the need, relevance, appropriateness, quality, and performance of Federal R&D programs. Relevance R&D investments must have clear plans, must be relevant to national priorities, agency missions, relevant fields, and “customer” needs, and must justify their claim on taxpayer resources. Programs that directly support Presidential priorities may receive special consideration with adequate documentation of their relevance. Review committees should assess program objectives and goals on their relevance to national needs, “customer” needs, agency missions, and the field(s) of study the program strives to address. For example, the Joint DOE/NSF Nuclear Sciences Advisory Committee’s Long Range Plan and the Astronomy Decadal Surveys are the products of good planning processes because they articulate goals and priorities for research opportunities within and across their respective fields.
OCR for page 115
Evaluating Research Efficiency in the U.S. Environmental Protection Agency OMB will work with some programs to identify quantitative metrics to estimate and compare potential benefits across programs with similar goals. Such comparisons may be within an agency or among agencies. Programs Must Have Complete Plans, With Clear Goals and Priorities Programs must provide complete plans, which include explicit statements of: specific issues motivating the program; broad goals and more specific tasks meant to address the issues; priorities among goals and activities within the program; human and capital resources anticipated; and intended program outcomes, against which success may later be assessed. Programs Must Articulate the Potential Public Benefits of the Program Programs must identify potential benefits, including added benefits beyond those of any similar efforts that have been or are being funded by the government or others. R&D benefits may include technologies and methods that could provide new options in the future, if the landscape of today’s needs and capabilities changes dramatically. Some programs and sub-program units may be required to quantitatively estimate expected benefits, which would include metrics to permit meaningful comparisons among programs that promise similar benefits. While all programs should try to articulate potential benefits, OMB and OSTP recognize the difficulty in predicting the outcomes of basic research. Consequently, agencies may be allowed to relax this as a requirement for basic research programs. Programs Must Document Their Relevance to Specific Presidential Priorities to Receive Special Consideration Many areas of research warrant some level of Federal funding. Nonetheless, the President has identified a few specific areas of research that are particularly important. To the extent a proposed project can document how it directly addresses one of these areas, it may be given preferential treatment. Program Relevance to the Needs of the Nation, of Fields of Science and Technology [S&T], and of Program “Customers” Must Be Assessed Through Prospective External Review Programs must be assessed on their relevance to agency missions, fields of science or technology, or other “customer” needs. A customer may be another
OCR for page 116
Evaluating Research Efficiency in the U.S. Environmental Protection Agency program at the same or another agency, an interagency initiative or partnership, or a firm or other organization from another sector or country. As appropriate, programs must define a plan for regular reviews by primary customers of the program’s relevance to their needs. These programs must provide a plan for addressing the conclusions of external reviews. Program Relevance to the Needs of the Nation, of Fields of S&T, and of Program “Customers” Must Be Assessed Periodically Through Retrospective External Review Programs must periodically assess the need for the program and its relevance to customers against the original justifications. Programs must provide a plan for addressing the conclusions of external reviews. Quality Programs should maximize the quality of the R&D they fund through the use of a clearly stated, defensible method for awarding a significant majority of their funding. A customary method for promoting R&D quality is the use of a competitive, merit-based process. NSF’s process for the peer-reviewed, competitive award of its R&D grants is a good example. Justifications for processes other than competitive merit review may include “outside-the-box” thinking, a need for timeliness (e.g., R&D grants for rapid response studies of Pfisteria), unique skills or facilities, or a proven record of outstanding performance (e.g., performance-based renewals). Programs must assess and report on the quality of current and past R&D. For example, NSF’s use of Committees of Visitors, which review NSF directorates, is an example of a good quality-assessment tool. OMB and OSTP encourage agencies to provide the means by which their programs may be benchmarked internationally or across agencies, which provides one indicator of program quality. Programs Allocating Funds Through Means Other Than a Competitive, Merit-based Process Must Justify Funding Methods and Document How Quality is Maintained Programs must clearly describe how much of the requested funding will be broadly competitive based on merit, providing compelling justifications for R&D funding allocated through other means. (See OMB Circular A-11 for definitions of competitive merit review and other means of allocating Federal research funding.) All program funds allocated through means other than unlimited competition must document the processes they will use to distribute funds to each type of R&D performer (e.g., Federal laboratories, Federally-funded R&D
OCR for page 117
Evaluating Research Efficiency in the U.S. Environmental Protection Agency centers, universities, etc.). Programs are encouraged to use external assessment of the methods they use to allocate R&D and maintain program quality. Program Quality Must Be Assessed Periodically Through Retrospective Expert Review Programs must institute a plan for regular, external reviews of the quality of the program's research and research performers, including a plan to use the results from these reviews to guide future program decisions. Rolling reviews performed every 3-5 years by advisory committees can satisfy this requirement. Benchmarking of scientific leadership and other factors provides an effective means of assessing program quality relative to other programs, other agencies, and other countries. Performance R&D programs should maintain a set of high priority, multi-year R&D objectives with annual performance outputs and milestones that show how one or more outcomes will be reached. Metrics should be defined not only to encourage individual program performance but also to promote, as appropriate, broader goals, such as innovation, cooperation, education, and dissemination of knowledge, applications, or tools. OMB encourages agencies to make the processes they use to satisfy the Government Performance and Results Act (GRPA) consistent with the goals and metrics they use to satisfy these R&D criteria. Satisfying the R&D performance criteria for a given program should serve to set and evaluate R&D performance goals for the purposes of GPRA. OMB expects goals and performance measures that satisfy the R&D criteria to be reflected in agency performance plans. Programs must demonstrate an ability to manage in a manner that produces identifiable results. At the same time, taking risks and working toward difficult-to-attain goals are important aspects of good research management, especially for basic research. The intent of the investment criteria is not to drive basic research programs to pursue less risky research that has a greater chance of success. Instead, the Administration will focus on improving the management of basic research programs. OMB will work with some programs to identify quantitative metrics to compare performance across programs with similar goals. Such comparisons may be within an agency or among agencies. Construction projects and facility operations will require additional performance metrics. Cost and schedule earned-value metrics for the construction of R&D facilities must be tracked and reported. Within DOE, the Office of Science’s formalized independent reviews of technical cost, scope, and schedule baselines and project management of construction projects (“Lehman Reviews”)
OCR for page 118
Evaluating Research Efficiency in the U.S. Environmental Protection Agency are widely recognized as an effective practice for discovering and correcting problems involved with complex, one-of-a-kind construction projects. REFERENCES OMB (Office of Management and Budget). 2007. Research and development program investment criteria. Pp. 72-77 in Guide to the Program Assessment Rating Tool (PART). Program Assessment Rating Tool Guidance No. 2007-02. Office of Management and Budget, Washington, DC. January 29, 2007 [online]. Available: http://stinet.dtic.mil/cgi-bin/GetTRDoc?AD=ADA471562&Location=U2&doc=GetTRDoc.pdf [accessed Nov. 14, 2007].