Appendixes



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 95
Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program Appendixes

OCR for page 95
Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program This page intentionally left blank.

OCR for page 95
Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program Appendix A Measuring Government Performance A number of federal laws and policies require government agencies to measure and report the performance of their programs. These include the Government Performance and Results Act (GPRA), the research and development (R&D) investment criteria, and the Program Assessment Rating Tool (PART). GPRA establishes a broad statutory framework for management and accountability, whereas the R&D investment criteria and PART are focused on more simplified measures of performance for budget decisions. GOVERNMENT PERFORMANCE AND RESULTS ACT The Government Performance and Results Act of 19931 was intended to increase the effectiveness, efficiency, and accountability of the federal government. It requires federal agencies to set strategic goals and to measure program performance against those goals. Reporting takes three forms: a strategic plan, which states the agency mission, goals and objectives, and a description of how the goals and objectives will be achieved over the next five or more years; an annual performance plan, which establishes performance goals as well as performance indicators for measuring or assessing the outputs, service levels, and outcomes of each program activity; and 1   Public Law 103-62.

OCR for page 95
Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program an annual performance report, which compares actual accomplishments with the performance goals. The GPRA does not apply to interagency programs such as the Climate Change Science Program (CCSP). However, agency contributions to such programs are subject to GPRA, although they may be formulated in agency terms, rather than interagency terms. Agencies that cannot express performance goals in an objective, quantifiable, and measurable form can seek Office of Management and Budget (OMB) approval for alternative forms. Science agencies have generally adopted both quantitative (e.g., publication count) and qualitative (e.g., progress in understanding) indicators.2 RESEARCH AND DEVELOPMENT INVESTMENT CRITERIA AND THE PROGRAM ASSESSMENT RATING TOOL In 2002, two White House management initiatives—the R&D investment criteria and PART—were introduced in part to inform budget decisions. The R&D investment criteria were intended to improve the process for budgeting, selecting, and managing research and development programs.3 Managers must demonstrate the extent to which their programs meet the tests of relevance, quality, and performance (see Box A.1). The criteria also address retrospective review of whether investments were well directed, efficient, and productive. PART focuses on the subset of long-term and annual performance measures that capture the most important aspects of the program’s mission and priorities. Based on a set of yes or no questions (see Box A.2), each program is assigned a score, which is translated into a qualitative rating: effective, moderately effective, adequate, ineffective, or results not demonstrated. The rating is intended to be used to tie program performance to 2   General Accounting Office, 1997, Measuring Performance: Strengths and Limitations of Research Indicators, GAO/RCED-97-91, Washington, D.C., 34 pp. 3   Memorandum on FY 2004 interagency research and development priorities, from John H. Marburger III, director of the Office of Science and Technology Policy, and Mitchell Daniels, director of the Office of Management and Budget, on May 30, 2002, <http://www.ostp.gov/html/ombguidmemo.pdf>. The guidelines drew heavily from National Research Council, 2001, Implementing the Government Performance and Results Act for Research: A Status Report, National Academy Press, Washington, D.C., 190 pp. OMB also developed guidelines for applied research, using the Department of Energy’s (DOE’s) applied energy technology programs as a pilot. See <http://www7.nationalacademies.org/gpra/Applied_Research.html>.

OCR for page 95
Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program Box A.1 R&D Investment Criteria The following criteria apply to all federal research and development programs. Relevance Programs must have complete plans, with clear goals and priorities. Programs must articulate the potential public benefits of the program. Programs must document their relevance to specific presidential priorities to receive special consideration. Program relevance to the needs of the nation, of fields of science and technology, and of program “customers” must be assessed through prospective external review. Program relevance to the needs of the nation, of fields of science and technology, and of program “customers” must be assessed periodically through retrospective external review. Quality Programs allocating funds through means other than a competitive, meritbased process must justify funding methods and document how quality is maintained. Program quality must be assessed periodically through retrospective expert review. Performance Programs may be required to track and report relevant program inputs annually. Programs must define appropriate output and outcome measures, schedules, and decision points. Program performance must be retrospectively documented annually. SOURCE: Office of Management and Budget, 2003, Budget Procedures Memorandum No. 861, Completing the Program Assessment Rating Tool (PART) for the FY 2005 Review Process, 60 pp., <http://www.whitehouse.gov/omb/part/bpm861.pdf>.

OCR for page 95
Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program Box A.2 PART Questions and Relation to the R&D Investment Criteria Program Purpose and Design (20 percent weighting) Questions address R&D investment criteria of program relevance: 1.1 Is the program purpose clear? 1.2 Does the program address a specific and existing problem, interest, or need? 1.3 Is the program designed so that it is not redundant or duplicative of any other federal, state, local or private effort? 1.4 Is the program design free of major flaws that would limit the program’s effectiveness or efficiency? 1.5 Is the program design effectively targeted so that resources will address the program’s purpose directly and will reach intended beneficiaries? Strategic Planning (10 percent weighting) Questions address prospective aspects of the R&D investment criteria: 2.1 Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program? 2.2 Does the program have ambitious targets and time frames for its long-term measures? 2.3 Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program’s long-term goals? 2.4 Does the program have baselines and ambitious targets for its annual measures? 2.5 Do all partners (including grantees, subgrantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program? 2.6 Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need? 2.7 Are budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program’s budget? 2.8 Has the program taken meaningful steps to correct its strategic planning deficiencies? Additional questions for R&D programs: 2.RD1 If applicable, does the program assess and compare the potential benefits of efforts within the program and (if relevant) to other efforts in other programs that have similar goals? 2.RD2 Does the program use a prioritization process to guide budget requests and funding decisions?

OCR for page 95
Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program Program Management (20 percent weighting) Questions address prospective aspects of program quality and performance in the R&D investment criteria, as well as general program management issues: 3.1 Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance? 3.2 Are federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule, and performance results? 3.3 Are funds (federal and partners’) obligated in a timely manner and spent for the intended purpose? 3.4 Does the program have procedures (e.g., competitive sourcing or cost comparisons, information technology improvements, appropriate incentives) to measure and achieve efficiencies and cost-effectiveness in program execution? 3.5 Does the program collaborate and coordinate effectively with related programs? 3.6 Does the program use strong financial management practices? 3.7 Has the program taken meaningful steps to address its management deficiencies? Additional question for R&D programs: 3.RD1 For R&D programs other than competitive grants programs, does the program allocate funds and use management processes that maintain program quality? Program Results and Accountability (50 percent weighting) Questions address retrospective aspects of the R&D investment criteria, with emphasis on performance: 4.1 Has the program demonstrated adequate progress in achieving its long-term performance goals? 4.2 Does the program (including program partners) achieve its annual performance goals? 4.3 Does the program demonstrate improved efficiencies or cost-effectiveness in achieving program goals each year? 4.4 Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals? 4.5 Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results? SOURCE: Office of Management and Budget, 2005, Guidance for Completing the Program Assessment Rating Tool (PART), 64 pp., <http://www.whitehouse.gov/omb/part/fy2005/2005_guidance.doc>.

OCR for page 95
Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program budget appropriations.4 Twenty percent of federal programs are being rated each year, beginning with the fiscal year (FY) 2004 budget request.5 A 2004 General Accounting Office (GAO) report found that PART had helped structure OMB’s use of performance information for program analysis and internal review.6 However, budget allocations were not always tied to program ratings. Programs rated as “effective” or “moderately effective” did not always receive increased funding, and programs rated as “ineffective” did not always lose funding. The report also noted that by using PART to influence GPRA measures, OMB is influencing agency program goals, to the detriment of a wide range of stakeholders. It concluded that although PART is useful for program-level budget analysis, it cannot substitute for GPRA’s longer-term, strategic focus on thematic goals. Nevertheless, goals and performance measures relevant to the R&D criteria and PART are being incorporated into future GPRA agency performance plans.7 4   Office of Management and Budget, 2003, Performance Measurement Challenges and Strategies, 13 pp., <http://www.whitehouse.gov/omb/part/challenges_strategies.pdf>. 5   Agency programs relevant to climate change that were evaluated in the FY 2004 budget include Department of Defense (DOD) Basic Research; DOE’s Basic Energy Sciences, Biological and Environmental Research, Environmental Management, and Office of Science; U.S. Agency for International Development (USAID) Climate Change; and National Science Foundation (NSF) Geosciences. In FY 2005, relevant agency programs include the U.S. Department of Agriculture’s (USDA’s) National Resources Inventory and Soil Survey; Department of Interior’s (DOI’s) Science and Technology; Environmental Protection Agency’s (EPA’s) Ecological Research; and National Aeronautics and Space Administration’s (NASA’s) Biological Sciences Research and Earth Science Applications. See <http://www.whitehouse.gov/omb/part/program_assessments_planned_2005.html>. 6   General Accounting Office, 2004, Performance Budgeting: Observations on the Use of OMB’s Program Assessment Rating Tool for the Fiscal Year 2004 Budget, GAO-04-174, Washington, D.C., 67 pp. 7   Office of Management and Budget, 2003, Budget procedures memorandum no. 861, Completing the Program Assessment Rating Tool (PART) for the FY2005 review process, 60 pp., <http://www.whitehouse.gov/omb/part/bpm861.pdf>.