Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 88
Scientific Review of the Proposed Risk Assessment Bulletin from the Office of Management and Budget 6 Impact on the Practice of Risk Assessment in the Federal Government The committee was asked to comment in general terms on how the guidance in the proposed Office of Management and Budget (OMB) bulletin would affect the practice of risk assessment in the federal government. That task was interpreted by the committee as including the following questions: First, how would implementation of the OMB bulletin improve the practice of risk assessment from a scientific perspective? That is, what benefits would accrue from implementation of the bulletin? Second, what are the costs in staff resources that would be necessary to implement the bulletin? Third, how would implementation of the bulletin affect the timeliness of completing risk assessments in the federal government? Fourth, can the bulletin be integrated smoothly into the agencies’ current practices? Fifth, overall, what are the expectations as to whether implementation of the bulletin would improve the practice of risk assessment in federal agencies and achieve the stated objective “to enhance the technical quality and objectivity of risk assessments prepared by federal agencies” (OMB 2006, p. 3)? On the basis of the committee’s general experience, information generated during its review of the bulletin, and the comments received from federal agencies (see Appendix E), the committee concludes that, although variable and uncertain to some extent, the potential for adverse impacts of the bulletin on the practice of risk assessment in the federal government is high. The bases of that conclusion are discussed below and include the likely drain on agency resources, the extended time necessary to complete risk assessments that are undertaken, and the highly
OCR for page 89
Scientific Review of the Proposed Risk Assessment Bulletin from the Office of Management and Budget likely disruptive effect on many agencies of implementing the bulletin. Moreover, if some of the provisions discussed in earlier chapters and below were ultimately interpreted in a rigid one-size-fits-all way, the overall adverse impact would be substantially greater. As a starting point, the committee addresses OMB’s failure to undertake—or at least provide to the public—an evaluation of the likely benefits and costs of implementing the bulletin for agency risk assessment practices and the consequences of that omission for the committee’s work. THE ABSENCE OF INFORMATION TO EVALUATE THE IMPACT OF THE BULLETIN ON AGENCY RISK ASSESSMENT PRACTICES OMB, the champion of benefit-cost analysis for decision-making, requires agencies that propose major regulations to provide quantitative, or at least qualitative, information regarding the anticipated consequences of their proposals. It was therefore surprising that OMB did not include such information in its proposed bulletin. For example, to gauge the benefits to be achieved from implementing the bulletin, it is essential to specify the baseline—in this case, the agencies’ current practices with respect to risk assessment. Although OMB has implied that the agencies do not now meet the standards it seeks to establish, it has not constructed a baseline specifying risk assessment proficiency for each agency (or even each of the major regulatory agencies), including the extent to which a few, some, or many agencies produce generally satisfactory and high-quality risk assessments or the reasons why those or other agencies fall short of the specified standards. Specifically, OMB has not established which agencies do not know what good practices are and which agencies do not have the ability, resources, or incentives to meet those standards. Similarly, OMB has not identified the costs that could be incurred by implementing the bulletin. The extent of the changes in the agencies will generally depend on the extent to which they are not currently meeting the standards set forth in the bulletin—again, a baseline issue. Beyond that, however, OMB has not identified the costs, such as the staff resources necessary to meet the bulletin’s standards, the additional time that would be required to meet the standards, and the disruption that would result from changing established practices. Nor has OMB indi-
OCR for page 90
Scientific Review of the Proposed Risk Assessment Bulletin from the Office of Management and Budget cated what weight it gave to those factors in its decision to propose the bulletin. Given the importance of such information for an evaluation of the impact of the bulletin, the committee has attempted on its own to analyze the various likely effects of the bulletin on the practice of risk assessment in the federal government. BENEFITS OMB anticipates that implementation of the bulletin would raise all agency risk assessment practices to consistently higher levels and that that would translate into better information for decision-makers and hence better decisions. The committee accepts (indeed, applauds) that goal but finds that the proposed bulletin cannot achieve that result. In evaluating potential benefits, it is essential to understand that not all agencies are the same and that they deal with different types of hazards or risks. Indeed, there are substantial disparities among agencies (and even among components of the same agency) in sophistication with respect to risk assessment, expertise and experience with risk assessments, and resources available to devote to risk assessments. Some agencies have spent considerable time and resources in developing internal risk assessment guidelines (for example, EPA 1996, 1998, 2005; NASA 2002), others have taken the first few steps toward staffing up and are making some progress, and still others appear to rely almost completely on outsourcing for their risk assessments. The agencies also have different missions, which require different types of risk assessments. There are risk assessments involving engineered systems, risk assessments involving ecologic science, and risk assessments dealing with public-health issues (perhaps those on which the bulletin is most focused) and involving biologic sciences. Although those assessments have features in common, they differ substantially in many ways.1 1 See, for example, Appendix E, p. DOT-2 (“the operating administrations employ varied risk assessment practices that range from informed judgment to probabilistic risk assessments”); p. DOD-1 (“risk assessment methods and characterization of uncertainty are dependent upon and tailored to the specific purpose or function being assessed”); pp. HHS-1 and -2 (“FDA and CDC use very similar conceptual approaches to risk assessment although the different contexts
OCR for page 91
Scientific Review of the Proposed Risk Assessment Bulletin from the Office of Management and Budget The bulletin would thus affect different agencies in vastly different ways, and the potential for benefits would be highly varied across agencies. In general, the introduction and implementation of standards and guidelines where none exist or where existing standards are inadequate could lead to improvements. The people responsible for the expanded range of risk assessments covered by the bulletin would undoubtedly benefit from having an explicit statement of what is expected of them2—and this is true even for those in the agencies that contract out most of their risk assessment work. Implementation of the bulletin might enable agency managers (particularly political appointees who may not have extensive experience with the scientific or technical work of the agency) to ask more pertinent questions of risk assessors and to demand compliance with the new standards.3 And availability of the bulletin might diminish delays caused by having to start anew or make major revisions near the end of the process. Therefore, the committee expects that if the proposed bulletin were implemented, at best, some agency risk assessments might be slightly improved from a scientific or technical perspective. One important committee concern is that imposing all the bulletin’s provisions on all agency risk assessments would not improve their scientific quality. That is so because broad scientific consensus does not exist for some provisions, such as uncertainty and variability (discussed in Chapter 4). Another potential problem is that the bulletin specifies that the quality standard for the dissemination of public information in the 1996 amendments to the Safe Drinking Water Act “should be met, where feasible, in all risk assessments which address adverse health effects” (OMB 2006, p. 13). The committee was unable to identify any information regarding the implementation of that statute, and although it includes a number of scientifically valid suggestions, it represents a proposal at the edge of risk assessment science rather than one of general scientific acceptance.4 In these circumstances, to impose across the board a legisla- (e.g., food, environmental, and occupation) necessitate differences in these agencies approaches. … FDA’s efforts include probabilistic risk assessments, safety assessments and qualitative risk assessments”); and p. EPA-2. 2 See, for example, Appendix E, p. DOD-10. 3 See, for example, Appendix E, p. NASA-9 (“being able to cite an external requirement reinforces the existing risk assessment requirements established within NASA”). 4 See Appendix E, pp. EPA-16 and -17 (“EPA has adapted these requirements in
OCR for page 92
Scientific Review of the Proposed Risk Assessment Bulletin from the Office of Management and Budget tive provision enacted for one specific statute as though it represents mainstream scientific thought seems premature and would produce uncertain results. Moreover, if the goal is consistently high-quality risk assessments across the federal government, it means that federal agencies farthest behind would have to be brought up to the agencies doing a generally respectable job, and there is no indication that the agencies behind the curve will be able to improve if only they are told what they need to do.5 In fact, many deficiencies in the technical quality of current risk assessments and risk assessment programs can be traced, not to inadequate guidance, but to inadequate resources, including inadequate budgets and inadequate staffing (qualitatively in relevant expertise or quantitatively in number of qualified experts). Many agencies will require more and better data to satisfy the specific risk assessment requirements described in the bulletin.6,7 That in turn will depend in part on future federal budgets for research and data-gathering,8 whether that work is undertaken by agency personnel or its implementation of the Information Quality Guidelines…‘in light of our numerous statutes, regulations, guidance and policies’…and [to] accommodate the range of real world situations that EPA confronts in its implementation of our diverse programs.”) 5 Appendix E, pp. DOD-6 and -7, is not to the contrary; DOD asks for more guidance, but it is looking not for general statements but for specific implementation or policy clarifications to flesh out the general guidance. 6 The committee notes that talented risk assessors can produce high-quality risk assessments (that is, risk assessments that capture the state of the knowledge concerning the hazard, describe uncertainties, and explain how the uncertainties affect the interpretation of the risk assessment results) with minimal or poor data. However, risk assessments conducted with minimal or poor data will inevitably yield risk estimates with greater uncertainty, which will undermine acceptance. 7 See, for example, Appendix E, p. DOT-3 (“the challenges…involve a lack of data relating to the nature of the risks at issue”); DOD at page 5 (“lack of scientifically defensible and/or agreed upon input information”); p. DOE-3 (“one of the technical difficulties is the paucity of data”); p. EPA-5 (“the principal scientific challenge relates to limited data”); and p. HUD-1 (“data are not amenable to aggressive statistical data manipulation”). 8 See, for example, Appendix E, p. HHS-4 (“most challenges that FDA faces in conducting risk assessments are related to funding or resource scarcity rather than substantial scientific or technical issues”); and p. HUD-1 (“Congressional
OCR for page 93
Scientific Review of the Proposed Risk Assessment Bulletin from the Office of Management and Budget through outsourcing. For some individual assessments, funding requirements can be substantial.9 Even small, local assessments can generate substantial costs for research and data-gathering. In addition, informed use of data depends on staff expertise and experience in each of the areas identified as risk assessment in the bulletin. Without adequate staffing, scientific data cannot be responsibly interpreted and applied for risk assessment purposes. For example, some agencies—such as the Centers for Disease Control and Prevention and some offices in the Food and Drug Administration and the Environmental Protection Agency (EPA)—have epidemiologists on staff, but other risk assessing offices that will need fully qualified epidemiologists to meet the standards set forth in the bulletin do not have such experts on board.10 Given the current state of affairs with respect to funding and staff-ing, the committee finds that implementation of the bulletin, without concentrated attention on data and staffing needs in relation to the baseline, is unlikely to achieve the objective of enhancing the technical quality of risk assessments throughout the federal government. COSTS Staff Resources to Implement Guidance Although the benefits associated with the changes in practices called for by the bulletin would be varied and uncertain, the costs can be expected to affect every agency and, in general, to be substantial. That should be considered in the current context of limited funding for risk assessment activities and the challenges already facing agencies.11 Thus, authority and appropriations may limit the scope of research to support the risk assessment”). 9 For example, EPA alone “has funded a total of $368 million on PM [particulate matter] research and related technical work for fiscal years 1998-2003, including $66.7 million for fiscal year 2003” (NRC 2004). 10 See Appendix E, p. HUD-1 (“cannot support full time equivalent staff for the [required] analyses”). 11 See, for example, Appendix E, p. HHS-4 (“the logistics of supporting risk assessment activities remain difficult and involve issues such as availability of staff expertise and availability of funding”).
OCR for page 94
Scientific Review of the Proposed Risk Assessment Bulletin from the Office of Management and Budget adding mandates (for example, expanding the scope and complexity of risk assessments) would necessitate reallocation of resources and would probably negatively affect the number of risk assessments produced by federal agencies, the availability of advisory materials from federal agencies, and the ability of the agencies to complete non-risk-assessment work. First, as discussed in Chapter 3, the definition of risk assessment in the bulletin goes well beyond what the agencies have consistently construed as risk assessments. In addition, with so many separate documents (defined as including not just traditional risk assessments but also analyses, such as margin-of-exposure estimates, hazard determinations, and toxicologic profiles) individually subject to the standards and related certification, additional resources would be required for agencies to ensure that their work products satisfy the requirements of the bulletin.12 Second, in many instances, additional research would have to be undertaken or additional data gathered to meet the provisions of the bulletin. Consider, for example, Section IV(3), requiring a range of plausible risk estimates whenever a quantitative characterization of risk is provided, and Section IV(7c), requiring “information on the timing of exposure and the onset of the adverse effect(s).” Section IV(7b)’s requirement to “assess, to the extent feasible, countervailing risks caused by alternative mitigation measures” could lead, for example, to having to evaluate occupational risks posed by environmental interventions or even the secondary effect of income on health; this could result in an extremely broad-based analysis much larger in scope than currently undertaken. Consider also the additional analysis that would have to be undertaken to satisfy Section V(4c)’s requirement that risk assessors “provide a quantitative distribution of the uncertainty” and Section V(6)’s requirement “to characterize…variability through a quantitative distribution, reflecting different affected population(s), time scales, geography, or other parameters relevant to the needs and objectives of the assessment.” In general, 12 See, for example, Appendix E, p. NASA-9 (applying the bulletin to “any internal risk assessment performed within NASA that is releasable under the Freedom of Information Act…could [result in] a substantial burden to meet all of the requirements contained within the Bulletin”); and p. DOL-4 and -5 (the Occupational Safety and Health Administration’s exposure assessments and nonregulatory informational products (for example, perchloroethylene exposures) have not been treated as risk assessments and would therefore be subject to new requirements).
OCR for page 95
Scientific Review of the Proposed Risk Assessment Bulletin from the Office of Management and Budget adherence to the provisions of the bulletin would be more labor- and resource-intensive than current practices.13 As noted above, staff would have to be added to provide for necessary expertise; even for agencies that have an excellent corps of experts in other fields, the bulletin’s emphasis on uncertainty analysis will require major qualitative and quantitative changes in their staffing profile to ensure the availability of adequate numbers of person qualified to produce and interpret these complex analyses.14 Virtually all the existing staff and the new staff would also have to undergo training, a costly and time-consuming process. Third, the lack of flexibility and the lack of clarity of the bulletin would probably result in some unnecessary use of resources with little gain in quality because, as the bulletin reads now, a standard is to be applied whether or not it has scientific relevance in any particular case. For example, applying a quantitative analysis to a qualitative discussion of toxicity would have little value. Reanalyzing analyses previously rejected or evaluating the rigor of proffered studies developed by outside parties on similar issues would not ordinarily improve the quality of the risk assessment itself.15 And subjecting peer-reviewed scientific journal articles or Power Point presentations to the provisions of the bulletin would add little scientific rigor to the assessment of risks by the agencies.16 There is also the possibility of squandering resources because the bulletin is not clear as to what constitutes compliance, in the sense of what is sufficient to satisfy the requirements. For example, Section IV(3) states that “when a quantitative characterization of risk is provided, a range of plausible risk estimates shall be provided.” How large must the range be? Section IV(4b) requires that the risk assessors shall give “weight to both positive and negative studies in light of each study’s 13 See Appendix E, p. DOD-11. See also, for example, Appendix E, p. EPA-14 (“if categorically adopted [the Bulletin’s provisions] would mandate a high level of analysis and development of characterization that goes beyond most current EPA practice in risk assessment”). 14 See Appendix E, p. DOD-11 (“increased the level of expertise needed to perform quantitative uncertainty analyses”). 15 See Appendix E, p. HHS-15. See also p. OMB-3 (“If third-party submissions are to be used and made publicly available by Federal agencies, it is the responsibility of the Federal Government to make sure that such information meets relevant standards”). 16 See Appendix E, pp. HHS-7 and -8.
OCR for page 96
Scientific Review of the Proposed Risk Assessment Bulletin from the Office of Management and Budget technical quality.” Apart from the possibility that affected entities could use this provision to further drain staff resources by requiring an agency to respond to unconventional or largely irrelevant studies,17 how serious must a study be to qualify? How much discussion of the weight given is necessary? Section IV(5) requires that for “critical assumptions in the assessment…a quantitative evaluation of reasonable alternative assumptions and their implications for the key findings of the assessment” be included. How many alternative assumptions must be considered, and how detailed should the discussion of the implications be?18 Given the likelihood of challenges to controversial risk assessments, agencies may feel compelled to reallocate even more resources to particularly important risk assessments, lest there be any question about their compliance with the bulletin.19 These are clearly wasted costs that could be substantial as the stakes are raised. Moreover, as discussed in Chapter 4, it is not always clear at the outset of a risk assessment whether it would ultimately fall under the general standards, the regulatory standards, or the special standards for influential risk assessments.20 If the risk were considered influential and the most exacting standards were applied only to find little impact, substantial resources would have been used needlessly. Although these new obligations would be imposed, there is no indication that any additional funds are being requested or appropriated. If current budgetary conditions continue for the indefinite future, it appears unlikely that additional resources will be made available. As a result, funding will have to come from within the agencies and presumably 17 See Appendix E, p. HHS-20 (“There may be instances where parties (particularly competitors) may disagree over the ‘science’ to be applied…or even whether conventional scientific concepts are applicable or recognizable. In the latter case, individuals or firms advocating the use of ‘unconventional’ or ‘alternative’ therapies may…argue that individuals trained in ‘conventional’ science or medicine are either biased or not qualified to evaluate the merits of their products”). 18 See Appendix E, p. DOE-3 (“there is always one more scenario, or one more approach that someone feels deserves assessment”). 19 See, for example, Appendix E, p. EPA-14 (“while…the Bulletin does not create legal rights…, challenges that claim that the risk assessment or supporting analyses have not fully carried out the practices established by the Bulletin come in many other fora. Such claims could pose an additional burden”). 20 See also, for example, Appendix E, p. HHS-13 (“It is not always clear…at the outset of a risk assessment that it will be influential”).
OCR for page 97
Scientific Review of the Proposed Risk Assessment Bulletin from the Office of Management and Budget from funds otherwise targeted for risk assessments. The committee is concerned about two possibilities. Some agencies may try to meet the new requirements on some risk assessments, leaving them with inadequate resources to undertake other (already started or planned) risk assessments, so fewer risk assessments would be done, fewer risks would be identified (and the extent of the risks understood), and fewer solutions would be proposed for problems that need consideration. The alternative is for agencies to continue to do all the risk assessments they are now doing (indeed, in some cases, statutorily required to do) and the additional ones that are now covered by the bulletin but cut corners wherever they can, so the overall level of quality will decline. Timeliness in Completing Risk Assessments Many risk assessments take considerable time, some several years.21 The bulletin obviously would add to the timeline of existing risk assessments, sometimes—for example, the requirements for gathering additional data or doing additional research or analysis discussed above—a great deal of time.22 There would be additional demands from 21 See Appendix E, p. DOT-5 (“the time…varies widely from days to years, depending on the complexity of the issue”); p. DOD-8 (“Health hazard assessments…take 30 to 90 days…Human health risk assessments for the Defense Environmental Restoration Program sites can vary from months…to 5 years or greater for complex sites”); p. DOE-4 (“the range may be from a few months to several years for extremely complex or controversial projects”); p. DOI-9 (time “varies, contracted risk assessments may take up to 2 years from problem identification to delivery”); p. EPA-9 (“assessments vary widely in their complexity and in the time needed for their production and completion…ninety days [for TOSCA]…few weeks or few months [for Superfund sites]…one to five years [for IRIS]…and some of the most complex assessments…in which there is significant controversy and significant new data, the time needed may extend well beyond five years”); pp. HHS-6 (“the Report on Carcinogens takes approximately 2.5 years for each agent under review”) and -22 (surgeon general reports on smoking “varied from less than a year to over 5 years”); p. HUD-2 (“the period to complete an original risk assessment is usually two years”); p. NASA-7 (“the completion of nuclear mission safety analyses require about 3-5 years”); and p. DOL-4 (“most recently completed risk assessment…required about 2.5 years”). 22 See Appendix E, p. NASA-7 (applying the standards for influential risk as
OCR for page 98
Scientific Review of the Proposed Risk Assessment Bulletin from the Office of Management and Budget its application to documents not customarily considered risk assessments.23 At a minimum, effective implementation of the bulletin would require agency management to add to already full workloads far more time for risk assessment planning at the front end of the process and for interpretation and use of the risk assessment at the end of the process.24 An agency must also be mindful that, in some circumstances, the full scope of data needs may be identified only during the course of an assessment when data gaps appear; at that point, the assessment may be put on hold for additional data development to augment the assessment or require additional time and staffing to complete. Finally, delays in completing risk assessments, again as defined by the bulletin, may result in untimely responses to “unsafe conditions,”25 urgent public-health needs,26 untimely release of public alerts about serious risks,27 or disease investigations, such as anthrax deaths, mumps outbreaks, or SARS.28 One element of the current timeline for completing risk assessments is the OMB requirement establishing minimum standards for peer review of scientific information disseminated by the federal government (70 Fed. Reg. 2664-2677 ).29 The requirement for peer review is re- sessments to determine whether an internal policy or directive was required could “dramatically impact the time to develop, implement, and modify the internal controls”); p. DOL-6 (“deriving quantitative distributions of model uncertainty and variability…could add significant time…where such analyses are not critical to fully inform regulatory decisionmakers”); p. DOD-12 (giving “weight to both positive and negative studies in site-specific risk assessments…may significantly increase the time and resources needed to conduct the assessment”); and p. EPA-14 (describing the many problems with the standards calling for multiple analyses). 23 See, for example, Appendix E, pp. HHS-21 and -22 and p. EPA-14. 24 See, for example, Appendix E, p. HUD-3 (“the time course would have to be extended to make sure the procedures are properly followed”); p. DOD-11 (“some organizations…believe that adherence to [certain] provisions may impact the ability to meet critical and/or regulatory prescribed deadlines”); and p. EPA-15 (“if EPA followed all of the procedures described in the twenty standards, assessments could take considerably longer”). 25 See Appendix E, p. DOT-8. 26 See Appendix E, p. HHS-21. 27 See Appendix E, p. HHS-15. 28 See Appendix E, p. HHS-10. 29 OMB there stated that “peer review is one of the important procedures used to ensure that the quality of published information meets the standards of the scien
OCR for page 99
Scientific Review of the Proposed Risk Assessment Bulletin from the Office of Management and Budget peated in this bulletin.30 As a practical matter, if the risk assessment bulletin were issued as is, agencies might add to the task of the peer reviewers that they determine whether the risk assessment they are reviewing meets the bulletin’s guidance. If so, that will add another ingredient to the peer review process and possibly extend the time needed.31 There is also concern that the call for peer review, with the call for public participation, in the bulletin goes beyond previously stated requirements.32 More important, additional time would be added if, in addition to the peer review process, OMB were to review anew the work product even where there is an existing well-done peer review.33 In this connection, the committee notes that it supports the call for peer review of risk assessments because that is the standard course for ensuring good scientific standards on such work. The committee is therefore troubled by OMB’s repeated references to the Information Quality Act (IQA) and its invocation as the legal authority for OMB to issue this bulletin (OMB 2006, p. 7), that suggest that challenges to a particular risk assessment—and almost every risk assessment is open to challenge on one ground or another—will be handled through the process designed for the IQA, a process that is more a legal or policy process than a scientific one.34 Specifically, the committee is concerned that to the extent that the implementation of the technical aspects of risk assessment will be tific and technical community” (p. 2665). 30 Section III(5): “The agency shall follow appropriate procedures for peer review and public participation in the process of preparing the risk assessment.” 31 See, for example, Appendix E, p. DOE-5. 32 See, for example, Appendix E, p. EPA-15 (“this section goes beyond [existing] guidelines by calling for a response to comment package for all influential risk assessments, and also in its call not only to explain the basis for the agency position, but also to explain why other approaches were not taken, and why”); and p. HHS-22 discussing surgeon general reports and noting that “adding the requirement for public participation and comment to this process likely would add a large volume of comments, which would affect the timeliness of the reports without adding improvements in the scientific quality to the report.” See also Appendix E, p. DOT-8. 33 See Appendix E, p. OMB-3 (“under existing authorities and procedures, OMB might review a risk assessment [that has been peer reviewed in accordance with established peer review procedures]”). 34 IQA gives the right to private groups to file administrative challenges to data disseminated by federal agencies, with an appeal to OMB (67 Fed. Reg. 8452 ).
OCR for page 100
Scientific Review of the Proposed Risk Assessment Bulletin from the Office of Management and Budget overseen by OMB and not by the peer-review process or by agency technical managers, scientific issues may be superseded by policy considerations.35 The bulletin also includes a number of requirements concerning presentation, such as Sections IV (1), (2), and (4c). Some—like providing “a clear statement of the…objectives of the risk assessment,” a clear summary of “the scope of the assessment,” “presenting the information about risk in an accurate, clear, complete and unbiased manner,” and “describing the data, methods and assumptions used in the assessment with a high degree of transparency”—are fairly straightforward and should be part of any well-written risk assessment, although there may be matters of dispute as to how complete is “complete” and how transparent is “a high degree of transparency.”36 The bulletin also requires an execu-tive summary (Section IV) that includes among other things “information that places the risk in context/perspective with other risks familiar to the target audience” (see Chapter 4). The apparent purpose of this recommendation is to remedy a presumed inability of the readers of risk assessments to understand the numbers as written. That makes it an aspect of risk communication—a process that the bulletin specifically disclaims addressing (OMB 2006, p. 3). As discussed in Chapter 5, the bulletin’s exclusion of risk communication is at variance with accepted practice, which holds that two-way risk communication is essential to sound assessment. If additional one-way communication is undertaken at the end of an assessment process, to make the results available to a wider (or less knowledgeable) audience, additional resources and time will be necessary to ensure that the materials are prepared in a scientifically sound way.37 Here, as elsewhere in this discussion, committee reserva- 35 This concern is greatly increased by public comments requesting that judicial review be considered a component of this process, further converting the scientific process into a legal one. OMB does not address the issue of enforcement in the bulletin, but in light of the many public comments on this issue, some clarification of OMB’s position would be desirable. 36 But see Appendix E, p. HHS-17 (“excessive characterization of every possible uncertainty or extensive evaluation of each assumption could make the risk assessment more confusing and less transparent”). 37 See, for example, Appendix E, p. DOD-11 (“additional labor will be required to…communicate the results to people unfamiliar with the risk assessment process”). See also Appendix E, p. DOT-8 (“requiring the risk assessment to contain a range of risk estimates so that the public is aware of whether the nature of the
OCR for page 101
Scientific Review of the Proposed Risk Assessment Bulletin from the Office of Management and Budget tions and concerns are related to broad-brush application of all standards to all agencies; concerns regarding specific standards themselves are discussed elsewhere. A slightly different problem arises from the bulletin’s requirement that risk assessments used for regulatory analysis include a variety of evaluations of alternative mitigation measures (see Sections IV [7a], [7b], and [7c]). Although risk assessors contribute information for use in risk management, this standard goes well beyond the job description of the scientist or technical person assessing the risk onto the path of risk management, another subject that the bulletin said it would not address (OMB 2006, p. 3). Not only will these requirements be resource-intensive and time-consuming, but the committee is also concerned that if they were incorporated at the primary stage of the risk assessment process (for example, identifying a hazard and determining the extent of the risk), risk assessors may be greatly delayed in completing their work. Another example of the burden that would be imposed by the bulletin is the provision that for every risk assessment document (again, defined to include not just complete risk assessments but also individual components), an agency will have to “include a certification explaining that the agency has complied” with the bulletin and the information-quality guidelines (OMB 2006, p. 25). That is not only getting a signature of an official or a check-the-boxes form; it apparently would require a serious explanation of each step of the process and how it constitutes compliance with the bulletin—a substantial time demand, even assuming that everything is in order, not only for the scientific or technical staff that performed the risk assessment but also for agency managers and presumably the general counsel’s office. Time, like funds in the federal government, is a limited commodity, even apart from the statutory or court-imposed deadlines that are so problematic for some agencies. Time spent by staff on one project, whether it is doing additional work to comply with the terms of the bulletin or to document that it has complied with the bulletin, is time not spent on another, potentially more important project. Again, the committee is concerned about two possibilities: fewer risk assessments (with the attendant consequences) or the same number of risk assessments but of lower quality. risk is conservative…is time consuming, not always necessary, and could deter the DOT operating administrations from employing such assessments”).
OCR for page 102
Scientific Review of the Proposed Risk Assessment Bulletin from the Office of Management and Budget Integrating the Bulletin into Current Practices In addition to the costs identified above, there would probably be substantial costs attributable to integrating the bulletin into current risk assessment practices in the federal government. As noted above, many agencies have devoted substantial resources over the last several decades to developing risk assessment guidelines appropriate for their missions. Some, like those at EPA, have been developed with substantial input from stakeholders, consultants, congressional staff, the National Research Council, and other experts and interested parties. If the bulletin were viewed not merely as technical guidance for the less proficient in the field but rather (as it appears to be) as the standards to be applied for all risk assessments, each of the “mismatches” would have to be identified, the causes for the differences in approach documented, and substantial negotiations conducted with OMB to arrive at a decision as to what is most appropriate for a particular agency.38 It is beyond dispute that what works for EPA is not the same as what works for the Department of Transportation or the National Aeronautics and Space Administration. One size does not fit all, particularly where the agencies come to the issue with such disparate expertise and experience and, more important, dramatically different risk assessment responsibilities and resources. As developed above, risk assessments in the federal government include those involving statistical analyses, those evaluating the strength of bridges or levees with engineering and the physical sciences, those involving ecologic science, and those involving public-health matters. Those call for very different types of analyses, and imposing one set of standards on the lot is likely to be wasteful, if not counterproductive to good science. The committee notes that this is not just an up-front, one-time-only cost. Another of the bulletin’s requirements is to have procedures in place to ensure that agencies are “aware of new, relevant information that 38 See Appendix E, p. NASA-6 (where “a risk assessment evolves and is updated over the life of the project or program, it can be considered as a ‘living’ risk model with no fixed dates for their final delivery.” See also Appendix E, p. DOL-2, noting that the Occupational Safety and Health Administration (OSHA) generally bases its regulatory decisions on a range of central estimates of risk derived from the best supported models and that it is unclear how quantitative uncertainty distributions would be taken into account in OSHA’s regulatory framework.
OCR for page 103
Scientific Review of the Proposed Risk Assessment Bulletin from the Office of Management and Budget might alter a previously conducted influential risk assessment” (OMB 2006, p. 21). That is a desirable provision and scientifically valid if there are resources available to monitor the scientific literature for any research associated with an agent for which an influential risk assessment has been conducted and there is a prospect that changes in the science can be reflected contemporaneously in the decision-making process. But, such advances in the science might produce other ways of considering a scientific issue, so the agencies would have to renegotiate with OMB as to how they should do their work. ARE THE GOALS OF ENHANCING TECHNICAL QUALITY AND OBJECTIVITY OF RISK ASSESSMENT MET BY THE PROPOSED BULLETIN? The committee was asked to determine whether the bulletin achieves its stated purpose to “enhance the technical quality and objectivity of risk assessments.” The committee finds that it fails to achieve that purpose. The committee has identified a number of ways in which implementation of the overarching risk assessment principles can improve risk assessment practices but finds that the potential for benefits will vary widely among agencies and that, although salutary in some respects, the proposed bulletin will probably not achieve the objective of raising all agency risk assessment practices to consistently higher levels. In addition, the committee has identified some of the costs associated with the changes that would be brought about—in staff resources, timeliness of risk assessments, and other factors—and finds them to be substantial. Moreover, in earlier chapters, the committee identified various issues of interpretation; if these are not resolved in a way that provides flexibility to the agencies, the costs will be significantly increased. Overall, the committee concludes that, although varied and uncertain to some extent, the potential for adverse impacts on the practice of risk assessment in the federal government if the proposed bulletin were implemented is very high. For that reason, the committee does not accept OMB’s view that implementing the bulletin would enhance the technical quality and objectivity of risk assessments in the federal government.
OCR for page 104
Scientific Review of the Proposed Risk Assessment Bulletin from the Office of Management and Budget REFERENCES EPA (U.S. Environmental Protection Agency). 1996. Guidelines for Reproductive Toxicity Risk Assessment. EPA/630/R-96/009. Risk Assessment Forum, U.S. Environmental Protection Agency, Washington, DC [online]. Available: http://www.epa.gov/ncea/raf/pdfs/repro51.pdf [accessed July 27, 2006]. EPA (U.S. Environmental Protection Agency). 1998. Guidelines for Ecological Risk Assessment. EPA/630/R-95/002F. Risk Assessment Forum, U.S. Environmental Protection Agency, Washington, DC [online]. Available: http://cfpub.epa.gov/ncea/cfm/recordisplay.cfm?deid=12460 [accessed July 27, 2006]. EPA (U.S. Environmental Protection Agency). 2005. Guidelines for Carcinogen Risk Assessment. EPA/630/P-03/001B. Risk Assessment Forum, U.S. Environmental Protection Agency, Washington, DC [online]. Available: http://www.epa.gov/iris/cancer032505.pdf [accessed July 27, 2006]. NASA (National Aeronautics and Space Administration). 2002. Probabilistic Risk Assessment Procedures Guide for NASA Managers and Practitioners, Version 1.1. Prepared for Office of Safety and Mission Assurance NASA Headquarters, Washington, DC [online]. Available: http://www.hq.nasa.gov/office/codeq/doctree/praguide.pdf [accessed Oct. 18, 2006]. NRC (National Research Council). 2004. Research Priorities for Airborne Particulate Matter: IV. Continuing Research Progress. Washington, DC: The National Academies Press. OMB (U.S. Office of Management and Budget). 2006. Proposed Risk Assessment Bulletin. Released January 9, 2006. Washington, DC: Office of Management and Budget, Executive Office of the President [online]. Available: http://www.whitehouse.gov/omb/inforeg/proposed_risk_ assessment_bulletin_010906.pdf [accessed Oct. 11, 2006].
Representative terms from entire chapter: