ernment review. Some assessments, such as TEAP, have not included a formal review process because all of the major stakeholders were already involved in preparation of the report and because they included proprietary information. Stratospheric ozone assessments are peer reviewed, but do not undergo public review because the scope of the issue is limited and the perception of legitimacy in this process by all major stakeholders has been established over time.

Effective review processes increase credibility by allowing many individuals to evaluate the veracity of the report and increase legitimacy by involving a larger range of stakeholders (Edwards and Schneider 2001). A transparent process for review is especially important (Edwards and Schneider 2001; Watson 2006). The following questions are helpful for establishing the guidelines for review of an assessment product:

  • Will there be an expert review only, or also a stakeholder, government, and public review?

  • How will reviewers be selected?

  • Who coordinates the review process?

  • How will responses to review comments be handled?

  • Will reviewer comments and responses be made public?

To address the risk that experts involved in the assessment process might promote an agenda or their own research, the review process can be designed to include a balanced group of reviewers, incorporating varied viewpoints and expertise from outside the field of science being assessed. Legitimacy in the process often can be enhanced by setting up an independent body of respected individuals to function as a neutral broker between the reviewers and the experts involved in the assessment process.

CONSENSUS BUILDING

Dissenting voices among assessment participants can negatively impact perceptions of the legitimacy of the assessment process and can even detract from its credibility if the dissent is not addressed in a rigorous and transparent fashion (Edwards and Schneider 2001). Ideally, assessment leaders will manage the process such that either a consensus can be found, or the dissenting conclusions can be incorporated into the process. For example, differing views can be explained by inherent uncertainties of the state of knowledge or by alternative interpretations of available information. Assessments are more likely to be effective if they have clear guidelines agreed upon by participants from the outset and explicit treatment of dissenting views.

Given that process assessments rely on the latest scientific knowledge available, dissenting conclusions in these types of assessments are more



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement