Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 1
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary TOXICITY-PATHWAY-BASED RISK ASSESSMENT PREPARING FOR PARADIGM CHANGE A Symposium Summary
OCR for page 2
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary This page intentionally left blank.
OCR for page 3
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary Summary of the Symposium In 2007, a committee of the National Research Council (NRC) proposed a vision that embraced recent scientific advances and set a new course for toxicity testing (NRC 2007a). The committee envisioned a new paradigm in which biologically important perturbations in key toxicity pathways would be evaluated with new methods in molecular biology, bioinformatics, computational toxicology, and a comprehensive array of in vitro tests based primarily on human biology. Although some view the vision as too optimistic with respect to the promise of the new science and debate the time required to implement the vision, no one can deny that a revolution in toxicity testing is under way. New approaches are being developed, and data are being generated. As a result, the U.S. Environmental Protection Agency (EPA) expects a large influx of data that will need to be evaluated. EPA also is faced with tens of thousands of chemicals on which toxicity information is incomplete and emerging chemicals and substances that will need risk assessment and possible regulation. Therefore, the agency asked the NRC Standing Committee on Risk Analysis Issues and Reviews to convene a symposium to stimulate discussion on the application of the new approaches and data in risk assessment. The standing committee was established in 2006 at the request of EPA to plan and conduct a series of public workshops that could serve as a venue for discussion of issues critical for the development and review of objective, realistic, and scientifically based human health risk assessment. An ad hoc planning committee was formally appointed under the oversight of the standing committee to organize and conduct the symposium. The biographies of the standing committee and planning committee members are provided in Appendixes A and B, respectively. The symposium was held on May 11-13, 2009, in Washington, DC, and included presentations and discussion sessions on pathway-based approaches for hazard identification, applications of new approaches to mode-of-action analyses, the challenges to and opportunities for risk assessment in the changing paradigm, and future directions. The symposium agenda, speaker and panelist biographies, and presentations are provided in Appendixes C, D, and E, respectively. The symposium also included a poster session to showcase examples of
OCR for page 4
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary how new technologies might be applied to quantitative and qualitative aspects of risk assessment. The poster abstracts are provided in Appendix F. This summary provides the highlights of the presentations and discussions at the symposium. Any views expressed here are those of the individual committee members, presenters, or other symposium participants and do not reflect any findings or conclusions of the National Academies. A PARADIGM CHANGE ON THE HORIZON Warren Muir, of the National Academies, welcomed the audience to the symposium and stated that the environmental-management paradigm of the 1970s is starting to break down with recent scientific advances and the exponential growth of information and that the symposium should be seen as the first of many discussions on the impact of advances in toxicology on risk assessment. He introduced Bernard Goldstein, of the University of Pittsburgh, chair of the Standing Committee on Risk Analysis Issues and Reviews, who stated that although the standing committee does not make recommendations, symposium participants should feel free to suggest how to move the field forward and to make research recommendations. Peter Preuss, of EPA, concluded the opening remarks and emphasized that substantial changes are on the horizon for risk assessment. The agency will soon be confronted with enormous quantities of data from high-throughput testing and as a result of the regulatory requirements of the REACH (Registration, Evaluation, Authorisation and Restriction of Chemicals) program in Europe that requires testing of thousands of chemicals. He urged the audience to consider the question, What is the future of risk assessment? Making Risk Assessment More Useful in an Era of Paradigm Change E. Donald Elliott, of Yale Law School and Willkie Farr & Gallagher LLP, addressed issues associated with acceptance and implementation of the new pathway approaches that will usher in the paradigm change. He emphasized that simply building a better mousetrap does not ensure its use, and he provided several examples in which innovations, such as movable type and the wheel, were not adopted until centuries later. He felt that ultimately innovations must win the support of a user community to be successful, so the new tools and approaches should be applied to problems that the current paradigm has difficulty in addressing. Elliott stated that the advocates of pathway-based toxicity testing should illustrate how it can address the needs of a user community, such as satisfying data requirements for REACH; providing valuable information on sensitive populations; evaluating materials, such as nanomaterials, that are not easily evaluated in typical animal models; and demonstrating that fewer animal tests are needed if the approaches are applied. He warned, however, that the new ap-
OCR for page 5
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary proaches will not be as influential if they are defined as merely less expensive screening techniques. Elliott continued by saying that the next steps needed to effect the paradigm change will be model evaluation and judicial acceptance. NRC (2007b) and Beck (2002) set forth a number of questions to consider in evaluating a model, such as whether the results are accurate and represent the system being modeled? The standards for judicial acceptance in agency reviews and private damage cases are different. The standards for agency reviews are much more lenient than those in private cases in which a judge must determine whether an expert’s testimony is scientifically valid and applicable. Accordingly, the best approach for judicial acceptance would be to have a record established on the basis of judicial review of agency decisions, in which a court generally defers to the agency when decisions involve determinations at the frontiers of science. Elliott stated that the key issue is to create a record showing that the new approach works as well as or better than existing methods in a particular regulatory application. He concluded, however, that the best way to establish acceptance might be for EPA to use its broad rule-making authority under Section 4 of the Toxic Substances Control Act to establish what constitutes a valid testing method in particular applications. Emerging Science and Public Health Lynn Goldman, of Johns Hopkins Bloomberg School of Public Health, a member of the standing committee and the planning committee, discussed the public-health aspects of the emerging science and potential challenges. She agreed with Elliott that a crisis is looming, given the number of chemicals that need to be evaluated and the perception that the process for ensuring that commercial chemicals are safe is broken and needs to be re-evaluated. The emerging public-health issues are compounding the sense of urgency in that society will not be able to take 20 years to make decisions. Given the uncertainties surrounding species extrapolation, dose extrapolation, and evaluation of sensitive populations today, the vision provided in the NRC report Toxicity Testing in the 21st Century: A Vision and a Strategy offers tremendous promise. However, Goldman used the example of EPA’s Endocrine Disruptor Screening Program as a cautionary tale. In 1996, Congress passed two laws, the Food Quality Protection Act and the Safe Drinking Water Act, that directed EPA to develop a process for screening and testing chemicals for endocrine-disruptor potential. Over 13 years, while three advisory committees have been formed, six policy statements have been issued, and screening tests have been modified four times, no tier 2 protocols have been approved, and only one list of 67 pesticides to be screened has been generated. One of the most troubling aspects is that most of the science is now more than 15 years old. EPA lacked adequate funding, appropriate expertise, enforceable expectations by Congress, and the political will to push the
OCR for page 6
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary program forward. The fear that a chemical would be blacklisted on the basis of a screening test and the “fatigue factor,” in which supporters eventually tire and move on to other issues, compounded the problems. Goldman suggested that the following lessons should be learned from the foregoing example: support is needed from stakeholders, administration, and Congress for long-term investments in people, time, and resources to develop and implement new toxicity-testing approaches and technologies; strong partnerships within the agency and with other agencies, such as the National Institutes of Health (NIH), are valuable; new paradigms will not be supported unless there are convincing proof-of-concept and verification studies; and new processes are needed to move science into regulatory science more rapidly. Goldman concluded that the new approaches and technologies have many potential benefits, including improvement in the ability to identify chemicals that have the greatest potential for risk, the generation of more scientifically relevant data on which to base decisions, and improved strategies of hazard and risk management. However, she warned that resources are required to implement the changes: not only funding but highly trained scientists will be needed, and the pipeline of scientists who will be qualified and capable of doing the work needs to be addressed. Toxicity Testing in the 21st Century Kim Boekelheide, of Brown University, who was a member of the committee responsible for the report Toxicity Testing in the 21st Century: a Vision and a Strategy reviewed the report and posed several questions to consider throughout the discussion in the present symposium. The committee was formed when frustration with toxicity-testing approaches was increasing. Boekelheide cited various problems with toxicity-testing approaches, including low throughput, high cost, questionable relevance to actual human risks, use of conservative defaults, and reliance on animals. Thus, the committee was motivated by the following design criteria for its vision: to provide the broadest possible coverage of chemicals, end points, and life stages; to reduce the cost and time of testing; to minimize animal use and suffering; and to develop detailed mechanistic and dose-response information for human health risk assessment. The committee considered several options, which are summarized in Table 1. Option I was essentially the status quo, option II was a tiered approach, and options III and IV were fundamental shifts in the current approaches. Although the committee acknowledged option IV as the ultimate goal for toxicity testing, it chose option III to represent the vision for the next 10-20 years. That approach is a fundamental shift—one that is based primarily on human biology, covers a broad range of doses, is mostly high-throughput, is less expensive and time-consuming, uses substantially fewer animals, and focuses on perturbations of critical cellular responses.
OCR for page 7
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary TABLE 1 Options for Future Toxicity-Testing Strategies Considered by the NRC Committee on Toxicity Testing and Assessment of Environmental Agents Option I In Vivo Option II Tiered In Vivo Option III In Vitro and In Vivo Option IV In Vitro Animal biology Animal biology Primarily human biology Primarily human biology High doses High doses Broad range of doses Broad range of doses Low throughput Improved throughput High and medium throughput High throughput Expensive Les expensive Less expensive Less expensive Time-consuming Less time-consuming Less time-consuming Less time-consuming Relatively large number of animals Fewer animals Substantially fewer animals Virtually no animals Apical end points Apical end points Perturbations of toxicity pathways Perturbations of toxicity pathways Some in silico and in vitro screens In silico screens possible In silico screens Source: Modified from NRC 2007a. K. Boekelheide, Brown University, presented at the symposium. Boekelheide described the components of the vision, which are illustrated in Figure 1. The core component is toxicity-testing, in which toxicity-pathway assays play a dominant role. The committee defined a toxicity pathway as a cellular-response pathway that, when sufficiently perturbed, is expected to result in an adverse health effect (see Figure 2), and it envisioned a toxicity-testing system that evaluates biologically important perturbations in key toxicity pathways by using new methods in computational biology and a comprehensive array of in vitro tests based on human biology. Boekelheide noted that since release of the report, rapid progress in human stem-cell biology, better accessibility to human cells, and development of bioengineered tissues have made the committee’s vision more attainable. He also noted that the toxicity-pathway approach moves away from extrapolation from high dose to low dose and from animals to humans but involves extrapolation from in vitro to in vivo and between levels of biologic organization. Thus, there will be a need to build computational systems-biology models of toxicity-pathway circuitry and pharmacokinetic models that can predict human blood and tissue concentrations under specific exposure conditions.
OCR for page 8
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary FIGURE 1 Components of the vision described in the report, Toxicity Testing in the 21st Century: A Vision and a Strategy. Source: NRC 2007a. K. Boekelheide, Brown University, presented at the symposium. FIGURE 2 Perturbation of cellular response pathway, leading to adverse effects. Source: Modified from NRC 2007a. K. Boekelheide, Brown University, modified from symposium presentation.
OCR for page 9
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary Boekelheide stated that the vision offers a toxicity-testing system more focused on human biology with more dose-relevant testing and the possibility of addressing many of the frustrating problems in the current system. He listed some challenges with the proposed vision, including development of assays for the toxicity pathways, identification and testing of metabolites, use of the results to establish safe levels of exposure, and training of scientists and regulators to use the new science. Boekelheide concluded by asking several questions for consideration throughout the symposium program: How long will it take to implement the new toxicity-testing paradigm? How will adaptive responses be distinguished from adverse responses? Is the proposed approach a screening tool or a stand-alone system? How will the new paradigm be validated? How will new science be incorporated? How will regulators handle the transition in testing? Symposium Issues and Questions Lorenz Rhomberg, of Gradient Corporation, a member of the standing committee and chair of the planning committee, closed the first session by providing an overview of issues and questions to consider throughout the symposium. Rhomberg stated that the new tools will enable and require new approaches. Massive quantities of multivariate data are being generated, and this poses challenges for data handling and interpretation. The focus is on “normal” biologic control and processes and the effects of perturbations on those processes, and a substantial investment will be required to improve understanding in fundamental biology. More important, our frame of reference has shifted dramatically: traditional toxicology starts with the whole organism, observes apical effects, and then tries to explain the effects by looking at changes at lower levels of biologic organization, whereas the new paradigm looks at molecular and cellular processes and tries to explain what the effects on the whole organism will be if the processes are perturbed. People have different views on the purposes and applications of the new tools. For example, some want to use them to screen out problematic chemicals in drug, pesticide, or product development; to identify chemicals for testing and the in vivo testing that needs to be conducted; to establish testing priorities for data-poor chemicals; to identify biomarkers or early indicators of exposure or toxicity in the traditional paradigm; or to conduct pathway-based evaluations of causal processes of toxicity. Using the new tools will pose challenges, such as distinguishing between causes and effects, dissecting complicated networks of pathways to determine how they interact, and determining which changes are adverse effects rather than adaptive responses. However, the new tools hold great promise, particularly for examining how variations in the population affect how people react to various exposures.
OCR for page 10
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary Rhomberg concluded with some overarching questions to be considered throughout the symposium: What are the possibilities of the new tools, and how do we realize them? What are the pitfalls, and how can we avoid them? How is the short-term use of the new tools different from the ultimate vision? When should the focus be on particular pathways rather than on interactions, variability, and complexity? How is regulatory and public acceptance of the new paradigm to be accomplished? THE NEW SCIENCE An Overview John Groopman, of Johns Hopkins Bloomberg School of Public Health, began the discussion of the new science by providing several examples of how it has been used. He first discussed the Keap1-Nrf2 signaling pathway, which is sensitive to a variety of environmental stressors. Keap1-Nrf2 signaling pathways have been investigated by using knockout animal models, and the investigations have provided insight into how the pathways modulate disease outcomes. Research has shown that different stressors in Nrf2 knockout mice affect different organs; that is, one stressor might lead to a liver effect, and another to a lung effect. Use of knockout animals has allowed scientists to tease apart some of the pathway integration and has shown that the signaling pathways can have large dose-response curves—in the 20,000-fold range—in response to activation. Groopman stated, however, that some of the research has provided cautionary tales. For example, when scientists evaluated the value of an aflatoxin-albumin biomarker to predict which rats were at risk for hepatocellular carcinoma, they found that the biomarker concentration tracked with the disease at the population level but not in the individual animals. Thus, one may need to be wary of the predictive value of a single biomarker for a complex disease. In another case, scientists thought that overexpression of a particular enzyme in a signaling pathway would lead to risk reduction, but they found that transgene overexpression had no effect on tumor burden. Overall, the research suggests that a reductionist approach might not work for complex diseases. Groopman acknowledged substantial increases in the sensitivity of mass spectrometry over the last 10 years but noted that the throughput in many cases has not increased, and this is often an underappreciated and underdiscussed aspect of the new paradigm. Groopman concluded by discussing the recent data on cancer genomes. Sequence analysis of cancer genomes has shown that different types of cancer, such as breast cancer and colon cancer, are not the same disease, and although there are common mutations within the same cancer type, the disease differs among individuals. Through sequence analysis, the number of confirmed genetic contributors to common human diseases has increased dramatically since 2000. Genome-wide association studies have shown that many alleles have modest
OCR for page 11
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary effects in disease outcomes, that many genes are involved in each disease, that most genes that have been shown to be involved in human disease were not predicted on the basis of current biologic understanding, and that many risk factors are in noncoding regions of the genome. Sequencing methods and technology have improved dramatically, and researchers who once dreamed of sequencing the human genome in a matter of years can now complete the task in a matter of days (see Figure 3). Groopman concluded by stating that the sequencing technology needs to be extended to experimental models so that questions about the concordance between effects observed in people and those observed in experimental models can be answered. Gene-Environment Interactions George Leikauf, of the University of Pittsburgh, discussed the current understanding of gene-environment interactions. In risk assessment, human variability and susceptibility are considered, and an uncertainty factor of 10 has traditionally been used to account for these factors. However, new tools available today are helping scientists to elucidate gene-environment interactions, and this research may provide a more scientific basis for evaluating human variability and susceptibility in the context of risk assessment. Leikauf noted that genetic disorders, such as sickle-cell anemia and cystic fibrosis, and environmental disorders, such as asbestosis and pneumoconiosis, cause relatively few deaths compared with complex diseases that are influenced by many genetic and environmental factors. Accordingly, it is the interaction between genome and environment that needs to be elucidated in the case of complex diseases. FIGURE 3 DNA sequencing output. Current output is 1-2 billion bases per machine per day. The human genome contains 3 billion bases. Source: Stratton et al. 2009. Reprinted with permission; copyright 2009, Nature. J. Groopman, Johns Hopkins Bloomberg School of Public Health, presented at the symposium.
OCR for page 40
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary Panel Discussion The afternoon session closed with a panel discussion that focused on the definition of toxicity pathway, an issue raised during several presentations. Faustman noted that defining a toxicity pathway is context-dependent. For example, apoptosis is generally seen as an adverse effect, but if apoptosis did not occur during development, humans would have fins. So time, place, and response make a difference. Goodsaid added that he would be hesitant to label a pathway as a toxicity pathway in isolation. For FDA, pathway information helps the agency to make regulatory decisions; mechanistic data allow the agency to interpret other test data. Waters stated that ideally what should be identified are pathways that are consistently and measurably changed within 2 weeks—or possibly even 4 weeks—of exposure and that are indicative and predictive of some outcome downstream that is recognized as a toxicity end point. She noted, however, that her definition raises the controversy about distinguishing an adverse response from an adaptive response; until gene expression, protein abundance, or some other measure has been evaluated over the dynamic cycle, one cannot distinguish whether it is a time-dependent expression of some adaptive response or truly a dose-dependent change indicative of toxicity. Thomas stated that his work has not focused on defining a toxicity pathway itself, but on grouping pathways according to common biologic function to make predictions. Faustman noted that there are differences between the various analytic tools and approaches that are being used to define pathways and that the differences could affect how one defines a pathway. The tools and approaches are only as good as the data that go into them, and the scientific community has not yet developed an unbiased approach for integrating pathway information. Rhomberg stated that toxicity pathway might not be the best descriptive term. Normal biologic processes need to be understood, as does how the processes are changed by various agents and result in toxicity. Furthermore, the processes are not linear sequences of events but networks of interactions. Thus, Rhomberg concluded that the focus should be on discovering the key pathway components and how they are integrated to make the functional modules discussed by Waters. One participant questioned, however, whether one needed to understand what a toxicity pathway was; perhaps one could use the models described during the symposium to derive a benchmark dose for a response, calculate a tissue dose, and then extrapolate to a human exposure that would drive the given response. Thomas and Kedderis agreed, but Kedderis noted that it is important to evaluate mechanisms when dealing with unknowns, agents on which few, if any, data are available. Thomas countered that one can provide guidance on reference doses or risk-specific doses by using genomic approaches and thus bridge the gap between making decisions on the basis of reasonable scientific data and not making decisions because of lack of data and simply ignoring the possible risks posed by exposure to the unknowns. He added that if genomic
OCR for page 41
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary profiles are obtained on six specific tissues, one can predict about 90% of all positive responses by using rat and mouse bioassays and thus obtain valuable information for risk assessment. WHAT THE FUTURE HOLDS Where Are We Going? or, Are We There Yet, and Where Is There The final session of the symposium was devoted to presentations and discussions on visions for the future and the path forward. Preuss began the session by providing a perspective on the changing landscape of risk assessment and the need for modernization. He stated that advances in understanding of the gene environment and the pending test data from the European REACH program are driving the need for change. He described two paradigms for understanding the effects of toxic chemicals. One is the human-disease model in which genetic profiles of people with and without a disease are compared to yield fingerprints of disease and susceptibility, and the other is the current animal-testing model in which chemically induced events are matched to rodent test results and rodent modes of action. Preuss noted that the new science and technologies should allow movement between paradigms, although the two approaches will probably progress in tandem for many years. However, the question now is how to move from assessing a few chemicals each year to assessing thousands of chemicals each year as the REACH program anticipates. Dossiers on 40,000 chemicals are expected by 2012, and the U.S. government is ill prepared to use the volume and complexity of information resulting from that or a similar program. Anticipating the need for change, EPA sponsored several NRC reports over the last few years that focused on toxicity testing and risk assessment (for example, NRC 2007a, 2008, 2009). Overall, those reports concluded that risk-assessment data needs cannot be met with the current testing methods, that scientists need to determine how to use the new data being generated for risk assessment, and that the transformation of risk assessment has to occur with stakeholder input. Preuss described EPA’s dual approach to developing the next generation of risk assessments. First, EPA is considering creating a high-priority list of chemicals and streamlining the process for assessment by narrowing the scope, using off-the-shelf risk approaches, and focusing and coordinating stakeholder reviews. Second, EPA is considering broadening the scope of some assessments to synthesize more information into each assessment, such as assessments on cumulative effects of agents that cause the same effect or on families of chemicals that are physically similar. EPA intends to explore the new science, methods, and policies that could be incorporated into emerging and future risk assessments with the primary goal of mapping a path forward. Preuss listed many questions that will need to be addressed, including, How can the new information best be incorporated into risk assessment and used to inform risk managers?
OCR for page 42
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary What new policies and procedures will be needed? How can EPA ensure that decision-makers, the courts, and Congress see the new approach as an acceptable way to proceed? EPA’s strategy for the next generation of risk assessment is to implement the framework presented in the NRC report Science and Decisions (NRC 2009); to develop an operational knowledge of bioinformatics, data mining, and gene-environment databases to support risk-assessment work; and to develop prototype examples of increasingly complex assessments that are responsive to risk context and refined through discussions with scientists, risk managers, and stakeholders. Preuss concluded that EPA estimates that it may take a decade before risk assessment can rely primarily on the new advances in science, but it is necessary to begin now to address the needed changes. Bucher continued the discussion by providing a perspective from the National Institute of Environmental Health Sciences (NIEHS) and noted that current NTP research includes many projects that are amenable to high-throughput screening and high-content data analysis. Furthermore, NTP has made a commitment to the development and use of high-throughput screening that could be used to set priorities among chemicals for further in-depth toxicologic evaluation, to identify mechanisms of toxicity, and to develop predictive models of in vivo biologic responses in humans. NTP’s commitment is consistent with its vision, articulated 5 years ago, of supporting the evolution of toxicology from an observational science to a predictive one that is based on a broad array of target-specific, mechanism-based biologic observations. Bucher noted that several people had commented during the symposium that tools need to be developed to handle the enormous volume of data being generated. He described a program at NIEHS led by Christopher Portier that is attempting to create such a tool. The approach is to use knowledge about genes associated with diseases to determine pathways linked to the genes and thus link the pathways to the diseases (that is, to elucidate “disease” pathways). The next step is to use toxicogenomic and proteomic databases on well-studied chemicals to link chemicals to diseases through pathways and then to analyze the toxicity pathways to find the best points for screening, such as critical nodes or connection points. –Omics and other molecular tools can then be used to validate the choices. What is ultimately created is an interaction network (see Figure 14). Bucher concluded that NTP expectations for the 21st century are to continue to refine traditional methods and to develop new methods to generate information on mechanisms, exposure-response relationships, and life-stage and genetic susceptibility that will allow better prediction of toxicity to humans and ultimately better protection of public health; to reconcile results from new data-rich techniques—such as genomics, proteomics, and high-throughput screens—with existing testing information for conceptual validation; and to develop approaches to accomplish formal validation of new methods for human hazard and risk estimations. Tina Bahadori, of the American Chemistry Council Long-Range Research Initiative, continued the discussion of the future vision and path forward by pro-
OCR for page 43
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary FIGURE 14 Interaction network that can be used to associate environmental factors with toxicity pathways and associated human diseases. Source: Gohlke et al. 2009. Reprinted with permission; copyright 2009, BMC Systems Biology. J. Bucher, National Institute of Environmental Health Sciences, presented at the symposium. viding an industry perspective. She stated that an unprecedented opportunity exists to improve the scientific understanding of the potential effects of chemicals on human health and the environment. New technologies to test the effects of chemicals have the potential to revolutionize risk assessment, and if it is done in a scientifically robust way, risks could be understood better, faster, and less expensively. The concern, however, is that the technology is advancing faster than the means to interpret the data accurately. Although investments are made to generate volumes of data, comparable investments to interpret the data are lacking; without investment in the “science of interpretation,” the tendency will be to rely on high-throughput hazard data as a surrogate for risk assessment. Bahadori stated that scientists need to determine how information is going to be translated into a real-world context to protect public health and the environment. Information on host susceptibility and background exposures will be needed for interpretation and extrapolation of in vitro test results. Furthermore, information on actual human exposure will be needed for selection of doses for toxicity testing so that hazard information can be developed on environmentally relevant effects and for determination of whether concentrations that perturb
OCR for page 44
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary toxicity pathways are biologically relevant. Scientists also will need to understand the progression from exposure to effect. Exposure science will need to predict and link exposure among all levels of biologic organization and to use the new technologies to characterize exposure. Bahadori emphasized the need to invest in the new tools and technologies and to stay committed to the long-range vision, recognizing that it may be years before the new science can be used to advance risk assessment. She concluded by saying that her organization has developed a research strategy to support the vision and is investing in research on interpreting the new data being generated, on developing innovative approaches to characterize biologically relevant exposures and their relation to health risk, and on determining the genetic influences and gene-environment interactions of susceptible populations. Roger Ulrich, of Calistoga Pharmaceuticals, provided a perspective from the pharmaceutical industry on what the future holds. The key difficulties that the pharmaceutical industry faces are late-stage failures and product withdrawals, which are extremely expensive and reduce the ability to reinvest in research; the erosion of consumer confidence in and the increased consumer expectations for product safety; the paucity of new products; and the shift of risk and product development from large pharmaceutical companies to small ones. To overcome the difficulties, Ulrich stated, the industry must focus on the pipeline and do a better job of assessing products, and this requires more thorough preclinical assessment of toxicity and more research on mechanisms and affected biologic processes or pathways. The goal is to identify prognostic biomarkers—markers that tell what might happen rather than markers that tell what has happened (diagnostic biomarkers). He also stated that the industry must focus on patients and identify at-risk people or populations in parallel with clinical development. Ulrich stated that the new technologies can help to improve the processes of drug discovery and development. They can help to identify molecular pathways involved in pharmacologic and toxicologic mechanisms; this will help in making decisions as to whether to pursue specific compounds earlier in the process. The new technologies can also help to identify potential biomarkers and to detect adverse drug effects in animals that do not necessarily result in the expression of an adverse outcome in an animal model. Regarding the patient, the new technologies can be used to understand the idiosyncrasies that may increase the risk or decrease the benefit for individual patients. For example, they can be used to identify genetic defects or susceptibilities that can lead to adverse events in response to specific drugs. Thus, the vision for the drug industry is to use contemporary tools to understand the full spectrum of drug effects on biologic pathways in the preclinical phase before formal development, to overlay drug-response networks on patient phenotype and genetic networks to understand individual patient risk, and to identify benefits, as well as risks, and apply the knowledge in prescription practices. In the next 5 years, Ulrich stated, scientists will continue to develop and explore the potential of the new tools and technologies, but the excitement will be in the development of in silico models and
OCR for page 45
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary application of new discoveries to the clinical setting. He concluded by listing several resources needed, including an open-source genetic and functional genomic data platform, most likely funded by government, and training for the next generation of scientists. Helmut Zarbl, of the University of Medicine and Dentistry of New Jersey, continued the discussion with an academic perspective on the future vision and discussed the NRC report Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment (NRC 2007c), which was released shortly after Toxicity Testing in the 21st Century. The committee that produced the report was asked to evaluate the status of toxicogenomics and to discuss potential applications, including applications to risk assessment. It concluded that toxicogenomic technologies have strong potential to affect decision-making but are not ready to replace existing testing regimens in risk assessment and regulatory toxicology; that toxicogenomics can provide information to be added to the weight of evidence for refining risk judgments; and that ultimately the technologies are envisioned as more sensitive and informative than existing technologies and have the potential to replace some current approaches or tests. That committee recommended a human toxicogenomics initiative to accomplish the following tasks: create and manage a large public database for storing and integrating the results of toxicogenomic analysis with conventional toxicity-testing data; assemble toxicogenomic and conventional toxicologic data on hundreds of compounds into a single database; create a centralized national biorepository for human clinical and epidemiologic samples; develop bioinformatic tools—such as software, analysis, and statistical tools—further; consider ethical, legal, and social implications of collecting and using toxicogenomic data and samples; and coordinate subinitiatives to evaluate the application of toxicogenomic technologies to the assessment of risks associated with chemical exposures. Zarbl concluded by discussing the path forward and stated that improvements in technology and science often build on previous knowledge and that scientists should not abandon the tools and knowledge of classical toxicology and risk assessment. He continued, saying that the paradigm shift will require a reduction in reliance on apical end points, and the challenge will be to validate toxicogenomic data to ensure that they are predictive of outcomes that occur much further downstream. Thus, the path for the next several years will be to develop in vitro assays, tools, and strategies; to continue to populate public databases of curated data; to invest in systems toxicology and computational tools for pathway-based risk assessment; to incorporate toxicogenomic data into the weight of evidence for risk assessment; and to continue to explore and validate the utility of the pathway-based approach for risk assessment. Further in the future, pathway-based approaches may be used for routine hazard screening of both new and legacy compounds. Zarbl again highlighted, however, the need for validation of the new approaches before they become stand-alone processes, and he cautioned that if the assays yield negative results, we need to proceed with care to ensure that nothing was missed.
OCR for page 46
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary Gina Solomon, of the Natural Resources Defense Council, closed with a perspective from an environmental-advocacy group and stated that she is encouraged by the research presented at the symposium. However, she noted that many people have concerns about the new direction, and some effort will be needed to persuade the broader community that the new approach has merit. The NRC report Toxicity Testing in the 21st Century offered the hope of rapid, cost-effective, animal-sparing testing of thousands of chemicals and a chance to intervene to prevent pathway perturbations before the occurrence of disease. Many data have been generated since publication of that report, but few of them have been incorporated or used in risk assessment, and the backlog of 80,000 chemicals remains. Solomon noted that the situation is reminiscent of EPA’s Endocrine Disruptor Screening Program (EDSP), which was originally designed to screen thousands of chemicals but was delayed for more than a decade by an overcumbersome validation process. She stated that the EDSP has not lived up to its promise, and the scientific community needs to work hard not to repeat the history of that program. She acknowledged that some effort will be required to bring the new science into the process of regulatory decision-making in a timely fashion. However, the revision and reauthorization of the Toxic Substances Control Act now under way may help to facilitate the adoption of new scientific tools. Solomon concluded that at the end of the day, if the pathway-based assays are concerned only with generating more in-depth, detailed mode-of-action data on the same subset of old chemicals, the new paradigm will fail. However, if it can deal with the backlog of chemicals and foster faster, health-protective hazard identification and risk assessment, it will be heralded as an important and valuable advance. Panel Discussion The symposium closed with a panel discussion focused on questions about and hopes for advancing the new paradigm. Preuss expressed concern that the field is quickly becoming, if it is not already, extremely complex and that distinguishing important signals from ones that are not important is going to be challenging. Furthermore, validating the sophisticated, complex models and approaches is going to present another challenge. The new paradigm will not be successful if scientists create models that are intelligible only to a select few. The fear is that the research community is far outpacing the ability of the regulatory community to use the information that is generated and to make intelligent decisions about it. If that happens, much will fall by the wayside. The problem is that the EPA’s research budgets have decreased each year to the point where EPA now has 50% of the purchasing power that it had 10 years ago. The available resources are not commensurate with the kind of effort needed. Zarbl noted that pathway predictions at this point may be premature given that a majority of gene expression is still not understood and that huge gaps must be addressed before the data should be used in risk assessment. Ulrich agreed
OCR for page 47
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary that huge gaps exist but stated that the gaps are being systematically filled. His concern was that realistic expectations be set. For example, it is unrealistic to expect that in 10 years scientists will be able to screen 200,000 chemicals in cell cultures and know the human risks on the basis of the resulting data. Bahadori added that the paradigm will fail if perfection of the science is required. Information that can inform and improve risk assessment will be available soon, if it is not already available. The scientific community needs to determine the level of knowledge needed for different purposes, that is, what is good enough for a given task. Bahadori expressed concern, however, about funding and emphasized the need to articulate the value of the current effort to ensure that resources are available. The scientific community needs to create a compelling case for requesting resources. Bucher remarked on the need expressed throughout the symposium for creating a large, publicly accessible data platform and the need for a concerted effort to overcome the inertia that might exist in creating such a platform. He stated that expectations for the new science and approaches will not be fulfilled if the necessary computational and analytic tools are not made available. Zarbl clarified that what is needed is a platform where data are curated; standards would have to be met to enter data so that they are standardized or uniform. What is being proposed is not simply a data repository. The question arose about what must be accomplished in the near term to illustrate the value of the new science. Bucher commented that the results of high-throughput screening should be used to design and guide current toxicity testing. The resulting data can help to move standard toxicology models forward. Zarbl stated that studies that demonstrate that pathway-based risk assessment can produce results at least as good as standard approaches (such as the studies described by Thomas earlier in the symposium) are the milestones needed. Ulrich noted that high priority should be given to a concerted effort to pull together the existing knowledge to create a cohesive and comprehensive output. Kenny Crump, of Environ, concurred: the key is to start applying and using the data and comparing the results with those of the current path. That would illuminate the gaps and help to differentiate between data that EPA might need and data that some other application would need. Hal Zenick, of EPA, commented that application of the new approaches may depend on what information is needed. One size does not fit all risk assessments. For example, the information needed to make a clean-up decision will be quite different from that needed to set exposure guidelines on a nationally pervasive bioaccumulative chemical. The nature of the question that needs to be answered may determine how soon the new approaches can be applied to risk assessment and risk management. Whenever the new science is finally used to make risk-assessment and risk-management decisions, several noted, a major challenge will be risk communication, that is, explaining the new approaches to policy-makers and the public.
OCR for page 48
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary REFERENCES Amundson, S.A., K.T. Do, L.C. Vinikoor, R.A. Lee, C.A. Koch-Paiz, J. Ahn, M. Reimers, Y. Chen, D.A. Scudiero, J.N. Weinstein, J.M. Trent, M.L. Bittner, P.S. Meltzer, and A.J. Fornace, Jr. 2008. Integrating global gene expression and radiation survival parameters across the 60 cell lines of the National Cancer Institutes anticancer drug screen. Cancer Res. 68(2): 415-424. Ashburner, M., C.A. Ball, J.A. Blake, D. Botstein, H. Butler, J.M. Cherry, A.P. Davis, K. Dolinski, S.S. Dwight, J.T. Eppig, M.A. Harris, D.P. Hill, L. Issel-Tarver, A. Kasarskis, S. Lewis, J.C. Matese, J.E. Richardson, M. Ringwald, G.M. Rubin, and G. Sherlock. 2000. Gene ontology: Tool for the unification of biology. Nat Genet. 25(1):25-29. Beck, M.B. 2002. Model evaluation and performance. Pp. 1275-1279 in Encyclopedia of Environmetrics, A.H. El-Shaarawi, and W.W. Piegorsch, eds. Chichester, UK: Wiley. Begley, T.J., A.S. Rosenback, T. Ideker, and L.D. Samson. 2002. Damage recovery pathways in Saccharomyces cerevisiae revealed by genomic phenotyping and in-teractome mapping. Mol. Cancer Res. 1(2):103-112. Bucher, J.R., and C. Portier. 2004. Human carcinogenic risk evaluation, Part V: The national toxicology program vision for assessing the human carcinogenic hazard of chemicals. Toxicol. Sci. 82(2): 363-366. Collins, F.S., G.M. Gray, and J.R. Bucher. 2008. Toxicology. Transforming environmental health protection. Science 319(5865):906-907. EC (European Commission). 2007. REACH in Brief. European Commission [online]. Available: http://ec.europa.eu/environment/chemicals/reach/pdf/2007_02_reach_in_brief.pdf [accessed Apr. 30, 2010]. EPA (U.S. Environmental Protection Agency). 2009. The U.S Environmental Protection Agency’s Strategic Plan for Evaluating the Toxicity of Chemicals. EPA/100/K-09/001. Office of the Science Advisor, Science Policy Council, U.S. Environmental Protection Agency, Washington, DC. March 2009 [online]. Available: http://www.epa.gov/spc/toxicitytesting/docs/toxtest_strategy_032309.pdf [accessed Apr. 30, 2010]. Fry, R.C., T.J. Begley, and L.D. Samson. 2005. Genome-wide responses to DNA-damaging agents. Annu. Rev. Microbiol. 59:357-377. Gohlke, J.M., R. Thomas, Y. Zhang, M. Rosenstein, A.P. Davis, C. Murphy, K.G. Becker, C.J. Mattingly, and C.J. Portier. 2009. Genetic and environmental pathways to complex diseases. BMC Systems Biology 3:46. Hartung, T., and M. Leist. 2008. Food for thought…on the evolution of toxicology and the phasing out of animal testing. ALTEX 25(2):91-96. Johnson, C.D., Y. Balagurunathan, K.P. Lu, M. Tadesse, M.H. Falahatpisheh, R.J. Carroll, E.R. Dougherty, C.A. Afshari, and K.S. Ramos. 2003. Genomic profiles and predictive biological networks in oxidant-induced atherogenesis. Physiol. Genomics 13(3):263-275. Johnson, C.D., Y. Balagurunathan, M.G. Tadesse, M.H. Falahatpisheh, M. Brun, M.K. Walker, E.R. Dougherty, and K.S. Ramos. 2004. Unraveling gene-gene interactions regulated by ligands of the aryl hydrocarbon receptor. Environ. Health Perspect. 112(4):403-412. Judson, R., A. Richard, D.J. Dix, K. Houck, M. Martin, R. Kavlock, V. Dellarco, T. Henry, T. Holderman, P. Sayre, S. Tan, T. Carpenter, and E. Smith. 2009. The tox-
OCR for page 49
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary icity data landscape for environmental chemicals. Environ. Health Perspect. 117(5):685-695. Lu, K.P., L.M. Hallberg, J. Tomlinson, and K.S. Ramos. 2000. Benzo(a)pyrene activates L1Md retrotransposon and inhibits DNA repair in vascular smooth muscle cells. Mutat. Res. 454(1-2): 35-44. NRC (National Research Council). 2000. Scientific Frontiers in Developmental Toxicology and Risk Assessment. Washington, DC: National Academy Press. NRC (National Research Council). 2007a. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: National Academies Press. NRC (National Research Council). 2007b. Models in Environmental Regulatory Decision Making. Washington, DC: National Academies Press. NRC (National Research Council). 2007c. Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment. Washington, DC: National Academies Press. NRC (National Research Council). 2008. Phthalates and Cumulative Risk Assessment: The Tasks Ahead. Washington, DC: National Academies Press. NRC (National Research Council). 2009. Science and Decisions: Advancing Risk Assessment. Washington, DC: National Academies Press. Oberdörster, G., E. Oberdörster, and J. Oberdörster. 2005. Nanotoxicology: An emerging discipline evolving from studies of ultrafine particles. Environ. Health Perspect. 113(7):823-839. Robinson, J.F., X. Yu, S. Hong, C. Zhou, N. Kim, D. DeMasi, and E.M. Faustman. 2010. Embryonic toxicokinetic and dynamic differences underlying strain sensitivity to cadmium during neurulation. Reprod. Toxicol. 29(3):279-285. Rudén, C. 2001. The use and evaluation of primary data in 29 trichloroethylene carcinogen risk assessments. Regul. Toxicol. Pharmacol. 34(1):3-16. Stratton, M.R., P.J. Campbell, and P.A. Futreal. 2009. The cancer genome. Nature 458(7239):719-724. Teeguarden, J.G., P.M. Hinderliter, G. Orr, B.D. Thrall, and J.G. Pounds. 2007. Particokinetics in vitro: Dosimetry considerations for in vitro nanoparticle toxicity assessments. Toxicol. Sci. 95(2):300-312. Waters, K.M., L.M. Masiello, R.C. Zangar, B.J. Tarasevich, N.J. Karin, R.D. Quesenberry, S. Bandyopadhyay, J.G. Teeguarden, J.G. Pounds, and B.D. Thrall. 2009. Macrophage responses to silica nanoparticles are highly conserved across particle sizes. Toxicol. Sci. 107(2):553-569. Yu, X., W.C. Griffith, K. Hanspers, J.F. Dillman III, H. Ong, M.A. Vredevoogd, and E.M. Faustman. 2006. A system based approach to interpret dose and time-dependent microarray data: Quantitative integration of gene ontology analysis for risk assessment. Toxicol. Sci. 92(2):560-577. Yu, X., S. Hong, E.G. Moreira, and E.M. Faustman. 2009. Improving in vitro Sertoli cell/gonocyte co-culture model for assessing male reproductive toxicity: Lessons learned from comparisons of cytotoxicity versus genomic responses to phthalates. Toxicol. Appl. Pharmacol. 239(3):325-336. Yu, X., J.F. Robinson, J.S. Sidhu, S. Hong, and E.M. Faustman. 2010. A system-based comparison of gene expression reveals alterations in oxidative stress, disruption of ubiquitin-proteasome system and altered cell cycle regulation after exposure to cadmium and methylmercury in Mouse Embryonic Fibroblast (MEF). Toxicol. Sci. 114(2):356-377.
OCR for page 50
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change - A Symposium Summary This page intentionally left blank.