National Academies Press: OpenBook
« Previous: Front Matter
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

TOXICITY-PATHWAY-BASED RISK ASSESSMENT


PREPARING FOR PARADIGM CHANGE


A Symposium Summary

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

This page intentionally left blank.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

Summary of the Symposium

In 2007, a committee of the National Research Council (NRC) proposed a vision that embraced recent scientific advances and set a new course for toxicity testing (NRC 2007a). The committee envisioned a new paradigm in which biologically important perturbations in key toxicity pathways would be evaluated with new methods in molecular biology, bioinformatics, computational toxicology, and a comprehensive array of in vitro tests based primarily on human biology. Although some view the vision as too optimistic with respect to the promise of the new science and debate the time required to implement the vision, no one can deny that a revolution in toxicity testing is under way. New approaches are being developed, and data are being generated. As a result, the U.S. Environmental Protection Agency (EPA) expects a large influx of data that will need to be evaluated. EPA also is faced with tens of thousands of chemicals on which toxicity information is incomplete and emerging chemicals and substances that will need risk assessment and possible regulation. Therefore, the agency asked the NRC Standing Committee on Risk Analysis Issues and Reviews to convene a symposium to stimulate discussion on the application of the new approaches and data in risk assessment.

The standing committee was established in 2006 at the request of EPA to plan and conduct a series of public workshops that could serve as a venue for discussion of issues critical for the development and review of objective, realistic, and scientifically based human health risk assessment. An ad hoc planning committee was formally appointed under the oversight of the standing committee to organize and conduct the symposium. The biographies of the standing committee and planning committee members are provided in Appendixes A and B, respectively.

The symposium was held on May 11-13, 2009, in Washington, DC, and included presentations and discussion sessions on pathway-based approaches for hazard identification, applications of new approaches to mode-of-action analyses, the challenges to and opportunities for risk assessment in the changing paradigm, and future directions. The symposium agenda, speaker and panelist biographies, and presentations are provided in Appendixes C, D, and E, respectively. The symposium also included a poster session to showcase examples of

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

how new technologies might be applied to quantitative and qualitative aspects of risk assessment. The poster abstracts are provided in Appendix F. This summary provides the highlights of the presentations and discussions at the symposium. Any views expressed here are those of the individual committee members, presenters, or other symposium participants and do not reflect any findings or conclusions of the National Academies.

A PARADIGM CHANGE ON THE HORIZON

Warren Muir, of the National Academies, welcomed the audience to the symposium and stated that the environmental-management paradigm of the 1970s is starting to break down with recent scientific advances and the exponential growth of information and that the symposium should be seen as the first of many discussions on the impact of advances in toxicology on risk assessment. He introduced Bernard Goldstein, of the University of Pittsburgh, chair of the Standing Committee on Risk Analysis Issues and Reviews, who stated that although the standing committee does not make recommendations, symposium participants should feel free to suggest how to move the field forward and to make research recommendations. Peter Preuss, of EPA, concluded the opening remarks and emphasized that substantial changes are on the horizon for risk assessment. The agency will soon be confronted with enormous quantities of data from high-throughput testing and as a result of the regulatory requirements of the REACH (Registration, Evaluation, Authorisation and Restriction of Chemicals) program in Europe that requires testing of thousands of chemicals. He urged the audience to consider the question, What is the future of risk assessment?

Making Risk Assessment More Useful in an Era of Paradigm Change

E. Donald Elliott, of Yale Law School and Willkie Farr & Gallagher LLP, addressed issues associated with acceptance and implementation of the new pathway approaches that will usher in the paradigm change. He emphasized that simply building a better mousetrap does not ensure its use, and he provided several examples in which innovations, such as movable type and the wheel, were not adopted until centuries later. He felt that ultimately innovations must win the support of a user community to be successful, so the new tools and approaches should be applied to problems that the current paradigm has difficulty in addressing. Elliott stated that the advocates of pathway-based toxicity testing should illustrate how it can address the needs of a user community, such as satisfying data requirements for REACH; providing valuable information on sensitive populations; evaluating materials, such as nanomaterials, that are not easily evaluated in typical animal models; and demonstrating that fewer animal tests are needed if the approaches are applied. He warned, however, that the new ap-

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

proaches will not be as influential if they are defined as merely less expensive screening techniques.

Elliott continued by saying that the next steps needed to effect the paradigm change will be model evaluation and judicial acceptance. NRC (2007b) and Beck (2002) set forth a number of questions to consider in evaluating a model, such as whether the results are accurate and represent the system being modeled? The standards for judicial acceptance in agency reviews and private damage cases are different. The standards for agency reviews are much more lenient than those in private cases in which a judge must determine whether an expert’s testimony is scientifically valid and applicable. Accordingly, the best approach for judicial acceptance would be to have a record established on the basis of judicial review of agency decisions, in which a court generally defers to the agency when decisions involve determinations at the frontiers of science. Elliott stated that the key issue is to create a record showing that the new approach works as well as or better than existing methods in a particular regulatory application. He concluded, however, that the best way to establish acceptance might be for EPA to use its broad rule-making authority under Section 4 of the Toxic Substances Control Act to establish what constitutes a valid testing method in particular applications.

Emerging Science and Public Health

Lynn Goldman, of Johns Hopkins Bloomberg School of Public Health, a member of the standing committee and the planning committee, discussed the public-health aspects of the emerging science and potential challenges. She agreed with Elliott that a crisis is looming, given the number of chemicals that need to be evaluated and the perception that the process for ensuring that commercial chemicals are safe is broken and needs to be re-evaluated. The emerging public-health issues are compounding the sense of urgency in that society will not be able to take 20 years to make decisions. Given the uncertainties surrounding species extrapolation, dose extrapolation, and evaluation of sensitive populations today, the vision provided in the NRC report Toxicity Testing in the 21st Century: A Vision and a Strategy offers tremendous promise. However, Goldman used the example of EPA’s Endocrine Disruptor Screening Program as a cautionary tale. In 1996, Congress passed two laws, the Food Quality Protection Act and the Safe Drinking Water Act, that directed EPA to develop a process for screening and testing chemicals for endocrine-disruptor potential. Over 13 years, while three advisory committees have been formed, six policy statements have been issued, and screening tests have been modified four times, no tier 2 protocols have been approved, and only one list of 67 pesticides to be screened has been generated. One of the most troubling aspects is that most of the science is now more than 15 years old. EPA lacked adequate funding, appropriate expertise, enforceable expectations by Congress, and the political will to push the

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

program forward. The fear that a chemical would be blacklisted on the basis of a screening test and the “fatigue factor,” in which supporters eventually tire and move on to other issues, compounded the problems. Goldman suggested that the following lessons should be learned from the foregoing example: support is needed from stakeholders, administration, and Congress for long-term investments in people, time, and resources to develop and implement new toxicity-testing approaches and technologies; strong partnerships within the agency and with other agencies, such as the National Institutes of Health (NIH), are valuable; new paradigms will not be supported unless there are convincing proof-of-concept and verification studies; and new processes are needed to move science into regulatory science more rapidly. Goldman concluded that the new approaches and technologies have many potential benefits, including improvement in the ability to identify chemicals that have the greatest potential for risk, the generation of more scientifically relevant data on which to base decisions, and improved strategies of hazard and risk management. However, she warned that resources are required to implement the changes: not only funding but highly trained scientists will be needed, and the pipeline of scientists who will be qualified and capable of doing the work needs to be addressed.

Toxicity Testing in the 21st Century

Kim Boekelheide, of Brown University, who was a member of the committee responsible for the report Toxicity Testing in the 21st Century: a Vision and a Strategy reviewed the report and posed several questions to consider throughout the discussion in the present symposium. The committee was formed when frustration with toxicity-testing approaches was increasing. Boekelheide cited various problems with toxicity-testing approaches, including low throughput, high cost, questionable relevance to actual human risks, use of conservative defaults, and reliance on animals. Thus, the committee was motivated by the following design criteria for its vision: to provide the broadest possible coverage of chemicals, end points, and life stages; to reduce the cost and time of testing; to minimize animal use and suffering; and to develop detailed mechanistic and dose-response information for human health risk assessment. The committee considered several options, which are summarized in Table 1. Option I was essentially the status quo, option II was a tiered approach, and options III and IV were fundamental shifts in the current approaches. Although the committee acknowledged option IV as the ultimate goal for toxicity testing, it chose option III to represent the vision for the next 10-20 years. That approach is a fundamental shift—one that is based primarily on human biology, covers a broad range of doses, is mostly high-throughput, is less expensive and time-consuming, uses substantially fewer animals, and focuses on perturbations of critical cellular responses.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

TABLE 1 Options for Future Toxicity-Testing Strategies Considered by the NRC Committee on Toxicity Testing and Assessment of Environmental Agents

Option I In Vivo

Option II Tiered In Vivo

Option III In Vitro and In Vivo

Option IV In Vitro

Animal biology

Animal biology

Primarily human biology

Primarily human biology

High doses

High doses

Broad range of doses

Broad range of doses

Low throughput

Improved throughput

High and medium throughput

High throughput

Expensive

Les expensive

Less expensive

Less expensive

Time-consuming

Less time-consuming

Less time-consuming

Less time-consuming

Relatively large number of animals

Fewer animals

Substantially fewer animals

Virtually no animals

Apical end points

Apical end points

Perturbations of toxicity pathways

Perturbations of toxicity pathways

 

Some in silico and in vitro screens

In silico screens possible

In silico screens

Source: Modified from NRC 2007a. K. Boekelheide, Brown University, presented at the symposium.

Boekelheide described the components of the vision, which are illustrated in Figure 1. The core component is toxicity-testing, in which toxicity-pathway assays play a dominant role. The committee defined a toxicity pathway as a cellular-response pathway that, when sufficiently perturbed, is expected to result in an adverse health effect (see Figure 2), and it envisioned a toxicity-testing system that evaluates biologically important perturbations in key toxicity pathways by using new methods in computational biology and a comprehensive array of in vitro tests based on human biology. Boekelheide noted that since release of the report, rapid progress in human stem-cell biology, better accessibility to human cells, and development of bioengineered tissues have made the committee’s vision more attainable. He also noted that the toxicity-pathway approach moves away from extrapolation from high dose to low dose and from animals to humans but involves extrapolation from in vitro to in vivo and between levels of biologic organization. Thus, there will be a need to build computational systems-biology models of toxicity-pathway circuitry and pharmacokinetic models that can predict human blood and tissue concentrations under specific exposure conditions.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
FIGURE 1 Components of the vision described in the report, Toxicity Testing in the 21st Century: A Vision and a Strategy. Source: NRC 2007a. K. Boekelheide, Brown University, presented at the symposium.

FIGURE 1 Components of the vision described in the report, Toxicity Testing in the 21st Century: A Vision and a Strategy. Source: NRC 2007a. K. Boekelheide, Brown University, presented at the symposium.

FIGURE 2 Perturbation of cellular response pathway, leading to adverse effects. Source: Modified from NRC 2007a. K. Boekelheide, Brown University, modified from symposium presentation.

FIGURE 2 Perturbation of cellular response pathway, leading to adverse effects. Source: Modified from NRC 2007a. K. Boekelheide, Brown University, modified from symposium presentation.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

Boekelheide stated that the vision offers a toxicity-testing system more focused on human biology with more dose-relevant testing and the possibility of addressing many of the frustrating problems in the current system. He listed some challenges with the proposed vision, including development of assays for the toxicity pathways, identification and testing of metabolites, use of the results to establish safe levels of exposure, and training of scientists and regulators to use the new science. Boekelheide concluded by asking several questions for consideration throughout the symposium program: How long will it take to implement the new toxicity-testing paradigm? How will adaptive responses be distinguished from adverse responses? Is the proposed approach a screening tool or a stand-alone system? How will the new paradigm be validated? How will new science be incorporated? How will regulators handle the transition in testing?

Symposium Issues and Questions

Lorenz Rhomberg, of Gradient Corporation, a member of the standing committee and chair of the planning committee, closed the first session by providing an overview of issues and questions to consider throughout the symposium. Rhomberg stated that the new tools will enable and require new approaches. Massive quantities of multivariate data are being generated, and this poses challenges for data handling and interpretation. The focus is on “normal” biologic control and processes and the effects of perturbations on those processes, and a substantial investment will be required to improve understanding in fundamental biology. More important, our frame of reference has shifted dramatically: traditional toxicology starts with the whole organism, observes apical effects, and then tries to explain the effects by looking at changes at lower levels of biologic organization, whereas the new paradigm looks at molecular and cellular processes and tries to explain what the effects on the whole organism will be if the processes are perturbed.

People have different views on the purposes and applications of the new tools. For example, some want to use them to screen out problematic chemicals in drug, pesticide, or product development; to identify chemicals for testing and the in vivo testing that needs to be conducted; to establish testing priorities for data-poor chemicals; to identify biomarkers or early indicators of exposure or toxicity in the traditional paradigm; or to conduct pathway-based evaluations of causal processes of toxicity. Using the new tools will pose challenges, such as distinguishing between causes and effects, dissecting complicated networks of pathways to determine how they interact, and determining which changes are adverse effects rather than adaptive responses. However, the new tools hold great promise, particularly for examining how variations in the population affect how people react to various exposures.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

Rhomberg concluded with some overarching questions to be considered throughout the symposium: What are the possibilities of the new tools, and how do we realize them? What are the pitfalls, and how can we avoid them? How is the short-term use of the new tools different from the ultimate vision? When should the focus be on particular pathways rather than on interactions, variability, and complexity? How is regulatory and public acceptance of the new paradigm to be accomplished?

THE NEW SCIENCE

An Overview

John Groopman, of Johns Hopkins Bloomberg School of Public Health, began the discussion of the new science by providing several examples of how it has been used. He first discussed the Keap1-Nrf2 signaling pathway, which is sensitive to a variety of environmental stressors. Keap1-Nrf2 signaling pathways have been investigated by using knockout animal models, and the investigations have provided insight into how the pathways modulate disease outcomes. Research has shown that different stressors in Nrf2 knockout mice affect different organs; that is, one stressor might lead to a liver effect, and another to a lung effect. Use of knockout animals has allowed scientists to tease apart some of the pathway integration and has shown that the signaling pathways can have large dose-response curves—in the 20,000-fold range—in response to activation.

Groopman stated, however, that some of the research has provided cautionary tales. For example, when scientists evaluated the value of an aflatoxin-albumin biomarker to predict which rats were at risk for hepatocellular carcinoma, they found that the biomarker concentration tracked with the disease at the population level but not in the individual animals. Thus, one may need to be wary of the predictive value of a single biomarker for a complex disease. In another case, scientists thought that overexpression of a particular enzyme in a signaling pathway would lead to risk reduction, but they found that transgene overexpression had no effect on tumor burden. Overall, the research suggests that a reductionist approach might not work for complex diseases. Groopman acknowledged substantial increases in the sensitivity of mass spectrometry over the last 10 years but noted that the throughput in many cases has not increased, and this is often an underappreciated and underdiscussed aspect of the new paradigm.

Groopman concluded by discussing the recent data on cancer genomes. Sequence analysis of cancer genomes has shown that different types of cancer, such as breast cancer and colon cancer, are not the same disease, and although there are common mutations within the same cancer type, the disease differs among individuals. Through sequence analysis, the number of confirmed genetic contributors to common human diseases has increased dramatically since 2000. Genome-wide association studies have shown that many alleles have modest

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

effects in disease outcomes, that many genes are involved in each disease, that most genes that have been shown to be involved in human disease were not predicted on the basis of current biologic understanding, and that many risk factors are in noncoding regions of the genome. Sequencing methods and technology have improved dramatically, and researchers who once dreamed of sequencing the human genome in a matter of years can now complete the task in a matter of days (see Figure 3). Groopman concluded by stating that the sequencing technology needs to be extended to experimental models so that questions about the concordance between effects observed in people and those observed in experimental models can be answered.

Gene-Environment Interactions

George Leikauf, of the University of Pittsburgh, discussed the current understanding of gene-environment interactions. In risk assessment, human variability and susceptibility are considered, and an uncertainty factor of 10 has traditionally been used to account for these factors. However, new tools available today are helping scientists to elucidate gene-environment interactions, and this research may provide a more scientific basis for evaluating human variability and susceptibility in the context of risk assessment. Leikauf noted that genetic disorders, such as sickle-cell anemia and cystic fibrosis, and environmental disorders, such as asbestosis and pneumoconiosis, cause relatively few deaths compared with complex diseases that are influenced by many genetic and environmental factors. Accordingly, it is the interaction between genome and environment that needs to be elucidated in the case of complex diseases.

FIGURE 3 DNA sequencing output. Current output is 1-2 billion bases per machine per day. The human genome contains 3 billion bases. Source: Stratton et al. 2009. Reprinted with permission; copyright 2009, Nature. J. Groopman, Johns Hopkins Bloomberg School of Public Health, presented at the symposium.

FIGURE 3 DNA sequencing output. Current output is 1-2 billion bases per machine per day. The human genome contains 3 billion bases. Source: Stratton et al. 2009. Reprinted with permission; copyright 2009, Nature. J. Groopman, Johns Hopkins Bloomberg School of Public Health, presented at the symposium.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

Leikauf continued, saying that evaluating genetic factors that affect pharmacokinetics or pharmacodynamics can provide valuable information for risk assessment. For example, genetic variations that lead to differences in carrier or transporter proteins can affect chemical absorption rates and determine the magnitude of a chemical’s effect on the body. Genetic variations that lead to differences in metabolism may also alter a chemical’s effect on the body. For example, if someone’s metabolism is such that a chemical is quickly converted to a reactive intermediate and then slowly eliminated, the person may be at greater risk because of the longer residence time of the reactive intermediate in the body. Thus, the relative rates of absorption and metabolism can be used to evaluate variability and susceptibility and can provide some scientific basis for selection of uncertainty factors. Leikauf noted, however, that determining the physiologic and pharmacologic consequences of the many genetic polymorphisms is difficult. He discussed several challenges to using genetic information to predict outcome. For example, not all genes are expressed or cause a given phenotype even if they are expressed. Thus, knowing one particular polymorphism does not mean knowing the likelihood of an outcome. Leikauf concluded, however, that the next step in genetics is to use the powerful new tools to understand the complexity and how it leads to diversity.

Tools and Technologies for Pathway-Based Research

Ivan Rusyn, of the University of North Carolina at Chapel Hill, discussed various tools and technologies that are now available for pathway-based research. He noted that the genomes of more than 180 organisms have been sequenced since 1995 and that although determining genetic sequence is important, understanding how we are different from one another may be more important. High-throughput sequencing—some of which can provide information on gene regulation and control by incorporating transcriptome analysis—has enabled the genome-wide association studies already discussed at the symposium and has provided valuable information on experimental models, both whole-animal and in vitro systems.

Rusyn described various tools and technologies available at the different levels of biologic organization and noted that the throughput potential for data acquisition diminishes as data relevance increases (see Figure 4). Single-molecule-based screening can involve cell-free systems or cell-based systems. In the case of cell-free systems, many of the concepts have been known for decades, but technologic advances have enabled researchers to evaluate classes of proteins, transporters, nuclear receptors, and other molecules and to screen hundreds of chemicals in a relatively short time. Miniaturization of cell-based systems has allowed researchers to create high-throughput formats that allow evaluation of P450 inhibition, metabolic stability, cellular toxicity, and enzyme induction. Screening with cell cultures has advanced rapidly as a result of robotic technologies and high-content plate design, and concentration-response

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

profiles on multiple phenotypes can now be generated quickly. Much effort is being invested in developing engineered tissue assays, some of which are being used by the pharmaceutical industry as screening tools. Finally, screening that uses invertebrates, such as Caenorhabditis elegans, and lower vertebrates, such as zebrafish, has been used for years, but scientists now have the ability to generate transgenic animals and to screen environmental chemicals in high-throughput or medium-throughput formats to evaluate phenotypes.

Rusyn described seminal work with knock-out strains of yeast that advanced pathway-based analysis (see, for example, Begley et al. 2002; Fry et al. 2005). High-throughput screens were used to identify pathways involved in response to chemicals that damaged DNA. Since then, multiple transcription-factor analyses have further advanced our knowledge of important pathways and have allowed scientists to rediscover “old” biology with new tools and technology. Rusyn noted, however, that it is difficult to go from gene expression to a whole pathway. A substantial volume of data is being generated, and the major challenge is to integrate all the data—chemical, traditional toxicologic, –omics, and high-throughput screening data—to advance our biologic understanding. Rusyn concluded that the complexity of science today creates an urgent need to train new scientists and develop new interdisciplinary graduate programs.

FIGURE 4 Throughput potential for data acquisition as related to levels of biologic organization. As the human relevance increases the throughput potential decreases. Source: NIEHS, unpublished data. I. Rusyn, University of North Carolina at Chapel Hill, presented at the symposium.

FIGURE 4 Throughput potential for data acquisition as related to levels of biologic organization. As the human relevance increases the throughput potential decreases. Source: NIEHS, unpublished data. I. Rusyn, University of North Carolina at Chapel Hill, presented at the symposium.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

PATHWAY-BASED APPROACHES FOR HAZARD IDENTIFICATION

ToxCast: Redefining Hazard Identification

Robert Kavlock, of EPA, opened the afternoon session by discussing ToxCast, an EPA research program. He stated that a substantial problem is the lack of data on chemicals. In a recent survey (Judson et al. 2009), EPA identified about 10,000 high-priority chemicals in EPA’s program offices; found huge gaps in data on cancer, reproductive toxicity, and developmental toxicity; and found no evidence in the public domain of safety or hazard data on more than 70% of the identified chemicals. Kavlock noted that this problem is not restricted to the United States; a better job must also be done internationally to eliminate the chemical information gap. He emphasized that at this stage, priorities must be set for testing of the chemicals. The options include conducting more animal studies, using exposure as a priority-setting metric, using structure-activity models, and using bioactivity profiling, which would screen chemicals by using high-throughput technologies (see Figure 5). The ToxCast program was designed to implement the fourth option, and the name attempts to capture the key goals of the program: to cast a broad net to capture the bioactivity of the chemicals and to try to forecast the toxicity of the chemicals. The ToxCast program is part of EPA’s contribution to the Tox21 Consortium (Collins et al. 2008), a partnership of the NIH Chemical Genomics Center (NCGC), EPA’s Office of Research and Development, and the National Toxicology Program (NTP) to advance the vision proposed in the NRC report (NRC 2007a). Kavlock also noted that EPA responded to that report by issuing a strategic plan for evaluating the toxicity of chemicals that included three goals: identifying toxicity pathways and using them in screening, using toxicity pathways in risk assessment, and making an institutional transition to incorporate the new science.

FIGURE 5 Illustration of bioactivity profiling using high-throughput technologies to screen chemicals. Source: EPA 2009. R. Kavlock, U.S. Environmental Protection Agency, presented at the symposium.

FIGURE 5 Illustration of bioactivity profiling using high-throughput technologies to screen chemicals. Source: EPA 2009. R. Kavlock, U.S. Environmental Protection Agency, presented at the symposium.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

Kavlock provided further details on the ToxCast program. It is a research program that was started by the National Center for Computational Toxicology and was developed to address the chemical screening and priority-setting needs for inert pesticide components, antimicrobial agents, drinking-water contaminants, and high- and medium-production-volume chemicals. The ToxCast program is currently the most comprehensive use of high-throughput technologies, at least in the public domain, to elucidate predictive chemical signatures. It is committed to stakeholder involvement and public release of the data generated. The program components are identifying toxicity pathways, developing high-throughput assays for them, screening chemical libraries, and linking the results to in vivo effects. Each component involves challenges, such as incorporating metabolic capabilities into the assays, determining whether to link assay results to effects found in rodent toxicity studies or to human toxicity, and predicting effective in vivo concentrations from effective in vitro concentrations. Kavlock described the three phases of the program (see Table 2) and noted that it is completing the first phase, proof-of-concept, and preparing for the second phase, which involves validation. He mentioned that it has developed a relational database, ToxRefDB, that contains animal toxicology data that will serve as the in vivo “anchor” for the ToxCast predictions.

Kavlock stated that 467 biochemical and cellular assays (see Table 3) are being used to evaluate chemicals, but the expectation is that a larger number of assays will eventually be used. Multiple assays and technologies are used to evaluate each end point, and initial results have been positive in that the results agree with what is known about the chemicals being testing. Kavlock concluded that the future of screening is here, and the challenge is to interpret all the data being generated. He predicted that the first application will be use of the data to set priorities among chemicals for targeted testing and that application to risk assessment will follow as more knowledge is gained from its initial use.

Practical Applications: Pharmaceuticals

William Pennie, of Pfizer, discussed screening approaches in the pharmaceutical industry and provided several examples of their use. Pennie noted that implementing new screening paradigms and using in silico and in vitro approaches may be easier in the pharmaceutical industry because of the ultimate purpose—screening out unpromising drug candidates early in the research phase as opposed to screening in environmental chemicals whose toxicity needs to be evaluated. Huge challenges are still associated with using these approaches in the pharmaceutical industry, and Pennie emphasized that academe, regulatory agencies, and industries need to collaborate to build the infrastructure needed. Otherwise, only incremental change will be made in developing and implementing pathway-based approaches.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

TABLE 2 Phased Development of ToxCast Program

Phasea

Number of Chemicals

Chemical Criteria

Purpose

Number of Assays

Cost per Chemical

Target Date

Ia

320

Data-rich (pesticides)

Signature development

>500

$20,000

FY 2007-2008

Ib

15

Nanomaterials

Pilot

166

$10,000

FY 2009

IIa

>300

Data-rich chemicals

Validation

>400

~$20,000-25,000

FY 2009

IIb

>100

Known human toxicants

Extrapolation

>400

~$20,000-25,000

FY 2009

IIc

>300

Expanded structure and use diversity

Extension

>400

~$20,000-25,000

FY 2010

IId

>12

Nanomaterials

PMN

>200

~$15,000-20,000

FY 2009-2010

III

Thousands

Data-poor

Prediction and priority-setting

>300

~$15,000-20,000

FY 2011-2012

aSince the symposium, phases IIa, IIb, and IIc have been merged into a single endeavor. Source: R. Kavlock, U. S. Environmental Protection Agency, presented at the symposium.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

TABLE 3 Types of ToxCast Assays

Biochemical Assays

Cellular Assays

  • Protein families

  • Cell lines

GPCR

NR

Kinase

Phosphatase

Protease

Other enzyme

Ion channel

Transporter

HepG2 human hepatoblastoma

A549 human lung carcinoma

HEK 293 human embryonic kidney

  • Primary cells

  • Assay formats

Human endothelial cells

Human monocytes

Human keratinocytes

Human fibroblasts

Human renal proximal tubule cells

Human small-airway epithelial cells

Radioligand binding

Enzyme activity

Coactivator recruitment

  • Biotransformation-competent cells

Primary rat hepatocytes

Primary human hepatocytes

  • Assay formats

Cytotoxicity

Reporter gene

Gene expression

Biomarker production

High-content imaging for cellular phenotype

Source: R. Kavlock, U.S. Environmental Protection Agency, presented at the symposium.

Pennie stated that some of the pathway-based approaches have been applied more successfully in the later stages of drug development than in the early, drug-discovery phase. One problem in developing the new approaches is that scientists often focus on activation of one pathway rather than considering the complexity of the system. Pennie stated that pathway knowledge should be added to a broader understanding of the biology; thus, the focus should be on a combination of properties rather than on one specific feature. Although the pharmaceutical industry is currently using in vitro assays that are typically functional end-point assays, Pennie noted that there is no reason why those assays could not be supplemented or replaced with pathway-based assays, given a substantial investment in validation. He said that the industry is focusing on using batteries of in vitro assays to predict in vivo outcomes, similar to the ToxCast program, and described an effort at Pfizer to develop a single-assay platform that would evaluate multiple end points simultaneously and provide a single predictive score for hepatic injury. Seven assays were evaluated by using 500

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

compounds that spanned the classes of hepatic toxicity. Researchers were able to develop a multiparameter optimization model that determined the combination of assays that would yield the most predictive power. On the basis of that analysis, they identified a combination of assays—a general cell-viability assay followed by an optimized imaging-based hepatic platform that measured several end points—that resulted in over 60% sensitivity and about 90% specificity. Pennie emphasized the value of integrating the pathway information into the testing cascade. If an issue is identified with a chemical, that knowledge can guide the in vivo testing and, instead of a fishing expedition, scientists can test a hypothesis. Pfizer has also developed a multiparameter optimization model that uses six physicochemical properties to characterize permeability, clearance, and safety and that helps to predict the success of a drug candidate. Pennie concluded by saying that the future challenge is to develop prediction models that combine data from multiple sources (that is, structural-alert data, physicochemical data, in vitro test data, and in vivo study data) to provide a holistic view of compound safety.

Practical Applications: Consumer Products

George Daston, of Procter and Gamble, discussed harnessing the available computational power to support new approaches to toxicology to solve some problems in the consumer-products industry. He noted that the new paradigm is a shift from outcome-driven toxicology, in which models are selected to evaluate a particular disease state without knowledge about the events from exposure to outcome, to mechanism-driven toxicology, in which scientists seek answers to several questions: How does the chemical interact with the system? What is the mechanism of action? How can we predict what the outcome would be on the basis of the mechanism? The transition to mechanism-driven toxicology will be enabled by the 50 years of data from traditional toxicology, the ability to do high-throughput and high-content biology, and the huge computational power currently available.

Daston provided two examples of taking advantage of today’s computational power. First, his company needed a system to evaluate chemicals without testing every new chemical entity to make initial predictions about safety. A chemical database was developed to search chemical substructures to identify analogues that might help to predict the toxicity of untested chemicals. A process was then developed in which first a chemist reviews a new compound and designs a reasonable search strategy, then the computer is used to search enormous volumes of data for specific patterns, and finally the output is evaluated according to expert rules based on physical chemistry, metabolism, reactivity, and toxicity to support testing decisions. Daston mentioned several public databases (DSSTox, ACTOR, and ToxRefdB) that are available for conducting similar searches and emphasized the importance of public data-sharing.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

His second example involved analysis of high-content datasets from microarrays in which all potential mechanisms of action of a particular chemical are evaluated as changes in gene expression. That approach complements the one discussed by Kavlock for the ToxCast program in that it is a detailed analysis of one assay rather than a scan of multiple types of assays. Daston and others focused on using steroid-hormone mechanisms to evaluate whether gene-expression analysis (that is, genomics) could predict those mechanisms. Steroid hormone mechanisms were chosen because research has shown that effects regulated by estrogen, androgen, and other steroid hormones depend on gene expression. That is, a chemical binds to a receptor; the receptor complex migrates to the nucleus, binds to specific sites on the DNA, and causes upregulation or downregulation of specific genes; and this change in gene expression causes the observed cellular and tissue response. They found not only that chemicals that act by the same mechanism of action affect the same genes in the same direction but that the magnitude of the changes is the same as long as the chemicals are matched for pharmacologic activity. Thus, they found that genomics could be used quantitatively to improve dose-response assessments. Daston stated that genomics can be used to accelerate the mechanistic understanding, and the information gained can be used to determine whether similar kinds of effects can be modeled in an in vitro system. One surprising discovery was how extrapolatable the results were, not only from in vivo to in vitro but from species to species. Daston concluded by saying that once the critical steps in a toxicologic process are known, quantitative models can be built to predict behavior at various levels of organization.

Practical Applications: Mixtures

John Groten, of Schering-Plough, discussed current approaches and possible applications of the new science to mixtures risk assessment. He noted that especially in toxicology research in the pharmaceutical industry (but also in the food and chemical industry) there is an increasing need for parallel and efficient processes to assess compound classes, more alternatives to animal testing, tiered approaches that link toxicokinetics and toxicodynamics, enhanced use of systems biology in toxicology, and an emphasis on understanding interactions, combined action, and mixtures risk assessment. Today, risk assessments in the food, drug, and chemical industries attempt to evaluate and incorporate mixture effects, but the processes for doing so are case-driven and relatively simplistic. For example, adding hazard quotients to ensure that a sum does not exceed a threshold might be a beginning, but the likelihood of joint exposure and the possibility that compounds affect the same target system need to be assessed qualitatively and, preferably, quantitatively. Although research has been conducted on toxicokinetics and toxicodynamics of mixtures, most publications have dealt with toxicokinetic interactions. Groten stated that toxicokinetics should be used to correct for differences in exposure to mixture components but, because of a

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

lack of mechanistic understanding in the toxicodynamic phase, not to predict toxic interactions. He noted that empirical approaches are adequate as a starting point but that in many cases these models depend on mathematical laws rather than biologic laws, and he recommended that mechanistic understanding be used to fine tune experiments and to test or support empirical findings.

Groten listed several challenges for mixtures research, including the difficulty of using empirical models and conventional toxicology to show the underlying sequence of events in joint action, the adequacy (or inadequacy) of conventional toxicity end points to provide a wide array of testable responses at the no-observed-adverse-effect level, and the inability of current models to distinguish kinetic and dynamic interactions. He concluded by noting that the health effects of chemical mixtures are mostly related to specific interactions at the molecular level and that the application of functional genomics (sequencing, genotyping, transcriptomics, proteomics, and metabolomics) will provide new insights and advance the risk assessment of mixtures. He echoed the need that previous speakers raised for the use of multidisciplinary teams with statisticians, bioinformaticians, molecular biologists, and others to conduct future research in this field.

Pathway-Based Approaches: A European Perspective

Thomas Hartung, of the Center for Alternatives to Animal Testing, provided a European perspective on pathway-based approaches and reviewed the status of the European REACH program. He noted that regulatory toxicology is a business; toxicity testing with animals in the European Union is an $800 million/year business that employs about 15,000 people. The data generated, however, are not always helpful for reaching conclusions about toxicity. For example, one study examined 29 risk assessments of trichloroethylene and found that four concluded that it was a carcinogen, 19 were equivocal, and six concluded that it was not a carcinogen (Rudén 2001). Hartung stated that one problem is that the system today is a patchwork to which every health scare over decades has added a patch. For example, the thalidomide disaster resulted in a requirement for reproductive-toxicity testing. Many patches are 50-80 years old, and there is no way to remove a patch because international guidelines have been created and are extremely difficult to change once they have been agreed on. Another problem is that animal models are limited—humans are not 70-kg rats. However, cell cultures are also limited; metabolism and defense mechanisms are lacking, the fate of test compounds is unknown, and dedifferentiation is favored by the growth conditions. Thus, the current system is far from perfect.

Hartung discussed the REACH initiative and noted that it constitutes the biggest investment in consumer safety ever undertaken. The original projection was that REACH would involve 27,000 companies in Europe (one-third of the world market) but affect the entire global market in that it also affects imported chemicals and that it would result in the assessment of at least 30,000 chemicals

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

(see Figure 6 for an overview of the chemical registration process). Given that about 5,000 chemicals have been assessed in 25 years, the program goal is quite ambitious. REACH, however, is much bigger than originally expected. By December 2008, 65,000 companies have submitted over 2.7 million preregistrations on 144,000 substances. The feasibility of REACH is now being reassessed.

Alternative methods clearly will be needed to provide the necessary data. REACH, however, requires companies to review all existing information on a chemical and to make optimal use of in vitro systems and in silico modeling. Animal testing is considered a last resort and can be conducted only with authorization by the European Chemical Agency. Hartung stated that one problem is to determine how to validate new testing approaches. It is not known how predictive the existing animal tests are for human health effects, so it does not make sense to validate truly novel approaches against animal tests. Hartung said that the key problem for REACH will be the need for reproductive-toxicity testing, which will represent 70% of the costs of REACH and involve more than 80% of the animals that will be used. That problem will mushroom because few facilities are capable of conducting the testing. The bigger challenges, however, are the number of false positives that will result from the testing and the need to determine which chemicals truly represent a threat to humans. Hartung concluded that a revolution, construction of something new, is needed rather than an evolution—replacement of parts or pieces one by one. The worst mistake would be to integrate small advances into the existing system. The new technologies offer tremendous opportunities for a new system (see Figure 7) that can overcome the problems that we face today.

FIGURE 6 Overview of chemical registration for REACH. *Registration includes chemicals that are suspected of being carcinogens, mutagens, or reproductive toxicants and have production volumes of at least 1 ton and chemicals that are considered persistent and have production volumes of at least 100 tons. Source: Modified from EC 2007. Reprinted with permission; copyright 2007, European Union. T. Hartung, Johns Hopkins University, modified from symposium presentation.

FIGURE 6 Overview of chemical registration for REACH. *Registration includes chemicals that are suspected of being carcinogens, mutagens, or reproductive toxicants and have production volumes of at least 1 ton and chemicals that are considered persistent and have production volumes of at least 100 tons. Source: Modified from EC 2007. Reprinted with permission; copyright 2007, European Union. T. Hartung, Johns Hopkins University, modified from symposium presentation.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
FIGURE 7 Integration of new approaches for toxicology. Source: Hartung and Leist 2008. Reprinted with permission; copyright 2008, ALTEX. T. Hartung, Johns Hopkins University, presented at the symposium.

FIGURE 7 Integration of new approaches for toxicology. Source: Hartung and Leist 2008. Reprinted with permission; copyright 2008, ALTEX. T. Hartung, Johns Hopkins University, presented at the symposium.

Panel Discussion

The afternoon session closed with a panel discussion that focused on data gaps, pitfalls, and research needs. Kavlock commented that scientists need data to validate the systems, such as data from the pharmaceutical industry, which has extensive human and animal toxicology data on pharmaceutical agents. David Jacobson-Kram, of the Food and Drug Administration (FDA), stated that FDA will soon be joining the efforts of other federal agencies on high-throughput screening and pathway profiling and may be able to provide extensive data. He continued by saying that developing a battery of short-term tests that uses changes in gene-expression patterns to predict human carcinogenic potential will revolutionize toxicology. He cautioned, however, that the tests need to be validated; negative results may simply represent the lack of metabolic activation in a system or the inability of water-insoluble compounds to reach their target. Charles Auer, retired from EPA, emphasized the substantial resources needed for such an effort.

Leikauf stated that a critical problem will be interpretation of the data. Kavlock noted that the goal of ToxCast is to determine the probability that a chemical will cause a particular adverse health effect. The data must be generated and provided to scientists so that they can evaluate them and determine whether the system is working. Hartung agreed with Kavlock that scientists will

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

deal with probabilities rather than bright lines (that is, point estimates). Frederic Bois, a member of the standing committee, reminded the audience that determining the probability of whether a chemical causes an effect is hazard assessment, not risk assessment; risk assessment requires a dose-response component, which is where the issue of metabolism becomes critically important. Goldstein noted that although probabilities might be generated, regulators will draw bright lines.

Kavlock stated that a key difference between the current system and the new approaches is the scale of information. Substantially more information will be generated with the new approaches, and it is hoped that that information will drive intelligent targeted testing that allows interpretation of data for risk assessment. Several symposium participants emphasized that the discussion on data interpretation and probability highlighted the need to educate the public on the new science and its implications. Linking upstream biomarkers or effects with downstream effects will be critical.

APPLICATION TO MODE-OF-ACTION ANALYSIS

What Is Required for Acceptance?

John Bucher, of the NTP, opened the morning session of the second day by exploring the relationship between toxicity pathways and modes of action and questions surrounding validation. He noted that the concept of mode of action arose from frustration over the inability to describe the biologic pathway of an outcome at the molecular level. Instead, mode of action describes a series of “key events” that lead to an outcome; key events are measurable effects in experimental studies and can be compared among studies. Bucher stated that toxicity pathways are the contents of the “black boxes” described by the modes of action and that key toxicity pathways will be identified with the help of toxico-genomic data and genetic-association studies that examine relationships between genetic alterations and human diseases. He contrasted toxicity pathways and mode of action: mode of action accommodates a less-than-complete mechanistic understanding, allows and requires considerable human judgment, and provides for conceptual cross-species extrapolation; toxicity pathways accommodate unbiased discovery, can provide integrated dose-response information, may allow more precise mechanistic “binning,” and can reveal a spectrum of responses. Bucher stated that acceptance of modes of action and toxicity pathways is complicated by various “inconvenient truths.” For mode of action, it is not a trivial task to lay out the key events for an outcome, and inconsistencies sometimes plague associations, for example, in the case of hepatic tumors in PPAR-alpha knockout mice that have been exposed to peroxisome-proliferating agents. For toxicity pathways, scientists are evaluating the worst-case scenario; chemicals are applied to cells that have lost their protective mechanism, so the chances of positive results are substantially increased. Furthermore, cells begin to deterio-

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

rate quickly; if an effect requires time to be observed, a cellular assay may not be conducive to detecting it.

Bucher stated that Tox21—a collaboration of EPA, NTP, and NCGC—has been remarkably successful, with each group bringing its own strengths to the effort; this collaboration should make important contributions to the advancement of the new science. However, many goals will need to be met for acceptance of the toxicity-pathway approach as the new paradigm, and until some of the goals have been reached, the scientific community cannot adequately know what will be needed for acceptance. Bucher noted that the NTP Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM) and the Interagency Coordinating Committee on Validation of Alternative Methods (ICCVAM) were established in 2000 to facilitate development, scientific review, and validation of alternative toxicologic test methods and were charged to ensure that new and revised test methods are validated to meet the needs of federal agencies. The law that created NICEATM and ICCVAM set a high hurdle for validation of new or revised methods, but NICEATM and ICCVAM have put forth a 5-year plan to evaluate high-throughput approaches and facilitate development of alternative test methods. Bucher concluded, saying that “at some point toxicologists will have to decide when our collective understanding of adverse biological responses in…in vitro assays…has advanced to the point that data from these assays would support decisions that are as protective of the public health as are current approaches relying on the results of the two-year rodent bioassay” (Bucher and Portier 2004).

Environmental Disease: Evaluation at the Molecular Level

Kenneth Ramos, of University of Louisville, described a functional-genomics approach to unraveling the molecular mechanisms of environmental disease and used his research on polycyclic aromatic hydrocarbons (PAHs), specifically benzo[a]pyrene (BaP), as a case study. Ramos noted that a challenge for elucidating chemical toxicity is that chemicals can interact in multiple ways to cause toxicity, so the task is not defining key events but understanding the inter-relationships and interactions of all the key events. BaP is no exception in causing toxicity potentially through multiple mechanisms; it is a prototypical PAH that binds to the aryl hydrocarbon receptor (AHR), deregulates gene expression, and is metabolized by CYP450 to intermediates that cause DNA damage and oxidative stress. Ramos stated that his laboratory has focused on using computational approaches to understand genomic data and construct biologic networks, which will provide clues to BaP toxicity. He interjected that the notion of pathway-based toxicity may be problematic because intersecting pathways all contribute to the ultimate biologic outcome, so the focus should be on understanding networks.

He said that taking advantage of genomics allowed his laboratory to identify three major molecular events: reactivation of L1 retroelement (Lu et al.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

2000), activation of inflammatory signaling (Johnson et al. 2003), and inhibition of genes involved in the immune response (Johnson et al. 2004). The researchers then began to investigate the observation that BaP activated repetitive genetic sequences known as retrotransposons. Retrotransposons are mobile elements in the genome, propagate through a copy-and-paste mechanism, and use reverse transcriptase and RNA intermediates. L1s are the most characterized and abundant retrotransposons, make up about 17% of mammalian DNA by mass, and mediate genome-wide changes via insertional and noninsertional mechanisms. They may cause a host of adverse effects in humans and animals because their ability to copy themselves allows them to insert themselves randomly throughout the genome.

Ramos described the work on elucidating L1 regulatory networks by using genomics and stated that the key was identifying nodes where multiple pathways appeared to overlap. He and his co-workers used silencing RNA approaches to knock down specific proteins, such as the AHR, so that they could investigate the effect on the biologic network, and they concluded that the repetitive sequences are important molecular target for PAHs. His laboratory has now turned to trying to understand the epigenetic basis of regulation of repetitive sequences and how PAHs might affect those regulatory control mechanisms in cells. The idea that biologic outcomes are affected by disruption of epigenetic events adds another layer of complexity to the story of environmental disease. It means that in addition to the direct actions of the chemical, the state of regulation of endogenous systems is important for understanding the biologic response. Ramos concluded that L1 is linked to many human diseases—such as chronic myeloid leukemia, Duchenne muscular dystrophy, colon cancer, and atherosclerosis—and that research has shown that environmental agents, such as BaP, regulate the cellular expression of L1 by transcriptional mechanisms, DNA methylation, and histone covalent modifications. Thus, the molecular machinery involved in silencing and reactivating retroelements not only is important in environmental responses but might be playing a prominent role in defining disease outcomes.

Dioxin: Evaluation of Pathways at the Molecular Level

Alvaro Puga, of the University of Cincinnati College of Medicine, used dioxin as an example to discuss molecular pathways in disease outcomes. Dioxin (2,3,7,8-tetrachlorodibenzo-p-dioxin [TCDD]) is a contaminant of Agent Orange—a herbicide used during the Vietnam War—that has been linked with numerous health effects. Some effects are characterized as antiproliferative, such as the antiestrogenic, antiandrogenic, and immunosuppressive effects; others are proliferative, such as cancer; and the remainder are characterized as effects on differentiation and development, such as birth defects. The effects of dioxin are primarily receptor-dependent. Dioxin binds to the AHR, a ligand-activated transcription factor. The resulting complex translocates to the nucleus and binds with the AHR nuclear translocator to form a complex that then binds to DNA-responsive ele-

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

ments on the genome; that binding induces gene expression. Dioxin is not the only ligand to bind to the AHR, and that raises the question of whether results from one ligand can be extrapolated to all ligands. That has essentially been done for several classes of halogenated aromatic hydrocarbons—polychlorinated dibenzo-p-dioxins, polychlorinated dibenzofurans, and polychlorinated biphenyls—with dioxin as the reference compound. AHR ligands have substantially different potencies to activate AHR-dependent gene expression, and their ability to produce toxicity appears to depend on their metabolic stability.

Puga stated that most toxic effects of dioxin are mediated by the AHR. For example, research has shown that AHR-deficient mice are resistant to dioxin-induced cytotoxicity and teratogenicity and that AHR-deficient zebrafish are resistant to dioxin-induced cardiac edema. The AHR has been implicated in many signaling pathways, and thus dioxin has the potential for disrupting other signaling pathways, such as MAP-kinase pathways and pathways associated with the cell cycle. Research indicates that cellular conditions determine whether the effects will be proliferative or antiproliferative. Puga stated that his laboratory is working to map the AHR regulatory network by varying the genotype of the cells (that is, using cells that have a wild-type receptor and cells that have a point mutation in the receptor that prevents binding to DNA) and asking the question, What is the target of the AHR at the whole-genome level? He said that recent research indicates that dioxin causes massive deregulation of homeobox and differentiation genes, so scientists should be critically investigating the developmental outcomes associated with dioxin exposure.

Systems-Level Approaches for Understanding Nanomaterial Biocompatibility

Brian Thrall, of the Pacific Northwest National Laboratory, discussed the challenges in evaluating mode of action and conducting hazard assessment of nanomaterials, and he provided examples of approaches from his laboratory to address the challenges. Nanotechnology will soon affect all aspects of society; current estimates are that sales from products that incorporate nanotechnology will reach $3 trillion by 2015. Although there are no documented cases of human toxicity of or disease caused by nanomaterials, concern has arisen because other types of particles and fibers have been linked to human disease, and comparisons have recently been made between asbestos fibers and carbon nanotubes. Thrall noted that if hazard assessments of the nanoproducts currently on the market were conducted using chronic bioassays, it could cost over $1 billion and take 30-50 years. Clearly, rapid screening approaches that lead to a small number of chronic bioassays or other in vivo testing would dramatically reduce the cost and time required to test the products.

Thrall stated that nanomaterials are difficult to evaluate because they are engineered materials that are made on the scale of biologic molecules and could

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

therefore interact with biologic pathways in many complex ways. Critical challenges in nanotoxicology revolve around addressing fundamental questions of exposure, dose, and mode of action. Focusing on dose, Thrall stated that a number of reports have indicated that the toxicity of particles depends on size: smaller particles tend to be more toxic on an equal-mass basis. However, Oberdorster et al. (2005) showed that using mass as the basis of comparison may not reveal much about chemical potency and biologic reactivity. The question then becomes what dose metric—mass, particle number, or surface area—is the most informative. Thrall stated that research with amorphous silica in his laboratory showed that surface area was the most appropriate dose metric in experiments evaluating cytotoxicity (Waters et al. 2009). Using genomics, he and co-workers also showed that the predominant dose-response pattern for gene expression depends on surface area and is independent of particle size. Furthermore, reverse transcription polymerase chain reaction (RT-PCR) showed that for more than 75% of the genes identified, the magnitude of expression correlated better with nominal surface area than with mass or particle number.

Thrall then asked whether any biologic processes can be attributed to size dependence; that is, do the chemical and physical properties that make nanomaterials commercially attractive cause unique biologic responses? His laboratory investigated that question by conducting gene-set enrichment analyses—a statistical approach in which the ontologic attributes of a gene set are compared. He and co-workers found that the major cellular processes affected by 10-nm and 500-nm silica were identical; none of over 1,000 biologic processes identified was statistically different as a function of particle size. So for amorphous silica, there was no compelling evidence that new biologic processes arise as a function of size at the nanoscale. His research on amorphous silica also showed how high-content data can be used to address some fundamental questions concerning nanomaterials.

Thrall concluded his presentation by noting a few other challenges that arise with nanomaterials. First, engineered particles are not as simple as soluble chemicals because such physical forces as gravity, diffusion, and convection act on them, particularly in cell-culture systems, and can influence dose and changes in dose (see Figure 8). For example, the dose measured for a particle that settles slowly or hardly at all could differ significantly from the dose measured for a dense particle that settles quickly in the culture. Dosimetry models for the in vitro systems need to be validated so that mode-of-action studies can be anchored by biologic dose. Second, nanomaterials adsorb proteins in biologic systems (see Figure 9), and this could have an important effect on disposition of nanoparticles and biologic response to the nanoparticles. Structure-based modeling might provide insight on principles that guide protein interaction with nanomaterials and might serve in the future as a screening tool for evaluating relationships between alterations in surface chemistry and toxicity. Thrall noted the importance of developing hybrid quantitative structure-activity relationship models that integrate structural, chemical, and biologic data and stated that re-

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

cent work has shown that specific aspects of surface area, such as the presence of polar functional groups, rather than total surface area alone are important for toxicity. He said that although this symposium is not focused on exposure, there is a need for more information on potential exposure to set priorities among and validate nanotoxicity studies.

FIGURE 8 Dosimetry considerations in cell systems. Source: Teeguarden et al. 2007. Reprinted with permission; copyright 2007, Society of Toxicology. B. Thrall, Pacific Northwest National Laboratory, presented at the symposium.

FIGURE 8 Dosimetry considerations in cell systems. Source: Teeguarden et al. 2007. Reprinted with permission; copyright 2007, Society of Toxicology. B. Thrall, Pacific Northwest National Laboratory, presented at the symposium.

FIGURE 9 What do cells see? Protein adsorption by nanomaterials is a universal phenomenon in biologic systems. Source: B. Thrall, unpublished data, Pacific Northwest National Laboratory, presented at the symposium. Reprinted with permission; copyright 2007, Pacific Northwest National Laboratory.

FIGURE 9 What do cells see? Protein adsorption by nanomaterials is a universal phenomenon in biologic systems. Source: B. Thrall, unpublished data, Pacific Northwest National Laboratory, presented at the symposium. Reprinted with permission; copyright 2007, Pacific Northwest National Laboratory.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

Panel Discussion

The morning session closed with a panel discussion that focused on the use of the new science in regulatory decision-making. Thrall commented that nanomaterials may not be a good case study because there are still relatively few data on them, and he emphasized the need to collect exposure data particularly to determine the magnitude of exposure and the materials to which people are being exposed. That information is needed to allow priority-setting for research. Bucher stated that dioxin and dioxin-like compounds may be good examples for incorporating the new science and noted that recent research has supported the toxic equivalency factors established by the World Health Organization for predicting carcinogenic outcome. He added that research has shown that it will not be sufficient simply to evaluate pathways and that other information on timing and persistence will need to be integrated with pathway data.

Jonathan Wiener, of Duke University, addressed policy and legal aspects of incorporating the new science in agency decision-making. He considered the scenario in which EPA based a decision, such as a decision to regulate a chemical, on the new toxicity-pathway-based science or on a combination of traditional and new testing methods. He first noted that the agency would face internal review in the executive branch in the Office of Management and Budget Office of Information and Regulatory Affairs (OIRA) and possibly in the Office of Science and Technology Policy (OSTP). Wiener suggested that the field may be open for some new and interesting approaches to guide agency science and decision-making, in light of recent actions of President Obama’s administration, such as his call for a new executive order on regulatory oversight by OIRA and his memorandum on improving agency science and strengthening OSTP’s role.

After internal review within the executive branch, Wiener observed, the next hurdle would be judicial review. He argued that, although courts can be skeptical of new scientific methods in civil tort liability lawsuits, judicial review of agency science may be more deferential, especially when agencies are acting “at the frontiers of science.” Several regulatory statutes now call for agencies to use the “best available science” or the “latest scientific knowledge,” and a court could be convinced that toxicity-pathway approaches constitute the best and latest science. Furthermore, the Supreme Court has recently held that if the agency provides a persuasive reason for changing its policy or its basis of decision-making, the courts will be receptive to the change even if the reason is not the one that a court would have given. Thus, an agency seeking to rely in whole or in part on toxicity-pathway-based approaches for making regulatory decisions ought to give a good explanation of why these new methods are valuable.

Finally, Wiener pointed out that the question of what constitutes an “adverse effect” may be pivotal. Yet, as others have discussed during this symposium, responses observed in toxicity-pathway studies may not always indicate an adverse effect as opposed to, for example, an adaptive effect. Wiener’s research indicates that the term adverse effect has been used in hundreds of federal statutes and thousands of judicial opinions since 1970, but it is almost never

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

defined. Wiener suggested that EPA try to provide a thorough and tractable interpretation of what an adverse effect is and how the new toxicity-pathway testing methods can demonstrate such an effect.

The question of whether a framework that would facilitate the use of new data could be developed or whether it was too soon to use the new data was discussed. Thrall noted that the development of a framework and advancing the new science would depend on fields outside toxicology, such as improving computational abilities to handle various approaches and assumptions. Ramos, however, stated that there is now technology that allows a pathway-based approach to classification, to increase understanding of modes of action, to gain insight into biologic outcome, and ultimately to predict safety. He warned, however, that one has to temper that optimism with reality and recognize that a system that has checks and balances to minimize error must be built because we do not yet know whether we can make predictions on the basis of the new science with a given level of certainty. Elliott agreed, emphasizing that the issue should not be framed as an all-or-nothing decision, and suggested that the agency should take a relatively simple, well-understood system, establish the pathway-based approach for it, and then build on that precedent. Ramos stated that there are some chemicals, such as arsenic and PAHs, with which that approach could be taken. Wiener underscored Elliott’s point and stated that in the near term toxicity-pathway-based approaches should be combined with whole-animal tests and human epidemiology so that reviewing bodies, such as OIRA and the courts, become comfortable with the information as providing a fuller picture rather than as a replacement at this stage. Other symposium participants echoed the idea of pushing forward and using and applying the data that have been collected, and one noted that no single assay is going to give a yes or no answer for a risk assessment or substantive decision. We will need to integrate all the information and use the best interpretive skills and scientific judgment to answer the important questions.

CHALLENGES AND OPPORTUNITIES FOR RISK ASSESSMENT IN THE CHANGING PARADIGM

Dose and Temporal Response

Elaine Faustman, of the University of Washington, opened the afternoon session by discussing datasets and tools available to examine dose and temporal response and what is needed to move forward. Faustman discussed the creation of gene ontologies and noted the paper by Ashburner et al. (2000) as critical in advancing the field. Three categories—biologic process (goal or objective), molecular function (elemental activity or task), and cellular component (location or complex)—have been defined, and each category has a structured, controlled vocabulary. Figure 10 provides an example of a gene ontology and shows the

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
FIGURE 10 Example of gene ontology for DNA metabolism, a biologic process. Similar ontologies can be built for molecular function and cellular component. Source: Ashburner et al. 2000. Reprinted with permission; copyright 2000, Nature Genetics. E. Faustman, University of Washington, presented at the symposium.

FIGURE 10 Example of gene ontology for DNA metabolism, a biologic process. Similar ontologies can be built for molecular function and cellular component. Source: Ashburner et al. 2000. Reprinted with permission; copyright 2000, Nature Genetics. E. Faustman, University of Washington, presented at the symposium.

equivalent genes for three species for a specific biologic process. For risk assessment, gene ontologies provide an outstanding opportunity to use genomic information for cross-species comparisons.

Several years ago, Faustman and co-workers recognized that decision rules were needed to evaluate dose- and time-dependent genomic data. Consequently, a system-based framework to interpret those data was developed (Yu et al. 2006). That framework (see Figure 11) has been used to identify potential signaling pathways versus single genes significantly changed after exposure. Once biologic processes are linked to pathway changes, one can begin to evaluate deviations from the normal patterns of gene expression that result from chemical exposure (Yu et al. 2006). That approach has been applied to metals, phthalates, and sulfur mustard (Robinson et al. 2010; Yu et al. 2006, 2009, 2010).

Faustman noted the report Scientific Frontiers in Developmental Toxicology and Risk Assessment (NRC 2000) and summarized the three main points of the report: signaling is used in almost every developmental event, about 17 pathways of cell-cell signaling are responsible for all of development, and the 17

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

pathways are highly conserved among metazoa. The report was valuable because it identified pathways involved in early, middle, and late development and laid the foundation for future work. Faustman commented that when evaluating pathways, one needs to consider not only whether the pathway is present but what the signal means (that is, a signaling pathway in one organism may have roles different from those in another organism). Furthermore, not only whether a pathway is expressed but when it is expressed and how it is expressed makes a difference. Like other speakers, Faustman emphasized the inter-relatedness of some of the pathways and the relationship of the network to general function.

Returning to the topic of risk assessment, Faustman noted observations from studies in her laboratory on the effects of metal exposure on mouse development. The studies found that metals affect genes involved in the Wnt signaling pathway, that multiple transcription-factor families are affected by metals, and that more than 50% of the genes affected were uncharacterized at the pathway level; the latter finding indicates that much work still needs to be done to determine the link between gene changes and pathways and the importance of the gene changes. Faustman closed by listing several needs, including new tools for evaluating quantitative genomic response at multiple levels of biologic organization, kinetic and dynamic models that can allow for integration at various organization levels, better characterization of variability in genomic data, discussion of and consensus on how responses and changes in responses should be considered for effect-level assessment, and discussion of approaches to evaluate responses to early low-dose exposures vs responses at increasing complexity and decreased specificity.

FIGURE 11 Framework for interpretation of dose- and time-dependent genomic data. Source: Yu et al. 2006. Reprinted with permission; copyright 2006, Toxicological Sciences. E. Faustman, University of Washington, presented at the symposium.

FIGURE 11 Framework for interpretation of dose- and time-dependent genomic data. Source: Yu et al. 2006. Reprinted with permission; copyright 2006, Toxicological Sciences. E. Faustman, University of Washington, presented at the symposium.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

Application of Genomic Dose-Response Data to Define Mode of Action and Low-Dose Behavior of Chemical Toxicants

Russell Thomas, of the Hamner Institutes for Health Sciences, provided a series of practical applications of genomic data to risk assessment. He noted several aspects of genomics that make it applicable to risk assessment. First, gene-expression microarray technology is more than a decade old, and multiple studies have demonstrated the sensitivity and reproducibility of the current generation of microarrays. Second, genomic technology is capable of broadly evaluating transcriptional changes (genome-wide) and of focusing on changes in individual genes and pathways. Third, gene-expression microarray analysis can provide insight into the dose required to affect cellular processes and the underlying biology of dose-dependent transitions. Thomas commented, however, that there is still no consensus on how to use genomic information in risk assessment.

In his first example, Thomas described an experiment from his laboratory in which the dose-response changes in gene expression of several chemical carcinogens, previously tested by NTP, were evaluated with transcriptomic data, and the results were compared with tumor-incidence data. For the experiment, groups of female B6C3F1 mice were exposed to one of five doses of a given carcinogen for 90 days, the transcriptional changes in target tissues were evaluated with whole-genome microarrays from Affymetrix, and the dose-response changes in gene expression were examined with a pathway-based approach. Specifically, the researchers calculated benchmark doses (BMDs) for individual genes on the basis of their inherent variability; grouped genes on the basis of biologic function, such as their role in proliferation, apoptosis, and metabolism; and finally calculated a summary value for the particular pathway. They found good correlation not only between the BMDs for the pathway-based transcriptional responses, such as cell proliferation and DNA damage response, and tumor incidence but between overall changes in gene expression and tumor incidence. They also found that the BMD for the most sensitive biologic process was always less than the BMD for the tumor response. Thomas concluded that transcriptomic dose-response alterations correlate with tumor incidence and that BMD values for the most sensitive pathways are protective.

In the second example, Thomas described how to use cross-species differences in transcriptional dose-response data to evaluate mode of action. Thomas and co-workers exposed groups of rats and mice to chloroprene for 5 or 15 days. Chloroprene is metabolized to epoxide metabolites, and the rate of metabolism and of generation of the epoxide metabolites is about 10 times higher in mice than in rats. Accordingly, they used a physiologically based pharmacokinetic (PBPK) model to try to normalize the doses so that mice and rats received about the same internal dose. The BMD analysis of the genomic data indicated that at 5 days glutathione metabolism was perturbed and at 15 days DNA repair genes

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

were affected and that the mouse was substantially more sensitive than the rat, although some differences disappeared when the comparison was based on internal dose. Overall, the results indicated the importance of the generation of the reactive epoxide metabolites in the proposed mode of action of chloroprene. Thomas concluded that pathway-based transcriptomic dose-response data can provide insights into the mode of action.

In the third example, Thomas provided suggestions for using genomics data to conduct risk assessments. If the mode of action is known, transcriptional benchmark dose lower confidence limit (BMDL) values could be derived on the basis of responses in key pathways and the values could be used in the risk assessment. If the mode of action is not known, the challenge is to discriminate between “adverse” and “adaptive” changes. However, one could proceed by evaluating all responses, deriving transcriptional BMDL values for the most sensitive pathways, and using them in risk assessments. Thomas compared reference doses and risk-specific doses derived from transcriptional BMDL values with comparable risk-assessment values from EPA’s Integrated Risk Information System. On the basis of those comparisons, Thomas concluded that pathway-based transcriptomic dose-response data can provide reasonable reference doses or points of departure for performing cancer and noncancer risk assessments. He added the hope is that genomics data can be used in the future to inform the shape of the dose-response curve in the low-dose region and allow some assessment of whether a linear or nonlinear approach should be taken.

Using Physiologically Based Pharmacokinetic Models to Interpret –Omics Dose-Response Data

Gregory Kedderis, of Chapel Hill, NC, reviewed the use of PBPK models in risk assessment and discussed how PBPK models could be used to inform in vitro experiments. He stated that PBPK models are mathematical descriptions of anatomy, physiology, and biochemistry in which model compartments represent organs or organ groups that are linked by blood flow. There are two types of data needed for PBPK models: physiologic measures, such as body weight, alveolar ventilation rates, blood-flow distribution to tissues, and organ volumes; and chemical-specific parameters, such as partition coefficients, biotransformation kinetic values, and values related to protein binding and chemical reactivity. Generally, the information is available in the literature, but Kedderis cautioned that some data require technical expertise to evaluate (for example, distinguishing relative measurements from objective measurements). PBPK models are used to relate external exposures to an internal dose—ultimately, the target-organ dose. Kedderis stated that the major contribution of PBPK models to risk assessment is that they allow one to reconcile various exposure routes and species differences. PBPK models can also be used to extrapolate human in vitro data to the in vivo system if data are available on the composition of the tissue.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

In the new paradigm, Kedderis stated that PBPK models could be used prospectively to inform in vitro experiments. For example, one could run a PBPK model for a given chemical by using a realistic exposure scenario and obtain a target-organ dose that could then be used to set concentration ranges for an in vitro experiment. Using that approach would ensure that cells are exposed to concentrations that are realistic for in vivo conditions. The problem is that cell cultures are closed systems, and biotransformation is an extremely important consideration. Many toxic chemicals require biotransformation to produce toxicity, and metabolism of a direct-acting chemical is often a detoxification event. Furthermore, many metabolites are transitory or chemically reactive. Kedderis emphasized that metabolically competent cells are needed but stated that for chemicals metabolized exclusively or primarily by the liver, there are commercially available tissues or cellular preparations that have integrated metabolism similar to that of the whole liver. He noted several other concerns regarding cell cultures, including issues about chemical solubility and volatility, the use of carrier solvents that can act as potent inhibitors, and changes in gene expression of immortalized cell lines. Kedderis was optimistic, however, about the promises of the new science and technologies and concluded that PBPK models will be able to augment the interpretation of –omic dose-response data and ultimately to provide information on physiologic variability, bioactivation variability, and response variability in the population.

Modular Network Modeling of Toxicity Pathways for Extrapolation

Katrina Waters, of Pacific Northwest National Laboratory, described her laboratory research on –omics approaches to modeling of human disease states. She began by noting several challenges in using a pathway-based approach for risk assessment, including elucidating accurate mechanistic response models from global response data, given that most global response data capture only one regulatory mechanism at a single time; distinguishing reversible, adaptive processes from true toxicity pathways; validating dose-response models that capture single biologic pathways and assume isolation from other systems; and extrapolating from in vitro to in vivo systems. Biologic systems are inherently complex and have many redundant interdependent signaling networks, and cell-response modeling will require incorporation of complex feedback and compensatory mechanisms to be predictive at a tissue level.

Waters stated that global technologies—such as microarrays to measure global transcriptional responses, tandem mass-spectrometry proteomics to measure global protein changes, and parallel Western blot technology to measure protein abundance and protein phosphorylation states—are biased in their measurement of cellular processes. However, integrating the multiple data types provides many advantages, including more comprehensive coverage of network and pathway processes. Waters illustrated her point by showing a network re-

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

construction from her laboratory based on microarray, proteomic, and parallel Western blot data (see Figure 12). The researchers found that individual technologies were not measuring the same thing multiple times, but rather measuring different parts of a network. Ultimately, no single technology could have led to the construction of a signaling network as comprehensive as the one developed when data of all three types were used.

Waters continued with another example from her laboratory in which macrophages and type 2 epithelial cells were exposed to silica nanoparticles, carbon nanotubes, or non-particle lipopolysaccharides. The cellular responses were evaluated by using whole-genome microarrays and global proteomic analysis. Using the microarray data, they could distinguish between lipopolysaccharide-induced inflammation and silica-induced inflammation in the macrophage cells, but no distinction could be made between silica, crystalline silica, and carbon nanotubes on the basis of those data. The proteomic data, however, allowed them to distinguish the different particles from each other by using protein profiles. From those studies, Waters concluded that integrated heterogeneous data provide more comprehensive cell-response networks than any single data type alone.

FIGURE 12 Integrated data provide more comprehensive and accurate network reconstruction. Black nodes represent genes or proteins that were measured with more than one technology. Red, blue, and yellow nodes represent genes or proteins measured with individual technology indicated in the figure. Source: Waters and Thrall, unpublished data, presented at the symposium. Reprinted with permission; copyright 2010, Pacific Northwest National Laboratory.

FIGURE 12 Integrated data provide more comprehensive and accurate network reconstruction. Black nodes represent genes or proteins that were measured with more than one technology. Red, blue, and yellow nodes represent genes or proteins measured with individual technology indicated in the figure. Source: Waters and Thrall, unpublished data, presented at the symposium. Reprinted with permission; copyright 2010, Pacific Northwest National Laboratory.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

Waters next described the advantages of developing modular network models to describe disease states. She stated that biologic systems are too complex for mechanistic models and that simpler abstractions are needed. So one can use the –omics data to define a set of inputs and a set of outputs and then represent the network clusters as functional modules (see Figure 13). The system is defined in terms of functional modules; the focus is on information flow (cause-effect relationships) rather than on molecular mechanisms. Thus, the modular network models can capture biologic complexity while reducing computational load to a finite number of variables that can be validated experimentally. Inferred network relationships can be verified by using molecular-intervention techniques, such as inhibiting a key enzyme in a biologic process and observing the cellular response. Waters stated that eventually it would be desirable to use the model to ask which functional module is responsible for a transition from an adaptive response to an adverse response. Once that module has been identified, mechanistic detail can be added to allow the model to predict an outcome better given the input dose. The next step is to scale the process to multicellular systems. The hope is to identify the mediators of cell-cell communication and the regulatory network underlying synthesis and secretion of the mediators.

Waters emphasized that biology is a dynamic process, that multiple levels of regulatory processes are necessary to capture biologic complexity, and that dose-response and temporal data will be required to capture the transition from adaptive processes to adverse outcomes. Furthermore, cells do not respond in isolation in tissues; thus, Waters stated, models must account for paracrine, neurologic, and other physiologic interactions to extrapolate from in vitro to in vivo systems accurately. She concluded by noting several needs to advance the toxicity-pathway approach for risk assessment, including biologically based models that incorporate epigenetic, proteomic, metabolomic, and post-translational modification data better; improved understanding of the relationships between toxicity pathways and toxicity outcomes; criteria for defining appropriate in vitro systems that represent in vivo toxicity sufficiently; and improved understanding of dosimetry and temporal differences between in vitro and in vivo systems.

FIGURE 13 Illustration of the development of modular network models. A system consisting of functional modules can be developed from –omics data, and mechanistic detail can be added once the module that is primarily responsible for the toxic response is identified. Source: K. Waters, unpublished data, Pacific Northwest National Laboratory, modified from symposium presentation. Reprinted with permission; copyright 2010, Pacific Northwest National Laboratory.

FIGURE 13 Illustration of the development of modular network models. A system consisting of functional modules can be developed from –omics data, and mechanistic detail can be added once the module that is primarily responsible for the toxic response is identified. Source: K. Waters, unpublished data, Pacific Northwest National Laboratory, modified from symposium presentation. Reprinted with permission; copyright 2010, Pacific Northwest National Laboratory.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

Integrating Global Gene Expression and Survival of 60 Cell Lines: Applications for Risk Assessment

Albert J. Fornace, Jr., of Georgetown University, discussed the use of –omics and other approaches to investigate individual sensitivity to ionizing radiation and possible applications to risk assessment. His research focus is on understanding variations in individual injury by using genomics rather than dose alone. Research implications include the elucidation of markers for screening radiologic workers or first responders after a radiation emergency, identification of persons with higher risks of adverse outcomes from exposure and possibly risks of late effects and second cancers, and development of a model for testing other toxicants on the basis of research on radiation effects. For his research, his laboratory uses the NCI60 cell lines as a model for population datasets. NCI60 cell lines consist of 60 human tumor-cell lines that have been used to test more than 100,000 compounds for chemosensitivity and on which numerous assays have been performed to identify molecular markers. Fornace noted several advantages to using ionizing radiation as the agent for studying individual sensitivity, including its well-characterized stress response, the absence of drug uptake or metabolism issues, and the linear response of stress genes (that is, a roughly proportional increase in the production of mRNA to dose).

Fornace stated that the approach has been to activate pathways as robustly as possible (that is, to cause an acute response) and evaluate increases in gene expression by using hybridized microarrays. A recent study (Amundson et al. 2008) found great heterogeneity of gene expression. On further analysis, 25 genes that were p53-responsive were identified; several had not been known previously as p53-regulated genes. The researchers were also able to identify genes related to radiation sensitivity and found that basal gene expression provided better predictions of radiation sensitivity than radiation-induced gene expression. Focusing next on pathways, they found that a set of genes—many of which regulate cell-cycle progression—was robustly and clearly repressed. They built a network by using protein-interaction databases and overlaid gene-expression data from the cell-cycle cluster. Ultimately, they identified a variety of genes known to be repressed by E2F4 and found that this transcription factor moves into the nucleus on exposure of the cell to ionizing radiation, where it presumably represses the genes. Fornace concluded that the approach developed to identify stress-response signatures of radiation could be adapted to non-genotoxic agents. The key is to select doses that will yield appreciable stress-pathway activation with reliable changes in gene expression. Initial studies in his laboratory have shown potential for using that approach to identify signatures of activation of various pathways for direct-acting agents.

The Food and Drug Administration Experience in Analyzing Genomic Data

Federico Goodsaid, of the FDA Office of Clinical Pharmacology, described

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

genomic data submissions to FDA and its approach to their review. He stated that FDA receives genomic data through voluntary exploratory data submissions (VXDS), which involve meetings with sponsors in an open nonregulatory environment. Since 2004, FDA has had 35 data submissions from a wide variety of clinical divisions. The majority of platforms from 2004-2006 have been for microarray differential gene expression, and the majority from 2006-2008 for candidate gene identification. Most of the data have been from clinical studies, and the number of submissions concerning efficacy has been about twice the number concerning safety. Goodsaid noted that the data can be used in several ways: as part of a submission for drug approval in which the data are interpreted in the context of any specific claims made by the sponsor, in the co-development of drugs and tests in which a drug and a genetic test are validated as part of a phase III trial, for labeling updates in which genomic information is added to pre-existing labels, or in the biomarker-qualification process in which the data can enter a formal evaluation and review process with the purpose of qualifying a biomarker for a specific context of use (for example, as a preclinical marker of nephrotoxicity, hepatotoxicity, or vascular injury).

Next, Goodsaid noted the work of the Microarray Quality Control Consortium, which was initiated and is coordinated by the FDA National Center for Toxicological Research and was tasked with addressing issues of reliability, performance, quality, and analysis of microarray data. The consortium found that microarray data are repeatable within a laboratory, reproducible among laboratories, concordant among platforms, and comparable with alternative technologies. The consortium then began to address technical issues related to the development and validation of predictive signatures and classifiers on the basis of gene-expression data from microarrays and to assess the capabilities and limitations of microarray technology. Goodsaid noted that publications on that work would be released soon.

Goodsaid returned to the VXDS program at FDA and stated that one aspect involves the biologic interpretation of the lists of differentially expressed genes. He noted several questions that FDA considers, such as, What functions or pathways are associated with substantially over-represented genes in the list? How many pathways are affected? What types of pathways are affected? What is the inferred mechanism of action and toxicity of the gene-expression changes? What is the tissue specificity of the pathways and gene function? Goodsaid stated that FDA reviews a sponsor’s interpretation, tries to reconstruct it, and attempts to provide alternative biologic interpretations. He mentioned that FDA has entered into a cooperative research and development agreement with one company to provide software that provides several methods for evaluating the lists of differentially expressed genes. Goodsaid concluded by saying that FDA has been able to draft guidance for pharmacogenomic-data submissions on the basis of its experience with the voluntary data submissions and recommendations from sponsors, the Microarray Quality Control Consortium, other interested parties, and public forums.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

Panel Discussion

The afternoon session closed with a panel discussion that focused on the definition of toxicity pathway, an issue raised during several presentations. Faustman noted that defining a toxicity pathway is context-dependent. For example, apoptosis is generally seen as an adverse effect, but if apoptosis did not occur during development, humans would have fins. So time, place, and response make a difference. Goodsaid added that he would be hesitant to label a pathway as a toxicity pathway in isolation. For FDA, pathway information helps the agency to make regulatory decisions; mechanistic data allow the agency to interpret other test data.

Waters stated that ideally what should be identified are pathways that are consistently and measurably changed within 2 weeks—or possibly even 4 weeks—of exposure and that are indicative and predictive of some outcome downstream that is recognized as a toxicity end point. She noted, however, that her definition raises the controversy about distinguishing an adverse response from an adaptive response; until gene expression, protein abundance, or some other measure has been evaluated over the dynamic cycle, one cannot distinguish whether it is a time-dependent expression of some adaptive response or truly a dose-dependent change indicative of toxicity.

Thomas stated that his work has not focused on defining a toxicity pathway itself, but on grouping pathways according to common biologic function to make predictions. Faustman noted that there are differences between the various analytic tools and approaches that are being used to define pathways and that the differences could affect how one defines a pathway. The tools and approaches are only as good as the data that go into them, and the scientific community has not yet developed an unbiased approach for integrating pathway information.

Rhomberg stated that toxicity pathway might not be the best descriptive term. Normal biologic processes need to be understood, as does how the processes are changed by various agents and result in toxicity. Furthermore, the processes are not linear sequences of events but networks of interactions. Thus, Rhomberg concluded that the focus should be on discovering the key pathway components and how they are integrated to make the functional modules discussed by Waters. One participant questioned, however, whether one needed to understand what a toxicity pathway was; perhaps one could use the models described during the symposium to derive a benchmark dose for a response, calculate a tissue dose, and then extrapolate to a human exposure that would drive the given response. Thomas and Kedderis agreed, but Kedderis noted that it is important to evaluate mechanisms when dealing with unknowns, agents on which few, if any, data are available. Thomas countered that one can provide guidance on reference doses or risk-specific doses by using genomic approaches and thus bridge the gap between making decisions on the basis of reasonable scientific data and not making decisions because of lack of data and simply ignoring the possible risks posed by exposure to the unknowns. He added that if genomic

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

profiles are obtained on six specific tissues, one can predict about 90% of all positive responses by using rat and mouse bioassays and thus obtain valuable information for risk assessment.

WHAT THE FUTURE HOLDS

Where Are We Going? or, Are We There Yet, and Where Is There

The final session of the symposium was devoted to presentations and discussions on visions for the future and the path forward. Preuss began the session by providing a perspective on the changing landscape of risk assessment and the need for modernization. He stated that advances in understanding of the gene environment and the pending test data from the European REACH program are driving the need for change. He described two paradigms for understanding the effects of toxic chemicals. One is the human-disease model in which genetic profiles of people with and without a disease are compared to yield fingerprints of disease and susceptibility, and the other is the current animal-testing model in which chemically induced events are matched to rodent test results and rodent modes of action. Preuss noted that the new science and technologies should allow movement between paradigms, although the two approaches will probably progress in tandem for many years. However, the question now is how to move from assessing a few chemicals each year to assessing thousands of chemicals each year as the REACH program anticipates. Dossiers on 40,000 chemicals are expected by 2012, and the U.S. government is ill prepared to use the volume and complexity of information resulting from that or a similar program. Anticipating the need for change, EPA sponsored several NRC reports over the last few years that focused on toxicity testing and risk assessment (for example, NRC 2007a, 2008, 2009). Overall, those reports concluded that risk-assessment data needs cannot be met with the current testing methods, that scientists need to determine how to use the new data being generated for risk assessment, and that the transformation of risk assessment has to occur with stakeholder input.

Preuss described EPA’s dual approach to developing the next generation of risk assessments. First, EPA is considering creating a high-priority list of chemicals and streamlining the process for assessment by narrowing the scope, using off-the-shelf risk approaches, and focusing and coordinating stakeholder reviews. Second, EPA is considering broadening the scope of some assessments to synthesize more information into each assessment, such as assessments on cumulative effects of agents that cause the same effect or on families of chemicals that are physically similar. EPA intends to explore the new science, methods, and policies that could be incorporated into emerging and future risk assessments with the primary goal of mapping a path forward. Preuss listed many questions that will need to be addressed, including, How can the new information best be incorporated into risk assessment and used to inform risk managers?

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

What new policies and procedures will be needed? How can EPA ensure that decision-makers, the courts, and Congress see the new approach as an acceptable way to proceed? EPA’s strategy for the next generation of risk assessment is to implement the framework presented in the NRC report Science and Decisions (NRC 2009); to develop an operational knowledge of bioinformatics, data mining, and gene-environment databases to support risk-assessment work; and to develop prototype examples of increasingly complex assessments that are responsive to risk context and refined through discussions with scientists, risk managers, and stakeholders. Preuss concluded that EPA estimates that it may take a decade before risk assessment can rely primarily on the new advances in science, but it is necessary to begin now to address the needed changes.

Bucher continued the discussion by providing a perspective from the National Institute of Environmental Health Sciences (NIEHS) and noted that current NTP research includes many projects that are amenable to high-throughput screening and high-content data analysis. Furthermore, NTP has made a commitment to the development and use of high-throughput screening that could be used to set priorities among chemicals for further in-depth toxicologic evaluation, to identify mechanisms of toxicity, and to develop predictive models of in vivo biologic responses in humans. NTP’s commitment is consistent with its vision, articulated 5 years ago, of supporting the evolution of toxicology from an observational science to a predictive one that is based on a broad array of target-specific, mechanism-based biologic observations.

Bucher noted that several people had commented during the symposium that tools need to be developed to handle the enormous volume of data being generated. He described a program at NIEHS led by Christopher Portier that is attempting to create such a tool. The approach is to use knowledge about genes associated with diseases to determine pathways linked to the genes and thus link the pathways to the diseases (that is, to elucidate “disease” pathways). The next step is to use toxicogenomic and proteomic databases on well-studied chemicals to link chemicals to diseases through pathways and then to analyze the toxicity pathways to find the best points for screening, such as critical nodes or connection points. –Omics and other molecular tools can then be used to validate the choices. What is ultimately created is an interaction network (see Figure 14). Bucher concluded that NTP expectations for the 21st century are to continue to refine traditional methods and to develop new methods to generate information on mechanisms, exposure-response relationships, and life-stage and genetic susceptibility that will allow better prediction of toxicity to humans and ultimately better protection of public health; to reconcile results from new data-rich techniques—such as genomics, proteomics, and high-throughput screens—with existing testing information for conceptual validation; and to develop approaches to accomplish formal validation of new methods for human hazard and risk estimations.

Tina Bahadori, of the American Chemistry Council Long-Range Research Initiative, continued the discussion of the future vision and path forward by pro-

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
FIGURE 14 Interaction network that can be used to associate environmental factors with toxicity pathways and associated human diseases. Source: Gohlke et al. 2009. Reprinted with permission; copyright 2009, BMC Systems Biology. J. Bucher, National Institute of Environmental Health Sciences, presented at the symposium.

FIGURE 14 Interaction network that can be used to associate environmental factors with toxicity pathways and associated human diseases. Source: Gohlke et al. 2009. Reprinted with permission; copyright 2009, BMC Systems Biology. J. Bucher, National Institute of Environmental Health Sciences, presented at the symposium.

viding an industry perspective. She stated that an unprecedented opportunity exists to improve the scientific understanding of the potential effects of chemicals on human health and the environment. New technologies to test the effects of chemicals have the potential to revolutionize risk assessment, and if it is done in a scientifically robust way, risks could be understood better, faster, and less expensively. The concern, however, is that the technology is advancing faster than the means to interpret the data accurately. Although investments are made to generate volumes of data, comparable investments to interpret the data are lacking; without investment in the “science of interpretation,” the tendency will be to rely on high-throughput hazard data as a surrogate for risk assessment.

Bahadori stated that scientists need to determine how information is going to be translated into a real-world context to protect public health and the environment. Information on host susceptibility and background exposures will be needed for interpretation and extrapolation of in vitro test results. Furthermore, information on actual human exposure will be needed for selection of doses for toxicity testing so that hazard information can be developed on environmentally relevant effects and for determination of whether concentrations that perturb

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

toxicity pathways are biologically relevant. Scientists also will need to understand the progression from exposure to effect. Exposure science will need to predict and link exposure among all levels of biologic organization and to use the new technologies to characterize exposure. Bahadori emphasized the need to invest in the new tools and technologies and to stay committed to the long-range vision, recognizing that it may be years before the new science can be used to advance risk assessment. She concluded by saying that her organization has developed a research strategy to support the vision and is investing in research on interpreting the new data being generated, on developing innovative approaches to characterize biologically relevant exposures and their relation to health risk, and on determining the genetic influences and gene-environment interactions of susceptible populations.

Roger Ulrich, of Calistoga Pharmaceuticals, provided a perspective from the pharmaceutical industry on what the future holds. The key difficulties that the pharmaceutical industry faces are late-stage failures and product withdrawals, which are extremely expensive and reduce the ability to reinvest in research; the erosion of consumer confidence in and the increased consumer expectations for product safety; the paucity of new products; and the shift of risk and product development from large pharmaceutical companies to small ones. To overcome the difficulties, Ulrich stated, the industry must focus on the pipeline and do a better job of assessing products, and this requires more thorough preclinical assessment of toxicity and more research on mechanisms and affected biologic processes or pathways. The goal is to identify prognostic biomarkers—markers that tell what might happen rather than markers that tell what has happened (diagnostic biomarkers). He also stated that the industry must focus on patients and identify at-risk people or populations in parallel with clinical development.

Ulrich stated that the new technologies can help to improve the processes of drug discovery and development. They can help to identify molecular pathways involved in pharmacologic and toxicologic mechanisms; this will help in making decisions as to whether to pursue specific compounds earlier in the process. The new technologies can also help to identify potential biomarkers and to detect adverse drug effects in animals that do not necessarily result in the expression of an adverse outcome in an animal model. Regarding the patient, the new technologies can be used to understand the idiosyncrasies that may increase the risk or decrease the benefit for individual patients. For example, they can be used to identify genetic defects or susceptibilities that can lead to adverse events in response to specific drugs. Thus, the vision for the drug industry is to use contemporary tools to understand the full spectrum of drug effects on biologic pathways in the preclinical phase before formal development, to overlay drug-response networks on patient phenotype and genetic networks to understand individual patient risk, and to identify benefits, as well as risks, and apply the knowledge in prescription practices. In the next 5 years, Ulrich stated, scientists will continue to develop and explore the potential of the new tools and technologies, but the excitement will be in the development of in silico models and

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

application of new discoveries to the clinical setting. He concluded by listing several resources needed, including an open-source genetic and functional genomic data platform, most likely funded by government, and training for the next generation of scientists.

Helmut Zarbl, of the University of Medicine and Dentistry of New Jersey, continued the discussion with an academic perspective on the future vision and discussed the NRC report Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment (NRC 2007c), which was released shortly after Toxicity Testing in the 21st Century. The committee that produced the report was asked to evaluate the status of toxicogenomics and to discuss potential applications, including applications to risk assessment. It concluded that toxicogenomic technologies have strong potential to affect decision-making but are not ready to replace existing testing regimens in risk assessment and regulatory toxicology; that toxicogenomics can provide information to be added to the weight of evidence for refining risk judgments; and that ultimately the technologies are envisioned as more sensitive and informative than existing technologies and have the potential to replace some current approaches or tests. That committee recommended a human toxicogenomics initiative to accomplish the following tasks: create and manage a large public database for storing and integrating the results of toxicogenomic analysis with conventional toxicity-testing data; assemble toxicogenomic and conventional toxicologic data on hundreds of compounds into a single database; create a centralized national biorepository for human clinical and epidemiologic samples; develop bioinformatic tools—such as software, analysis, and statistical tools—further; consider ethical, legal, and social implications of collecting and using toxicogenomic data and samples; and coordinate subinitiatives to evaluate the application of toxicogenomic technologies to the assessment of risks associated with chemical exposures.

Zarbl concluded by discussing the path forward and stated that improvements in technology and science often build on previous knowledge and that scientists should not abandon the tools and knowledge of classical toxicology and risk assessment. He continued, saying that the paradigm shift will require a reduction in reliance on apical end points, and the challenge will be to validate toxicogenomic data to ensure that they are predictive of outcomes that occur much further downstream. Thus, the path for the next several years will be to develop in vitro assays, tools, and strategies; to continue to populate public databases of curated data; to invest in systems toxicology and computational tools for pathway-based risk assessment; to incorporate toxicogenomic data into the weight of evidence for risk assessment; and to continue to explore and validate the utility of the pathway-based approach for risk assessment. Further in the future, pathway-based approaches may be used for routine hazard screening of both new and legacy compounds. Zarbl again highlighted, however, the need for validation of the new approaches before they become stand-alone processes, and he cautioned that if the assays yield negative results, we need to proceed with care to ensure that nothing was missed.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

Gina Solomon, of the Natural Resources Defense Council, closed with a perspective from an environmental-advocacy group and stated that she is encouraged by the research presented at the symposium. However, she noted that many people have concerns about the new direction, and some effort will be needed to persuade the broader community that the new approach has merit. The NRC report Toxicity Testing in the 21st Century offered the hope of rapid, cost-effective, animal-sparing testing of thousands of chemicals and a chance to intervene to prevent pathway perturbations before the occurrence of disease. Many data have been generated since publication of that report, but few of them have been incorporated or used in risk assessment, and the backlog of 80,000 chemicals remains. Solomon noted that the situation is reminiscent of EPA’s Endocrine Disruptor Screening Program (EDSP), which was originally designed to screen thousands of chemicals but was delayed for more than a decade by an overcumbersome validation process. She stated that the EDSP has not lived up to its promise, and the scientific community needs to work hard not to repeat the history of that program. She acknowledged that some effort will be required to bring the new science into the process of regulatory decision-making in a timely fashion. However, the revision and reauthorization of the Toxic Substances Control Act now under way may help to facilitate the adoption of new scientific tools. Solomon concluded that at the end of the day, if the pathway-based assays are concerned only with generating more in-depth, detailed mode-of-action data on the same subset of old chemicals, the new paradigm will fail. However, if it can deal with the backlog of chemicals and foster faster, health-protective hazard identification and risk assessment, it will be heralded as an important and valuable advance.

Panel Discussion

The symposium closed with a panel discussion focused on questions about and hopes for advancing the new paradigm. Preuss expressed concern that the field is quickly becoming, if it is not already, extremely complex and that distinguishing important signals from ones that are not important is going to be challenging. Furthermore, validating the sophisticated, complex models and approaches is going to present another challenge. The new paradigm will not be successful if scientists create models that are intelligible only to a select few. The fear is that the research community is far outpacing the ability of the regulatory community to use the information that is generated and to make intelligent decisions about it. If that happens, much will fall by the wayside. The problem is that the EPA’s research budgets have decreased each year to the point where EPA now has 50% of the purchasing power that it had 10 years ago. The available resources are not commensurate with the kind of effort needed.

Zarbl noted that pathway predictions at this point may be premature given that a majority of gene expression is still not understood and that huge gaps must be addressed before the data should be used in risk assessment. Ulrich agreed

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

that huge gaps exist but stated that the gaps are being systematically filled. His concern was that realistic expectations be set. For example, it is unrealistic to expect that in 10 years scientists will be able to screen 200,000 chemicals in cell cultures and know the human risks on the basis of the resulting data. Bahadori added that the paradigm will fail if perfection of the science is required. Information that can inform and improve risk assessment will be available soon, if it is not already available. The scientific community needs to determine the level of knowledge needed for different purposes, that is, what is good enough for a given task. Bahadori expressed concern, however, about funding and emphasized the need to articulate the value of the current effort to ensure that resources are available. The scientific community needs to create a compelling case for requesting resources.

Bucher remarked on the need expressed throughout the symposium for creating a large, publicly accessible data platform and the need for a concerted effort to overcome the inertia that might exist in creating such a platform. He stated that expectations for the new science and approaches will not be fulfilled if the necessary computational and analytic tools are not made available. Zarbl clarified that what is needed is a platform where data are curated; standards would have to be met to enter data so that they are standardized or uniform. What is being proposed is not simply a data repository.

The question arose about what must be accomplished in the near term to illustrate the value of the new science. Bucher commented that the results of high-throughput screening should be used to design and guide current toxicity testing. The resulting data can help to move standard toxicology models forward. Zarbl stated that studies that demonstrate that pathway-based risk assessment can produce results at least as good as standard approaches (such as the studies described by Thomas earlier in the symposium) are the milestones needed. Ulrich noted that high priority should be given to a concerted effort to pull together the existing knowledge to create a cohesive and comprehensive output. Kenny Crump, of Environ, concurred: the key is to start applying and using the data and comparing the results with those of the current path. That would illuminate the gaps and help to differentiate between data that EPA might need and data that some other application would need.

Hal Zenick, of EPA, commented that application of the new approaches may depend on what information is needed. One size does not fit all risk assessments. For example, the information needed to make a clean-up decision will be quite different from that needed to set exposure guidelines on a nationally pervasive bioaccumulative chemical. The nature of the question that needs to be answered may determine how soon the new approaches can be applied to risk assessment and risk management. Whenever the new science is finally used to make risk-assessment and risk-management decisions, several noted, a major challenge will be risk communication, that is, explaining the new approaches to policy-makers and the public.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

REFERENCES

Amundson, S.A., K.T. Do, L.C. Vinikoor, R.A. Lee, C.A. Koch-Paiz, J. Ahn, M. Reimers, Y. Chen, D.A. Scudiero, J.N. Weinstein, J.M. Trent, M.L. Bittner, P.S. Meltzer, and A.J. Fornace, Jr. 2008. Integrating global gene expression and radiation survival parameters across the 60 cell lines of the National Cancer Institutes anticancer drug screen. Cancer Res. 68(2): 415-424.

Ashburner, M., C.A. Ball, J.A. Blake, D. Botstein, H. Butler, J.M. Cherry, A.P. Davis, K. Dolinski, S.S. Dwight, J.T. Eppig, M.A. Harris, D.P. Hill, L. Issel-Tarver, A. Kasarskis, S. Lewis, J.C. Matese, J.E. Richardson, M. Ringwald, G.M. Rubin, and G. Sherlock. 2000. Gene ontology: Tool for the unification of biology. Nat Genet. 25(1):25-29.

Beck, M.B. 2002. Model evaluation and performance. Pp. 1275-1279 in Encyclopedia of Environmetrics, A.H. El-Shaarawi, and W.W. Piegorsch, eds. Chichester, UK: Wiley.

Begley, T.J., A.S. Rosenback, T. Ideker, and L.D. Samson. 2002. Damage recovery pathways in Saccharomyces cerevisiae revealed by genomic phenotyping and in-teractome mapping. Mol. Cancer Res. 1(2):103-112.

Bucher, J.R., and C. Portier. 2004. Human carcinogenic risk evaluation, Part V: The national toxicology program vision for assessing the human carcinogenic hazard of chemicals. Toxicol. Sci. 82(2): 363-366.

Collins, F.S., G.M. Gray, and J.R. Bucher. 2008. Toxicology. Transforming environmental health protection. Science 319(5865):906-907.

EC (European Commission). 2007. REACH in Brief. European Commission [online]. Available: http://ec.europa.eu/environment/chemicals/reach/pdf/2007_02_reach_in_brief.pdf [accessed Apr. 30, 2010].

EPA (U.S. Environmental Protection Agency). 2009. The U.S Environmental Protection Agency’s Strategic Plan for Evaluating the Toxicity of Chemicals. EPA/100/K-09/001. Office of the Science Advisor, Science Policy Council, U.S. Environmental Protection Agency, Washington, DC. March 2009 [online]. Available: http://www.epa.gov/spc/toxicitytesting/docs/toxtest_strategy_032309.pdf [accessed Apr. 30, 2010].

Fry, R.C., T.J. Begley, and L.D. Samson. 2005. Genome-wide responses to DNA-damaging agents. Annu. Rev. Microbiol. 59:357-377.

Gohlke, J.M., R. Thomas, Y. Zhang, M. Rosenstein, A.P. Davis, C. Murphy, K.G. Becker, C.J. Mattingly, and C.J. Portier. 2009. Genetic and environmental pathways to complex diseases. BMC Systems Biology 3:46.

Hartung, T., and M. Leist. 2008. Food for thought…on the evolution of toxicology and the phasing out of animal testing. ALTEX 25(2):91-96.

Johnson, C.D., Y. Balagurunathan, K.P. Lu, M. Tadesse, M.H. Falahatpisheh, R.J. Carroll, E.R. Dougherty, C.A. Afshari, and K.S. Ramos. 2003. Genomic profiles and predictive biological networks in oxidant-induced atherogenesis. Physiol. Genomics 13(3):263-275.

Johnson, C.D., Y. Balagurunathan, M.G. Tadesse, M.H. Falahatpisheh, M. Brun, M.K. Walker, E.R. Dougherty, and K.S. Ramos. 2004. Unraveling gene-gene interactions regulated by ligands of the aryl hydrocarbon receptor. Environ. Health Perspect. 112(4):403-412.

Judson, R., A. Richard, D.J. Dix, K. Houck, M. Martin, R. Kavlock, V. Dellarco, T. Henry, T. Holderman, P. Sayre, S. Tan, T. Carpenter, and E. Smith. 2009. The tox-

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

icity data landscape for environmental chemicals. Environ. Health Perspect. 117(5):685-695.

Lu, K.P., L.M. Hallberg, J. Tomlinson, and K.S. Ramos. 2000. Benzo(a)pyrene activates L1Md retrotransposon and inhibits DNA repair in vascular smooth muscle cells. Mutat. Res. 454(1-2): 35-44.

NRC (National Research Council). 2000. Scientific Frontiers in Developmental Toxicology and Risk Assessment. Washington, DC: National Academy Press.

NRC (National Research Council). 2007a. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: National Academies Press.

NRC (National Research Council). 2007b. Models in Environmental Regulatory Decision Making. Washington, DC: National Academies Press.

NRC (National Research Council). 2007c. Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment. Washington, DC: National Academies Press.

NRC (National Research Council). 2008. Phthalates and Cumulative Risk Assessment: The Tasks Ahead. Washington, DC: National Academies Press.

NRC (National Research Council). 2009. Science and Decisions: Advancing Risk Assessment. Washington, DC: National Academies Press.

Oberdörster, G., E. Oberdörster, and J. Oberdörster. 2005. Nanotoxicology: An emerging discipline evolving from studies of ultrafine particles. Environ. Health Perspect. 113(7):823-839.

Robinson, J.F., X. Yu, S. Hong, C. Zhou, N. Kim, D. DeMasi, and E.M. Faustman. 2010. Embryonic toxicokinetic and dynamic differences underlying strain sensitivity to cadmium during neurulation. Reprod. Toxicol. 29(3):279-285.

Rudén, C. 2001. The use and evaluation of primary data in 29 trichloroethylene carcinogen risk assessments. Regul. Toxicol. Pharmacol. 34(1):3-16.

Stratton, M.R., P.J. Campbell, and P.A. Futreal. 2009. The cancer genome. Nature 458(7239):719-724.

Teeguarden, J.G., P.M. Hinderliter, G. Orr, B.D. Thrall, and J.G. Pounds. 2007. Particokinetics in vitro: Dosimetry considerations for in vitro nanoparticle toxicity assessments. Toxicol. Sci. 95(2):300-312.

Waters, K.M., L.M. Masiello, R.C. Zangar, B.J. Tarasevich, N.J. Karin, R.D. Quesenberry, S. Bandyopadhyay, J.G. Teeguarden, J.G. Pounds, and B.D. Thrall. 2009. Macrophage responses to silica nanoparticles are highly conserved across particle sizes. Toxicol. Sci. 107(2):553-569.

Yu, X., W.C. Griffith, K. Hanspers, J.F. Dillman III, H. Ong, M.A. Vredevoogd, and E.M. Faustman. 2006. A system based approach to interpret dose and time-dependent microarray data: Quantitative integration of gene ontology analysis for risk assessment. Toxicol. Sci. 92(2):560-577.

Yu, X., S. Hong, E.G. Moreira, and E.M. Faustman. 2009. Improving in vitro Sertoli cell/gonocyte co-culture model for assessing male reproductive toxicity: Lessons learned from comparisons of cytotoxicity versus genomic responses to phthalates. Toxicol. Appl. Pharmacol. 239(3):325-336.

Yu, X., J.F. Robinson, J.S. Sidhu, S. Hong, and E.M. Faustman. 2010. A system-based comparison of gene expression reveals alterations in oxidative stress, disruption of ubiquitin-proteasome system and altered cell cycle regulation after exposure to cadmium and methylmercury in Mouse Embryonic Fibroblast (MEF). Toxicol. Sci. 114(2):356-377.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×

This page intentionally left blank.

Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 1
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 2
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 3
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 4
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 5
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 6
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 7
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 8
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 9
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 10
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 11
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 12
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 13
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 14
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 15
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 16
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 17
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 18
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 19
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 20
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 21
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 22
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 23
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 24
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 25
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 26
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 27
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 28
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 29
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 30
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 31
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 32
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 33
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 34
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 35
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 36
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 37
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 38
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 39
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 40
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 41
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 42
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 43
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 44
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 45
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 46
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 47
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 48
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 49
Suggested Citation:"Summary of the Symposium." National Research Council. 2010. Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary. Washington, DC: The National Academies Press. doi: 10.17226/12913.
×
Page 50
Next: Appendixes »
Toxicity-Pathway-Based Risk Assessment: Preparing for Paradigm Change: A Symposium Summary Get This Book
×
Buy Paperback | $46.00 Buy Ebook | $36.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

In 2007, the National Research Council envisioned a new paradigm in which biologically important perturbations in key toxicity pathways would be evaluated with new methods in molecular biology, bioinformatics, computational toxicology, and a comprehensive array of in vitro tests based primarily on human biology. Although some considered the vision too optimistic with respect to the promise of the new science, no one can deny that a revolution in toxicity testing is under way. New approaches are being developed, and data are being generated. As a result, the U.S. Environmental Protection Agency (EPA) expects a large influx of data that will need to be evaluated. EPA also is faced with tens of thousands of chemicals on which toxicity information is incomplete and emerging chemicals and substances that will need risk assessment and possible regulation. Therefore, the agency asked the National Research Council to convene a symposium to stimulate discussion on the application of the new approaches and data in risk assessment.

The symposium was held on May 11-13, 2009, in Washington, DC, and included presentations and discussion sessions on pathway-based approaches for hazard identification, applications of new approaches to mode-of-action analyses, the challenges to and opportunities for risk assessment in the changing paradigm, and future directions.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!