Skip to main content

Currently Skimming:

Summary of the Symposium
Pages 1-50

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 1...
... TOXICITY-PATHWAY-BASED RISK ASSESSMENT PREPARING FOR PARADIGM CHANGE A Symposium Summary
From page 3...
... . The committee envisioned a new paradigm in which biologically important perturbations in key toxicity pathways would be evaluated with new methods in molecular biology, bioinformatics, computational toxicology, and a comprehensive array of in vitro tests based primarily on human biology.
From page 4...
... A PARADIGM CHANGE ON THE HORIZON Warren Muir, of the National Academies, welcomed the audience to the symposium and stated that the environmental-management paradigm of the 1970s is starting to break down with recent scientific advances and the exponential growth of information and that the symposium should be seen as the first of many discussions on the impact of advances in toxicology on risk assessment. He introduced Bernard Goldstein, of the University of Pittsburgh, chair of the Standing Committee on Risk Analysis Issues and Reviews, who stated that although the standing committee does not make recommendations, symposium participants should feel free to suggest how to move the field forward and to make research recommendations.
From page 5...
... Given the uncertainties surrounding species extrapolation, dose extrapolation, and evaluation of sensitive populations today, the vision provided in the NRC report Toxicity Testing in the 21st Century: A Vision and a Strategy offers tremendous promise. However, Goldman used the example of EPA's Endocrine Disruptor Screening Program as a cautionary tale.
From page 6...
... Although the committee acknowledged option IV as the ultimate goal for toxicity testing, it chose option III to represent the vision for the next 10-20 years. That approach is a fundamental shift -- one that is based primarily on human biology, covers a broad range of doses, is mostly high-throughput, is less expensive and time-consuming, uses substantially fewer animals, and focuses on perturbations of critical cellular responses.
From page 7...
... The committee defined a toxicity pathway as a cellular-response pathway that, when sufficiently perturbed, is expected to result in an adverse health effect (see Figure 2) , and it envisioned a toxicity-testing system that evaluates biologically important perturbations in key toxicity pathways by using new methods in computational biology and a comprehensive array of in vitro tests based on human biology.
From page 8...
... Source Fate/Transport Exposure Tissue Dose Toxicity Pathways: Cellular response pathways that, when sufficiently perturbed, are Biologic Interaction expected to result in adverse health effects. Perturbation Biologic Normal Inputs Biologic Function Early Cellular Significance of perturbation Changes will depend not only on magnitude but on underlying nutritional, genetic, disease, Adaptive Stress Cell and life-stage status of host.
From page 9...
... Symposium Issues and Questions Lorenz Rhomberg, of Gradient Corporation, a member of the standing committee and chair of the planning committee, closed the first session by providing an overview of issues and questions to consider throughout the symposium. Rhomberg stated that the new tools will enable and require new approaches.
From page 10...
... He first discussed the Keap1-Nrf2 signaling pathway, which is sensitive to a variety of environmental stressors. Keap1-Nrf2 signaling pathways have been investigated by using knockout animal models, and the investigations have provided insight into how the pathways modulate disease outcomes.
From page 11...
... However, new tools available today are helping scientists to elucidate gene-environment interactions, and this research may provide a more scientific basis for evaluating human variability and susceptibility in the context of risk assessment. Leikauf noted that genetic disorders, such as sickle-cell anemia and cystic fibrosis, and environmental disorders, such as asbestosis and pneumoconiosis, cause relatively few deaths compared with complex diseases that are influenced by many genetic and environmental factors.
From page 12...
... High-throughput sequencing -- some of which can provide information on gene regulation and control by incorporating transcriptome analysis -- has enabled the genome-wide association studies already discussed at the symposium and has provided valuable information on experimental models, both whole-animal and in vitro systems. Rusyn described various tools and technologies available at the different levels of biologic organization and noted that the throughput potential for data acquisition diminishes as data relevance increases (see Figure 4)
From page 13...
... Mammalian Cell culture- Single Invertebrates- and lower Engineered Human (e.g., rodents) - based high molecule vertebrate organisms- tissue-based studies based content based based screening screening screening screening screening Throughput potential for data acquisition Human relevance of the data FIGURE 4 Throughput potential for data acquisition as related to levels of biologic organization.
From page 14...
... . Kavlock also noted that EPA responded to that report by issuing a strategic plan for evaluating the toxicity of chemicals that included three goals: identifying toxicity pathways and using them in screening, using toxicity pathways in risk assessment, and making an institutional transition to incorporate the new science.
From page 15...
... The program components are identifying toxicity pathways, developing highthroughput assays for them, screening chemical libraries, and linking the results to in vivo effects. Each component involves challenges, such as incorporating metabolic capabilities into the assays, determining whether to link assay results to effects found in rodent toxicity studies or to human toxicity, and predicting effective in vivo concentrations from effective in vitro concentrations.
From page 16...
... TABLE 2 Phased Development of ToxCast Program 16 Phasea Number of Chemicals Chemical Criteria Purpose Number of Assays Cost per Chemical Target Date Ia 320 Data-rich (pesticides) Signature development >500 $20,000 FY 2007-2008 Ib 15 Nanomaterials Pilot 166 $10,000 FY 2009 IIa >300 Data-rich chemicals Validation >400 ~$20,000-25,000 FY 2009 IIb >100 Known human toxicants Extrapolation >400 ~$20,000-25,000 FY 2009 IIc >300 Expanded structure and use Extension >400 ~$20,000-25,000 FY 2010 diversity IId >12 Nanomaterials PMN >200 ~$15,000-20,000 FY 2009-2010 III Thousands Data-poor Prediction and >300 ~$15,000-20,000 FY 2011-2012 priority-setting a Since the symposium, phases IIa, IIb, and IIc have been merged into a single endeavor.
From page 17...
... Although the pharmaceutical industry is currently using in vitro assays that are typically functional end-point assays, Pennie noted that there is no reason why those assays could not be supplemented or replaced with pathway-based assays, given a substantial investment in validation. He said that the industry is focusing on using batteries of in vitro assays to predict in vivo outcomes, similar to the ToxCast program, and described an effort at Pfizer to develop a single-assay platform that would evaluate multiple end points simultaneously and provide a single predictive score for hepatic injury.
From page 18...
... If an issue is identified with a chemical, that knowledge can guide the in vivo testing and, instead of a fishing expedition, scientists can test a hypothesis. Pfizer has also developed a multiparameter optimization model that uses six physicochemical properties to characterize permeability, clearance, and safety and that helps to predict the success of a drug candidate.
From page 19...
... He noted that especially in toxicology research in the pharmaceutical industry (but also in the food and chemical industry) there is an increasing need for parallel and efficient processes to assess compound classes, more alternatives to animal testing, tiered approaches that link toxicokinetics and toxicodynamics, enhanced use of systems biology in toxicology, and an emphasis on understanding interactions, combined action, and mixtures risk assessment.
From page 20...
... He noted that regulatory toxicology is a business; toxicity testing with animals in the European Union is an $800 million/year business that employs about 15,000 people. The data generated, however, are not always helpful for reaching conclusions about toxicity.
From page 21...
... Hartung said that the key problem for REACH will be the need for reproductive-toxicity testing, which will represent 70% of the costs of REACH and involve more than 80% of the animals that will be used. That problem will mushroom because few facilities are capable of conducting the testing.
From page 22...
... Panel Discussion The afternoon session closed with a panel discussion that focused on data gaps, pitfalls, and research needs. Kavlock commented that scientists need data to validate the systems, such as data from the pharmaceutical industry, which has extensive human and animal toxicology data on pharmaceutical agents.
From page 23...
... Instead, mode of action describes a series of "key events" that lead to an outcome; key events are measurable effects in experimental studies and can be compared among studies. Bucher stated that toxicity pathways are the contents of the "black boxes" described by the modes of action and that key toxicity pathways will be identified with the help of toxicogenomic data and genetic-association studies that examine relationships between genetic alterations and human diseases.
From page 24...
... Ramos stated that his laboratory has focused on using computational approaches to understand genomic data and construct biologic networks, which will provide clues to BaP toxicity. He interjected that the notion of pathway-based toxicity may be problematic because intersecting pathways all contribute to the ultimate biologic outcome, so the focus should be on understanding networks.
From page 25...
... Ramos concluded that L1 is linked to many human diseases -- such as chronic myeloid leukemia, Duchenne muscular dystrophy, colon cancer, and atherosclerosis -- and that research has shown that environmental agents, such as BaP, regulate the cellular expression of L1 by transcriptional mechanisms, DNA methylation, and histone covalent modifications. Thus, the molecular machinery involved in silencing and reactivating retroelements not only is important in environmental responses but might be playing a prominent role in defining disease outcomes.
From page 26...
... He said that recent research indicates that dioxin causes massive deregulation of homeobox and differentiation genes, so scientists should be critically investigating the developmental outcomes associated with dioxin exposure. Systems-Level Approaches for Understanding Nanomaterial Biocompatibility Brian Thrall, of the Pacific Northwest National Laboratory, discussed the challenges in evaluating mode of action and conducting hazard assessment of nanomaterials, and he provided examples of approaches from his laboratory to address the challenges.
From page 27...
... He and co-workers found that the major cellular processes affected by 10-nm and 500-nm silica were identical; none of over 1,000 biologic processes identified was statistically different as a function of particle size. So for amorphous silica, there was no compelling evidence that new biologic processes arise as a function of size at the nanoscale.
From page 28...
... Source: B Thrall, unpublished data, Pacific Northwest National Laboratory, presented at the symposium.
From page 29...
... After internal review within the executive branch, Wiener observed, the next hurdle would be judicial review. He argued that, although courts can be skeptical of new scientific methods in civil tort liability lawsuits, judicial review of agency science may be more deferential, especially when agencies are acting "at the frontiers of science." Several regulatory statutes now call for agencies to use the "best available science" or the "latest scientific knowledge," and a court could be convinced that toxicity-pathway approaches constitute the best and latest science.
From page 30...
... We will need to integrate all the information and use the best interpretive skills and scientific judgment to answer the important questions. CHALLENGES AND OPPORTUNITIES FOR RISK ASSESSMENT IN THE CHANGING PARADIGM Dose and Temporal Response Elaine Faustman, of the University of Washington, opened the afternoon session by discussing datasets and tools available to examine dose and temporal response and what is needed to move forward.
From page 31...
... has been used to identify potential signaling pathways versus single genes significantly changed after exposure. Once biologic processes are linked to pathway changes, one can begin to evaluate deviations from the normal patterns of gene expression that result from chemical exposure (Yu et al.
From page 32...
... The studies found that metals affect genes involved in the Wnt signaling pathway, that multiple transcription-factor families are affected by metals, and that more than 50% of the genes affected were uncharacterized at the pathway level; the latter finding indicates that much work still needs to be done to determine the link between gene changes and pathways and the importance of the gene changes. Faustman closed by listing several needs, including new tools for evaluating quantitative genomic response at multiple levels of biologic organization, kinetic and dynamic models that can allow for integration at various organization levels, better characterization of variability in genomic data, discussion of and consensus on how responses and changes in responses should be considered for effect-level assessment, and discussion of approaches to evaluate responses to early low-dose exposures vs responses at increasing complexity and decreased specificity.
From page 33...
... model to try to normalize the doses so that mice and rats received about the same internal dose. The BMD analysis of the genomic data indicated that at 5 days glutathione metabolism was perturbed and at 15 days DNA repair genes
From page 34...
... Thomas concluded that pathway-based transcriptomic dose-response data can provide insights into the mode of action. In the third example, Thomas provided suggestions for using genomics data to conduct risk assessments.
From page 35...
... Kedderis was optimistic, however, about the promises of the new science and technologies and concluded that PBPK models will be able to augment the interpretation of –omic dose-response data and ultimately to provide information on physiologic variability, bioactivation variability, and response variability in the population. Modular Network Modeling of Toxicity Pathways for Extrapolation Katrina Waters, of Pacific Northwest National Laboratory, described her laboratory research on –omics approaches to modeling of human disease states.
From page 36...
... Reprinted with permission; copyright 2010, Pacific Northwest National Laboratory.
From page 37...
... Furthermore, cells do not respond in isolation in tissues; thus, Waters stated, models must account for paracrine, neurologic, and other physiologic interactions to extrapolate from in vitro to in vivo systems accurately. She concluded by noting several needs to advance the toxicity-pathway approach for risk assessment, including biologically based models that incorporate epigenetic, proteomic, metabolomic, and post-translational modification data better; improved understanding of the relationships between toxicity pathways and toxicity outcomes; criteria for defining appropriate in vitro systems that represent in vivo toxicity sufficiently; and improved understanding of dosimetry and temporal differences between in vitro and in vivo systems.
From page 38...
... The Food and Drug Administration Experience in Analyzing Genomic Data Federico Goodsaid, of the FDA Office of Clinical Pharmacology, described
From page 39...
... He mentioned that FDA has entered into a cooperative research and development agreement with one company to provide software that provides several methods for evaluating the lists of differentially expressed genes. Goodsaid concluded by saying that FDA has been able to draft guidance for pharmacogenomic-data submissions on the basis of its experience with the voluntary data submissions and recommendations from sponsors, the Microarray Quality Control Consortium, other interested parties, and public forums.
From page 40...
... Goodsaid added that he would be hesitant to label a pathway as a toxicity pathway in isolation. For FDA, pathway information helps the agency to make regulatory decisions; mechanistic data allow the agency to interpret other test data.
From page 41...
... government is ill prepared to use the volume and complexity of information resulting from that or a similar program. Anticipating the need for change, EPA sponsored several NRC reports over the last few years that focused on toxicity testing and risk assessment (for example, NRC 2007a, 2008, 2009)
From page 42...
... . The next step is to use toxicogenomic and proteomic databases on well-studied chemicals to link chemicals to diseases through pathways and then to analyze the toxicity pathways to find the best points for screening, such as critical nodes or connection points.
From page 43...
... Information on host susceptibility and background exposures will be needed for interpretation and extrapolation of in vitro test results. Furthermore, information on actual human exposure will be needed for selection of doses for toxicity testing so that hazard information can be developed on environmentally relevant effects and for determination of whether concentrations that perturb
From page 44...
... The key difficulties that the pharmaceutical industry faces are late-stage failures and product withdrawals, which are extremely expensive and reduce the ability to reinvest in research; the erosion of consumer confidence in and the increased consumer expectations for product safety; the paucity of new products; and the shift of risk and product development from large pharmaceutical companies to small ones. To overcome the difficulties, Ulrich stated, the industry must focus on the pipeline and do a better job of assessing products, and this requires more thorough preclinical assessment of toxicity and more research on mechanisms and affected biologic processes or pathways.
From page 45...
... That committee recommended a human toxicogenomics initiative to accomplish the following tasks: create and manage a large public database for storing and integrating the results of toxicogenomic analysis with conventional toxicity-testing data; assemble toxicogenomic and conventional toxicologic data on hundreds of compounds into a single database; create a centralized national biorepository for human clinical and epidemiologic samples; develop bioinformatic tools -- such as software, analysis, and statistical tools -- further; consider ethical, legal, and social implications of collecting and using toxicogenomic data and samples; and coordinate subinitiatives to evaluate the application of toxicogenomic technologies to the assessment of risks associated with chemical exposures. Zarbl concluded by discussing the path forward and stated that improvements in technology and science often build on previous knowledge and that scientists should not abandon the tools and knowledge of classical toxicology and risk assessment.
From page 46...
... The NRC report Toxicity Testing in the 21st Century offered the hope of rapid, costeffective, animal-sparing testing of thousands of chemicals and a chance to intervene to prevent pathway perturbations before the occurrence of disease. Many data have been generated since publication of that report, but few of them have been incorporated or used in risk assessment, and the backlog of 80,000 chemicals remains.
From page 47...
... The question arose about what must be accomplished in the near term to illustrate the value of the new science. Bucher commented that the results of high-throughput screening should be used to design and guide current toxicity testing.
From page 48...
... 2004. Human carcinogenic risk evaluation, Part V: The na tional toxicology program vision for assessing the human carcinogenic hazard of chemicals.
From page 49...
... 2007a. Toxicity Testing in the 21st Century: A Vision and a Strategy.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.