National Academies Press: OpenBook
« Previous: 1 Introduction
Suggested Citation:"2 Vision." National Research Council. 2007. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press. doi: 10.17226/11970.
×
Page 35
Suggested Citation:"2 Vision." National Research Council. 2007. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press. doi: 10.17226/11970.
×
Page 36
Suggested Citation:"2 Vision." National Research Council. 2007. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press. doi: 10.17226/11970.
×
Page 37
Suggested Citation:"2 Vision." National Research Council. 2007. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press. doi: 10.17226/11970.
×
Page 38
Suggested Citation:"2 Vision." National Research Council. 2007. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press. doi: 10.17226/11970.
×
Page 39
Suggested Citation:"2 Vision." National Research Council. 2007. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press. doi: 10.17226/11970.
×
Page 40
Suggested Citation:"2 Vision." National Research Council. 2007. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press. doi: 10.17226/11970.
×
Page 41
Suggested Citation:"2 Vision." National Research Council. 2007. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press. doi: 10.17226/11970.
×
Page 42
Suggested Citation:"2 Vision." National Research Council. 2007. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press. doi: 10.17226/11970.
×
Page 43
Suggested Citation:"2 Vision." National Research Council. 2007. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press. doi: 10.17226/11970.
×
Page 44
Suggested Citation:"2 Vision." National Research Council. 2007. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press. doi: 10.17226/11970.
×
Page 45
Suggested Citation:"2 Vision." National Research Council. 2007. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press. doi: 10.17226/11970.
×
Page 46
Suggested Citation:"2 Vision." National Research Council. 2007. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press. doi: 10.17226/11970.
×
Page 47
Suggested Citation:"2 Vision." National Research Council. 2007. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press. doi: 10.17226/11970.
×
Page 48
Suggested Citation:"2 Vision." National Research Council. 2007. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press. doi: 10.17226/11970.
×
Page 49
Suggested Citation:"2 Vision." National Research Council. 2007. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press. doi: 10.17226/11970.
×
Page 50
Suggested Citation:"2 Vision." National Research Council. 2007. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press. doi: 10.17226/11970.
×
Page 51
Suggested Citation:"2 Vision." National Research Council. 2007. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press. doi: 10.17226/11970.
×
Page 52
Suggested Citation:"2 Vision." National Research Council. 2007. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press. doi: 10.17226/11970.
×
Page 53
Suggested Citation:"2 Vision." National Research Council. 2007. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press. doi: 10.17226/11970.
×
Page 54
Suggested Citation:"2 Vision." National Research Council. 2007. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press. doi: 10.17226/11970.
×
Page 55

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

2 Vision Make no little plans. They have no magic to stir men’s blood and probably themselves will not be realized. Make big plans; aim high in hope and work, remembering that a noble, logical diagram once recorded will never die, but long after we are gone will be a living thing, asserting itself with ever- growing insistency. Daniel Hudson Burnham, Architect Designer of the 1893 Chicago World’s Fair The goal of toxicity testing is to develop data that can ensure appropriate protection of public health from the adverse effects of exposures to environmental agents. Current approaches to toxicity testing rely primarily on observing adverse biologic responses in homogeneous groups of animals exposed to high doses of a test agent. However, the relevance of such animal studies for the as- sessment of risks to heterogeneous human populations exposed at much lower concentrations has been questioned. Moreover, the studies are expensive and time-consuming and can use large numbers of animals, so only a small proportion of chemicals have been evaluated with these methods. Adequate coverage of differ- ent life stages, of end points of public concern, such as develop- mental neurotoxicity, and of mixtures of environmental agents is a 35

36 Toxicity Testing in the 21st Century continuing concern. Current tests also provide little information on modes and mechanisms of action, which are critical for under- standing interspecies differences in toxicity, and little or no infor- mation for assessing variability in human susceptibility. Thus, the committee looked to recent scientific advances to provide a new approach to toxicity testing. A revolution is taking place in biology. At its center is the progress being made in the elucidation of cellular-response net- works. Those networks are interconnected pathways composed of complex biochemical interactions of genes, proteins, and small molecules that maintain normal cellular function, control commu- nication between cells, and allow cells to adapt to changes in their environment. A familiar cellular-response network is signaling by estrogens in which initial exposure results in enhanced cell prolif- eration and growth of specific tissues or in proliferation of estro- gen-sensitive cells in culture (Frasor et al. 2003). In that type of network, initial interactions between a signaling molecule and various cellular receptors result in a cascade of early, midterm, and late responses to achieve a coordinated response that orches- trates normal physiologic functions (Landers and Spelsberg 1992; Thummel 2002; Rochette-Egly 2003). Bioscience is rapidly enhancing our knowledge of cellular- response networks and allowing scientists to begin to uncover the manner in which environmental agents perturb pathways to cause toxicity. Pathways that can lead to adverse health effects when sufficiently perturbed are termed toxicity pathways. Responses of cells to oxidative stress caused by exposure to diesel exhaust par- ticles (DEP) constitute an example of toxicity pathways within a cellular-response network (Xiao et al. 2003). In a dose-related fash- ion, in vitro exposures to DEP lead to activation of a hierarchic set of pathways. First, cell antioxidant signaling is increased. As the dose increases, inflammatory signaling is enhanced; finally, at higher doses, there is activation of cell-death (apoptosis) pathways (Nel et al. 2006). Thus, in the cellular-response network dealing

Vision 37 with oxidative stress, the antioxidant pathways activated by DEPs are normal adaptive signaling pathways that assist in maintaining homeostasis; however, they are also toxicity pathways in that they lead to adverse effects when oxidant exposure is sufficiently high. The committee capitalizes on the recent advances in elucidating and understanding toxicity pathways and proposes a new ap- proach to toxicity testing based on them. New investigative tools are providing knowledge about biologic processes and functions at an astonishing rate. In vitro tests that evaluate activity in toxicity pathways are elucidating the modes and mechanisms of action of toxic substances. Quantitative high-throughput assays can be used to expand the coverage of the universe of new and existing chemicals that need to be evaluated for human health risk assessment (Roberts 2001; Inglese 2002; Inglese et al. 2006; Haney et al. 2006). The new assays can also generate enhanced information on dose-response relationships over a much wider range of concentrations, including those representative of human exposure. Pharmacokinetic and pharmacodynamic models promise to provide more accurate extrapolation of tissue dosimetry linked to cellular and molecular end points. The application of toxicogenomic technologies and systems-biology evaluation of signaling networks will permit genomewide scans for genetic and epigenetic perturbations of toxicity pathways. Thus, changes in toxicity pathways are envisioned as the basis of a new toxicity-testing paradigm for managing the risks posed by environmental agents instead of apical end points from whole-animal tests. This chapter provides an overview of the committee’s vision but first discusses the limitations of current toxicity-testing strate- gies, the design goals for a new system, and the options that the committee considered. Key terms used throughout this report are listed and defined in Box 2-1.

38 Toxicity Testing in the 21st Century BOX 2-1 Key Terms Used in the Report • Apical end point. An observable outcome in a whole organism, such as a clinical sign or pathologic state, that is indicative of a disease state that can result from exposure to a toxicant. • Cellular-response network. Interconnected pathways composed of the complex biochemical interactions of genes, proteins, and small molecules that maintain normal cellular function, control communication between cells, and allow cells to adapt to changes in their environment. • High-throughput assays. Efficiently designed experiments that can be automated and rapidly performed to measure the effect of substances on a biologic process of interest. These assays can evaluate hundreds to many thousands of chemicals over a wide concentration range to identify chemical actions on gene, pathway, and cell function. • Mechanism of action. A detailed description, often at the molecular level, of the means by which an agent causes a disease state or other adverse effect. • Medium-throughput assays. Assays that can be used to test large numbers of chemicals for their ability to perturb more integrated cellular responses, such as cell proliferation, apoptosis, and mutation. Because of assay complexity, fewer agents can be evaluated in the same period than with high-throughput assays. • Mode of action. A description of key events or processes by which an agent causes a disease state or other adverse effect. • Systems biology. The study of all elements in a biologic system and their interrelationships in response to exogenous perturbation (Stephens and Rung 2006). • Toxicity pathway. Cellular response pathways that, when sufficiently perturbed in an intact animal, are expected to result in adverse health effects (see Figure 2-2). LIMITATIONS OF CURRENT TESTING STRATEGIES The exposure-response continuum shown in Figure 2-1 effec- tively represents the current approach to toxicologic risk assess-

Vision 39 ment. It focuses primarily on adverse health outcomes as the end points for assessing the risk posed by environmental agents and establishing human exposure guidelines. Although intermediate biologic changes and mechanisms of action are considered in the paradigm, they are viewed as steps along the pathway to the ul- timate induction of an adverse health outcome. Traditional toxicity-testing strategies undertaken in the con- text of the above paradigm have evolved and expanded over the last few decades to reflect increasing concern about a wider vari- ety of toxic responses, such as subtle neurotoxic effects and ad- verse immunologic changes. The current system, which relies primarily on a complex set of whole-animal-based toxicity-testing strategies for hazard identification and dose-response assessment, has difficulty in addressing the wide variety of challenges that toxicity testing must meet today. Toxicity testing is under increas- ing pressure to meet several competing demands: • Test large numbers of existing chemicals, many of which lack basic toxicity data. • Test the large number of new chemicals and novel materi- als, such as nanomaterials, introduced into commerce each year. • Evaluate potential adverse effects with respect to all criti- cal end points and life stages. • Evaluate potential toxicity in the most vulnerable members of the human population. • Minimize animal use. Exposure Tissue Biologically Early Late Pathology Dose Effective Dose Responses Responses FIGURE 2-1 The exposure-response continuum underlying the current paradigm for toxicity testing.

40 Toxicity Testing in the 21st Century • Reduce the cost and time required for chemical safety evaluation. • Acquire detailed mechanistic and tissue-dosimetry data needed to assess human risk quantitatively and to aid in regula- tory decision-making. The current approach relies primarily on in vivo mammalian toxicity testing and is unable to meet those competing demands adequately. In 1979, about 62,000 chemicals were in commerce (GAO 2005). Today, there are 82,000, and about 700 are introduced each year (GAO 2005). The large volume of new and current chemicals in commerce is not being fully assessed (see the com- mittee’s interim report, NRC 2006). One reason for the testing gaps is that the current testing is so time-consuming and resource- intensive. Furthermore, only limited mechanistic information is routinely developed to understand how most chemicals are ex- pected to produce adverse health effects in humans. Those defi- ciencies limit the ability to predict toxicity in human populations that are typically exposed to much lower doses than those used in whole-animal studies. They also limit the ability to develop pre- dictions about similar chemicals that have not been similarly tested. The following sections describe several limitations of the current system and describe how a system based on toxicity pathways would help to address them. Low-Dose Extrapolation from High-Dose Data Traditional toxicity testing has relied on administering high doses to animals of nearly identical susceptibility to generate data for identifying critical end points for risk assessment. Historically, exposing animals to high doses was justified by a need for suffi- cient statistical power to observe high incidences of toxic re- sponses in small test populations with relatively short exposures.

Vision 41 In many cases, daily doses in animal toxicity tests are orders of magnitude greater than those expected in human exposures. Thus, the use of high-dose animal toxicity tests for predicting risks of specific apical human end points has remained challenging and controversial. Inferring effects at lower doses is difficult because of inherent uncertainty in the nature of dose-response relation- ships. Effects at high doses may result from metabolic processes that contribute negligibly at lower doses or may arise from bio- logic processes that do not occur with treatment at lower doses. In contrast, high doses may cause overt toxic responses that preclude the detection of biologic interactions between the chemical and various signaling pathways that lead to subtle but important ad- verse effects. The vision proposed in this report offers the poten- tial to obtain direct information on toxic effects at exposures more relevant to those experienced by human populations. Animal-to-Human Extrapolation Other concerns arise about the relationship between the biol- ogy of the test species and the heterogeneous human population. Animals have served as models of human response for decades because the biology of the test animals is, in general, similar to that of humans (NRC 1977). However, although the generality holds true, there are several examples of idiosyncratic responses in test animals and humans in which chemicals do not have a spe- cific toxic effect in a test species but do in humans and vice versa. A classic example is thalidomide: rats are resistant, and human fetuses are sensitive. The committee envisions a future in which tests based on human cell systems can serve as better models of human biologic responses than apical studies in different species. The committee therefore believes that, given a sufficient research and development effort, human cell systems have the potential to largely supplant testing in animals.

42 Toxicity Testing in the 21st Century Mixtures Current toxicity-testing approaches have been criticized because of their failure to consider co-exposures that commonly occur in human populations. Because animal toxicity tests are time-consuming and resource-intensive and result in the sacrifice of animals, it is difficult to use them for substantial testing of chemical mixtures (NRC 1988; Cassee et al. 1998; Feron et al. 1995; Lydy et al. 2004; Bakand et al. 2005; Pauluhn 2005; Teuschler et al. 2005). Furthermore, without information on how chemicals exert their biologic effects, testing of mixtures is a daunting task. For example, testing of mixtures in animal assays could involve huge numbers of combinations of chemicals and the use of substantial resources in an effort of uncertain value. In contrast, testing based on toxicity pathways could allow grouping of chemicals according to their effects on key biologic pathways. Combinations of chemicals that interact with the same toxicity pathway could be tested over broad dose ranges much more rapidly and inexpensively. The resulting data could allow an intelligent and focused approach to the problem of assessing risk in human populations exposed to mixtures. DESIGN CRITERIA FOR A NEW TOXICITY- TESTING PARADIGM The committee discussed the design criteria that should be considered in developing a strategy for toxicity testing in the fu- ture. As discussed in the committee’s interim report (NRC 2006), which did much to frame those criteria, the goal is to improve tox- icity testing by accomplishing the following objectives: • Provide broader coverage of chemicals and their mixtures, end points, and life-stage vulnerabilities.

Vision 43 • Reduce the cost and time of testing, increase efficiency and flexibility, and make it possible to reach a decision more quickly. • Use fewer animals and cause minimal suffering to animals that are used. • Develop a more robust scientific basis of risk assessment by providing detailed mechanistic and dosimetry information and by encouraging the integration of toxicologic and population- based data. The committee considered those objectives as it weighed various options. The following section discusses some of the op- tions considered by the committee. OPTIONS FOR A NEW TOXICITY-TESTING PARADIGM In developing its vision for toxicity testing, the committee explored four options, as presented in Table 2-1. The baseline op- tion (Option I) applies current toxicity-testing principles and prac- tices. Accordingly, it would use primarily in vivo animal toxicity tests to predict human health risks. The difficulties in interpreting animal data obtained at high doses with respect to risks in the heterogeneous human population would not be circumvented. Moreover, because whole-animal testing is expensive and time- consuming, the number of chemicals addressed would continue to be small. The continued use of relatively large numbers of animals for toxicity testing also raises ethical issues and is inconsistent with emphasis on reduction, replacement, and refinement of ani- mal use (Russell and Burch 1959). Overall, the current approach does not provide an adequate balance among the four objectives of toxicity testing identified in the committee’s interim report: depth of testing, breadth of testing, animal welfare, and conserva- tion of testing resources.

44 Toxicity Testing in the 21st Century The committee extensively considered the expanded use of tiered testing (Option II) to alleviate some of the concerns with present practice. The tiered approach to toxicity testing entails a stepwise process for screening and evaluating the toxicity of agents that still relies primarily on test results in whole animals. The goal of tiered testing is to generate pertinent data for more efficient assessment of potential health risks posed by an environmental agent, taking into consideration available knowledge on the chemical and its class, its modes or mechanisms TABLE 2-1 Options for Future Toxicity-Testing Strategies Option I Option II Option III Option IV In Vivo Tiered In Vivo In Vitro and In Vivo In Vitro Animal biology Animal biology Primarily human Primarily human biology biology High doses High doses Broad range of Broad range of doses doses Low throughput Improved High and medium High throughput throughput throughput Expensive Less expensive Less expensive Less expensive Time- Less time- Less time- Less time- consuming consuming consuming consuming Use of relatively Use of fewer Use of substantially Use of virtually no large numbers animals fewer animals animals of animals Based on apical Based on apical Based on Based on end points end points perturbations of perturbations of critical cellular critical cellular responses responses Some screening Screening using Screening using using computational computational computational and in vitro approaches possible; approaches approaches; more limited animal flexibility than studies that focus on current methods mechanism and metabolism

Vision 45 of action, and its intended use and estimated exposures (Carmichael et al. 2006). Those factors are used to refine testing priorities to focus first on areas of greatest concern in early tiers and then to move judiciously to advanced testing in later tiers as needed. In addition, an emphasis on pharmacokinetic studies in tiered approaches has been considered in recent discussions of improving toxicity testing of pesticides (Carmichael et al. 2006; Doe et al. 2006). Tiered testing has been recommended in evaluating the toxic- ity of agricultural products (Doe et al. 2006), in screening for en- docrine disruptors (Charles 2004), and in assessing developmental toxicity (Spielman 2005) and carcinogenicity (Stavanja et al. 2006) of chemicals and products. A tiered-testing approach also has the promise to include comparative genomic studies to help to iden- tify genes, transcription-factor motifs, and other putative control regions that are involved in tissue responses (Ptacek and Sell 2005). The increasing complexity of biologic information—including ge- nomic, proteomic, and cell-signaling information—has encour- aged the use of a more systematic multilevel approach in toxicity screening (Yokota et al. 2004). The systematic development of tiered, decision-tree selection of more limited suites of animal tests could conceivably provide toxicity-testing data nearly equivalent to those currently obtained but without the need to conduct tests for as many apical end points. The use of appropriately chosen computational models and in vitro screens might also permit sound risk-management decisions in some cases without the need for in vivo testing. Both types of tiered-testing strategies offer the potential of reducing animal use and toxicity-testing costs and allowing flexibility in testing based on risk-management information needs. Although the committee recognized the potential for incremental improve- ment in toxicity testing through a tiered approach, Option II still represents only a small step in improving coverage, reducing costs and animal use, and increasing mechanistic information in risk

46 Toxicity Testing in the 21st Century assessment. It still relies on whole-animal testing and is geared mainly toward deciding which animal tests are required in risk assessment for any specific agent. Although tiered testing might be pursued more formally in a transition to a more comprehensive toxicity-testing strategy, it does not meet most of the design crite- ria discussed earlier. In the committee’s view, a more transformative paradigm shift is needed to achieve the objectives for toxicity testing set out in its interim report, represented by Options III and IV in Table 2- 1. The committee’s vision is built on the identification of biologic perturbations of toxicity pathways that can lead to adverse health outcomes under conditions of human exposure. The use of a com- prehensive array of in vitro tests to identify relevant biologic per- turbations with cellular and molecular systems based on human biology could eventually eliminate the need for whole-animal test- ing and provide a stronger, mechanistically based approach for environmental decision-making. Computational models could also play a role in the early identification of environmental agents potentially harmful to humans, although further testing would probably be needed. This new approach would be less expensive and less time-consuming than the current approach and result in much higher throughput. Although the reliance on in vitro results lacks the whole-organism integration provided by current tests, toxicologic assessments would be based on biologic perturbations of toxicity pathways that can reasonably be expected to lead to adverse health effects. Understanding of the role of such perturba- tions in the induction of toxic responses would be refined through toxicologic research. With the further development of in vitro test systems of toxicity pathways and the tools for assessing the dose- response characteristics of the perturbations, the committee be- lieves that its vision for toxicity testing will meet the four objec- tives set out in its interim report. Full implementation of the high-throughput, fully human- cell-based testing scheme represented by Option IV in Table 2-1

Vision 47 would face a number of scientific challenges. Major concerns are related to ensuring adequate testing of metabolites and the poten- tial difficulties of evaluating novel chemicals, such as nanomateri- als and biotechnology products with in vitro tests. Those chal- lenges require maintenance of some whole-animal tests into the foreseeable future, as indicated in Option III, which includes spe- cific in vivo studies to assess formation of metabolites and some mechanistic studies of target-organ responses to environmental agents and leaves open the possibility that more extensive in vivo toxicity evaluations of new classes of agents will be needed. Like Option IV, Option III emphasizes the development and applica- tion of new in vitro assays for biologic perturbations of toxicity pathways. Thus, although the committee notes that Option IV embodies the ultimate goal for toxicity testing, the committee’s vision for the next 10-20 years is defined by Option III. The committee is mindful of the methodologic developments that will be required to orchestrate the transition from current practices toward its vision. During the transition period, there will be a need to continue the use of many current test procedures, in- cluding whole-animal tests, as the tools needed to implement the committee’s vision fully are developed. The steps that need to be taken to achieve the committee’s vision are discussed further in Chapter 5. The committee notes that European approaches to improve toxicity testing emphasize the replacement of animal tests with in vitro methods (Gennari et al. 2004). However, a major goal of the European approaches is to develop in vitro batteries that can pre- dict the outcome of high-dose testing in animals. The committee distinguishes those in vitro tests from the ones noted in Options III and IV. In vitro studies promise to provide more mechanistic information and to allow more extensive and more rapid determi- nations of biologic perturbations that are directly relevant to hu- man biology and exposures.

48 Toxicity Testing in the 21st Century OVERVIEW OF COMMITTEE’S LONG-RANGE VISION FOR TOXICITY TESTING The framework outlined in Figure 2-2 forms the basis of the committee’s vision for toxicity testing in the 21st century. The fig- ure indicates that the initial perturbations of cell-signaling motifs, genetic circuits, and cellular-response networks are obligatory changes related to chemical exposure that might eventually result in disease. The consequences of a biologic perturbation depend on the magnitude of the perturbation, which is related to the dose, the timing and duration of the perturbation, and the susceptibility of the host. Accordingly, at low doses, many biologic systems may function normally within their homeostatic limits. At somewhat higher doses, clear biologic responses occur. They may be success- fully handled with adaptation, although some susceptible people may respond. A more intense or persistent perturbation may overwhelm the capacity of the system to adapt and lead to tissue injury and possibly to adverse health effects. In this framework, the goals of toxicity testing are to identify critical pathways that when perturbed can lead to adverse health outcomes and to evaluate the host susceptibility to understand the effects of perturbations on human populations. To implement the new toxicity-testing approach, toxicologists will need to evolve a comprehensive array of test procedures that will allow the reliable identification of important biologic perturbations in key toxicity pathways. And epidemiologists and toxicologists will need to de- velop approaches to understand the range of host susceptibility within populations. Viewing toxic responses in that manner shifts the focus away from the apical end points emphasized in the tra- ditional toxicity-testing paradigm, toward biologic perturbations that can be identified more efficiently without the need for whole- animal testing and toward characterizing host vulnerability to provide the context for assessing the implications of test results.

Vision 49 Exposure Tissue Dose Biologic Interaction Perturbation Biologic Normal Inputs Biologic Function Early Cellular Changes Adaptive Stress Cell Responses Injury Morbidity and Mortality FIGURE 2-2 Biologic responses viewed as results of an intersection of exposure and biologic function. The intersection leads to perturbation of biologic pathways. When perturbations are sufficiently large or when the host is unable to adapt because of underlying nutritional, genetic, disease, or life-stage status, biologic function is compromised, and this leads to toxicity and disease. Source: Adapted from Andersen et al. 2005. Reprinted with permission; copyright 2005, Trends in Biotechnology. Figure 2-3 illustrates the major components of the commit- tee’s proposed vision: chemical characterization, toxicity testing, and dose-response and extrapolation modeling. Each component is discussed in further detail in Chapter 3, and the tools and tech- nologies that might play some role in the future paradigm are dis- cussed in Chapter 4. Chemical characterization involves consideration of physico- chemical properties, environmental persistence, bioaccumulation potential, production volumes, concentration in environmental media, and exposure data. Computational tools, such as quantita- tive structure-activity relationship models and bioinformatics,

50 Toxicity Testing in the 21st Century FIGURE 2-3 The committee’s vision is a process that includes chemical charac- terization, toxicity testing, and dose-response and extrapolation modeling. At each step, population-based data and human exposure information are consid- ered, as is the question of what data are needed for decision-making. may eventually be used to categorize chemicals, predict likely tox- icity and metabolic pathways, screen for relative potency with predictive models, and organize large databases for analysis and hypothesis generation. Toxicity testing in the committee’s vision seeks to identify the perturbations in toxicity pathways that are expected to lead to ad- verse effects. The focus on biologic perturbations rather than api- cal end points is fundamental to the committee’s vision. If adopted, the vision will lead to a major shift in emphasis away from whole-animal testing toward efficient in vitro tests and greater human surveillance. Targeted testing is also used to iden- tify or explore functional end points associated with adverse health outcomes and may include in vivo metabolic or mechanis- tic studies.

Vision 51 Dose-response modeling is used to describe the relationship between biologic perturbations and dose in quantitative terms and optimally mechanistic terms; extrapolation modeling is used to make predictions of possible effects in human populations at pre- vailing environmental exposure concentrations. Computational modeling of toxicity pathways evaluated with specific high- throughput tests themselves will be a key tool for establishing dose-response relationships. Pharmacokinetic models, such as physiologically based pharmacokinetic models, will assist in ex- trapolating from in vitro to in vivo conditions by relating concen- trations active in in vitro toxicity-test systems to human blood concentrations. At each step, population-based data and human-exposure in- formation should be considered. For example, human biomonitor- ing and surveillance can provide data on exposure to environ- mental agents, host susceptibility, and biologic change that will be key for dose-response and extrapolation modeling. Throughout, the information needs for risk-management decision-making must be borne in mind because they will to a great extent guide the na- ture of the testing required. Thus, the population-based data and exposure information and the risk contexts are shown to encircle the core toxicity-testing strategy in Figure 2-3. The components of the toxicity-testing paradigm are semi- autonomous but interrelated modules, containing specific sets of underlying technologies and capabilities. Some chemical evalua- tions may proceed stepwise from chemical characterization to tox- icity testing to dose-response and extrapolation modeling, but that sequence might not always be followed. A critical feature of the new vision is consideration of risk context at each step and the ability to exit the strategy at any point whenever enough data have been generated to inform the decision that needs to be made. Also, the proposed vision emphasizes the generation and use of population-based data and exposure estimates when possible. The committee notes that the development of surveillance systems for

52 Toxicity Testing in the 21st Century chemicals newly introduced into the market will be important. The new vision encourages the collection of such data on impor- tant existing chemicals from biomonitoring, surveillance, and mo- lecular epidemiologic studies. Finally, flexibility is needed in the testing of environmental agents to encourage the development and application of novel tools and approaches. The evolution of the toxicity-testing process, as envisioned here, must retain flexi- bility to encourage incorporation of new information and new methods as they are developed and found to be useful for evaluat- ing whether a given exposure poses a risk to humans. That will require formal procedures for the phasing in or phasing out of standard testing methods. Indeed, that process is attuned to the need for efficient testing of all chemicals in a timely, cost-effective fashion. The committee envisions a reconfiguration of toxicity testing through the development of in vitro medium- and high- throughput assays. The in vitro tests would be developed not to predict the results of current apical toxicity tests but rather as cell- based assays that are informative about mechanistic responses of human tissues to toxic chemicals. The committee is aware of the implementation challenges that the new toxicity-testing paradigm would face. For example, toxicity testing must be able to address the potential adverse health effects of chemicals in the environ- ment and of the metabolites formed when the chemicals enter the body. Much research will be needed to ensure that the new sys- tem evaluates the effects of the chemicals and their metabolites fully. Moreover, as we shift from a focus on apical end points to perturbations in toxicity pathways, there will be a need to develop an appropriate science base to support risk-management actions based on the perturbations. Implementation of the vision and the possible challenges are discussed in Chapter 5.

Vision 53 REFERENCES Andersen, M.E., J.E. Dennison, R.S. Thomas, and R.B. Conolly. 2005. New direc- tions in incidence-dose modeling. Trends Biotechnol. 23(3):122-127 Bakand, S., C. Winder, C. Khalil, and A. Hayes. 2005. Toxicity assessment of in- dustrial chemicals and airborne contaminants: Transition from in vivo to in vitro test methods: A review. Inhal. Toxicol. 17(13):775-787. Carmichael, N.G., H.A. Barton, A.R. Boobis, R.L. Cooper, V.L. Dellarco, N.G. Doerrer, P.A. Fenner-Crisp, J.E. Doe, J.C. Lamb IV, and T.P. Pastor. 2006. Agricultural chemical safety assessment: A multisector approach to the modernization of human safety requirements. Crit. Rev. Toxicol. 36(1):1-7. Cassee, F.R., J.P. Groten, P.J. van Bladeren, and V.J. Feron. 1998. Toxicological evaluation and risk assessment of chemical mixtures. Crit. Rev. Toxicol. 28(1):73-101. Charles, G.D. 2004. In vitro models in endocrine disruptor screening. ILAR J. 45(4):494-501. Doe, J.E., A.R. Boobis, A. Blacker, V. Dellarco, N.G. Doerrer, C. Franklin, J.I. Goodman, J.M. Kronenberg, R. Lewis, E.E. Mcconnell, T. Mercier, A. Mor- etto, C. Nolan, S. Padilla, W. Phang, R. Solecki, L. Tilbury, B. van Ravenzwaay, and D.C. Wolf. 2006. A tiered approach to systemic toxicity testing for agricultural chemical safety assessment. Crit. Rev. Toxicol. 36(1):37-68. Feron, V.J., J.P. Groten, D. Jonker, F.R. Cassee, and P.J. van Bladeren. 1995. Toxi- cology of chemical mixtures: Challenges for today and the future. Toxicol- ogy 105(2-3):415-427. Frasor, J., J.M. Danes, B. Komm, K.C.N. Chang, C.R. Lyttle, and B.S. Katzenellen- bogen. 2003. Profiling of estrogen up- and down-regulated gene expression in human breast cancer cells: Insights into gene networks and pathways underlying estrogenic control of proliferation and cell phenotype. Endocri- nology 144(10):4562-4574. GAO (U.S. Government Accounting Office). 2005. Chemical Regulation: Options Exist to Improve EPA’s Ability to Assess Health Risks and Manage its Chemical Review Program. GAO-05-458. U.S. Government Accounting Office, Washington, DC. June 2005 [online]. Available: http://www.gao. gov/new.items/d05458.pdf [accessed March 8, 2007]. Gennari, A., C. van den Berghe, S. Casati, J. Castell, C. Clemedson, S. Coecke, A. Colombo, R. Curren, G. Dal Negro, A. Goldberg, C. Gosmore, T. Hartung, I. Langezaal, I. Lessigiarska, W. Maas, I. Mangelsdorf, R. Parchment, P. Prieto, J.R. Sintes, M. Ryan, G. Schmuck, K. Stitzel, W. Stokes, J.A. Vericat, L. Gribaldo. 2004. Strategies to replace in vivo acute systemic toxicity test- ing. The report and recommendations of ECVAM Workshop 50. Altern. Lab. Anim. 32(4):437-459.

54 Toxicity Testing in the 21st Century Haney, S.A., P. LaPan, J. Pan, and J. Zhang. 2006. High-content screening moves to the front of the line. Drug Discov. Today 11(19-20):889-894. Inglese, J. 2002. Expanding the HTS paradigm. Drug Discov. Today 7(Suppl. 18):S105-S106. Inglese, J., D.S. Auld, A. Jadhav, R.L. Johnson, A. Simeonov, A. Yasgar, W. Zheng, and C.P. Austin. 2006. Quantitative high-throughput screening: A titration- based approach that efficiently identifies biological activities in large chemical libraries. Proc. Natl. Acad. Sci. U.S.A. 103(31):11473-11478. Landers, J.P., and T.C. Spelsberg. 1992. New concepts in steroid hormone action: Transcription factors, proto-oncogenes, and the cascade model for steroid regulation of gene expression. Crit. Rev. Eukaryot. Gene Expr. 2(1):19-63. Lydy, M., J. Belden, C. Wheelock, B. Hammock, and D. Denton. 2004. Challenges in regulating pesticide mixtures. Ecology and Society 9(6):1-6. Nel, A., T. Xia, L. Madler, and N. Li. 2006. Toxic potential of materials at the nanolevel. Science 311(5761):622-627. NRC (National Research Council). 1977. Drinking Water and Health, Vol. 1. Washington, DC: National Academy Press. NRC (National Research Council). 1988. Complex Mixtures: Methods for In Vivo Toxicity Testing. Washington, DC: National Academy Press. NRC (National Research Council). 2006. Toxicity Testing for Assessment of Envi- ronmental Agents: Interim Report. Washington, DC: The National Acad- emies Press. Pauluhn, J. 2005. Overview of inhalation exposure techniques: Strengths and weaknesses. Exp. Toxicol. Pathol. 57(Suppl. 1):111-128. Ptacek, T., and S.M. Sell. 2005. A tiered approach to comparative genomics. Brief Funct. Genomic Proteomic. 4(2):178-185. Roberts, S.A. 2001. High-throughput screening approaches for investigating drug metabolism and pharmacokinetics. Xenobiotica 31(8-9):557-589. Rochette-Egly, C. 2003. Nuclear receptors: Integration of multiple signaling pathways through phosphorylation. Cell. Signal. 15(4):355-366. Russell, W.M.S., and R.L. Burch. 1959. The Principles of Humane Experimental Technique. London: Methuen. Spielmann, H. 2005. Predicting the risk of developmental toxicity from in vitro assays. Toxicol. Appl. Pharmacol. 207(Suppl. 2):375-380. Stavanja, M.S., P.H. Ayres, D.R. Meckley, E.R. Bombick, M.F. Borgerding, M.J. Morton, C.D. Garner, D.H. Pence, and J.E. Swauger. 2006. Safety assess- ment of high fructose corn syrup (HFCS) as an ingredient added to ciga- rette tobacco. Exp. Toxicol. Pathol. 57(4):267-281. Stephens, S.M., and J. Rung. 2006. Advances in systems biology: Measurement, modeling and representation. Curr. Opin. Drug Discov. Devel. 9(2):240-250. Thummel, C.S. 2002. Ecdysone-regulated puff genes 2000. Insect Biochem. Mol. Biol. 32(2):113-120.

Vision 55 Teuschler, L., J. Klaunig, E. Carney, J. Chambers, R. Conolly, C. Gennings, J. Giesy, R. Hertzberg, C. Klaassen, R. Kode, D. Paustenbach, and R. Yang. 2005. Support of Science-Based Decisions Concerning the Evaluation of the Toxicology of Mixtures: A New Beginning. Charting the Future: Building the Scientific Foundation for Mixtures Joint Toxicity and Risk Assessment, February 16–17, 2005, Atlanta, GA, Society in Toxicology Sponsored Meeting: Contemporary Concepts In Toxicology [online]. Available: http:// www.toxicology.org/ai/meet/MixturesWhitePapers.doc [accessed Feb. 9, 2007]. Xiao, G.G., M. Wang, N. Li, J.A. Loo, and A.E. Nel. 2003. Use of proteomics to demonstrate a hierarchical oxidative stress response to diesel exhaust par- ticle chemicals in a macrophage cell line. J. Biol. Chem. 278(50):50781-50790. Yokota, F., G. Gray, J.K. Hammitt, and K.M. Thompson. 2004. Tiered chemical testing: A value of information approach. Risk Anal. 24(6):1625-1639.

Next: 3 Components of the Vision »
Toxicity Testing in the 21st Century: A Vision and a Strategy Get This Book
×
Buy Paperback | $49.00 Buy Ebook | $39.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Advances in molecular biology and toxicology are paving the way for major improvements in the evaluation of the hazards posed by the large number of chemicals found at low levels in the environment. The National Research Council was asked by the U.S. Environmental Protection Agency to review the state of the science and create a far-reaching vision for the future of toxicity testing. The book finds that developing, improving, and validating new laboratory tools based on recent scientific advances could significantly improve our ability to understand the hazards and risks posed by chemicals. This new knowledge would lead to much more informed environmental regulations and dramatically reduce the need for animal testing because the new tests would be based on human cells and cell components. Substantial scientific efforts and resources will be required to leverage these new technologies to realize the vision, but the result will be a more efficient, informative and less costly system for assessing the hazards posed by industrial chemicals and pesticides.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!