Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 35
Toxicity Testing in the 21st Century: A Vision and a Strategy 2 Vision Make no little plans. They have no magic to stir men’s blood and probably themselves will not be realized. Make big plans; aim high in hope and work, remembering that a noble, logical diagram once recorded will never die, but long after we are gone will be a living thing, asserting itself with ever-growing insistency. Daniel Hudson Burnham, Architect Designer of the 1893 Chicago World’s Fair The goal of toxicity testing is to develop data that can ensure appropriate protection of public health from the adverse effects of exposures to environmental agents. Current approaches to toxicity testing rely primarily on observing adverse biologic responses in homogeneous groups of animals exposed to high doses of a test agent. However, the relevance of such animal studies for the assessment of risks to heterogeneous human populations exposed at much lower concentrations has been questioned. Moreover, the studies are expensive and time-consuming and can use large numbers of animals, so only a small proportion of chemicals have been evaluated with these methods. Adequate coverage of different life stages, of end points of public concern, such as developmental neurotoxicity, and of mixtures of environmental agents is a
OCR for page 36
Toxicity Testing in the 21st Century: A Vision and a Strategy continuing concern. Current tests also provide little information on modes and mechanisms of action, which are critical for understanding interspecies differences in toxicity, and little or no information for assessing variability in human susceptibility. Thus, the committee looked to recent scientific advances to provide a new approach to toxicity testing. A revolution is taking place in biology. At its center is the progress being made in the elucidation of cellular-response networks. Those networks are interconnected pathways composed of complex biochemical interactions of genes, proteins, and small molecules that maintain normal cellular function, control communication between cells, and allow cells to adapt to changes in their environment. A familiar cellular-response network is signaling by estrogens in which initial exposure results in enhanced cell proliferation and growth of specific tissues or in proliferation of estrogen-sensitive cells in culture (Frasor et al. 2003). In that type of network, initial interactions between a signaling molecule and various cellular receptors result in a cascade of early, midterm, and late responses to achieve a coordinated response that orchestrates normal physiologic functions (Landers and Spelsberg 1992; Thummel 2002; Rochette-Egly 2003). Bioscience is rapidly enhancing our knowledge of cellular-response networks and allowing scientists to begin to uncover the manner in which environmental agents perturb pathways to cause toxicity. Pathways that can lead to adverse health effects when sufficiently perturbed are termed toxicity pathways. Responses of cells to oxidative stress caused by exposure to diesel exhaust particles (DEP) constitute an example of toxicity pathways within a cellular-response network (Xiao et al. 2003). In a dose-related fashion, in vitro exposures to DEP lead to activation of a hierarchic set of pathways. First, cell antioxidant signaling is increased. As the dose increases, inflammatory signaling is enhanced; finally, at higher doses, there is activation of cell-death (apoptosis) pathways (Nel et al. 2006). Thus, in the cellular-response network dealing
OCR for page 37
Toxicity Testing in the 21st Century: A Vision and a Strategy with oxidative stress, the antioxidant pathways activated by DEPs are normal adaptive signaling pathways that assist in maintaining homeostasis; however, they are also toxicity pathways in that they lead to adverse effects when oxidant exposure is sufficiently high. The committee capitalizes on the recent advances in elucidating and understanding toxicity pathways and proposes a new approach to toxicity testing based on them. New investigative tools are providing knowledge about biologic processes and functions at an astonishing rate. In vitro tests that evaluate activity in toxicity pathways are elucidating the modes and mechanisms of action of toxic substances. Quantitative high-throughput assays can be used to expand the coverage of the universe of new and existing chemicals that need to be evaluated for human health risk assessment (Roberts 2001; Inglese 2002; Inglese et al. 2006; Haney et al. 2006). The new assays can also generate enhanced information on dose-response relationships over a much wider range of concentrations, including those representative of human exposure. Pharmacokinetic and pharmacodynamic models promise to provide more accurate extrapolation of tissue dosimetry linked to cellular and molecular end points. The application of toxicogenomic technologies and systems-biology evaluation of signaling networks will permit genomewide scans for genetic and epigenetic perturbations of toxicity pathways. Thus, changes in toxicity pathways are envisioned as the basis of a new toxicity-testing paradigm for managing the risks posed by environmental agents instead of apical end points from whole-animal tests. This chapter provides an overview of the committee’s vision but first discusses the limitations of current toxicity-testing strategies, the design goals for a new system, and the options that the committee considered. Key terms used throughout this report are listed and defined in Box 2-1.
OCR for page 38
Toxicity Testing in the 21st Century: A Vision and a Strategy BOX 2-1 Key Terms Used in the Report Apical end point. An observable outcome in a whole organism, such as a clinical sign or pathologic state, that is indicative of a disease state that can result from exposure to a toxicant. Cellular-response network. Interconnected pathways composed of the complex biochemical interactions of genes, proteins, and small molecules that maintain normal cellular function, control communication between cells, and allow cells to adapt to changes in their environment. High-throughput assays. Efficiently designed experiments that can be automated and rapidly performed to measure the effect of substances on a biologic process of interest. These assays can evaluate hundreds to many thousands of chemicals over a wide concentration range to identify chemical actions on gene, pathway, and cell function. Mechanism of action. A detailed description, often at the molecular level, of the means by which an agent causes a disease state or other adverse effect. Medium-throughput assays. Assays that can be used to test large numbers of chemicals for their ability to perturb more integrated cellular responses, such as cell proliferation, apoptosis, and mutation. Because of assay complexity, fewer agents can be evaluated in the same period than with high-throughput assays. Mode of action. A description of key events or processes by which an agent causes a disease state or other adverse effect. Systems biology. The study of all elements in a biologic system and their interrelationships in response to exogenous perturbation (Stephens and Rung 2006). Toxicity pathway. Cellular response pathways that, when sufficiently perturbed in an intact animal, are expected to result in adverse health effects (see Figure 2-2). LIMITATIONS OF CURRENT TESTING STRATEGIES The exposure-response continuum shown in Figure 2-1 effectively represents the current approach to toxicologic risk assess-
OCR for page 39
Toxicity Testing in the 21st Century: A Vision and a Strategy ment. It focuses primarily on adverse health outcomes as the end points for assessing the risk posed by environmental agents and establishing human exposure guidelines. Although intermediate biologic changes and mechanisms of action are considered in the paradigm, they are viewed as steps along the pathway to the ultimate induction of an adverse health outcome. Traditional toxicity-testing strategies undertaken in the context of the above paradigm have evolved and expanded over the last few decades to reflect increasing concern about a wider variety of toxic responses, such as subtle neurotoxic effects and adverse immunologic changes. The current system, which relies primarily on a complex set of whole-animal-based toxicity-testing strategies for hazard identification and dose-response assessment, has difficulty in addressing the wide variety of challenges that toxicity testing must meet today. Toxicity testing is under increasing pressure to meet several competing demands: Test large numbers of existing chemicals, many of which lack basic toxicity data. Test the large number of new chemicals and novel materials, such as nanomaterials, introduced into commerce each year. Evaluate potential adverse effects with respect to all critical end points and life stages. Evaluate potential toxicity in the most vulnerable members of the human population. Minimize animal use. FIGURE 2-1 The exposure-response continuum underlying the current paradigm for toxicity testing.
OCR for page 40
Toxicity Testing in the 21st Century: A Vision and a Strategy Reduce the cost and time required for chemical safety evaluation. Acquire detailed mechanistic and tissue-dosimetry data needed to assess human risk quantitatively and to aid in regulatory decision-making. The current approach relies primarily on in vivo mammalian toxicity testing and is unable to meet those competing demands adequately. In 1979, about 62,000 chemicals were in commerce (GAO 2005). Today, there are 82,000, and about 700 are introduced each year (GAO 2005). The large volume of new and current chemicals in commerce is not being fully assessed (see the committee’s interim report, NRC 2006). One reason for the testing gaps is that the current testing is so time-consuming and resource-intensive. Furthermore, only limited mechanistic information is routinely developed to understand how most chemicals are expected to produce adverse health effects in humans. Those deficiencies limit the ability to predict toxicity in human populations that are typically exposed to much lower doses than those used in whole-animal studies. They also limit the ability to develop predictions about similar chemicals that have not been similarly tested. The following sections describe several limitations of the current system and describe how a system based on toxicity pathways would help to address them. Low-Dose Extrapolation from High-Dose Data Traditional toxicity testing has relied on administering high doses to animals of nearly identical susceptibility to generate data for identifying critical end points for risk assessment. Historically, exposing animals to high doses was justified by a need for sufficient statistical power to observe high incidences of toxic responses in small test populations with relatively short exposures.
OCR for page 41
Toxicity Testing in the 21st Century: A Vision and a Strategy In many cases, daily doses in animal toxicity tests are orders of magnitude greater than those expected in human exposures. Thus, the use of high-dose animal toxicity tests for predicting risks of specific apical human end points has remained challenging and controversial. Inferring effects at lower doses is difficult because of inherent uncertainty in the nature of dose-response relationships. Effects at high doses may result from metabolic processes that contribute negligibly at lower doses or may arise from biologic processes that do not occur with treatment at lower doses. In contrast, high doses may cause overt toxic responses that preclude the detection of biologic interactions between the chemical and various signaling pathways that lead to subtle but important adverse effects. The vision proposed in this report offers the potential to obtain direct information on toxic effects at exposures more relevant to those experienced by human populations. Animal-to-Human Extrapolation Other concerns arise about the relationship between the biology of the test species and the heterogeneous human population. Animals have served as models of human response for decades because the biology of the test animals is, in general, similar to that of humans (NRC 1977). However, although the generality holds true, there are several examples of idiosyncratic responses in test animals and humans in which chemicals do not have a specific toxic effect in a test species but do in humans and vice versa. A classic example is thalidomide: rats are resistant, and human fetuses are sensitive. The committee envisions a future in which tests based on human cell systems can serve as better models of human biologic responses than apical studies in different species. The committee therefore believes that, given a sufficient research and development effort, human cell systems have the potential to largely supplant testing in animals.
OCR for page 42
Toxicity Testing in the 21st Century: A Vision and a Strategy Mixtures Current toxicity-testing approaches have been criticized because of their failure to consider co-exposures that commonly occur in human populations. Because animal toxicity tests are time-consuming and resource-intensive and result in the sacrifice of animals, it is difficult to use them for substantial testing of chemical mixtures (NRC 1988; Cassee et al. 1998; Feron et al. 1995; Lydy et al. 2004; Bakand et al. 2005; Pauluhn 2005; Teuschler et al. 2005). Furthermore, without information on how chemicals exert their biologic effects, testing of mixtures is a daunting task. For example, testing of mixtures in animal assays could involve huge numbers of combinations of chemicals and the use of substantial resources in an effort of uncertain value. In contrast, testing based on toxicity pathways could allow grouping of chemicals according to their effects on key biologic pathways. Combinations of chemicals that interact with the same toxicity pathway could be tested over broad dose ranges much more rapidly and inexpensively. The resulting data could allow an intelligent and focused approach to the problem of assessing risk in human populations exposed to mixtures. DESIGN CRITERIA FOR A NEW TOXICITY-TESTING PARADIGM The committee discussed the design criteria that should be considered in developing a strategy for toxicity testing in the future. As discussed in the committee’s interim report (NRC 2006), which did much to frame those criteria, the goal is to improve toxicity testing by accomplishing the following objectives: Provide broader coverage of chemicals and their mixtures, end points, and life-stage vulnerabilities.
OCR for page 43
Toxicity Testing in the 21st Century: A Vision and a Strategy Reduce the cost and time of testing, increase efficiency and flexibility, and make it possible to reach a decision more quickly. Use fewer animals and cause minimal suffering to animals that are used. Develop a more robust scientific basis of risk assessment by providing detailed mechanistic and dosimetry information and by encouraging the integration of toxicologic and population-based data. The committee considered those objectives as it weighed various options. The following section discusses some of the options considered by the committee. OPTIONS FOR A NEW TOXICITY-TESTING PARADIGM In developing its vision for toxicity testing, the committee explored four options, as presented in Table 2-1. The baseline option (Option I) applies current toxicity-testing principles and practices. Accordingly, it would use primarily in vivo animal toxicity tests to predict human health risks. The difficulties in interpreting animal data obtained at high doses with respect to risks in the heterogeneous human population would not be circumvented. Moreover, because whole-animal testing is expensive and time-consuming, the number of chemicals addressed would continue to be small. The continued use of relatively large numbers of animals for toxicity testing also raises ethical issues and is inconsistent with emphasis on reduction, replacement, and refinement of animal use (Russell and Burch 1959). Overall, the current approach does not provide an adequate balance among the four objectives of toxicity testing identified in the committee’s interim report: depth of testing, breadth of testing, animal welfare, and conservation of testing resources.
OCR for page 44
Toxicity Testing in the 21st Century: A Vision and a Strategy The committee extensively considered the expanded use of tiered testing (Option II) to alleviate some of the concerns with present practice. The tiered approach to toxicity testing entails a stepwise process for screening and evaluating the toxicity of agents that still relies primarily on test results in whole animals. The goal of tiered testing is to generate pertinent data for more efficient assessment of potential health risks posed by an environmental agent, taking into consideration available knowledge on the chemical and its class, its modes or mechanisms TABLE 2-1 Options for Future Toxicity-Testing Strategies Option I In Vivo Option II Tiered In Vivo Option III In Vitro and In Vivo Option IV In Vitro Animal biology Animal biology Primarily human biology Primarily human biology High doses High doses Broad range of doses Broad range of doses Low throughput Improved throughput High and medium throughput High throughput Expensive Less expensive Less expensive Less expensive Time-consuming Less time-consuming Less time-consuming Less time-consuming Use of relatively large numbers of animals Use of fewer animals Use of substantially fewer animals Use of virtually no animals Based on apical end points Based on apical end points Based on perturbations of critical cellular responses Based on perturbations of critical cellular responses Some screening using computational and in vitro approaches; more flexibility than current methods Screening using computational approaches possible; limited animal studies that focus on mechanism and metabolism Screening using computational approaches
OCR for page 45
Toxicity Testing in the 21st Century: A Vision and a Strategy of action, and its intended use and estimated exposures (Carmichael et al. 2006). Those factors are used to refine testing priorities to focus first on areas of greatest concern in early tiers and then to move judiciously to advanced testing in later tiers as needed. In addition, an emphasis on pharmacokinetic studies in tiered approaches has been considered in recent discussions of improving toxicity testing of pesticides (Carmichael et al. 2006; Doe et al. 2006). Tiered testing has been recommended in evaluating the toxicity of agricultural products (Doe et al. 2006), in screening for endocrine disruptors (Charles 2004), and in assessing developmental toxicity (Spielman 2005) and carcinogenicity (Stavanja et al. 2006) of chemicals and products. A tiered-testing approach also has the promise to include comparative genomic studies to help to identify genes, transcription-factor motifs, and other putative control regions that are involved in tissue responses (Ptacek and Sell 2005). The increasing complexity of biologic information—including genomic, proteomic, and cell-signaling information—has encouraged the use of a more systematic multilevel approach in toxicity screening (Yokota et al. 2004). The systematic development of tiered, decision-tree selection of more limited suites of animal tests could conceivably provide toxicity-testing data nearly equivalent to those currently obtained but without the need to conduct tests for as many apical end points. The use of appropriately chosen computational models and in vitro screens might also permit sound risk-management decisions in some cases without the need for in vivo testing. Both types of tiered-testing strategies offer the potential of reducing animal use and toxicity-testing costs and allowing flexibility in testing based on risk-management information needs. Although the committee recognized the potential for incremental improvement in toxicity testing through a tiered approach, Option II still represents only a small step in improving coverage, reducing costs and animal use, and increasing mechanistic information in risk
OCR for page 46
Toxicity Testing in the 21st Century: A Vision and a Strategy assessment. It still relies on whole-animal testing and is geared mainly toward deciding which animal tests are required in risk assessment for any specific agent. Although tiered testing might be pursued more formally in a transition to a more comprehensive toxicity-testing strategy, it does not meet most of the design criteria discussed earlier. In the committee’s view, a more transformative paradigm shift is needed to achieve the objectives for toxicity testing set out in its interim report, represented by Options III and IV in Table 2-1. The committee’s vision is built on the identification of biologic perturbations of toxicity pathways that can lead to adverse health outcomes under conditions of human exposure. The use of a comprehensive array of in vitro tests to identify relevant biologic perturbations with cellular and molecular systems based on human biology could eventually eliminate the need for whole-animal testing and provide a stronger, mechanistically based approach for environmental decision-making. Computational models could also play a role in the early identification of environmental agents potentially harmful to humans, although further testing would probably be needed. This new approach would be less expensive and less time-consuming than the current approach and result in much higher throughput. Although the reliance on in vitro results lacks the whole-organism integration provided by current tests, toxicologic assessments would be based on biologic perturbations of toxicity pathways that can reasonably be expected to lead to adverse health effects. Understanding of the role of such perturbations in the induction of toxic responses would be refined through toxicologic research. With the further development of in vitro test systems of toxicity pathways and the tools for assessing the dose-response characteristics of the perturbations, the committee believes that its vision for toxicity testing will meet the four objectives set out in its interim report. Full implementation of the high-throughput, fully human-cell-based testing scheme represented by Option IV in Table 2-1
OCR for page 47
Toxicity Testing in the 21st Century: A Vision and a Strategy would face a number of scientific challenges. Major concerns are related to ensuring adequate testing of metabolites and the potential difficulties of evaluating novel chemicals, such as nanomaterials and biotechnology products with in vitro tests. Those challenges require maintenance of some whole-animal tests into the foreseeable future, as indicated in Option III, which includes specific in vivo studies to assess formation of metabolites and some mechanistic studies of target-organ responses to environmental agents and leaves open the possibility that more extensive in vivo toxicity evaluations of new classes of agents will be needed. Like Option IV, Option III emphasizes the development and application of new in vitro assays for biologic perturbations of toxicity pathways. Thus, although the committee notes that Option IV embodies the ultimate goal for toxicity testing, the committee’s vision for the next 10-20 years is defined by Option III. The committee is mindful of the methodologic developments that will be required to orchestrate the transition from current practices toward its vision. During the transition period, there will be a need to continue the use of many current test procedures, including whole-animal tests, as the tools needed to implement the committee’s vision fully are developed. The steps that need to be taken to achieve the committee’s vision are discussed further in Chapter 5. The committee notes that European approaches to improve toxicity testing emphasize the replacement of animal tests with in vitro methods (Gennari et al. 2004). However, a major goal of the European approaches is to develop in vitro batteries that can predict the outcome of high-dose testing in animals. The committee distinguishes those in vitro tests from the ones noted in Options III and IV. In vitro studies promise to provide more mechanistic information and to allow more extensive and more rapid determinations of biologic perturbations that are directly relevant to human biology and exposures.
OCR for page 48
Toxicity Testing in the 21st Century: A Vision and a Strategy OVERVIEW OF COMMITTEE’S LONG-RANGE VISION FOR TOXICITY TESTING The framework outlined in Figure 2-2 forms the basis of the committee’s vision for toxicity testing in the 21st century. The figure indicates that the initial perturbations of cell-signaling motifs, genetic circuits, and cellular-response networks are obligatory changes related to chemical exposure that might eventually result in disease. The consequences of a biologic perturbation depend on the magnitude of the perturbation, which is related to the dose, the timing and duration of the perturbation, and the susceptibility of the host. Accordingly, at low doses, many biologic systems may function normally within their homeostatic limits. At somewhat higher doses, clear biologic responses occur. They may be successfully handled with adaptation, although some susceptible people may respond. A more intense or persistent perturbation may overwhelm the capacity of the system to adapt and lead to tissue injury and possibly to adverse health effects. In this framework, the goals of toxicity testing are to identify critical pathways that when perturbed can lead to adverse health outcomes and to evaluate the host susceptibility to understand the effects of perturbations on human populations. To implement the new toxicity-testing approach, toxicologists will need to evolve a comprehensive array of test procedures that will allow the reliable identification of important biologic perturbations in key toxicity pathways. And epidemiologists and toxicologists will need to develop approaches to understand the range of host susceptibility within populations. Viewing toxic responses in that manner shifts the focus away from the apical end points emphasized in the traditional toxicity-testing paradigm, toward biologic perturbations that can be identified more efficiently without the need for whole-animal testing and toward characterizing host vulnerability to provide the context for assessing the implications of test results.
OCR for page 49
Toxicity Testing in the 21st Century: A Vision and a Strategy FIGURE 2-2 Biologic responses viewed as results of an intersection of exposure and biologic function. The intersection leads to perturbation of biologic pathways. When perturbations are sufficiently large or when the host is unable to adapt because of underlying nutritional, genetic, disease, or life-stage status, biologic function is compromised, and this leads to toxicity and disease. Source: Adapted from Andersen et al. 2005. Reprinted with permission; copyright 2005, Trends in Biotechnology. Figure 2-3 illustrates the major components of the committee’s proposed vision: chemical characterization, toxicity testing, and dose-response and extrapolation modeling. Each component is discussed in further detail in Chapter 3, and the tools and technologies that might play some role in the future paradigm are discussed in Chapter 4. Chemical characterization involves consideration of physicochemical properties, environmental persistence, bioaccumulation potential, production volumes, concentration in environmental media, and exposure data. Computational tools, such as quantitative structure-activity relationship models and bioinformatics,
OCR for page 50
Toxicity Testing in the 21st Century: A Vision and a Strategy FIGURE 2-3 The committee’s vision is a process that includes chemical characterization, toxicity testing, and dose-response and extrapolation modeling. At each step, population-based data and human exposure information are considered, as is the question of what data are needed for decision-making. may eventually be used to categorize chemicals, predict likely toxicity and metabolic pathways, screen for relative potency with predictive models, and organize large databases for analysis and hypothesis generation. Toxicity testing in the committee’s vision seeks to identify the perturbations in toxicity pathways that are expected to lead to adverse effects. The focus on biologic perturbations rather than apical end points is fundamental to the committee’s vision. If adopted, the vision will lead to a major shift in emphasis away from whole-animal testing toward efficient in vitro tests and greater human surveillance. Targeted testing is also used to identify or explore functional end points associated with adverse health outcomes and may include in vivo metabolic or mechanistic studies.
OCR for page 51
Toxicity Testing in the 21st Century: A Vision and a Strategy Dose-response modeling is used to describe the relationship between biologic perturbations and dose in quantitative terms and optimally mechanistic terms; extrapolation modeling is used to make predictions of possible effects in human populations at prevailing environmental exposure concentrations. Computational modeling of toxicity pathways evaluated with specific high-throughput tests themselves will be a key tool for establishing dose-response relationships. Pharmacokinetic models, such as physiologically based pharmacokinetic models, will assist in extrapolating from in vitro to in vivo conditions by relating concentrations active in in vitro toxicity-test systems to human blood concentrations. At each step, population-based data and human-exposure information should be considered. For example, human biomonitoring and surveillance can provide data on exposure to environmental agents, host susceptibility, and biologic change that will be key for dose-response and extrapolation modeling. Throughout, the information needs for risk-management decision-making must be borne in mind because they will to a great extent guide the nature of the testing required. Thus, the population-based data and exposure information and the risk contexts are shown to encircle the core toxicity-testing strategy in Figure 2-3. The components of the toxicity-testing paradigm are semi-autonomous but interrelated modules, containing specific sets of underlying technologies and capabilities. Some chemical evaluations may proceed stepwise from chemical characterization to toxicity testing to dose-response and extrapolation modeling, but that sequence might not always be followed. A critical feature of the new vision is consideration of risk context at each step and the ability to exit the strategy at any point whenever enough data have been generated to inform the decision that needs to be made. Also, the proposed vision emphasizes the generation and use of population-based data and exposure estimates when possible. The committee notes that the development of surveillance systems for
OCR for page 52
Toxicity Testing in the 21st Century: A Vision and a Strategy chemicals newly introduced into the market will be important. The new vision encourages the collection of such data on important existing chemicals from biomonitoring, surveillance, and molecular epidemiologic studies. Finally, flexibility is needed in the testing of environmental agents to encourage the development and application of novel tools and approaches. The evolution of the toxicity-testing process, as envisioned here, must retain flexibility to encourage incorporation of new information and new methods as they are developed and found to be useful for evaluating whether a given exposure poses a risk to humans. That will require formal procedures for the phasing in or phasing out of standard testing methods. Indeed, that process is attuned to the need for efficient testing of all chemicals in a timely, cost-effective fashion. The committee envisions a reconfiguration of toxicity testing through the development of in vitro medium- and high-throughput assays. The in vitro tests would be developed not to predict the results of current apical toxicity tests but rather as cell-based assays that are informative about mechanistic responses of human tissues to toxic chemicals. The committee is aware of the implementation challenges that the new toxicity-testing paradigm would face. For example, toxicity testing must be able to address the potential adverse health effects of chemicals in the environment and of the metabolites formed when the chemicals enter the body. Much research will be needed to ensure that the new system evaluates the effects of the chemicals and their metabolites fully. Moreover, as we shift from a focus on apical end points to perturbations in toxicity pathways, there will be a need to develop an appropriate science base to support risk-management actions based on the perturbations. Implementation of the vision and the possible challenges are discussed in Chapter 5.
OCR for page 53
Toxicity Testing in the 21st Century: A Vision and a Strategy REFERENCES Andersen, M.E., J.E. Dennison, R.S. Thomas, and R.B. Conolly. 2005. New directions in incidence-dose modeling. Trends Biotechnol. 23(3):122-127 Bakand, S., C. Winder, C. Khalil, and A. Hayes. 2005. Toxicity assessment of industrial chemicals and airborne contaminants: Transition from in vivo to in vitro test methods: A review. Inhal. Toxicol. 17(13):775-787. Carmichael, N.G., H.A. Barton, A.R. Boobis, R.L. Cooper, V.L. Dellarco, N.G. Doerrer, P.A. Fenner-Crisp, J.E. Doe, J.C. Lamb IV, and T.P. Pastor. 2006. Agricultural chemical safety assessment: A multisector approach to the modernization of human safety requirements. Crit. Rev. Toxicol. 36(1):1-7. Cassee, F.R., J.P. Groten, P.J. van Bladeren, and V.J. Feron. 1998. Toxicological evaluation and risk assessment of chemical mixtures. Crit. Rev. Toxicol. 28(1):73-101. Charles, G.D. 2004. In vitro models in endocrine disruptor screening. ILAR J. 45(4):494-501. Doe, J.E., A.R. Boobis, A. Blacker, V. Dellarco, N.G. Doerrer, C. Franklin, J.I. Goodman, J.M. Kronenberg, R. Lewis, E.E. Mcconnell, T. Mercier, A. Moretto, C. Nolan, S. Padilla, W. Phang, R. Solecki, L. Tilbury, B. van Ravenzwaay, and D.C. Wolf. 2006. A tiered approach to systemic toxicity testing for agricultural chemical safety assessment. Crit. Rev. Toxicol. 36(1):37-68. Feron, V.J., J.P. Groten, D. Jonker, F.R. Cassee, and P.J. van Bladeren. 1995. Toxicology of chemical mixtures: Challenges for today and the future. Toxicology 105(2-3):415-427. Frasor, J., J.M. Danes, B. Komm, K.C.N. Chang, C.R. Lyttle, and B.S. Katzenellenbogen. 2003. Profiling of estrogen up- and down-regulated gene expression in human breast cancer cells: Insights into gene networks and pathways underlying estrogenic control of proliferation and cell phenotype. Endocrinology 144(10):4562-4574. GAO (U.S. Government Accounting Office). 2005. Chemical Regulation: Options Exist to Improve EPA’s Ability to Assess Health Risks and Manage its Chemical Review Program. GAO-05-458. U.S. Government Accounting Office, Washington, DC. June 2005 [online]. Available: http://www.gao.gov/new.items/d05458.pdf [accessed March 8, 2007]. Gennari, A., C. van den Berghe, S. Casati, J. Castell, C. Clemedson, S. Coecke, A. Colombo, R. Curren, G. Dal Negro, A. Goldberg, C. Gosmore, T. Hartung, I. Langezaal, I. Lessigiarska, W. Maas, I. Mangelsdorf, R. Parchment, P. Prieto, J.R. Sintes, M. Ryan, G. Schmuck, K. Stitzel, W. Stokes, J.A. Vericat, L. Gribaldo. 2004. Strategies to replace in vivo acute systemic toxicity testing. The report and recommendations of ECVAM Workshop 50. Altern. Lab. Anim. 32(4):437-459.
OCR for page 54
Toxicity Testing in the 21st Century: A Vision and a Strategy Haney, S.A., P. LaPan, J. Pan, and J. Zhang. 2006. High-content screening moves to the front of the line. Drug Discov. Today 11(19-20):889-894. Inglese, J. 2002. Expanding the HTS paradigm. Drug Discov. Today 7(Suppl. 18):S105-S106. Inglese, J., D.S. Auld, A. Jadhav, R.L. Johnson, A. Simeonov, A. Yasgar, W. Zheng, and C.P. Austin. 2006. Quantitative high-throughput screening: A titration-based approach that efficiently identifies biological activities in large chemical libraries. Proc. Natl. Acad. Sci. U.S.A. 103(31):11473-11478. Landers, J.P., and T.C. Spelsberg. 1992. New concepts in steroid hormone action: Transcription factors, proto-oncogenes, and the cascade model for steroid regulation of gene expression. Crit. Rev. Eukaryot. Gene Expr. 2(1):19-63. Lydy, M., J. Belden, C. Wheelock, B. Hammock, and D. Denton. 2004. Challenges in regulating pesticide mixtures. Ecology and Society 9(6):1-6. Nel, A., T. Xia, L. Madler, and N. Li. 2006. Toxic potential of materials at the nanolevel. Science 311(5761):622-627. NRC (National Research Council). 1977. Drinking Water and Health, Vol. 1. Washington, DC: National Academy Press. NRC (National Research Council). 1988. Complex Mixtures: Methods for In Vivo Toxicity Testing. Washington, DC: National Academy Press. NRC (National Research Council). 2006. Toxicity Testing for Assessment of Environmental Agents: Interim Report. Washington, DC: The National Academies Press. Pauluhn, J. 2005. Overview of inhalation exposure techniques: Strengths and weaknesses. Exp. Toxicol. Pathol. 57(Suppl. 1):111-128. Ptacek, T., and S.M. Sell. 2005. A tiered approach to comparative genomics. Brief Funct. Genomic Proteomic. 4(2):178-185. Roberts, S.A. 2001. High-throughput screening approaches for investigating drug metabolism and pharmacokinetics. Xenobiotica 31(8-9):557-589. Rochette-Egly, C. 2003. Nuclear receptors: Integration of multiple signaling pathways through phosphorylation. Cell. Signal. 15(4):355-366. Russell, W.M.S., and R.L. Burch. 1959. The Principles of Humane Experimental Technique. London: Methuen. Spielmann, H. 2005. Predicting the risk of developmental toxicity from in vitro assays. Toxicol. Appl. Pharmacol. 207(Suppl. 2):375-380. Stavanja, M.S., P.H. Ayres, D.R. Meckley, E.R. Bombick, M.F. Borgerding, M.J. Morton, C.D. Garner, D.H. Pence, and J.E. Swauger. 2006. Safety assessment of high fructose corn syrup (HFCS) as an ingredient added to cigarette tobacco. Exp. Toxicol. Pathol. 57(4):267-281. Stephens, S.M., and J. Rung. 2006. Advances in systems biology: Measurement, modeling and representation. Curr. Opin. Drug Discov. Devel. 9(2):240-250. Thummel, C.S. 2002. Ecdysone-regulated puff genes 2000. Insect Biochem. Mol. Biol. 32(2):113-120.
OCR for page 55
Toxicity Testing in the 21st Century: A Vision and a Strategy Teuschler, L., J. Klaunig, E. Carney, J. Chambers, R. Conolly, C. Gennings, J. Giesy, R. Hertzberg, C. Klaassen, R. Kode, D. Paustenbach, and R. Yang. 2005. Support of Science-Based Decisions Concerning the Evaluation of the Toxicology of Mixtures: A New Beginning. Charting the Future: Building the Scientific Foundation for Mixtures Joint Toxicity and Risk Assessment, February 16–17, 2005, Atlanta, GA, Society in Toxicology Sponsored Meeting: Contemporary Concepts In Toxicology [online]. Available: http://www.toxicology.org/ai/meet/MixturesWhitePapers.doc [accessed Feb. 9, 2007]. Xiao, G.G., M. Wang, N. Li, J.A. Loo, and A.E. Nel. 2003. Use of proteomics to demonstrate a hierarchical oxidative stress response to diesel exhaust particle chemicals in a macrophage cell line. J. Biol. Chem. 278(50):50781-50790. Yokota, F., G. Gray, J.K. Hammitt, and K.M. Thompson. 2004. Tiered chemical testing: A value of information approach. Risk Anal. 24(6):1625-1639.
Representative terms from entire chapter: