Skip to main content

Currently Skimming:

7 Alternative Approaches and Emerging Technologies
Pages 194-229

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 194...
... The second part discusses some new toxicity-testing approaches (-omics technologies and computational toxicology) that may have longer-term potential for achieving greater depth, breadth, animal welfare, and conservation in toxicity testing.
From page 195...
... In the 1990s, government centers devoted to the validation and regulatory acceptance of alternative methods were established in Europe and the United States, alternative tests began to be formally approved and accepted by regulatory agencies, and the triennial World Congresses on Alternatives were inaugurated. There is evidence that, owing in part to the implementation of Three Rs approaches, use of laboratory animals in research and testing in the United States decreased by about 30%1 in the decade after the estab 1 Estimate based on comparison of average number of Animal Welfare Act (AWA)
From page 196...
... Refinement Alternatives Refinement alternatives are changes in existing practices that either decrease animal pain and distress or increase animal welfare. Refinements are best practices, namely, ways of carrying out animal-based procedures and practices that ensure the best practical outcomes with respect to both animal welfare and science.
From page 197...
... Animal Welfare Act, with its emphasis on minimizing pain and distress. Reduction Alternatives Reduction alternatives are methods that use fewer animals than conventional procedures but yield comparable levels of information.
From page 198...
... . Reducing animal numbers in toxicity tests not only subjects fewer animals to potential suffering but has the potential to lower the cost of testing.
From page 199...
... The Ames mutagenesis test, developed in 1971, was the first in vitro test used in regulatory toxicology. In vitro tests and other nonanimal methods have since been accepted in regulatory toxicology case by case after the development of the field of validation and the establishment of ICCVAM and ECVAM in the 1990s.
From page 200...
... Because of the technical advantages, such approaches are being evaluated for large-scale testing programs, including HPV chemical testing and endocrine-disruptor testing. As the large-scale testing programs are developed and implemented, nonanimal methods are being incorporated as screens into tier-testing approaches with animal testing being reserved primarily for the highest tiers.
From page 201...
... Studies that use large numbers of animals and thereby have increased sensitivity would have profound implications for modeling human health risk assessment if the animal models used were found to be relevant to humans. Another fish model that is gaining increased attention from toxicology researchers is the zebrafish (Sumanas and Lin 2004)
From page 202...
... Substantial anatomic and physiologic differences between mammals and other species will also prevent their application to assessment of some toxic end points. EMERGING TECHNOLOGIES Novel -omics technologies and computational toxicology may one day contribute to resolution of much of the current tension around the objectives of toxicity testing.
From page 203...
... Environmental Genome Project (EGP) -- have identified and characterized millions of SNPs.
From page 204...
... For example, the NRC Standing Committee on Emerging Issues and Data on Environmental Contaminants, which was convened at the request of NIEHS, currently is focused on toxicogenomics and its applications in environmental and pharmaceutical safety assessment, risk communica
From page 205...
... However, studies need to examine mechanisms of action of toxicants and determine general and toxicant-specific cellular responses. Furthermore, studies need to include sufficiently large samples, assess dose-response relationships, characterize the temporal nature of gene expression in relation to the relevant end point, and, to the extent possible, examine an appropriate variety of tis 3 Toxicogenomics uses genomic and other -omic technologies to study the genetic response to environmental pollutants or toxicants and ultimately to identify environmental agents that could cause adverse effects.
From page 206...
... That standard is intended to address at least some of the difficulties that arise in comparison of datasets that have been acquired with different technologies, compiled at different times, or generated from different laboratories and should facilitate the construction of databases with broader utility, such as those being developed by the National Center for Toxicogenomics. Although genomics technologies, such as transcript profiling, have considerable potential in both predictive and mechanistic toxicology, their appropriate application to a risk-benefit analysis of novel chemical entities requires an understanding of the utility of the resulting data and, ideally, regulatory guidance or policy regarding their use.
From page 207...
... Many research needs and activities were recognized -- for example, linking the Office of Research and Development's Computational Toxicology Research Program to genomics activities, developing an analytical framework and acceptance criteria for genomics data, and developing internal expertise and methods to evaluate such data at EPA. Throughout the review, EPA recognized the role of the emerging technologies in informing the risk-assessment process and in potentially increasing the scientific rigor of the regulatory process.
From page 208...
... • Studies that identify protein biomarkers of effect, such as biomarkers of liver toxicity (Gao et al.
From page 209...
... Metabonomics Metabonomics is defined as the study of metabolic responses to drugs, environmental agents, and diseases and involves the quantitative measurement of changes in metabolites in living systems in response to internal or external stimuli or as a consequence of genetic change (Nicholson et al.
From page 210...
... Computational toxicology, like the other nonanimal approaches to toxicology previously discussed, holds the potential to lessen the tension between the four major objectives of regulatory testing schemes -- breadth, depth, animal welfare, and conservation. Although computational-modeling approaches have the clear advantages of being rapid and of potentially reducing animal testing, the success and validation of any given computational approach clearly depend on the end point being modeled -- is the end point amenable to a computational approach?
From page 211...
... Generally, SAR analyses are qualitative analyses that predict biologic activity on an ordinal or categorical scale, whereas quantitative SAR (QSAR) analyses use statistical methods to correlate structural descriptors with biologic responses and predict biologic activity on an interval or continuous scale.
From page 212...
... . Physiologically Based Pharmacokinetic Models PBPK models predict distribution of chemicals throughout the body and describe the interactions of chemicals with biologic targets.
From page 213...
... BBDR models also may be used to improve the experimental design of toxicology studies so that data needs for risk assessment are fulfilled. Computational Approaches to Predicting Metabolic Fate Predicting metabolic fate is important in determining the risks associated with environmental exposure to chemicals.
From page 214...
... . Many docking software products and scoring algorithms have been developed and are commercially available through organizations, such as the Cambridge Crystallographic Data Center (CCDC 2005; Tripos 2005; Accelrys 2005; Schrödinger 2005)
From page 215...
... There are opportunities to link PBPK models with SAR approaches. PBPK models require a variety of input parameters, including partition coefficients, metabolic parameters, and rates of metabolism of test compounds.
From page 216...
... It may eventually be possible to link SAR, PBPK, and BBDR models to predict dose-response behaviors for the perturbations of cellular signaling networks by exogenous compounds and to provide estimates of risk to exposed humans. VALIDATION New or revised toxicity-testing methods for regulatory toxicology are developed for a number of reasons, such as to increase chemical throughput, to provide more detailed information about individual chemicals, to reduce animal use and suffering, and to decrease costs associated with testing.
From page 217...
... Validation is one of several phases in the evolution of a test method from conception to application. The phases are test development, prevalidation or test optimization, formal validation, independent assessment, and regulatory acceptance.
From page 218...
... . • Many validation efforts compare a new test method undergoing validation with an existing animal-based test for the same end point.
From page 219...
... In addition to its guidance on validation principles, ICCVAM and the NTP Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM) have issued practical guidance on submitting validation data for assessment and nominating promising test methods for further development or validation (ICCVAM/NICEATM 2004)
From page 220...
... It should be emphasized that the formal validation process applies to methods intended for immediate regulatory testing. It is not intended for methods that, for example, are used only inhouse in industry or are purely investigational or newly emerging.
From page 221...
... CCDC (Cambridge Crystallographic Data Centre)
From page 222...
... . ECVAM (European Centre for the Validation Alternative Methods)
From page 223...
... EPA 100/B-04/002. Genomics Task Force Work group, Science Policy Council, U
From page 224...
... . ICCVAM–NICEATM (Interagency Coordinating Committee on the Validation of Alternative Methods- National Toxicology Program Interagency Center for the Evaluation of Alternative Toxicological Methods)
From page 225...
... MetaSite. Molecular Discovery Ltd [online]
From page 226...
... 1999. The Murine Local Lymph Node Assay: A Test Method for Assessing the Allergic Contact Dermatitis Po tential of Chemicals/Compounds.
From page 227...
... Herts, England: Universities Fed-eration for Animal Welfare.
From page 228...
... 2000. Introduction: Reducing unrelieved pain and distress in laboratory animals using humane endpoints.
From page 229...
... 2002. Allelic variation in human gene expression.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.