Appendix F
Poster Abstracts

Use of early effect biomarker data to enhance dose-response models of lung tumors in rats exposed to titanium dioxide


Bruce Allen,1 Andrew Maier,2 Alison Willis,2 and Lynne T. Haber2

1

Bruce Allen Consulting, Chapel Hill, NC;

2

Toxicology Excellence for Risk Assessment, Cincinnati, OH

The use of precursor data (biomarkers of effect) directly in risk assessments is increasingly emphasized as the future of toxicology. Because of the limited validation that has been conducted for the biomarkers themselves and the lack of vetted approaches for incorporating biomarker data into dose-response assessments quantitatively, current applications for the growing pool of biomarker data are constrained to hazard characterization. This paper presents a dose-response modeling approach that incorporates biomarker data on the lung-tumor response in rats exposed to titanium dioxide (TiO2). We used a series of linked “cause-effect” functions, fitted with a likelihood approach, to describe the relationships between successive key events and the ultimate tumor response. That approach was used to evaluate a hypothesized pathway for biomarker progression from a biomarker of exposure (lung burden), through several intermediate potential biomarkers of effect, to the clinical effect of interest (lung-tumor production). The model evaluated the contribution of several intermediate effect biomarkers to the dose-response behavior for lung tumors. These effect biomarkers included the polymorphonuclear-lymphocyte (PMN) count indicative of inflammation, proteins in bronchoalveolar-lavage fluid (BALF) indicative of alveolar damage, pulmonary-fibrosis incidence, and alveolar-cell proliferation. Interestingly, when the model allowed either fibrosis (and its precursors) or cell proliferation (and its precursors) or both to predict the tumor response, the cell-proliferation data provided no additional predictive power beyond that of the fibrosis response. Overall, the likelihood-maximization approach allowed the calculation of a lung-burden-based benchmark dose for lung tumors that directly incorporated data on biomarkers of exposure and effect. The biomarker-based modeling approach provided a more refined dose-response estimate than that



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 84
Appendix F Poster Abstracts Use of early effect biomarker data to enhance dose-response models of lung tumors in rats exposed to titanium dioxide Bruce Allen,1 Andrew Maier,2 Alison Willis,2 and Lynne T. Haber2 1 Bruce Allen Consulting, Chapel Hill, NC; 2Toxicology Excellence for Risk Assessment, Cincinnati, OH The use of precursor data (biomarkers of effect) directly in risk assess- ments is increasingly emphasized as the future of toxicology. Because of the limited validation that has been conducted for the biomarkers themselves and the lack of vetted approaches for incorporating biomarker data into dose- response assessments quantitatively, current applications for the growing pool of biomarker data are constrained to hazard characterization. This paper presents a dose-response modeling approach that incorporates biomarker data on the lung- tumor response in rats exposed to titanium dioxide (TiO2). We used a series of linked “cause-effect” functions, fitted with a likelihood approach, to describe the relationships between successive key events and the ultimate tumor response. That approach was used to evaluate a hypothesized pathway for biomarker pro- gression from a biomarker of exposure (lung burden), through several interme- diate potential biomarkers of effect, to the clinical effect of interest (lung-tumor production). The model evaluated the contribution of several intermediate effect biomarkers to the dose-response behavior for lung tumors. These effect bio- markers included the polymorphonuclear-lymphocyte (PMN) count indicative of inflammation, proteins in bronchoalveolar-lavage fluid (BALF) indicative of alveolar damage, pulmonary-fibrosis incidence, and alveolar-cell proliferation. Interestingly, when the model allowed either fibrosis (and its precursors) or cell proliferation (and its precursors) or both to predict the tumor response, the cell- proliferation data provided no additional predictive power beyond that of the fibrosis response. Overall, the likelihood-maximization approach allowed the calculation of a lung-burden-based benchmark dose for lung tumors that directly incorporated data on biomarkers of exposure and effect. The biomarker-based modeling approach provided a more refined dose-response estimate than that 84

OCR for page 84
85 Appendix F obtained by using the traditional approach of dose-response modeling based only on tumor data. Physiological modeling of metabolic interactions Frédéric Y. Bois INERIS, Verneuil en Halatte, France Purpose: Modeling metabolic interactions between chemicals can be a formidable task in model development. This presentation demonstrates a new approach and the capabilities of new tools to facilitate that development. Methods: Individual models of metabolic pathways are automatically merged and coupled to a template physiologically based pharmacokinetic (PBPK) model by using the GNU MCSim software. The global model generated is very efficient and able to simulate the interactions between a theoretically unlimited number of substances. Development time increases only linearly with the number of substances considered while the number of possible interactions increases ex- ponentially. Results: An example of application of the approach to the prediction of the kinetics of a mixture of 30 arbitrary chemicals is shown. The qualitative and quantitative behavior of the corresponding pathway network is analyzed by us- ing Monte Carlo simulations. In our example, the number of significant interac- tions, given the uncertainty and variability in the pharmacokinetics and metabo- lism of those substances, is much lower than the theoretically possible number of interactions. Conclusion: The integrative approach to interaction modeling is efficient and can be extended beyond metabolic interactions. It relies on the availability of specific data on the rate constants of individual reactions. Such data could be obtained through unconventional experiments in enzyme kinetics or through ab initio chemical modeling of enzymatic reactions. We are currently exploring both approaches. The role of oxysterols in a computational steroidogenesis model of human H295R cells to improve predictability of biochemical responses to endocrine disruptors M. Breen,1,2 M.S. Breen,3 A.L. Lloyd,1 and R.B. Conolly2 1 Biomathematics Program, Department of Statistics, North Carolina State University, Raleigh, NC; 2National Center for Computational Toxicology, U.S. Environmental Protection Agency, Research Triangle Park, NC; 3 National Exposure Research Laboratory, U.S. Environmental Protection Agency, Research Triangle Park, NC Steroids, which have an important role in a wide range of physiological

OCR for page 84
86 Toxicity-Pathway-Based Risk Assessment: A Symposium Summary processes, are synthesized primarily in the gonads and adrenal glands through a series of enzyme-mediated reactions. The activity of steroidogenic enzymes can be altered by a variety of endocrine disruptors (EDs), some of which are envi- ronmental contaminants. We are developing a dynamic computational model of the metabolic network of adrenal steroidogenesis in a human H295R cell line to predict the synthesis and secretion of adrenocortical steroids (such as mineralo- corticoids, glucocorticoids, androgens, and estrogens) and the biochemical re- sponse to EDs. We previously developed a deterministic model that describes the biosynthetic pathways for the conversion of cholesterol to adrenocortical steroids and the kinetics of enzyme inhibition by the ED metyrapone (MET). In this study, we extended the model by adding the pathway of oxysterol biosyn- thesis. Oxysterols are endogenous products of cholesterol unrelated to steroido- genesis. Experiments were performed to measure concentrations of cholesterol and 14 steroids in human H295R cells by using LC/MS/MS and ELISA meth- ods. Model parameters were estimated with an iterative optimization algorithm. Results show that the model fit improved with the extended model. Model pre- dictions closely correspond to time-course measurements of both cholesterol and steroid concentrations from control and dose-response experiments with MET. Our study demonstrates the feasibility of using the computational model of ad- renal steroidogenesis to predict the in vitro adrenocortical steroid concentrations from human H295R cells. This capability could be useful to define mechanisms of action of poorly characterized chemicals and mixtures in support of the H295R steroidogenesis screening system and predictive risk assessments. Disclaimer: This work was supported in part by the NCSU/EPA Coopera- tive Training Program in Environmental Sciences Research, Training Agree- ment CT833235-01-0 with North Carolina State University. Although this work was reviewed by the U.S. Environmental Protection Agency and approved for publication, it may not necessarily reflect official agency policy. Toxicity pathway modeling of chemically induced oxidative stress: a case study with trichloroethylene C.J. Brinkerhoff, S.S. Isukapalli, and P.G. Georgopoulos Environmental and Occupational Health Sciences Institute, A Joint Institute of the University of Medicine and Dentistry of New Jersey-Robert Wood Johnson Medical School and Rutgers University, Piscataway, NJ Oxidative stress is an important toxicity pathway that can lead to cellular injury and carcinogenesis. Mechanistic understanding and predictive modeling of this pathway is expected to improve risk assessment for environmental expo- sures significantly. Trichloroethylene (TCE) is a commonly encountered con- taminant that is known to cause oxidative stress in liver tissue, where it is me- tabolized; it is a known carcinogen in rodents and a suspected carcinogen in humans. Human exposures to TCE are common because it is a ubiquitous con-

OCR for page 84
87 Appendix F taminant in air and water owing to its widespread use as a degreaser and gen- eral-purpose solvent. A new combined physiologically based pharmacokinetic (PBPK) and bio- logically based dose-response (BBDR) model is presented for studying the im- pact of TCE exposures. This model describes oxidative stress in a mechanistic manner. Many of the toxic effects of TCE are hypothesized to be due to the me- tabolites trichloroacetate and dichloroacetate rather than to TCE alone. There- fore, the mathematical formulation of the toxicity pathway for oxidative stress generation includes TCE and its metabolites and considers direct exposures to the metabolites. This new model improves on existing TCE models by including a toxicity response pathway and by considering the impact of direct exposures to metabolites. Model parameters were estimated from in vitro mouse liver slice measurement data from the literature; these data include amounts of free radicals and lipid peroxidation after TCE exposures. Specifically, model parameters were identified in a sequential manner, first using in vivo measurements of oxi- dative stress in mice after exposures to TCE metabolites and then using data from experiments involving exposures to TCE. The model can thus estimate relative contributions of each chemical to the total oxidative stress caused by a dose of TCE. Furthermore, the addition of the toxicity response pathway in the new model resulted in successful predictions of oxidative stress produced in the liver over periods ranging from hours to weeks; such consistent predictions on large time scales were not possible with existing models. The modeling ap- proach used here can be generalized to other chemicals and toxicity pathways. Disclaimer: Support for this work has been provided by U.S. Environ- mental Protection Agency grant GAD R 832721-010 and National Institute of Environmental Health Sciences grant P30ES005022. This work has not been reviewed by and does not represent the opinions of the funding agencies. Use of renal biomarkers to characterize toxicity-based pathways of nephrotoxicity Ronald Brown,1 Peter Goering,1 and Jun Zhang2 1 U.S. Food and Drug Administration, Center for Devices and Radiological Health, Silver Spring, MD; 2U.S. Food and Drug Administration, Center for Drug Evaluation and Research, Silver Spring, MD The classic nephrotoxic agents gentamicin (Gen), mercury (Hg), and chromium (Cr) have been shown to produce acute kidney injury (AKI) at fairly well-defined sites in the nephron. This study was conducted to (1) determine the role of the nitric oxide (NO) toxicity-based pathway in renal pathogenesis of those agents and (2) determine the correlation between immunohistochemical expression of biomarkers of NO and expression of two AKI biomarkers, kidney injury molecule-1 (KIM-1) and renal papillary antigen-1(RPA-1), in the proxi- mal tubules and collecting ducts, respectively. Sprague-Dawley rats were given

OCR for page 84
88 Toxicity-Pathway-Based Risk Assessment: A Symposium Summary injections of Gen (100 mg/kg, sc, daily for 3 days), HgCl2 (0.25 mg Hg/kg, iv), K2Cr2O7 (5 mg Cr/kg, sc), or vehicle (control). At 24 or 72 h after the last dose, kidneys were collected, and formalin-fixed renal sections were used for histopa- thologic evaluation and immunostaining for inducible NO synthase (iNOS), endothelial NO synthase (eNOS), nitrotyrosine (NT), KIM-1, and RPA-1. Kid- neys from rats treated with Gen, Hg, or Cr exhibited increased expression of iNOS, eNOS, and NT, which correlated with the severity of renal histopathology and the tissue expression of biomarkers of AKI, specifically KIM-1 and RPA-1. Those findings suggest that acute nephrotoxic effects of Gen, Hg, or Cr are as- sociated with a toxicity-based pathway involving NO and nitrosative stress. Findings also show how renal biomarkers can be used to characterize toxicity- based pathways of kidney injury. PBPK models, BBDR models, and virtual tissues: How will they contribute to the use of toxicity pathways in risk assessment? Rory Conolly, Imran Shah, and Thomas Knudsen U.S. Environmental Protection Agency, National Center for Computational Toxicology, Office of Research and Development, Research Triangle Park, NC Accuracy in risk assessment, which is desirable to ensure protection of the public health while avoiding overregulation of economically important sub- stances, requires quantitatively accurate descriptions of in vivo dose-response and time-course behaviors. This level of detailed characterization is desirable when substances are economically important or environmentally persistent. The report Toxicity Testing in the 21st Century: A Vision and a Strategy emphasizes in vitro studies, with bioinformatics and systems modeling approaches used to predict in vivo behavior. Physiologically based pharmacokinetic (PBPK) and biologically based dose-response (BBDR) models and virtual tissues (VTs) will all be important in these extrapolations. PBPK models describe the relationship between external exposure and target-site dose. BBDR models extend PBPK models to include the linkage between target-site dose, key events, and end- point effect. These mathematical models typically have compartments that cor- respond to whole tissues (such as liver, kidney, and lung) and typically contain limited tissue-specific data. VT models, such as the U.S. Environmental Protec- tion Agency v-LiverTM and v-EmbryoTM projects, are computational models that will encode sufficient biological information to support significant predictive capabilities, with higher-level behaviors emerging from the structures encoded at more fundamental levels of organization. PBPK, BBDR, and VT models are thus complementary to one another, each having the ability to facilitate the in- terpretation of in vitro data with respect to in vivo significance and predicting dosimetry at finer levels of biological detail. This enhanced capability will help to identify (1) the doses or concentrations for in vitro studies that correspond to realistic levels of exposure in vivo and (2) relevant descriptions of the tissue-

OCR for page 84
89 Appendix F dose-end-point response continuum. It will also contribute to pathway-based assessment of specific biological processes and toxicities. Disclaimer: Although this work was reviewed by the U.S. Environmental Protection Agency and approved for publication, it may not necessarily reflect official agency policy. Improving mode-of-action analysis using transcript profiling in nullizygous mouse models Chris Corton,1 Susan Hester,1 Lauren Aleksunes,2 Hongzu Ren,1 Beena Vallanat,1 Michael George,1 Mitch Rosen,1 Barbara Abbott,1 Curt Klaassen,2 Stephen Nesnow,1 and Chris Lau1 1 U.S. Environmental Protection Agency, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC; 2Kansas University Medical Center, Kansas City, KS A number of nuclear receptors (NRs) mediate transcriptional hepatocyte growth and carcinogenic effects in the rodent liver after chemical exposure. These receptors include the constitutive activated/androstane receptor (CAR), pregnane X receptor (PXR), and peroxisome proliferator-activated receptor al- pha (PPAR-α). We hypothesized that transcriptional analysis in the livers of exposed wild-type and NR-null mice can strengthen the weight of evidence for mode-of-action analysis of environmentally relevant chemicals. We tested this hypothesis by examining gene expression by Affymetrix arrays in the livers of wild-type mice and mice nullizygous for CAR, PXR, or PPAR-α exposed to chemicals of a number of classes, including perfluoroalkyl acids, conazole fun- gicides, and a phthalate ester plasticizer. A comparison of gene expression in exposed wild-type and nullizygous mice revealed the extent of the involvement of the receptors in mediating the effects of the chemicals. Three major conclu- sions can be drawn from these studies. (1) Many chemicals activate more than one receptor in the mouse liver. Perfluorooctanoic acid (PFOA) and di-2- ethylhexyl phthalate (DEHP) exhibit features of activation of PPAR-α and CAR in wild-type mice. In the absence of PPAR-α, perfluorooctane sulfonate (PFOS) activates CAR. (2) Chemicals exhibit differences in receptor dependence. Ap- proximately 99%, 92%, and 85% of the genes regulated by WY-14,643 (a PPAR-α agonist), PFOS, and PFOA, respectively, were dependent on PPAR-α for altered expression. These results indicate that PPAR-α plays a dominant role in mediating the effects of these chemicals despite minor differences in the ex- tent of receptor dependence. (3) In the absence of the receptor that mediates the majority of effects, some chemicals exhibit greater induction of alternative path- ways. PFOA and PFOS activate a CAR-like signature to a greater extent in PPAR-α-null mice than in wild-type mice. Because PPAR-α and CAR exhibit antagonistic effects, greater CAR-like effects may occur through loss of repres- sion by PPAR-α. These findings demonstrate that PPAR-α plays a necessary

OCR for page 84
90 Toxicity-Pathway-Based Risk Assessment: A Symposium Summary role in mediating the effects of PFOA, PFOS, and DEHP. In conclusion, cou- pling genomewide transcript profiling in different genetic backgrounds can be valuable in determination of the mode of action of liver-tumor induction by en- vironmentally relevant chemicals. Disclaimer: This abstract does not represent U.S. Environmental Protec- tion Agency policy. Issues in using human variability distributions to estimate low-dose risk Kenny S. Crump,1 Weihsueh Chiu,2 and Ravi P. Subramaniam2 1 Louisiana Tech University, Ruston, LA; 2U.S. Environmental Protection Agency, Office of Research and Development, National Center for Environmental Assessment, Research Triangle Park, NC Low-dose extrapolation and accounting for human variability in suscepti- bility remain among the key issues in implementing the recommendations of the National Research Council report Toxicity Testing in the 21st Century: A Vision and a Strategy (NRC 2007). In Science and Decisions: Advancing Risk Assess- ment (NRC 2008), a separate committee made numerous recommendations on addressing these issues in the current context of toxicity testing that may also be applicable to the envisioned 21st century toxicity-testing paradigm. Among the recommendations is a proposal to estimate low-dose risks by using models de- rived from human variability distributions (HVDs). In the existing approaches to HVD modeling, log-normal distributions are estimated from data on various pharmacokinetic and pharmacodynamic parameters that impact individual sensi- tivities to the toxic response. These distributions are combined into an overall log-normal distribution for the product of the individual parameters by adding their variances. The log-normal distribution is transferred to the dose axis by centering it at a point-of-departure (POD) dose usually estimated from animal data. The resulting log-normal distribution is used to quantify low-dose risk. This poster examines the implications of various assumptions in this ap- proach:  Existing approaches to HVD modeling generally assume that the distri- bution of individual threshold doses determined by dichotomizing a continuous apical response is log-normal. This assumption is incompatible with an assump- tion that the apical responses themselves are log-normal (except under highly specialized conditions). However, the two assumptions generally lead to very different risk estimates.  The assumption that risk can be expressed as a function of a product of independent parameters lacks phenomenological support. A demonstration is provided that shows that this assumption is generally invalid.  Even if these problems were not present, such modeling would be unre- liable because of model uncertainty. As demonstrated herein, distributions other

OCR for page 84
91 Appendix F than the log-normal distribution can describe the available data on the parame- ters affecting sensitivity equally well but provide low-dose risk estimates that differ by orders of magnitude from those obtained by using the log-normal dis- tribution.  These issues remain whether the end point of interest is an apical toxic response or a specified degree of toxicity-pathway perturbation. In view of these problems, we recommend caution in the use of HVD modeling as a general approach to setting exposure standards for human expo- sures to toxic chemicals. Disclaimer: The views expressed in this poster represent those of the au- thors and do not reflect the views or policies of the U.S. Environmental Protec- tion Agency. Major challenges to biologically based dose-response modeling for estimating low-dose human risk using molecular toxicology data Kenny S. Crump,1 Chao Chen,2 Weihsueh A. Chiu,2 Thomas A. Louis,3 Ravi P. Subramaniam,2 and Christopher J. Portier4 1 Louisiana Tech University, Ruston, LA; 2U.S. Environmental Protection Agency, Office of Research and Development, National Center for Environmental Assessment, Research Triangle Park, NC; 3Johns Hopkins Bloomberg School of Public Health, Baltimore, MD; 4National Institute of Environmental Health Sciences, Research Triangle Park, NC The strength of recent advances in molecular toxicology arises from the po- tential to provide information on proximal markers of dose and on early markers of contributions from multiple pathways to diseased states. Biologically based dose-response (BBDR) modeling can incorporate data on biological processes at the cellular and molecular levels to link external exposure to an adverse effect, and such modeling has therefore been suggested by some as a viable link between data generated on toxicity-pathway perturbations and estimates of risk of adverse re- sponses at doses of concern to humans. This poster presents the point of view that there are likely serious impediments to developing BBDR models for this specific purpose. Such models have been used profitably to evaluate proposed mechanisms of toxicity and to identify data gaps. However, the application of these models to the quantitative predictions of low-dose human risk (limited so far to clonal growth modeling for predicting cancer) has not improved the reliability of such predictions despite the extensive effort expended. That is because BBDR models do not eliminate the need for empirical modeling of the relationship between dose and effect but only move it from the whole organism to a lower level of biological organization. Moreover, in doing this, BBDR models introduce significant new sources of uncertainty. Quantitative inferences from the data are limited by inter- and intra-individual heterogeneity that cannot be eliminated with available or rea-

OCR for page 84
92 Toxicity-Pathway-Based Risk Assessment: A Symposium Summary sonably anticipated experimental techniques. BBDR modeling also does not avoid uncertainties in the mechanisms of toxicity relevant to low-level human exposures. We are not recommending against research to develop BBDR models, which have many potential uses. However, we recommend that before a BBDR model is used to set human exposure standards, an evaluation of the robustness of the model predictions in light of limitations and sources of uncertainty described above needs to be performed. Furthermore, efforts to develop BBDR models for risk assess- ment should consider their resource and time requirements vis-à-vis their potential benefits. Disclaimer: The views expressed in this poster represent those of the au- thors and do not reflect the views or policies of the U.S. Environmental Protec- tion Agency. One vision of the future use of in vitro data in setting human exposure standards—familiar problems and familiar solutions Kenny Crump1 and Tom Louis2 1 Louisiana Tech University, Ruston, LA; 2Johns Hopkins Bloomberg School of Public Health, Baltimore, MD Detailed mathematical models of biological processes have not proved ca- pable of providing useful quantitative information regarding dose-response curve shapes at very low doses. Because we expect that this state of affairs will continue in the foreseeable future, we support the vision of the National Re- search Council committee that produced Toxicity Testing in the 21st Century: A Vision and a Strategy of not recommending development of quantitative esti- mates of human risk from in vitro data. Rather, efforts should focus on estimat- ing human exposure levels that would not be expected to cause perturbations in normal physiology that could result in adverse health consequences in humans. In considering how this approach is likely to play out in practice, we consider the following idealized situation: A critical toxic pathway has been identified for a chemical, and dose- response data exist for an in vitro assay for this pathway. A physiologically based pharmacokinetic (PBPK) model is also available that relates human expo- sure to delivery of the chemical to the critical cells in humans. How should this information be used to set an exposure standard? This problem is conceptually very similar to what is currently faced in us- ing data from whole-animal tests, and we expect the way forward to be similar. We envision that it will necessarily involve the rather low-technology approach of defining a point of departure (POD), applying safety factors to the POD to arrive at a cell concentration expected not to cause adverse perturbations, and using a PBPK model to relate the resulting concentration at the cell level to hu- man exposure. Setting these factors will involve a considerable dose of scientific judgment. Application of the approach may also require changes in environ-

OCR for page 84
93 Appendix F mental laws. On the basis of experience with attempts to use biologically based dose-response (BBDR) models to quantify dose-response shapes at low doses, we predict that detailed mechanistic models developed with computational sys- tems biology will impact the process qualitatively (perhaps through deciding among broad categories of dose responses that could influence safety factors) rather than provide direct numerical input. To achieve the goal of protecting sensitive subgroups, we recommend test- ing many human cell lines from representative populations and from targeted populations that may be especially sensitive to perturbations of specific toxicity pathways. Strategic direction and application of computational models, -omics, and HTS approaches for the risk assessment of industrial and pesticide chemicals V. Dellarco, T. Henry, N. Kramek, J. Mclain, P. Sayre, J. Seed, and S. Bradbury U.S. Environmental Protection Agency, Office of Prevention, Pesticides, and Toxic Substances, Washington, DC Significant advances have been made in human health and ecological risk assessment over the last decade. Substantial challenges, however, remain in pro- viding credible scientific information in a timely and efficient manner to support risk assessment and risk-management decisions under various statutes (such as the Toxic Substances Control Act, the Federal Insecticide Fungicide and Roden- ticide Act, and the Food Quality Protection Act) that the Office of Prevention, Pesticides, and Toxic Substances (OPPTS) is responsible for implementing. A major challenge confronting OPPTS is the need for critical information to ad- dress risk uncertainties in large chemical inventories (such as high- and me- dium-production-volume industrial chemicals and pesticide inert ingredients); these uncertainties result from limited knowledge across chemical classes and their adverse outcomes. That information gap needs to be addressed in a reliable way, yet in a time- and cost-effective manner. Solutions to meeting this chal- lenge include not just the generation of more data faster but also the determina- tion of which chemical- and exposure-specific effects data are essential for man- aging and assessing the most likely risks. From a strategic and tactical view, elucidating the initiating and key events in critical toxicity pathways is neces- sary to reduce uncertainties associated with the use of in vitro assays, high- throughput screening (HTS), and in silico approaches. This need is articulated in the 2007 National Research Council report Toxicity Testing in the 21st Century: A Vision and a Strategy. Addressing the need and establishing libraries of criti- cal toxicity pathways will provide a substantial basis for using in silico and HTS approaches to predict toxicity credibly, to support priority-setting and screening assessments, and to provide more robust scientific foundations for quantitative risk assessment. This poster describes the knowledge bases and computational

OCR for page 84
94 Toxicity-Pathway-Based Risk Assessment: A Symposium Summary tools OPPTS currently uses in its regulatory program, where the program aims to be in the short term and the long term, and our views of the potential risk- assessment and risk-management applications of computational toxicology. Disclaimer: The views expressed in this article are those of the authors and do not necessarily reflect those of the U.S. Environmental Protection Agency. Toxicity-pathway-based mode-of-action modeling for risk assessment Stephen Edwards,1 Julian Preston,1 Andrew Geller,1 Annie Jarabek,1,2 Douglas C. Wolf,1 Elaine Cohen Hubal,3 Gerald Ankley,1 Hisham El-Masri,1 Imran Shah,3 John R. Fowle III,1 Jeffrey Ross,1 John Nichols,1 Kevin Crofton,1 Mike Devito,1 Ram Ramabhadran,1 Robert Kavlock,3 Rory Conolly,3 Sid Hunter,1 Tom Knudsen,3 and William Mundy1 1 U.S. Environmental Protection Agency, National Health and Environmental Effects Laboratory, Research Triangle Park, NC; 2U.S. Environmental Protection Agency, National Center for Environmental Assessment, Research Triangle Park, NC; 3U.S. Environmental Protection Agency, National Center for Computational Toxicology, Research Triangle Park, NC In response to the 2007 National Research Council report on toxicity test- ing in the 21st century, the U.S. Environmental Protection Agency (EPA) has entered into a memorandum of understanding with the National Human Genome Research Institute and the National Toxicology Program to jointly pursue ways to incorporate high-throughput methods into hazard identification and risk char- acterization. In collaboration with these organizations, EPA researchers are co- ordinating in vitro, laboratory animal, human, and ecological field studies with computational methods for data analysis and modeling to establish a path for- ward. For toxicity-pathway-based risk assessment to become a reality, research must be done to estimate exposure at the cellular level as well as to link pertur- bations of that toxicity pathway quantitatively to an adverse outcome (an apical end point as currently established in regulatory applications). The first task re- quires linking environmental concentrations to internal doses at which the toxic- ity pathways are active and defining the relationship between those internal doses and in vitro concentrations used for high-throughput screening. The sec- ond task requires the determination of the key events linking perturbations in the toxicity pathway to the resulting adverse outcome and quantitative in vivo pa- rameter estimates for these key events relative to in vitro toxicity-pathway meas- urements. This poster describes EPA efforts to address the second task through the establishment of interdisciplinary teams to identify novel toxicity pathways, establish a mode of action (MOA) linking perturbation of a toxicity pathway to adverse outcomes, establish bioindicators of key events in the MOA, collect quantitative in vivo dose-response and time-course data for bioindicators to en- able quantitative modeling, and create quantitative models for both human and

OCR for page 84
110 Toxicity-Pathway-Based Risk Assessment: A Symposium Summary effective methods to set priorities among chemicals for targeted in vivo testing and thus improve the efficiency of the use of animals in those bioassays. The chemicals selected for phase I are composed largely of a diverse set of pesticide active ingredients, whose EPA registration process included sufficient support- ing in vivo data. These were supplemented with a number of nonpesticide, high- production-volume chemicals of environmental concern. Application of HTS to environmental toxicants is a novel approach to predictive toxicology and differs from what is required for drug-efficacy screening in several ways. Biochemical interaction of environmental chemicals is generally weaker than that seen with drugs and their intended targets. Additionally, the chemical diversity space cov- ered by environmental chemicals is much broader than that of pharmaceuticals. The ToxMiner™ database was created to link biological, metabolic, and cellular pathway data to genes and in vitro assay data for the chemicals screened in the ToxCast phase I HTS assays. Also included in ToxMiner was human disease information, which correlated with ToxCast assays that target specific genetic loci. We have implemented initial pathway inference and network analyses, which allow linkage of the types of adverse health outcomes with exposure to chemicals screened in phase I. This approach permits exploration of disease at a higher level of cellular and organismal organization, focusing on multiple, re- lated disorders, and may aid in the understanding of common disease outcomes (such as cancer and immune disorders) that are characterized by locus heteroge- neity. Through the use of the ToxMiner database and the analysis framework presented here, we hope to gain insight into relationships between potential dis- ease states in humans and environmental chemicals and to contribute to the lar- ger goals of toxicogenomics by clarifying the role of gene-environment interac- tions in pathobiology. Disclaimer: Although this work was reviewed by the U.S Environmental Protection Agency and approved for publication, it may not necessarily reflect official agency policy. Computational xenobiotic metabolism prediction system for toxicity assessment Fangping Mu and Helen H. Cui Los Alamos National Laboratory, Los Alamos, NM Biotransformation is the process whereby a substance, usually a foreign compound (xenobiotic), is chemically transformed in the body to form a me- tabolite or a variety of metabolites. Chemical transformations can activate a xenobiotic, rendering it toxic, or can alter a xenobiotic to a nontoxic species. Expert systems represent state-of-the-art xenobiotic metabolism prediction systems. These systems are rule-based systems designed to identify functional- group transformations that occur in known reactions and then, by generalizing, to formulate reaction rules for global application. The rules can provide reason-

OCR for page 84
111 Appendix F able prediction of all possible metabolite formation. However, they commonly predict many more metabolites than are observed experimentally. Ranking of the possibility of metabolite formation is still not consistently available. To overcome the significant number of false positives in rule-based sys- tems for metabolism prediction, we investigated machine-learning technology for xenobiotic metabolism prediction. We collected human xenobiotic reactions from Elsevier MDL’s Metabolite Database and classified reactions according to rules based on functional-group biotransformations. For each reaction rule, the reaction center can be well defined and is represented as a molecular substruc- ture pattern by using SMARTS, which is a language for describing molecular patterns. Using the SMARTS patterns, we identified potential reaction centers for each reaction class by using the identified metabolites in Elsevier MDL’s Metabolite Database. Each set of potential reaction centers was divided into negative and positive examples. More than 23 atomic properties were used to model the topological, geometric, and electronic and steric environment of the atoms in the molecule. More than 42 molecular properties were used to model the shape, surface, energy, and charge distribution of the molecule. Support vec- tor machines were used to separate the positive and negative examples in each reaction class. A total of 36 biotransformations were modeled. Results show that the overall sensitivity and specificity of classifiers are around 87%. To demonstrate the relevance of metabolism to toxicity, we used epoxide formation as an example. Epoxide hydrolase detoxifies molecules that have an epoxide moiety by hydrolyzing the epoxide to a diol. However, some stable ep- oxides are unsuitable substrates for this enzyme. We collected stable epoxides from Toxnet (http://toxnet.nlm.nih.gov/) and found 489 chemicals that contain epoxide moieties. The metabolism prediction model for epoxide hydrolysis pre- dicted that only 24, or 4.9%, would be hydrolyzed enzymatically. Prediction of metabolism with this method can enhance the accuracy of toxicity assessment. A comparison of multiple methods to evaluate biphasic (hormetic) dose responses in high-throughput in vitro toxicology screens Marc A. Nascarella1,2,* and Edward J. Calabrese2 1 Gradient Corporation, Cambridge, MA; 2University of Massachusetts, Division of Environmental Health Sciences, Amherst, MA * Presenting author We describe several methods for evaluating the low-dose response in in vitro drug screens. This presentation focuses on previous work in which the re- sponse of yeast exposed to over 2,100 putative anticancer agents in a high- throughput drug screen was studied (Calabrese et al. 2006, 2008; Nascarella et al. 2009). We describe a methodology to evaluate the fundamental shape of the dose-response curve to determine whether there is nonrandom biological activity below the toxic threshold (Calabrese et al. 2006). We also show how the toxic

OCR for page 84
112 Toxicity-Pathway-Based Risk Assessment: A Symposium Summary threshold using a benchmark-dose (BMD) procedure was estimated and then how the distribution of responses at concentrations below the estimated toxic threshold was evaluated. This approach is followed by the description of a sec- ond methodology that uses a complementary but separate evaluation to deter- mine the average magnitude of response and the distribution of mean responses for the putative anticancer agents (Calabrese et al. 2008). We selected concen- tration-response studies that had concentrations below the estimated BMD to determine the average magnitude of response below a toxic threshold. This analysis is novel in that we use a linear mixed model to predict the average re- sponse in the low-concentration zone for each anticancer agent. We describe how we used the average response for each of the anticancer agents using the best linear unbiased prediction (BLUP) or empirical Bayes approach (presented with prediction intervals). This assessment provides a more accurate prediction of the true chemical mean response than a simple mean (because the regression toward the mean affects only chemicals whose predictor differs from the mean, and not the mean itself). In a third line of inquiry, we demonstrate how we quan- tified the individual hormetic concentration responses by measuring the width of the hormetic zone, the interval from the maximum stimulatory concentration to the toxic threshold, and the amplitude of the maximum stimulation (Nascarella et al. 2009). We describe the advantages and disadvantages of using these multi- ple evaluation schemes in determining biological activity below the toxic threshold. Application of toxicogenomics to develop a mode of action for a carcinogenic conazole fungicide S. Nesnow,1 J. Allen,1 C. Blackman,1 P.-J. Chen,2 Y. Ge,1 S. Hester,1 L. King,1 P. Ortiz,1 J. Ross,1 S.-F. Thai,1 W. Ward,1 W. Winnik,1 and D. Wolf1 1 U.S. Environmental Protection Agency, National Heath and Environmental Effects Research Laboratory, Research Triangle Park, NC; 2National Taiwan University, Department of Agricultural Chemistry, Taipei, Taiwan Conazoles are a common class of fungicides used to control fungal growth in the environment and in humans. Some of these agents have adverse toxico- logical outcomes in mammals as carcinogens, reproductive toxins, and hepato- toxins. We coupled the results of genomic analyses with traditional laboratory investigations (in toxicology, molecular biology, and biochemistry) to propose a mode of action (MOA) for the carcinogenic activity of propiconazole in mouse liver. A key element of the approach was to use activity-inactivity pairs of cona- zoles. This allowed the sequestration of the genomic results toward the toxi- cologic end points and provided a rapid method to identify genes, pathways, and networks that could be responsible for the observed toxic effects. Conazoles are designed to inhibit CYP51; this is a central step in the biosynthesis of ergosterol in fungal systems and of ergosterol, cholesterol, vitamin D, and the sex steroids

OCR for page 84
113 Appendix F in mammalian systems. Conazoles are pleiotropic—they can both induce and inhibit mammalian CYPs, and these characteristics help to explain their varied toxic activities. We performed both dose-response and time-course studies in mice to develop and characterize key events in the MOA that can describe the propiconazole-induced carcinogenic process. These studies provided data on the following series of key events in the carcinogenic MOA of propiconazole: nu- clear receptor activation, CYP induction, decreases in hepatic retinoic acid lev- els, increased oxidative stress, decreases in serum cholesterol levels, increases in mevalonic acid levels, increased cell proliferation, decreased apoptosis, and in- duction of in vivo mutagenicity. Those key events have been synthesized into an MOA that describes the carcinogenic process induced by propiconazole in mouse liver. Disclaimer: This abstract does not represent U.S. Environmental Protec- tion Agency policy. Use of toxicogenomic data at the U.S. Environmental Protection Agency to inform the cancer assessment of the fungicide propiconazole N. McCarroll,1 E. Reaves,1 M. Manibusan,1 D. C. Wolf,2 S. D. Hester,2 J. Allen,2 S-F. Thai,2 J. Ross,2 Y. Ge,2 W. Winnik,2 L. King,2 C. Blackman,2 W.O. Ward,2 and S. Nesnow2 1 U.S. Environmental Protection Agency, Office of Pesticide Programs, Washington, DC; 2U.S. Environmental Protection Agency, National Health and Environmental Effects Research Laboratory, Office of Research and Development, Research Triangle Park, NC The U.S Environmental Protection Agency (EPA) Office of Pesticide Pro- grams (OPP) routinely uses mode-of-action (MOA) data, when they are avail- able, for pesticide cancer risk assessment. An MOA analysis incorporates data from required toxicology studies and supplemental mechanistic data. These data are evaluated to identify a set of key events, quantifiable and critical steps, in the pathway to tumor development. EPA has considered genomic data as part of the weight of the evidence (WOE) in support of an MOA. However, to expand this effort, standard approaches are being developed to include toxicogenomic data and data from other new technologies into the risk-assessment process. Cona- zoles are antifungal pesticides used for the protection of fruit, vegetable, and cereal crops and as pharmaceuticals for the treatment of fungal infections. Anti- fungal activity is exerted through inhibition of a specific cytochrome, CYP51, a critical step in the biosynthesis of ergosterol, a steroid required for formation of fungal cell walls. Many conazoles induce hepatotoxicity and liver tumors. A toxicogenomic dataset has been developed for the mouse liver tumorigen propi- conazole. The objective of this study was to determine how toxicogenomic data could inform MOA analysis and the interpretation of human relevance. Toxico- genomic data, supplemental tissue-response information, molecular and bio-

OCR for page 84
114 Toxicity-Pathway-Based Risk Assessment: A Symposium Summary chemical studies, and traditional registration studies were used to determine the value of applying genomic data to the MOA analysis. Postulated key events based on genomic and experimental studies include nuclear receptor activation, CYP induction, cholesterol inhibition, oxidative stress, altered retinoic acid and mevalonic acid levels, and in vivo mutagenicity. Those key events were organ- ized into a hypothesized MOA that explains the tumorigenic response to propi- conazole. The EPA cancer risk assessment guidance was used to integrate ge- nomic data into the risk assessment. This study shows how toxicogenomic data can inform our understanding of cancer and increase the efficiency and accuracy of a risk assessment. Disclaimer: The views expressed in this abstract do not necessarily reflect those of the U.S. Environmental Protection Agency. Prediction of in vivo dose-response relationship from in vitro concentration- response relationship using cellular-level PBPK modeling Thomas Peyret and Kannan Krishnan Département de Sante Environnementale et Sante au Travail, Université de Montréal, Montréal, Québec, Canada There is still a lack of appropriate tools to extrapolate the results of in vitro toxicity tests to in vivo conditions, particularly for risk-assessment purposes. In this study, multicompartmental models describing the in vitro and in vivo systems were developed and evaluated by using toluene as the model substrate. The in vivo and in vitro models consist of four components (cell and interstitial space for the tissue and plasma and erythrocytes for the vascular components) and two compo- nents (cell and culture medium), respectively. Cell:culture medium (EMEM), cell:blood, interstitial:blood, and tissue:blood partition coefficients (PCs) used in the in vitro or in vivo models were derived from medium (EMEM, cell, interstitial space, plasma, and erythrocyte):water PCs. The medium:water PCs in turn were calculated on the basis of the fractional content and extent of toluene uptake into the neutral lipid, phospholipid, water, and protein components of the media. The free concentration of toluene in neutral lipids, neutral phospholipids, and water was calculated on the basis of its solubility (oil:water PC). The toluene binding to hemoglobin was calculated from the free concentration in the microenvironment and hemoglobin:microenvironment PC (derived from blood:air data). The in vitro toluene concentration in the neuroblastoma culture system was calculated by mul- tiplying the EMEM concentration (McDermott et al. 2007. Toxicol in vitro 21:116-124) by the cell:EMEM PC. The in vivo brain cell concentration of toluene was calculated from the cell:blood and the concentration in blood in the human PBPK model adopted from Tardif et al. (1997. Toxicol appl pharmacol 144:120- 134). The PBPK model was used to extrapolate the toluene concentration-response relationship from an in vitro SH-SY5Y cell-viability study to in vivo conditions. The EMEM:air PC predicted by the in vitro model was 78% of the value measured

OCR for page 84
115 Appendix F by McDermott et al. (2007). The human exposure concentrations of 132, 231, 350, 647, and 1,011 ppm provided the same brain-cell concentrations at steady state as the in vitro concentrations of 5.64 µM (no-observed-adverse-effect level), 13.1 µM (lowest observed-adverse-effect level), 22.5 µM, 46.6 µM, and 76.3 µM. This work was supported by a research grant from AFSSET—L’Agence française de sécurité sanitaire de l'environnement et du travail. Dissecting enzyme regulation by multiple allosteric effectors using the random sampling-high dimensional model representation algorithm Joshua Rabinowitz,1,2 Jennifer Hsiao,1 Kimberly Gryncel,3 Evan Kantrowitz,3 Xiaojiang Feng,1 Genyuan Li,1 and Herschel Rabitz1 1 Princeton University, Department of Chemistry, Princeton, NJ; 2Princeton University, Lewis-Sigler Institute for Integrative Genomics, Princeton, NJ; 3 Boston College, Department of Chemistry, Merkert Chemistry Center, Chestnut Hill, MA The random sampling-high dimensional model representation algorithm (RS-HDMR) serves to extract complex relationships within multivariable sys- tems. It is a completely data-driven algorithm that can reveal linear, nonlinear, independent, and cooperative relationships from random sampling of the target variables with favorable scalability. RS-HDMR has been applied to a variety of systems in chemistry, physics, biology, engineering, and environmental sci- ences. In this study, it is used to dissect the combinatorial allosteric regulation of the enzyme aspartate transcarbamoylase (ATCase, EC 2.1.3.2 of Escherichia coli). ATCase catalyzes the committed step of pyrimidine biosynthesis and is al- losterically regulated by all four ribonucleoside triphosphates (NTPs) in a nonlinear manner. In this work, ATCase activity was measured in vitro at 300 random NTP concentration combinations, each involving (consistent with in vivo conditions) all four NTPs being present. These data were then used to de- rive an RS-HDMR model of ATCase activity over the full four-dimensional NTP space. The model accounted for 90% of the variance in the experimental data. Its main elements were positive ATCase regulation by adenosine triphos- phate (ATP) and negative regulation by cytidine triphosphate (CTP), with the negative effects of CTP dominating the positive ones of ATP when both regula- tors were abundant (a negative cooperative effect of ATP x CTP). Strong sensi- tivity to both ATP and CTP concentrations occurred in their physiological con- centration ranges. Uridine 5’-triphosphate (UTP) had only a slight effect, and guanine triphosphate (GTP) had almost none. These findings support a pre- dominant role of CTP and ATP in ATCase regulation. The general approach provides a new paradigm for dissecting multifactorial regulation of biological molecules and processes.

OCR for page 84
116 Toxicity-Pathway-Based Risk Assessment: A Symposium Summary Parallel analysis of activation of the cellular stress-response system for rapid evaluation of environmental toxicants Steven Simmons,1 David Reif,2 and Ram Ramabhadran1 1 U.S. Environmental Protection Agency, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC; 2U.S. Environmental Protection Agency, National Center for Computational Toxicology, Research Triangle Park, NC Determining the toxic potential and mode of action (MOA) of environ- mental chemicals is a cost- and labor-intensive endeavor that has traditionally involved the use of laboratory animals. These considerations have necessitated the development of efficient in vitro, cell-based approaches that can permit the rapid analysis of chemical toxicities in a high-throughput mode as envisioned in the 2007 National Research Council report Toxicity Testing in the 21st Century: A Vision and a Strategy. Although assay technologies developed in the drug- discovery field provide the appropriate tools for this approach, a strategy for toxicity evaluation based on screening against specific cellular targets is imprac- tical because of the large numbers of such putative targets and the large numbers of chemicals (and other products, such as nanomaterials) that require screening. To address those issues, we are developing a small ensemble (fewer than 10) of rapid and inexpensive reporter-gene assays based on the well-characterized cel- lular stress-response pathways. These adaptive pathways are activated in a coor- dinated fashion on exposure to environmental insults in an attempt by the cell to maintain or re-establish homeostasis. Although the pathways are activated at very low doses of toxicants, they also trigger terminal events, such as apoptosis and other adverse effects, when the cell is damaged irreversibly. Those charac- teristics make them ideal sentinels for rapid and sensitive in vitro screening of toxicants that is amenable to the high-throughput modality. We are in the process of characterizing the integrated response of this stress assay ensemble to chemical toxicants to validate its use as a means of grouping chemicals that produce similar biological responses and also to infer their MOA. Luciferase-based reporter-gene assays consisting of promoters de- rived from the stress-pathway target genes or artificial promoters based on spe- cific stress-response elements have been incorporated into lentiviral vectors that allow their rapid and stable delivery to a wide variety of cell types, both primary and established. Using this approach, we have determined the stress signatures of a set of toxic and nontoxic metals and those of other compounds and devel- oped an approach for graphic representation of the signatures to facilitate visual comparison. We anticipate that this approach will be useful for determining the MOA of environmental toxicants. Disclaimer: This is an abstract of a proposed presentation and does not necessarily reflect U.S. Environmental Protection Agency policy.

OCR for page 84
117 Appendix F The application of cellular systems biology to create a “safety risk index” for potential drug-induced adverse events D. Lansing Taylor Cellumen, Inc., Pittsburgh, PA The purpose of a cellular systems biology (CSB™) approach to safety profiling is to address the complexity of “systems” responses to perturbations that involve multiple cellular pathways, organelles, and mechanisms of action. Not only does this approach begin to define the biology of potential adverse events, but it enables improved prediction of subsequent effects in vivo. A sys- tems approach also facilitates early target selection and optimization of chemis- try around both safety and efficacy end points by using very small amounts of precious drug substance early in candidate selection. The handoff from discov- ery to development is consequently more effective and, not surprisingly, results in overall reduced attrition. The CellCiphr® approach involves (1) the use of the relevant cells representing the major rodent and human organs, (2) panels of organ-specific functional biomarkers multiplexed by using fluorescence detec- tion with high-content screening readers, (3) a growing database of reference drugs on which there are both safety data and CellCiphr® profile data, and (4) the safety risk index generated with classifier software. Profiles are generally performed at three time points to establish 10-point dose-response curves using over 10 cellular functional biomarkers in a 384-well format. Currently, there are separate panels of human HepG2 cells, primary rat hepatocytes, and rat cardio- myocytes. A human primary hepatocyte panel is being completed. In addition, a variety of both human and rodent stem-cell-derived cell systems are being evaluated for the major organs, including liver, heart, kidney, brain, immune system, and gastrointestinal tract in 2-D and 3-D architectures. Early studies have demonstrated that a CellCiphr profile shows predictive power not evident in simpler cell-based assays using the standard “ROC” curve analyses based on over 230 compounds in the database. The database has now grown to over 500 compounds. The safety risk index was used retrospectively to set priorities among lead series, such as the “glitazones” that demonstrated the value of CellCiphr for priority-setting. The ability to define mechanisms of ac- tion has also been demonstrated in profiling a group of non-steroidal anti- inflammatory drugs. When the biologically rich data are used, key signaling pathways are flagged that define the mechanisms of action that are responsible for potential toxic liabilities and adverse events. Finally, it has been demon- strated that the CellCiphr can flag as “high risk” those drugs that have been withdrawn or marked with a “black box” warning. Using the CellCiphr approach will result in improved early safety profil- ing, improved and powerful human predictivity, mode-of-action and biomarker identification, and, most important, overall reduction of safety-related attrition.

OCR for page 84
118 Toxicity-Pathway-Based Risk Assessment: A Symposium Summary Metabolomics in risk assessment Suryanarayana V. Vulimiri U.S. Environmental Protection Agency, National Center for Environmental Assessment, Office of Research and Development, Washington, DC Exposure to a toxic chemical is often reflected in the perturbation of sev- eral cellular events in which biochemicals of a given metabolic pathway are upregulated, downregulated, or unaffected. Cellular responses exhibited by both experimental animals and humans to toxic exposure are complex, but they un- dergo similar types of metabolic change that lead to adverse outcomes. Me- tabolomics is an emerging technology that uses high-throughput methods to simultaneously identify, quantify, and characterize low-molecular-weight (<1800 Da) biochemicals from numerous metabolic pathways. Several modes of action (MOAs) of toxic chemicals—such as oxidative stress, inflammation, cell proliferation, and cell damage or cytotoxicity—can be studied with the me- tabolomic approach. For example, oxidative stress, a common cellular perturba- tion caused by exposure to environmental insults, could be measured by the changes in the ratios of reduced glutathione (GSH) and oxidized glutathione (GSSG) in the GSH biosynthesis pathway. Other MOAs—such as inflammation, cell proliferation, and cytotoxicity—are identified by measuring the levels of arachidonate, ornithine, and o-phosphoethanolamine, respectively. The MOAs can be identified by the biochemical profile of the toxic response in vitro and in vivo. In population-based studies, the metabolomic approach can also be used for analyzing the putative biomarkers in body fluids, such as blood, plasma, se- rum, and urine. A comprehensive mechanism-based interpretation of informa- tion obtained through the metabolomics approach can be used to improve under- standing and to advance the MOA-based assessment of environmental risks to human health. Information from the metabolomics approach can also be com- bined with genomics information in molecular characterization of the hazards that cause perturbations in different toxicity pathways. Overall, the metabolomic approach is used as a noninvasive method to identify metabolic profiles and putative biomarkers as early predictors of toxicity and disease. Disclaimer: The views expressed are those of the author and do not neces- sarily reflect the U.S. Environmental Protection Agency’s opinion or policy. Formaldehyde and leukemia: epidemiology, potential mechanisms, and implications for risk assessment Luoping Zhang,1 Laura Beane-Freeman,2 Jun Nakamura,3 Stephen S. Hecht,4 John Vandenberg,5 Martyn T. Smith,1 and Babasaheb R. Sonawane6 1 University of California, Berkeley, School of Public Health, Berkeley, CA; 2 National Institutes of Health, National Cancer Institute, Division of Cancer Epidemiology and Genetics, Bethesda, MD; 3University of North Carolina at

OCR for page 84
119 Appendix F Chapel Hill, Department of Environmental Sciences and Engineering, Chapel Hill, NC; 4University of Minnesota, Masonic Cancer Center, Minneapolis, MN; 5 U.S. Environmental Protection Agency, Office of Research and Development, National Center for Environmental Assessments, Research Triangle Park, NC; 6 U.S. Environmental Protection Agency, Washington, DC Formaldehyde is widely used in commerce. Although it is regulated in many countries, occupational and environmental exposure to formaldehyde can remain quite high. Formaldehyde is an established human carcinogen (naso- pharyngeal cancer) and may be associated with an increased risk of leukemia in exposed individuals. However, risk assessment of formaldehyde and leukemia has been challenging because of inconsistencies in human and animal studies and the lack of a known mechanism for leukemia induction. As part of the 39th Annual Environmental Mutagen Society Meeting in October 2008, a special symposium was held at which an up-to-date review of the epidemiology and potential mechanisms of formaldehyde and leukemia and their implications for risk assessment were presented. Updated results of two of the three largest in- dustrial cohort studies of formaldehyde-exposed workers have shown positive associations with leukemia, particularly myeloid leukemia. A more recent meta- analysis of studies also supports the association. Recent mechanistic studies show the formation of formaldehyde-DNA adducts after formaldehyde exposure and the need for specific DNA repair (Fanconi Anemia-BRCA) pathways to protect against formaldehyde toxicity. Together, the updated findings suggest the need for future studies that more effectively assess the risk of leukemia aris- ing from formaldehyde exposure. It was recognized that increased communica- tion among scientists who practice epidemiology, toxicology, biology, and risk assessment could enhance the design of such studies, and specific recommenda- tions that could lead to greater understanding of the issue were made. A toxico- genomic approach in experimental models and human exposure studies, with the measurement of biomarkers of internal exposure, such as formaldehyde-DNA and protein adducts, should prove fruitful. Currently, only limited tools are available to measure such adducts in blood, bone marrow, and other target tis- sues specifically, so their development is urgently needed. Adducts and relevant DNA-repair pathways should be examined in the bone marrow of mice treated with formaldehyde-generating chemicals and in human progenitors exposed to formaldehyde in vitro. Use of transgenic mice deficient in DNA-repair genes could further facilitate the exploration of this area. In conclusion, there are many opportunities to incorporate knowledge from multiple disciplines that could be used to reduce uncertainty in the risk assessment of formaldehyde and leukemia. Disclaimer: The views expressed are those of the authors and do not nec- essarily reflect the views, policies, or endorsement of their affiliated institutions, the National Cancer Institute, or the U.S. Environmental Protection Agency.

OCR for page 84