National Academies Press: OpenBook
« Previous: Summary
Suggested Citation:"1 Introduction." National Research Council. 2007. Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment. Washington, DC: The National Academies Press. doi: 10.17226/12037.
×
Page 11
Suggested Citation:"1 Introduction." National Research Council. 2007. Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment. Washington, DC: The National Academies Press. doi: 10.17226/12037.
×
Page 12
Suggested Citation:"1 Introduction." National Research Council. 2007. Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment. Washington, DC: The National Academies Press. doi: 10.17226/12037.
×
Page 13
Suggested Citation:"1 Introduction." National Research Council. 2007. Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment. Washington, DC: The National Academies Press. doi: 10.17226/12037.
×
Page 14
Suggested Citation:"1 Introduction." National Research Council. 2007. Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment. Washington, DC: The National Academies Press. doi: 10.17226/12037.
×
Page 15
Suggested Citation:"1 Introduction." National Research Council. 2007. Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment. Washington, DC: The National Academies Press. doi: 10.17226/12037.
×
Page 16
Suggested Citation:"1 Introduction." National Research Council. 2007. Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment. Washington, DC: The National Academies Press. doi: 10.17226/12037.
×
Page 17
Suggested Citation:"1 Introduction." National Research Council. 2007. Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment. Washington, DC: The National Academies Press. doi: 10.17226/12037.
×
Page 18
Suggested Citation:"1 Introduction." National Research Council. 2007. Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment. Washington, DC: The National Academies Press. doi: 10.17226/12037.
×
Page 19
Suggested Citation:"1 Introduction." National Research Council. 2007. Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment. Washington, DC: The National Academies Press. doi: 10.17226/12037.
×
Page 20
Suggested Citation:"1 Introduction." National Research Council. 2007. Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment. Washington, DC: The National Academies Press. doi: 10.17226/12037.
×
Page 21

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

1 Introduction Scientists, regulators, and the public all desire efficient and accurate ap- proaches to assess the toxicologic effects of chemical, physical, and biologic agents on living systems. Yet, no single approach exists to analyze toxicologic responses, a difficult task given the complexity of human and animal physiology and individual variations. The genomic knowledge and new technologies that have emerged in the post-genomic era promise to inform the understanding of many risks as well as enlighten current approaches and lead to novel predictive approaches for studying disease risk. As biologic knowledge progresses with the science of toxicology, “toxicogenomics” (see Box 1-1 for definition) has the potential to improve risk assessment and hazard screening. BACKGROUND Approaches to Assessing Toxicity Detection of toxicity requires a means to observe (or measure) specific ef- fects of exposures. Toxicology traditionally has focused on phenotypic changes in an organism that result from exposure to chemical, physical, or biologic agents. Such changes range from reversible effects, such as transient skin reac- tions, to chronic diseases, such as cancer, to the extreme end point of death. Typical whole-animal toxicology studies may range from single-dose acute to chronic lifetime exposures, and (after assessment of absorption, distribution, metabolism, and excretion properties) they include assessments of end points such as clinical signs of toxicity, body and organ weight changes, clinical chem- istry, and histopathologic responses. 11

12 Applications of Toxicogenomic Technologies BOX 1-1 Toxicogenomics Definition Toxicogenomics: In this report, toxicogenomics is defined as the application of genomic technologies (for example, genetics, genome sequence analysis, gene expression profiling, proteomics, metabolomics, and related approaches) to study the adverse effects of environmental and pharmaceutical chemicals on human health and the environment. Toxicogenomics combines toxicology with informa- tion-dense1 genomic technologies to integrate toxicant-specific alterations in gene, protein, and metabolite expression patterns with phenotypic2 responses of cells, tissues, and organisms. Toxicogenomics can provide insight into gene- environment interactions and the response of biologic pathways and networks to perturbations. Toxicogenomics may lead to information that is more discriminat- ing, predictive, and sensitive than that currently used to evaluate toxic exposure or predict effects on human health. Toxicology studies generally use multiple doses that span the expected range from where no effects would be observed to where clinical or histopa- thologic changes would be evident. The highest dose at which no overt toxicity occurs in a 90-day study (the maximum tolerated dose), is generally used to es- tablish animal dosing levels for chronic assays that provide insight into potential latent effects, including cancer, reproductive or developmental toxicity, and im- munotoxicity. These studies constitute the mainstays of toxicologic practice.3 In addition to animal studies, efforts to identify and understand the effects of environmental chemicals, drugs, and other agents on human populations have used epidemiologic studies to examine the relationship between a dose and the response to exposures. In contrast to animal studies, in which exposures are ex- perimentally controlled, epidemiologic studies describe exposure with an esti- mate of error, and they assess the relationship between exposure and disease distribution in human populations. These studies operate under the assumption that many years of chemical exposures or simple passage of time may be re- quired before disease expression can be detected. 1 Toxicogenomic approaches are often referred to as “high-throughput,” a term that can refer to the density of information or the ability to analyze many subjects (or com- pounds) in a short time. Although the toxicogenomic techniques described here create information that is highly dense, the techniques do not all offer the ability to analyze many subjects or compounds at one time. Therefore, the term high-throughput is not used except in reference to gene sequencing technologies. 2 Relating to “the observable properties of an organism that are produced by the inter- action of the genotype and the environment” (Merriam-Webster’s Online Dictionary, 10th Edition). 3 For more information, see the National Research Council report Toxicity Testing for Assessment of Environmental Agents: Interim Report (NRC 2006a).

Introduction 13 As medical science has progressed, so have the tools used to assess animal toxicity. For example, more sensitive diagnostic and monitoring tools have been used to assess organ function, including tools to detect altered heart rhythms, brain activity, and changes in hormone levels as well as to analyze changes visi- ble by electron microscopy. Most notable, however, are the contributions of chemistry, cell and molecular biology, and genetics in detecting adverse effects and identifying cellular and molecular targets of toxicants. It is now possible to observe potential adverse effects on molecules, subcellular structures, and or- ganelles before they manifest at the organismal level. This ability has enhanced etiologic understanding of toxicity and made it possible to assess the relevance of molecular changes to toxicity. These molecular and cellular changes have been assessed in studies of animals but have also been applied to study human populations (“molecular epi- demiology”) with some success. For example, our understanding of gene- environment interactions has benefited greatly from studies of lung, head, and neck cancer among tobacco users—studies that examined differences in genes (polymorphisms) that are related to carcinogen metabolism and DNA repair. Similarly, studies of UV sunlight exposure and human differences in DNA re- pair genes have clarified gene-environment interactions in skin cancer risk. Cur- rent technology now enables the role of multiple genes of cell signaling path- ways to be examined in human population studies aimed at assessing the interplay between environmental exposures and cancer risk. Although current practice in toxicology continues to strongly emphasize changes observable at the level of the whole organism as well as at the level of the organ, the use of cellular and molecular end points sets the stage for apply- ing toxicogenomic technologies to a more robust examination of how complex molecular and cellular systems contribute to the expression of toxicity. Predictive Toxicology Predictive toxicology describes the study of how toxic effects observed in humans or model systems can be used to predict pathogenesis, assess risk, and prevent human disease. Predictive toxicology includes, but is not limited to, risk assessment, the practical facilitation of decision making with scientific informa- tion. Many of the concepts described in this report relate to approaches to risk assessment; key risk assessment concepts are reviewed in Appendix C. Typical information gaps and inconsistencies that limit conventional risk assessment are listed in Box 1-2. These gaps and inconsistencies present opportunities for toxi- cogenomics to provide useful information. Although toxicogenomics includes effects on wildlife and other environ- mental effects, this report is limited to a discussion of toxicogenomics as it ap- plies to the study of human health.

14 Applications of Toxicogenomic Technologies BOX 1-2 Typical Information Gaps and Inconsistencies That Limit Conventional Risk Assessment (Modified from NRC 2005a) • Lack of sufficient screening data—basic short-term in vitro or animal- bioassay data on toxicity or carcinogenicity of the compound. • Lack of information or inconsistent information about effects on humans— epidemiologic studies. • Paucity of accurate information on human exposure levels. • Relevance of animal data to humans—quantitative or qualitative. • Paucity of information on the relationship between dose and response, espe- cially at low doses relevant to human environmental exposures. • Inconsistent animal-bioassay data on different species—differential responses in varied animal test models. • Paucity of information or inconsistencies in data on different exposures, par- ticularly exposure during development and early-life exposures or by varied routes of exposure (inhalation, diet, drinking water). • Lack of data on impacts of coexposures to other chemicals—current risk as- sessment practices are uncertain about “how to add” coexposures to the many and varied chemicals present in real-world environments. • Paucity of data on the impact of human variability on susceptibility, including age, gender, race, disease, and other confounders. Overview of Toxicogenomic Technologies Toxicogenomic technologies comprise several different technology plat- forms for analysis of genomes, transcripts, proteins, and metabolites. These technologies are described briefly here and in more detail in Chapter 2. It is im- portant to recognize two additional issues associated with the use of toxicoge- nomic technologies. First, the large quantity of information that a single experi- ment can generate, and the comprehensive nature of this information, is much greater than what traditional experiments generate. Second, the advancement of computing power and techniques enable these large amounts of information to be synthesized from different sources and experiments and to be analyzed in novel ways. Genomic technologies encompass both genome sequencing technologies, which derive DNA sequences from genes and other regions of DNA, and geno- type analysis, which detects sequence variations between individuals in individ- ual genes. Whereas the sequencing of genomes was once an extraordinary un- dertaking, rapid evolution of sequencing technology has dramatically increased throughput and decreased cost, now outperforming the benchmark technology standard used for the Human Genome Project. The convergence of genome se- quencing and genotyping technologies will eventually enable whole-genome sequences of individuals to be analyzed. Advances in genotyping technologies

Introduction 15 (see Chapter 2) allow the simultaneous assessment of multiple variants across the whole genome in large populations rather than just single or several gene polymorphisms. Transcriptomic technologies (or gene expression profiling) measure mRNA expression in a highly parallel assay system, usually using microarrays. As the first widely available method for global analysis of gene expression, DNA microarrays are the emblematic technology of the post-genomic era. Mi- croarray technology for transcriptomics has enabled the analysis of complex, multigene systems and their responses to environmental perturbations. Proteomics is the study of collections of proteins in living systems. Be- cause the same proteins may exist in multiple modified and variant forms, pro- teomes are more complex than the genomes and transcriptomes that encode them. Proteomic technologies use mass spectrometry (MS) and microarray tech- nologies to resolve and identify the components of complex protein mixtures, to identify and map protein modifications, to characterize protein functional asso- ciations, and to compare proteomic changes quantitatively in different biologic states. Metabolomics is the study of small-molecule components of biologic sys- tems, which are the products of metabolic processes. Because metabolites reflect the activities of RNAs, proteins, and the genes that encode them, metabolomics allows for functional assessment of diseases and drug and chemical toxicity. Metabolomics technologies, employing nuclear magnetic resonance spectros- copy and MS, are directed at simultaneously measuring dozens to thousands of compounds in biofluids (for example, urine) or in cell tissue extracts. A key strength of metabolomic approaches is that they can be used to noninvasively and repeatedly measure changes in living tissues and living animals and that they measure changes in the actual metabolic flow. As with proteomics, the ma- jor limitation of metabolomics is the difficulty of comprehensively measuring diverse metabolites in complex biologic systems. Bioinformatics is a branch of computational biology focused on applying advanced computational techniques to the collection, management, and analysis of numerical biologic data. Elements of bioinformatics are essential to the prac- tice of all genomic technologies. Bioinformatics also encompasses the integra- tion of data across genomic technologies, the integration of genomic data with data from other observations and measurements, and the integration of all these data in databases and related information resources. It is helpful to think of bio- informatics not as a separate discipline but as the universal means of analyzing and integrating information in biology. Policy Context Regulatory agencies with the biggest stake in predictive toxicology in- clude the EPA, the Occupational Safety and Health Administration (OSHA), and the Food and Drug Administration (FDA). The EPA and OSHA are concerned

16 Applications of Toxicogenomic Technologies with potentially toxic exposures in the community and in the workplace. The mission of the EPA is to protect human health and the environment and to safe- guard the nation’s air, water, and land. OSHA’s mission is to ensure the safety and health of America’s workers by setting and enforcing standards; providing training, outreach, and education; establishing partnerships; and encouraging continual improvement in workplace safety and health. The FDA is responsible for protecting the public health by ensuring the safety, efficacy, and security of human and veterinary drugs, biologic products, medical devices, the nation’s food supply, cosmetics, and products that emit radiation. The FDA is also re- sponsible for advancing public health by facilitating innovations that make medicines and foods more effective, safer, and more affordable. Finally, the FDA is responsible for helping the public receive the accurate, science-based information it needs to use these regulated products to improve health. Working in parallel with these agencies and providing additional scientific underpinning to regulatory agency efforts is the Department of Health and Hu- man Services (DHHS), National Institutes of Health (NIH). The NIH (2007) mission is “science in pursuit of fundamental knowledge about the nature and behavior of living systems and the application of that knowledge to extend healthy life and reduce the burdens of illness and disability,” including research on “causes, diagnosis, prevention, and cure of human disease.” The NIH National Institute of Environmental Health Sciences (NIEHS) strives to use environmental sciences to understand human disease and improve human health, including “how environmental exposures fundamentally alter human biology” and why some people develop disease in response to toxicant exposure and others do not.4 In sum, NIH, regulatory agencies, the chemical and pharmaceutical indus- tries, health professionals, attorneys, the media, and the general public are all interested in knowing how new genomic technologies developed in the after- math of the Human Genome Project can improve our understanding of toxicity and ultimately protect public health and the environment. Although the FDA and the EPA have developed planning documents on toxicogenomic policies (see Chapter 9), specific policies have not yet emerged, and it is clear that stake- holders are grappling with similar questions: • Where is the science of toxicogenomics going? • What are the potential benefits and drawbacks of using toxicogenomics information for regulatory agencies, industry, and the public? • What are the challenges in implementing toxicogenomic technologies, collecting and using the data, and communicating the results? • Can genomic technologies predict health effects? • How will government agencies, industry, academics, and others know when a particular technology is ready to be used for regulatory purposes? 4 See http://www.niehs.nih.gov/od/fromdir.htm (accessed April 2, 2007).

Introduction 17 • Will regulatory requirements have to be changed to reap the benefits of the new technologies to protect public health and the environment? COMMITTEE CHARGE AND RESPONSE In April 2004, NIEHS asked the National Academies to direct its investi- gative arm, the National Research Council (NRC), to examine the impact of toxicogenomic technologies on predictive toxicology (see Box 1-3 for the com- plete statement of task). In response, the NRC formed the Committee on Applications of Toxico- genomic Technologies to Predictive Toxicology, a panel of 16 members that included experts in toxicology, molecular and cellular biology, epidemiology, law and ethics, bioinformatics (including database development and mainte- nance), statistics, public health, risk communication, and risk assessment (see Appendix A for committee details). The committee held two public meetings in Washington, DC, to collect information, meet with researchers and decision makers, and accept testimony from the public. The committee met five addi- tional times, in executive session, to deliberate on findings and complete its re- port. The remaining chapters of this report constitute the findings of the NRC committee. The committee approached its charge by focusing on potential uses of toxicogenomics often discussed in the broad toxicology community, with a fo- cus on human health issues and not environmental impact. The committee de- termined that the applications described in Chapters 4-8 capture much of the often-cited potential value of toxicogenomics. After identifying potential toxi- cogenomic applications, the committee searched the scientific literature for case examples that demonstrate useful implementations of toxicogenomic technolo- gies in these areas. This report is not intended to be a compendium of all studies that used toxicogenomic technologies and does not attempt to highlight the full range of papers published. Peer-reviewed and published papers in the public domain were selected to illustrate applications the committee identified as worthy of consideration. For example, Box 1-4 contains brief summaries of selected stud- ies where toxicogenomic technologies have shown promise in predictive toxi- cology. New studies using toxicogenomic technologies are published almost daily. Likewise, approaches to analyzing and interpreting data are rapidly evolv- ing, resulting in changes in attitudes toward various approaches.5 Even while this report was being prepared, such changes were observed, and therefore the committee has attempted to provide a snapshot of this rapidly evolving field. This report is the product of the efforts of the entire NRC committee. The report underwent extensive, independent external review overseen by the NRC 5 The value of platforms and software is also evolving rapidly and any discussion of platforms or software should not be considered an endorsement by the committee.

18 Applications of Toxicogenomic Technologies Report Review Committee. It specifically addresses, and is limited to, the state- ment of task as agreed upon by the NRC and the DHHS. This report consists of chapters on existing or potential “applications” and chapters that deal with broader issues. The technologies encompassed in toxico- genomics are described in Chapter 2. This is followed by a discussion of ex- perimental design and data analysis in Chapter 3. Chapter 4 discusses the appli- cations of toxicogenomic technologies to assess exposure: “Can toxicogenomic technologies determine whether an individual has been exposed to a substance and, if so, to how much?” Chapter 5 asks “Can toxicogenomic data be used to detect potential toxicity of an unknown compound quickly, reliably, and at a reasonable cost?” Chapter 6 addresses the assessment of individual variability in humans. The question in this context is “Can toxicogenomic technologies detect variability in response to exposures and provide a means to explain variability between individuals?” Chapter 7 addresses the question “What can toxicoge- nomic technologies teach us about the mechanisms by which toxicants produce adverse effects in biologic systems?” Considerations for risk assessment not covered in these four application chapters are discussed in Chapter 8. Chapter 9 focuses on validation issues that are relevant to most of the applications. In Chapter 10, sample and data collection and analysis are discussed, as are data- base needs. The ethical, legal, and social issues raised by use of toxicogenomics are considered in Chapter 11. Finally, Chapter 12 summarizes the recommenda- tions from the other chapters and identifies several overarching recommenda- tions. BOX 1-3 Statement of Task A committee of the NRC will examine the impact of “toxicogenomic” tech- nologies on predictive toxicology. These approaches include studying gene and protein activity and other biologic processes to begin to characterize toxic sub- stances and their potential risks. For the promise of these technologies to be real- ized, significant challenges must be recognized and addressed. This study will provide a broad overview for the public, senior government policy makers, and other interested and involved parties of the benefits potentially arising from these technologies, identify the challenges to achieving them, and suggest approaches and incentives that may be used to address the challenges. Potential scientific benefits might include identifying susceptible popula- tions and mechanisms of action and making better use of animal toxicity testing. Potential challenges might include scientific issues such as correlating gene ex- pression with adverse effects; conflicting, nonexistent, or inadequate regulatory requirements; legal, social, and ethical issues; coordination between regulators and the regulated communities; organizational infrastructure for handling large vol- umes of data, new analytic tools, and innovative ways to synthesize and interpret results; communication with appropriate audiences about scientific and nonscien-

Introduction 19 tific information; and the need for scientific standards in conducting and interpret- ing toxicogenomic experiments. This study will highlight major new or anticipated uses of these technolo- gies and identify the challenges and possible solutions to implementing them to improve the protection of public health and the environment. BOX 1-4 Selected Examples of the Use of Toxicogenomic Technologies in Predictive Toxicology Predictive toxicology is predicated on the hypothesis that similar treatments leading to the same end point will share comparable changes in gene expression. Examples where toxicogenomic technologies have such shown promise in predic- tive toxicology are presented below. Explanations of the technologies, methodolo- gies, and concepts are presented in greater detail in later chapters of the report. 1. Steiner et al. (2004) evaluated different classes of toxicants by transcript profiling in male rats treated with various model compounds or the appropriate vehicle controls. The results of this study demonstrated the feasibility of com- pound classification based on gene expression profile data. Most of the compounds evaluated were either well-known hepatotoxicants or showed hepatatotoxicity during preclinical testing. These compounds included acetaminophen, amiodarone, aflatoxin B1, carbon tetrachloride, coumarin, hydrazine, 1,2-dichlorobenzene, and 18 others. The aim was to determine whether biologic samples from rats treated with these various compounds can be classified based on gene expression profiles. Hepatic gene expression profiles were analyzed using a supervised learning method (support vector machines [SVM]) to generate classification rules and com- bine this with recursive feature elimination to identify a compact set of probe sets with potential use as biomarkers. In all studies, more than 150 genes were ex- pressed above background and showed at least a 2-fold modulation with a p value of <0.05 (two-tailed, unpaired t test). The predictive models were able to discriminate between hepatotoxic and non-hepatotoxic compounds. Furthermore, they predicted the correct class of hepa- totoxicant in most cases. As described in this report (Chapter 5), the predictive model produced virtually no false-positive outcomes but at the cost of some false- negative results—amiodarone, glibenclamide, and chlorpromazine were not recog- nized as toxic. This example is also important in that the it shows that a predictive model based on transcript profiles from the Wistar rat strain can successfully clas- sify profiles from another rat strain (Sprague-Dawley) for the peroxisome prolif- erator WY14643. In addition, the model identified non-responding animals (those treated with a toxicant but not exhibiting conventional toxicologic effects). 2. Ruepp et al. (2002) performed both genomic and proteomic analyses of acetaminophen toxicity in mouse liver. Acetaminophen overdose causes severe (Continued on next page)

20 Applications of Toxicogenomic Technologies BOX 1-4 Continued centrilobular hepatic necrosis in humans and in experimental animals. In this case, the researchers explored the mechanism of toxicity using sub toxic and toxic doses to overnight fasted mice. Animals were sacrificed at different time points from 15 minutes to 4 hours postinjection. Liver toxicity was assessed by plasma ALT activ- ity (a liver enzyme marker) and by electron microscopy. Using RT-PCR, genomic expression analysis was performed. In addition, proteomic analysis on liver mito- chondrial subfractions using a quantitative fluorescent 2D-DIGE method was done. The results showed Kupffer cell-derived GM-CSF mRNA (GM-CSF is a granulocyte specific gene) induction at both doses acutely, and chaperone proteins Hsp10 and Hsp60 decreased in mitochondria at both doses, most likely by leaking into the cytoplasm. All of these perturbations occurred before morphologic changes. Other genomic studies of acetaminophen have shown that its hepatotoxicity can be reproduced. Liver diseases that induce nonuniform lesions often give rise to greatly varying histopathology results in needle biopsy samples from the same patient. Heinloth et al. (2007) examined whether gene expression analysis of such biopsies utilizing acetaminophen as a model hepatotoxicant that gives a multifocal pattern of necrosis following toxic doses. Rats were treated with a single toxic or subtoxic dose of acetaminophen and sacrificed 6, 24, or 48 hours after exposure. Left liver lobes were harvested, and both gene expression and histopathologic analysis were performed on the same biopsy-sized samples. While histopathologic evaluation of such small samples revealed significant sample to sample differences after toxic doses of acetaminophen, gene expression analysis provided a very ho- mogeneous picture and allowed clear distinction between subtoxic and toxic doses. The results show that the use of genomic analysis of biopsy samples together with histopathologic analyses could provide a more precise representation of the overall condition of a patient's liver than histopathologic evaluation alone. 3. Haugen et al. (2004) examined arsenic-response networks in Saccharo- myces cerevisiae by employing global gene expression and sensitivity phenotype data in a metabolic network composed of all known biochemical reactions in yeast, as well as the yeast network of 20,985 protein-protein/protein-DNA interactions. Arsenic is a nonmutagenic carcinogen to which millions of people are exposed. The phenotypic-profiling data mapped to the metabolic network. The two signifi- cant metabolic networks unveiled were shikimate and serine, threonine and gluta- mate biosynthesis. Transcriptional profiling of specific deletion strains confirmed that the several transcription factors strongly mediate the cell's adaptation to arse- nic-induced stress. By integrating phenotypic and transcriptional profiling and mapping the data onto the metabolic and regulatory networks, the researchers demostrated that arsenic is likely to channel sulfur into glutathione for detoxifica- tion; that leads to indirect oxidative stress by depleting glutathione pools and alters protein turnover via arsenation of sulfhydryl groups on proteins. As described by Haugen et al., “Our data show that many of the most sensitive genes . . . are in- volved in serine and threonine metabolism, glutamate, aspartate and arginine me- tabolism, or shikimate metabolism, which are pathways upstream of the differen- tially expressed sulfur, methionine and homocysteine metabolic pathways,

Introduction 21 respectively. These downstream pathways are important for the conversion to glu- tathione, necessary for the cell's defense from arsenic. . . . This overlap of sensitive upstream pathways and differentially expressed downstream pathways provides the link between transcriptional and phenotypic profiling data.”

Next: 2 Toxicogenomic Technologies »
Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment Get This Book
×
Buy Paperback | $84.00 Buy Ebook | $64.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The new field of toxicogenomics presents a potentially powerful set of tools to better understand the health effects of exposures to toxicants in the environment. At the request of the National Institute of Environmental Health Sciences, the National Research Council assembled a committee to identify the benefits of toxicogenomics, the challenges to achieving them, and potential approaches to overcoming such challenges. The report concludes that realizing the potential of toxicogenomics to improve public health decisions will require a concerted effort to generate data, make use of existing data, and study data in new ways—an effort requiring funding, interagency coordination, and data management strategies.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!