Skip to main content

Currently Skimming:

Appendix D: Workshop Presentations
Pages 71-176

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 71...
... The notion of quantum information can be abstracted, in much the same way as the notions of classical information. There are actually more things that can be done with information, if it is regarded in this quantum fashion.
From page 72...
... The next step beyond quantum cryptography, the one that made quantum information theory a byword, is the discovery of fast quantum algorithms for solving certain problems. For quantum computing, unlike simple cryptography, it is necessary to consider not only the preparation and measurement of quantum states, but also the interaction of quantum data along a stream.
From page 73...
... are most rapidly developed using quantum intermediate space in quantum computers. When I'm feeling especially healthy, I say that quantum computers will probably be practical within my lifetime strange phenomena involving quantum information are continually being discovered.
From page 74...
... Higher throughput is being achieved by shifting from batch processing to continuous feed stream, but this practice necessitates a greater understanding of the reaction kinetics, and hence the mechanism, to optimize feed stream rates. Simulation models are needed that have sufficient accuracy to be able to predict what upstream change in process variables are required to maintain the downstream products within specifications, as it may take several hours for upstream changes to affect downstream quality.
From page 75...
... Wisdom: having sufficient understanding of factors governing performance to reliably predict what will happen knowing what will work To illustrate how scientific computing and knowledge management convert data and information into knowledge and wisdom, a real example is taken from lubricant chemistry. Polysulfides, R-Sn-R, are added to lubricants to prevent wear of ferrous metal components under high pressure.
From page 76...
... Access to data is critical for academics in the QSPR-QSAR method development community, but is problematic due to intellectual property issues in the commercial sector.4 Hence there is a need to advance the science and the IT systems in the public arena to develop the fundamental foundation and building blocks upon which public and proprietary institutions can develop their own knowledge management and predictive modeling systems. What is the current status of chemical and physical property data?
From page 77...
... Examples of high-priority challenges cited by industry in the WTEC report to be ultimately addressed by the Universal Data and Simulation Engine are discussed below.5 How do we achieve this vision of a Universal Data and Simulation Engine? Toward this end, NIST has been exploring the concepts of dynamic data evaluation and virtual measurements of chemical and physical properties and predictive simulations of physical phenomena and processes.
From page 78...
... Quantum chemistry methods have achieved the greatest level of robustness and coupled with advances in computational speed have enabled widespread success in areas such as predicting gas-phase, small-molecule thermochemistry and providing insight into reaction mechanisms. Current challenges for quantum chemistry are accurate predictions of rate constants and reaction barriers, condensed-phase thermochemistry and kinetics, van der Waals forces, searching a complex reaction space, transition metal and inorganic systems, and performance of alloys and materials dependent upon the chemical composition.
From page 79...
... This is due to the maturity of quantum mechanics to reliably predict gas-phase thermochemistry for small (20 nonhydrogen atoms or less) , primarily organic, molecules, plus the availability of standard-reference-quality experimental data.
From page 80...
... The tools and systematic protocols to customize and validate potentials for given properties with specified accuracy and uncertainty do not currently exist and need to be developed. In conclusion, we are still at the early stages of taking advantage of the full potential offered by scientific computing and information technology to benefit both academic science and industry.
From page 81...
... At the automatic or molecular level, detailed models of the system are employed in molecular simulations to predict the structural, thermodynamic, and dynamic behavior of a system. The range of application of these methods is on the order of angstroms to nanometers.
From page 82...
... For molecular simulations of the structure and thermodynamic properties of complex fluids and matenals, particularly those consisting of large, articulated molecules (e.g., surfactants, polymers, proteins) , Monte Carlo methods offer attractive features that make them particularly effective.
From page 83...
... JUAN J DE PABLO 83 The development of novel, clever Monte Carlo trial moves for specific systems is a fertile area of research; significant advances in our ability to model;fluids and materials will result from such efforts.
From page 84...
... Reliable force fields are now being proposed for a wide variety of systems, including hydrocarbons, carbohydrates, alcohols, polymers, etc.6~6~7~8 Accurate force fields are the cornerstone of;fluids and materials modeling; much more work in this area is required to reach a stage at which modeling tools can be used with confidence to interpret the results of experiments and to anticipate the behavior of novel materials. The above discussion has been focused on Monte Carlo methods.
From page 85...
... .2~22 Multiscale methods for the study of dynamic processes currently rely on separation of time scales for various processes. One of the cornerstones of these methods is the averaging or coarse-graining of fast, local processes into a few, well-chosen variables carrying sufficient information content to provide a meaningful description of a system at longer time and length scales.
From page 86...
... The density of memory is increasing at the same rate as computing speed, and disk drive densities are increasing at an even faster pace. These advances have already led to a new era in computational chemistry computational models of molecules and molecular processes are now so widely accepted and PCs and workstations so reasonably priced that molecular calculations are routinely used by many experimental chemists to better understand the results of their experiments.
From page 87...
... In fact, Professor Jack Dongarra of the University of Tennessee, one of the world's foremost experts in scientific computing, has recently stated this fact in stark terms: The rising tide of change [resulting from advances in information technology] shows no respect for the established order.
From page 88...
... 88 100 10 1 ~ ~ Computing Power ("flops)
From page 90...
... 9o APPENDIX D The performance of the Earth Simulator is equally impressive when real scientific and engineering applications are considered. A general atmospheric global circulation benchmark ran at over 26 teraflops, or 65% of peak performance, on the Earth Simulator.2 On the commercially oriented machines that are currently being used in the United States, it has proven difficult to achieve more than 10% of peak performance for such an application.3 So, not only is the raw speed of the Japanese Earth Simulator very high, it is also very effective for scientific and engineering applications.
From page 91...
... Opportunities in Computational Chemistry What do the advances occurring in computing technology mean for computational chemistry, especially electronic structure theory, the author's area of expertise? It is still too early to answer this question in detail, but the advances in computing technology will clearly have a major impact on both the size of molecules that can be computationally modeled and the fidelity with which they can be modeled.
From page 92...
... But the accuracy of the calculations has been improving steadily over the past 30 years. By 2000, bond energies could be predicted to better than 1 kcal/mol, as good as or better than the accuracy of most experimental measurements.
From page 95...
... This is important in the interpretation of experimental data (experimentalists always seem to focus on larger molecules than we can model) , to characterize molecules for which experimental data is unavailable, and to obtain data for parameterizing semiempirical models of more complex systems.
From page 96...
... A 2000, 104, 9971. Challenges in Computational Chemistry Although the opportunities offered by advances in computing technologies are great, many challenges must be overcome to realize the full benefits of these advances.
From page 98...
... 98 APPENDIX D lo6 105 ~ 10 Q o 103 2 lol 10° 10 ,-1 I' CCSD(T)
From page 99...
... Beylkin, "Multiresolution Quantum Chemistry: Basic Theory and Initial Applications," to be published.
From page 100...
... 100 APPENDIX D DFT/LDA Energy 10-3 -230.194 10-5 -230.19838 10-7 -230.198345 For comparison Partricige-3 primitive set + aug-cc-pVTZ polarization set: -230.158382 hartrees FIGURE 10 Mathematical challenges: multiwavelet calculations on benzene. Courtesy of R
From page 102...
... 102 APPENDIX D from Cray, Inc. Oak Ridge National Laboratory is leading this effort for DOE's Office of Science.
From page 103...
... Autoignition is also the basis of a very efficient internal combustion engine with extremely low emissions the revolutionary homogeneous charge, compression ignition, or HCCI engine. The catch is that HCCI engines can not yet be used in cars and trucks because it is not yet possible to properly control the "blameless" combustion of fuels associated with autoignition.
From page 104...
... Nanoscale phenomena on the other hand often operate on micro- to millisecond (or longer) time scales and over distances of 10-100 (or more)
From page 106...
... 106 20 15 ._ V e_ 10o 5 o 1 1982 1986 1990 1994 1998 2002 FIGURE 13 Background: exponential growth in GenBank. APPENDIX D GenBank · Number of base pairs .
From page 108...
... 108 APPENDIX D puting power from Moore' s Law. The upper band represents the rate of growth in computing resources needed for genomic research (i.e., the amount of computing power that will be needed to mine and analyze genomic data)
From page 109...
... In addition to the hardware required to store and access such massive amounts of data, a seasoned staff will be required to manage such a resource. Important data management services such as backup and restore of research data can be assured more easily in a Grid environment
From page 110...
... 110 FIGURE 17 Elements of the North Carolina BioGrid. APPENDIX D BioGrid Portal, BioApplications Interfaces Grid-aware or grid-enabled bioinformatics applications ·.
From page 111...
... I challenge the participants in this workshop to think about applications for Grid technologies in chemical science and technology. Clearly, chemists and chemical engineers are not yet confronted with the flood of data that is becoming available in the genomics area but the amount of data that we have in chemistry is still substantial.
From page 112...
... . In chemical science and technology, on the other hand, remote access to instruments is largely a foreign concept.
From page 113...
... In EMSL there are a number of first-of-a-kind and one-of-a-kind instruments that could be made available to the community via the Internet. We decided to focus on the instruments in the High-Field Nuclear Magnetic Resonance Facility as a test case.
From page 114...
... The scientists can do everything from the virtual control panel displayed on his/her computer display that he/she can do sitting at the real control panel. Currently, over half of the scientists using EMSL's high-field NMRs use them remotely, a testimony to the effectiveness of this approach to research in chemical science and technology.
From page 115...
... Virtual laboratories have already proven to be an effective means of dealing with the rising costs of forefront instruments for chemical research by providing capabilities needed by researchers not colocated with the instruments all we need is a sponsor willing to push this technology forward on behalf of the user community. The twenty-first century will indeed be an exciting time for chemical science and technology.
From page 116...
... We conclude with a summary of the advances and a number of challenges. Sequence to Structure: Structure Prediction in Protein Folding Structure prediction of polypeptides and proteins from their amino acid sequences is regarded as a holy grail in the computational chemistry and molecular and structural biology communities.
From page 117...
... In spite of pioneering contributions and decades of effort, the ah initio prediction of the folded structure of a protein remains a very challenging problem. The existing approaches for the protein structure prediction can be classified as: (1)
From page 118...
... 118 APPENDIX D ~ . Helix Prediction - Detailed l~odel~ng - Simulations of Local Interactions Beta Sheet Precliction - Kim Modeling of Beta Sheet Formation - Predict List of Optimal Arrangements Derivation of Restraints _ Overall SD Structure Prediction - Structural Data from Pus Stages - Prediction via Novel Solution Approach (Global Optimization & Molecular Dynamics)
From page 119...
... CHRISTODOULOSA. FLOUDAS 119 or larger oligopeptides.29 The overall methodology for the ah initio prediction of helical segments encompasses the following steps:24 The overlapping oligopeptides are modeled as neutral peptides surrounded by a vacuum environment using the ECEPP/3 force field.30 An ensemble of low potential energy pentapeptide conformations, along with the global minimum potential energy conformation, is identified using a modification of the ocpp global optimization approach3i and the conformational space annealing approach.32 For the set of unique conformers Z
From page 120...
... 120 APPENDIX D models that feature three types of binary variables: (1) representing the existence or nonexistence of contacts between pairs of hydrophobic residues; (2)
From page 121...
... CHRISTODOULOSA. FLOUDAS 121 for constrained minimization is enhanced by introducing torsion angle dynamics40 within the context of the ocpp global optimization framework.26 Two viewpoints provide competing explanations of the protein-folding question.
From page 122...
... From this astronomical number of sequences, the computational sequence selection process aims at selecting those sequences that will be compatible with a given structure using efficient optimization of energy functions that model the molecular interactions. The first attempts at computational protein design focused only on a subset of core residues and explored steric van der Waals-based energy functions, although over time they evolved to incorporate more detailed models and interaction potentials.
From page 123...
... CHRISTODOULOSA. FLOUDAS 123 optimized decoy structures.58 59 The resulting potential, which involves 2730 parameters, was shown to provide higher Z scores than other potentials and place native folds lower in energy.58 59 The formulation allows a set of mutations for each position i that in the general case comprises all 20 amino acids.
From page 124...
... 24 APPENDIX D nonconvex constrained global optimization problem, a class of problems for which several methods have been developed. In this work, the formulations are solved via the ocpp deterministic global optimization approach, a branch and bound method applicable to the identification of the global minimum of nonlinear optimization problems with twice-differentiable functions.27 36 37 38 39 60 6i In addition to identifying the global minimum energy conformation, the global optimization algorithm provides the means for identifying a consistent ensemble of low-energy conformations.35 6i Such ensembles are useful in deriving quantitative comparisons between the free folding and template-constrained simulations.
From page 125...
... The discussion is organized as follows. First, I review the key challenges that one faces in carrying out accurate condensed phase modeling, with the analysis centered on the core technologies that form the essential ingredients of a simulation methodology.
From page 126...
... The problems of protein structure prediction, protein-ligand binding, and enzymatic catalysis, which are discussed in the next section, fall into this category. Any condensed-phase simulation protocol that is going to address the above problems requires three fundamental components: 1.
From page 127...
... RICHARD FRIESNER Energy Models 127 Quantum (chemical Methods. Quantum chemistry represents the most fundamental approach to computation of the energy of any given atomic configuration.
From page 128...
... In this case, the standard models (valence terms for stretches, bends, torsions, electrostatic, and van der Waals terms for intermolecular interactions) are sufficient, with the proviso that the electrostatics should be described by higher order multipoles of some sort (as opposed to using only atomic point charges)
From page 129...
... Computational chemistry needs to become a full partner with experiment, not try to replace it the technology to do that simply is not there at this time. Solvation Models I concentrate here on continuum models for aqueous solution, although the basic ideas are not very different for other solvents.
From page 130...
... Generation of accurate protein structures, whether in a homology modeling context or simply enumerating low-energy structures given a high-resolution crystal structure. The use of a single protein conformation to assess ligand binding is highly problematic if one wants to evaluate a large library of diverse ligands so as to locate novel scaffolds capable of supporting binding.
From page 131...
... All one can do is attempt to improve all three components and carry out tests to see whether better agreement with experiment is obtained. My own intuition is that on a 5- to 10-year time scale, there will be some real successes in this type of calculation, but achieving reliability over a wide range of chemistries and receptors is going to be a great challenge.
From page 132...
... 32 APPENDIX D THE CURRENT STATE OF RESEARCH IN INFORMATION AND COMMUNICATIONS James R Heath University of California, Los Angeles This paper focuses on an area of nanoelectronics to which chemists have been making substantial contributions in the past five to ten years.
From page 133...
... MULTISCALE MODELING Dimitrios Maroudas University of Massachusetts In the next decade, multiscale modeling will be a very important area of chemical science and technology in terms of both needs and opportunities. The field has emerged during the past 10 years in response to the need for an integrated computational approach toward predictive modeling of systems that are both complex and complicated in their theoretical description.
From page 134...
... Ultimately, what one would like to do, from an engineering viewpoint at least, is use all these methodologies to explore vast regions of parameter space, identify critical phenomena, promote critical phenomena that improve the behavior of a system, and avoid critical phenomena that lead to failure. Length Supply Chain Modeling: Planning & Scheduling Mixed Integer Linear Programming ~ Global logistics Process Simulat ion: Equation-Based Models Control and Optimization ~ Processing units/facilities Continuum Mechanics: Finite-Element & -Difference Methods Boundary-lntegral Methods ~ Macroscopic modeling Mesoscopic Scaler Coarse-grained Quantum and Statistical Mechanics Mixed/coupled atomistic-continuum Methods ~ Mesoscale modeling Statistical Mechanics: Semi-empirical Hamiltonians Molecular Statics, Lattice Dynamics, Molecular Dynamics, Monte Carlo Modeling for mechanistic understanding Quantum Mechanics: Ab initio, electronic structure Density Functional Theory, first principles Molecular Dynamics Accurate calculations of materials properties Time 1 ,.
From page 135...
... DIMITRIOS MAROUDAS 135 Strict requirements must be imposed if multiscale modeling is to become a powerful predictive tool. In particular, we need to deal with issues of accuracy and transferability through connection to first principles because phenomenological models are not transferable between widely different environments.
From page 136...
... I explain how I see computational science what it is, where it is going, and what is driving the fundamentals that I believe are taking place, and finally, how all this relates to you and to the chemical sciences. I start with some controversial views.
From page 137...
... LINDA R PETZOLD 137 engineering or maybe it's the antidiscipline, depending on how one looks at it.
From page 138...
... 138 APPENDIX D plot, which shows speedup (relative to computer power in 1970) derived from supercomputer hardware, is just another restatement of Moore's Law.i The three last data points are ASCI Red and ASCI White (the big DOE machines)
From page 139...
... LINDA R PETZOLD 139 \ Applied \ Computer Mathematics ~ ~ ~ Science / \ Science and Engineering FIGURE 2 Computer Science and Engineering (CSE)
From page 140...
... Another approach is the hybrid finite element-molecular dynamics-quantum mechanics method, attributed to Abraham, Broughton, Bernstein, and Kaxiras.3 This method is attractive because it is massively parallel, but it's designed for systems that involve a central defective region, surrounded by a region that is only slightly perturbed from the equilibnum. Therefore it has limitations in the systems that it can address.
From page 141...
... We must find a way to integrate all these data and use them as input for the computations. A key challenge in systems biology is identification of the control behavior.
From page 142...
... The first is multiple time scales, a problem that is familiar in chemical engineering where it is called stiffness, and we have good solutions to it. In the stochastic world there doesn't seem to be much knowledge of this phenomenon, but I believe that we recently have found a solution to this problem.
From page 143...
... LINDA R PETZOLD 143 Uncertainty analysis was mentioned many times at the workshop, and still to come are multiscale systems, fully stochastic systems, and so on.
From page 144...
... 44 APPENDIX D difference. There is a very tenuous trade-off between round-off and truncation error, for which there sometimes is no solution particularly for chemistry problems, which tend to be badly scaled.
From page 145...
... There have been some good developments in user interface technology for scientific computing, and some exceptions to the sad state of most of our software interfaces. In fact, I think that the first one is obvious: MATLAB.6 It is an excellent example of the value of a beautiful interface, but it has been directed primarily at relatively small problems.
From page 146...
... 146 APPENDIX D SIMULATION IN MATERIALS SCIENCE George C Schatz Northwestern University Introduction The primary goal of this paper is to provide examples of problems in nanomaterials where computational methods are able to play a vital role in the discovery process.
From page 147...
... GEORGE C SCHATZ 147 cede, with the result that codes are now available for describing the optical properties of assemblies of nanoparticles with upwards of tens of thousands of particles.
From page 148...
... 148 APPENDIX D Melting of DNA That Links Nanoparticles The DNA-linked gold nanoparticle aggregates just described have another use for DNA detection that is both very important and distinctly nanoscale in nature. When the aggregates are heated, it is found that they "melt" (i.e., the double helix unravels into single-stranded oligonucleotides)
From page 149...
... GEORGE C SCHATZ 1.00 0.80 0.60 o 0.40 0.20 0.00 149 - Cooperative Melting Model ' _Tlm—320, T2m—350 / / _ /\Hl=AH2=5o kcal/mol / I I f11 I ~ 1 1 .
From page 150...
... 150 APPENDIX D lap and the DNA is stabilized needs to be proven, and crucial information about the range of the cooperative mechanism needs to be determined. To do this, we must simulate the ion atmosphere around DNA dimers, trimers and other structures, ideally with gold particles also nearby.
From page 151...
... LINDA R PETZOLD 151 been collaborating with a team of AFOSR funded faculty Steven Sibener, Luping Yu, Tim Minton, Dennis Jacobs, Bill Hase, Barbara Garrison, John Tully to develop theory to model the erosion of polymers and other materials, and to test this theory in conjunction with laboratory and satellite experiments.
From page 152...
... . DISTRIBUTED CYBERINFRASTRUCTURE SUPPORTING THE CHEMICAL SCIENCES AND ENGINEERING Larry L
From page 153...
... Other areas of science have similar levels of complexity, and I would argue that those areas have progressed farther than the chemical sciences in thinking about the distributed information infrastructure that needs to be built to make the science possible. · Medical Imaging with MRI: The NIH has funded the Biomedical Informatics Research Network (BIRN)
From page 154...
... Every year both the number and the speed of these channels increase, thereby creating a bandwidth explosion. As an aside, the capability of optical fiber to support these multiple channels, ultimately enhancing the chemical sciences using the fiber, is itself a victory for chemical engineering.
From page 155...
... In my institute we try to look out to see how this is all happening, by creating "Living Laboratories of the Future." We do this by deploying experimental networked test beds, containing bleeding edge technological components that will become mass market in 3, 5, or 10 years, and using the system now with faculty and students. That is the way supercomputers developed from 15 years ago when a few people ran computational chemistry on a gigahertz machine like the Cray-2 that cost $20 million, to today, when everybody works on a gigahertz machine PC that costs only $1000.
From page 156...
... . For science, all of these trends mean that we are going to be able to know the state of the world in much greater detail than any of us ever thought possibleand probably on a much shorter time scale than most of us would expect as we focus on this report that looks at "the next 10 years." 3See: http://www.cs.wisc.
From page 157...
... I describe research that has significant academic character but simultaneously has a path of relevance to automotive applications. The second section focuses on some cross-cutting themes, and the third section describes three specific examples that illustrate those themes and reflect some challenges and opportunities for chemical sciences.
From page 158...
... Catalytic Materials Exhaust gas catalytic aftertreatment for pollution control depends heavily on chemistry and on materials. Hence, in one CAMS project, the goal is to guide the development of improved catalytic materials for lean-burn, exhaust-NOx (NO and NO2)
From page 159...
... ELLEN STECHEL 159 FIGURE 2 Ab initio calculations on yttria-stabilized zirconia to improve activity and robustness of oxide conductors for fuel cells and sensors. Ionic Conductors A second example from the CAMS group focuses on ionic conductors, with the goal of improving the activity and robustness of oxide conductors for fuel cells and sensors.5 The scientific goal of the project is to understand with the intent of tailoring ion diffusion in disordered electrolytes.
From page 160...
... 160 APPENDIX D on four recurring and broadly applicable themes. The first theme is the false dichotomy that arises when trying to distinguish applied from basic research.
From page 161...
... ELLEN STECHEL 161 Another way of looking at the two-way relationship between science and application is to recognize that the difference is in the starting and ending points, but not necessarily in the road traveled. If the primary motivation is science a quest for understanding then the world in which the researcher lives is one that is reasonably deterministic, predictive, and well controlled.
From page 162...
... 162 APPENDIX D so on. Although the constraints are many, it is necessary to consider all of them seriously and intellectually.
From page 163...
... In the following discussion, the focus for each example is on why the industry has an interest in the application, what the industry is trying to accomplish, and the nature of some of the key ongoing scientific challenges. The emphasis is on how the chemical sciences fit in and how simulation fits in, but the challenge in each example involves significantly more than chemical sciences and simulation.
From page 164...
... The virtual aluminum-casting project has developed a comprehensive and integrated set of simulation tools that capture expertise in aluminum metallurgy casting and heat treatment years of experience captured in models. In addition to enabling calculations, an often-overlooked characteristic of models, is that they can organize knowledge and information and thereby assist in knowledge dissemination.
From page 166...
... Precipitates form during heat treatment, and manufacturers have the opportunity to optimize the aluminum heat treatment to achieve an optimized structure. First principles calculations and multiscale calculations are able to elucidate opportunities, including overturning 100 years of metallurgical conventional wisdom (Figure 4~.8 Several key challenges must be overcome in this area.
From page 167...
... In other words, the material must have adequate thermal durability and resist sintering and chemical poisoning from exhaust gas components. Neither thermal degradation nor chemical degradation is particularly well understood beyond some general principles; i.e., modeling is difficult without a lot of empirical input.
From page 168...
... The industry does utilize accelerated aging, but it is done somewhat empirically without the benefit of a solid foundation of basic science. Homogeneous Charge Compression Ignition The third and final example arises from the Ford Motor Company-MIT Alliance, a partnership that funds mutually beneficial collaborative research.
From page 169...
... Obtaining reasonably definitive answers in a timely manner is equally crucial, but too often, there does not seem to be a time scale driving needed urgency. Data and knowledge management offer additional overlooked opportunities.
From page 170...
... For large pharmaceutical companies to survive, they must maintain an income stream capable of supporting their current infrastructure as well as funding R&D for the future. The cost of drug development and the low probability of technical success call for improved efficiency of drug discovery and development and further investment in innovative technologies and processes that improve the chances of bringing a compound to market as a drug.
From page 171...
... To meet critical time lines they are outsourcing components of research and development to contract research organizations, enabling greater efficiencies by providing added expertise or resources and decreasing development time lines. The trend toward mergers and acquisitions, consolidating pipelines, and attempting to achieve economies of scale is an attempt by large pharmaceutical companies to build competitive organizations.
From page 172...
... The traditional approach has been to generate large amounts of replicate data, to use statistics to provide confidence, and to move cautiously, stepwise, toward higher complexity: from in vitro to cell-based to tissue-based to in viva in model animals and then to man. In a practical sense, the drug discovery and development odds have been improved by a number of simple strategies: Start with chemically diverse leads, optimize them in parallel in both discovery and later development, back them up with other compounds when they reach the clinic and follow on with new, structurally different compounds after release.
From page 173...
... One of the reasons that this is an accepted approach is that there is richness in data and information and there is a wealth of methodology available, both experimental and computational, that enables these approaches. The limitations in this area are concerned primarily with the challenges facing structural biology such as appropriate expression systems, obtaining sufficient amounts of pure protein, and the ability to crystallize the protein.
From page 174...
... Each compound, in a practical sense, is a question that is being asked of a complex biological system. The answer to a single question provides very little information; however, in an manner analogous to the game of "twenty question," the combined result from a well-chosen collection of compounds (questions)
From page 175...
... The solution to a problem in drug discovery and development is far more complex than a game of 20 questions and should not be tnvialized. Even so, the power of discnm~nation through categorization of answers and integration of answers from diverse experiments provides an extremely powerful mechanism for optimizing to a satisfying outcome a potential drug.
From page 176...
... This leads to the compelling conclusion that since the patterns describe evolutionary changes (divergence and convergence) and also describe the critical features of substrate binding, the substrate is the driver of evolutionary change.~3 A particular application of PD is in the analysis of variations of genetic information (single nucleotide polymorphisms, or SNPs)


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.