Click for next page ( 32


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 31
6 Conclusion The United States has always played a leadership role in bringing computing technology to bear on science and engineering research and development, advancing entirely new frontiers. The design of the earliest computers, for example Britain's Colossus and the U.S.'s ENIAC during World War If was driven by defense and national security needs. In the United States, ENIAC was followed by a steady stream of government-driven high-performance systems.2 Supercomputing continues to be important for satisfying those needs. The particular technical approaches of any program that develops or uses supercomputing represent a complex compromise between conflicting requirements and the risks and opportunities entailed in various approaches. As described earlier in this report, an assessment of the approaches requires a detailed understanding of (1) the applications, (2) the algorithms used to solve those applications, (3) codes and programming environments, (4) the performance of codes on various platforms, (5) the likely evolution of various hardware and software technologies under various Finding scenarios, and (6) the costs, probabilities, and risks involved in various approaches. In its final report, the committee will seek to characterize broadly the requirements of different application classes and to examine architecture, software, algorithm, and cost challenges and trade-offs associated with these application classes, keeping in mind the needs of the nuclear stockpile stewardship program, the broad science community, and the national security community. (Note that the identification of the distinct requirements of the stockpile stewardship program and its relation to the ASC acquisition strategy is expected to be the focus of a separate classified report by the JASONs). The committee believes it would be unwise to significantly redirect or reorient current supercomputing programs before careful scientific consideration has been given to the issues described above. Such changes might be hard to reverse, might reduce flexibility, and might increase costs in the future. Exciting opportunities to advance knowledge and to serve society using supercomputing continue to emerge. The life and health sciences are becoming extraordinarily data rich, and researchers in those sciences are struggling to make sense of the data. The unfits of the genome projects for physiology and medicine cannot be realized without significant investments in computational hardware, algorithms, the 'Colossus was designed to decrypt German codes. See http://www.codesandciphers.org.uk/lorenz/colossus.htm>. ENIAC (Electronic Numerical Integrator Analyzer and Computer) was built to calculate ballistic firing tables. See ~http://ftp.arl.mil/~mike/comphist/eniac-story.html>. 2For a discussion of early government-funded projects in the late 1 940s and 1 950s that essentially created the early U.S. computer industry, see Chapter 3 ("Military Roots") in Kenneth Flamm, 1988, Creating the Computer: Government, Industry, and High Technology, Washington, D.C.: Brookings Institution Press. 1

OCR for page 31
32 THE FUTURE OF SUPERCOMPUTING: ANINTERIMREPORT software infrastructure, and the human infrastructure. The understanding of the physical world enabled by simulation and modeling is reaching ever-higher levels of fidelity and timeliness. As in many other areas of technology R&D, there seem to be sound economic and social arguments for continued government investment in supercomputing. To sustain our leadership in supercomputing, to meet the security and defense needs of our nation, and to realize the opportunities to use supercomputing to advance knowledge, progress in supercomputing must go on. Continuity and stability in the government funding of supercomputing appear to be essential to the well-beina of supercomputing in the United States. An appropriate balance must be struck between evolutionary and innovative advances. Evolution is important because it allows present achievements to be exploited and because a diversity of approaches to supercomputing including refinements of existing approaches appears to be necessary to address the diversity of the computational challenges we face. Innovation in supercomputing stems from application- motivated research, which leads to experimentation and prototyping and then, in turn, to advanced development and testbeds and, finally, deployment and products. All the stages along that path need sustained investment. Coupled innovations in architecture, in software, in algorithms, and in application strategies and solution methods are equally important. Balance is also needed between exploiting cost- effective advances in widely used hardware and software and developing custom solutions that meet the most demanding needs. As we reach the limitations of current approaches and encounter the disruptions that are unavoidable when different technologies grow at different rates, the fruits of that research and its maturation into practice will prepare us for major paradigm shifts in the future.