National Academies Press: OpenBook

The Future of Computing Performance: Game Over or Next Level? (2011)

Chapter: 1 The Need for Continued Performance Growth

« Previous: Summary
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

1


The Need for Continued Performance Growth

Information technology (IT) has become an integral part of modern society, affecting nearly every aspect of our lives, including education, medicine, government, business, entertainment, and social interactions. Innovations in IT have been fueled by a continuous and extraordinary increase in computer performance. By some metrics computer performance has improved by a factor of an average of 10 every 5 years over the past 2 decades. A sustained downshift in the rate of growth in computing performance would have considerable ramifications both economically and for society. The industries involved are responsible for about $1 trillion of annual revenue in the United States. That revenue has depended on a sustained demand for IT products and services that in turn has fueled demand for constantly improving performance. Indeed, U.S. leadership in IT depends in no small part on its driving and taking advantage of the leading edge of computing performance. Virtually every sector of society—manufacturing, financial services, education, science, government, military, entertainment, and so on—has become dependent on the continued growth in computing performance to drive new efficiencies and innovations. Moreover, all the current and foreseeable future applications rely on a huge software infrastructure, and the software infrastructure itself would have been impossible to develop with the more primitive software development and programming methods of the past. The principal force allowing better programming models, which emphasize programmer productivity over computing efficiency, has been the

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

growth in computing performance. (Chapter 4 explores implications for software and programming in more detail.)

This chapter first considers the general question of why faster computers are important. It then examines four broad fields—science, defense and national security, consumer applications, and enterprise productivity—that have depended on and will continue to depend on sustained growth in computing performance. The fields discussed by no means constitute an exhaustive list,1 but they are meant to illustrate how computing performance and its historic exponential growth have had vast effects on broad sectors of society and what the results of a slowdown in that growth would be.

WHY FASTER COMPUTERS ARE IMPORTANT

Computers can do only four things: they can move data from one place to another, they can create new data from old data via various arithmetic and logical operations, they can store data in and retrieve them from memories, and they can decide what to do next. Students studying computers or programming for the first time are often struck by the surprising intuition that, notwithstanding compelling appearance to the contrary, computers are extremely primitive machines, capable of performing only the most mind-numbingly banal tasks. The trick is that computers can perform those simple tasks extremely fast—in periods measured in billionths of a second—and they perform these tasks reliably and repeatably. Like a drop of water in the Grand Canyon, each operation may be simple and may in itself not accomplish much, but a lot of them (billions per second, in the case of computers) can get a lot done.

Over the last 60 years of computing history, computer buyers and users have essentially “voted with their wallets” by consistently paying more for faster computers, and computer makers have responded by pric-

____________________

1Health care is another field in which IT has substantial effects—in, for example, patient care, research and innovation, and administration. A recent National Research Council (NRC) report, although it does not focus specifically on computing performance, provides numerous examples of ways in which computation technology and IT are critical underpinnings of virtually every aspect of health care (NRC, 2009, Computational Technology for Effective Health Care: Immediate Steps and Strategic Directions, Washington, D.C.: The National Academies Press, available online at http://www.nap.edu/catalog.php?record_id=12572). Yet another critically important field that increasingly benefits from computation power is infrastructure. “Smart” infrastructure applications in urban planning, high-performance buildings, energy, traffic, and so on are of increasing importance. That is also the underlying theme of two of the articles in the February 2009 issue of Communications of the ACM (Tom Leighton, 2009, Improving performance on the Internet, Communications of the ACM 52(2): 44-51; and T.V. Raman, 2009, Toward 2W: Beyond Web 2.0, Communications of the ACM 52(2): 52-59).

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

ing their systems accordingly: a high-end system may be, on the average, 10 percent faster and 30 percent more expensive than the next-best. That behavior has dovetailed perfectly with the underlying technology development in the computers—as ever-faster silicon technology has become available, faster and faster computers could be designed. It is the nature of the semiconductor manufacturing process that silicon chips coming off the fabrication line exhibit a range of speeds. Rather than discard the slower chips, the manufacturer simply charges less for them. Ever-rising performance has been the wellspring of the entire computer industry. Meanwhile, the improving economics of ever-larger shipment volumes have driven overall system costs down, reinforcing a virtuous spiral2 by making computer systems available to lower-price, larger-unit-volume markets.

For their part, computer buyers demand ever-faster computers in part because they believe that using faster machines confers on them an advantage in the marketplace in which they compete.3 Applications that run on a particular generation of computing system may be impractical or not run at all on a system that is only one-tenth as fast, and this encourages hardware replacements for performance every 3-5 years. That trend has also encouraged buyers to place a premium on fast new computer systems because buying fast systems will forestall system obsolescence as long as possible. Traditionally, software providers have shown a tendency to use exponentially more storage space and central processing unit (CPU) cycles to attain linearly more performance; a tradeoff commonly referred to as bloat. Reducing bloat is another way in which future system improvements may be possible. The need for periodic replacements exists whether the performance is taking place on the desktop or in the “cloud”

____________________

2A small number of chips are fast, and many more are slower. That is how a range of products is produced that in total provide profits and, ultimately, funding for the next generation of technology. The semiconductor industry is nearing a point where extreme ultraviolet (EUV) light sources—or other expensive, exotic alternatives—will be needed to continue the lithography-based steps in manufacturing. There are a few more techniques left to implement before EUV is required, but they are increasingly expensive to use in manufacturing, and they are driving costs substantially higher. The future scenario that this implies is not only that very few companies will be able to manufacture chips with the smallest feature sizes but also that only very high-volume products will be able to justify the cost of using the latest generation of technology.

3For scientific researchers, faster computers allow larger or more important questions to be pursued or more accurate answers to be obtained; office workers can model, communicate, store, retrieve, and search their data more productively; engineers can design buildings, bridges, materials, chemicals, and other devices more quickly and safely; and manufacturers can automate various parts of their assembly processes and delivery methods more cost-effectively. In fact, the increasing amounts of data that are generated, stored, indexed, and retrieved require continued performance improvements. See Box 1.1 for more on data as a performance driver.

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

BOX 1.1
Growth of Stored and Retrievable Data

The quantity of information and data that is stored in a digital format has been growing at an exponential rate that exceeds even the historical rate of growth in computing performance, which is the focus of this report. Data are of value only if they can be analyzed to produce useful information that can be retrieved when needed. Hence, the growth in stored information is another reason for the need to sustain substantial growth in computing performance.

As the types and formats of information that is stored in digital form continue to increase, they drive the rapid growth in stored data. Only a few decades ago, the primary data types stored in IT systems were text and numerical data. But images of increasing resolution, audio streams, and video have all become important types of data stored digitally and then indexed, searched, and retrieved by computing systems.

The growth of stored information is occurring at the personal, enterprise, national, and global levels. On the personal level, the expanding use of e-mail, text messaging, Web logs, and so on is adding to stored text. Digital cameras have enabled people to store many more images in their personal computers and data centers than they ever would have considered with traditional film cameras. Video cameras and audio recorders add yet more data that are stored and then must be indexed and searched. Embedding those devices into the ubiquitous cell phone means that people can and do take photos and movies of events that would previously not have been recorded.

At the global level, the amount of information on the Internet continues to increase dramatically. As static Web pages give way to interactive pages and social-networking sites support video, the amount of stored and searchable data continues its explosive growth. Storage technology has enabled this growth by reducing the cost of storage by a rate even greater than that of the growth in processor performance.

The challenge is to match the growth in stored information with the computational capability to index, search, and retrieve relevant information. Today, there are not sufficiently powerful computing systems to process effectively all the images and video streams being stored. Satellite cameras and other remote sensing devices typically collect much more data than can be examined for useful information or important events.

Considerably more progress is needed to achieve the vision described by Vannevar Bush in his 1945 paper about a MEMEX device that would collect and make available to users all the information relevant to their life and work.1

_______________

1Vannevar Bush, 1945, “As we may think,” Atlantic Magazine, July 1945, available online at http://www.theatlantic.com/magazine/archive/1969/12/as-we-may-think/3881/.

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

in a Web-based service, although the pace of hardware replacement may vary in the cloud.

All else being equal, faster computers are better computers.4 The unprecedented evolution of computers since 1980 exhibits an essentially exponential speedup that spans 4 orders of magnitude in performance for the same (or lower) price. No other engineered system in human history has ever achieved that rate of improvement; small wonder that our intuitions are ill-tuned to perceive its significance. Whole fields of human endeavors have been transformed as computer system capability has ascended through various threshold performance values.5 The impact of computer technology is so widespread that it is nearly impossible to overstate its importance.

Faster computers create not just the ability to do old things faster but the ability to do new things that were not feasible at all before.6 Fast computers have enabled cell phones, MP3 players, and global positioning devices; Internet search engines and worldwide online auctions; MRI and CT scanners; and handheld PDAs and wireless networks. In many cases, those achievements were not predicted, nor were computers designed specifically to cause the breakthroughs. There is no overarching roadmap for where faster computer technology will take us—each new achievement opens doors to developments that we had not even conceived. We should assume that this pattern will continue as computer systems

____________________

4See Box 1.2 for a discussion of why this is true even though desktop computers, for example, spend most of their time idle.

5The music business, for example, is almost entirely digital now, from the initial sound capture through mixing, processing, mastering, and distribution. Computer-based tricks that were once almost inconceivable are now commonplace, from subtly adjusting a singer’s note to be more in tune with the instruments, to nudging the timing of one instrument relative to another. All keyboard instruments except acoustic pianos are now digital (computer-based) and not only can render very accurate imitations of existing instruments but also can alter them in real time in a dizzying variety of ways. It has even become possible to isolate a single note from a chord and alter it, a trick that had long been thought impossible. Similarly, modern cars have dozens of microprocessors that run the engine more efficiently, minimize exhaust pollution, control the antilock braking system, control the security system, control the sound system, control the navigation system, control the airbags and seatbelt retractors, operate the cruise control, and handle other features. Over many years, the increasing capability of these embedded computer systems has allowed them to penetrate nearly every aspect of vehicles.

6Anyone who has played state-of-the-art video games will recognize the various ways in which game designers wielded the computational and graphics horsepower of a new computer system for extra realism in a game’s features, screen resolution, frame rate, scope of the “theater of combat,” and so on.

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

BOX 1.2
Why Do I Need More Performance When My Computer Is Idle Most of the Time?

When computers find themselves with nothing to do, by default they run operating-system code known as the idle loop. The idle loop is like the cell-phone parking lot at an airport, where your spouse sits waiting to pick you up when you arrive and call him or her. It may seem surprising or incongruous that nearly all the computing cycles ever executed by computers have been wasted in the idle loop, but it is true. If we have “wasted” virtually all the computing horsepower available since the beginning of the computer age, why should we worry about a potential threat to increased performance in the future? Is there any point in making machinery execute the idle loop even faster? In fact, there is. The reason has as much to do with humans as it does with the computing machines that they design.

Consider the automobile. The average internal-combustion vehicle has a six-cylinder engine capable of a peak output of around 200 horsepower. Many aspects of the engine and drivetrain reflect that peak horsepower: when you press the pedal to the floor while passing or entering a highway, you expect the vehicle to deliver that peak horsepower to the wheels, and you would be quite unhappy if various parts of the car were to leave the vehicle instead, unable to handle the load. But if you drive efficiently, over several years of driving, what fraction of the time is spent under that peak load condition? For most people, the answer is approximately zero. It only takes about 20 horsepower to keep a passenger car at highway speeds under nominal conditions, so you end up paying for a lot more horsepower than you use.

But if all you had at your driving disposal was a 20-horsepower power plant (essentially, a golf cart), you would soon tire of driving the vehicle because you would recognize that energy efficiency is great but not everything; that annoying all the other drivers as you slowly, painfully accelerate from an on-ramp gets old quickly; and that your own time is valuable to you as well. In effect, we all

get faster yet.7 There is no reason to think that it will not continue as long as computers continue to improve. What has changed—and will be described in detail in later chapters—is how we achieve faster computers. In short, power dissipation can no longer be dealt with independently of performance (see Chapter 3). Moreover, although computing performance has many components (see Chapter 2), a touchstone in this report will be computer speed; as described in Box 1.3, speed can be traded for almost any other sort of functionality that one might want.

____________________

7Some of the breakthroughs were not solely performance-driven—some depend on a particular performance at a particular cost. But cost and performance are closely related, and performance can be traded for lower cost if desired.

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

accept a compromise that results in a system that is overdesigned for the common case because we care about the uncommon case and are willing to pay for the resulting inefficiency.

In a computing system, although you may know that the system is spending almost all its time doing nothing, that fact pales in comparison with how you feel when you ask the system to do something in real time and must wait for it to accomplish that task. For instance, when you click on an attachment or a file and are waiting for the associated application to open (assuming that it is not already open), every second drags.1 At that moment, all you want is a faster system, regardless of what the machine is doing when you are not there. And for the same reason that a car’s power plant and drivetrain are overdesigned for their normal use, your computing system will end up sporting clock frequencies, bus speeds, cache sizes, and memory capacity that will combine to yield a computing experience to you, the user, that is statistically rather rare but about which you care very much.

The idle-loop effect is much less pronounced in dedicated environments—such as servers and cloud computing, scientific supercomputers, and some embedded applications—than it is on personal desktop computers. Servers and supercomputers can never go fast enough, however—there is no real limit to the demand for higher performance in them. Some embedded applications, such as the engine computer in a car, will idle for a considerable fraction of their existence, but they must remain fast enough to handle the worst-case computational demands of the engine and the driver. Other embedded applications may run at a substantial fraction of peak capacity, depending on the workload and the system organization.

____________

1It is worth noting that the interval between clicking on most e-mail attachments and successful opening of their corresponding applications is not so much a function of the CPU’s performance as it is of disk speed, memory capacity, and input/output interconnect bandwidth.

Finding: The information technology sector itself and most other sectors of society—for example, manufacturing, financial and other services, science, engineering, education, defense and other government services, and entertainment—have become dependent on continued growth in computing performance.

The rest of this chapter describes a sampling of fields in which computing performance has been critical and in which a slowing of the growth of computing performance would have serious adverse repercussions. We focus first on high-performance computing and computing performance in the sciences. Threats to growth in computing performance will be felt there first, before inevitably extending to other types of computing.

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

BOX 1.3
Computing Performance Is Fungible

Computing speed can be traded for almost any other feature that one might want. In this sense, computing-system performance is fungible, and that is what gives it such importance. Workloads that are at or near the absolute capacity of a computing system tend to get all the publicity—for every new computing-system generation, the marketing holy grail is a “killer app” (software application), some new software application that was previously infeasible, now runs adequately, and is so desirable that buyers will replace their existing systems just to buy whatever hardware is fast enough to run it. The VisiCalc spreadsheet program on the Apple II was the canonical killer app; it appeared at the dawn of the personal-computing era and was so compelling that many people bought computers just to run it. It has been at least a decade since anything like a killer app appeared, at least outside the vocal but relatively small hard-core gaming community. The reality is that modern computing systems spend nearly all their time idle (see Box 1.2 for an explanation of why faster computers are needed despite that); thus, most systems have a substantial amount of excess computing capacity, which can be put to use in other ways.

Performance can be traded for higher reliability: for example, the digital signal processor in a compact-disk player executes an elaborate error-detection-and-correction algorithm, and the more processing capability can be brought to bear on that problem, the more bumps and shocks the player can withstand before the errors become audible to the listener. Computational capacity can also be used to index mail and other data on a computer periodically in the background to make the next search faster. Database servers can take elaborate precautions to ensure high system dependability in the face of inevitable hardware-component failures. Spacecraft computers often incorporate three processors where one would suffice for performance; the outputs of all three processors are compared via a voting scheme that detects if one of the three machines has failed. In effect, three processors’ worth of performance is reduced to one processor’s performance in exchange for improved system dependability. Performance can be used in the service of other goals, as well. Files on a hard drive can be compressed, and this trades computing effort and time for better effective drive capacity. Files that are sent across a network or across the Internet use far less bandwidth and arrive at their destination faster when they are compressed. Likewise, files can be encrypted in much the same way to keep their contents private while in transit.

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

THE IMPORTANCE OF COMPUTING PERFORMANCE FOR THE SCIENCES

Computing has become a critical component of most sciences and complements the traditional roles of theory and experimentation.8 Theoretical models may be tested by implementing them in software, evaluating them through simulation, and comparing their results with known experimental results. Computational techniques are critical when experimentation is too expensive, too dangerous, or simply impossible. Examples include understanding the behavior of the universe after the Big Bang, the life cycle of stars, the structure of proteins, functions of living cells, genetics, and the behavior of subatomic particles. Computation is used for science and engineering problems that affect nearly every aspect of our daily lives, including the design of bridges, buildings, electronic devices, aircraft, medications, soft-drink containers, potato chips, and soap bubbles. Computation makes automobiles safer, more aerodynamic, and more energy-efficient. Extremely large computations are done to understand economics, national security, and climate change, and some of these computations are used in setting public policy. For example, hundreds of millions of processor hours are devoted to understanding and predicting climate change–one purpose of which is to inform the setting of international carbon-emission standards.

In many cases, what scientists and engineers can accomplish is limited by the performance of computing systems. With faster systems, they could simulate critical details—such as clouds in a climate model or mechanics, chemistry, and fluid dynamics in the human body—and they could run larger suites of computations that would improve confidence in the results of simulations and increase the range of scientific exploration.

Two themes common to many computational science and engineering disciplines are driving increases in computational capability. The first is an increased desire to support multiphysics or coupled simulations, such as adding chemical models to simulations that involve fluid-dynamics simulations or structural simulations. Multiphysics simulations are necessary for understanding complex real-world systems, such as the climate, the human body, nuclear weapons, and energy production. Imagine, for example, a model of the human body in which one could experiment with the addition of new chemicals (medicines to change blood pressure), changing structures (artificial organs or prosthetic devices), or effects of radiation. Many scientific fields are ripe for multiphysics simulations

____________________

8See an NRC report for one relatively recent take on computing and the sciences (NRC, 2008, The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering, Washington, D.C.: The National Academies Press, available online at http://www.nap.edu/catalog.php?record_id=12451).

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

because the individual components are understood well enough and are represented by a particular model and instantiation within a given code base. The next step is to take two or more such code bases and couple them in such a way that each communicates with the others. Climate modeling, for example, is well along that path toward deploying coupled models, but the approach is still emerging in some other science domains.

The second crosscutting theme in the demand for increased computing performance is the need to improve confidence in simulations to make computation truly predictive. At one level, this may involve running multiple simulations and comparing results with different initial conditions, parameterizations, simulations at higher space or time resolutions or numerical precision, models, levels of detail, or implementations. In some fields, sophisticated “uncertainty quantification” techniques are built into application codes by using statistical models of uncertainty, redundant calculations, or other approaches. In any of those cases, the techniques to reduce uncertainty increase the demand for computing performance substantially.9

High-Energy Physics, Nuclear Physics, and Astrophysics

The basic sciences, including physics, also rely heavily on high-end computing to solve some of the most challenging questions involving phenomena that are too large, too small, or too far away to study directly. The report of the 2008 Department of Energy (DOE) workshop on Scientific Grand Challenges: Challenges for Understanding the Quantum Universe and the Role of Computing at the Extreme Scale summarizes the computational gap: “To date, the computational capacity has barely been able to keep up with the experimental and theoretical research programs. There is considerable evidence that the gap between scientific aspiration and the availability of computing resource is now widening….”10 One of the examples involves understanding properties of dark matter and dark energy by analyzing datasets from digital sky surveys, a technique that has already been used to explain the behavior of the universe shortly

____________________

9In 2008 and 2009, the Department of Energy (DOE) held a series of workshops on computing and extreme scales in a variety of sciences. The workshop reports summarize some of the scientific challenges that require 1,000 times more computing than is available to the science community today. More information about these workshops and others is available online at DOE’s Office of Advanced Scientific Computing Research website, http://www.er.doe.gov/ascr/WorkshopsConferences/WorkshopsConferences.html.

10DOE, 2009, Scientific Grand Challenges: Challenges for Understanding the Quantum Universe and the Role of Computing at the Extreme Scale, Workshop Report, Menlo Park, Cal., December 9-11, 2008, p. 2, available at http://www.er.doe.gov/ascr/ProgramDocuments/ProgDocs.html.

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

after the Big Bang and its continuing expansion. The new datasets are expected to be on the order of 100 petabytes (1017 bytes) in size and will be generated with new high-resolution telescopes that are on an exponential growth path in capability and data generation. High-resolution simulations of type Ia and type II supernova explosions will be used to calibrate their luminosity; the behavior of such explosions is of fundamental interest, and such observational data contribute to our understanding of the expansion of the universe. In addition, an improved understanding of supernovae yields a better understanding of turbulent combustion under conditions not achievable on Earth. Finally, one of the most computationally expensive problems in physics is aimed at revealing new physics beyond the standard model, described in the DOE report as “analogous to the development of atomic physics and quantum electrodynamics in the 20th century.”11

In addition to the data analysis needed for scientific experiments and basic compute-intensive problems to refine theory, computation is critical to engineering one-of-a-kind scientific instruments, such as particle accelerators like the International Linear Collider and fusion reactors like ITER (which originally stood for International Thermonuclear Experimental Reactor). Computation is used to optimize the designs, save money in construction, and reduce the risk associated with these devices. Similarly, simulation can aid in the design of complex systems outside the realm of basic science, such as nuclear reactors, or in extending the life of existing reactor plants.

Chemistry, Materials Science, and Fluid Dynamics

A 2003 National Research Council report outlines several of the “grand challenges” in chemistry and chemical engineering, including two that explicitly require high-performance computing.12 The first is to “understand and control how molecules react—over all time scales and the full range of molecular size”; this will require advances in predictive computational modeling of molecular motions, which will complement other experimental and theoretical work. The second is to “learn how to design and produce new substances, materials, and molecular devices with properties that can be predicted, tailored, and tuned before production”; this will also require advances in computing and has implications for commercial use of chemical and materials engineering in medicine,

____________________

11Ibid. at p. vi.

12NRC, 2003, Beyond the Molecular Frontier: Challenges for Chemistry and Chemical Engineering, Washington, D.C.: The National Academies Press, available online at http://www.nap.edu/catalog.php?record_id=10633.

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

energy and defense applications, and other fields. Advances in computing performance are necessary to increase length scales to allow modeling of multigranular samples, to increase time scales for fast chemical processes, and to improve confidence in simulation results by allowing first-principles calculations that can be used in their own right or to validate codes based on approximate models. Computational materials science contributes to the discovery of new materials. The materials are often the foundation of new industries; for example, understanding of semiconductors led to the electronics industry and understanding of magnetic materials contributed to data storage.

Chemistry and material science are keys to solving some of the most pressing problems facing society today. In energy research, for example, they are used to develop cleaner fuels, new materials for solar panels, better batteries, more efficient catalysts, and chemical processes for carbon capture and sequestration. In nuclear energy alone, simulation that combines materials, fluids, and structures may be used for safety assessments, design activities, cost, and risk reduction.13 Fluid-dynamics simulations are used to make buildings, engines, planes, cars, and other devices more energy-efficient and to improve understanding of the processes, such as combustion, that are fundamental to the behavior of stars, weapons, and energy production. Those simulations vary widely among computational scales, but whether they are run on personal computers or on petascale systems, the value of additional performance is universal.

Biological Sciences

The use of large-scale computation in biology is perhaps most visible in genomics, in which enormous data-analysis problems were involved in computing and mapping the human genome. Human genomics has changed from a purely science-driven field to one with commercial and personal applications as new sequence-generation systems have become a commodity and been combined with computing and storage systems that are modest by today’s standards. Companies will soon offer personalized genome calculation to the public. Genomics does not stop with the human genome, however, and is critical in analyzing and synthesizing microorganisms for fighting disease, developing better biofuels, and mitigating environmental effects. The goal is no longer to sequence a single species but to scoop organisms from a pond or ocean, from soil, or from deep

____________________

13Horst Simon, Thomas Zacharia, and Rick Stevens, 2007, Modeling and Simulation at the Exascale for the Energy and Environment, Report on the Advanced Scientific Computing Research Town Hall Meetings on Simulation and Modeling at the Exascale for Energy, Ecological Sustainability and Global Security (E3), Washington, D.C.: DOE, available online at http://www.er.doe.gov/ascr/ProgramDocuments/Docs/TownHall.pdf.

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

underground and analyze an entire host of organisms and compare them with other species to understand better what lives and why in particular environments.

At the macro level, reverse engineering of the human brain and simulating complete biologic systems from individual cells to the structures and fluids of a human are still enormous challenges that exceed our current reach in both understanding and computational capability. But progress on smaller versions of those problems shows that progress is possible.

One of the most successful kinds of computation in biology has been at the level of proteins and understanding their structure. For example, a group of biochemical researchers14 are applying standard computer-industry technology (technology that was originally designed with and funded by profits from mundane consumer electronics items) to tackle the protein-folding problem at the heart of modern drug discovery and invention. This problem has eluded even the fastest computers because of its overwhelming scale and complexity. But several decades of Moore’s law have now enabled computational machinery of such capability that the protein-folding problem is coming into range. With even faster hardware in the future, new treatment regimens tailored to individual patients may become feasible with far fewer side effects.

Climate-Change Science

In its 2007 report on climate change, the Intergovernmental Panel on Climate Change (IPPC) concluded that Earth’s climate would change dramatically over the next several decades.15 The report was based on millions of hours of computer simulations on some of the most powerful

____________________

14See David E. Shaw, Martin M. Deneroff, Ron O. Dror, Jeffrey S. Kuskin, Richard H. Larson, John K. Salmon, Cliff Young, Brannon Batson, Kevin J. Bowers, Jack C. Chao, Michael P. Eastwood, Joseph Gagliardo, J. P. Grossman, Richard C. Ho, Douglas J. Lerardi, István Kolossváry, John L. Klepeis, Timothy Layman, Christine Mcleavey, Mark A. Moraes, Rolf Mueller, Edward C. Priest, Yibing Shan, Jochen Spengler, Michael Theobald, Brian Towles, and Stanley C. Wang, 2008, Anton, a special-purpose machine for molecular dynamics simulation, Communications of the ACM 51(7): 91-97.

15See IPCC, 2007, Climate Change 2007: Synthesis Report, Contribution of Working Groups I, II and III to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, eds. Core Writing Team, Rajendra K. Pachauri and Andy Reisinger, Geneva, Switzerland: IPCC. The National Research Council has also recently released three reports noting that strong evidence on climate change underscores the need for actions to reduce emissions and begin adapting to impacts (NRC, 2010, Advancing the Science of Climate Change, Limiting the Magnitude of Climate Change, and Adapting to the Impacts of Climate Change, Washington, D.C.: The National Academies Press, available online at http://www.nap.edu/catalog.php?record_id=12782; NRC, 2010, Limiting the Magnitude of Future Climate Change, Washington, D.C.: The National Academies Press, available online at http://www.nap.edu/catalog.php?record_id=12785; NRC, 2010, Adapting to the Impacts of Climate Change, available online at http://www.nap.edu/catalog.php?record_id=12783.)

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

supercomputers in the world. The need for computer models in the study of climate change is far from over, however, and the most obvious need is to improve the resolution of models, which previously simulated only one data point every 200 kilometers, whereas physical phenomena like clouds appear on a kilometer scale. To be useful as a predictive tool, climate models need to run at roughly 1,000 times real time, and estimates for a kilometer-scale model therefore require a 20-petaflop (1015 floating-point operations per second) machine, which is an order of magnitude faster than the fastest machine available at this writing.16

Resolution is only one of the problems, however, and a report based on a DOE workshop suggests that 1 exaflop (1018 floating-point operations per second) will be needed within the next decade to meet the needs of the climate research community.17 Scientists and policy-makers need the increased computational capability to add more features, such as fully resolved clouds, and to capture the potential effects of both natural and human-induced forcing functions on the climate. They also need to understand specific effects of climate change, such as rise in sea levels, changes in ocean circulation, extreme weather events at the local and regional level, and the interaction of carbon, methane, and nitrogen cycles. Climate science is not just about prediction and observation but also about understanding how various regions might need to adapt to changes and how they would be affected by a variety of proposed mitigation strategies. Experimenting with mitigation is expensive, impractical because of time scales, and dangerous—all characteristics that call for improved climate models that can predict favorable and adverse changes in the climate and for improved computing performance to enable such simulations.

Computational Capability and Scientific Progress

The availability of large scientific instruments—such as telescopes, lasers, particle accelerators, and genome sequencers—and of low-cost sensors, cameras, and recording devices has opened up new challenges related to computational analysis of data. Such analysis is useful in a variety of domains. For example, it can be used to observe and understand physical phenomena in space, to monitor air and water quality, to develop a map of the genetic makeup of many species, and to examine alternative

____________________

16See, for example, Olive Heffernan, 2010, Earth science: The climate machine, Nature 463(7284): 1014-1016, which explores the complexity of new Earth models for climate analysis.

17DOE, 2009, Scientific Grand Challenges: Challenges in Climate Change Science and the Role of Computing at the Extreme Scale, Workshop Report, Washington D.C., November 6-7, 2008, available online at http://www.er.doe.gov/ascr/ProgramDocuments/Docs/ClimateReport.pdf.

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

energy sources, such as fusion. Scientific datasets obtained with those devices and simulations are stored in petabyte storage archives in scientific computing centers around the world.

The largest computational problems are often the most visible, but the use of computing devices by individual scientists and engineers is at least as important for the progress of science. Desktop and laptop machines today are an integral part of any scientific laboratory and are used for computations and data-analysis problems that would have required supercomputers only a decade ago. Individual investigators in engineering and science departments around the country use cluster computers based on networks of personal computers. The systems are often shared by a small group of researchers or by larger departments to amortize some of the investment in personnel, infrastructure, and maintenance. And systems that are shared by departments or colleges are typically equivalent in computational capability to the largest supercomputers in the world 5-10 years earlier.

Although no one problem, no matter how large, could have justified the aggregate industry investment that was expended over many years to bring hardware capability up to the required level, we can expect that whole new fields will continue to appear as long as the stream of improvements continues. As another example of the need for more performance in the sciences, the computing centers that run the largest machines have very high use rates, typically over 90 percent, and their sophisticated users study the allocation policies among and within centers to optimize the use of their own allocations. Requests for time on the machines perpetually exceed availability, and anecdotal and statistical evidence suggests that requests are limited by expected availability rather than by characteristics of the science problems to be attacked; when available resources double, so do the requests for computing time.

A slowdown in the growth in computing performance has implications for large swaths of scientific endeavor. The amount of data available and accessible for scientific purposes will only grow, and computational capability needs to keep up if the data are to be used most effectively. Without continued expansion of computing performance commensurate with both the amount of data being generated and the scope and scale of the problems scientists are asked to solve—from climate change to energy independence to disease eradication—it is inevitable that important breakthroughs and opportunities will be missed. Just as in other fields, exponential growth in computing performance has underpinned much scientific innovation. As that growth slows or stops, the opportunities for innovation decrease, and this also has implications for economic competitiveness.

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

THE IMPORTANCE OF COMPUTING PERFORMANCE FOR DEFENSE AND NATIONAL SECURITY

It is difficult to overstate the importance of IT and computation for defense and national security. The United States has an extremely high-technology military; virtually every aspect depends on IT and computational capability. To expand our capabilities and maintain strategic advantage, operational needs and economic motivators urge a still higher-technology military and national and homeland security apparatus even as many potential adversaries are climbing the same technology curve that we traversed. If, for whatever reason, we do not continue climbing that curve ourselves, the gap between the United States and many of its adversaries will close. This section describes several examples of where continued growth in computing performance is essential for effectiveness. The examples span homeland security, defense, and intelligence and have many obvious nonmilitary applications as well.

Military and Warfighting Needs

There has been no mystery about the efficacy of better technology in weaponry since the longbow first appeared in the hands of English archers or steel swords first sliced through copper shields. World War II drove home the importance of climb rates, shielding, speed, and armament of aircraft and ended with perhaps the most devastating display of unequal armament ever: the nuclear bomb.

The modern U.S. military is based largely on the availability of technologic advantages because it must be capable of maintaining extended campaigns in multiple simultaneous theaters throughout the world, and our armed forces are much smaller than those fielded by several potential adversaries. Technology—such as better communication, satellite links, state-of-the-art weapons platforms, precision air-sea-land launched rockets, and air superiority—acts as a force multiplier that gives the U.S. military a high confidence in its ability to prevail in any conventional fight.

Precision munitions have been a game-changer that enables us to fight wars with far fewer collateral losses than in any recent wars. No longer does the Air Force have to carpet-bomb a section of a city to ensure that the main target of interest is destroyed; instead, it can drop a precision bomb of the required size from a stealthy platform or as a cruise missile from an offshore ship and take out one building on a crowded city street. Sufficiently fast computers provided such capabilities, and faster ones will improve them.

Because high technology has conferred so strong an advantage on the U.S. military for conventional warfare, few adversaries will ever consider engaging us this way. Instead, experiences in Vietnam, Afghanistan, and

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

Iraq have been unconventional or “asymmetric”: instead of large-scale tank engagements, these wars have been conducted on a much more localized basis—between squads or platoons, not brigades or divisions. Since Vietnam, where most of the fighting was in jungles, the venues have been largely urban, in towns where the populace is either neutral or actively hostile. In those settings, the improvised explosive device (IED) has become a most effective weapon for our adversaries.

The military is working on improving methods for detecting and avoiding IEDs, but it is also looking into after-the-fact analysis that could be aided by computer-vision autosurveillance. With remotely piloted aerial vehicles, we can fly many more observation platforms, at lower risk, than ever before. And we can put so many sensors in the air that it is not feasible to analyze all the data that they generate in a timely fashion. At least some part of the analysis must be performed at the source. The better and faster that analysis is, the less real-time human scrutiny is required, and the greater the effectiveness of the devices. A small, efficient military may find itself fighting at tempos that far exceed what was experienced in the past, and this will translate into more sorties per day on an aircraft carrier, faster deployment of ground troops, and quicker reactions to real-time information by all concerned. Coordination among the various U.S. military elements will become much more critical. Using computer systems to manage much of the information associated with these activities could offload the tedious communication and background analytic tasks and move humans’ attention to issues for which human judgment is truly required.

Training Simulations

Although often rudimentary, training simulations for the military are ubiquitous and effective. The U.S. government-sponsored first-person-shooter game America’s Army is freely downloadable and puts its players through the online equivalent (in weapons, tactics, and first aid) of the training sequence given to a raw recruit. There have been reports that the “training” in the video game America’s Army was sufficient to have enabled its players to save lives in the real world.18 The U.S. Army conducts squad-level joint video-game simulations as a research exercise. Squad tactics, communication, identification of poorly illuminated

____________________

18Reported in Earnest Cavalli, 2008, Man imitates America’s army, saves lives, Wired.com, January 18, 2008, available online at http://www.wired.com/gamelife/2008/01/americas-army-t/. The article cites a press release from a game company Web site (The official Army game: America’s Army, January 18, 2008, available at http://forum.americasarmy.com/viewtopic.php?t=271086).

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

targets in houses, and overall movement are stressed and analyzed. In another type of simulation, a single soldier is immersed in a simulation with computer-generated images on all four walls around him. Future training simulations could be made much more realistic, given enough computational capability, by combining accurately portrayed audio (the real sound of real weapons with echos, nulls, and reflections generated by computer) with ever-improving graphics. Humans could be included, perhaps as avatars as in Second Life, who know the languages and customs of the country in which the military is engaged. Limited handheld language-translation devices are being tested in the field; some soldiers like them, and others report difficulty in knowing how and when to use them. Simulations can be run with the same devices so that soldiers can become familiar with the capabilities and limitations and make their use much more efficient. When training simulations become more realistic, they can do what all more accurate simulations do: reduce the need for expensive real-world realizations or increase the range and hazard-level tolerances of operations that would not be possible to train for in the real world.

Autonomous Robotic Vehicles

The Defense Advanced Research Projects Agency (DARPA) has sponsored multiple “Grand Challenge” events in which robotic vehicles compete to traverse a preset course in minimum time. The 2007 competition was in an urban setting in which the vehicles not only were required to stay on the road (and on their own side of the road) but also had to obey all laws and customs that human drivers would. The winning entry, Boss, from the Carnegie Mellon University (CMU) Robotics Institute, had a daunting array of cameras, lidars, and GPS sensors and a massive (for a car) amount of computing horsepower.19

CMU’s car (and several other competitors) finished the course while correctly identifying many tricky situations on the course, such as the arrival of multiple vehicles at an intersection with stop signs all around. (Correct answer: The driver on the right has the right of way. Unless you got there considerably earlier than that driver, in which case you do. But even if you do have an indisputable right of way, if that driver starts across the intersection, you have the duty to avoid an accident. As it turns out, many humans have trouble with this situation, but the machines largely got it right.)

CMU says that to improve its vehicle, the one thing most desired is

____________________

19See Carnegie Mellon Tartan Racing, available at http://www.tartanracing.org, for more information about the vehicle and the underlying technology.

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

additional computing horsepower—the more the better. According to DARPA’s vision, we appear to be within shouting distance of a robotic military supply truck, one that would no longer expose U.S. military personnel to the threats of IEDs or ambushes. The same technology is also expected to improve civilian transportation and has the potential to reduce collisions on domestic roads and highways.

Domestic Security and Infrastructure

Airport Security Screening

Terrorist groups target civilian populations and infrastructure. The events of 9/11 have sparked many changes in how security is handled, most of which involve computer-based technology. For example, to detect passenger-carried weapons—such as knives, guns, and box cutters—fast x-ray scanners and metal detectors have become ubiquitous in airports throughout the world.

But the x-ray machines are used primarily to produce a two-dimensional image of the contents of carry-on bags; the actual “detector” is the human being sitting in front of the screen. Humans are susceptible to a wide array of malfunctions in that role: they get distracted, they get tired, they get sick, and their effectiveness varies from one person to another. Although it can be argued that there should always be a human in the loop when it is humans one is trying to outsmart, it seems clear that this is an opportunity for increased computational horsepower to augment a human’s ability to identify threat patterns in the images.

In the future, one could envision such x-ray image analytic software networking many detectors in an attempt to identify coordinated patterns automatically. Such automation could help to eliminate threats in which a coordinated group of terrorists is attempting to carry on to a plane a set of objects that in isolation are nonthreatening (and will be passed by a human monitor) but in combination can be used in some dangerous way. Such a network might also be used to detect smuggling: one object might seem innocuous, but an entire set carried by multiple people and passed by different screeners might be a pattern of interest to the authorities. And a network might correlate images with weapons found by hand inspection and thus “learn” what various weapons look like when imaged and in the future signal an operator when a similar image appeared.

Surveillance, Smart Cameras, and Video Analytics

A staple of nearly all security schemes is the camera, which typically feeds a real-time low-frame-rate image stream back to a security guard,

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

whose job includes monitoring the outputs of the camera and distinguishing normal situations from those requiring action. As with airport screeners, the security guards find it extremely difficult to maintain the necessary vigilance in monitoring cameras: it is extremely boring, in part because of the low prevalence of the events that they are watching for. But “boring” is what computers do best: they never tire, get distracted, show up for work with a hangover, or fight with a spouse. If a computer system could watch the outputs of cameras 3, 5, and 8 and notify a human if anything interesting happens, security could be greatly enhanced in reliability, scope, and economics.

In the current state of the art, the raw video feed from all cameras is fed directly to monitors with magnetic tape storage or digital sampling to hard drives. With the emergence of inexpensive high-definition cameras, the raw bit rates are quickly climbing well beyond the abilities of networks to transport the video to the monitors economically and beyond the capacity of storage systems to retain the information.

What is needed is for some processing to be performed in the cameras themselves. Suppose that a major retailer needs surveillance on its customer parking lot at night as an antitheft measure. Virtually all the time, the various cameras will see exactly the same scene, down to the last pixel, on every frame, hour after hour. Statistically, the only things that will change from the camera point of view are leaves blowing across the lot, the occasional wild animal, rain, shadows caused by the moon’s traversal in the sky, and the general light-dark changes when the sun goes down and comes up the next day. If a camera were smart enough to be able to filter out all the normal, noninteresting events, identifying interesting events would be easier. Although it may be desirable to carry out as much analysis at the camera as possible to reduce the network bandwidth required, the camera may not be constructed in a way that uses much power (for example, it may not have cooling features), and this suggests another way in which power constraints come into play.

Computer technology is only now becoming sophisticated enough at the price and power levels available to a mobile platform to perform some degree of autonomous filtering. Future generations of smart cameras will permit the networking bandwidth freed up by the camera’s innate intelligence to be used instead to coordinate observations and decisions made by other cameras and arrive at a global, aggregate situational state of much higher quality than what humans could otherwise have pieced together.

Video search is an important emerging capability in this realm. If you want to find a document on a computer system and cannot remember where it is, you can use the computer system’s search feature to help you find it. You might remember part of the file name or perhaps some

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

key words in the document. You might remember the date of creation or the size. All those can be used by the search facility to narrow down the possibilities to the point where you can scan a list and find the one the document that you wanted. Faster computer systems will permit much better automated filtering and searching, and even pictures that have not been predesignated with key search words may still be analyzed for the presence of a person or item of interest.

All the above applies to homeland security as well and could be used for such things as much improved surveillance of ship and aircraft loading areas to prevent the introduction of dangerous items; crowd monitoring at control points; and pattern detection of vehicle movements associated with bombing of facilities.20

A related technology is face recognition. It is a very short step from surveilling crowds to asking whether anyone sees a particular face in a crowd and then to asking whether any of a list of “persons of interest” appear in the crowd. Algorithms that are moderately effective in that task already exist. Faster computer systems could potentially improve the accuracy rate by allowing more computation within a given period and increase the speed at which a given frame can be analyzed. As with the overall surveillance problem, networked smart cameras might be able to use correlations to overcome natural-sight impediments.

Infrastructure Defense Against Automated Cyberattack

The Internet now carries a large fraction of all purchases made, so a generalized attack on its infrastructure would cause an immediate loss in sales. Much worse, however, is that many companies and other organizations have placed even their most sensitive documents online, where they are protected by firewalls and virtual private networks but online nonetheless—bits protecting other bits. A coordinated, widespread attack on the U.S. computing and network infrastructure would almost certainly

____________________

20These efforts are much more difficult than it may seem to the uninitiated and, once understood by adversaries, potentially susceptible to countermeasures. For example, England deployed a large set of motorway automated cameras to detect (and deter) speeding; when a camera’s radar detected a vehicle exceeding the posted speed limit, the camera snapped a photograph of the offending vehicle and its driver and issued the driver an automated ticket. In the early days of the system’s deployment, someone noticed that if the speeding vehicle happened to be changing lanes during the critical period when the radar could have caught it, for some reason the offense would go unpunished. The new lore quickly spread throughout the driving community and led to a rash of inspired lane-changing antics near every radar camera—behavior that was much more dangerous than the speeding would have been. This was reported in Ray Massey, 2006, Drivers can avoid speeding tickets … by changing lanes, Daily Mail Online, October 15, 2006, available at http://www.dailymail.co.uk/news/article-410539/Drivers-avoid-speeding-tickets--changing-lanes.html.

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

be set up and initiated via the Internet, and the havoc that it could potentially wreak on businesses and government could be catastrophic. It is not out of the question that such an eventuality could lead to physical war.

The Internet was not designed with security in mind, and this oversight is evident in its architecture and in the difficulty with which security measures can be retrofitted later. We cannot simply dismantle the Internet and start over with something more secure. But as computer-system technology progresses and more performance becomes available, there will be opportunities to look for ways to trade the parallel performance afforded by the technology for improved defensive measures that will discourage hackers, help to identify the people and countries behind cyberattacks, and protect the secrets themselves better.

The global Internet can be a dangerous place. The ubiquitous connectivity that yields the marvelous wonders of search engines, Web sites, browsers, and online purchasing also facilitates identity theft, propagation of worms and viruses, ready platforms for staging denial-of-service attacks, and faceless nearly risk-free opportunities for breaking into the intellectual-property stores and information resources of companies, schools, government institutions, and military organizations. Today, a handful of Web-monitoring groups pool their observations and expertise with a few dozen university computer-science experts and many industrial and government watchdogs to help to spot Internet anomalies, malevolent patterns of behavior, and attacks on the Internet’s backbone and name-resolution facilities. As with video surveillance, the battle is ultimately human on human, so it seems unlikely that humans should ever be fully removed from the defensive side of the struggle. However, faster computers can help tremendously, especially if the good guys have much faster computing machinery than the bad guys.

Stateful packet inspection, for example, is a state-of-the-art method for detecting the presence of a set of known virus signatures in traffic on communications networks, which on detection can be shunted into a quarantine area before damage is done. Port-based attacks can be identified before they are launched. The key to those mitigations is that all Internet traffic, harmful or not, must take the form of bits traversing various links of the Internet; computer systems capable of analyzing the contents over any given link are well positioned to eliminate a sizable fraction of threats.

Data Analysis for Intelligence

Vast amounts of unencrypted data not only are not generated in intelligence agencies but are available in the open for strategic data-mining. Continued performance improvements are needed if the agencies are to

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

garner useful intelligence from raw data. There is a continuing need to analyze satellite images for evidence of military and nuclear buildups, evidence of emerging droughts or other natural disasters, evidence of terrorist training camps, and so on. Although it is no secret that the National Security Agency and the National Reconnaissance Office have some of the largest computer complexes in the world, the complexity of the data that they store and process and of the questions that they are asked to address is substantial. Increasing amounts of computational horsepower are needed not only to meet their mission objectives but also to maintain an advantage over adversaries.

Nuclear-Stockpile Stewardship

In the past, the reliability of a nuclear weapon (the probability that it detonates when commanded to do so) and its safety (the probability that it does not detonate otherwise) were established largely with physical testing. Reliability tests detonated sample nuclear weapons from the stockpile, and safety tests subjected sample nuclear weapons to extreme conditions (such as fire and impact) to verify that they did not detonate under such stresses. However, for a variety of policy reasons, the safety and reliability of the nation’s nuclear weapons is today established largely with computer simulation, and the data from nonnuclear laboratory experiments are used to validate the computer models.

The simulation of a nuclear weapon is computationally extremely demanding in both computing capability and capacity. The already daunting task is complicated by the need to simulate the effects of aging. A 2003 JASON report21 concluded that at that time there were gaps in both capability and capacity in fulfilling the mission of stockpile stewardship—ensuring nuclear-weapon safety and reliability.

Historically, the increase in single-processor performance played a large role in providing increased computing capability and capacity to meet the increasing demands of stockpile stewardship. In addition, parallelism has been applied to the problem, so the rate of increase in performance of the large machines devoted to the task has been greater than called for by Moore’s law because the number of processors was increased at the same time that single-processor performance was increasing. The largest of the machines today have over 200,000 processors and LINPACK benchmark performance of more than 1,000 Tflops.22

____________________

21Roy Schwitters, 2003, Requirements for ASCI, JSR-03-330, McLean, Va.: The MITRE Corporation.

22For a list of the 500 most powerful known computer systems in the world, see “Top 500,” available online at http://www.absoluteastronomy.com/topics/TOP500.

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

The end of single-processor performance scaling makes it difficult for those “capability” machines to continue scaling at historical rates and so makes it difficult to meet the projected increases in demands of nuclear-weapon simulation. The end of single-processor scaling has also made the energy and power demands of future capability systems problematic, as described in the recent DARPA ExaScale computing study.23 Furthermore, the historical increases in demand in the consumer market for computing hardware and software have driven down costs and increased software capabilities for military and science applications. If the consumer market suffers, the demands of science and military applications are not likely to be met.

THE IMPORTANCE OF COMPUTING PERFORMANCE FOR CONSUMER NEEDS AND APPLICATIONS

The previous two sections offered examples of where growth in computing performance has been essential for science, defense, and national security. The growth has also been a driver for individuals using consumer-oriented systems and applications. Two recent industry trends have substantially affected end-user computational needs: the increasing ubiquity of digital data and growth in the population of end users who are not technically savvy. Sustained growth in computing performance serves not only broad public-policy objectives, such as a strong defense and scientific leadership, but also the current and emerging needs of individual users.

The growth in computing performance over the last 4 decades—impressive though it has been—has been dwarfed over the last decade or so by the growth in digital data.24 The amount of digital data is growing more rapidly than ever before. The volumes of data now available outstrip our ability to comprehend it, much less take maximum advantage

____________________

23Peter Kogge, Keren Bergman, Shekhar Borkar, Dan Campbell, William Carlson, William Dally, Monty Denneau, Paul Franzon, William Harrod, Kerry Hill, Jon Hiller, Sherman Karp, Stephen Keckler, Dean Klein, Robert Lucas, Mark Richards, Al Scarpelli, Steven Scott, Allan Snavely, Thomas Sterling, R. Stanley Williams, and Katherine Yelick, 2008, ExaScale Computing Study: Technology Challenges in Achieving Exascale Systems, Washington, D.C.: DARPA. Available online at http://www.er.doe.gov/ascr/Research/CS/DARPA%20exascale%20-%20hardware%20(2008).pdf.

24A February 2010 report observed that “quantifying the amount of information that exists in the world is hard. What is clear is that there is an awful lot of it, and it is growing at a terrific rate (a compound annual 60%) that is speeding up all the time. The flood of data from sensors, computers, research labs, cameras, phones and the like surpassed the capacity of storage technologies in 2007” (Data, data, everywhere: A special report on managing information, The Economist, February 25, 2010, available online at http://www.economist.com/displaystory.cfm?story_id=15557443).

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

of it. According to the How Much Information project at the University of California, Berkeley,25 print, film, magnetic, and optical storage media produced about 5 exabytes (EB) of new information in 2003. Furthermore, the information explosion is accelerating. Market research firm IDC estimates that in 2006 161 EB of digital content was created and that that figure will rise to 988 EB by 2010. To handle so much information, people will need systems that can help them to understand the available data. We need computers to see data the way we do, identify what is useful to us, and assemble it for our review or even process it on our behalf. This growing end-user need is the primary force behind the radical and continuing transformation of the Web as it shifts its focus from data presentation to end-users to automatic data-processing on behalf of end-users.26 The data avalanche and the consequent transformation of the Web’s functionality require increasing sophistication in data-processing and hence additional computational capability to be able to reason automatically in real time so that we can understand and interpret structured and unstructured collections of information via, for example, sets of dynamically learned inference rules.

A computer’s ability to perform a huge number of computations per second has enabled many applications that have an important role in our daily lives.27 An important subset of applications continues to push the frontiers of very high computational needs. Examples of such applications are these:

  • Digital content creation—allows people to express creative skills and be entertained through various modern forms of electronic arts, such as animated films, digital photography, and video games.
  • Search and mining—enhances a person’s ability to search and recall objects, events, and patterns well beyond the natural limits of human memory by using modern search engines and the ever-growing archive of globally shared digital content.

____________________

25See Peter Lyman and Hal R. Varian, 2003, How much information?, available online at http://www2.sims.berkeley.edu/research/projects/how-much-info-2003/index.htm, last accessed November 2, 2010.

26See, for example, Tim Berners-Lee’s 2007 testimony to the U.S. Congress on the future of the World Wide Web, The digital future of the United States. Part I: The future of the World Wide Web,” Hearings before the Subcommittee on Telecommunications and the Internet of the Committee on Energy and Commerce, 110th Congress, available at http://dig.csail.mit.edu/2007/03/01-ushouse-future-of-the-web.html, last accessed November 2, 2010.

27Of course, a computer system’s aggregate performance may be limited by many things: the nature of the workload itself, the CPU’s design, the memory subsystem, input/output device speeds and sizes, the operating system, and myriad other system aspects. Those and other aspects of performance are discussed in Chapter 2.

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
  • Real-time decision-making—enables growing use of computational assistance for various complex problem-solving tasks, such as speech transcription and language translation.
  • Collaboration technology—offers a more immersive and interactive 3D environment for real-time collaboration and telepresence.
  • Machine-learning algorithms—filter e-mail spam, supply reliable telephone-answering services, and make book and music recommendations.

Computers have become so pervasive that a vast majority of end-users are not computer aficionados or system experts; rather, they are experts in some other field or disciplines, such as science, art, education, or entertainment. The shift has challenged the efficiency of human-computer interfaces. There has always been an inherent gap between a user’s conceptual model of a problem and a computer’s model of the problem. However, given the change in demographics of computer users, the need to bridge the gap is now more acute than ever before. The increased complexity of common end-user tasks (“find a picture like this” rather than “add these two numbers”) and the growing need to be able to offer an effective interface to a non-computer-expert user at a higher level of object semantics (for example, presenting not a Fourier transform data dump of a flower image but a synthesized realistic visual of a flower) have together increased the computational capability needed to provide real-time responses to user actions.

Bridging the gap would be well served by computers that can deal with natural user inputs, such as speech and gestures, and output content in a visually rich form close to that of the physical world around us. A typical everyday problem requires multiple iterations of execute and evaluate between the user and the computer system. Each such iteration normally narrows the original modeling gap, and this in turn requires additional computational capability. The larger the original gap, the more computation is needed to bridge it. For example, some technology-savvy users working on an image-editing problem may iterate by editing a low-level machine representation of an image, whereas a more typical end-user may interact only at the level of a photo-real output of the image with virtual brushes and paints.

Thanks to sustained growth in computing performance over the years, more effective computer-use models and visually rich human-computer interfaces are introducing new potential ways to bridge the gap. An alternative to involving the end-user in each iteration is to depend on a computer’s ability to refine model instances by itself and to nest multiple iterations of such an analytics loop for each iteration of a visual computing loop involving an end-user. Such nesting allows a reduction in

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

the number of interactions between a user and his or her computer and therefore an increase in the system’s efficiency or response. However, it also creates the need to sustain continued growth in computational performance so that a wider variety of more complex tasks can be simulated and solved in real time for the growing majority of end-users. Real-time physical and behavioral simulation of even a simple daily-life object or events (such as water flow, the trajectory of a ball in a game, and summarizing of a text) is a surprisingly computationally expensive task, and requires multiple iterations or solutions of a large number of subproblems derived from decomposition of the original problem.

Computationally intensive consumer applications include such phenomena as virtual world simulations and immersive social-networking, video karaoke (and other sorts of real-time video interactions), remote education and training that require simulation, and telemedicine (including interventional medical imaging).28

THE IMPORTANCE OF COMPUTING PERFORMANCE FOR ENTERPRISE PRODUCTIVITY

Advances in computing technology in the form of more convenient communication and sharing of information have favorably affected the productivity of enterprises. Improved communication and sharing have been hallmarks of computing from the earliest days of time-sharing in corporate or academic environments to today’s increasingly mobile, smart phone-addicted labor force. Younger employees in many companies today can hardly recall business processes that did not make use of e-mail, chat and text messaging, group calendars, internal Web resources, blogs, Wiki toolkits, audio and video conferencing, and automated management of workflow. At the same time, huge improvements in magnetic storage technology, particularly for disk drives, have made it affordable to keep every item of an organization’s information accessible on line. Individual worker productivity is not the only aspect of an enterprise that has been affected by continued growth in computing performance. The ability of virtually every sort of enterprise to use computation to understand data related to its core lines of business—sometimes referred to as analytics—has improved dramatically as computer performance has increased over the years. In addition, massive amounts of data and computational capability accessible on the Internet have increased the demand for Web services, or “software as a service,” in a variety of sectors. Analytics and

____________________

28For more on emerging applications and their need for computational capability, see Justin Rattner, 2009, The dawn of terascale computing, IEEE Solid-State Circuits Magazine 1(1): 83-89.

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

the implications of Web services for computing performance needs are discussed below.

Analytics

Increases in computing capability and efficiency have made it feasible to perform deep analysis of numerous kinds of business data—not just off line but increasingly in real time—to obtain better input into business decisions.29 Efficient computerized interactions between organizations have created more efficient end-to-end manufacturing processes through the use of supply-chain management systems that optimize inventories, expedite product delivery, and reduce exposure to varying market conditions.

In the past, real-time business performance needs were dictated mostly by transaction rates. Analytics (which can be thought of as computationally enhanced decision-making) were mostly off line. The computational cost of actionable data-mining was too high to be of any value under real-time use constraints. However, the growth in computing performance has now made real-time analytics affordable for a larger class of enterprise users.

One example is medical-imaging analytics. Over the last 2 decades, unprecedented growth has taken place in the amount and complexity of digital medical-image data collected on patients in standard medical practice. The clinical necessity to diagnose diseases accurately and develop treatment strategies in a minimally invasive manner has mandated the development of new image-acquisition methods, high-resolution acquisition hardware, and novel imaging modalities. Those requirements have placed substantial computational burdens on the ability to use the image information synergistically. With the increase in the quality and utility of medical-image data, clinicians are under increasing pressure to generate more accurate diagnoses or therapy plans. To meet the needs of the clinician, the imaging-research community must provide real-time (or near real-time) high-volume visualization and analyses of the image data to optimize the clinical experience. Today, nearly all use of computation in medical imaging is limited to “diagnostic imaging.” However, with sufficient computational capability, it is likely that real-time medical interventions could become possible. The shift from diagnostic imaging to interventional imaging can usher in a new era in medical imaging. Real-time

____________________

29IBM’s Smart Analytics System, for example, is developing solutions aimed at retail, insurance, banking, health care, and telecommunication. For more information see the IBM Smart Analytics System website, available online at http://www-01.ibm.com/software/data/infosphere/smart-analytics-system/.

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

medical analytics can guide medical professionals, such as surgeons, in their tasks. For example, surface extractions from volumetric data coupled with simulations of various what-if scenarios accomplished in real time offer clear advantages over basic preoperative planning scenarios.

Web Services

In the last 15 years, the Internet and the Web have had a transformational effect on people’s lives. That effect has been enabled by two concurrent and interdependent phenomena: the rapid expansion of Internet connectivity, particularly high-speed Internet connections, and the emergence of several extraordinarily useful Internet-based services. Web search and free Web-based e-mail were among the first such services to explode in popularity, and their emergence and continuous improvements have been made possible by dramatic advances in computing performance, storage, and networking technologies. Well beyond text, Web-server data now include videos, photos, and various other kinds of media. Users—individuals and businesses—increasingly need information systems to see data the way they do, identify what is useful, and assemble it for them. The ability to have computers understand the data and help us to use it in various enterprise endeavors could have enormous benefits. As a result, the Web is shifting its focus from data presentation to end-users to automatic data-processing on behalf of end-users. Finding preferred travel routes while taking real-time traffic feeds into account and rapid growth in program trading are some of the examples of real-time decision-making.

Consider Web search as an example. A Web search service’s fundamental task is to take a user’s query, traverse data structures that are effectively proportional in size to the total amount of information available on line, and decide how to select from among possibly millions of candidate results the handful that would be most likely to match the user’s expectation. The task needs to be accomplished in a few hundred milliseconds in a system that can sustain a throughput of several thousand requests per second. This and many other Web services are offered free and rely on on-line advertisement revenues, which, depending on the service, may bring only a few dollars for every thousand user page views. The computing system that can meet those performance requirements needs to be not only extremely powerful but also extremely cost-efficient so that the business model behind the Internet service remains viable.

The appetite of Internet services for additional computing performance doesn’t appear to have a foreseeable limit. A Web search can be used to illustrate that, although a similar rationale could be applied to other types of services. Search-computing demands fundamentally grow

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

in three dimensions: data-repository increases, search-query increases, and service-quality improvements. The amount of information currently indexed by search engines, although massive, is still generally considered a fraction of all on-line content even while the Web itself keeps expanding. Moreover, there are still several non-Web data sources that have yet to be added to the typical Web-search repositories (such as printed media). Universal search,30 for example, is one way in which search-computing demands can dramatically increase as all search queries are simultaneously sent to diverse data sources. As more users go online or become more continuously connected to the Internet through better wireless links, traffic to useful services would undergo further substantial increases.

In addition to the amount of data and types of queries, increases in the quality of the search product invariably cause more work to be performed on behalf of each query. For example, better results for a user’s query will often be satisfied by searching also for some common synonyms or plurals of the original query terms entered. To achieve the better results, one will need to perform multiple repository lookups for the combinations of variations and pick the best results among them, a process that can easily increase the computing demands for each query by substantial factors.

In some cases, substantial service-quality improvements will demand improvements in computing performance along multiple dimensions simultaneously. For example, the Web would be much more useful if there were no language barriers; all information should be available in every existing language, and this might be achievable through machine-translation technology at a substantial processing cost. The cost would come both from the translation step itself, because accurate translations require very large models or learning over large corpora, and from the increased amount of information that then becomes available for users of every language. For example, a user search in Italian would traverse not only Italian-language documents but potentially documents in every language available to the translation system. The benefits to society at large from overcoming language barriers would arguably rival any other single technologic achievement in human history, especially if they extended to speech-to-speech real-time systems.

The prospect of mobile computing systems—such as cell phones, vehicle computers, and media players—that are increasingly powerful, ubiquitous, and interconnected adds another set of opportunities for bet-

____________________

30See Google’s announcement: Google begins move to universal search: Google introduces new search features and unveils new homepage design,” Press Release, Google.com, May 16, 2007, available online at http://www.google.com/intl/en/press/pressrel/universalsearch_20070516.html.

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×

ter computing services that go beyond simply accessing the Web on more devices. Such devices could act as useful sensors and provide a rich set of data about their environment that could be useful once aggregated for real-time disaster response, traffic-congestion relief, and as-yet-unimagined applications. An early example of the potential use of such systems is illustrated in a recent experiment conducted by the University of California, Berkeley, and Nokia in which cell phones equipped with GPS units were used to provide data for a highway-conditions service.31

More generally, the unabated growth in digital data, although still a challenge for managing and sifting, has now reached a data volume large enough in many cases to have radical computing implications.32 Such huge amounts of data will be especially useful for a class of problems that have so far defied analytic formulation and been reliant on a statistical data-driven approach. In the past, because of insufficiently large datasets, the problems have had to rely on various, sometimes questionable heuristics. Now, the digital-data volume for many of the problems has reached a level sufficient to revert to statistical approaches. Using statistical approaches for this class of problems presents an unprecedented opportunity in the history of computing: the intersection of massive data with massive computational capability.

In addition to the possibility of solving problems that have heretofore been intractable, the massive amounts of data that are increasingly available for analysis by small and large businesses offer the opportunity to develop new products and services based on that analysis. Services can be envisioned that automate the analysis itself so that the businesses do not have to climb this learning curve. The machine-learning community has many ideas for quasi-intelligent automated agents that can roam the Web and assemble a much more thorough status of any topic at a much deeper level than a human has time or patience to acquire. Automated inferences can be drawn that show connections that have heretofore been unearthed only by very talented and experienced humans.

On top of the massive amounts of data being created daily and all that portends for computational needs, the combination of three elements has the potential to deliver a massive increase in real-time computational resources targeted toward end-user devices constrained by cost and power:

____________________

31See the University of California, Berkeley, press release about this experiment (Sarah Yang, 2008, Joint Nokia research project captures traffic data using GPS-enabled cell phones, Press Release, UC Berkeley News, February 8, 2008, available online at http://berkeley.edu/news/media/releases/2008/02/08_gps.shtml).

32Wired.com ran a piece in 2008 declaring “the end of science”: The Petabyte Age: Because more isn’t just more—more is different,” Wired.com, June 23, 2008, available online at http://www.wired.com/wired/issue/16-07.

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
  • Clouds of servers.
  • Vastly larger numbers of end-user devices, consoles, and various form-factor computing platforms.
  • The ubiquitous connectivity of computing equipment over a service-oriented infrastructure backbone.

The primary technical challenge to take advantage of those resources lies in software. Specifically, innovation is needed to enable the discovery of the computing needs of various functional components of a specific service offering. Such discovery is best done adaptively and under the real-time constraints of available computing bandwidth at the client-server ends, network bandwidth, and latency. On-line games, such as Second Life, and virtual world simulations, such as Google Earth, are examples of such a service. The services involve judicious decomposition of computing needs over public client-server networks to produce an interactive, visually rich end-user experience. The realization of such a vision of connected computing will require not only increased computing performance but standardization of network software layers. Standardization should make it easy to build and share unstructured data and application programming interfaces (APIs) and enable ad hoc and innovative combinations of various service offerings.

In summary, computing in a typical end-user’s life is undergoing a momentous transformation from being useful yet nonessential software and products to being the foundation for around-the-clock relied-on vital services delivered by tomorrow’s enterprises.

Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 21
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 22
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 23
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 24
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 25
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 26
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 27
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 28
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 29
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 30
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 31
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 32
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 33
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 34
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 35
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 36
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 37
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 38
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 39
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 40
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 41
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 42
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 43
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 44
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 45
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 46
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 47
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 48
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 49
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 50
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 51
Suggested Citation:"1 The Need for Continued Performance Growth." National Research Council. 2011. The Future of Computing Performance: Game Over or Next Level?. Washington, DC: The National Academies Press. doi: 10.17226/12980.
×
Page 52
Next: 2 What Is Computer Performance? »
The Future of Computing Performance: Game Over or Next Level? Get This Book
×
 The Future of Computing Performance: Game Over or Next Level?
Buy Paperback | $46.00 Buy Ebook | $36.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The end of dramatic exponential growth in single-processor performance marks the end of the dominance of the single microprocessor in computing. The era of sequential computing must give way to a new era in which parallelism is at the forefront. Although important scientific and engineering challenges lie ahead, this is an opportune time for innovation in programming systems and computing architectures. We have already begun to see diversity in computer designs to optimize for such considerations as power and throughput. The next generation of discoveries is likely to require advances at both the hardware and software levels of computing systems.

There is no guarantee that we can make parallel computing as common and easy to use as yesterday's sequential single-processor computer systems, but unless we aggressively pursue efforts suggested by the recommendations in this book, it will be "game over" for growth in computing performance. If parallel programming and related software efforts fail to become widespread, the development of exciting new applications that drive the computer industry will stall; if such innovation stalls, many other parts of the economy will follow suit.

The Future of Computing Performance describes the factors that have led to the future limitations on growth for single processors that are based on complementary metal oxide semiconductor (CMOS) technology. It explores challenges inherent in parallel computing and architecture, including ever-increasing power consumption and the escalated requirements for heat dissipation. The book delineates a research, practice, and education agenda to help overcome these challenges. The Future of Computing Performance will guide researchers, manufacturers, and information technology professionals in the right direction for sustainable growth in computer performance, so that we may all enjoy the next level of benefits to society.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!