Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 37
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering 3 The Potential Impact of HECC in the Atmospheric Sciences INTRODUCTION Weather and climate events, and now long-term environmental change, are increasingly significant in both private and public decision making. We require warnings of severe and damaging weather, predictions of the tracks and intensity of hurricanes and winter storms, and longer-term forecasts about seasonal variations. Extreme event theory has been extended into a statistical characterization of droughts, floods, heat waves, and other environmental forces. In recent years we have sought to foresee the global changes that might be in progress, anthropogenic or otherwise. The skill and reliability of forecasts has increased markedly since the advent of weather radar, Earth-observing satellites, and powerful computers. Today tornados are almost always seen on radar in time to issue warnings, and satellite images portray the growth and trajectory of hurricanes and other storms. To look beyond a few hours, we combine powerful computers with the relevant laws of physics converted into mathematical models to predict how the observed present state of the global atmosphere will evolve in the hours, days, months, and years ahead. The significance of the decisions and planning that depend on weather and climate forecasts and simulations justifies efforts to make dramatic improvement. We expect that this improvement will be achieved as the numerical models that portray events in the atmosphere and ocean and on the land surface gain significantly in their resolution and sophistication. Detailed calculation of the feedbacks in the climate system, including the carbon cycle and other elemental cycles, are required for accurate climate prediction. Numerical weather forecasting in particular is approaching the ability to take account of local features (for example, lakes, ridges) and local variations in atmospheric moisture content, and the successful modeling of atmospheric variables on these local scales should lead to a significant improvement in forecast quality. The importance to society of weather and climate information is evident from the significant investments in surface and balloon observations and in radar and satellite observing systems. The observations themselves provide insight into present conditions and expectations for the next few hours, but the larger value of those observations is realized with the computer models that turn them into longer-range
OCR for page 38
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering forecasts. This forecast process depends on scientific understanding of complex physical and chemical processes in the atmosphere and detailed simulation of momentum, heat, energy, and molecular exchange between the land, ocean, and atmosphere. The radiant energy from the Sun is converted into thermal energy and then into the kinetic energy of winds and storms, all involving a highly nonlinear exchange of energy among small- and large-scale phenomena and a continuing exchange of water and chemical constituents between the atmosphere, oceans, and land surface. Because of these continuous exchanges, it is difficult to talk about atmospheric science without also considering ocean, land, and ice. Skill in atmospheric prediction builds on scientific understanding converted into the mathematical expressions that become a numerical model of Earth and its atmosphere, oceans, and land surface, including detailed treatments of land and ocean biogeochemistry as well as the radiative impacts of atmospheric aerosols and gas-phase chemistry. The ocean is the planet’s reservoir of thermal energy and water, the land surface alters energy fluxes, and plants in both the ocean and on land maintain the oxygen balance. The atmosphere is the high-speed transport system that links them together and drives toward a thermodynamic equilibrium that is never attained. The central task facing atmospheric scientists is to unite sufficiently powerful science and sufficiently powerful computers to create a numerical counterpart of Earth and its atmosphere with the similitude required to manage weather, climate, and environmental risk with confidence in the years ahead. This chapter identifies the major frontier challenges that the atmospheric sciences are attacking in order to realize that central task. In order to identify those challenges, the committee relied on several recent reports, including the following: European Centre for Medium-Range Weather Forecasts, 2006, ECMWF Strategy 2006-2015. Available at http://www.ecmwf.int/about/programmatic/2006/index.html. Intergovernmental Panel on Climate Change (IPCC), 2007, Climate Change 2007: The Physical Science Basis, Working Group I report, Cambridge University Press. National Research Council (NRC), 1998, The Atmospheric Sciences Entering the Twenty-first Century, Washington, D.C: National Academy Press. NRC, 2000, From Research to Operations in Weather Satellites and Numerical Weather Prediction: Crossing the Valley of Death, Washington, D.C.: National Academy Press. NRC, 2001, Effectiveness of U.S. Climate Modeling, Washington, D.C.: National Academy Press. NRC, 2002, Abrupt Climate Change: Inevitable Surprises. Washington, D.C.: National Academy Press. National Science Foundation (NSF), 2000, NSF Geosciences Beyond 2000: Understanding and Predicting Earth’s Environment and Habitability. Available at http://www.nsf.gov/geo/adgeo/geo2000.jsp. University Corporation for Atmospheric Research (UCAR). 2005. Establishing a Petascale Collaboratory for the Geosciences: Scientific Frontiers and Establishing a Petascale Collaboratory for the Geosciences: Technical and Budgetary Prospectus. Technical Working Group and Ad Hoc Committee for a Petascale Collaboratory for the Geosciences. Available at http://www.geo-prose.com/projects/petascale_tech.html. Members of the committee also consulted the peer-reviewed literature in the atmospheric sciences, especially the references listed at the end of this chapter. The committee had extensive discussion with invited guests in a daylong workshop, details of which are included in Appendix B. Further discussions were undertaken with the directors and senior executives of the National Centers for Environmental
OCR for page 39
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering Prediction (NCEP) of the National Weather Service and the Geophysical Fluid Dynamics Laboratory (GFDL), both of which are part of the National Oceanic and Atmospheric Administration, and with the director of the European Centre for Medium-Range Weather Forecasts (ECMWF). Particular assistance was received from Louis Uccellini, Ben Kyger, and Steve Lord from NCEP, Brian Gross from GFDL, Dominique Marbouty of the ECMWF, James Hack from the National Center for Atmospheric Research (NCAR), and other colleagues. Special thanks go to Jeremy D. Ross, Storm Exchange, Inc., for assistance with the discussion of challenges and techniques of high-resolution mesoscale modeling. MAJOR CHALLENGES IN THE ATMOSPHERIC SCIENCES This section identifies the major challenges facing the atmospheric sciences. Each challenge is given a ranking of , , or , representing the committee’s consensus on the degree to which advances in HECC will impact progress. A ranking of  indicates that progress would immediately accelerate if advances in HECC were available. Other resources might be partially limiting, too, such as field or laboratory instruments, personnel, or funds for field programs. A ranking of  indicates that HECC is currently playing or will soon play a key role, but that other factors, such as immaturity of the models or data sets, are more limiting because they prevent advances in HECC from having an immediate impact. A ranking of  indicates that HECC will probably not be a limiting resource within 5 years because the models, data sets, and theories are not fully mature. Even for the challenges ranked , however, some computer-intensive modeling or data analysis activities are already under way. In the brief discussion that accompanies each challenge, references are made to the physical or mathematical aspects of the problem that necessitate attacking it with HECC. These aspects are described in more detail in the section on computational challenges in the atmospheric sciences. Major Challenge 1: Extend the Range, Accuracy, and Utility of Weather Prediction  Many sectors of the economy and the public at large have come to depend on accurate forecasting of day-to-day weather. Examples include agriculture, transportation, energy, construction, and recreation. Over the last 50 years, the accuracy and range of weather forecasting have steadily improved, owing in equal measure to underlying improvements in Physical models of the atmosphere; Algorithms for solving partial differential initial- and boundary-value problems; Coverage and quality of data for model initialization; Algorithms for model initialization; and Computers that are faster and have larger memories, allowing higher spatial resolution and running of ensembles of models. These advances are the result of massive, sustained research efforts in atmospheric physics and dynamics, instrumentation, the mathematics of chaos, and several areas of data filtering and numerical analysis, all of them capitalizing on concurrent progress in supercomputers themselves. While today’s weather forecasts are good by previous standards, several serious limitations can nonetheless be noted: Even a short-term forecast for one or two days out may occasionally miss a significant weather event or misjudge its strength or trajectory.
OCR for page 40
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering Beyond five days, the reliability (and therefore the utility) of weather forecasts decreases rapidly. Quantitative precipitation forecasts are not as skillful as forecasts for other aspects of weather—wind, pressure, and temperature. Certain types of severe weather—namely, those triggered by instabilities—are poorly predicted. Most weather forecasts do not provide truly local information—down to the scale of a few kilometers—which is the level of specificity needed by some human activities. While some forecast improvement could be achieved fairly readily just by upgrading the computing platforms, addressing the limitations just listed would require progress on multiple fronts, such as theory, modeling of small-scale phenomena (especially moisture processes), observations, data assimilation and exploitation, and understanding of atmospheric chemistry. The role that HECC is expected to play in this progress is described in the next section. Major Challenge 2: Improve Understanding and Timely Prediction of Severe Weather, Pollution, and Climate Events  The need for accurate weather forecasts becomes especially acute when natural hazards threaten. At least three types of natural hazard arise in the atmosphere: Severe storms (tornadoes, floods, hurricanes, snowstorms, and ice storms); Severe climate events (drought, heat wave); and Atmospheric response to external events (fire, volcano, pollution release). The committee decided to separate these extraordinary forecasting challenges from the day-to-day variety described in Major Challenge 1 because severe events involve different physical processes or require a special type of forecast model or a special kind of input data. The time criticality of the latter kind of forecast may also differ. In the case of tornados and flash floods, gains of a few minutes in warnings may make a substantial difference in the value of the forecast, giving the public more time to seek shelter. With hurricanes and ice storms, timely predictions can alert government agencies and power companies to mobilize their emergency crews, people can be evacuated, and ships at sea can alter their routes or prepare for severe conditions. The accurate prediction of extreme events often requires input from local observing systems that may not be part of the standard national or global data system. It is now conceivable—though not yet achievable—that minute-by-minute Doppler radar data could be used to initialize or refine a high-resolution, three-dimensional fluid dynamics model (grid cell of 10 meters) of a developing severe tornadic thunderstorm. Such simulations would require that data ingestion and its forward integration be done substantially faster than in real time, which cannot be accomplished with current high-end computing capabilities. Droughts and heat waves last longer, several weeks, and they too demand special forecasting methods. Inputs of information on soil moisture and the state of vegetation may be critical, and models with more detail about feedback between the atmosphere and land surface are required. One activity under way in the climate modeling community is an evaluation of the length and magnitude of droughts in climate models over the next 100 years. Initial indications are that droughts will be more intense and last longer in the future and that the inclusion of accurate detailed information on carbon and water exchange between the land, oceans, and atmosphere is critical to the accuracy of the climate simulations. When dangerous material is unexpectedly released into the atmosphere, it is essential to quickly predict the spread and concentration of the substance. Local turbulence models are required, coupled with
OCR for page 41
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering current or forecast winds and clouds. In the case of forest fires, the development of the fire itself can be modeled with the appropriate equations, including those for the combustion properties of the fuel. What is commonly needed for addressing Major Challenge 2 are computational models with finer resolution that incorporate, in most of the cases, models of additional physical interactions. Progress against that common need requires advances in HECC capabilities. Major Challenge 3: Improve Understanding and Prediction of Seasonal-, Decadal-, and Century-Scale Climate Variation on Global, Regional, and Local Scales  While detailed weather forecasting cannot be extended beyond a month, climate may usefully be forecasted for several months, even into the next season. This is possible because Earth’s climate system has some slow components that are less chaotic. The slow components nudge the mean properties of the fast chaotic weather systems. Examples include heat storage in the ocean, snow cover, and albedo changes due to seasonal vegetation variations. Most of the large national and international numerical prediction centers run ensemble models linking atmospheric, oceanic, and land components to provide outlooks for seasonal anomalies up to nine months in advance. Special attention is paid to El Niño events, in which the sea surface temperatures of the tropical eastern Pacific rise by a few degrees, or La Niña events, in which they cool, modifying the mean state of the climate in a large region, including much of North America. An El Niño prediction thus carries information about the probability of wetter or drier conditions that can be used by agricultural interests and water resource planners in the United States. The science of El Niño prediction is still developing. The El Niño forecasts produced by dynamical models and by statistical means at a variety of institutions show considerable scatter. Nevertheless, the main dynamical models are fairly consistent in identifying the onset of both El Niño and La Niña events with lead times of a few months. They are considerably less successful in anticipating the intensity and duration of those events, however. The computational challenge related to seasonal forecasts, including El Niño and La Niña events, is clear. A good model must include velocity and temperature fields in the atmosphere and ocean, as well as the coupling between the atmosphere and the ocean and land by turbulent transport. Cloud and radiation processes in the atmosphere are also involved. Ensemble methods are required to address the chaotic nature of the coupled atmosphere-ocean system. Progress will require the pursuit of various techniques by competing research groups. HECC plays a key role in both research and operational prediction, but progress on basic physics is also required. Major Challenge 4: Understand the Physics and Dynamics of Clouds, Aerosols, and Precipitation  While many aspects of the atmosphere are complex, many scientists agree that clouds and precipitation, and their interaction with aerosols, pose the greatest difficulty. Clouds have an impact on all scales (Stephens, 2005): They can lead to a local thunderstorm that rains on a picnic, or they can generate deadly tornados and hail. The degree of cloud coverage worldwide controls the average Earth albedo, which influences global temperature. The difficulty in understanding and predicting cloud processes is a result of the multiplicity of aerosol particles and condensed water particles that interact inside clouds as well as the tendency for clouds to encase vigorous turbulent motions. Summer convective clouds often form in response to a sensitive instability in which small thermals of rising air accelerate upward if sufficient latent heat is released from condensing water vapor. In present cloud models, all liquid and ice particles are binned
OCR for page 42
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering into a small number of categories (between 2 and 10, typically). The transference of material between these categories is represented by crude semiempirical equations, so-called parameterizations. Detailed atmospheric chemistry simulations are required to portray aerosol and radiation impacts, and these in turn require advances in modeling and in HECC. Even so, major breakthroughs in modeling have occurred. The first numerical simulations of a tornadic thunderstorm were accomplished in the late 1980s. Recently, some success in predicting the amount of cloud cover in stratocumulus decks over the oceans was reported. While these are not per se successful forecasts, they show that the understanding of clouds is advancing. At the present time, however, forecasts of summer precipitation are done on a probabilistic basis rather than a yes-no basis. Statistically, the reliability of summer precipitation forecasts is much poorer than for other weather forecasts. Even in a simpler situation, such as precipitation from orographic clouds in mountainous terrains, existing models may vary in their prediction of precipitation by factors of two or more. Progress in understanding cloud physics will require a mix of theory, laboratory and field measurement, and trials with high-resolution numerical models. And those numerical models must include a better representation of the spectrum of particle sizes and more realistic transfer processes, which will multiply the computational difficulty. Major Challenge 5: Understand Atmospheric Forcing and Feedbacks Associated with Moisture and Chemical Exchange at Earth’s Surface  While early theories and numerical models of Earth’s atmosphere used a static representation for the lower boundary (“Earth’s surface”), the degree of active interaction between ground and atmosphere is now widely appreciated. These surface feedbacks take several forms. On a small scale, a patch of rain will moisten the soil and increase the evaporation, reducing the temperature and altering the winds. On a larger scale, a local drought may kill the vegetation and alter the local roughness and albedo. On a global scale, increased temperature may alter the soil chemistry and release stored soil carbon to the atmosphere. Reduction in sea ice will reduce the albedo and allow massive heat fluxes to emerge from the ocean into the atmosphere. Surface feedbacks have been identified as components of thunderstorm initiation, heat waves and droughts, dust storms, Arctic ocean warming, and other phenomena. For climate modeling, two important human inputs—fossil fuel emissions and land cover change—are thought to contribute significantly (~8-10 petagrams (Pg) of carbon per year) to atmospheric CO2. Meanwhile, the world’s oceans are a carbon sink (~2 Pg/yr), and another 2 Pg/yr of carbon is lost to a hypothesized “missing sink.” Feedbacks also exist between warming and greenhouse gas production. New climate simulations are beginning to quantify the effects of land cover and fossil fuel emissions on the global carbon and water cycles and to identify and quantify the feedbacks among the driving biogeochemical processes and the climate system in the past, present, and future (Cox et al., 2000). Exemplifying today’s capabilities, Figure 3-1 shows the result of a simulation in which the physical climate system is fully interacting with the global carbon cycle (Thompson et al., 2004). By incorporating the coupling and feedbacks, the simulation exposes qualitative behaviors that we seek to understand. To credibly evaluate feedbacks in the climate system, one must couple (1) a physical ocean model that has an embedded biological ecosystem model with (2) an atmospheric physical model embedded with a detailed model of atmospheric particle and gas-phase chemistry. This coupled model must at the same time be further coupled to a detailed model of continental biological and physical processes. The terrestrial biosphere, oceanic biosphere, and classically constructed physical climate system all interact and contribute to the simulated climate. The biogeochemical and physical climate system is fully coupled
OCR for page 43
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering FIGURE 3-1 Output from a coupled carbon-cycle/physical-climate simulation that allows the exchange of anthropogenic CO2 between atmosphere, ocean, and land. (Derived from the simulations of Thompson et al., 2004.) The vertical axis displays the cumulative gigatons of anthropogenic carbon added to each of the reservoirs shown, with negative numbers representing absorption from the atmosphere. This indicates that the ocean is taking up a fraction of the anthropogenic CO2 and that the land does this as well until about 2070. The curve labeled “emissions” displays the integrated anthropogenic output. Until about 2070 the land system is a slight sink for CO2, but carbon feedbacks inside the model lead it to become a source thereafter. This directly impacts the atmospheric concentration of CO2, but current computing capabilities limit our ability to discern the strength and timing of this predicted inflection in terrestrial uptake. Coupled models such as this are essential for understanding climate, and HECC resources are essential to enable the incorporation of enough geochemical and atmospheric chemistry detail. in the sense that changes and trends in the biogeochemistry influence the physical climate at each time step and vice versa. in the sense that changes and trends in the biogeochemistry influence the physical climate at each time step and vice versa. Clearly, this is a daunting computational task. While we know enough of the underlying science in atmospheric chemistry, aerosols, terrestrial biogeochemistry, and ocean biogeochemistry to create fully coupled models, and we have adequate mathematical models for those processes, it is not yet possible to exploit all these components because of limitations of HECC. At present, most fully coupled production simulations cannot even use rudimentary models of all the interacting processes. And so, while Figure 3-1 allows us a glimpse of critical understanding, advances in HECC are needed to better represent the underlying science and improve our understanding. Major Challenge 6: Develop a Theoretical Understanding of Nonlinearities and Tipping Points in Weather and Climate Systems  From a theoretical viewpoint, a nonlinear system such as Earth’s climate system can clearly undergo sudden changes in state, even with slow changes in external forcing. Positive feedback processes such as
OCR for page 44
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering ice-albedo feedback, water vapor feedback, or warming CO2 feedback could all push toward a bifurcation or tipping point. Even weather systems such as severe thunderstorms or cyclones might behave like this. A theoretical understanding of nonlinear dynamics is therefore essential. The matter is made more urgent by evidence that Earth’s climate may already have experienced abrupt change in the form of glacial retreat (NRC, 2002). Another possible example of change is the postulated shutdown of thermohaline circulation in the Atlantic Ocean. A potential danger in studies of nonlinear behavior is that low-order models with few degrees of freedom seem to overpredict the occurrence and magnitude of bifurcations. When more components of a system are included in a model, more competing pathways of change are present and abrupt shifts are less likely. Thus the reliable prediction of bifurcation requires complex, multicomponent models. Major Challenge 7: Create the Ability to Accurately Predict Global Climate and Carbon-Cycle Response to Forcing Scenarios over the Next 100 Years  As concerns over global warming grow, it is important to accurately predict global climate as a function of various emission scenarios. Two recent reports (IPCC, 2001 and 2007) show that such predictions are affecting public opinion and industrial and government policy. Features of these predictions include the following: Regional climate and economic impacts of climate change, including scenarios for adapting land use and agriculture. Impacts on ocean circulation and productivity. Impacts on snow, water storage, and river flow. Inclusion of the many feedbacks connecting the global carbon cycle, water vapor, and biogeochemical cycles. Detailed simulations of the time trends in the oxidative state of the atmosphere via comprehensive atmospheric chemistry. Interdependency between economic decisions and greenhouse gas emissions. This major challenge builds on the results of Major Challenge 5. The potential impact of HECC advances on Major Challenge 7 would stem both from improvements to and testing of computer models and from allowing models to be run over tens of simulated decades, which requires immense amounts of supercomputer time. Because our understanding of physical, chemical, and biological processes is adequate, progress would come as soon as the advances in HECC are introduced. In the longer term, the development of computational models that can resolve all-important regional differences will also require significant work on algorithms. That is because the algorithms for various components of the fully coupled simulations scale in different ways, so that the complete simulation will not scale readily to the tens of thousands of processors anticipated with next-generation supercomputers. Given these needs, the credibility, accuracy, and availability of global climate models must continue to increase. To retain its credibility, the most recent IPCC report (2007) limited its discussion to aspects in which there is broad agreement across models. The models predicted a general warming of between 2 and 6 degrees over the next 50 to 100 years, somewhat accelerated at higher latitudes. Precipitation increases are predicted for the middle and high latitudes, with drying in the subtropics. The predictions about climate change on the regional scale are less consistent. Models differ considerably on whether regions such as Western Europe, the southwestern United States, or Australia will become wetter or dryer and on how such changes will impact agriculture. The model-dependence of
OCR for page 45
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering these predictions is probably due to how local surface feedbacks (for example, sea surface temperature, soil moisture, and vegetation) are simulated or to subtle resonances related to the location and amplitude of planetary waves in the upper atmosphere. Regional feedbacks between the biogeochemical-carbon cycle and the regional hydrologic cycle are critical to the computed climate, and models for these contain significant uncertainties. These uncertainties call for fundamental scientific research to improve understanding. Additional computational resources will also be required for data analysis given the massive amounts of data that can be produced by climate models and by the newer generations of satellites that will be deployed in the decades ahead. The impact of global warming on agriculture, ocean fisheries, and water resources is expected to be significant. Simulations of the next 100 years will require a detailed understanding of how ocean temperature, salinity, and nutrient distributions will evolve over time. The prediction of these impacts requires special models that are still not well developed. The demands on computing resources for climate studies are huge, particularly because runs are so long. Typically, because weather is so variable, at least 5 years of weather must be simulated to identify moderate shifts in climate. In typical cases, thousands of years of simulated time are required to allow all the different pools of heat and carbon to reach a steady state. In addition, high spatial resolution is required to capture the dominant physical processes. Some climate simulation runs take as much as several months of wall clock time, but longer runs are not practical. Thus the most ambitious climate simulations are unlikely to be undertaken unless HECC resources are available. Major Challenge 8: Model and Understand the Physics of the Ice Ages, Including Embedded Abrupt Climate Change Events Such as the Younger Dryas, Heinrich, and Dansgaard-Oeschger Events  During the last 2 million years (the Pleistocene epoch), while humans were evolving into their present form, the climate of Earth was swinging between long, cold glacial periods and brief, warm interglacial periods. Large ice sheets were growing and decaying. The level of the sea was rising and falling. Patterns of precipitation and vegetation were shifting. The climate records in deep sea cores and in ice cores show that 100- to 1,000-year swings dominated, with some significant fluctuations of shorter period as well. Embedded in the longer cycles were numerous brief events, such as the Heinrich and Dansgaard-Oeschger events and the dramatic Younger Dryas event at the end of the last glacial period, 12,000 years ago. In the Younger Dryas, the temperature may have jumped several degrees Celsius in a span of 20 to 50 years (NRC, 2002). The implications of these recent natural climate changes for the modern world are profound, but the absence of a complete theory is unsettling. The longer cycles show a strong relationship between carbon dioxide concentration and climate, but without clear evidence of cause and effect. The brief embedded events demonstrate the ability of Earth’s climate to shift quickly. Such sudden shifts today would have significant impacts on human life and well being. Progress on a theory for the ice ages will require extensive field and laboratory work, along with the development and testing of numerical models. Ocean heat storage, cloud distribution, ice dynamics, and—possibly—atmospheric chemistry must be included in these models. Intensive use of global climate simulations will be called for, although they may not be a limiting resource.
OCR for page 46
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering Major Challenge 9: Model and Understand Key Climate Events in the Early History of Earth and Other Planets  While the recent Pleistocene climate fluctuations were strong, Earth’s earlier climate experienced even larger fluctuations. Examples include these: Paleocene-Eocene thermal maximum (55 million years ago), when ocean temperatures rose by about 6 degrees and the carbon cycle experienced a large anomaly. The Cretaceous-tertiary impact-induced extinction event (65 million years ago); the resulting changes in the environment that killed the dinosaurs remain a mystery. The Permian-Triassic extinction (250 to 290 million years ago), which decreased the diversity of life on Earth; the role of climate change is not known. Snowball Earth (about 700 million years ago); glacial deposits from that time suggest that the entire planet may have been ice-covered. Just as an understanding of ancient terrestrial climates provides a context for modern climate studies, so does the understanding of the climate of nearby Earth-like planets. The most relevant planets in this regard are Venus and Mars. Venus is closer to the Sun than Earth is, has a much higher albedo, and a massive CO2-rich greenhouse atmosphere. The surface temperature on Venus is nearly 700 K, twice that of Earth. The central question for Venus is whether it once had a cooler Earth-like climate and, if it did, how and when it developed its powerful greenhouse. Mars, on the other hand, is a darker planet, with a thinner CO2 atmosphere and seasonal CO2 ice caps. Its rotation rate is similar to Earth’s, and the basic fluid dynamics of its atmosphere may resemble that of Earth. The main difference is the absence of large amounts of water, with the accompanying influence of latent heat. Together, the climatic history of early Venus, Earth, and Mars provides a useful test of our knowledge of climate change. COMPUTATIONAL CHALLENGES IN THE ATMOSPHERIC SCIENCES As noted in the committee’s ratings for each of the major challenges, Major Challenges 1-3, 5, and 7 are critically dependent on advances in HECC capabilities, while HECC plays an important though probably not rate-limiting role for Major Challenges 4, 6, and 8. HECC plays a role in Major Challenge 9, but its absence would not be a barrier to progress. HECC in the atmospheric sciences consists primarily of simulations based on coupled, multidimensional partial differential equation models of fluid dynamics and heat and mass transfer. These fundamental atmospheric processes are driven by a variety of forces produced by radiation, moisture processes, chemical reactions, and interactions with land and sea surfaces. The largest-scale flows in the atmosphere are a response to the poleward thermal gradients created by solar radiation falling on a spherical Earth. At these large scales, the dynamics of Earth’s atmosphere are shaped by two effects. First, the vertical scales of structure and motion, which are small compared to horizontal scales, are essentially in hydrostatic balance. Second, the large-scale horizontal motions are largely constrained by the balance of Coriolis forces arising from Earth’s rotation and horizontal pressure gradients. However, these large-scale balances are modulated by smaller-scale effects and other physical processes, which are reflected in the variations we observe as weather and climate. The equations of fluid dynamics and heat and mass transfer are approximated by discrete forms in a fairly standard approach, with modifications to take into account the large aspect ratio of Earth’s
OCR for page 47
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering atmosphere and the range of length scales and timescales that must be represented. The remainder of this section describes the primary issues that distinguish high-end simulation in atmospheric science. Chaos, Probability Forecasts, and Ensemble Modeling The fundamental mathematical description of atmospheric dynamics as a well-posed initial-value approximation to the Navier-Stokes equations might lead one to believe that increasing the accuracy of observations and increasing the resolution of numerical forecasts would forever increase the accuracy of the numerical forecasts. However, this is not the case. It has been known since the 1960s (Lorenz, 1963; Smale, 1967) that solutions of otherwise well-posed nonlinear evolution equations can exhibit chaotic behavior, with exponential growth in small perturbations interacting with nonlinearity to produce incredibly complex behavior over long times. The chaotic behavior in fluid dynamics problems has been observed experimentally and in numerical simulations in a variety of settings, including atmospheric fluid flows. In a landmark study of errors in an operational weather forecast model, Lorenz (1982) reached the conclusion that the limit of predictability of weather events as an initial-value problem was about 2 weeks. Beyond that time period, the best one can expect is to be able to predict statistical averages. As with the actual atmosphere, the fields generated from complex numerical models of the atmosphere are chaotic and apparently random. Simulations of the annual cycle of temperature and precipitation show a realistic variation from year to year, just like the actual atmosphere, but the superimposed randomness poses a challenge for modelers seeking to trace signs of climate change back to altered system parameters. Whether the change is due to a rise in carbon dioxide, a shift in ice albedo, an altered solar constant, or a change in aerosol concentration, it is necessary to simulate several years of behavior to gather statistically significant evidence of climate sensitivity. Among climate scientists, a minimum of 10 years of model integration is typically thought to be needed for credible work. Such a standard creates substantial demand for supercomputer power, a demand that compounds the needs arising from high spatial resolution and the addition of physical processes. The IPCC climate runs carried out by many research and operational centers used a substantial fraction of the available computer power in the atmospheric community from 2004 through 2006 to satisfy this requirement. Regional climate studies that nested within the global IPCC runs set 5 years as the credibility standard. Today, the strategy for determining the likelihood of important events is to take advantage of increasing computer power to compute many forecasts in addition to single forecasts at higher resolution. These ensembles of forecasts are the critical strategy for coping with the implications of chaos. Usually, the ensemble is constructed by starting with a single forecast whose initial and boundary conditions and internal physics algorithms are the best available in some sense. This single forecast is referred to as deterministic and is often run at high resolution. The ensemble is then constructed by perturbing initial conditions, boundary conditions, and internal physics algorithms to produce a variety of forecasts starting from the same time. If the ensemble of forecasts appropriately reflects the uncertainty in the initial and boundary conditions and in the internal physical approximations, then the computed ensemble forecasts should provide a probability distribution that reveals the uncertainty in the forecast and that allows decisions to be made on a probabilistic basis. Improvements to operational forecast capabilities require research into next-generation production systems. Those next-generation systems demand HECC almost by definition because they incorporate models and algorithms of greater complexity than those used in production systems, which themselves are heavy users of high-end capacity. The major forecast centers all produce global ensemble forecasts extending up to 2 weeks into the future. For example, each day the ECMWF produces a global forecast with an ensemble of 51 members.
OCR for page 52
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering FIGURE 3-4 Simulation of Hurricane Katrina using the Weather Research and Forecast (WRF) model at 3 km resolution obtained by a series of nested computations based on the Global Forecast System. The upper panel shows the surface isobars at landfall and the simulated track of the hurricane as blue arrows compared with the observed track in black. The lower panel shows a simulated radar image of the rainfall in the hurricane at landfall computed from the distributions of water and ice in the simulated storm. Computations performed by Weather Ventures Ltd.
OCR for page 53
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering significant increase in realism that occurs at resolutions of 4 km or less when the convective processes are calculated directly instead of with the statistical schemes that are used at 12 or 36 km resolution. Statistical algorithms to represent some subgrid-scale processes will also be required, so work on them continues. Threat-based operational forecasting is possible for a variety of other threats, but it is not generally implemented today. Thus there is a significant opportunity for such forecasts to predict at high resolution impending snowstorms, severe weather complexes, serious air quality events, and emergency situations such as wildfires, release of toxic plumes, or severe icing events. Implementing threat-focused, high-resolution forecasts poses a number of challenges. The first is identifying the cases that merit such attention. The second is accessing the substantial computer power necessary to make such forecasts in a short period of time, because the supercomputers that are capable of such a targeted calculation are generally running full-time to produce normal operational forecasts. The third is identifying members of the large-scale ensemble to use as the foundation for the special forecast. With observational and forecast systems providing increased rates of information flow in the years ahead, we can anticipate that threat-focused forecasting will become more valuable and will add to the demand for computer capability in numerical weather prediction. Spatial Resolution, Adaptive Grids, and Subgrid Processes The physical processes that determine the dynamics of Earth’s atmosphere occur over an enormous range of scales: spatial scales from microns (for water vapor and ice) to thousands of kilometers, and timescales from seconds to centuries. No computer simulation now or in the foreseeable future will be able to resolve the full range of spatial scales and timescales. Typically, simulations are set up to represent some fraction of the timescales and length scales, from the largest down to some smaller limit of resolution, with processes that take place at still smaller scales represented using so-called subgrid-scale models. Examples of subgrid-scale models incorporated in atmospheric simulations include the following: Convective processes. The convective components of numerical models include relatively intense vertical motion driven by buoyancy and often leading to cloud formation, at scales that are generally too small to be resolved by the simulation’s spatial grid. Thus schemes must be introduced to represent condensation and latent heat release; large-scale motions within the clouds and the exchange of air with the region near the cloud; and the mass fluxes of air and the three phases of water. Cloud microphysics. All of the complex interactions involving water in the cloud must be modeled to account for water vapor, cloud water, cloud ice, rain, snow, graupel, and hail. The rates at which raindrops and ice crystals grow must be estimated and their fall velocities calculated. Occurring on scales from the molecular to a few centimeters, these processes must be represented statistically in all atmospheric models. Radiation. Radiation from the Sun warms Earth and its atmosphere while the emission of long-wave radiation from the top of the atmosphere maintains its thermal balance. The scattering and conversion of radiation into other forms of energy as it travels through the atmosphere also occur on scales so small that statistical models will be required for the foreseeable future. The models of radiation processes must take account of wavelength dependencies, and they are often based on strategies that involve treating spectral variation in bands. Detailed understanding of
OCR for page 54
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering how atmospheric molecules accrete to form aerosol particles is also critical for understanding atmospheric radiation, and it must be represented by a subgrid-scale model. Boundary layer processes. Atmospheric flow is profoundly affected by conditions at the surface of the Earth, with thermal exchanges and the loss of momentum creating a boundary layer that may extend 3 or 4 km in the vertical. Turbulent motion is responsible for much of the transfer of heat and water in the boundary layer. The vertical resolution of contemporary global models is higher in the boundary layer and is often on the scale of a few hundred meters. The most sophisticated statistical schemes for portraying boundary layer dynamics are now developed in very high-resolution models that produce large-eddy simulations. These statistical schemes can be used for weather prediction and research models on scales of a few kilometers or more. Surface layer. The actual exchanges of heat, water, and momentum between the atmosphere and Earth occur in a surface layer usually only tens of meters deep at the bottom of the boundary layer. Here the mechanical generation of turbulence owing to wind shear is dominant, and the fluxes of energy are strong. Over the ocean, we must model the interaction of two fluids at a common boundary; over the land, we must take account of topography, land use, running water, and melting snow, as well as conditions in the upper layers of the ground. Thus, contemporary prediction and research models are linked to models of the ocean and the land surface, which themselves are complex, dynamical models that compute the evolution of a wide variety of processes at a wide variety of scales. But the actual exchange between the atmosphere and the ocean and land surface occurs on scales much smaller than those in these models, so it must be modeled statistically on both sides of the interface. Very-high-resolution research simulations of these processes, using detailed physical and chemical equations, provide rich data sets for developing algorithms whose implications can then be tested by comparison to appropriate observations and by measuring improvements in large-scale simulation models. Preserving, Improving, and Using Weather and Climate Observations The store of atmospheric and environmental observations is a continually expanding resource for research and operations as the new data each day add to our understanding of Earth. But this resource is more than a static archive, because continuing progress in scientific understanding and technological capability makes it possible to upgrade the quality and applicability of the stored data when computer resources permit. In recent years, satellite data have been playing an increasingly important role in weather forecasting as the primary forecast centers process data flows that go well beyond 1 billion observations per day. Satellite observations of winds, temperatures, and humidity over the world oceans, previously a region devoid of data, are contributing greatly to this flow. Some of the difficulties in merging different types of data taken over a period of several hours have been solved using four-dimensional data-assimilation schemes. Information from satellites is also being used to detect weather and climate phenomena such as shrinking glaciers and sea ice, advancing deserts, shifting ocean currents, and wide-ranging dust storms. This information comes from nearly 50 environmental satellites, providing a daunting flow of data. The distribution and analysis of these data are largely limited by the availability of large disk arrays and high-speed computers. In the same way we use an atmospheric prediction model to assimilate data for a forecast run, we can improve historical observations through a computational process known as reanalysis. The reanalysis process can add variables not originally observed, and it can produce significant improvements in
OCR for page 55
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering the reanalyzed data set obtained as the forecast models improve. Today there are a number of global, long-term reanalysis data sets available for research purposes and for applications. A notable data set just completed is the North American Regional Reanalysis (NARR) (Mesinger, 2006), which contains a wide variety of atmospheric and surface variables at a resolution of 32 km for the period from 1979 to the present. These data sets are useful for developing dynamically consistent climatologies and for serving as verification data sets for calibrating ensemble models. Moreover, they provide a dynamically consistent set of initial conditions and verification data for use in model development and refinement. They offer, in effect, a balanced and consistent virtual atmosphere, but they are expensive because they require running a forecast model over many years of observations. Reprocessing is also important with satellite observations, because continuing research often provides improved algorithms for converting the radiances observed by the satellite instruments into physical variables in the atmosphere or on the surface. Generally, the most basic information from the satellite instruments is securely archived so that this process can be carried out when the considerable requirements for computer capacity can be met. As an example of the computational challenge associated with massive satellite data sets, consider the reprocessing of global data from the Moderate Resolution Imaging Spectrometer (MODIS) satellite-borne sensor. Since the launches in 1999 and 2002, the research community has improved the algorithms used to derive physical climate variables from the satellite data (Vermote et al., 2002). With each improvement, the entire historical MODIS data set must be reprocessed. With current NASA computing capability, about 4 days of data can be processed in 1 day of computation. Thus, to generate a new historical MODIS product suite requires 1 or 2 years of dedicated computation. It is quickly becoming clear that, with the massive amounts of data generated by existing Earth observation systems, HECC is necessary for managing and analyzing data. Atmospheric models, like scientific instruments, must be calibrated, especially if they are to be used for long-term prediction. A model intended for climate research, perhaps on global change issues, must be calibrated to produce the present climate by comparing key averages with observed equivalents, perhaps from a reanalysis data set. The calibration often requires adjusting parameters in the statistical algorithms for subgrid-scale processes or in the schemes controlling fluxes between the atmosphere, ocean, and land surface. Similarly, the characteristics of models used to predict seasonal climate variations must be determined so the predictions can be corrected. A first concern is that such models will exhibit spatially dependent biases—for example, tending to be too warm in one part of the world and too cool in another. Typically, they are used to make retrospective forecasts for 25 years or so in order to determine the biases with sufficient confidence to correct the predictions. A second concern with ensemble models is the spread of the members of the ensemble. A set of retrospective forecasts and verification data over an equal number of years is required to perform the statistical analyses needed to improve or calibrate the predictions. The necessity of computing retrospective forecasts over a period of 25 years before bringing a new seasonal prediction model into operational service is a serious impediment to incorporating improvements into the computer prediction system or using observations. A common strategy is to compute ensembles with fewer members than will be used in practice, which saves machine time but limits statistical confidence. HECC has a key role in enabling us to capitalize on all these terabytes or more of data. In many problems of atmospheric science, the greatest computational challenges (and number of operations) are for data assimilation, not modeling. Experts in the field told the committee that just handling the data volumes projected from satellites of the National Polar-orbiting Operational Environmental Satellite
OCR for page 56
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering System (NPOESS) era will necessitate a 10-fold or so increase over today’s HECC capabilities. But other observational capabilities are also ramping up, such as those for terrestrial biogeochemistry. The Orbiting Carbon Observatory (OCO), for instance, is expected to be launched in 2008 and to generate more data than can ever be assimilated into today’s models on today’s computers. The computational attributes of assimilation are much different from those of modeling. For instance, the mix of serial and parallel codes is different. And the data treatment and assimilation part of the problem call for compromises to make the problem tractable and to get satisfactory data-usage numbers. These compromises have more serious consequences than the compromises in modeling. Some would even argue that the potential to improve weather forecasting by improving assimilation is significantly greater than the potential by improving modeling. The computational environments needed for weather and climate studies are strongly influenced by the observational data. The size of the data sets is important, but maybe more important is the growing complexity of the data system. Further, because there are real-world uses of weather and climate products, those products are subject to forms of analysis, evaluation, and validation that are far different from those used for the products of most other fields of science and engineering. This evaluation is data driven in a way that is not defined by the normal practice of “science”; for instance, the performance of a global model that incorporates more than 1 billion observations might essentially be evaluated by how well it predicts a snowfall in one evaluator’s home town.1 The assimilation of data into predictive models, as well as the new understanding of the atmospheric system that the combination of observations and modeling will produce, is an exciting development in Earth system science. We will need more computing capability to handle the higher-resolution, higher-fidelity measurements. The ability to efficiently explore data sets that will be produced by the anticipated high-resolution models is not yet clear, nor is the path to analysis and visualization of those vast quantities of data. These activities are absolutely essential to the research enterprise. Equivalent increases in storage and archive capacity will also be required, as well as additional related expertise in software engineering and computer science. Transitioning to New HECC Resources The atmospheric sciences are ready to exploit additional HECC resources right now. Some coupled climate simulations have been performed utilizing up to 7,700 processors. If more resources were available, current models could be run more routinely on such platforms, enabling the incorporation of more, and more-realistic, physics and chemistry. It is expected that the scientific benefit would be immediate. In operational meteorology, a 10-fold improvement in computing capability could readily be exploited in the near term, noticeably improving the accuracy and quality of weather and ocean predictions. Increases of this order are required to improve prediction of severe weather, as well as for forecasts of importance to the energy, transportation, and agriculture sectors. The operational forecast centers could become ready to use the increased resolution during the interval between the purchase decision and delivery of the new computer. Further advances, though, would require modifying the current software to improve scalability and changing the model physics to reflect phenomena that become important because of finer resolution. That step would be followed by a significant expenditure of time and computational resources to establish the baseline forecast performance of the models over enough cases to reflect a reasonable range of weather conditions. 1 The committee thanks an anonymous reviewer for many of these thoughts on data assimilation.
OCR for page 57
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering More generally, the HECC requirements for atmospheric science must consider the overall end-to-end science enterprise. While the Japanese Earth Simulator has done wonders for energizing the climate science community in Japan and has been extremely useful for collaborators worldwide, there was no way to use the computer except by going to Japan. While grid computing cannot help with high-end processing, it can help with access, scheduling, monitoring, and analysis of simulation results. This analysis step is—next to developing the models themselves—the endeavor where scientific judgments play the largest role. Indeed, the infrastructure for analysis and access to simulation results is a key part of the computational modeling in the atmospheric sciences. For weather forecasting, the entire end-to-end system is executed in a short number of hours, and success might be evaluated in less than 48 hr. This time element is another critical, defining characteristic of computing for operational weather forecasting. The data capture, quality control, assimilation, and forecasting take place in a few hours, and are repeated every 6 hr. This throughput requirement drives the capability requirements of the computing system. Along with the consequential and tangible nature of the products, it also calls for a hardening of the computing system that is far more stringent than called for by most scientific research. This brings attention, again, to the end-to-end system, the stability of the hardware, and the full range of systems software that glues the system together.2 In any HECC upgrades, software needs continual adaptation. There may be significant rewrites ahead of the programming paradigm changes from the MPI and OpenMP language extensions. The HPC language program of the Defense Advanced Research Projects Agency (DARPA) and the special-purpose (game) processors that are starting to encroach on computer offerings are likely to have a large impact on the effort required to stay with the technology curve. Already, model development calls for equal investments in software engineering and science. While, as noted above, some additional HECC capacity could be readily absorbed by the atmospheric sciences, the longer-term science goals—creating the capability for regional climate forecasts and for localized weather forecasts that explicitly model convective processes in enough detail to predict severe weather—will require significant attention to the overall system. In particular, there will be a need for modifications to operational modeling systems and significant changes in physical parameterizations at all scales. THE NEED FOR HECC RESOURCES TO ADVANCE THE ATMOSPHERIC SCIENCES For nearly 60 years, the atmospheric sciences have utilized the most powerful HECC resources that were available, starting with the first numerical weather forecast computed on the ENIAC in 1950.3 Today the most advanced computers are used in predicting daily weather and seasonal climate variations, in examining likely scenarios for climate change over the remainder of this century, and in a variety of research efforts aimed at understanding atmospheric processes and phenomena and their interactions with other components of the Earth system. The full coupling of all the factors that influence climate, atmospheric science, and weather requires vast computational resources. The computational challenges and strategies are diverse, and new ones now arise as observations of the Earth and atmosphere increase in number and diversity. There are three areas suggested by this chapter where advances in HECC would speed up the achievement of important scientific and national goals, each of which is described on the next two pages. 2 The committee thanks an anonymous reviewer for contributing the ideas behind this paragraph. 3 Platzman (1979) gives a detailed account of the early days of numerical forecasting.
OCR for page 58
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering Improved Models for the Global and Regional Hydrologic Cycle A variety of phenomenological macroscopic models are currently used to represent clouds, precipitation, and convection in both weather forecasting and climate modeling. However, they give widely varying results when applied to the same problems. Furthermore, they are not valid for grid resolutions finer than about 10 km. Finally, it is not known how to make a direct connection between this class of models and detailed mesoscale and microscale observational data. It would be possible to use more realistic models of clouds, precipitation, and convective processes by substantially increasing the grid resolution, which would be feasible with enough computing power. At a horizontal mesh spacing of 1 km, it is possible to replace cumulus convection and large-scale precipitation parameterizations with a new class of models that resolve cloud systems. Such fine-grid resolution would also permit the accurate prediction of tropical cyclones and other extreme weather events. The difficulty with such an approach is computational expense: It has been estimated that a global model on a uniform grid with 1 km mesh spacing would require a computer with sustained performance of 10 Pflop/s (Oliker et. al., 2006), which would require parallel performance and scalability orders of magnitude beyond current capabilities. However, these issues are being addressed, and techniques of computation and scaling are evolving along with hardware capabilities. An alternative approach that would require far less computational power is adaptive mesh refinement (AMR), in which grid resolution can change as a function of space, time, and the emerging solution (Berger and Oliger, 1984). Various kinds of nested grid methods closely related to this approach have been used in atmospheric modeling over the past 20 years, as illustrated in Figure 3-4. Most of these methods are based on one-way coupling, in which a solution computed on a coarser global grid is used to interpolate boundary conditions for a fixed, nested finer grid, with no feedback from the fine-grid solution to the coarse-grid evolution. The use of AMR requires a two-way coupling between coarser and finer meshes, which raises considerable theoretical and practical difficulties (Jablonowski, 2004), from fundamental issues of well-posedness (Oliger and Sundstrom, 1978) to an observed tendency of local refinement methods to produce unphysical precipitation at refinement boundaries. A successful attack on these issues should probably pursue both approaches: the development of (1) scalable implementations of uniform-grid methods aimed at the very highest performance and (2) a new generation of local refinement methods and codes for atmospheric, oceanic, and land modeling. Both efforts should be undertaken with the specific goal of improved models of the global and regional hydrologic cycle, including validation against detailed observational data. Better Theory for and Quantification of Uncertainty Atmospheric models are exceedingly complex owing to the multiple physical, chemical, and biological processes being represented, the numerical algorithms used to represent these processes individually, and the coupling of the individual processes at the level of both the mathematical models and the numerical methods. The propagation of uncertainty through a coupled model is particularly problematic, because nonlinear interactions can amplify the forced response of a system. In addition, it is often the case that we are interested in bounding the uncertainties in predictions of extreme, and hence rare, events, requiring a rather different set of statistical tools than those to study means and variances of large ensembles. New systematic theories about multiscale, multiphysics couplings are needed to quantify relationships better. This will be important as atmospheric modeling results are coupled with economic and impact models. Building a better understanding of coupling and the quantification of uncertainties through coupled systems is necessary groundwork for supporting the decisions that will be made based on modeling results.
OCR for page 59
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering This understanding requires new approaches to error attribution and a better mathematical theory for complex model systems. The result would be models that are demonstrably more accurate. One aspect of the challenge is that error quantification and propagation have to span a dozen or more orders of magnitude just for the time component—ranging from timescales of 10−6 s for some chemical reactions in the atmosphere up to 1,000 years for the characteristic timescale of ocean circulation. If we understand the sources of uncertainty, we may be able to make models so good in particular areas that we are at the limits of what we can learn from observation. Conversely, where models are uncertain, we may be able to suggest observations or experiments that would significantly add to our knowledge of the climate system. Continued Development of HECC Infrastructure The atmospheric sciences have been relatively successful at developing community infrastructure for HECC. This is largely due to the programmatic requirements of operational weather prediction and, in climate modeling, of the IPCC assessment process. Components of this infrastructure include widely available and well-supported community codes with well-defined interfaces to many of the physical submodels that permit experimentation by the community; access to HECC computing systems that enable scientific investigation using simulation; and standards for archiving and sharing simulation data. As noted in the section “Computational Challenges in the Atmospheric Sciences,” valuable scientific and operational advances would be within reach of the community if additional HECC resources, with computing power up to 10 times greater than that in use, were made available. These advances, which would bring more valuable operational forecasts and better understanding of feedbacks and coupling in Earth’s climate, are achievable with the community’s current structures. Longer term, however, computational capabilities will need to increase 1,000- to 10,000-fold. Such an increase would not be possible with the current infrastructure. The increases in code complexity could exceed the capacity of the national centers for software development and support, while the growth in model complexity and the need for better resolution would greatly increase the need for computing cycles. Moreover, the data requirements before and after model simulation or prediction threaten to swamp the system. The volumes of data from simulations and from satellite and other observations are rapidly increasing, and the reanalysis of that data for the purpose of scientific investigations and policy analyses will stress the infrastructure for storing and accessing the data, necessitating a whole new generation of analytical tools. The solution to these problems lies in a combination of hardware, software, and human resources. The national centers and the federal agencies that fund them already have plans for increasing the capability of computing facilities to a petaflop and beyond, so that the main concern is that the atmospheric sciences community be able to access these resources. Some of the computational needs related to data management and analysis are already acute, and the scalability, algorithms, and software hurdles will become paramount when we begin to move to platforms with multiple processors per chip, whether or not the platform has petascale performance. Historically, the community has usually been successful in obtaining HECC resources for the more programmatic activities, but its track record is not perfect, and there must also be sufficient access for the research community. Software and human resources represent a more difficult problem. To overcome the challenges, it will be necessary to develop production-quality versions of a whole new generation of simulation codes, analysis codes, and middleware for managing the data, while at the same time maintaining and enhancing the current capabilities to meet evolving atmospheric science requirements. Such an effort will require the formation of focused development teams with close ties to the algorithm and software development communities on one side, and to the science users and observational programs on the other.
OCR for page 60
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering CONCLUSION: EARTH IN A COMPUTER This chapter has shown how the atmospheric sciences combine global observations from Earth’s surface and from space with scientific algorithms, software systems, and high-end computers to predict weather and seasonal variations and to simulate long-term climate change. The computer models and their predictions are the focus of a widely distributed effort to understand and resolve the complexity of the atmosphere and the Earth system and to foretell their future. From weather events in the next hour to climate variations in the next century, these computer predictions and simulations are vital to a wide range of public and private activities involving issues of economics, policy, and sustainability. A number of trends are rapidly increasing the demand for computational capability in the atmospheric sciences. First, the dramatic increases in the volume and resolution of observations from the surface and especially from space will necessitate computational power approaching that needed by the models themselves. Second, the incontrovertible evidence that forecasts improve with higher spatial resolution calls for geometric increases in computer capability. Third, the necessity of including and resolving the biogeochemical component of the Earth system in climate studies adds another dimension to the demand for computer power. These demands are not being met by existing computational resources. The steady progress over the past 50 years has been impressive—each increase in computer capability has produced meaningful improvement in the value of the forecasts and simulations and, in turn, posed new scientific and algorithmic challenges. This strong record of success suggests that a leap now to petascale computational power would make further significant progress possible. A virtual Earth system (VES) might run continuously in linked petascale machines, assimilating data from the tens of satellites in space and from nodes acquiring local data all over the world. VES4 would maintain a continuous, dynamically consistent portrait of the atmosphere, oceans, and land—it would be a digital mirror reflecting events all over the planet. The image in that digital mirror would serve as the foundation for predictive models to run continuously, predicting global-scale weather and global-scale seasonal change at unprecedented resolution and with remarkable verisimilitude. These global portraits, in turn, would be the evolving matrix in which nested and focused high-resolution models sharpen the forecasts of the atmospheric events that really matter to humans and their societies. The new dynamic record of Earth and the predictions of the VES model would bring forth an era of enlightened management of weather and climate risk, contributing to national economic vitality and stimulating a stronger commitment to environmental stewardship. The creation and operation of an accurate and reliable VES would be a stunning and commanding national achievement—a dramatic demonstration of the benefits that can be realized for society by linking Earth and atmospheric science with the most advanced computers. REFERENCES Cox, P.M., R.A. Betts, C.D. Jones, S.A. Spall, and I.J. Totterdell. 2000. Acceleration of global warming due to carbon-cycle feedbacks in a coupled climate model. Nature 208: 184-187. Dutton, John A. 2002. The Ceaseless Wind: An Introduction to the Theory of Atmospheric Motion. New York, N.Y.: Dover. Hack, James J., Julie M. Caron, G. Danabasoglu, Keith W. Oleson, Cecilia Bitz, and John E. Truesdale. 2006. CCSM−CAM3 climate simulation sensitivity to changes in horizontal resolution. Journal of Climate 19(11): 2267-2289. Intergovernmental Panel on Climate Change (IPCC). 2001. Third Assessment Report, Climate Change 2001. Cambridge, England: Cambridge University Press. 4 In a pleasant coincidence, these are the initials of the late Verner E. Suomi, the father of satellite meteorology, an incredibly innovative scientist, and a leader in creating effective communication systems to empower meteorologists to share meteorological information and advancing technology.
OCR for page 61
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering IPCC. 2007. Fourth Assessment Report, Climate Change 2007. Cambridge, England: Cambridge University Press. Jochum, Markus, Raghu Murtugudde, Raffaele Ferrari, and Paola Malanotte-Rizzoli. 2005. The impact of horizontal resolution on the tropical heat budget in an atlantic ocean model. Journal of Climate 18(6): 841-851. Kalnay, Eugenia. 2003. Atmospheric Modeling, Data Assimilation and Predictability. Cambridge, England: Cambridge University Press. Lorenz, E.N. 1982. Atmospheric predictability experiments with a large numerical model. Tellus 34: 505-513. Mesinger, Fedor, Geoff DiMego, Eugenia Kalnay, et al. 2006. North American regional reanalysis. Bulletin of the American Meteorological Society 87: 343-360. National Research Council (NRC). 2002. Abrupt Climate Change: Inevitable Surprises. Washington, D.C.: The National Academies Press. Palmer, T.N. 2006. Predictability of weather and climate: From theory to practice. In Predictability of Weather and Climate. Tim Palmer and Renate Hagaedorn, eds. Cambridge, England: Cambridge University Press. Platzman, George W. 1979. The ENIAC computations of 1950—Gateway to numerical weather prediction. Bulletin of the American Meteorological Society 60: 302-312. Rojas, Maisa. 2006. Multiply nested regional climate simulation for southern South America: Sensitivity to model resolution. Monthly Weather Review 134(8): 2208-2223. Stephens, G.L. 2005. Cloud feedbacks in the climate system: A critical review. Journal of Climate 18: 237-273. Teweles, Sidney, Jr., Herman B. Wobus. 1954. Verification of prognostic charts. Bulletin of the American Meteorological Society 35: 455-463. Thompson, S., B. Govindasamy, A. Mirin, K. Caldeira, C. Delire, J. Milovich, M. Wickett, and D.J. Erickson III. 2004. Quantifying the effects of CO2-fertilized vegetation on future global climate and carbon dynamics. Geophysical Research Letters 31. Vermote, E.F., N.Z. El Saleous, and C.O. Justice. 2002. Atmospheric correction of MODIS data in the visible to middle infrared—First results. Remote Sensing Environment 83: 97-111. Wilks, Daniel S. 2005. Statistical Methods in Atmospheric Science. New York, N.Y.: Academic Press.
OCR for page 62
The Potential Impact of High-End Capability Computing on Four Illustrative Fields of Science and Engineering This page intentionally left blank.