quent reprogramming of the major climate models to allow them to run on commodity equipment caused additional delays during which very little science could be undertaken.

The new CRAY T-90 vector supercomputer was generally considered to be overpriced. Many of the U.S. supercomputer users that would have preferred a Japanese vector machine turned instead to commodity micro-processor-based clusters from various vendors. Applications such as those at NCAR, which require high machine capability and broad memory access, were hampered by the small caches and slow interconnects of the commodity products. After a number of years of optimization efforts, the efficiency of the NCAR applications taken as a whole is only 4.5 percent on a large system of the 32-processor IBM Power 4 nodes and 5.7 percent on a large system of the 4-processor IBM Power 3 nodes.7 Only recently, and with substantial U.S. development funding, has Cray Research successfully developed the X-1, a vector supercomputer comparable in power to those produced in Japan.

Commodity-based systems are now increasingly used for weather simulations, since the problem has become one of capacity. Many independent simulations are carried in an ensemble study and each can now be performed on a relatively modest number of nodes, even on a commodity system. While efficiency is low, these systems seem to offer good cost/ performance. However, custom systems are still needed for climate simulations, since climate studies require that a few scenarios be simulated over long time periods, and scientists prefer to study scenarios one at a time. Commodity systems cannot complete the computation of one scenario in a reasonable time. The same consideration applies to large fluid problems such as the long global ocean integrations with 10-km or finer horizontal grids that will be needed as part of climate simulations—such problems require the scalability and capability of large systems that can only be provided by hybrid or fully custom architectures.

1  

Christopher M. Dumler. 1997. “Anti-dumping Laws Trash Supercomputer Competition.” Cato Institute Briefing Paper No. 32. October 14.

2  

Federal Register, vol. 62, no. 167, August 28, 1997:45636.

3  

See <http://www.scd.ucar.edu/info/itc.html>.

4  

See <http://www.computingjapan.com/magazine/issues/1997/jun97/0697indnews.html>.

5  

Ibid.

6  

Bill Buzbee, Director of the Scientific Computing Division at NCAR during that antidumping investigation, argued in 1998 that the decision gave a significant computational advantage to all Earth system modelers outside the United States and that it would still be 1 to 2 years before U.S. commodity-based supercomputers were powerful enough to carry out the NCAR research simulations that could be done on the NEC system in 1996 (National Research Council, 1998, Capacity of U.S. Climate Modeling to Support Climate Change Assessment Activities, Washington, D.C.: National Academy Press).

7  

The underlying hardware reasons for these numbers are discussed in an online presentation by Rich Loft of NCAR, available at <http://www.scd.ucar.edu/dir/CAS2K3/CAS2K3%20Presentations/Mon/loft.ppt>.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement