National Academies Press: OpenBook

Getting Up to Speed: The Future of Supercomputing (2005)

Chapter: 8 A Policy Framework

« Previous: 7 Supercomputing Abroad
Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×

8
A Policy Framework

In this chapter the committee discusses apolicy framework forgovernment activities in supercomputing. It does so in general terms, without going into the specifics of current or proposed policies. Concrete government policies in supercomputing in areas such as acquisitions, research funding, and support of industrial R&D are discussed in Chapter 9.

The federal government has been involved in the development and advancement of supercomputing since the advent of computers. Although the mechanisms and levels of support have varied over time, there has been a long-standing federal commitment to encourage technical progress and the diffusion of high-performance computing systems. (Key aspects of this history are summarized in Chapter 3.) Effective policy must be premised on a clear understanding of the rationale for intervention and an analysis of how intervention might be tailored to adapt to a changing economic and technological environment. In the absence of a compelling rationale for intervention, economists are generally reluctant to see government intervene in highly competitive markets, where the costs of disruption to well-functioning and efficient private sector allocation mechanisms are likely to be high. However, there are two broad and widely accepted rationales for government involvement in supercomputing: (1) the government is the primary customer and (2) supercomputing technology is beneficial to the country as a whole.

Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×

THE GOVERNMENT AS THE LEADING USER AND PURCHASER OF SUPERCOMPUTER TECHNOLOGY

Much technological innovation is, at least initially, directed to applications dominated by government involvement and purchasing. Most notably, defense and national security needs have often been the specific setting in which new technologies—including supercomputing—were first developed and applied. Even when commercial firms are the locus of research and development for new technology, governments are often the largest single customer for the resulting innovations.

Government demand for advanced information technology—including supercomputers—is not static. Historically, government demand has been quite responsive to current technological capabilities. As technical progress over time relaxes a given set of constraints, key government supercomputer purchasers have not simply taken advantage of a fixed level of performance at a lower cost; instead they spur continuing technical progress by demanding ever higher levels of technical performance.

The use of supercomputing allows mission-oriented government agencies to achieve their objectives more effectively, with the consequence that the federal government has a strong interest in ensuring a healthy rate of technological progress within supercomputing. The U.S. government remains the single largest purchaser of supercomputers in the world, and most federal supercomputer procurement is justified by the requirements of missions like national security and climate modeling.

For example, the justification for the original ASCI program was to promote supercomputing technology not for its own sake but for the sake of ensuring confidence in the nuclear stockpile in the absence of nuclear testing. DOE tried to achieve this objective by two means: The aggressive procurement of supercomputers throughout the 1990s and funding of the PathForward development program, which attempted to accelerate technical progress in the types of supercomputers used by the ASCI program.

Other defense and national security agencies have also been aggressive users of supercomputing technology. (See Chapter 4 for a description of specific applications.) For example, the timely calculation of areas of enemy territory where enemy radars are not able to spot our airplanes (such calculations were performed during the first Gulf war) can be crucial.1 Design and refurbishment of nuclear weapons depends critically on supercomputing calculations, as does the design of next-generation armament for the Army’s Future Combat System.

1  

William R. Swart. 1991. Keynote address. SC1991, Albuquerque, N.M., November 20.

Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×

It is likely that supercomputing will be increasingly important to homeland security. Examples include micrometeorology analysis to combat biological terrorism and computer forensic analysis in the wake of terrorist bombings. The federal government must be able to guarantee that such systems do what they are intended to do. Moreover, these programs must ensure that, while supercomputers are available to U.S. security agencies with no hindrance and with capabilities that satisfy their needs, other countries can be prevented from achieving key capabilities in supercomputing. To achieve this balancing act, the relevant federal agencies and research laboratories must often be closely involved in critical aspects of supercomputing R&D, even when the research and development are carried out in the private sector.

As the social custodian of well-defined government missions and the largest and most aggressive customer for new technology related to these missions, the government has an incentive to ensure appropriate and effective funding for innovative supercomputing investments so as to guarantee that the technology progresses at a rate and in a direction that serve the missions.

SUPERCOMPUTER TECHNOLOGY INVESTMENTS AS PUBLIC GOODS

The public goods nature of supercomputer investment is a second broad rationale for government intervention. In contrast to purely private goods (such as hot dogs or pencils, which only one person owns and consumes), public goods are nonrival (many consumers can take advantage of the good without diminishing the ability of other consumers to enjoy it) and nonexcludable (suppliers cannot prevent some people from using the good while allowing others to do so). National defense is an important example of a public good. Even though the national defense protects one person, it can still protect others (nonrival), and the national defense cannot protect some people without also protecting others (nonexcludable).

When a market involves goods that are both nonrival and nonexcludable, innovators are unable to capture the full value of their inventions, so the incentive for an individual firm to undertake investment is less than the socially optimal level of incentive. In the absence of government intervention or coordinated action, the underinvestment problem tends to be most serious for basic research, fundamental scientific discoveries, technologies that serve as stepping-stones for follow-on research by others, and software.

Both policymakers and economists have emphasized the public goods rationale for government intervention in areas like supercomputing technology. In large part, and as discussed in more detail in Chapter 3 (and

Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×

elsewhere), a number of technologies and innovations first implemented in supercomputers played an important role in shaping the architecture and performance of mainstream computers today (from workstations to personal computers). Moreover, initiatives funded in the context of supercomputers have influenced the ability to commercialize innovations, from workstation architecture to the latest Intel processor. Algorithms and codes initially developed for supercomputers in areas such as computational fluid dynamics, solid modeling, or signal processing are now broadly used by industry. As well, many of the most important applications of supercomputing technology, such as national security and climate modeling, are themselves public goods. Given these conditions, it is not surprising that both policymakers and economists have long justified investments in supercomputing technology on the basis of their status as public goods.

Several perceived shortcomings of the environment for supercomputing may reflect the public goods problem. For example, supercomputer users suffer from a lack of accessible and well-maintained software. Moreover, the development of better programming interfaces would greatly enhance productivity. While such initiatives would benefit all supercomputer users, no individual programmer or team has sufficient incentives to develop such complementary software and interface technologies. Similar to the more comprehensive approach to software development that is being attempted in recent projects such as the Earth System Modeling Framework at multiple institutions, overcoming these deficiencies requires either government intervention to provide direct support for the development of these technologies or a mechanism for coordinated action across groups involved in supercomputing technology.2

POTENTIAL COSTS OF GOVERNMENT INTERVENTION

Because the federal government is the main purchaser of supercomputing technology, and supercomputer hardware and software development is a public good, the federal government has played a leading and crucial role in the development and procurement of supercomputing technology. As discussed in Chapter 3, the federal government is not simply a passive consumer in these markets but has actively sought to influence

2  

Some people have also attempted to justify government intervention on the grounds of international competitiveness. According to this argument, government intervention can ensure that U.S. products are superior and thus benefit U.S. economy. Most economists reject this type of argument, and the committee found no reason to endorse it for supercomputing.

Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×

the rate and precise direction of technological change, with consequences for both the supercomputer market and the evolution of computing more generally.

It is important to emphasize that federal intervention in a technologically dynamic industry can be costly and disruptive, substantially limiting the efficiency and incentives provided by competitive markets. Many economists question the desirability of government involvement in the development and commercialization of new technology; government intervention can often be a far from benign influence on the market for new technologies.3 First, attempts to promote standardization through procurement can result in inadequate diversity, reducing the degree of technological experimentation. Inadequate experimentation with a variety of new technologies can be particularly costly in areas like supercomputing, where much of the realized value of a given technology is only realized over time through user experience and learning. Second, individual firms and vendors supporting specific supercomputer architectures may attempt to exert political influence over the procurement process itself. When such rent seeking occurs, government purchasing decisions may be based on the political influence of a firm rather than on its ability to meet the needs of government agencies in terms of performance and cost.

Given that government intervention may come with substantial costs, it is important to consider the types of interventions that the government can undertake and some of the key trade-offs that policymakers might consider as they develop and implement policy towards supercomputing.

ALTERNATIVE MODES FOR GOVERNMENT INTERVENTION

Almost by definition, government intervention in the supercomputer industry influences the allocation of resources toward supercomputing technology. However, the government has wide latitude in choosing the form of its intervention, and each type of intervention has its own costs and benefits. In large part, the government’s optimal choice of intervention and involvement depends on the balance between the specific mission-oriented objectives of individual agencies and the broader goal of encouraging technological progress in supercomputing (and information technology more generally).

The government has two main avenues for increasing innovation in

3  

Linda Cohen and Roger Noll. 1991. The Technology Pork Barrel. Washington, D.C.: Brookings Institution Press.

Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×

supercomputers. It can either provide incentives to the nongovernment sector or it can conduct the research itself.

Government Incentives

Government policy can provide broad encouragement to private industry to develop and commercialize supercomputing technology and affect the broader information technology marketplace. The government can influence the private sector by providing incentives for innovation and development investments, including grants or other subsidies, tax incentives, and intellectual property protection.

The government may subsidize private R&D activities. For example, a pervasive form of federal support for scientific and engineering research is grant and research contract programs, ranging from the peer-reviewed grant systems maintained by the National Science Foundation and other institutions to research contracts awarded by mission agencies such as DARPA. Such programs are particularly effective when the government would like to encourage basic research in specific areas but has limited information or knowledge about the precise nature of the outputs from research in that area. For example, grants and subsidies to the supercomputer center at the University of Illinois during the early 1990s were the principal form of support underlying the development of the Mosaic browser technology, an enormously beneficial innovation whose precise form, features, or impact could not have been forecast prior to its invention.4

Alternatively, R&D tax credits can provide important incentives for innovative investment. R&D tax credit programs provide direct incentives to private firms at a relatively low administrative burden.5 How-

4  

A somewhat similar approach is for the government, a nonprofit organization, or even a private firm to offer a prize. This approach has been tried throughout history with mixed results. For example, in 1795, the French Directory offered a prize of 12,000 francs “to any Frenchman who could devise a method of ensuring fresh, wholesome food for his armies and navies.” The prize was awarded by Napoleon Bonaparte to Nicholas Appret, who in-vented a method for preservation by sealing foods in airtight bottles and immersing them in boiling water for varying periods, which led to modern-day canning. Sobel provides an extremely rich description of the deficiencies and politics of government-sponsored prizes in his history of a prize for longitude at sea (David Sobel, 1995, Longitude: The True Story of a Lone Genius Who Solved the Greatest Scientific Problem of His Time, New York, N.Y.: Walker and Company). Recent examples of prizes range from the efforts by U.S. electrical companies to encourage research on a refrigerator that runs on 25 percent less electricity to the Ansari X Prize, which awarded $10 million to the first privately sponsored spacecraft to reach 100 km above Earth’s surface (www.xprize.org).

5  

The costs and delays in grant review are often cited as the reason private firms are un-willing to apply for government subsidy programs.

Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×

ever, tax credit programs have often been criticized for subsidizing private research that would have taken place even in the absence of a tax credit program.6 Moreover, it is difficult to use tax credits to specifically encourage research in specialized technical areas such as supercomputing. While tax credit programs are an appropriate tool to achieve broad R&D investment objectives, they are often too blunt to influence the precise direction of technical advance.

Finally, the patent system provides an indirect incentive system to encourage the development and commercialization of new supercomputing technology. Underinvestment in research and development will occur if others can copy a new idea or invention. A patent for a new invention gives the inventor monopoly rights to the invention for a fixed period of time, currently 20 years, so that the inventor can capture a relatively large proportion of the gains from innovation.7 Unlike fixed subsidies, patents lead to distortions from monopoly pricing; however, a principal rationale for the patent system is that the short-run loss from high prices is (hopefully) more than compensated for by the enhanced incentives for innovative investment. Perhaps the chief benefit of the patent system is its inherent flexibility: Rather than having the government determine in advance the types of innovations and discoveries to be encouraged, the patent system provides a market-based incentive available across a wide range of technologies and industries. However, the influence of the patent system on innovation incentives is subtle, and there is an ongoing debate about its use, particularly in areas of science and technology that might also benefit from subsidies or other mechanisms.8

Each of these mechanisms provides incentives for innovation but

6  

B. Hall and J. van Reenen. 2000. “How Effective Are Fiscal Incentives for R&D? A New Review of the Evidence.” Research Policy 29(4-5):449-469.

7  

An alternative method whereby firms can avoid the problem of underinvestment is for the firms in an industry to engage in research joint ventures, where they agree to share the cost of development as well as the benefits. However, firms may fear that such joint research activity may lead to antitrust prosecutions. The National Cooperative Research Act of 1984 tried to reduce firms’ fears of antitrust penalties by lowering the damages a joint venture must pay if it is convicted of an antitrust violation. International joint ventures are increasingly common. For example, in 1992, Toshiba, IBM, and Siemens announced they would collaborate in developing advanced memory chips, and on the same day, Fujitsu and Advanced Micro Devices said they would jointly manufacture flash memories, which are used for data storage instead of disk drives. From April 1991 to July 1992, at least seven technology alliances to produce memory chips were formed between U.S. and Japanese firms.

8  

See, for example, Nancy Gallini and Suzanne Scotchmer, 2002, “Intellectual Property: When Is It the Best Incentive Mechanism?” Innovation Policy and the Economy, Adam B. Jaffe, Josh Lerner, and Scott Stern, eds., Cambridge, Mass.: MIT Press.

Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×

places few restrictions on the ultimate output of the R&D investment or the use made of the technology and discoveries resulting from that investment. As such, these mechanisms will be inadequate when the government would like to maintain close control over the precise development of a technology or keep a given technology secret. When the government has clear technical objectives and an interest in maintaining precise control, the government can staff and fund intramural research and even implement prototyping and development programs.

Government Research

Since the beginning of the computer era, national laboratories and other government agencies have conducted supercomputer research and, in some cases, been responsible for building individual machines. A key benefit of internal development is that the government can maintain extensive control over the evolution of the technology and, when needed, maintain a high level of secrecy for the technology. Maintaining such control may be important in those cases where the technology is being developed for very specific government missions within very narrow parameters and where secrecy and continued control over the technology is much more important than cost or the ability to build on a diverse set of already existing technologies. The degree of control and secrecy that are feasible even under internal development should not be overstated. Government employees can move to private industry (or even start their own companies), and as long as individual components or subsystems are being procured from the private sector, it is difficult to maintain complete secrecy over the technology choices and capabilities of large government projects.

Most important, large intramural technology development projects are likely to be extremely costly, relative to what could be achieved through procurement from the private sector. Indeed, while overall government science and technology expenditures are predominantly funded through grants and tax credits, a high share of supercomputer investment is implemented through procurement contracts with private firms. Under ideal conditions, procurement allows the government to acquire specific types of advanced technology while taking advantage of competition between firms on the basis of cost and performance. The government can indeed take advantage of these benefits when it is a relatively small player in an otherwise competitive market. For example, over the past two decades, the government has been able to take advantage of the rapid pace of technical advance and the high level of competition in the market for personal computers as it acquires desktop PCs for nearly all government functions.

Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×

However, reaping the benefits of competition through a procurement system is more challenging when the government is the principal (or even sole) demander and the development requires substantial sunk investments. In this case, procurement decisions themselves shape the degree of competition in the marketplace. For example, the government can choose to deal with one, two, or more firms in the market over time. By committing to one firm, the government may be able to encourage the firm to make large sunk investments to take advantage of economies of scale and also maintain a relatively high level of secrecy. A single vendor captures a larger share of the benefits from innovation and customizing the software to work well with the hardware than would two vendors. The single firm gains from economies of scale in producing more units. However, a single vendor will exercise market power, setting a price above marginal cost and hence reducing demand for its product. By dealing with several firms over time, the procurement environment will be more competitive, leading to greater technological diversity, greater technological experimentation, and less risk. The probability of discovering a superior technology may be greater if more independent groups are involved. Because the government buys or funds most supercomputer purchases, its approach to procurement largely determines the degree of competition in this industry.

When a single government agency such as a single branch of the Department of Defense has faced this type of procurement environment, it often uses a “committed supplier” approach. When procuring technologies such as those for advanced fighter aircraft, the government chooses to engage (and commit) to a few firms (sometimes as few as two or three) over a relatively long horizon. By so doing, the government gives each firm a relatively strong incentive to make large investments, while maintaining at least a degree of flexibility and competition over time.9 At the broadest level, committing to several firms would probably be effective for supercomputing if there were a single coordinated approach to supercomputing procurement across the government. However, in contrast to the environment facing the Department of Defense, government procurement of supercomputing is dispersed across multiple government agencies and facilities, many of which are engaged in (at least tacit) competition with one another. Since technology changes rapidly, it is not possible to specify deliverables in detail beyond a short horizon. Therefore,

9  

For a review of the literature related to government sourcing, see W.N. Washington, 1997, “A Review of the Literature: Competition Versus Sole Source Procurements,” Acquisition Review Quarterly 10:173-187.

Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×

contracts are short term. In the current institutional structure, any given agency cannot commit to a long-term relation with a vendor. Doing so would require an accord and coordinated procurement across agencies and a different procurement model.

Finally, it is important to emphasize that government policy can, by itself, substantially limit the degree of competition available for future procurement. For example, if the U.S. government contracts with only one company, it virtually guarantees that there will be a monopoly (or at least a dominant firm) in the U.S. supercomputer market. In addition, by enacting trade barriers (see Box 8.1), the government may benefit a small number of domestic firms at the expense of government agencies and other consumers of supercomputers in the future, who may have to bear much higher prices or make do with inferior equipment.

COMPETING GOVERNMENT OBJECTIVES

Overall, optimal government policy toward supercomputing must therefore balance competing objectives, including serving the requirements of mission-oriented agencies and encouraging technological progress more broadly. As a practical matter, these objectives are balanced through the procurement process, which is discussed in detail in Chapter 9. In managing the procurement process, the government faces three key trade-offs: coordination versus diversification, commitment versus flexibility, and secrecy versus spillovers.

Coordination Versus Diversification

Government agencies can coordinate or they can act independently, obtaining diverse individual solutions. By coordinating (e.g., buying the same equipment and using common software), the agencies benefit from economies of scale. However, individual agencies would not necessarily obtain the best solution for their individual needs. A central planner (supercomputer czar) would be more likely to obtain the benefits of coordination at the expense of not fully satisfying the diverse needs of individual agencies. On the other hand, if each individual agency makes independent decisions, it probably will forgo the benefits from coordination (local versus global maximization).

Commitment Versus Flexibility

The government may commit or maintain flexibility. For example, the government may commit to a particular vendor (a particular piece of hardware or software) or a particular approach (parallel versus vector ma-

Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×

chines) for a given period of time. By so doing, it would benefit from economies of scale and leverage investments by others. However, it would lose flexibility. If the government backs the “wrong horse,” the rate of future advances might be slowed.

At least in part, the trade-off between commitment and flexibility reflects mandates to maintain a procurement process with high integrity. The government intentionally layers the procurement process with enormous amounts of auditing (and other legal constraints) in order to eliminate corruption. While such mandates serve the purpose of avoiding favoritism, they inevitably slow down the process of acquiring a new system, adopting frontier technology, and coordinating across different bidding processes.10 They may also make it harder to weigh intangibles such as a good, continued relation between government and a vendor.

Secrecy Versus Spillovers

Because the government has many missions that depend on secrecy, such as code breaking and weapons development, it often sacrifices spillover benefits. A national defense agency may develop superior hardware or software that would benefit other government agencies or other users around the world by allowing them to avoid “reinventing the wheel.” However, to maintain secrecy for reasons of national security, the government does not share these innovations. Obviously there are many cases where secrecy is paramount, but there may be many cases at the margin, where the cost of reducing secrecy (at least to the degree of allowing government agencies to share information) would be justified by the spillover benefits to others.

Secrecy also reduces spillovers in the reverse direction. If much of the research on certain forms of supercomputing is done in a classified environment, then one creates two distinct supercomputing research communities; an academic one that is open to foreigners and a classified one. The two communities have a limited ability to interact, thus reducing the in-flow of people and research ideas from universities to classified supercomputing. Such a separation is more hurtful in areas where technology changes rapidly.

Overall, managing each of these trade-offs requires a detailed understanding of the specific needs and requirements of different agencies and institutions, as well as the environment and infrastructure in which

10  

See, for example, Steven Kelman, 1990, Procurement and Public Management: The Fear of Discretion and the Quality of Government Performance, Washington, D.C.: American Enterprise Institute Press; Shane Greenstein, 1993, “Procedural Rules and Procurement Regulations: Complexity Creates Trade-offs,” Journal of Law, Economics, and Organizations, pp. 159-180.

Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×

supercomputing technology will be developed and deployed. It requires a clear understanding of the critical points of control. For example, there is no practical way to prevent foreign countries from assembling powerful clusters out of commodity components, but it is practical to restrict access to critical application codes.

BOX 8.1
Trade Policies

Several U.S. government policies affect international trade, such as antidumping laws, subsidies for sales in third markets, restrictions on imports (quotas or tariffs, if allowed under international agreements), and exports (export restrictions). Using these policies, the United States has effectively banned Japanese supercomputers from the U.S. supercomputer market. The events leading up to this ban follow.

As summarized in Chapter 3, Japanese firms started manufacturing high-performance vector machines in the early 1980s. By the late 1980s, using vector designs based on high-performance custom processor chips, these manufacturers posed a substantial competitive threat to U.S. producers. They benefited substantially from procurement by the Japanese government and the educational system and also received direct government subsidies for related research and development. It has also been alleged that large Japanese private customers that received substantial government funding were under pressure to buy Japanese supercomputers. The U.S. government pressured Japan to open its markets. In 1996, NEC developed the SX-4, a fast and relatively inexpensive CMOS-based vector supercomputer.

On May 17, 1996, the federally funded University Corporation for Atmospheric Research (UCAR) decided to lease a supercomputer made by a Japanese company, the first such decision by a public entity.1 It awarded a $35 million, 5-year leasing contract for a supercomputer to the U.S.-based integrator company Federal Computer Corporation (FCC), which had out-bid two other finalists for the contract—Fujitsu America, Inc., and Cray Research of Eagan, Minnesota—to supply a supercomputer to the National Center for Atmospheric Research (NCAR) for modeling weather and climate. The heart of FCC’s proposal was four NEC SX-4 machines, to be provided by HNSX Supercomputing, the U.S.-based subsidiary of NEC. Within 2 months, a domestic firm, SGI/Cray Research, which had submitted a bid to UCAR, filed an antidumping complaint.

Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×

In 1997, the International Trade Administration (ITA) of the Department of Commerce determined in “Notice of Final Determination of Sales at Less Than Fair Value: Vector Supercomputers from Japan” (A-588-841) that vector supercomputers from Japan were being sold in the United States at less than fair value. In its determination,2 the ITA concluded that dumping had occurred and calculated dumping margins using relatively indirect evidence:

Manufacturer/producer exporter

Margin percentage

Fujitsu Ltd.

173.08

NEC

454.00

All others

313.54

On September 26, 1997, a second U.S. agency, the International Trade Commission, made the dumping charge final with its determination that Cray Research had suffered material injury, even though NCAR argued that the hardware Cray proposed did not meet its minimum specifications.3

The punitive tariffs of between 173 percent and 454 percent on all supercomputers imported from Japan established a barrier so high that it effectively prevented imports and excluded Japanese supercomputers from the U.S. market.4 NEC and Fujitsu were, however, able to sell many supercomputers outside the United States.

NEC filed suit with the Court of International Trade (CIT) seeking suspension of the antidumping investigation. The suit, which was unsuccessful, alleged that the U.S. actions were politically motivated and reported that, prior to its findings, the Department of Commerce had arranged five meetings between it and government agencies, meetings that were attended by high-ranking officials.5

On May 3, 2001, the Commerce Department revoked the duties on vector supercomputers made by NEC and Fujitsu Ltd., retroactive to October 1, 2000. Ironically, Cray requested this action as part of Cray’s distribution and service agreement with NEC, whereby Cray became the exclusive distributor of NEC’s vector supercomputers in North America and a nonexclusive distributor in the rest of the world other than certain accounts in France. However, it has not yet sold any NEC SX-6 machines in the United States.

This U.S. policy has had adverse effects on U.S. scientific computing. For example, as a consequence of the initial CIT action, NCAR was unable to upgrade its supercomputing capability for almost 2 years and suffered a serious delay in research.6 In addition, because the NCAR climate codes were heavily oriented toward a vector architecture-based supercomputer, they could easily have been ported to the powerful NEC system. Subse-

Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×

quent reprogramming of the major climate models to allow them to run on commodity equipment caused additional delays during which very little science could be undertaken.

The new CRAY T-90 vector supercomputer was generally considered to be overpriced. Many of the U.S. supercomputer users that would have preferred a Japanese vector machine turned instead to commodity micro-processor-based clusters from various vendors. Applications such as those at NCAR, which require high machine capability and broad memory access, were hampered by the small caches and slow interconnects of the commodity products. After a number of years of optimization efforts, the efficiency of the NCAR applications taken as a whole is only 4.5 percent on a large system of the 32-processor IBM Power 4 nodes and 5.7 percent on a large system of the 4-processor IBM Power 3 nodes.7 Only recently, and with substantial U.S. development funding, has Cray Research successfully developed the X-1, a vector supercomputer comparable in power to those produced in Japan.

Commodity-based systems are now increasingly used for weather simulations, since the problem has become one of capacity. Many independent simulations are carried in an ensemble study and each can now be performed on a relatively modest number of nodes, even on a commodity system. While efficiency is low, these systems seem to offer good cost/ performance. However, custom systems are still needed for climate simulations, since climate studies require that a few scenarios be simulated over long time periods, and scientists prefer to study scenarios one at a time. Commodity systems cannot complete the computation of one scenario in a reasonable time. The same consideration applies to large fluid problems such as the long global ocean integrations with 10-km or finer horizontal grids that will be needed as part of climate simulations—such problems require the scalability and capability of large systems that can only be provided by hybrid or fully custom architectures.

1  

Christopher M. Dumler. 1997. “Anti-dumping Laws Trash Supercomputer Competition.” Cato Institute Briefing Paper No. 32. October 14.

2  

Federal Register, vol. 62, no. 167, August 28, 1997:45636.

3  

See <http://www.scd.ucar.edu/info/itc.html>.

4  

See <http://www.computingjapan.com/magazine/issues/1997/jun97/0697indnews.html>.

5  

Ibid.

6  

Bill Buzbee, Director of the Scientific Computing Division at NCAR during that antidumping investigation, argued in 1998 that the decision gave a significant computational advantage to all Earth system modelers outside the United States and that it would still be 1 to 2 years before U.S. commodity-based supercomputers were powerful enough to carry out the NCAR research simulations that could be done on the NEC system in 1996 (National Research Council, 1998, Capacity of U.S. Climate Modeling to Support Climate Change Assessment Activities, Washington, D.C.: National Academy Press).

7  

The underlying hardware reasons for these numbers are discussed in an online presentation by Rich Loft of NCAR, available at <http://www.scd.ucar.edu/dir/CAS2K3/CAS2K3%20Presentations/Mon/loft.ppt>.

Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×
Page 192
Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×
Page 193
Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×
Page 194
Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×
Page 195
Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×
Page 196
Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×
Page 197
Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×
Page 198
Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×
Page 199
Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×
Page 200
Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×
Page 201
Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×
Page 202
Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×
Page 203
Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×
Page 204
Suggested Citation:"8 A Policy Framework." National Research Council. 2005. Getting Up to Speed: The Future of Supercomputing. Washington, DC: The National Academies Press. doi: 10.17226/11148.
×
Page 205
Next: 9 Stewardship and Funding of Supercomputing »
Getting Up to Speed: The Future of Supercomputing Get This Book
×
Buy Paperback | $48.00 Buy Ebook | $38.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Supercomputers play a significant and growing role in a variety of areas important to the nation. They are used to address challenging science and technology problems. In recent years, however, progress in supercomputing in the United States has slowed. The development of the Earth Simulator supercomputer by Japan that the United States could lose its competitive advantage and, more importantly, the national competence needed to achieve national goals. In the wake of this development, the Department of Energy asked the NRC to assess the state of U.S. supercomputing capabilities and relevant R&D. Subsequently, the Senate directed DOE in S. Rpt. 107-220 to ask the NRC to evaluate the Advanced Simulation and Computing program of the National Nuclear Security Administration at DOE in light of the development of the Earth Simulator. This report provides an assessment of the current status of supercomputing in the United States including a review of current demand and technology, infrastructure and institutions, and international activities. The report also presents a number of recommendations to enable the United States to meet current and future needs for capability supercomputers.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!