Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 85
--> 4 The Organization of Federal Support: A Historical Review Rather than a single, overarching framework of support, federal funding for research in computing has been managed by a set of agencies and offices that carry the legacies of the historical periods in which they were created. Crises such as World War II, Korea, Sputnik, Vietnam, the oil shocks, and concerns over national competitiveness have all instigated new modes of government support. Los Alamos National Laboratory, for example, a leader in supercomputing, was created by the Manhattan Project and became part of the Department of Energy. The Office of Naval Research and the National Science Foundation emerged in the wake of World War II to continue the successful contributions of wartime science. The Defense Advanced Research Projects Agency (DARPA) and the National Aeronautics and Space Administration (NASA) are products of the Cold War, created in response to the launch of Sputnik to regain the nation's technological leadership. The National Bureau of Standards, an older agency, was transformed into the National Institute of Standards and Technology in response to recent concerns about national competitiveness. Each organization's style, mission, and importance have changed over time; yet each organization profoundly reflects the process of its development, and the overall landscape is the result of numerous layers of history. Understanding these layers is crucial for discussing the role of the federal government in computing research. This chapter briefly sets out a history of the federal government's programmatic involvement in computing research since 1945, distinguishing the various layers in the his-
OCR for page 86
--> torical eras in which they were first formed. The objective is to identify the changing role the government has played in these different historical periods, discuss the changing political and technological environment in which federal organizations have acted, and draw attention to the multiplicity, diversity, and flexibility of public-sector programs that have stimulated and underwritten the continuing steam of U.S. research in computing and communications since World War II. In fulfilling this charge, the chapter reviews a number of prominent federal research programs that exerted profound influence on the evolving computing industry. These programs are illustrative of the effects of federal funding on the industry at different times. Other programs, too numerous to describe in this chapter, undoubtedly played key roles in the history of the computing industry but are not considered here. 1945-1960: Era of Government Computers In late 1945, just a few weeks after atomic bombs ended World War II and thrust the world into the nuclear age, digital electronic computers began to whir. The ENIAC (Electronic Numerical Integrator and Computer), built at the University of Pennsylvania and funded by the Army Ballistic Research Laboratory, was America's first such machine. The following 15 years saw electronic computing grow from a laboratory technology into a routine, useful one. Computing hardware moved from the ungainly and delicate world of vacuum tubes and paper tape to the reliable and efficient world of transistors and magnetic storage. The 1950s saw the development of key technical underpinnings for widespread computing: cheap and reliable transistors available in large quantities, rotating magnetic drum and disk storage, magnetic core memory, and beginning work in semiconductor packaging and miniaturization, particularly for missiles. In telecommunications, American Telephone and Telegraph (AT&T) introduced nationwide dialing and the first electronic switching systems at the end of the decade. A fledgling commercial computer industry emerged, led by International Business Machines (IBM) (which built its electronic computer capability internally) and Remington Rand (later Sperry Rand), which purchased Eckert-Mauchly Computer Corporation in 1950 and Engineering Research Associates in 1952. Other important participants included Bendix, Burroughs, General Electric (GE), Honeywell, Philco, Raytheon, and Radio Communications Authority (RCA). In computing, the technical cutting edge, however, was usually pushed forward in government facilities, at government-funded research centers, or at private contractors doing government work. Government funding accounted for roughly three-quarters of the total computer field.
OCR for page 87
--> A survey performed by the Army Ballistics Research Laboratory in 1957, 1959, and 1961 lists every electronic stored-program computer in use in the country (the very possibility of compiling such a list says a great deal about the community of computing at the time). The surveys reveal the large proportion of machines in use for government purposes, either by federal contractors or in government facilities (Weik, 1955, pp. 57-61; Flamm, 1988). The Government's Early Role Before 1960, government—as a funder and as a customer—dominated electronic computing. Federal support had no broad, coherent approach, however, arising somewhat ad hoc in individual federal agencies. The period was one of experimentation, both with the technology itself and with diverse mechanisms for federal support. From the panoply of solutions, distinct successes and failures can be discerned, from both scientific and economic points of view. After 1960, computing was more prominantly recognized as an issue for federal policy. The National Science Foundation and the National Academy of Sciences issued surveys and reports on the field. If government was the main driver for computing research and development (R&D) during this period, the main driver for government was the defense needs of the Cold War. Events such as the explosion of a Soviet atomic bomb in 1949 and the Korean War in the 1950s heightened international tensions and called for critical defense applications, especially command-and-control and weapons design. It is worth noting, however, that such forces did not exert a strong influence on telecommunications, an area in which most R&D was performed within AT&T for civilian purposes. Long-distance transmission remained analog, although digital systems were in development at AT&T's Bell Laboratories. Still, the newly emergent field of semiconductors was largely supported by defense in its early years. During the 1950s, the Department of Defense (DOD) supported about 25 percent of transistor research at Bell Laboratories (Flamm, 1988, p. 16; Misa, 1985). However much the Cold War generated computer funding, during the 1950s dollars and scale remained relatively small compared to other fields, such as aerospace applications, missile programs, and the Navy's Polaris program (although many of these programs had significant computing components, especially for operations research and advanced management techniques). By 1950, government investment in computing amounted to $15 million to $20 million per year. All of the major computer companies during the 1950s had significant components of their R&D supported by government contracts of some
OCR for page 88
--> type. At IBM, for example, federal contracts supported more than half of the R&D and about 35 percent of R&D as late as 1963 (only in the late 1960s did this proportion of support trail off significantly, although absolute amounts still increased). The federal government supported projects and ideas the private sector would not fund, either for national security, to build up human capital, or to explore the capabilities of a complex, expensive technology whose long-term impact and use was uncertain. Many federally supported projects put in place prototype hardware on which researchers could do exploratory work. Establishment of Organizations The successful development projects of World War II, particularly radar and the atomic bomb, left policymakers asking how to maintain the technological momentum in peacetime. Numerous new government organizations arose, attempting to sustain the creative atmosphere of the famous wartime research projects and to enhance national leadership in science and technology. Despite Vannevar Bush's efforts to establish a new national research foundation to support research in the nation's universities, political difficulties prevented the bill from passing until 1950, and the National Science Foundation (NSF) did not become a significant player in computing until later in that decade. During the 15 years immediately after World War II, research in computing and communications was supported by mission agencies of the federal government, such as DOD, the Department of Energy (DOE), and NASA. In retrospect, it seems that the nation was experimenting with different models for supporting this intriguing new technology that required a subtle mix of scientific and engineering skill. Military Research Offices Continuity in basic science was provided primarily by the Office of Naval Research (ONR), created in 1946 explicitly to perpetuate the contributions scientists made to military problems during World War II. In computing, the agency took a variety of approaches simultaneously. First, it supported basic intellectual and mathematical work, particularly in numerical analysis. These projects proved instrumental in establishing a sound mathematical basis for computer design and computer processing. Second, ONR supported intellectual infrastructure in the infant field of computing, sponsoring conferences and publications for information dissemination. Members of ONR participated in founding the Association for Computing Machinery in 1947. ONR's third approach to computing was to sponsor machine design
OCR for page 89
--> and construction. It ordered a computer for missile testing through the National Bureau of Standards from Raytheon, which became known as the Raydac machine, installed in 1952 (Rees, 1982). ONR supported Whirlwind, MIT's first digital computer and progenitor of real-time command-and-control systems (Redmond and Smith, 1980). John von Neumann built a machine with support from ONR and other agencies at Princeton's Institute for Advanced Study, known as the IAS computer (Goldstine, 1972; Rees, 1982). The project produced significant advances in computer architecture, and the design was widely copied by both government and industrial organizations. Other military services created offices on a model similar to that of ONR. The Air Force Office of Scientific Research was established in 1950 to manage U.S. Air Force R&D activities. Similarly, the U.S. Army established the Army Research Office to manage and promote Army programs in science and technology. National Bureau of Standards Arising out of its role as arbiter of weights and measures, the National Bureau of Standards (NBS) had long had its own laboratories and technical expertise and had long served as a technical advisor to other government agencies. In the immediate postwar years, NBS sought to expand its advisory role and help U.S. industry develop wartime technology for commercial purposes. NBS, through its National Applied Mathematics Laboratory, acted as a kind of expert agent for other government agencies, selecting suppliers and overseeing construction and delivery of new computers. For example, NBS contracted for the three initial Univac machines—the first commercial, electronic, digital, stored-program computers—one for the Census Bureau and two for the Air Materiel Command. NBS also got into the business of building machines. When the Univac order was plagued by technical delays, NBS built its own computer in-house. The Standards Eastern Automatic Computer (SEAC) was built for the Air Force and dedicated in 1950, the first operational, electronic, stored-program computer in this country. NBS built a similar machine, the Standards Western Automatic Computer (SWAC) for the Navy on the West Coast (Huskey, 1980). Numerous problems were run on SEAC, and the computer also served as a central facility for diffusing expertise in programming to other government agencies. Despite this significant hardware, however, NBS's bid to be a government center for computing expertise ended in the mid-1950s. Caught up in postwar debates over science policy and a controversy over battery additives, NBS research
OCR for page 90
--> funding was radically reduced, and NBS lost its momentum in the field of computing (Akera, 1996). Atomic Energy Commission Nuclear weapons design and research have from the beginning provided impetus to advances in large-scale computation. The first atomic bombs were designed only with desktop calculators and punched-card equipment, but continued work on nuclear weapons provided some of the earliest applications for the new electronic machines as they evolved. The first computation job run on the ENIAC in 1945 was an early calculation for the hydrogen bomb project ''Super.'' In the late 1940s, the Los Alamos National Laboratory built its own computer, MANIAC, based on von Neumann's design for the Institute for Advanced Study computer at Princeton, and the Atomic Energy Commission (AEC) funded similar machines at Argonne National Laboratory and Oak Ridge National Laboratory (Seidel, 1996; Goldstine, 1980). In addition to building their own computers, the AEC laboratories were significant customers for supercomputers. The demand created by AEC laboratories for computing power provided companies with an incentive to design more powerful computers with new designs. In the early 1950s, IBM built its 701, the Defense Calculator, partly with the assurance that Los Alamos and Livermore would each buy at least one. In 1955, the AEC laboratory at Livermore, California, commissioned Remington Rand to design and build the Livermore Automatic Research Computer (LARC), the first supercomputer. The mere specification for LARC advanced the state of the art, as the bidding competition required the use of transistors instead of vacuum tubes (MacKenzie, 1991). IBM developed improved ferrite-core memories and supercomputer designs with funding from the National Security Agency, and designed and built the Stretch supercomputer for the Los Alamos Scientific Laboratory, beginning it in 1956 and installing it in 1961. Seven more Stretch supercomputers were built. Half of the Stretch supercomputers sold were used for nuclear weapon research and design (Pugh, 1995; pp. 222-223). The AEC continued to specify and buy newer and faster supercomputers, including the Control Data 6600, the STAR 100, and the Cray 1 (although developed without AEC funds), practically ensuring a market for continued advancements (Pugh, 1995; p. 192). AEC and DOE laboratories also developed much of the software used in high-performance computing including operating systems, numerical analysis software, and matrix evaluation routines (Flamm, 1987, p. 82). In addition to stimulating R&D in industry, the AEC laboratories also developed a large talent pool on which the computer industry and academia could draw. In fact,
OCR for page 91
--> the head of IBM's Applied Science Department, Cuthbert Hurd, came directly to IBM in 1949 from the AEC's Oak Ridge National Laboratory (Hurd, 1994). Physicists worked on national security problems with government support providing demand, specifications, and technical input, as well as dollars, for industry to make significant advances in computing technology. Private Organizations Not all the new organizations created by the government to support computing were public. A number of new private organizations also sprang up with innovative new charters and government encouragement that held prospects of initial funding support. In 1956, at the request of the Air Force, the Massachusetts Institute of Technology (MIT) created Project Lincoln, now known as the Lincoln Laboratory, with a broad charter to study problems in air defense to protect the nation from nuclear attack. The Lincoln Laboratory then oversaw the construction of the Semi-Automatic Ground Environment (SAGE) air-defense system (Box 4.1) (Bashe et al., 1986, p. 262). In 1946, the Air Force and Douglas Aircraft created a joint venture, Project RAND, to study intercontinental warfare. In the following year RAND separated from Douglas and became the independent, nonprofit RAND Corporation. RAND worked only for the Air Force until 1956, when it began to diversify to other defense and defense-related contractors, such as the Advanced Research Projects Agency and the Atomic Energy Commission, and provided, for a time, what one researcher called "in some sense the world's largest installation for scientific computing [in 1950]."1 RAND specialized in developing computer systems, such as the Johnniac, based on the IAS computer, which made RAND the logical source for the programming on SAGE. While working on SAGE, RAND trained hundreds of programmers, eventually leading to the spin-off of RAND's Systems Development Division and Systems Training Program into the Systems Development Corporation. Computers made a major impact on the systems analysis and game theoretic approaches that RAND and other similar think tanks used in attempts to model nuclear and conventional warfighting strategies. Engineering Research Associates (ERA) represented yet another form of government support: the private contractor growing out of a single government agency. With ERA, the Navy effectively privatized its wartime cryptography organization and was able to maintain civilian expertise through the radical postwar demobilization. ERA was founded in St. Paul, Minnesota, in January 1946 by two engineers who had done cryptography for the Navy and their business partners (Cohen and Tomash,
OCR for page 92
--> BOX 4.1 Project Whirlwind and SAGE Two closely connected computing projects, Whirlwind and SAGE, demonstrate the influence of federal research and development programs during the early days of computing. They not only generated technical knowledge and human resources, but they also forged a unique relationship among government, universities, and industry. The Whirlwind computer was originally intended to be part of a general-purpose flight simulator, but it evolved into the first real-time, general-purpose digital computer. SAGE, an air-defense system designed to protect against enemy bombers, made several important contributions to computing in areas as diverse as computer graphics, time-sharing, digital communications, and ferrite-core memories. Together, these two projects shared a symbiotic relationship that strengthened the early computer industry. Whirlwind originated in 1944 as part of the Navy's Airplane Stability and Control Analyzer (ASCA) project. At that time, the Navy made extensive use of flight simulators to test new aircraft designs and train pilots; however, each new aircraft design required a separate computer specially created for its particular design. ASCA was intended to negate the need to build individual computers for the flight simulators by serving as a general-purpose simulator that could emulate any design programmed into it. Jay Forrester, the leader of the computer portion of the ASCA project, soon recognized that analog computers (which were typically used on aircraft simulators) would not be fast enough to operate the trainer in real time. Learning of work in electronic digital computing as part of ENIAC at the University of Pennsylvania, Forrester began investigating the potential for real-time digital computers for Whirlwind. By early 1946, Forrester decided to pursue the digital route, expanding the goal of the Whirlwind program from building a generalizable aircraft simulator to designing a real-time, general-purpose digital computer that could serve many functions other than flight simulation. Pursuing a digital computer required dramatic increases in computing speeds and reliability, both of which hinged on development of improved computer memory—an innovation that was also needed to handle large amounts of data about incoming airplanes. Mercury delay-line memories, which used sonic pulses to record information and were being pursued by several other research centers, were too slow for the machine Forrester envisioned. He decided instead to use electrostatic storage tubes in which bits of information could be stored as an electrical charge and which claimed read-and-write times of a few milliseconds. Such tubes proved to be expensive, limited in storage capacity, and unreliable. Looking for a new memory alternative, Forrester came across a new magnetic ceramic called Deltamax and began working on the first magnetic core memory, a project to which he later assigned a graduate student, Bill Papian. The expansion of Whirlwind's technical objectives resulted in expanding project budgets that eventually undermined support for the project. Forrester originally planned Whirlwind as a 2-year, $875,000 program, but he increased his cost estimate for the Whirlwind computer itself to $1.9 million in March 1946 and to almost $3 million by 1947 (Campbell-Kelly and Aspray, 1996, pp. 161-163). By 1949, Whirlwind made up nearly 65 percent of the Office of Naval Research (ONR) mathematics research budget and almost 10 percent of ONR's entire contract research
OCR for page 93
--> budget (Edwards, 1996, p. 79). As a part of a general Department of Defense initiative to centralize computer research in 1951, ONR planned to reduce Whirlwind's annual budget from $1.15 million to $250 thousand in 1951, threatening the viability of the project (Edwards, 1996, p. 91). Support for the project was salvaged only after George Valley, Jr., a professor of physics at the Massachusetts Institute of Technology (MIT) and chairman of the Air Defense System Engineering Committee, realized that Whirlwind might play a critical role in a new air-defense program, SAGE, and convinced the Air Force to provide additional funding for the project, thereby adding to its credibility. In 1949, Valley began lobbying the Air Force to improve U.S. air-defense capability in the face of the nation's growing vulnerability to Soviet bombers (Freeman, 1995, p. 2). Valley was put in charge of the Air Defense Systems Engineering Committee to investigate possible solutions. The resulting Project Charles Summer Study Group recommended that the Air Force ask MIT to build a laboratory to carry out the experimental and field research necessary to develop a system to safeguard the United States (Freeman, 1995, p. 6). In response, MIT created Project Lincoln, now known as Lincoln Laboratory, to create the Semi-Automatic Ground Environment, or SAGE, system. Through SAGE, the Air Force became the major sponsor of Whirlwind, enabling the project to move toward completion. By late 1951, a prototype ferrite-core memory system was demonstrated, and by 1953, the Whirlwind's entire memory was replaced with core memory boasting a 9-microsecond access time, effectively ending the research phase of the program. The Air Force subsequently purchased production versions of the computer (designed in a cooperative effort between MIT and IBM) to equip each of its 23 Direction Centers. Each center had two IBM-manufactured versions of Whirlwind: one operating live and one operating in standby mode for additional reliability. The machines accepted input from over 100 different information sources (typically from ground, air, and seaborne radars) and displayed relevant information on cathode-ray-tube displays for operators to track and identify aircraft. The first SAGE Direction Center was activated in 1958, and deployment continued until 1963, when final deployment of 23 centers was completed at an estimated cost of $8 billion to $12 billion. Although a technical success, SAGE was already outdated by the time of its completion. The launch of Sputnik shifted the most feared military threat to the United States from long-range bombers to intercontinental ballistic missiles. SAGE command centers continued to operate into the middle of the 1980s but with a reduced urgency. All told, ONR spent roughly $3.6 million on Whirlwind, the Air Force, $13.8 million. In return, Whirlwind and SAGE generated a score of innovations. On the hardware side, Whirlwind and SAGE pioneered magnetic-core memory, digital phone-line transmission and modems, the light pen (one of the first graphical user interfaces), and duplexed computers. In software, they pioneered use of real-time software; concepts that later evolved into assemblers, compilers, and interpreters; software diagnosis programs; time-shared operating systems; structured program modules; table-driven software; and data description techniques. Five years after its introduction in Whirlwind, ferrite-core memory replaced every other type of com
OCR for page 94
--> puter memory, and remained the dominant form of computer memory until 1973. Royalties to MIT from nongovernment sales amounted to $25 million, as MIT licensed the technology broadly.1 In addition, SAGE accelerated the transfer of these technologies throughout the nascent computer industry. While Lincoln Laboratory was given primary responsibility for SAGE, the project also involved several private firms such as IBM, RAND, Systems Development Corporation (the spin-off from RAND), Burroughs, Western Electric, RCA, and AT&T.2 Through this complex relationship between academia, industry, and the military, SAGE technologies worked their way into commercial products and helped establish the industry leaders. SAGE was a driving force behind the formation of the American computer and electronics industry (Freeman, 1995, p. 33). IBM built 56 computers for SAGE, earning over $500 million, which helped contribute to its becoming the world's largest computer manufacturer (Edwards, 1996, pp. 101-102; Freeman, 1995, p. 33). At its peak, between 7,000 and 8,000 IBM employees worked on the project. SAGE technology contributed substantially to the SABRE airline reservation system marketed by IBM in 1964, which later became the backbone of the airline industry (Edwards, 1996, p. 102). Kenneth Olsen, who worked on Whirlwind before founding Digital Equipment Corporation, called Whirlwind the first minicomputer and states that his company was based entirely on Whirlwind technology (Old Associates, 1981, p. 23). SAGE also contributed to formalizing the programming profession. While developing software for the system, the RAND Corporation spun off the Systems Development Corporation (SDC) to handle the software for SAGE. SDC trained thousands of programmers who eventually moved into the workforce. Numerous computer engineers from both IBM and SDC started their own firms with the knowledge they acquired from SAGE. SAGE also established an influential precedent for organizational management. Lincoln Laboratory was structured in the same style as MIT had run the Radiation Laboratory during World War II, in that it had much less management involvement than other equivalent organizations. As a result, researchers had a large amount of freedom to pursue their own solutions to problems at hand. Norman Taylor, one of the key individuals who designed SAGE at Lincoln Laboratory credited the management style for the projects' successes: I think Bob [Everett] put his finger on one important thing: the freedom to do something without approval from top management. Take the case of the 65,000 word memory. . . . We built that big memory, and we didn't go to the steering committee to get approval for it. We didn't go up there and say, "Now, here's what we ought to do, it's going to cost this many million dollars, it's going to take us this long, and you must give us approval for it." We just had a pocket of money that was for advanced research. We didn't tell anybody what it was for; we didn't have to. (Freeman, 1995, p. 20) This management style contrasted with the more traditional bureaucratic style of most American corporations of the time. It was subsequently adopted by Digital Equipment Corporation (under Kenneth Olsen's leadership) and eventually imitated
OCR for page 95
--> by many—if not most—of the information technology firms that dot the suburban Boston and Silicon Valley landscapes. Although not the first to pioneer this management style and the organizational ethos it engendered, Lincoln Laboratory had demonstrated its functionality in large computing systems development. 1 MIT licensed the technology for core memories to several computer companies—IBM, Univac, RCA, General Electric, Burroughs, NCR, Lockheed, and Digital Equipment Corporation—and memory suppliers, including Ampex, Fabri-TEk, Electronic Memory & Magnetics, Data Products, General Ceramics, and Ferroxcube. See Old Associates (1981), Figure 2 and p. 3. 2 Although AT&T is a private company, much of its research was supported through a tax on customers. Hence, its research is often considered quasi-public. 1979). The Navy moved its Naval Computing Machine Laboratory from Dayton to St. Paul, and ERA essentially became the laboratory (Tomash, 1973; Parker 1985, 1986). ERA did some research, but it primarily worked on task-oriented, cost-plus contracts. As one participant recalled, "It was not a university atmosphere. It was 'Build stuff. Make it work. How do you package it? How do you fix it? How do you document it?'" (Tomash, 1973). ERA built a community of engineering skill, which became the foundation of the Minnesota computer industry. In 1951, for example, the company hired Seymour Cray for his first job out of the University of Minnesota (ERA, 1950; Cohen, 1983; Tomash 1973). As noted earlier, the RAND Corporation had contracted in 1955 to write much of the software for SAGE owing to its earlier experience in air defense and its large pool of programmers. By 1956, the Systems Training Program of the RAND Corporation, the division assigned to SAGE, was larger than the rest of the corporation combined, and it spun off into the nonprofit Systems Development Corporation (SDC). SDC played a significant role in computer training. As described by one of the participants, "Part of SDC's nonprofit role was to be a university for programmers. Hence our policy in those days was not to oppose the recruiting of our personnel and not to match higher salary offers with an SDC raise." By 1963, SDC had trained more than 10,000 employees in the field of computer systems. Of those, 6,000 had moved to other businesses across the country (Baum, 1981, pp. 47-51). Observations In retrospect, the 1950s appear to have been a period of institutional and technological experimentation. This diversity of approaches, while it
OCR for page 125
--> BOX 4.5 Computer Engineering at the National Science Foundation Not all of the National Science Foundation's (NSF's) support for computing-related research came through the computer directorate. In 1973, the NSF created the Electrical Sciences and Analysis Section in its Engineering Division to fund electrical engineering research. Over the course of the next 10 years, the section's budget grew from $7.4 million dollars to $23.7 million in 1984 as NSF incorporated new programs. In 1979, the section was renamed the Division of Electrical, Computer, and Systems Engineering when it began to support computer engineering. The division supported research in very large scale integrated circuit technology, fiber-optic communications networks, and computer-aided drafting. In 1986, many of the division's programs, including the Computer Engineering program, the Instrumentation, Sensing, and Measurement program, and the Automation, Instrumentation, and Sensing program, were shifted into the new microelectronics information processing system of the Computer and Information Sciences and Engineering Directorate. The communications programs were left behind, as most of their work focused on voice and video communication, rather than data networks.1 1 Personal communication from Gordon Bell, former director of the Computer and Information Science and Engineering Directorate at the National Science Foundation, July 1998. all of the support for basic research [in Software Engineering] . . . 60 percent of the support for basic research [in Computer Systems Design]" (NSF, 1979, p. B-II-3). NSF was also beginning to increase its support of intelligent systems as DARPA's support for basic AI declined. Computer research support at NSF took on its current form in 1986. That year, NSF director Erich Bloch announced the creation of a new directorate entirely for computing, the Computer and Information Sciences and Engineering (CISE) Directorate (CSTB, 1992, p. 223). To lead the new directorate, Bloch recruited Gordon Bell, a pioneering system architect at Digital Equipment Corporation, who had been pushing NSF for several years to increase funding for computer science. Bell, like others in the computer industry, was still concerned that universities were not training enough Ph.D.s in computer science to continue advancing the field. He believed that the creation of CISE could help alleviate this problem.17 Unlike the more recent organizational changes in computing at NSF, CISE was more than a change of name and bureaucratic position. Much like the creation of OCA, CISE consolidated all the computer initiatives in NSF into one entity. The Division of Computer Research was combined
OCR for page 126
--> with the computing portions of the Electrical, Computer, and Systems Engineering Division. CISE also absorbed the Office of Advanced Scientific Computing and the Division of Information Science and Technology. Monetary support for computing exploded immediately. CISE's 1986 budget was over $100 million, almost three times the Division of Computer Science's budget in 1984. CISE constituted 7 percent of the entire NSF budget as opposed to 3 percent in 1985.18 In addition, attaining the level of NSF directorate symbolically marked the end of the uncertain position of computing within NSF. Computer science was formally on a par with the biological sciences, the physical sciences, and the other directorates of NSF. Between 1987 and 1996, the CISE budget more than doubled from $117 million to $259 million, growing at about the same rate as NSF over-all and remaining relatively constant at 7 to 8 percent of NSF's total budget. While all divisions within CISE grew during this period, the Division of Advanced Scientific Computing and the Division of Networking and Communications Research received the majority of the absolute dollar increases, reflecting the growing importance of NSF's infrastructure programs (Table 4.3). The Advanced Scientific Computing Division's budget increased from $42 million to $87 million between 1987 and 1996, making it by far the largest division within CISE, accounting for 35 percent of CISE's budget during that time. The Networking Division's budget increased from approximately 8 percent to almost 20 percent of the entire CISE budget, largely as a result of the NSFNET program and related networking infrastructure programs, which grew from $6.5 million in 1987 to $41.6 million in 1996 (NSFNET and the Advanced Scientific Computing program are discussed in Chapter 3). As a result, infrastructure programs grew from 42 percent to 50 percent of the CISE budget.19 Starting in 1989, CISE also began supporting a number of science and technology centers (STCs) whose goal was to promote collaborative, interdisciplinary research related to computer science. They include centers for computer graphics and scientific visualization, discrete mathematics and theoretical computer science, parallel computing, and research in cognitive science. These centers are supported not only by NSF but also by several other federal agencies, universities, and members of industry. Reviews of the STC program in 1995 and 1996 were highly supportive of the centers (National Academy of Public Administration, 1995; National Research Council, 1996). Other Federal Agencies in the 1970s and 1980s DARPA and NSF, of course, did not represent all federal funding of computer research during the 1970s and 1980s, though they clearly played
OCR for page 127
--> TABLE 4.3 Growth in the National Science Foundation's Computer and Information Sciences and Engineering Directorate Budget (millions of dollars), 1987-1996 1987 1988 1989 1990 1991 1992 1993 1994 1995 1996 Computer and computation research 19 22 25 26 29 34 28 30 32 33 Cross-disciplinary activities 16 16 16 18 19 23 22 23 23 27 Advanced scientific computing 43 46 61 71 74 76 75 82 87 88 Information, robotics, and intelligent systems 17 17 18 19 22 25 26 29 30 31 Networking and communications research 10 11 16 20 29 34 39 50 56 54 Computer and information engineering 6 6 6 6 7 8 8 8 9 9 Microelectronics and information processing systems 6 7 8 10 11 13 13 15 16 17 TOTAL 117 124 152 169 190 210 212 236 254 259 NOTE: Totals may not add because of rounding. SOURCE: Personal communication from Vernon Ross, NSF Office of Budget, Finance, and Award Management, July 1997.
OCR for page 128
--> a dominant role. Although SCI was formulated prior to and independent of the Strategic Defense Initiative (SDI), it could not but be partially absorbed (especially in the minds of the public) by the latter, despite the efforts of DARPA management to keep the programs distinct (Edwards, 1996, pp. 293-299). Reagan's $35 billion SDI program pumped tens of millions of dollars annually into computing.20 SDI critically relied on command-and-control systems for its effectiveness, and doubts about software testing and reliability proved a major hurdle in implementation. SDI also supported work in parallel architectures, optical computing, and new semiconductor materials. The VHSIC program, launched in 1980, focused on transferring technology from the commercial semiconductor industry into the largely separate military electronics industry. The long procurement cycle of military electronics meant that it was perpetually behind rapidly changing commercial technology. Under the VHSIC program, DOD, through the Office of the Secretary of Defense, spent more than $900 million over the course of the decade, but few new chips actually made their way into military systems. As one analyst wrote, "R&D could not solve a procurement problem" (Alic et al., 1992, p. 269). The Office of the Secretary of Defense (OSD) spent significant funds on the development of the Ada programming language, intended to be standard for all DOD computer applications. While Ada displaced a number of other programming languages in DOD applications, it did not achieve broad acceptance in the commercial marketplace as had been hoped.21 OSD also made a significant investment in software production and maintenance techniques aimed at improving productivity and reliability ($60 million in 1984) (Flamm, 1987, p. 76). NASA support for computing has varied considerably over the years. Overall, NASA has been more of a development than a research agency in computing: that is, it has focused on hardware and applications rather than basic research. In hardware, the agency built highly rugged and reliable machines to run its spacecraft but with conservative rather than cutting-edge technology. Although NASA tended to have little effect on computer architecture and design (although some significant impact in packaging), its software work in redundant and fault-tolerant computers, simulation, and program verifications made significant contributions to programming practice. The Saturn V computer pioneered triple redundancy and voter circuits (Tomayko, 1985, pp. 7-18). Some of this technology has been transferred to transaction processing in commercial units. Funding began to decline rapidly after the peak of the space program, in the late 1960s, and was virtually halted by 1972, at which point NASA's only computing program was the ILLIAC IV. It took off again in the early 1980s, focusing on image processing and supercomputers for modeling of
OCR for page 129
--> aerostructures. The NASTRAN software package for finite element modeling of physical structures has become the most widely accepted such program in industry (Flamm, 1987, p. 85). Also during the 1980s, the National Institutes of Health (NIH) was a small but increasingly important player in developing computer applications for medicine and biology, particularly in innovative applications of expert systems (see Chapters 9 and 10 for a description of NIH's support for expert systems and virtual reality technology). The National Library of Medicine, along with DARPA and the National Institute of Standards and Technology (NIST), also supported work on information retrieval that has influenced the development of Internet search engines. Similarly, the Department of Energy invested in high-end and parallel machines, at about $7 million per year (Flamm, 1987, p. 93). SEMATECH In 1987, 14 U.S. semiconductor companies joined a not-for-profit venture, SEMATECH, to improve domestic semiconductor manufacturing. The joint nature of the effort, combined with member companies' willingness to put significant funds into SEMATECH and concerns over the nation's growing dependence on foreign suppliers for semiconductor devices, helped convince Congress to support the effort as well: in 1988, it appropriated $100 million annually for 5 years to match the industrial funding. The federal dollars for SEMATECH were funneled through DARPA because semiconductor manufacturing was seen as vital to the defense technology base. In the words of one analyst, "the half-billion-dollar federal commitment marks a major shift in U.S. technology policy: a turn toward explicit support for commercially oriented R&D carried out in the private sector" (Alic et al., 1992, p. 277). SEMATECH originally planned to develop new production processes in-house for manufacturing next-generation semiconductor devices, but soon after decided to concentrate its efforts on strengthening the supplier base for the semiconductor industry. At the time, Japanese semiconductor manufacturing equipment suppliers were gaining market share at a rate of 3.1 percentage points a year, and U.S. semiconductor manufacturers planned to purchase the majority of their equipment from Japanese suppliers (SEMATECH, 1991). Over the next several years, SEMATECH made several notable advances. It established partnerships with U.S. equipment suppliers to help them develop next-generation production tools, and it helped semiconductor manufacturers develop consensus regarding their future needs, especially those related to manufacturing equipment. These achievements allowed equipment manufacturers to meet one set of industry specifica-
OCR for page 130
--> tions rather than a variety of company specifications. SEMATECH also funded research and development efforts at supplier companies helping them improve their equipment and develop systems to make more advanced semiconductor devices. Perhaps most important, SEMATECH helped establish improved communication links between semiconductor manufacturers and their suppliers, allowing freer exchanges of information among users and suppliers of manufacturing equipment. These efforts and others began to show benefits soon thereafter. Semiconductor equipment manufacturers regained market share against the Japanese, boasting 53 percent of the world market in 1992 versus 38 percent for Japanese suppliers (VLSI Research, 1992). Production yields for U.S. semiconductor manufacturers improved from 60 percent in 1987 to 84 percent in 1992, and U.S. market share in semiconductor devices also improved (GAO, 1992, p. 10). Clearly, other factors played a role, not the least of which was the relative rise of the market for microprocessors—in which U.S. firms developed a strong competitive advantage—versus memory chips. Nevertheless, SEMATECH has been cited as a factor in the resurgence of U.S. semiconductor equipment manufacturers. DARPA program managers also considered the effort successful, noting that many of DARPA's objectives were mentioned in SEMATECH's strategic plan, including efforts to rapidly convert manufacturing technology into practice and to develop technology for more flexible semiconductor production (OTA, 1993, p. 128). DARPA continued its investment in SEMATECH beyond the original deadline, but, in 1995, SEMATECH announced that it would wean itself from public assistance. In doing so, it recognized that it had achieved most of its original objectives and believed it could remain self-sustaining with industry funds only. Doing so would also allow it greater freedom in establishing its research agenda, insulate it from continued uncertainty over federal funding, and reduce concerns about participating with foreign companies. In 1998, SEMATECH announced the establishment of SEMATECH International, a division of SEMATECH that would allow participation by foreign-owned companies. High-Performance Computing The late 1980s saw a new theme emerge in government support of computing research: coordination among federal research agencies. The most visible example of this coordination, which also accounts for a significant percentage of today's federal support for computing R&D, is the High Performance Computing and Communications Initiative (HPCCI). Although this program focused on the highest-end computers and applications, it has much broader impact. The pace of microelectronics means
OCR for page 131
--> that the evolution of a given capability (hardware and software) from supercomputer to desktop requires about a decade. Thus, today's high-performance applications are a glimpse into the future of computing. In keeping with its traditional role of providing facilities for computer science in universities, in 1984 the NSF asked Congress to set up supercomputer centers so academic researchers could access state-of-the-art supercomputers. The result was the National Centers for Supercomputing Applications. NSF then established a high-speed network backbone to connect these centers, which itself became the seed of the high-speed Internet backbone. In 1988, the Office of Science and Technology Policy (OSTP) and the Federal Coordinating Council for Science, Engineering, and Technology (FCCSET) created the National Research and Education Network, a new system that built on earlier projects within NSF, DOE, NASA, and DOD that supported advanced scientific computing and human resource development for computer science. The result was the High Performance Computing Program, which also included an emphasis on communications. In 1989, OSTP produced a formal program plan for high-performance computing. OSTP provided a vehicle for interagency coordination among the initial players, DOE, NASA, and NSF; the National Security Agency (NSA) has also been an influential player, although not a formal member. Thus, economies of scale and scope could be realized by avoiding duplication of effort across research agencies. Congress passed the High Performance Computing Act in 1991 as a 5-year program. This affirmed the interagency character of HPCCI, which by then had 10 federal agencies participating, including the Environmental Protection Agency, the National Library of Medicine (a branch of NIH), NIST, the National Oceanic and Atmospheric Administration, and later, the Department of Education, NSA, and the Veterans Administration. Originally, HPCCI aimed at meeting several grand challenges, including scientific modeling, weather forecasting, aerospace vehicle design, and earth biosphere research. These goals have since been expanded to "National Challenges," which include digital libraries, electronic commerce, health care, and improvement of information infrastructure (CSTB, 1995a). Overall, the program achieved a number of notable results. The success of some applications and programming paradigms convinced people that parallel computing could be made to work. The program created and disseminated technologies to speed the pace of innovation, enhance national security, promote education, and better understand the global environment (see Chapter 3 for a discussion of some of the results of the high-performance computing effort).
OCR for page 132
--> 1990 and Beyond The 1990s have seen the continued evolution of computing and communications technology and a changing environment for federal support. The technological side has been characterized by an explosion in the use of computers and the Internet. Personal computers have continued to penetrate businesses and homes. By 1998, approximately 40 percent of U.S. households had at least one computer, and a growing number boasted a connection to the Internet. Building upon decades of federal research and development, the Internet itself emerged as a major force with the number of servers growing exponentially. With the emergence of the World Wide Web and browser technologies (also derivatives of federally sponsored research—see Chapter 7), the Internet has become a medium for disseminating information and conducting business. Companies such as Amazon.com formed solely as virtual entities, and many established firms created a presence on the Web to conduct business. Development of networking technologies has also created new opportunities for new kinds of computing hardware and software. A number of companies developed and began offering network computers, machines designed specifically for use over the Internet and other corporate networks. Such machines rely on the network for much of their infrastructure, including application programs, rather than storing such files locally. Although it is not yet clear how well such computers will fare in the marketplace, especially as PC manufacturers expand their offerings of low-cost, scaled-down computers, network computers demonstrate the kinds of innovation that expansion of the Internet can motivate. Component software also emerged as a new programming modality in the 1990s. Epitomized by the Java programming language, component software allows programs to be assembled from components that can run on a wide variety of computing platforms. Applications can be accessed, downloaded, and run over the network (e.g., the Internet) as needed for computations. Along with these technological changes have come changes in the environment for federal research funding. With the fall of the Berlin Wall in 1989 and the subsequent demise of the Soviet Union, defense budgets began a slow, steady decline, placing additional pressure on defense research and development spending. At the same time, growing sentiment to reduce the federal deficit further squeezed federal budgets for science and technology generally in the first half of the decade. By 1997, the prospect of budget surpluses gave rise to the possibility of expanding budgets for science and technology spending and renewed attempts to develop a new framework for federal participation in the innovation process. Senator Phillip Gramm, along with Senators Joseph Lieberman,
OCR for page 133
--> Peter Domenici, and Jeffrey Bingaman, introduced a bipartisan bill in October 1997 to double federal spending for nondefense scientific, medical, and precompetitive engineering research over 10 years (the bill, S.1305, is called the National Research Investment Act of 1998). In early 1998, Congressman Vern Ehlers of the House Science Committee initiated a national science policy study to review the nation's science policy and develop a new, long-range science and technology policy that is ''concise, comprehensive, and coherent'' (Ehlers, 1998). The structure of federal support for computing and communications also underwent modification in the 1990s. In place of the FCCSET committee, the Clinton administration established a National Science and Technology Council in 1993 to coordinate federal programs in science, technology, and space. Its Committee on Computing, Information, and Communications (CCIC), through the subcommittee on Computing, Information, and Communications R&D, coordinates computing- and communications-related R&D programs conducted by the 12 federal departments and agencies in cooperation with academia and industry. This group has restructured and expanded upon the HPCCI to organize programs in five areas: (1) high-end computing and computation; (2) large-scale networking; (3) high-confidence systems; (4) human-centered systems; and (5) education, training, and human resources. Further, in February 1997, President Clinton established an Advisory Committee on High Performance Computing and Communications, Information Technology, and the Next-Generation Internet. The committee's charge is to assist the administration in accelerating the development and adoption of information technology that is vital to the nation's future (NSTC, 1997). Federal support for computing and communications infrastructure also changed in the 1990s. After opening the Internet to commercial use in 1992, NSF effectively privatized the network in 1995. Nevertheless, NSF and other federal agencies are pursuing development and deployment of the Next-Generation Internet (NGI), which will boast data rates 100 times those of the Internet. The NGI initiative will create an experimental, wide-area, scalable testbed for developing networking applications that are critical to national missions, such as defense and health care. Further, starting in December 1995, NSF began restructuring its support of national supercomputing centers, forming a new Partnerships for Advanced Computational Infrastructure program. The program will concentrate its resources on two groups of organizations, each with a leading-edge facility and several collaborators. One group, the National Partnership for Advanced Computational Infrastructure will have the San Diego Supercomputing Center in California as its leading-edge site. The other group, the National Computational Science Alliance, will have the National Center for Supercomputing Applications at Urbana-Champaign,
OCR for page 134
--> Illinois, as its leading-edge site. The objective is to equip these sites with high-end computing systems one to two orders of magnitude more capable than those typically available at major research universities. They will work in partnership with other organizations that are expected to contribute to access, to education, outreach, and training, and to software development that will facilitate and enhance both the overall infrastructure and access to that infrastructure (Cutter, 1997). Funding for research in computer science weathered these changes reasonably well with basic and applied research posting real gains between 1989 and 1995 (see Chapter 3). Nevertheless, the research community expressed concerns that such funding may not be adequate to support the continuing growth of the field (and the rising number of researchers in academia and industry) and that the nature of such research is changing. Many researchers claim that federal funding is increasingly focused on near-term objectives and less radical innovation. Calls for greater accountability in the research enterprise, they claim, have led agencies to favor work that is less risky and that exploits existing knowledge, despite its potentially lesser payback. The implications of such changes are not yet clear, but they will become evident over the next several years and beyond.22 Notes 1. Quoted in Edwards (1996), p. 122. 2. As President Eisenhower declared in the 1958 State of the Union message, "Some of the important new weapons which technology has produced do not fit into any existing service pattern. They cut across all services, involve all services, and transcend all services, at every stage from development to operation. In some instances they defy classification according to branch of service." 3. Quoted in Barber Associates (1975), pp. V-51 to V-52. 4. Quoted in Norberg (1996), pp. 40-53. 5. Quoted in Norberg and O'Neill (1996), p. 31. 6. Figure based on data for 1960-1968 in the National Science Foundation's annual Budget Request to Congress (1960-1969) and for 1968-1970 in its annual publication Grants and Awards (1968-1970). Both are available from the National Science Foundation. 7. Figure based on data from the 1968, 1969, and 1970 editions of the National Science Foundation's Grants and Awards for the Fiscal Year Ended June 30. 8. The fundamental discoveries of computability and complexity theory show precisely that the details of the computing machine do not matter in analyzing the most important properties of the function to be computed. The science of computing is the study of the consequences of certain basic assumptions about the nature of computation (spelled out most clearly in Turing's famous 1936 paper), not the study of particular artifacts. Of course, problems arising from the construction and use of actual computers are a main source of questions for the
OCR for page 135
--> science of computing, in the same way as problems in the physical sciences and engineering have been a main source of ideas and questions in mathematics. 9. Blue, quoted in Norberg and O'Neill (1996), p. 37. 10. Many of the details contained in this section derive from case studies of the VLSI program and MOSIS contained in Van Atta et al. (1991a), although the interpretation here differs in some respects. 11. Silicon Graphics, Inc. had sales of $3.1 billion and employed over 9,800 workers in 1998. 12. Charles Seitz in a presentation to the study committee, February 28, 1997, Stanford, Calif. 13. In order for the program to benefit U.S. industry more than its foreign competitors, there was a general understanding that investigators would delay open publication of results for roughly 1 year, during which time results would be circulated quickly within the community of DARPA-sponsored VLSI researchers (Van Atta et al., 1991a, pp. 17-10 and 17-13, based largely on comments by Robert Kahn on August 7, 1990). 14. Charles Seitz in a presentation to the study committee, February 28, 1997, Stanford, Calif. 15. John L. Hennessy in a briefing to the study committee, February 28, 1997, Stanford, Calif. 16. Data from "Compilation of Data" from the National Science Foundation's annual Summary of Awards between 1973 and 1985. 17. Personal communication from Gordon Bell, July 1998. 18. Personal communication from Vernon Ross, NSF Office of Budget, Finance, and Award Management, July 1997. 19. Personal communication from Vernon Ross, NSF Office of Budget, Finance, and Award Management, July 1997. 20. SDI budgets for computing are difficult to discern with accuracy, as they were buried within other types of contracts. One estimate is between $50 million and $225 million annually from 1985 to 1994 (Paul Edwards, 1996, p. 292). 21. For a discussion of Ada and its use in military and civilian applications, see CSTB (1997a). 22. The Computer Science and Telecommunications Board of the National Research Council has a project under way to document changes in support for information technology research in industry and government and evaluate their implications. For more information on this project, "Information Technology Research in a Competitive World," See <http://www4.nas.edu/cp.nsf>.
Representative terms from entire chapter: