Click for next page ( 13


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 12
1 Introduction The security of our nation, the viability of our economy, and the health and well-being of our citizens rely today on infrastructures for communication, finance, energy distribution, and transportation. All of these infrastructures depend increasingly on networked information sys- tems. That dependence, with its new levels and kinds of vulnerabilities, is attracting growing attention from government and industry. Within the last 2 years, the Office of Science and Technology Policy in the White House, the President's National Security Telecommunications Advisory Committee, the President's Commission on Critical Infrastructure Protec- tion, the Defense Science Board, and the General Accounting Office have each issued reports on the vulnerabilities of networked information sys- tems.~ Congressional hearings,2 articles in the popular press, and concern 1See Cybernation: The American Infrastructure in the Information Age: A Technical Primer on Risks and Reliability (Executive Office of the President, 1997), Reports from the Eight NSTAC Subcommittee Investigations (NSTAC, 1997), Critical Foundations: Protecting America's Infra- structures (PCCIP, 1997), Report of the Defense Science Board Task Force on Information Warfare Defense (IW-D) (Defense Science Board, 1996), and Information Security Computer Attacks at Department of Defense Pose Increasing Risks: A Report to Congressional Requesters (U.S. GAO, 1996~. 2Such as testimony titled "Weak Computer Security in Government: Is the Public at Risk?" presented before the Senate Governmental Affairs Committee on May 19, 1998, and testi- mony titled "Future Threats to the Department of Defense Information Systems: Y2K & Frequency Spectrum Reallocation," presented before the Senate Armed Services Committee on June 4, 1998. 12

OCR for page 12
INTRODUCTION 13 about the impending year 2000 problem have further heightened public awareness. Most recently, Presidential Decision Directive 633 has called for a national effort to assure the security of our increasingly vulnerable critical infrastructures. Although proposals for action are being advanced, their procedural emphasis reflects the limitations of available knowledge and technolo- gies for tackling the problem. These limitations constrain effective deci- sion making in an area that is clearly vital to all sectors of society. Creating a broader range of choices and more robust tools for build- ing trustworthy networked information systems is essential. To accom- plish this, new research is required. And since research takes time to bear fruit, the nation's dependence on networked information systems will greatly exceed their trustworthiness unless this research is initiated soon. Articulating an agenda for that research is the primary goal of this study; that detailed agenda and its rationale constitute the core of this report. TRUSTWORTHY NETWORKED INFORMATION SYSTEMS Networked information systems (NISs) integrate computing systems, communications systems, and people (both as users and operators). The defining elements are interfaces to other systems along with algorithms to coordinate those systems. Economics dictates the use of commercial off- the-shelf (COTS) components wherever possible, which means that de- velopers of an NIS have neither control over nor detailed information about many system components. The use of system components whose functionality can be changed remotely and while the system is running is increasing. Users and designers of an NIS built from such extensible system components thus cannot know with any certainty what software has entered system components or what actions those components might take. (Appendix E contains a detailed discussion of likely developments in software for those readers unfamiliar with current trends.) A trustworthy NIS does what people expect it to do and not some- thing else despite environmental disruption, human user and operator errors, and attacks4 by hostile parties. Design and implementation errors must be avoided, eliminated, or somehow tolerated. It is not sufficient to 3Available online at . 4In the computer security literature, '~vulnerability,,, ''attack,,, and ''threat,, are technical terms. A vulnerability is an error or weakness in the design, implementation, or operation of a system. An attack is a means of exploiting some vulnerability in a system. A threat is an adversary that is motivated and capable of exploiting a vulnerability.

OCR for page 12
4 TRUST IN CYBERSPACE address only some of these dimensions, nor is it sufficient simply to as- semble components that are themselves trustworthy. Trustworthiness is holistic and multidimensional. Trustworthy NISs are challenging systems to build, operate, and maintain. There is the intrinsic difficulty of understanding what can and cannot happen within any complex system and what can be done to control the behavior of such a system. With the environment only par- tially specified, one can never know what kinds of attacks will be launched or what manifestations failures may take. Modeling and planning for the behavior of a sentient adversary are especially hard. The trustworthiness of an NIS encompasses correctness, reliability, security (conventionally including secrecy, confidentiality, integrity, and availability), privacy, safety, and survivability (see Appendix K for defi- nitions of these terms). These dimensions are not independent, and care must be taken so that one is not obtained at the expense of another. For example, protection of confidentiality or integrity by denying all access trades one aspect of security availability for others. As another ex- ample, replication of components enhances reliability but may increase exposure to attack owing to the larger number of sites and the vulnerabili- ties implicit in the protocols to coordinate them. Integrating the diverse dimensions of trustworthiness and understanding how they interact are central challenges in building a trustworthy NIS. Various isolated dimensions of trustworthiness have become defining themes within professional communities and government pro- grams: Correctness stipulates that proper outputs are produced by the system for each input. Availability focuses on ensuring that a system continues to operate in the face of certain anticipated events (failures) whose occurrences are uncorrelated. Security is concerned with ensuring that a system resists poten- tially correlated events (attacks) that can compromise the secrecy, integ- rity, or availability of data and services. While individual dimensions of trustworthiness are certainly impor- tant, building a trustworthy system requires more. Consequently, a new term "trustworthiness" and not some extant technical term (with its accompanying intellectual baggage of priorities) was selected for use in this report. Of ultimate concern is how people perceive and engage a system. People place some level of trust in any system, although they may neither think about that trust explicitly nor gauge the amount realis- tically. Their trust is based on an aggregation of dimensions, not on a few

OCR for page 12
INTRODUCTION 15 narrowly defined or isolated technical properties. The term "trustworthi- ness" herein denotes this aggregation. To be labeled as trustworthy, a system not only must behave as ex- pected but also must reinforce the belief that it will continue to produce expected behavior and will not be susceptible to subversion. The ques- tion of how to achieve assurance has been the target of several research programs sponsored by the Department of Defense and others. Yet cur- rently practiced and proposed approaches for establishing assurance are still imperfect and/or impractical. Testing can demonstrate only that a flaw exists, not that all flaws have been found; deductive and analytical methods are practical only for certain small systems or specific proper- ties.5 Moreover, all existing assurance methods are predicated on an unrealistic assumption that system designers and implementers know what it means for a system to be "correct" before and during develop- ment.6 The study committee believes that progress in assurance for the foreseeable future will most likely come from figuring out (1) how to combine multiple approaches and (2) how best to leverage add-on tech- nologies and other approaches to enhance existing imperfect systems. Improved assurance, without any pretense of establishing a certain or a quantifiable level of assurance, should be the aim. WHAT ERODES TRUST The extent to which an NIS comes to be regarded as trustworthy is influenced, in large part, by people's experiences in using that system. However, generalizations from individual personal experience can be misleading. The collection of incidents in Neumann (1995) and its associ- ated online database suggests something about the lay of the land, al- though many kinds of attacks are not chronicled there (for various rea- sons). Other compilations of information on the trustworthiness of specific infrastructures can be found at the CERT/CC Web site7 and other sources. But absent scientific studies that measure dominant detractors of NIS trustworthiness, it is hard to know what vulnerabilities are the most significant or how resources might best be allocated in order to enhance a system's trustworthiness. Rigorous empirical studies of system outages and their causes are a necessary ingredient of any research agenda in 5see Chapter 3 for a more detailed discussion. Requirements invariably change through the development process, and the definition of system correctness changes accordingly. 7The computer Emergency Response Team ~CERTy/Coordination center ~cc' is an ele- ment of the Networked systems Survivability Program in the Software Engineering Insti- tute at Carnegie Mellon university. see .

OCR for page 12
16 TRUST IN CYBERSPACE tended to further NIS trustworthiness. Empirical studies of normal sys- tem operations are also important, because having baseline data can be helpful for detecting failures and attacks by monitoring usage (Ware, 1998~. But perceptions of trustworthiness are just that and, therefore, can be shaped by the popular press and information from organizations that have particular advocacy agendas. A predominant cause of NIS outages might not be a good topic for newspaper stories, although anecdotes of attacks perpetrated by hackers seem to be.8 Trust in an NIS is not unduly eroded when catastrophic natural phe- nomena in a region, such as earthquakes or storms, disrupt the operation of NISs only in that region. But when environmental disruption has disproportionate consequences, trust is eroded. Regional and long-dis- tance telephone outages caused by a backhoe accidentally severing a fi- ber-optic cable (Neumann, 1995) and a power outage disrupting Internet access in the Silicon Valley area as a result of rodents chewing cable insulation (Neumann, 1996) are just two illustrations. The good news is that the frequency and scope of accidental man-made and natural disrup- tions are not likely to change in the foreseeable future. Building a trust- worthy NIS for tomorrow that can tolerate today's levels of such disrup- tions should suffice. Errors made in the operation of a system also can lead to systemwide disruption. NISs are complex, and human operators err: an operator installing a corrupted top-level domain name server database at Network Solutions effectively wiped out access to roughly a million sites on the Internet in fuly 1997 (Wayner, 1997~; an employee's uploading of an in- correct set of translations into a Signaling System 7 processor led to a 90- minute network outage for AT&T toll-free telephone service in Septem- ber 1997 (Perillo, 1997~. Automating the human operator's job is not necessarily a solution, for it simply exchanges one vulnerability (human operator error) for another (design and implementation errors in the con- trol automation). Controlling a complex system is difficult, even under the best of cir- cumstances. Whether or not human operators are involved, the geo- graphic scope and the speed at which an NIS operates mean that assem- bling a current and consistent view of the system is not possible. The control theory that characterizes the operation of such systems (if known at all) is likely to be fraught with instabilities and to be highly nonlinear. When operators are part of the picture, details of the system's operating 8The classification and restricted distribution of many government studies about vulner- ability and the frequency of hostile attacks, rather than informing the public about real risks, serve mostly to encourage speculation.

OCR for page 12
INTRODUCTION 17 status must be distilled into a form that can be understood by humans. Moreover, there is the difficulty of designing an operator interface that facilitates human intervention and control. The challenge of implementing software that satisfies its specification is well known, and failing to meet that challenge invariably compromises system trustworthiness. NIS software is no exception. An oft-cited ex- ample is the January 1990 9-hour-long outage (blocking an estimated 5 million calls) that AT&T experienced due to a programming error in soft- ware for its electronic switching systems (Neumann, 1995~. More re- cently, software flaws caused an April 1998 outage in the AT&T frame- relay network (a nationwide high-speed data network used by business) (Mills, 1998), and in February 1998 the operation of the New York Mer- cantile Exchange and telephone service in several major East Coast cities were interrupted by a software failure in Illuminet, a private carrier (Kalish, 1998~. The challenges of developing software can also be responsible for project delays and cost overruns. Problems associated with software thus can undermine confidence and trust in a system long before the system has been deployed. NIS software is especially difficult to write, because it typically integrates geographically separated system components that execute concurrently, have idiosyncratic interfaces, and are sensitive to execution timings. Finally, there are the effects of hostile attacks on NIS trustworthiness and on perceptions of NIS trustworthiness. Evidence abounds that the Internet and the public telephone networks not only are vulnerable to attacks but also are being penetrated with some frequency. In addition, hackers seeking the challenge and insiders seeking personal gain or re- venge have been successful in attacking business and critical infrastruc- ture computing systems. Accounts of successful attacks on computer systems at military sites are perhaps the most disturbing, since tighter security might be expected there; Box 1.1 contains just a few examples of recent attacks on both critical and noncritical DOD computers. The De- fense Information Systems Agency (DISA) estimates that DOD may have experienced as many as 250,000 attacks on its computer systems in a recent year and that the number of such attacks may be doubling9 each year (U.S. GAO, 1996~. The exact number of attacks is not known because DISA's own penetration attempts on these systems indicate that only about 1 in 150 attacks is actually detected and reported (U.S. GAO, 1996~. Specifically, defense installations reported 53 attacks in 1992,115 in 1993, 255 in 1994, and 559 in 1995.

OCR for page 12
18 TRUST IN CYBERSPACE Similarly troubling statistics about private-sector computer break-ins have been reported (Hardy, 1996; Power, 1996; War Room Research LLC, 1996~. Attacks specifically directed at NISs running critical infrastructures are not frequent at present, but they do occur. According to FBI Director Louis Freeh speaking at the March 1997 Computer Crime Conference in New York City, a Swedish hacker shut down a 911 emergency call system in Florida for an hour (Milton, 1997~. And in March of 1997, a series of commands sent from a hacker's personal computer disabled vital services to the Federal Aviation Administration control tower at the Worcester, Massachusetts, airport (Boston Globe, 1998~.

OCR for page 12
INTRODUCTION 19 To a first approximation "everything" is becoming interconnected. The June 1997 Pentagon cyber-war game "Eligible Receiver" (Gertz, 1998; Myers, 1998) demonstrated that computers controlling electric power dis- tribution are, in fact, accessible from the Internet. It is doubtless only a matter of time before the control network for the public telephone net- work is discovered to be similarly connected having just one computer connected (directly or indirectly) to both networks suffices. Thus, the Internet will ultimately give ever larger numbers and increasingly sophis- ticated attackers access to the computer systems that control critical infra- structures. The study committee therefore concluded that resisting attack is a dimension of trustworthiness that, although not a significant source of disruption today, has the potential to become a significant cause of outages in the future. Interconnection within and between critical infrastructures further amplifies the consequences of disruptions, making the trustworthiness of one system conditional on that of another. The lesson of the Northeast power blackout in the late 1960s was that disruptions can propagate through a system with catastrophic consequences. Three decades later, in July 1998, a tree shorting a powerline running to a power plant in Idaho brought about cascading outages that ultimately took down all three of the main California-Oregon transmission trunks and interrupted ser- vice for 2 million customers (Sweet and Geppert, 1997~. Was the lesson learned? The interdependence of critical infrastructures also enables disrup- tion to propagate. An accidental fiber cut in January 1991 (Neumann, 1995) blocked 60 percent of the long-distance calls into and out of New York City but also disabled air traffic control functions in New York, Washington, D.C., and Boston (because voice and data links to air traffic control centers use telephone circuits) and disrupted the operation of the New York Mercantile Exchange and several commodity exchanges (be- cause buy and sell orders, as well as pricing information, are communi- cated using those circuits). The impact of such a disruption could easily extend to national defense functions.~ Furthermore, a climate of deregu- lation is promoting cost control and product enhancements in electric power distribution, telecommunications (Board on Telecommunications and Computer Applications, 1989), and other critical infrastructures )In March 1997, DISA disclosed that a contract had been awarded to Sprint for a global telecommunications network designed primarily to carry signal intelligence data to Fort Meade (Brewin, 1997~. According to the Defense Science Board (1996), the U.S. government procures more than 95 percent of its domestic telecommunications network services from U.S. commercial carriers.

OCR for page 12
20 TRUST IN CYBERSPACE actions that increase vulnerability to disruption by diminishing the cush- ions of reserve capacity and increasing the complexity of these systems. THIS STUDY IN CONTEXT Network security, information warfare, and critical-infrastructure protection have already been the subject of other national studies. The most visible of these studies summarized in Appendix F have focused on the expected shape and consequences of widespread networking, de- fending against information warfare and other cyber-threats, the coordi- nation of federal and private-sector players in such a defense, and na- tional policies affecting the availability of certain technological building blocks (e.g., cryptography). The absence of needed technology has been noted, and aggressive programs of research to fill broadly characterized gaps are invariably recommended. A Computer Science and Telecommunications Board study almost a decade ago anticipated the role networked computers would play in our society along with the problems that they could create (CSTB, 1991~. Its opening paragraph summarized the situation then and today with re- markable clarity: We are at risk. Increasingly, America depends on computers. They control power delivery, communications, aviation, and financial servic They are used to store vital information, from medical records to business plans to criminal records. Although we trust them, they are vulnerable to the effects of poor design and insufficient quality con- trol, to accident, and perhaps most alarmingly, to deliberate attack. The modern thief can steal more with a computer than with a gun. Tomor- row's terrorist may be able to do more damage with a keyboard than with a bomb. More recently, in October 1997, the President's Commission on Criti- cal Infrastructure Protection released a report (PCCIP, 1997) that discusses the vulnerability of U.S. infrastructures to physical as well as cyber- threats. Based substantially on the commission's recommendations and findings, Presidential Decision Directive 63 (White House National Secu- rity Council, 1998) outlines a procedure and administrative structure for developing a national infrastructure protection plan. The directive orders immediate federal government action, with the goal that, within 5 years, our nation's critical infrastructures will be protected from intentional acts that would diminish the functioning of government, public services, the orderly functioning of the economy, and the delivery of essential telecom- munications, energy, financial, and transportation services. Among the directive's general principles and guidelines is a request that research for protecting critical infrastructures be undertaken.

OCR for page 12
INTRODUCTION 21 The present study offers a detailed agenda for that research. It is an agenda that was developed by analyzing current approaches to trustwor- thiness and by identifying science and technology that currently do not, but could, play a significant role. The agenda thus fills the gap left by predecessor studies, with their focus on infrastructure vulnerabilities and the wider consequences. Articulating a research agenda is a necessary first step in obtaining better methods of infrastructure protection. The research agenda should be of interest to researchers, who will ultimately execute the agenda, and to funders of research, who will want to give priority to research problems that are urgent and approaches that are promising. The research agenda should also be of interest to policy- makers who, in formulating legislation and initiating other actions, will profit from knowing which technical problems do have solutions, which will have solutions if research is supported, and which cannot have solu- tions. NIS operators can profit from the agenda in much the same way as policymakers will. And product developers should be interested in the research agenda for its predictions of market needs and promising direc- tions to address those needs. SCOPE OF THIS STUDY The premise of this report is that a "trust gap" is emerging between the expectations of the public (along with parts of government) and the capabilities of NISs. The report is organized around an agenda and call for research aimed at improving the trustworthiness of NISs and thereby narrowing this gap. To develop this agenda, the study committee sur- veyed the state of the art, current practice, and trends with respect to computer networking and software. The committee also studied connec- tions between these technical topics and current economic and political forces; those investigations, too, are summarized in the report. Some of the research problems in the proposed agenda are new. Oth- ers are not new but warrant revisiting in light of special requirements and circumstances that NIS developers and operators face. The networked environment imposes novel constraints, enables new types of solutions, and changes engineering trade-offs. Characteristic elements of NISs (COTS software, extensible components, and evolution by accretion) af- fect software development practices. And the need to simultaneously support all of the dimensions of trustworthiness invites reconsidering known approaches for individual dimensions of trustworthiness with an eye toward possible interactions. The Internet and public telephone network figured prominently in the study committee's thinking, and that emphasis is reflected in Chapter 2 of this report. The attention is justified on two grounds. First, the

OCR for page 12
22 TRUST IN CYBERSPACE Internet and public telephone network are themselves large and complex NISs. Studying extant NISs is an obvious way to understand the technical problems that will be faced by developers and operators of future NISs. Second, the high cost of building a global communications infrastructure from the ground up implies that one or both of these two networks is likely to furnish communications services for most other NISs.ll With such a pivotal role, the trustworthiness and vulnerabilities of these com- munications fabrics need to be understood. Commercial software packages and systems and not systems cus- tom-built from scratch are also a central subject of this report, as is most evident in Chapter 3 on software development. This focus is sensible given the clear trend in government and military procurement to adapt and depend on commodities and services intended for the mass market.l2 Research that ignores COTS software could have little impact on trust- worthiness for tomorrow's NISs.l3 In the past, computer science research programs serving military needs could safely ignore commercial software products and practices; that course now invites irrelevance. Chapter 4 concerns security. The extensive treatment of this single dimension of trustworthiness merits comment, especially given the relative infrequency with which attacks today are responsible for NIS outages. A research agenda must anticipate tomorrow's needs. Hostile attacks are the fastest-growing source of NIS disturbances. Indications are that this trend will continued and that, because they can be coordinated, attacks are po- tentially the most destabilizing form of trustworthiness breach. Further- more, the study committee found that past approaches to security (i.e., the 1lFor example, during the Persian Gulf conflict, the Internet was used to disseminate intelligence and counterintelligence information. Moreover, defense experts believe that public messages originating within regions of conflict will, in the future, provide warnings of significant political and military developments earlier than normal intelligence gather- ing. These experts also envision the Internet as a back-up communications medium if other conventional channels are disrupted during conflicts (U.S. GAO, 1996~. 12According to the Report of the Defense Science Board Task Force on Information Warfare Defense (IW-D) (Defense Science Board, 1996), COTS systems constitute over 90 percent of the information systems procured by DOD. Moreover, the widespread use of COTS sys- tems in military systems for the coming century is urged in National Defense Panel (1997~. 13Research that takes into account COTS commodities and services is likely to be appli- cable to the development of custom-designed systems as well. Methods suitable for systems built from scratch, however, may not apply in the presence of the added constraints that COTS purchases impose. 14The present study was conducted without access to classified material. Unclassified studies, such as U.S. General Accounting Office (1996), point to the growing incentive to attack infrastructure and defense computing systems, as these systems become more criti- cal, and to the expanding base of potential attackers that is accompanying the growth of the Internet.

OCR for page 12
INTRODUCTION 23 "Orange Book" [U.S. DOD, 1985] and its brethren) are less and less relevant to building a trustworthy NIS: inappropriate disclosure of information is only one of many security policies of concern, and custom construction and/or complete analysis of an entire NIS or even significant parts of an NIS is impractical. The typically complex trust relationships that exist among the parts of an NIS add further complication. The "holy grail" for developers of trustworthy systems is technology to build trustworthy systems from untrustworthy components. The subject of Chapter 5, this piece of the research agenda is the most ambitious. What is being sought can be achieved today for single dimensions of trustworthi- ness, lending some credibility to the vision being articulated. For example, highly reliable computing systems are routinely constructed from unreli- able components (by using replication). As another example, firewalls enable networks of insecure processors to be protected from certain forms of attack. And new algorithmic paradigms and system architectures could result in the emergence of desirable system behavior from seemingly ran- dom behaviors of system components. Without further research, though, it is impossible to know whether approaches like these will actually bear fruit for NIS trustworthiness. Fleshing out highly speculative research direc- tions with details is impossible without actually doing some of the research, so the discussions in Chapter 5 are necessarily brief. The viability of technological innovations is invariably determined by the economic and political context, the subject of Chapter 6. The econom- ics of building, selling, and operating trustworthy systems is discussed, because economics determines the extent to which technologies for trust- worthiness can be embraced by system developers and operators, and it determines whether users can justify investments in supporting trustwor- thiness. The dynamics of the COTS marketplace and an implied limited diversity have become important for trustworthiness so they, too, are discussed. Risk avoidance is but a single point in a spectrum of risk management strategies; for NISs (because of their size and complexity) it is most likely an unrealistic one. Thus, alternatives to risk avoidance are presented in the hope of broadening the perspectives of NIS designers and operators. Finally, since there is more to getting research done than articulating an agenda, the chapter reviews the workings of DARPA and NSA (likely candidates to administer this agenda), U.S. cryptography policy, and the general climate in government regarding regulation and trustworthiness. REFERENCES Associated Press. 1997. "Fifteen Year Old Hacker Discusses How He Accessed U.S. Mili- tary Files," Associated Press, March 1.

OCR for page 12
24 TRUST IN CYBERSPACE Board on Telecommunications and Computer Applications, National Research Council. 1989. Growing Vulnerability of the Public Switched Networks: Implications for National Security Emergency Preparedness. Washington, DC: National Academy Press. Boston Globe. 1998. "Youth Faces Computer Crime Charges: U.S. Attorney Says Federal Case Is First Involving a Juvenile," Boston Globe, March 18. Available online at . Brewin, Bob. 1997. "DISA Discloses Secret NSA Pact with Sprint," Federal Computer Week, March 10. Available online at . Computer Science and Telecommunications Board (CSTB), National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: National Academy Press. Defense Science Board. 1996. Report of the Defense Science Board Task Force on Information Warfare Defense (IW-D). Washington, DC: Office of the Under Secretary of Defense for Acquisition and Technology, November 21. Executive Office of the President, Office of Science and Technology Policy. 1997. Cyberna- tion: The American Infrastructure in the Information Age: A Technical Primer on Risks and Reliability. Washington, DC: Executive Office of the President. Gertz, Bill. 1998. "'Infowar' Game Shut Down U.S. Power Grid, Disabled Pacific Com- mand," Washington Times, April 16, p. A1. Hardy, Quentin. 1996. "Many Big Firms Hurt by Break-ins," Wall Street Journal, November 21,p. B4. Kalish, David E. 1998. "Phone Outage Hits East Coast," Associated Press, February 25. Available online at . Mills, Mike. 1998. "AT&T High Speed Network Fails Red Cross, Banks Scramble to Ad- just," Washington Post, April 14, p. C1. Milton, Pat. 1997. "FBI Director Calls for Effort to Fight Growing Danger of Computer Crime," Associated Press, March 4. Myers, Laura. 1998. "Pentagon Has Computers Hacked," Associated Press, April 16. National Defense Panel. 1997. Transforming Defense: National Security in the 21st Century. Arlington, VA: National Defense Panel, December. National Security Telecommunications Advisory Committee (NSTAC). 1997. Reportsirom the Eight NSTAC Subcommittee Investigations. Tysons Corner, VA: NSTAC, December 10-11. Available online at . Neumann, Peter G. 1995. Computer Related Risks. New York: ACM Press. Neumann, Peter G. 1996. "Rats Take Down Stanford Power and Silicon Valley Internet Service," RISKS Digest, Vol. 18, Issue 52, October 12. Available online at . Perillo, Robert J. 1997. "AT&T Database Glitch Caused '800' Phone Outage," Telecom Digest, Vol. 17, Issue 253, September 18. Available online at . Power, Richard G. 1996. Testimony of Richard G. Power, Computer Security Institute, before the Permanent Subcommittee on Investigations, Committee on Government Affairs, U.S. Senate, Washington, DC, June 5. President's Commission on Critical Infrastructure Protection (PCCIP). 1997. Critical Foun- dations: Protecting America's Infrastructures. Washington, DC: PCCIP, October. Schultz, Gene. 1997. "Crackers Obtained Gulf War Military Secrets," RISKS Digest, Vol. 18, Issue 96, March 31. Available online at . Sweet, William, and Linda Geppert, eds. 1997. "Main Event: Power Outages Flag Technol- ogy Overload, Rule-making Gaps," IEEE Spectrum, 1997 Technology Analysis and Fore- cast.

OCR for page 12
INTRODUCTION 25 U.S. Department of Defense (DOD). 1985. Trusted Computer System Evaluation Criteria, Department of Defense 5200.28-STD, the "Orange Book." Ft. Meade, MD: National Computer Security Center, December. U.S. General Accounting Office (GAO). 1996. Information Security Computer Attacks at Department of Defense Pose Increasing Risks: A Report to Congressional Requesters. Wash- ington, DC: U.S. GAO, May. War Room Research LLC. 1996. 1996 Information Systems Security Survey. Baltimore, MD: War Room Research LLC, November 21. Ware, Willis H. 1998. The Cyber-posture of the National Information Infrastructure. Wash- ington, DC: RAND Critical Technologies Institute. Available online at . Wayner, Peter. 1997. "Human Error Cripples the Internet," New York Times, July 17. Avail- able online at . White House National Security Council. 1998. White Paper: The Clinton Administration's Policy on Critical Infrastructure Protection: Presidential Decision Directive 63. Washing- ton, DC: The White House, May 22. Zuckerman, M.J. 1996. "Post-Cold War Hysteria or a National Threat," USA Today, June 5, p. 1A.