National Academies Press: OpenBook

Toward a Safer and More Secure Cyberspace (2007)

Chapter: 3 Improving the Nation's Cybersecurity Posture

« Previous: 2 What Is at Stake?
Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
×

3
Improving the Nation’s Cybersecurity Posture

Given the scope and nature of the cybersecurity threat as discussed in Chapter 2, what should the nation do about it? This chapter begins with a committee-developed “Cybersecurity Bill of Rights” (Section 3.1) that characterizes what it would mean for cyberspace to be safe and secure. Building on this characterization, Section 3.2 describes the information technology (IT) landscape into which cybersecurity research flows. It describes the twin needs for research that would lead to improved deployment of today’s cybersecurity technologies and the emergence of new cybersecurity technologies in the future. Section 3.3 explains the rationale for cybersecurity research, placing such research in the larger context of the cybersecurity problem, and Section 3.4 concludes the chapter with five principles that should guide that research.

3.1
THE CYBERSECURITY BILL OF RIGHTS

The Cybersecurity Bill of Rights (CBoR) describes a vision for a safe and more secure cyberspace. In the most general sense, individual users, organizations, and society at large are entitled to use and rely on information technologies whose functionality does not diminish even when these technologies are under attack. Although there are 10 provisions in the CBoR that articulate desirable security properties of information technology writ large, it is likely that as information technology evolves, other provisions will need to be added and the existing ones modified.

Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
×
3.1.1
Introduction to the Cybersecurity Bill of Rights

The Cybersecurity Bill of Rights is a statement of security goals or expectations—what is it that society should reasonably expect in the way of security in its information technologies, and what should technologists and organizations strive to achieve? Since many or most of today’s information technologies are not designed or implemented with the goals of the CBoR in mind, the Cybersecurity Bill of Rights also illustrates the enormous gap between what information technologies should do and what they now do. Serious efforts directed at achieving these goals would greatly decrease—but never eliminate—the security risks associated with using information technology. As importantly, the availability of information technologies designed and implemented with these goals in mind would expand the policy choices available to society about the functionality that it deserves and should expect from its technologies.

As a statement of expectations, the security provisions of the CBoR are neither absolute nor unconditional. When an information technology system or component does not embed a provision that should be provided, users have a right to know that the technology they are using does not meet that expectation so that they can act accordingly. Moreover, the way in which the provisions of the CBoR are realized for any given system will depend on many contextual factors. For example, the cybersecurity needs of an individual end user are different from those of a bank or the electric power grid.

In constructing the CBoR, the committee derived the provisions by considering four categories that are important to cybersecurity. These categories involve the following: (1) holistic systems properties relating to availability, recoverability, and control of systems; (2) traditional security properties relating to confidentiality, authentication, and authorization; (3) crosscutting properties such as safe access to information, confident invocation of important transactions, including those that will control physical devices, and knowledge of what security will be available; and (4) matters relating to jurisprudence: that is, appropriate justice for victims of cyberattack. (Some of the categories and provisions within them overlap.)

Finally, the CBoR is user-centric, but “user” should be interpreted broadly. Users include individual end users, organizations, and—most importantly—programs and system components that use (invoke or call on) other information technology systems or components. But taken together and viewed overall, the CBoR should be seen as a societal bill of rights, because the use of information technology in society has ramifications reaching far beyond a single individual or organization. Because critical societal functions depend on information technology, the security

Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
×

of the information technologies involved in those functions is of paramount importance.

3.1.2
The Provisions of the Cybersecurity Bill of Rights
  • The first three provisions relate to holistic systems properties including availability, recoverability, and control of systems:

    1. Availability of system and network resources to legitimate users.

      Users of information technology systems (from individuals to groups to society, and including programs and applications1) should be able to use the computational resources to which they are entitled and systems that depend on those resources. Attacks intended to deny, seriously degrade, or reduce the timeliness of information technology-based services should not succeed.

    2. Easy and convenient recovery from successful attacks.

      Because cybersecurity measures will sometimes fail, recovery from a security compromise will be necessary from time to time. When necessary, such recovery should be easy and convenient for individual users, systems administrators, and other operators. Recovery is also an essential element of survivability and fault tolerance. Recovery should be construed broadly to include issues related to long-term availability in the face of “bit rot” and incompatible upgrades.2

    3. Control over and knowledge of one’s own computing environment.

      Users expect to be in control of events and actions in their own immediate environment, where control refers to taking actions that influence what happens in that environment. Knowledge refers to knowing how things that are happening compare to user expectations about what is happening. To the extent that events and actions are occurring that are not initiated by the user, a breach in security may be occurring.

1

Groups and societies are effectively aggregations of users, and computer programs and applications are proxies of users.

2

“Bit rot” refers to the phenomenon in which a program (or features of a program) will suddenly stop working after a long time, even though “nothing has changed” in the environment. In fact, the environment has changed, although perhaps in subtle and unnoticed ways.

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    • The next three provisions relate to the traditional security properties of confidentiality, authentication (and its extension, provenance), and authorization:

      1. Confidentiality of stored information and information exchange.

        One central function of information technology is the communication and storage of information. Just as most people engage in telephone conversations and store paper files with some reasonable assurance that the content will remain private even without their taking explicit action, users should expect electronic systems to communicate and store information in accordance with clear confidentiality policies and with reasonable and comprehensible default behavior. Systems for application in a particular problem domain should be able to support the range of privacy policies relevant to that domain.

        As for systems that communicate with one another, some or all of the information that they pass among themselves belongs to someone, at least in the sense that someone has a confidentiality interest in it. In other cases, the information may not be particularly sensitive, but there is almost never any affirmative reason for that information to be shared with other parties unbeknownst to the owner—suggesting that external access to normally confidential data should normally be done with explicit permission.

        As a particularly important way of ensuring confidentiality, responsible parties should have the technical capability to delete or expunge selected information that should not be permanently stored. This is important in the context of removing erroneous personal information from cyberspace. Today, electronically recorded information can be difficult to remove from the databases in which it is stored. For example, “deleted” information may be retained in a backup—and it should be possible to delete information from backups as well as from the original recording medium.

        Whether or not—in a particular situation—it is appropriate to delete all instances of a given datum is a policy issue. But even if a policy choice were made that asserted that such deletions were appropriate, the technology of today is largely incapable of supporting that choice.

      2. Authentication and provenance.

        Mutual authentication of the senders and receivers involved in an information exchange is an essential part of maintaining confi-

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×

    dentiality, since passing information to the wrong party or device is an obvious way in which confidentiality might be violated.

    As an extension of traditional authentication, users should have access to reliable and relevant provenance (that is, knowledge of the responsible parties) for any electronic information or electronic event, commensurate with their need for security and assurance.

    Provision V does not rule out anonymous speech, for example—but it does mean that any given user should be able to refuse to accept information from or participate in events initiated by anonymous parties. Information originating from untrustworthy sources should not be able to masquerade as information originating from known trustworthy sources. When information has no explicit provenance, users and their software agents should be able to determine this fact and make decisions regarding trust accordingly. Information sources and events in cyberspace should be construed broadly, so that deliberately hostile or antisocial sources and actions should have provenance as well. Provenance should be reliable and nonrepudiable.

    1. The technological capability to exercise fine-grained control over the flow of information in and through systems.

      Authorized parties should be technically able to exercise fine-grained control over flows of information. For example, it should be technologically possible for an individual to conduct certain online transactions with technologically guaranteed anonymity, and for putative partners in such transactions to decline to participate if anonymity is offered. It should also be technologically possible for individuals to know who collects what information about them. And, they should have the technical ability to restrict the types, amounts, and recipients of personal information.

      Access privileges determine the functionality that an information technology system or network offers to a user or other entity. Circumstances may change in such a way that privileges need to be revoked—for example, when a user is terminated or determined to be a threat, or when a service has been compromised. Revocation of privileges at various granularities is a necessary security capability.

      Whether or not individuals should have legal rights to exercise fine-grained control over the flow of information in and through systems is a policy issue. But even if a policy choice were made that asserted the propriety of such legal rights, the technology of today is largely incapable of supporting that choice.

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    • The next three provisions relate to crosscutting properties of systems such as safe access to information, confident invocation of important transactions, including those that will control physical devices, and knowledge of what security will be available:

      1. Security in using computing directly or indirectly in important applications, including financial, health care, and electoral transactions and real-time remote control of devices that interact with physical processes.

        Security is especially important in certain kinds of transactions, such as those involving financial, medical, or electoral matters. Further, computational devices increasingly control physical processes as well as information processes, and such devices may have the potential to act dangerously in the physical world. It is thus especially important that cyberattackers be precluded from impairing the safe operation of physical devices.

        In this context, security refers to the availability, integrity, appropriate privacy controls on information, sufficient guarantees about the identities of involved parties to prevent masquerading and other attack, and nonrepudiation guarantees so that parties can be assured of their interactions.

      2. The ability to access any source of information (e.g., e-mail, Web page, file) safely.

        Today, many security vulnerabilities are exploited as the result of some user action in accessing some source of information. In this context, safe access means that nothing unexpected happens and that nothing happens to compromise the expected confidentiality, integrity, and availability of the user’s information or computational resources. Safety cannot be assured with 100 percent certainty under any circumstances (for example, a user may take an allowed but unintended action that results in compromised confidentiality), but with proper attention to technology and to usability, the accessing of information can be made much less risky than it is today.

      3. Awareness of what security is actually being delivered by a system or component.

        Users generally have expectations about the security-relevant behavior of a system, even if these expectations are implicit, unstated, or unfounded. System behavior that violates these expectations is often responsible for security problems. Thus, users have a right to know what security policies and assurances are actually

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×

    being delivered by a system or component so that they can adjust their own expectations and subsequent behavior accordingly. As an illustration, nonexpert users need to know how security settings map onto policies being enforced, as well as how settings need to be specified in order to achieve a particular policy.

    Such awareness also implies the ability to make informed judgments about the degree of security that different systems provide. If individuals and organizations are to improve their cybersecurity postures, they need to know how to compare the security of different systems and the impact of changes on those systems. To a great degree, quantitative risk assessments, rational investment strategies, and cybersecurity insurance all depend on the ability to characterize the security of systems.

    • The last provision relates to justice:

      1. Justice for security problems caused by another party.

        In most of society, there is an expectation that victims of harm are entitled to some kind of justice—such as appropriate punishment of the perpetrator of harm. But today in cyberspace, there is no such expectation owing largely to the difficulty of identifying perpetrators and the lack of a legal structure for pursuing perpetrators. In addition, individuals who are victimized or improperly implicated because of cybersecurity problems should have access to due process that would make them whole. Society in its entirety should also have the ability to impose legal penalties on cyberattackers regardless of where they are located.

    3.1.3
    Concluding Comments

    Every set of rights has responsibilities associated with it. Because the CBoR defines a set of security expectations for information technology, it has implications for every party that creates or uses information technology. Designers and developers of information technologies for end users will have obligations to produce systems whose security behavior is consistent with the CBoR unless otherwise explicitly noted to be inconsistent. Designers and developers of information technology systems and components on which other systems depend are also affected, because the CBoR defines for system designers and developers a set of expectations for what can happen on either side of an interface between two components. That is, because information technology systems today are crafted and deployed in a modular fashion, the CBoR also has design and implementation implications for the functionality of

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×

    those two components, regardless of the side of the interface on which each resides. To the extent that the CBoR can be relied on to set security expectations for components developed by different parties, the result will be a more orderly world that supports composability of the building blocks in the IT infrastructure. The CBoR would also require end users to be sufficiently knowledgable to ascertain whether and to what extent the information technology that they use in fact delivers on the CBoR’s security obligations.

    How should the goals of the CBoR be achieved? As the discussion in the remainder of this report indicates, a new way of thinking about security—a drastic cultural shift—will be necessary regarding the ways in which secure systems are designed, developed, procured, operated, and used. In the long run, such a shift will entail new directions in education, training, development practice, operational practice, oversight, liability laws, and government regulation.

    3.2
    REALIZING THE VISION

    Compared to what is available today, the foregoing vision of a secure cyberspace is quite compelling. However, for two distinct though related reasons, we are a long way away from meeting this goal. The first reason is that there is much about cybersecurity technologies and practices that is known but not put into practice. As an example, according to the senior information security officer at a major financial institution, the codification and dissemination of best practices in cybersecurity policy at the level of the chief executive officer or the chief information officer have been particularly challenging, because incentives and rewards for adopting best practices are few. Box 3.1 indicates the limited scope of threats against which certain common commercial products defend.

    The second reason is that even assuming that everything known today was immediately put into practice, the resulting cybersecurity posture—though it would be stronger and more resilient than it is today—would still be inadequate against today’s threats, let alone tomorrow’s. Closing this gap—a gap of knowledge—will require research, as discussed below.

    3.3
    THE NECESSITY OF RESEARCH

    Framing the issue of necessary research requires understanding the larger context of which such research is a part. Today, the vast majority of actual cybersecurity efforts is devoted to a reactive catch-up game that fixes problems as they are discovered (either in anticipation of attack as

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×

    BOX 3.1

    What Firewalls and Antivirus Products Protect Against

    Firewalls—whether implemented with hardware or software—are used to prevent malicious or unwanted traffic from reaching protected resources or to allow only authorized traffic (e.g., from specific network addresses). Antivirus products generally scan files or file systems looking for known computer viruses or malicious code, usually relying on a frequently updated virus definition file.

    Below is a short list of some of the vulnerabilities that firewalls and antivirus products attempt to address:

    • Worms. Both firewalls and antivirus products can be used to identify and slow (or halt) the propagation of computer worms, which, unlike viruses, can act independently once released.

    • Viruses. Antivirus products can scan for, remove, and often repair damage done by viruses obtained from opening infected e-mails or other means.

    • Trojans. Antivirus products can identify and remove Trojan horse software (i.e., malicious software that masquerades as legitimate software), while firewalls can be used to spot and prevent network traffic associated with Trojan horse software.

    • Vulnerability scans. Firewalls can be used to prevent automated portscanning tools from outside the firewall from uncovering open ports on (or otherwise learning about) potentially vulnerable machines behind the firewall.

    • Denial-of-service attacks. Firewalls can often assist in mitigating denial-of-service attacks by blocking traffic from offending network addresses.

    • Insider misbehavior. Firewalls are often used to block specific kinds of network traffic (or requests) from those inside the firewall as well—for example, by not allowing traffic over specific ports used by applications deemed inappropriate for a given setting (e.g., P2P file-sharing applications in an office setting) or by blocking access to specific Web sites that an organization has deemed inappropriate for a given setting.

    the good guys find them or in response as the bad guys find them). Moreover, end users often do not avail themselves of known cybersecurity technologies and practices that could significantly improve their individual resistance to cyberattack of various kinds. For example, they often do not install patches to systems that could close known security holes in their design, implementation, or configuration. Vendors of IT products and services often do not use technologies and development practices that could reduce the number of security vulnerabilities embedded in them. For example, they do not use known technologies that might prevent the buffer overflows that continue to account for roughly half of all

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×

    Computer Emergency Response Team Coordination Center (CERT/CC) advisories.3

    Reactive efforts are essential because it is impossible to replace the existing IT infrastructure in one fell swoop (and even if it were possible, we would not know what to replace it with) and because the security of any given system will require upgrading throughout its life cycle as new threats emerge and new vulnerabilities are found. Still, continuously reacting to cybersecurity problems—without new approaches to developing and deploying a stronger and more secure technological foundation—is a poor way to make progress against escalating or new threats. By their very nature, reactive efforts are incremental; vulnerabilities that flow from basic system design and architectural concepts cannot be fixed by such means, and often patching introduces additional security flaws. A focus on patching also tends to draw interest and attention away from more fundamental architectural problems that cannot be simply fixed with a patch.

    Security add-ons will always be necessary to fix individual security problems as they arise, and R&D is needed to develop improved tools and techniques for dealing with near-term fixes (e.g., configuration management, audit, patch management), but ultimately there is no substitute for system- or network-wide security that is architected from initial design through deployment, easy to use, and minimally intrusive from the user’s standpoint.

    Furthermore, for all practical purposes, the cybersecurity risks (the combination of adversary threats and technical or procedural vulnerabilities) of the future are impossible to predict in any but the most general terms. Because it is difficult to anticipate innovation (which changes the architecture or implementation underlying specific systems) and to comprehend complex systems (which makes understanding the systems in place today very hard), it is almost guaranteed that unforeseen applications will result in unforeseen security concerns and human beings will be unable to anticipate all of the security issues that accompany complex systems.

    In short, in many ways security is an emergent property of a complex IT system that depends on both the underlying system architecture and its implementation. Consider, for example, the relatively common practice of building an application on top of an off-the-shelf operating system. Although the applications builder can in principle know all there is to know about the application, its relationship to the operating system is known only through the various application programming interfaces (APIs) of the operating system. But since the input-output behavior of

    3

    For more on the CERT/CC advisories, see http://www.cert.org/advisories/.

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×

    these APIs is usually incompletely specified (e.g., it may not be documented how the system responds when inputs are provided that are not of the expected variety), the overall relationship between application and operating system cannot be known completely. Much research is needed on the properties, practices, and disciplines to drive this emergence—just as research in the nascent complexity sciences is addressing similar problems of understanding emergence in other problem domains characterized by sensitive dependence on initial conditions.

    This does not mean that it is impossible to identify areas of focus, but it does imply that within those areas of focus the nation’s research strategy should seek to develop a broad and diverse technological foundation that would enable more rapid responses to new and currently unforeseen threats as they emerge as well as to yield unanticipated advances.

    As for the character of the research needed, both traditional and unorthodox approaches will be necessary. Traditional research is problem-specific, and there are many cybersecurity problems for which good solutions are not known. (A good solution to a cybersecurity problem is one that is effective, is robust against a variety of attack types, is inexpensive and easy to deploy, is easy to use, and does not significantly reduce or cripple other functionality in the system of which it is made a part.) Research is and will be needed to address these problems.

    But problem-by-problem solutions, or even problem-class by problem-class solutions, are highly unlikely to be sufficient to close the gap by themselves. Unorthodox, clean-slate approaches will also be needed to deal with what might be called a structural problem in cybersecurity research now, and these approaches will entail the development of new ideas and new points of view that revisit the basic foundations and implicit assumptions of security research.

    Addressing both of these reasons for the lack of security in cyberspace is important, but it is the second—closing the knowledge gap—that is the primary goal of cybersecurity research and the primary focus of this report.

    3.4
    PRINCIPLES TO SHAPE THE RESEARCH AGENDA

    This section describes a set of interrelated principles that the committee believes should shape the research agenda. Some are principles intended to drive specific components of the research agenda, while others are intended to change the mind-set with which the agenda is carried out. Individually, none of these principles is new, but in toto they represent the committee’s best understanding of what should constitute a sound philosophical foundation for cybersecurity research.

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    3.4.1
    Principle 1: Conduct cybersecurity research as though its application will be important.
    3.4.1.1
    The Rationale

    The committee’s view on conducting cybersecurity research is shaped by two essential points. First, much of today’s cybersecurity research is limited to creating “building blocks” for security that could be incorporated into various applications. Today’s dominant perspective is that basic research entails the creation or in-principle demonstration of a new cybersecurity concept or mechanism, and that bringing this concept or mechanism into real-world use is somehow less demanding or intellectually less worthy than the “basic” or “fundamental” research that led to the innovative concept or mechanism.

    But research that results only in a proof of concept or a feasibility demonstration is often far from practical application, and an innovation, original though it may be, is not a tool or a system. Indeed, there is an enormous distance between the development of a good idea and its widespread use (whether by end users or by system designers and developers), and traversing that distance often entails additional research activity that is significant in its own right.

    For example, the committee believes that the likelihood of a good idea succeeding in the marketplace is enhanced if it is scalable, adoptable, and composable.

    • A scalable idea works on real-world problems of reasonable size in reasonable time.

    • An adoptable idea is one whose benefits can easily be seen by its potential users, and it can be easily used by parties other than its creator.

    • A composable idea can be integrated into a system without necessitating full-scale re-analysis and retesting. Composability is desirable because any system of significant size is usually developed in pieces by separate groups and at separate times, and is complicated by the fact that users may configure a system so that different components are active. Without composability, the “complete” system must be tested as one big and maximally complex lump.

    Much additional research may be necessary to make a given concept scalable, composable, and adoptable. However, such considerations are often not taken seriously in the basic science stage, as many researchers believe they can defer such issues until the technology is ready for delivery. This attitude has inhibited the development of practical tools even though the underlying science had promise.

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×

    Formal verification methods provide an example. Formal methods such as model checking have been successfully applied to hardware on an economically significant scale. Nonetheless, much of the early work on formal verification methods for software resulted in technologies that required large amounts of training or radical changes to engineering practice, or that were based on unrealistic ideas about requirements gathering, or that were too costly and unable to interoperate, or that required hardware capabilities for undertaking verification that were not easily available. In many cases, these methods could operate only when the entire program to be verified was available, and could not operate on well-defined subunits. Researchers on formal methods did not have the benefit of effective metrics for assessing the benefits that might flow from adopting these methods.

    In recent years, some of these problems have been overcome, with the result that formal methods do have some genuine utility in software development, and the use of formal methods is a hotbed of activity in research and in companies like Microsoft. One example is Microsoft’s Static Driver Verifier (SDV) tool. The SDV is a static code-analysis tool for formally verifying that device drivers comply with various application programming interface rules about how the driver interfaces with an operating system. Box 3.2 provides more details.

    Second, the committee believes that a view of cybersecurity research as being devoted only to the creation of building blocks is far too narrow, and is one of the primary reasons that the benefits of past cybersecurity research have not been fully realized. While the creation of new cybersecurity building blocks is an essential and primary component of any research agenda in cybersecurity, the span of cybersecurity research must be broadened in several interrelated dimensions to encompass—indeed, embrace—the application of known and future approaches to specific application domains, development of cybersecurity tools for every part of the IT life cycle, and multidisciplinary approaches to cybersecurity problems.

    A related point is that focusing research attention on questions of deployment will help to reduce the time needed to deploy innovations. Large-scale deployments of any kind inevitably take a long time, and even small reductions in lead time could make a big difference should the need arise for the deployment of cybersecurity measures in an emergency.

    In addition, it is important for research to consider and decision makers to take into account the enormous political pressures to “do something” in the wake of a catastrophe. Indeed, it is not unknown that measures hastily put into place after a disaster have subsequently proven to be ineffective, or even worse, harmful. It is thus appropriate to focus some research attention on how to sensibly deploy emergency measures

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×

    BOX 3.2

    Lessons Learned from the Technology-Transfer Effort Associated with Microsoft’s Static Driver Verifier

    The Static Driver Verifier (SDV) systematically analyzes Windows device drivers against a set of rules that define what it means for a device driver to properly interact with the Windows operation system kernel. The SDV is based on a code-analysis engine known as SLAM, which incorporates type checking, model checking, program analysis, and automated deduction. SLAM was the result of research to create methodologies and tools to check the correctness of partial specification of program behavior—specifically the use of the device driver interface to the Windows kernel. The SDV provides an automated environment for running SLAM that incorporates rules for the Windows Driver Model; a well-articulated environment model of the Windows kernel and other drivers; scripts to configure the SDV with driver-specific information; and a graphical user interface to present results.

    Intellectually, the primary lesson learned in the transition from the SLAM research to the working SDV tool is to focus on problems rather than technology. The problem must be recognized as critical by product developers and end users, and not just technically interesting to researchers. It must also be bounded sufficiently to provide a tangible solution with measurable success criteria. All parties involved, including product developers, end users, and researchers, must see clearly the link between the problem at hand and the solution—which is what the implementation of the SDV framework made clear in the case of the SLAM research.

    From an organizational point of view, the primary lesson is that leaving the scaling up of a prototype research as an exercise for the development group is likely to result in lack of acceptance and adoption, since the development group will not necessarily make the “obvious” leap from technology solution to useful and viable product. Successful technology transfer is, at least in part, a research team responsibility, and involves considerable effort on the part of researchers to understand how product teams operate, how they allocate resources, how they make decisions, and what it takes to turn a prototype into a product.


    SOURCE: Adapted from Thomas Ball, Byron Cook, Vladimir Levin, and Sriram K. Rajamani, SLAM and Static Driver Verifier: Technology Transfer of Formal Methods Inside Microsoft, Microsoft Research Technical Report, MSR-TR-2004-08, January 2004.

    under such circumstances. In addition, because post-catastrophe deployments often change the boundaries of what is politically feasible, research should also consider what sensible things might be done if and when such opportunities arise.

    3.4.1.2
    New Computing Paradigms and Applications Domains

    Cybersecurity problems in an environment of large-scale distributed computing, embedded computing, batch processing and mainframe com-

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×

    puting, desktop computing, Web services (see Section 8.4.3), and pervasive computing (see Section 8.4.4) may be different from one another, even when meaningful analogs among these paradigms can be identified. Contexts of use matter as well: Internet services support http Web browsing and remote log-in, but the security issues associated with Web browsing are far greater than those associated with remote log-in simply because the former is used far more than the latter.

    At a deep technical level, the types of attacks that may be launched in these different environments are not so different from one another, and the fundamental research issues needed to address these attacks were identified in the early 1970s and have not changed significantly since then. But these environments do differ significantly in their exposure to a wide range of anonymous attackers. As a result, the opportunities for launching different types of attack do vary significantly, suggesting the need for research on the scope and nature of those opportunities in the different environments and how those opportunities might be limited or circumscribed.

    But what is less well appreciated is that similar issues apply in applications domains, and understanding how a particular cybersecurity approach is relevant to a particular application domain can be and often is as challenging as developing that approach in the first place. Cybersecurity research is most likely to be relevant to an application domain if it is conducted with deep knowledge of and insight into the issues that arise in that domain. An explicit consideration of the application domain serves both to inspire cybersecurity research based on the security problems associated with the domain and to increase the likelihood that the research will be used to solve real problems in the application domain. Examples of such application domains include cybersecurity for health care applications (see Section 8.4.1) and for the electric power grid (see Section 8.4.2).

    Since most cybersecurity researchers do not have domain-specific expertise, collaboration with others who do becomes a sine qua non for success in this kind of research. Moreover, these collaborations must be undertaken as enterprises among co-equals—and in particular the computer scientist as cybersecurity researcher cannot view the problem domain as “merely” the applications domain, must refrain from jumping to conclusions about the problem domain, must be willing to learn the facts and contemplate realities and paradigms in the problem domain seriously, and must not work solely on the refined abstract problem that characterizes much of computer science research. Similarly, applications experts cannot view security as a mere annoyance to be brushed aside as quickly as possible, must refrain from jumping to conclusions about cybersecurity, must be willing to learn the facts and contemplate realities

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×

    and paradigms in cybersecurity seriously, and must not work in complete isolation from the abstractions of computer science research.

    The need for collaboration between domain experts and cybersecurity specialists can also be seen in the issue of how to make security functionality more usable by nonspecialists. Addressed at greater length in Section 6.1, the research area of usable security entails the development of security technologies that can be integrated seamlessly into how people already do their work, thereby increasing the likelihood that they will actually be used in everyday life.

    3.4.1.3
    Attending to Security Throughout a System’s Life Cycle

    For many years, tensions between security and other desirable system attributes or functionality have generally not been resolved in ways that have improved security. While these kinds of tension may never disappear, and indeed in some cases (e.g., in the absence of a serious observed threats) it can make good economic and business sense to resolve these tensions in such a manner, the committee believes strongly that cybersecurity must be regarded as an essential element throughout the entire life cycle of an IT product or service and that cybersecurity efforts should focus much more on creating inherently secure products. Security products that retroactively attempt to apply security to systems will always be needed, and security-related afterthoughts will always be necessary (simply because the good guys cannot anticipate every possible move by the bad guys), but the reality of security is that it is important in every phase of a system’s life cycle, including requirements specification, design, development and implementation, testing and evaluation, operations, maintenance, upgrade, and even retirement. Whether different foci of research are needed to address security issues in each of these phases is an open question, but it is clear that the needs for security are not identical in each phase—and so researchers and funders should be open to the idea of phase-specific cybersecurity research.

    As an example of thinking implied by this principle, consider a search for alternatives to the notion of perimeter defense, which has been a common approach to security for many years. Under perimeter defense, what is “inside” a vital information system or network is protected from an outside “attacker” who may try to “penetrate” the system to gain access to or acquire control over data and system resources on the inside.

    Perimeter defense has the major advantage of being scalable. That is, defensive perimeters such as firewalls are deployed because it is much easier to secure one machine than several thousand. Scalability comes from the fact that adding a machine inside the perimeter imposes little if any additional burden on the defense.

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×

    However, in practice, perimeter defense is often implemented in ways that require no changes to systems on the inside of the perimeter. That is, defensive efforts are focused primarily on one perimeter—the perimeter that encompasses the entire system—with little defensive attention to components inside. (One familiar example of this is a firewall that protects all of the computers on a local area network—with the result that an attacker who compromises the firewall has rendered all of the computers on that network vulnerable.) The mind-set of perimeter defense is that “those inside the perimeter need not be concerned about security in any significant way.”

    In a world of increasingly interconnected and numerous computers and networks, this notion of perimeter defense is no longer realistic (if it ever was). Definitions of “inside” and “outside” change fluidly with business strategy and partnerships, and yesterday’s business partner may be tomorrow’s insider threat. In coalition operations involving U.S. military forces, an ally today may be an adversary tomorrow—implying that the implementation of security policies must be continually updated, since the categories of friend and foe are essentially arbitrary. The growing proliferation of wireless technologies and the reliance on employees working from home or while traveling makes the notion of “outside” a slippery concept. Trusted insiders may also be compromised. Most importantly, when the perimeter is breached (whether by virtue of a technical weakness such as buffer overflow or an operational weakness such as an employee being bribed to reveal a password), the attacker has entirely free rein inside.

    3.4.1.4
    Engaging a Multidisciplinary Approach to Cybersecurity

    Any meaningful cybersecurity research program should be understood as a highly multidisciplinary enterprise for two related reasons. First, adversaries can focus their efforts on any weak point in a system, whether that weak point is technological, organizational, sociological, or psychological. Interactions related to these factors may influence the technical agenda (e.g., consideration of how to make audit trails valuable evidence in court proceedings), but a technical agenda—that is, one limited to technology alone—will almost certainly be insufficient to solve real-world problems. Put differently, cybersecurity must be regarded holistically if real-world security is to be improved. Second, solutions to cybersecurity problems may also have some relationship to law enforcement authorities, insurance companies, customers, users, international governments, and so on. Solutions developed without recognizing these relationships may prove to be unusable for practical purposes in the face of real-world deployment problems.

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×

    Understanding why certain “technically promising” research may be inadequate or unusable is necessarily multidisciplinary, involving matters of economics, law and regulation, organization theory, psychology, and sociology, as well as deep insights into technology. To illustrate, consider that applications-in-practice require attention to a range of nontechnical issues:

    • Persuading operators and developers to adopt best practices in areas such as patch management, configuration management, audit and logging, and organizational and management processes. Also in scope are software engineering techniques, architecture, and network configuration through awareness, codification of those practices, and education programs.

    • Developing the value proposition and business case for the deployment of security, which includes economic models and measurement techniques to facilitate models for estimating costs and benefits, testbeds, field trials, and case studies to demonstrate and assess value when in situ. This point is discussed further in Section 6.4.

    • Easing changes to established business and engineering practices that may be associated with the introduction of cybersecurity functionality.

    • Ensuring that the application-in-practice is organizationally scalable. For example, a small pilot program to test the suitability of a security application may not reveal the range of exceptional cases that must be handled when the application is deployed throughout the organization. Large-scale deployments are almost always organizationally stressful, and procedures tested in a small-scale environment often need debugging and optimization when an application is scaled up.

    • Providing incremental benefit for incremental deployment. It is difficult to adopt cybersecurity solutions that provide benefit only when they are widely deployed, if only because the burden of proof is large under these circumstances. Conversely, “early gratification”—that is, when an increment of additional work or attention to cybersecurity resulting in some relatively immediate reward that relates to the current ongoing development activity—can obviate or dramatically reduce the need to use a manager-imposed “force majeure” that coerces the development team into adopting a security measure or technology.

    • Ensuring robustness against changing attacks. A specific cybersecurity solution may protect against the exploitation of a particular vulnerability, but be rendered ineffective by a small change in the nature of that exploit. Unless the nature of that change can be kept secret (a very hard condition to meet), such “solutions” will be rendered ineffective very quickly as attackers seek to counter it.

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    • Managing tensions between security and operational resilience. Although certain tensions between security and other desirable properties have often been noted (e.g., tensions between security and ease of use), the connection between security and organizational resilience has often been overlooked. For example, operational compliance with any given organizational security policy is facilitated by standardization, but standardization often increases the risk of common-mode failures. Security is often enhanced by physical security—sensitive activities being undertaken in protected locations—but organizational resilience in crisis often relies on distribution of processing and mobile access to information. Security is enhanced by encryption and tight access controls, but in crisis or emergency, decryption keys and the small number of individuals with the necessary access are often unavailable.

    These points suggest a need for problem-oriented research in addition to traditional discipline-oriented research. The latter tends to characterize research in most computer science academic departments and universities. Problem-oriented research, on the other hand, will require close collaboration among cybersecurity researchers and experts from other disciplines, and as suggested in Section 3.4.1.2, collaborations with application domain experts as well.

    Because of the stovepiped nature of many academic disciplines, including computer science, special efforts will be needed to nurture problem interdisciplinary efforts that will encourage and incentivize the interaction of academic cybersecurity researchers with researchers with other specialties, both in university departments and nonacademic research institutes.

    3.4.2
    Principle 2: Hedge against uncertainty in the nature of the future threat.

    It is unknown if a significant high-end cyberthreat will in fact emerge into public view, and judgments about the likelihood of such an emergence vary. But given the potential damage that such an adversary could inflict, it seems prudent to take a balanced approach that provides a hedge against that possibility. In the absence of substantial evidence about the existence of a high-end threat, a “Manhattan Project” approach to strengthening the nation’s cybersecurity posture is likely unwarranted because of the enormous cost of such an effort, to say nothing of how one would know if such an effort had been successful.

    At the same time, it is reasonable to construct a research agenda in cybersecurity that is both broader and deeper than might be required if

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×

    only low-end threats were at issue. The development of stronger technological foundations for computer and network security is, of course, highly relevant to threats across the entire spectrum, but because a high-end threat may well be capable of undertaking more sophisticated or more subtle technical attacks, the technological research agenda must be correspondingly deeper. Because high-end adversaries would be perfectly happy to target nontechnological elements of a system, a broader research agenda will be needed to develop approaches to defending those elements as well.

    Note that this hedge against uncertainty refers to R&D rather than deployment. That is, deployment costs are often large—and organizations may have sound reasons for not deploying various cybersecurity measures if a threat has not obviously manifested itself. Whatever the downside of a reactive approach, decision makers are often reactive because they do not see the value of certain proactive measures in the absence of a manifestly obvious threat. But it is undeniable that should a threat become manifestly obvious, decision makers will want to have options “off the shelf” that can be deployed in a short time so as to minimize the possible damage—and the very purpose of R&D is to expand the number of options available should high-end threats materialize.

    Of course, the term “short” is a relative one—and the time in question is “shorter than would be possible if R&D had not been conducted.” Research results cannot be deployed instantaneously, nor on a wide scale in less than a few years. In the face of the sudden emergence of a manifestly obvious high-end threat, it might be possible to deploy research prototypes on a scale of a few weeks or months for critical systems (and the likelihood of being able to do so would be higher if research had been conducted in accordance with Principle 1). For the majority of other systems, an emergency response might well be to put into place draconian procedural and technical measures that would mitigate the high-end attack but also would have the effect of drastically reducing the operational utility of those systems. As relevant research results were deployed to protect these systems, the original mitigation measures could be scaled back and the original operational utility of these systems gradually returned to normal.

    3.4.3
    Principle 3: Ensure programmatic continuity in the research agenda.

    A research program should support a substantial effort in areas with a long time horizon for payoff. Such support would necessarily extend for timescales long enough to make meaningful progress on hard problems (5 years to investigate a promising technology is not unreasonable,

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×

    for example) and in sufficient amounts that new technologies might be developed and tested in reasonably realistic operating environments.4 Historically, such investigations have been housed most often in academia, which can conduct research with fewer pressures for immediate delivery on a bottom line.

    This is not to say that long-term research cannot have intermediate milestones, though the purpose of such milestones should be to make midcourse corrections rather than go/no-go decisions that can demoralize researchers and make them overly conservative. Long-term research can also involve collaboration early and often with technology transition stakeholders and can engage both academic and industrial actors, even in the basic science stages. Those stakeholders get an early planning view and an opportunity to influence the course of research and development.

    Private industry has important roles to play as well. Today, industrial research and development in cybersecurity is a significant component of the nation’s cybersecurity R&D efforts, and meaningful cybersecurity results emerge from this effort. In addition, industrial participation, or at least the involvement of product developers, is essential for developing prototypes and mounting field demonstrations. Thus, it is highly appropriate to support academic/industrial cooperation in efforts oriented toward development.

    Possible synergies between government and academia/private industry deserve support as well. For example, both the National Institute of Standards and Technology and the National Security Agency (NSA) have very deep expertise regarding certain aspects of cybersecurity that could be valuable in the conduct of even unclassified research undertaken in the civilian sector.

    Finally, program managers—and more importantly, funders—of such research must be tolerant of research directions that do not appear to promise immediate applicability. Research programs, especially in IT, are often—even generally—more “messy” than research managers would like. The desire to terminate unproductive lines of inquiry is understandable, and sometimes entirely necessary, in a constrained budget environment. On the other hand, it is frequently very hard to distinguish between (A) a line of inquiry that will never be productive and (B) one that may take some time and determined effort to be productive. While an intel-

    4

    Note, however, that it is a long way from a prototype or conceptual proof-of-principle that is usable only by its creator to a tool that might be tested or deployed in such environments. In his classic text The Mythical Man-Month, Frederick P. Brooks, Jr. (Reading, Mass.: Addison-Wesley, 1995) estimates that the effort necessary to create a programming systems product from a program is an order of magnitude larger than for creating the program itself.

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×

    lectually robust research program must be expected to go down some blind alleys occasionally (indeed, even frequently), the current political environment punishes such blind alleys as being of Type A, with little apparent regard for the possibility that they might be Type B.5

    Most researchers, regardless of field, would argue that programmatic continuity is needed in any research program. But such continuity is particularly relevant to cybersecurity. As noted in Section 2.6, cybersecurity problems will endure as long as bad guys have incentives to compromise the security of IT-based systems and networks, and thus cybersecurity research will always be needed to deal with some new and unanticipated exploit. Moreover, because the underlying technology evolves (quite rapidly, in fact), solutions crafted in one IT environment may well no longer be useful in a different one.

    3.4.4
    Principle 4: Respect the need for breadth in the research agenda.

    One of the most frequent complaints from federal policy makers regarding reports that lay out research agendas is that such reports do not set priorities. Policy makers argue that in an environment of limited financial resources, they look to the research community to set priorities so that limited dollars can be spent most effectively. The committee understands the persuasiveness of and rationale for this perspective, and for this reason it has identified important areas of research focus (grouped into six categories and explored in detail in Chapters 4 through 9). Nevertheless, the committee is still quite concerned that an excessively narrow focus on priority areas would result in other important topics or promising avenues being neglected and that such a focus would run significant risks of leaving the nation unprepared for a rapidly changing cybersecurity environment.

    Broad research agendas are often regarded as “peanut butter spread”—a pejorative term used among policy makers to refer to spreading resources more or less evenly among a large number of programs or efforts. It is pejorative because the implication is that no thought has gone into deciding whether these efforts are necessary at all, and that the “spread” simply reflects the unwillingness of the agenda’s creators to set priorities. But the need for breadth in this case reflects the simple reality that there is no silver bullet, or even a small number of silver bullets, that will solve “the cybersecurity problem,” and a broad research agenda helps to ensure that good ideas are not overlooked.

    5

    National Research Council. 2003. Information Technology for Counterterrorism: Immediate Actions and Future Possibilities. The National Academies Press, Washington, D.C.

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×

    The basic canon of priority setting is that one identifies the most important problems and allocates resources preferentially to solve those problems. “Importance” is related both to frequency of occurrence and by severity of the impact of any given occurrence. But severity is very difficult to ascertain in general, as it depends on the details and the significance of particular systems attacked. As for frequency, the deployment of a defense that addresses the threat of a highly likely Attack A may well lead to a subsequent increase in the likelihood of a previously less likely Attack B. In short, adversaries may not behave in accordance with expectations that are based on static probability distributions, and thus it is very difficult to prioritize a research program for countering terrorism in the same way that one might, for example, prioritize a program for dealing with natural disasters. (Section 6.4.2 describes some of the issues related to quantitative risk assessment.)

    The fundamental asymmetry between attacker and defender also affects the research agenda. The cyberdefender must be successful at every point in the defense, whereas the cyberattacker must succeed only once. Even if one vulnerability is closed, a serious attacker will seek another vulnerability to exploit. This search will cost the attacker some time, and this other vulnerability may be more difficult to exploit—these factors make it worthwhile to close the original vulnerability. But there is no sense in which closing the original vulnerability can be said to be a final solution.

    Consequently, new exploitations of vulnerabilities can appear with very little warning. In many cases, these new exploitations are merely variations on a theme, and the defense can easily adjust to the new attack. But in other cases, these new exploitations are qualitatively different, of a nature and character not seen before. Although such cases are hopefully rare, it is safe to bet that the rate at which they appear will not be zero. If qualitatively new attacks suddenly manifest themselves, considerable time will elapse before techniques and technologies can be developed to handle them. Conducting a broad research agenda is likely to decrease significantly the time needed to develop countermeasures against these new attacks when they appear.

    Cybersecurity is analogous to developing a defense against con men and fraudsters, who are infinitely creative or at least very clever in adapting old attacks to new technologies. There are, of course, basic high-end principles that enable one to guard against con men and fraudsters. But it is not realistic to imagine that there is one or even a few promising approaches that will prevent or even substantially mitigate fraud in the future. Rather, a good cybersecurity research agenda is more like a good strategy for investing in the stock market, both of which are driven by a multitude of unpredictable factors. Although there are basic principles

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×

    of investment about which any investor should be cognizant, ultimately a diversified portfolio of investments is a foundational element of any good overall strategy—even if one is willing to place bets on a few very promising stocks.

    These comments should not be taken to mean that all topics are equally important in an absolute sense—only that the committee believes that any top-down articulation of research priorities is bound to be overtaken by events (e.g., new technologies, new threats, new kinds of exploits) very rapidly. Rather, decisions about what areas or topics should be supported should be made by those in a position to respond most quickly to the changing environment—namely, the research constituencies that provide peer review and the program managers of the various research-supporting agencies.

    Finally, notions of breadth and diversity in the cybersecurity research agenda should themselves be interpreted broadly. A great deal of experience suggests that cybersecurity considerations are not easily separated from other engineering issues, and in particular go hand-in-hand with the design and engineering of secure systems. Cybersecurity is relevant to research, education, and practice for every component of the IT system’s development life cycle, and research focused on these components should itself embrace a cybersecurity aspect to such work. By tacitly accepting the current practice of fencing off “cybersecurity research” into separate programs, research programs have a tendency to focus primarily on those areas that are more “purely cybersecurity” such as crypto protocols and other aspects of cybersecurity that are easily separable from basic system design and implementation and to neglect those areas where integration is a principal concern, principally the engineering of software and cyber-physical systems. Integrating cybsersecurity considerations into related programs (software and systems engineering, operating systems, programming languages, networks, Web applications, and so on) will help program managers in these areas to better integrate cybsersecurity into the overall engineering context. Because of the inability to achieve perfection in our engineering practices, it is necessary to pursue—simultaneously—a wide variety of kinds of interventions across a broad front. Section 4.3 (Software and Systems Assurance) explores these comments in somewhat greater depth.

    3.4.5
    Principle 5: Disseminate new knowledge and artifacts.

    University research activities are an important crucible in which new ideas are discovered, invented, and explored. But publication or other dissemination of research results is also a sine qua non for progress, and it is necessary to disseminate results to a community broader than one’s

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×

    own research laboratory for research to have a wide impact, a point that argues for cybersecurity research to be conducted on an unclassified basis as much as possible. As argued in the 2005 cybersecurity report of the President’s Information Technology Advisory Committee, “the vast majority of the Nation’s academic researchers do not hold the security clearances needed to undertake classified work [and furthermore] many research universities regard classified research as incompatible with their role as producers of knowledge benefiting society as a whole.”6 Almost by definition, broad dissemination is incompatible with classified research. (See also the discussion in Section B.6.4.2.)

    As a logical point, it would be possible to expand the number of researchers with clearances or to make more research unclassified. Although the committee acknowledges that there are some circumstances in which cybersecurity research should be classified, it also believes that these circumstances are narrow. Furthermore, a significant expansion in the number of cybersecurity researchers with security clearances does not seem feasible in the present political environment. Thus, the committee believes that as a general rule, the nation would be better served by the latter course.

    A related point is that the cybersecurity expertise and talent developed in the classified world are likely to be quite relevant to the civilian world, and mechanisms to share ideas about technology and training with the public, and in particular with students in the field, should be encouraged. A notable example of such technology sharing is the National Security Agency’s Domestic Technology Transfer Program, established for the purpose of openly sharing NSA-developed technologies with the non-NSA community.7 NSA has also worked with at least one major IT vendor to enhance the security of its products.8

    It is also worth noting that the declassifying of cybersecurity research has some parallels with the 1990s debate—since resolved—over restricting the export of strong cryptography.9 Under the restrictions in effect at the time, the export of products embedding strong cryptography and

    6

    President’s Information Technology Advisory Committee, Cyber Security: A Crisis of Prioritization, National Coordination Office for Information Technology Research and Development, Washington, D.C., February 2005; available at www.nitrd.gov/pitac/reports/20050301_cybersecurity/cybersecurity.pdf.

    7

    For more information on this program, see http://www.nsa.gov/techtrans/index.cfm for a description of the program and http://www.nsa.gov/techtrans/techt00004.cfm for a description of tools and technologies related to cybersecurity.

    8

    Alec Klein and Ellen Nakashima, “For Windows Vista Security, Microsoft Called in Pros,” Washington Post, January 9, 2007; available at http://www.washingtonpost.com/wp-dyn/content/article/2007/01/08/AR2007010801352.html.

    9

    National Research Council. 1996. Cryptography’s Role in Securing the Information Society, Kenneth W. Dam and Herbert S. Lin (eds.). National Academy Press, Washington, D.C.

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×

    even basic knowledge about cryptography was regulated as part of the munitions trade. At the time, the rationales on export related to the undesirability of allowing strong cryptography to be used by adversaries, both nation-states and criminals. But ultimately, decision makers realized that national security and economic security needs could not be easily disentangled, and that in an increasingly globalized economic environment, the ability of commercial firms to keep information confidential was important indeed. Beginning in the late 1990s, export controls on cryptography were gradually relaxed.

    Finally, in many fields of scientific research, the primary means of disseminating discoveries is through presentations at conferences or publication in refereed journals. However, in much of computer science, important research knowledge and insight are conveyed through the dissemination and use of software and/or hardware artifacts. Because cybersecurity has experimental dimensions, those responsible for academic human resource decisions should expect that significant research results in cybersecurity will be broadly disseminated through software downloads at least as much as through published papers or conference proceedings.10

    10

    National Research Council. 1994. Academic Careers for Experimental Computer Scientists and Engineers. National Academy Press, Washington, D.C. This report addresses the conflict between standard academic metrics of merit (i.e., published papers) and the practice of disseminating artifacts as is done in experimental computer science.

    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 51
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 52
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 53
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 54
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 55
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 56
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 57
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 58
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 59
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 60
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 61
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 62
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 63
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 64
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 65
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 66
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 67
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 68
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 69
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 70
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 71
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 72
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 73
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 74
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 75
    Suggested Citation:"3 Improving the Nation's Cybersecurity Posture." National Research Council and National Academy of Engineering. 2007. Toward a Safer and More Secure Cyberspace. Washington, DC: The National Academies Press. doi: 10.17226/11925.
    ×
    Page 76
    Next: Part II An Illustrative Research Agenda »
    Toward a Safer and More Secure Cyberspace Get This Book
    ×
    Buy Paperback | $67.00 Buy Ebook | $54.99
    MyNAP members save 10% online.
    Login or Register to save!
    Download Free PDF

    Given the growing importance of cyberspace to nearly all aspects of national life, a secure cyberspace is vitally important to the nation, but cyberspace is far from secure today. The United States faces the real risk that adversaries will exploit vulnerabilities in the nation’s critical information systems, thereby causing considerable suffering and damage. Online e-commerce business, government agency files, and identity records are all potential security targets.

    Toward a Safer and More Secure Cyberspace examines these Internet security vulnerabilities and offers a strategy for future research aimed at countering cyber attacks. It also explores the nature of online threats and some of the reasons why past research for improving cybersecurity has had less impact than anticipated, and considers the human resource base needed to advance the cybersecurity research agenda. This book will be an invaluable resource for Internet security professionals, information technologists, policy makers, data stewards, e-commerce providers, consumer protection advocates, and others interested in digital security and safety.

    1. ×

      Welcome to OpenBook!

      You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

      Do you want to take a quick tour of the OpenBook's features?

      No Thanks Take a Tour »
    2. ×

      Show this book's table of contents, where you can jump to any chapter by name.

      « Back Next »
    3. ×

      ...or use these buttons to go back to the previous chapter or skip to the next one.

      « Back Next »
    4. ×

      Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

      « Back Next »
    5. ×

      Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

      « Back Next »
    6. ×

      To search the entire text of this book, type in your search term here and press Enter.

      « Back Next »
    7. ×

      Share a link to this book page on your preferred social network or via email.

      « Back Next »
    8. ×

      View our suggested citation for this chapter.

      « Back Next »
    9. ×

      Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

      « Back Next »
    Stay Connected!