Click for next page ( 2


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 1
Executive Summary As communications and computation technologies become increas- ingly pervasive in our lives, individuals are asked to authenticate them-selves to verify their identities in a variety of ways. Ac- tivities ranging from electronic commerce to physical access to buildings to e-government have driven the development of increasingly sophisticated authentication systems. Yet despite the wide variety of authentication tech- nologies and the great range of activities for which some kind of authentica- tion is required, virtually all involve the use of personal information, raising privacy concerns. The development, implementation, and broad deploy- ment of authentication systems require that issues surrounding identity and privacy be thought through carefully. This report explores the interplay between authentication and privacy. It provides a framework for ~ir~g through policy choices and decisions related to authentication systems. Authentication's implications for privacy do not necessarily equate to violations of privacy, but understanding the distinctions requires being aware of how privacy can be affected by the process of authentication. Such awareness is usually absent, however, because authentication tends to be thought about more narrowly, in connection with security. In decid- ing how to design, develop, and deploy authentication systems, it is nec- essary to weigh privacy, security, cost, user convenience, and other inter- ests. A key point is that all of these factors are subject to choice: Whether any given system violates privacy depends on how it is designed and implemented. Changes in technology and practice make this the time for broader, more rigorous analyses of options in authentication.

OCR for page 1
2 WHO GOES THERE? The complexity of the interplay between authentication and privacy becomes clear when one tries to define authentication, which can take multiple forms: Individual authentication is the process of establishing an under- stood level of confidence that an identifier refers to a specific individual. Identity authentication is the process of establishing an understood level of confidence that an identifier refers to an identity. The authenti- cated identity may or may not be linkable to an individual. Attribute authentication is the process of establishing an understood level of confidence that an attribute applies to a specific individual. A common understanding and consistent use of these and other terms defined in the report are a prerequisite for informed discussion. The three variants above illustrate that authentication is not a simple concept: As the committee's first report on nationwide identity systems argued, grap- pling with these issues and their implications is just not that easy (Box ES.1~. This summary of the report includes the findings and recommenda- tions of the authoring Committee on Authentication Technologies and Their Privacy Implications. Each of these findings and recommendations, which are more fully developed and supported in the body of the report, is followed by the number of the finding or recommendation in parenthe- ses. This number corresponds to the chapter where the finding or recom- mendation is found and its order of appearance in that chapter. SECURITY, AUTHENTICATION, AND PRIVACY Authentication is not an end in itself. In general, people are authenti- cated so that their requests to do something can be authorized and/or so that information useful in holding them accountable can be captured. Authentication systems are deployed when control of access and/or pro- tection of resources, both key functions of security, are necessary. The three generic means of authentication that tend to be used in practice can be described loosely as "something you know," "something you have," or "something you are." The systems discussed in this re- port based on technologies such as passwords, public key infrastruc- tures (PKI), smart cards, and biometrics, among others (see Boxes ES.2, ES.3, and ES.4) generally implement one or a combination of these ap- proaches. ~ Computer Science and Telecommunications Board, National Research CounciL IDs- Not That Easy: Questions About Nationwide Identity Systems. Washington, D.C., National Acad- emy Press, 2002.

OCR for page 1
EXECUTIVE SUMMARY 3 Finding: Core authentication technologies are generally more neutral with respect to privacy than is usually believed. How these technologies are designed, developed, and deployed in systems is what most critically determines their privacy impli- cations. (5.6) But what kind of security is necessary, and is authentication required? When authentication is needed, which types might serve best? For ex- ample, when accountability is required, individual authentication may be

OCR for page 1
4 WHO GOES THERE? necessary; otherwise, attribute authentication (or no authentication) may suffice. Finding: Authorization does not always require individual au- thentication or identification, but most existing authorization systems perform one of these functions anyway. Similarly, a requirement for authentication does not always imply that ac- countability is needed, but many authentication systems gener- ate and store information as though it were. (2.1) The use of authentication when it is not needed to achieve an appro- priate level of security could threaten privacy. Overall, privacy protec-

OCR for page 1
EXECUTIVE SUMMARY 5

OCR for page 1
6 WHO GOES THERE? lion, like security, is poor in most systems in large part because systems builders are not motivated to improve it. There is an inherent tension between authentication and privacy, be- cause the act of authentication involves some disclosure and confirmation of personal information. Establishing an identifier or attribute for use within an authentication system, creating transactional records, and re- vealing information used in authentication to others with unrelated inter- ests all have implications for privacy. The many possible impacts of

OCR for page 1
EXECUTIVE SUMMARY authentication may not be considered by system designers whose choices strongly influence how privacy is affected and they may not be appreciated by the public. Most individuals do not understand the pri- vacy and security aspects of the authentication systems they are required to use in interactions with commercial and government organizations. As a result, individuals may behave in ways that compromise their own privacy and/or undermine the security of the authentication systems. Finding: Authentication can affect decisional privacy, informa- tion privacy, communications privacy, and bodily integrity pri- vacy interests. The broader the scope of use of an authentica- tion system, the greater its potential impact on privacy. (3.1) The tension between security and privacy does not mean that they must be viewed as opposites. The relationship between the two is com- plex: Security is needed in order to protect data (among other things), and in many circumstances the data being protected are privacy-sensi- tive. At the same time, authentication may require the disclosure of per- sonal information by a user. If many have access to that personal infor- mation, the value of the information for authentication is decreased, and the decreased privacy of the information through others' access to per- sonal information used in authentication can also compromise security. A critical factor in understanding the privacy implications of authen- tication technologies is the degree to which an authentication system is decentralized. A centralized password system, a public key system, or a biometric system would be much more likely to pose security and privacy hazards than would decentralized versions of any of these. The scope and scale of an authentication system also bear on these issues. Finding: Scale is a major factor in the implications of authenti- cation for privacy and identity theft. The bulk compromise of private information (which is more likely to occur when such information is accessible online) or the compromise of a widely relied on document-issuing system, can lead to massive issu- ance or use of fraudulent identity documents. The result would adversely affect individual privacy and private- and public- sector processes. (6.4) Usability is a significant concern when determining how authentica- tion systems should be deployed and used in practice. Such systems will fail if they do not incorporate knowledge of human strengths and limita- tions. Users need to be aware when an authentication (and hence possi- bly privacy-affecting) event is taking place. In addition, user understand-

OCR for page 1
8 WHO GOES THERE? ing of the security and privacy implications of certain technologies and certain modes of use plays a major role in the effectiveness of the tech- nologies. For example, without a clear understanding of the security/ privacy threats to the system, users may behave in ways that undermine the protections put in place by the designers. Finding: People either do not use systems that are not designed with human limitations in mind or they make errors in using them; these actions can compromise privacy. (4.1) Recommendation: User-centered design methods should be in- tegral to the development of authentication schemes and pri- vacy policies. (4.2) There are ways to lessen the impacts on privacy that authentication systems have. Guidelines include the following: Recommendation: When designing an authentication system or selecting an authentication system for use, one should Authenticate only for necessary, well-defined purposes; Minimize the scope of the data collected; Minimize the retention interval for data collected; Articulate what entities will have access to the collected data; Articulate what kinds of access to and use of the data will be allowed; Minimize the intrusiveness of the process; Overtly involve the individual to be authenticated in the process; Minimize the intimacy of the data collected; Ensure that the use of the system is audited and that the audit record is protected against modification and de- struction; and Provide means for individuals to check on and correct the information held about them that is used for authentica- tion. (3.2) More generally, systems should be designed, developed, and deployed with more attention to reconciling authentication and privacy goals. Recommendation: The strength of the authentication system employed in any system should be commensurate with the value of the resources (information or material) being protected. (2.1)

OCR for page 1
EXECUTIVE SUMMARY Recommendation: In designing or choosing an authentication system, one should begin by articulating a threat model in or- der to make an intelligent choice among competing technolo- gies, policies, and management strategies. The threat model should encompass all of the threats applicable to the system. Among the aspects that should be considered are the privacy implications of the technologies. (4.1) Recommendation: Individual authentication should not be per- formed if authorization based on nonidentifying attributes will suffice. That is, where appropriate, authorization technologies and systems that use only nonidentifying attributes should be used in lieu of individual authentication technologies. When individual authentication is required, the system should be sub- ject to the guidelines in Recommendation 3.2 (above). (2.3) Recommendation: Systems that demand authentication for purposes other than accountability, and that do not themselves require accountability, should not collect accountability infor- mation. (2.2) Recommendation: System designers, developers, and vendors should improve the usability and manageability of authentica- tion mechanisms, as well as their intrinsic security and privacy characteristics. (4.5) Recommendation: Organizations that maintain online-acces- sible databases containing information used to authenticate large numbers of users should employ high-quality informa- tion security measures to protect that information. Wherever possible, authentication servers should employ mechanisms that do not require the storage of secrets. (6.2) MULTIPLE IDENTITIES, LINKAGE, AND SECONDARY USE 9 Who do you find when you authenticate someone? There is no single identity, identifier, or role associated with each person that is globally unique and meaningful to all of the organizations and individuals with whom that person interacts. Finding: Most individuals maintain multiple identities as so- cial and economic actors in society. (1.1)

OCR for page 1
0 WHO GOES THERE? People invoke these identities under different circumstances. They may identify themselves as named users of computer systems, employ- ees, frequent fliers, citizens, students, members of professional societies, licensed drivers, holders of credit cards, and so on. These multiple iden- tities allow people to maintain boundaries and protect privacy. That capacity diminishes with the number of identifiers used. Finding: The use of a single or small number of identifiers across multiple systems facilitates record linkage. Accordingly, if a single identifier is relied on across multiple institutions, its fraudulent or inappropriate use (and subsequent recovery ac- tions) could have far greater ramifications than if used in only a single system. (4.3) The networking of information systems makes it easier to link infor- mation across different, even unrelated, systems. Consequently, many different transactions can be linked to the same individual. Systems that facilitate linkages among an individual's different identities, identifiers, and attributes pose challenges to the goal of privacy protection. Once data have been collected (such as from an authentication event or subse- quent transactions), dossiers may be created. Finding: The existence of dossiers magnifies the privacy risks of authentication systems that come along later and retro- actively link to or use dossiers. Even a so-called de-identified dossier constitutes a privacy risk, in that identities often can be reconstructed from de-identified data. (4.2) Secondary use of authentication systems (and the identifiers and/or identities associated with them) is related to linkage. Many systems are used in ways that were not originally intended by the system designers. The obvious example is the driver's license: Its primary function is to certify that the holder is authorized to operate a motor vehicle. However, individuals are now asked to present their driver's license as proof of age, proof of address, and proof of name in a variety of circumstances. As discussed in IDs Not That Easy and in this report, the primary use of an authentication system may require security and privacy considerations very different from those appropriate for subsequent secondary uses. (For example, a driver's license that certifies one is capable of driving a motor vehicle is a far cry from certification that one is not a threat to airline travel.) Given the difficulty of knowing all the ways in which a system might be used, care must be taken to prevent secondary use of the system as such use can easily lead to privacy and security risks.

OCR for page 1
EXECUTIVE SUMMARY Finding: Current authentication technology is not generally designed to prevent secondary uses or mitigate their effects. In fact, it often facilitates secondary use without the knowledge or consent of the individual being authenticated. (4.4) Finding: Secondary uses of authentication systems, that is, uses for which the systems were not originally intended, often lead to privacy and security problems. They can compromise the underlying mission of the original system user by fostering inappropriate usage models, creating security concerns for the issuer, and generating additional costs. (4.5) 11 At the extreme end of the identity spectrum is the concept of anonym- ity. Anonymity continues to play an important role in preserving the smooth functioning of society and it helps to protect privacy. The wide- spread use of authentication implies less anonymity. Finding: Preserving the ability of citizens to interact anony- mously with other citizens, with business, and with the govern- ment is important because it avoids the unnecessary accumula- tion of identification data that could deter free speech and inhibit legitimate access to public records. (6.7) Linkage and secondary uses of information and systems can be lessened. Recommendation: A guiding principle in the design or selec- tion of authentication technologies should be to minimize the linking of user information across systems unless the express purpose of the system is to provide such linkage. (4.3) Recommendation: Future authentication systems should be designed to make secondary uses difficult, because such uses often undermine privacy, pose a security risk, create unplanned- for costs, and generate public opposition to the issuer. (4.4) THE UNIQUE ROLES OF GOVERNMENT Government institutions play multiple roles in the area where au thentication and privacy intersect. Their approaches to authentication and privacy protection may differ from those of private sector entities for structural and legal reasons.

OCR for page 1
2 WHO GOES THERE? Finding: Electronic authentication is qualitatively different for the public sector and the private sector because of a govern- ment's unique relationship with its citizens: a. Many of the transactions are mandatory. b. Government agencies cannot choose to serve only se- lected market segments. Thus, the user population with which they must deal is very heterogeneous and may be difficult to serve electronically. c. Relationships between governments and citizens are sometimes cradle to grave but characterized by intermit- tent contacts, which creates challenges for technical au- thentication solutions. d. Individuals may have higher expectations for government agencies than for other organizations when it comes to protecting the security and privacy of personal data. (6.2) As a provider of services, the government has been seeking ways to more easily authenticate users who require such services. In some cases, interagency and intergovernmental solutions may conflict with the fun- damental principles espoused in the Privacy Act of 1974. Finding: Many agencies at different levels of government have multiple, and sometimes conflicting, roles in electronic authen- tication. They can be regulators of private sector behavior, issu- ers of identity documents or identifiers, and also relying parties for service delivery. (6.1) Finding: Interagency and intergovernmental authentication so- lutions that rely on a common identifier create a fundamental tension with the privacy principles enshrined in the Privacy Act of 1974, given the risks associated with data aggregation and sharing. (6.8) Government plays a special role in issuing identity documents (driver's licenses, birth certificates, passports, Social Security cards) that are foundational documents relied upon to establish identity in numer- ous authentication systems. However, the processes used to produce these foundational documents are not necessarily sufficiently secure to serve their stated function. Further, although states issue driver's licenses and the federal government issues passports, each may depend on the other for reissuance or replacement; no single entity has a complete au- thoritative database. While on the one hand the lack of easy linkage can

OCR for page 1
EXECUTIVE SUMMARY 13 be seen as a privacy boon, on the other the relative ease with which some foundational documents can be forged means that fraud is more likely and security and privacy risks (including identity theft) are great. Finding: Many of the foundational identification documents used to establish individual user identity are very poor from a security perspective, often as a result of having been generated by a diverse set of issuers that may lack an ongoing interest in ensuring the documents' validity and reliability. Birth certifi- cates are especially poor as base identity documents, because they cannot be readily tied to an individual. (6.3) Recommendation: Birth certificates should not be relied upon as the sole base identity document. Supplemented with sup- porting evidence, birth certificates can be used when proof of citizenship is a requirement. (6.1) MOVING FORWARD When people express concerns about privacy, they speak about intru- sion into personal affairs, disclosure of sensitive personal information, and improper attribution of actions to individuals. The more personal the infor- mation that is collected and circulated, the greater the reason for these concerns and the proliferation of authentication activity implies more col- lection and circulation of personal information. There are choices to be made: Is authentication necessary? If so, how should it be accomplished? What should happen to the information that is collected? It is time to be more thoughtful about authentication technologies and their implications for privacy. Some of this thinking must happen among technologists, but it is also needed among business and policy decision makers. The tension between authentication and privacy and the need for greater care in choosing how to approach authentication will grow in the information economy. In addition to the management control con- cerns associated with security, the economic value of understanding the behavior of customers and others is a strong motivator for capturing personal information. It is also a strong motivator for misusing such information, even if it is only captured through authentication systems. The decision about where and when to deploy identity authentication systems if only where confirmation of identity is already required today or in a greater range of circumstances will shape society in both obvious and subtle ways. The role of attribute authentication in protecting pri- vacy is underexplored. In addition, establishing practices and technical measures that protect privacy costs money at the outset. Many privacy

OCR for page 1
4 WHO GOES THERE? breaches are easy to conceal or are unreported; therefore, failing to pro- tect privacy may cost less than the initial outlay required to establish sound procedural and technical privacy protections. If the individuals whose information has been compromised and the agencies that are re- sponsible for enforcing privacy laws were to become aware of privacy breaches, the incentive for proactive implementation of technologies and policies that protect privacy would be greater. Finding: Privacy protection, like security, is very poor in many systems, and there are inadequate incentives for system opera- tors and vendors to improve the quality of both. (4.6) Finding: Effective privacy protection is unlikely to emerge vol- untarily unless significant incentives to respect privacy emerge to counterbalance the existing incentives to compromise pri- vacy. The experience to date suggests that market forces alone are unlikely to sufficiently motivate effective privacy protec- tion. (4.7) Even if the choice is made to institute authentication systems only where people today attempt to discern identity, the creation of reliable, inexpensive systems will inevitably invite function creep and unplanned- for secondary uses unless action is taken to avoid these problems. Thus, the privacy consequences of both the intended design and deployment and the unintended uses of authentication systems must be taken into consideration by vendors, users, policy makers, and the general public. Recommendation: Authentication systems should not infringe upon individual autonomy and the legal exercise of expressive activities. Systems that facilitate the maintenance and assertion of separate identities in separate contexts aid in this endeavor, consistent with existing practices in which individuals assert distinct identities for the many different roles they assume. Designers and implementers of such systems should respect informational, communications, and other privacy interests as they seek to support requirements for authentication actions. (3.1) The federal government has passed numerous laws and regulations that place constraints on the behavior of private sector parties as well as on government agencies. Among them are the Family Educational Rights and Privacy Act, the Financial Services Modernization Act, the Health Insurance Portability and Accountability Act of 1996, and, in 1974, the

OCR for page 1
EXECUTIVE SUMMARY 15 Privacy Act, which regulates the collection, maintenance, use, and dis- semination of personal information by federal government agencies. Given the plethora of privacy-related legislation and regulation, making sense of government requirements can be daunting. TOOLKIT With a basic understanding of authentication, privacy interests and protections, and related technologies, it is possible to consider how one might design an authentication system that limits privacy intrusions while still meeting its functional requirements. This report provides a toolkit for examining the privacy implications of various decisions that must be made when an authentication system is being contemplated. As men- tioned previously, most of these decisions can be made irrespective of the particular technology under consideration. The kind of authentication to be performed (attribute, identity, or individual) is an initial choice that will bear on the privacy implications. Viewed without regard to the resource that they are designed to protect, attribute authentication systems present the fewest privacy problems and individual authentication systems the most. Despite the fact that it raises more privacy concerns, in some instances individual authentication may be appropriate for privacy, security, or other reasons. In the process of developing an authentication system, several ques- tions must be answered early. Decisions will have to be made about which attributes to use, which identifiers will be needed, which identity will be associated with the identifier, and how the level of confidence needed for authentication will be reached. The answers to each of these questions will have implications for privacy. Chapter 7 elaborates on four types of privacy (information, decisional, bodily integrity, and communi- cations) and on how they are affected by the answers to each of the pre- ceding questions. The analysis proposed is technology-independent, for the most part, and can be applied to almost any proposed authentication system.