3
Conclusions and Recommendations

3.1
BASIC PREMISES

The committee’s work was informed by a number of basic premises. These premises framed the committee’s perspective in developing this report, and they can be regarded as the assumptions underlying the committee’s analysis and conclusions. The committee recognizes that others may have their own analyses with different premises, and so for analytical rigor, it is helpful to lay out explicitly the assumptions of the committee.


Premise 1. The United States faces two real and serious threats from terrorists. The first is from terrorist acts themselves, which could cause mass casualties, severe economic loss, and social dislocation to U.S. society. The second is from the possibility of inappropriate or disproportionate responses to the terrorist threat that can do more damage to the fabric of society than terrorists would be likely to do.


The events of September 11, 2001, provided vivid proof of the damage that a determined terrorist group can inflict on U.S. society. All evidence to date suggests that the United States continues to be a prime target for such terrorist groups as Al Qaeda, and future terrorist attacks could cause



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 67
3 Conclusions and Recommendations 3.1 BASIC PREMISES The committee’s work was informed by a number of basic premises. These premises framed the committee’s perspective in developing this report, and they can be regarded as the assumptions underlying the committee’s analysis and conclusions. The committee recognizes that others may have their own analyses with different premises, and so for analytical rigor, it is helpful to lay out explicitly the assumptions of the committee. Premise 1. The United States faces two real and serious threats from terrorists. The first is from terrorist acts themseles, which could cause mass casualties, seere economic loss, and social dislocation to U.S. society. The second is from the possibility of inappropriate or disproportionate responses to the terrorist threat that can do more damage to the fabric of society than terrorists would be likely to do. The events of September 11, 2001, provided vivid proof of the damage that a determined terrorist group can inflict on U.S. society. All evidence to date suggests that the United States continues to be a prime target for such terrorist groups as Al Qaeda, and future terrorist attacks could cause 

OCR for page 67
 PROTECTING INDIVIDUAL PRIVACY IN THE STRUGGLE AGAINST TERRORISTS major casualties, severe economic loss, and social disruption.1 The danger of future terrorist attacks on the United States is both real and serious. At the same time, inappropriate or disproportionate responses to the terrorist threat also pose serious dangers to society. History demonstrates that measures taken in the name of improving national security, especially in response to new threats or crises, have often proven to be both inef- fective and offensive to the nation’s values and traditions of liberty and justice.2 So the danger of unsuitable responses to the terrorist threat is also real and serious. Given the existence of a real and serious terrorist threat, it is a rea- sonable public policy goal to focus on preventing attacks before they occur—a goal that requires detecting the planning for such attacks prior to their execution. Given the possibility of inappropriate or dispropor- tionate responses, it is also necessary that programs intended to prevent terrorist attacks be developed and operated without undue compromises of privacy. Premise 2. The terrorist threat to the United States, serious and real though it is, does not justify goernment authorities conducting actiities or operations that contraene existing law. The longevity of the United States as a stable political entity is rooted in large measure in the respect that government authorities have had for the rule of law. Regardless of the merits or inadequacies of any legal regime, government authorities are bound by its requirements until the legal regime is changed, and, in the long term, public confidence and trust in government depend heavily on a belief that the government is indeed adhering to the laws of the land. The premises above would not change even if the United States were facing exigent circumstances. If existing legal authorities (including any emergency action provisions, of which there are many) are inadequate or unclear to deal with a given situation 1 For example, the National Intelligence Estimate of the terrorist threat to the U.S. home- land provides a judgment that “the U.S. Homeland will face a persistent and evolving ter- rorist threat over the next three years. The main threat comes from Islamic terrorist groups and cells, especially al-Qa’ida, driven by their undiminished intent to attack the Homeland and a continued effort by these terrorist groups to adapt and improve their capabilities.” See The Terrorist Threat to the U.S. Homeland, National Intelligence Estimate, July 2007, available from Office of the Director of National Intelligence, Washington, D.C. 2 Consider, for example, the 1942 internment of U.S. citizens of Japanese origin in the wake of the Pearl Harbor attack. The United States formally apologized to the Japanese American community for this act in 1988, and beginning in 1990 paid reparations to surviv- ing internees.

OCR for page 67
 CONCLUSIONS AND RECOMMENDATIONS or contingency, government authorities should seek to change the law rather than to circumvent or disobey it. A willingness of U.S. government authorities to circumvent or dis- obey the law in times of emergency is not unprecedented. For example, recently declassified Central Intelligence Agency (CIA) documents indi- cate widespread violations of the agency’s charter and applicable law in the 1960s and 1970s, during which time the CIA conducted surveillance operations on U.S. citizens under both Democratic and Republican presi- dents that were undertaken outside the agency’s charter.3 The U.S. Congress has also changed laws that guaranteed confiden- tiality in order to gain access to individual information collected under guarantees. For example, Section 508 of the USA Patriot Act, passed in 2001, allows the U.S. Department of Justice (DOJ) to gain access to indi- vidual information originally collected by the National Center for Educa- tion Statistics under a pledge of confidentiality. In earlier times, the War Powers Act of 1942 retrospectively overrode the confidentiality provisions of the Census Bureau, and it is now known that bureau officials shared individually identifiable census information with other government agen- cies for the purposes of detaining foreign nationals.4 Today, many laws provide statutory protection for privacy. Conform- ing to such protections is not only obligatory, but it also builds necessary discipline into counterterrorism efforts that serves other laudable pur- poses. By making the government stop and justify its effort to a senior official, a congressional committee, or a federal judge, warrant require- ments and other privacy protections often help bring focus and precision to law enforcement and national security efforts. In point of fact, courts rarely refuse requests for judicial authorization to conduct surveillance. As government officials often note, one reason for these high success rates is the quality of internal decision making that the requirement to obtain judicial authorization requires. Premise 3. Challenges to public safety and national security do not warrant fundamental changes in the leel of priacy protection to which nonterrorists are entitled. The United States is a strong nation for many reasons, not the least of which is its commitment to the rule of law, civil liberties, and respect 3 M. Mazzetti and T. Weiner, “Files on illegal spying show C.I.A. skeletons from Cold War,” New York Times, June 27, 2007. 4 W. Seltzer and M. Anderson, “Census confidentiality under the second War Powers Act (1942-1947),” paper prepared for the Annual Meeting of the Population Association of America, March 30, 2007, Population Association of America, New York, available at http:// www.uwm.edu/~margo/govstat/Seltzer-AndersonPAA2007paper3-12-2007.doc.

OCR for page 67
0 PROTECTING INDIVIDUAL PRIVACY IN THE STRUGGLE AGAINST TERRORISTS for diversity. Especially in times of challenge, it is important that this commitment remain strong and unwavering. New technological circum- stances may necessitate an update of existing privacy laws and policy, but privacy and surveillance law already includes means of dealing with national security matters as well as criminal law investigations. As new technologies become more commonly used, these means will inevitably require extension and updating, but greater government access to private information does not trump the commitment to the bedrock civil liberties of the nation. Note that the term “privacy” has multiple meanings depending on context and interpretation. Appendix L (“The Science and Technology of Privacy Protection”) explicates a technical definition of the term, and the term is often used in this report, as in everyday discourse, with a variety of informal meanings that are more or less consistent with the technical definition. Premise 4. Exploitation of new science and technologies is an important dimen- sion of national counterterrorism efforts. Although the committee recognizes that other sciences and technolo- gies are relevant as well, the terms of reference call for this report to focus on information technologies and behavioral surveillance techniques. The committee believes that when large amounts of information, personal and otherwise, are determined to be needed for the counterterrorist mission, the use of information technologies will be necessary and counterterrorist authorities will need to collect, manage, and analyze such information. Furthermore, it believes that behavioral surveillance techniques may have some potential for inferring intent from observed behavior if the underly- ing science proves sound—a capability that could be very useful in coun- terterrorist efforts “on the ground” if realized in the future. Premise 5. To the extent reasonable and feasible, counterterrorist programs should be formulated to proide secondary benefits to the nation in other domains. Counterterrorism programs are often expensive and controversial. In some cases, however, a small additional expenditure or programmatic adjustment may enable them to provide benefits that go beyond their role in preventing terrorism. Thus, they would be useful to the nation even if terror attacks do not occur. For example, hospital emergency reporting systems can improve medical care by prompt reporting of influenza, food poisoning, or other health problems, as well as alerting officials of bioter- rorist and chemical attacks.

OCR for page 67
 CONCLUSIONS AND RECOMMENDATIONS At the same time, policy makers must be aware of the phenomenon of “statutory mission creep”—in which the goals and missions of a pro- gram are expanded explicitly as the result of a specific policy action, such as congressional amendment of an existing law—and avoid its snares. In some instances, such as hospital emergency reporting systems, pri- vacy interests may not be seriously compromised by their application to multiple missions. But in others, such as the use of systems designed for screening terrorists to identify ordinary criminals, privacy interests may be deeply implicated because of the vast and voluminous new data sets that must be brought to bear on the expanded mission. Mission creep may also go beyond the original understandings of policy makers regarding the scope and nature of a program that they initially approve, and thus effectively circumvent careful scrutiny. In some cases, a sufficient amount of mission creep may even result in a program whose operation is not strictly legal. 3.2 CONCLUSIONS REGARDING PRIVACY The rich digital record that is made of people’s lives today provides many benefits to most people in the course of everyday life. Such data may also have utility for counterterrorist and law enforcement efforts. However, the use of such data for these purposes also raises concerns about the protection of privacy and civil liberties. Improperly used, pro- grams that do not explicitly protect the rights of innocent individuals are likely to create second-class citizens whose freedoms to travel, engage in commercial transactions, communicate, and practice certain trades will be curtailed—and under some circumstances, they could even be improperly jailed. 3.2.1 Protecting Privacy Conclusion 1. In the counterterrorism effort, some degree of priacy pro- tection can be obtained through the use of a mix of technical and procedural mechanisms. The primary goal of the nation’s counterterrorism effort is to prevent terrorist acts. In such an effort, identification of terrorists before they act becomes an important task, one that requires the accurate collection and analysis of their personal information. However, an imperfect under- standing of which characteristics to search for, not to mention imperfect and inaccurate data, will necessarily draw unwarranted attention to many innocent individuals. Thus, records containing personal information of terrorists cannot be

OCR for page 67
 PROTECTING INDIVIDUAL PRIVACY IN THE STRUGGLE AGAINST TERRORISTS examined without violating the privacy of others, and so absolute privacy protection—in the sense that the privacy of nonterrorists cannot be com- promised—is not possible if terrorists are to be identified. This technical reality does not preclude putting into place strong mechanisms that provide substantial privacy protection. In particular, restrictions on the use of personal information ensure that innocent indi- viduals are strongly protected during the examination of their personal information, and strong and vigorous oversight and audit mechanisms can help to ensure that these restrictions are obeyed. How much privacy protection is afforded by technical and procedural mechanisms depends on critical design features of both the technology and the organization that uses it. Two examples of relevant technical mechanisms are encryption of all data transports to protect against acci- dental loss or compromise and individually logged5 audit records that retain details of all queries, including those made by fully authorized individuals to protect against unauthorized use.6 But the mere presence of such mechanisms does not ensure that they will be used, and such mechanisms should be regarded as one enabler—one set of necessary but not sufficient tools—for the robust independent program oversight described in Recommendation 1c below. Relevant procedural mechanisms include restrictions on data col- lection and restrictions on use. In general, such mechanisms govern important dimensions of information collection and use, including an explication of what data are collected, whether collection is done openly or covertly, how widely the data are disseminated, how long they are retained, the decisions for which they are used, whether the processing is 5 “Individually logged” refers to audit records designed to monitor system usage by in- dividual users and maintain individual accountability. For example, consider a personnel office in which users have access to those personnel records for which they are responsible. Individually logged audit trails can reveal that an individual is printing far more records than the average user, which could indicate the selling of personal data. 6 Note that audit records documenting accesses to a database are conceptually distinct from the data contained within a database. An audit record typically identifies the party that took some specific action now captured in the audit record and the nature of the data involved in that action, but it does not specify the content of the data involved. (For ex- ample, a database of financial transactions is routinely updated to include all of the credit card purchases of John Smith for the last year. Since today is April 14, 2008, the database contains all of his purchases from April 14, 2007, to April 13, 2008. An audit record relevant to those records might include the fact that on January 17, 2004, Agent Mary Doe viewed John Smith’s credit card purchases—that is, she looked at his purchases from January 17, 2003, to January 16, 2004.) One result of this distinction is that the data within a database may be purged within a short period of time in accordance with a specified data retention policy, but audit records describing accesses to that data may be kept for the entire life of the database.

OCR for page 67
73 CONCLUSIONS AND RECOMMENDATIONS performed by computer or human, and who has the right to grant permis‑ sions for subsequent uses. Historically, privacy from government intrusion has been protected by limiting what information the government can collect: voice conversa‑ tions collected through wiretapping, e‑mail collected through access to stored data (authorized by the Electronic Communications Privacy Act, passed in 1986 and codified as 18 U.S.C. 2510), among others. However, in many cases today, the data in question have already been collected and access to them, under the third‑party business records doctrine, will be readily granted with few strings attached. As a result, there is great poten‑ tial for privacy intrusion arising from analysis of data that are accessible to government investigators with little or no restriction or oversight. In other words, powerful investigative techniques with significant privacy impact proceed in full compliance with existing law—but with signifi‑ cant unanswered privacy questions and associated concerns about data quality. Analytical techniques that may be justified for the purpose of national security or counterterrorism investigations, even given their potential power for privacy intrusion, must come with assurances that the infer‑ ences drawn against an individual will not then be used for normal domestic criminal law enforcement purposes. Hence, what is called for, in addition to procedural safeguards for data quality, are usage limitations that provide for full exploitation on new investigative tools when needed (and justified) for national security purposes, but that prevent those same inferences from being used in criminal law enforcement activity. An example—for illustration only—of the latter is the use of per‑ sonal data for airline passenger screening. Privacy advocates have often expressed concerns that the government use of large‑scale databases to identify passengers who pose a potential risk to the safety of an airplane could turn into far‑reaching enforcement mechanisms for all manner of offenses, such as overdue tax bills or child support payments. One way of dealing with this privacy concern would be to apply a usage‑ limiting privacy rule that allows the use of databases for the purpose of counterterrorism but prohibits the use of these same databases and analysis for domestic law enforcement. Those suspicious of government intentions are likely to find a rule limiting usage rather less comforting than a rule limiting collection, out of concern that government authori‑ ties will find it easier to violate a rule limiting usage than a rule limiting collection. Nevertheless, well‑designed and diligently enforced auditing and oversight processes may help over time to provide reassurance that the rule is being followed as well as to provide some actual protection for individuals. Finally, in some situations, improving citizen privacy can have the

OCR for page 67
 PROTECTING INDIVIDUAL PRIVACY IN THE STRUGGLE AGAINST TERRORISTS result of improving their security and vice versa. For example, improve- ments in the quality of data (i.e., more complete, more accurate data) used in identifying potential terrorists are likely to increase security by enhancing the effectiveness of information-based programs to identify terrorists and to decrease the adverse consequences that may occur due to confidentiality violations for the vast majority of innocent individuals. In addition, strong audit controls that record the details of all accesses to sensitive personal information serve both to protect the privacy of indi- viduals and to reduce barriers to information sharing between agencies or analysts. (Agencies or analysts are often reluctant to share informa- tion, even among themselves, because they feel a need to protect sources and methods, and audit controls that limit information access provide a greater degree of reassurance that sensitive information will not be improperly distributed.) Conclusion 2. Data quality is a major issue in the protection of the priacy of nonterrorists. As noted in Chapter 1, the issue of data quality arises internally as a result of measurement errors within databases and also as a consequence of efforts to link data or records across databases in the absence of clear, unique identifiers. Sharing personal information across agencies, even with “names” attached, offers no assurances that the linked data are sufficiently accurate for counterterrorism purposes; indeed, there are no metrics for accuracy that appear to be systematically used to assess such linking efforts. Data of poor quality severely limit the value of data mining in a number of ways. First, the actual characteristics of individuals are often collected in error for a wide array of reasons, including definitional prob- lems, identify theft, and misresponse on surveys. These errors could obviously result in individuals being inaccurately represented by data mining algorithms as a threat when they are not (with the consequence that personal and private information about them might be inappropriately released for wider scrutiny). Second, poor data quality can be amplified during file matching, resulting in the erroneous merging of information for different individuals into a single file. Again, the results can be improper treatment of individuals as terrorist threats, but here the error is compounded, since entire clusters of information are now in error with respect to the individual who is linked to the informa- tion in the merged file. Such problems are likely to be quite common and could greatly limit the utility of data mining methods used for counterterrorism. There are no obvious mechanisms for rectifying the current situation, other than col-

OCR for page 67
 CONCLUSIONS AND RECOMMENDATIONS lecting similar information from multiple sources and using the duplica - tive nature of the information to correct inaccuracies. However, given that today the existence of alternate sources is relatively infrequent, correcting individual errors will be extraordinarily difficult. 3.2.2 Distinctions Between Capability and Intent Conclusion 3. Inferences about intent and/or state of mind implicate pri- acy issues to a much greater degree than do assessments or determinations of capability. Although it is true that capability and intent are both needed to pose a real threat, determining intent on the basis of external indicators is inherently a much more subjective enterprise than determining capability. Determining intent or state of mind is inherently an inferential process, usually based on indicators such as whom one talks to, what organiza- tions one belongs to or supports, or what one reads or searches for online. Assessing capability is based on such indicators as purchase or other acquisition of suspect items, training, and so on. Recognizing that the distinction between capability and intent is sometimes unclear, it is never- theless true that placing people under suspicion because of their associa- tions and intellectual explorations is a step toward abhorrent government behavior, such as guilt by association and thought crime. This does not mean that government authorities should be categorically proscribed from examining indicators of intent under all circumstances—only that special precautions should be taken when such examination is deemed necessary. 3.3 CONCLUSIONS REGARDING THE ASSESSMENT OF COUNTERTERRORISM PROGRAMS Conclusion 4. Program deployment and use must be based on criteria more demanding than “it’s better than doing nothing.” In the aftermath of a disaster or terrorist incident, policy makers come under intense political pressure to respond with measures intended to prevent the event from occurring again. The policy impulse to do some- thing (by which is usually meant something new) under these circum- stances is understandable, but it is simply not true that doing something new is always better than doing nothing. Indeed, policy makers may deploy new information-based programs hastily, without a full consider- ation of (a) the actual usefulness of the program in distinguishing people or characteristic patterns of interest for follow-up from those not of inter-

OCR for page 67
 PROTECTING INDIVIDUAL PRIVACY IN THE STRUGGLE AGAINST TERRORISTS est, (b) an assessment of the potential privacy impacts resulting from the use of the program, (c) the procedures and processes of the organization that will use the program, and (d) countermeasures that terrorists might use to foil the program. The committee developed the framework presented in Chapter 2 to help decision makers determine the extent to which a program is effective in achieving its intended goals, compliant with the laws of the nation, and reflective of the values of society, especially with regard to the protection of data subjects’ privacy. This framework is intended to be applied by taking into account the organizational and human contexts into which any given program will be embedded as well as the countermeasures that terrorists might take to foil the program. The framework is discussed in greater detail in Chapter 2. 3.4 CONCLUSIONS REGARDING DATA MINING7 3.4.1 Policy and Law Regarding Data Mining Conclusion 5. The current policy regime does not adequately address iolations of priacy that arise from information-based programs using adanced analytical techniques, such as state-of-the-art data mining and record linkage. The current privacy policy regime was established prior to today’s world of broadband communications, networked computers, and enor- mous databases. In particular, it relies largely on limitations imposed on the collection and use of certain kinds of information, and it is essentially silent on the use of techniques that could be used to process and analyze already-collected information in ways that might compromise privacy. For example, an activity for counterterrorist purposes, possibly a data mining activity, is likely to require the linking of data found in multiple databases. The literature on record linkage suggests that, even assum- ing the data found in any given database to be of high quality, the data derived from linkages (the “mosaic” consisting of the collection of linked data) are likely to be error-prone. Certainly, the better the quality of the individual lists, the fewer the errors that will be made in record linkage, but even with high-quality lists, the percentage of false matches and false nonmatches may still be uncomfortably high. In addition, it is also the case that certain data mining algorithms are less sensitive to record link- age errors as inputs, since they use redundant information in a way that can, at times, identify such errors and downweight or delete them. Again, even in the best circumstances, such problems are currently extremely 7Additional observations about data mining are contained in Appendix H.

OCR for page 67
 CONCLUSIONS AND RECOMMENDATIONS difficult to overcome. Error-prone data are, of course, both a threat to privacy (as innocent individuals are mistakenly associated with terrorist activity) and a threat to effectiveness (as terrorists are overlooked because they have been hidden by errors in the data that would have suggested a terrorist connection). The committee also notes that the use of analytical techniques such as data mining is not limited to government purposes; private parties, including corporations, criminals, divorce lawyers, and private investiga- tors, also have access to such techniques. The large-scale availability of data and advanced analytical techniques to private parties carries clear potential for abuses of various kinds that might lead to adverse conse- quences for some individuals, but a deep substantive examination of this issue is outside the primary focus of this report on government policy. 3.4.2 The Promise and Limitations of Data Mining Chapter 1 (in Section 1.6.1) notes that data mining covers a wide vari- ety of analytical approaches for using large databases for counterterrorist purposes, and in particular it should be regarded as being much broader than the common notion of a technology underlying automated terrorist identification. Conclusion 6. Because data mining has proen to be aluable in priate-sector applications, such as fraud detection, there is reason to explore its potential uses in countering terrorism. Howeer, the problem of detecting and preempting a terrorist attack is astly more difficult than problems addressed by such com- mercial applications. As illustrated in Appendix H (“Data Mining and Information Fusion”), data mining has proven valuable in a number of private-sector applications. But the data used by analysts to track sales, banks to assess loan applications, credit card companies to detect fraud, and telephone companies to detect fraud are fundamentally different from counterter- rorism data. For example, private-sector applications generally have access to a substantial amount of relatively complete and structured data. In some cases, their data are more accurate than government data, and, in others, large volumes of relevant data sometimes enable statistical techniques to compensate8 to some extent for data of lower quality—thus, either way, reducing the data-cleaning effort required. In addition, a few false positives and false negatives are acceptable in private-sector 8A fact that underlies the ability of Internet search engines to propose correct spellings of many incorrectly spelled words.

OCR for page 67
 PROTECTING INDIVIDUAL PRIVACY IN THE STRUGGLE AGAINST TERRORISTS entail a practice of using the same data mining technologies to “mine the miners and track the trackers.” In practice, operational monitoring is generally the responsibility of the program managers and operational personnel. But as discussed in Appendix G (“The Jurisprudence of Privacy Law and the Need for Inde- pendent Oversight”), oversight is necessary to ensure that actual opera- tions have been conducted in accordance with stated policies. The reason is that, in many cases, decision makers formulate poli- cies in order to balance competing imperatives. For example, the public demands both a high degree of effectiveness in countering terrorism and a high degree of privacy. Program administrators themselves face multiple challenges: motivating high performance, adhering to legal requirements, staying within budget, and so on. But if operational personnel adhere to some elements of a policy and not to others, the balance that decision makers intended to achieve will not be realized in practice. The committee emphasizes that independent oversight is necessary to ensure that commitments to minimizing privacy intrusions embedded in policy statements are realized in practice. The reason is that losses of privacy are easy to discount under the pressure of daily operations, and those elements of policy intended to protect privacy are more likely to be ignored or compromised. Without effective oversight mechanisms in place, public trust is less likely to be forthcoming. In addition, oversight can support continuous improvement and guide administrators in mak- ing organizational change. For example, program oversight is essential to ensure that those responsible for the program do not bypass procedures or technolo- gies intended to protect privacy. Noncompliance with existing privacy- protecting laws, regulations, and best practices diminishes public support and creates an environment in which counterterrorism programs may be curtailed or eliminated. Indeed, even if shortcuts and bypasses increase effectiveness in a given case, in the long run scandals and public outcry about perceived abuses will reduce the political support for the programs or systems involved—and may deprive the nation of important tools useful in the counterterrorist mission. Even if a program is effective in the laboratory and expected to be so in the field, its deployment must be accompanied by strong technical and procedural safeguards to ensure that the privacy of individuals is not placed at undue risk. Oversight is also needed to protect against abuse and mission creep. Experience and history indicate that in many programs that collect or use personal information, some individuals may violate safeguards intended to protect individual privacy. Hospital clerks have been known to exam-

OCR for page 67
 CONCLUSIONS AND RECOMMENDATIONS ine the medical records of celebrities without having a legitimate reason for doing so, simply because they are curious. Police officers have been known to examine the records of individuals in motor vehicle informa- tion systems to learn about the personal lives of individuals with whom they interact in the course of daily business. And, of course, compromised insiders have been known to use the information systems of law enforce- ment and intelligence agencies to further nefarious ends. The phenomenon of mission creep is illustrated by the Computer- Assisted Passenger Prescreening System II (CAPPS II) program, initially described in congressional testimony as an aviation security tool and not a law enforcement tool but which morphed in a few months to a system that would analyze information on persons “with [any] outstanding state or federal arrest warrants for crimes of violence.”18 To guard against such practices, the committee advocates program oversight that mines the miners and tracks the trackers. That is, all opera- tion and command histories and all accesses to data-based counterter- rorism information systems should be logged on an individual basis, audited, and mined with the same technologies and the same zeal that are applied to combating terrorists. If, for example, such practices had been in place during Robert Hanssen’s tenure at the Federal Bureau of Inves- tigation (FBI), his use of its computer systems for unauthorized purposes might have been discovered sooner. Finally, the committee recognizes the phenomenon of statutory mis- sion creep, as defined above in the discussion of Premise 5. It occurs, for example, because in responding to a crisis, policy makers will naturally focus on adapting existing programs and capabilities rather than creat- ing new ones. On one hand, if successful, adaptation often promises to be less expensive and faster than creating a new program or capabilities from scratch. On the other hand, because an existing program is likely to be highly customized for specific purposes, adapting that program to serve other purposes effectively may prove difficult—perhaps even more difficult than creating a program from scratch. As importantly, adapting an existing program to new purposes may well be contrary to agreements and understandings established in order to initiate the original program in the first place. 18An initial description of CAPPS II by Deputy Secretary of DHS Admiral James Loy, then administrator of the Transportation Security Administration, assured Congress that CAPPS II was intended to be an aviation security tool, not a law enforcement tool. Testimony of Admiral James Loy before House Government Reform Subcommittee on Technology, In- formation Policy, Intergovernmental Relations and the Census (May 6, 2003). Morphed system—Interim Final Privacy Act Notice, 68 Fed. Reg. 45265 (August 1, 2003).

OCR for page 67
 PROTECTING INDIVIDUAL PRIVACY IN THE STRUGGLE AGAINST TERRORISTS The committee does not oppose expanding the goals and missions of a program under all circumstances. Nevertheless, it cautions that such expansion should not be undertaken hastily in response to crisis. In the committee’s view, following diligently the framework presented in Chap- ter 2 is an important step in exercising such caution. Recommendation 1d. Counterterrorism programs should provide meaningful redress to any individuals inappropriately harmed by their operation. Programs that are designed to balance competing interests (in the case of counterterrorism, collective security and individual privacy and civil liberties) will naturally be biased in one direction or another if their incentive/penalty structure is not designed to reflect this balance. The availability of redress to the individual harmed thus acts to promote the goal of compliance with stated policy—as does the operational oversight mentioned in Recommendation 1c—and to provide incentives for the government to improve the policies, technologies, and data underlying the operation of the program. Although the committee makes no specific recommendation con- cerning the form of redress that is appropriate for any given privacy harm suffered by innocent individuals as the result of a counterterrorism program, it notes that many forms of redress are possible in principle, ranging from apology to monetary compensation. The most appropri- ate form of redress is likely to depend on the nature and purpose of the specific counterterrorism program involved. However, the committee believes that, at a minimum, an innocent individual should always be provided with at least an explicit acknowledgment of the harm suffered and an action that reduces the likelihood that such an incident will ever be repeated, such as correcting erroneous data that might have led to the harm. Note that responsibilities for correction should apply to the holder of erroneous data, regardless of whether the holder is the government or a third party. The availability of redress might, in principle, enable terrorists to manipulate the system in order to increase their chances of remaining undetected. However, as noted in Item 7 of the committee’s framework on effectiveness, information-based programs should be robust and not eas- ily circumvented by adversary countermeasures, and thus the possibility that terrorists might manipulate the system is not a sufficient argument against the idea of redress.

OCR for page 67
 CONCLUSIONS AND RECOMMENDATIONS 3.7.2 Periodic Review of U.S. Law, Policy, and Procedures for Protection of Privacy Recommendation 2. The U.S. government should periodically review the nation’s laws, policies, and procedures that protect individuals’ private information for relevance and effectiveness in light of chang- ing technologies and circumstances. In particular, Congress should reexamine existing law to consider how privacy should be protected in the context of information-based programs (e.g., data mining) for counterterrorism. The technological environment in which policy is embedded is con- stantly changing. Although technological change is not new, the pace of technological change has dramatically increased in the digital age. As noted in Engaging Priacy and Information Technology in a Digital Age, advances in information technology make it easier and cheaper by orders of magnitude to gather, retain, and analyze information, and other trends have enabled access to new kinds of information that previously would have been next to impossible to gather about another individual. 19 Fur- thermore, new information technologies have eroded the privacy protec- tion once provided through obscurity or the passage of time. Today, it is less expensive to store information electronically than to decide to get rid of it, and new and more powerful data mining techniques and technolo- gies make it much easier to extract and identify personally identifiable patterns that were previously protected by the vast amounts of data “noise” around them. The security environment is also constantly changing. New adversar- ies emerge, and counterterrorist efforts must account for the fact that new practices and procedures for organizing, training, planning, and acquiring resources may emerge as well. Most importantly, new attacks appear. The number of potential terrorist targets in the United States is large,20 and 19 National Research Council, Engaging Priacy and Information Technology in a Digital Age, The National Academies Press, Washington, D.C., 2007. 20Analysts and policy makers have debated the magnitude of this number. In one ver- sion of the Total Information Awareness program, the number of important and plausible terrorist targets was estimated at a few hundred, while other informed estimates place the number at a few thousand. Still other analysts argue that the number is virtually unlimited, since terrorists could, in principle, seek to strike anywhere in their attempts to sow terror. There is evidence on both sides of this point. Some point to Al Qaeda planning documents and other intelligence information to suggest that it continues to be very interested in large- scale strikes on targets that are media-worthy around the world, such as targets associated with air transportation. Others point out that the actual history of terrorist attacks around the world has for the most part involved attacks on relatively soft and undefended targets, of which there are very many.

OCR for page 67
 PROTECTING INDIVIDUAL PRIVACY IN THE STRUGGLE AGAINST TERRORISTS although the different types of attack on these targets may be limited, attacks might be executed in myriad ways. As an example of a concern ripe for examination and possible action, the committee found common ground in the proposition that policy mak- ers should seriously consider restrictions on how personal information is used in addition to restrictions on how records are collected and accessed. Usage restrictions could be an important and useful supplement to access and collection limitation rules in an era in which much of the personal information that can be the basis for privacy intrusion is already either publicly available or easily accessible on request without prior judicial oversight. Privacy protection in the form of information usage restrictions can provide a helpful tool that balances the need to use powerful inves- tigative tools, such as data mining, for counterterrorism purposes and the imperative to regulate privacy intrusions of such techniques through accountable adherence to clearly stated privacy rules. (Appendix G elabo- rates on this aspect of the recommendation.) Such restrictions can serve an important function in helping to ensure that programs created to address a specific area stay focused on the prob- lem that the programs were designed to address and in guarding against unauthorized or unconsidered expansion of government surveillance power. They also help to discourage mission creep, which often expands the set of purposes served by the program without explicit legislative authorization and into areas that are poorly matched by the original program’s structure and operation. An example of undesirable mission creep would be the use of personal data collected from the population acquired for counterterrorist purposes to uncover tax evaders or parents who have failed to make child support payments. This is not to say that finding such individuals is not a worthy social goal, but rather that the mismatch between such a goal and the intrusiveness of data collection measures for counterterrorist purposes is substantial indeed. Without clear legal rules defining the boundaries for use between counterterrorism and inappropriate law enforcement uses, debates over mission creep are likely to continue without constructive resolution. A second example of a concern that may be ripe for legislative action involves the current legal uncertainty supporting private-sector liability for cooperation with government data mining programs. Such uncer- tainty creates real risk in the private sector, as indicated by the present variety of private lawsuits against telecommunications service provid- ers,21 and private-sector responsibilities and rights must be clarified along 21 For example, the Electronic Frontier Foundation filed a class-action lawsuit against AT&T on January 31, 2006, claiming that AT&T violated the law and the privacy of its customers by collaborating with the National Security Agency in the Terrorist Surveillance Program.

OCR for page 67
 CONCLUSIONS AND RECOMMENDATIONS with government powers and privacy protections. What exists today is a mix of law, regulation, and informal influence in which the legal rights and responsibilities of private-sector entities are highly uncertain and not well understood. A coherent, comprehensive legal regime regulating information- intensive surveillance such as government data mining, would do much to reduce such uncertainty. As one example, such a regime might address the issue of liability limitation for private-sector data sources (data- base providers, etc.) that provide privacy-intrusive information to the government. Without spelling out the precise scope and coverage of the com- prehensive regime, the committee believes that to the extent that the government legally compels a private party to provide data or a private party otherwise complies with an apparently legal requirement to dis- close information, it should not be subject to liability simply for the act of complying with the government compulsion or legal requirement. Any such legal protection should not extend to the content of the information it supplies, and the committee also believes that the regime should allow incentives for data providers to invest reasonable effort in ensuring the quality of the data they provide. Furthermore, they should provide effec- tive legal remedies for those individuals who suffer harm as a result of provider negligence. Furthermore, the regime would necessarily preserve the ability of individuals to challenge the constitutionality of the underly- ing data access statute. Listed below are other examples of how the adequacy of privacy- related law might be called into question by a changing environment (Appendix F elaborates on these examples). • Conducting general searches. On one hand, the Fourth Amendment forbids general searches—that is, searches that are not limited as to the location of the search or the type of evidence the government is seek- ing—by requiring that all searches and seizures must be reasonable and that all warrants must state with particularity the item to be seized and the place to be searched. On the other hand, machine-aided searching of enormous digital transaction records is in some ways analogous to a general search. Such a search can be a dragnet that sweeps through mil- lions or billions of records, often containing highly sensitive information. Much like a general search in colonial times was not limited to a particular person or place, a machine-aided search through digital databases can be very broad. How, if at all, should database searches be regulated by the Fourth Amendment or by statute? A related issue is that the historical difficulty of physical access to ostensibly public information has provided a degree of privacy protection

OCR for page 67
 PROTECTING INDIVIDUAL PRIVACY IN THE STRUGGLE AGAINST TERRORISTS for that information—what might be known as privacy through obscurity. But a search-enabled digital world erodes some of these previously inher- ent protections against invasions of privacy, changing the technological milieu that surrounds privacy jurisprudence. • Increased access to data; searches and sureillance of U.S. persons out- side the United States. The Supreme Court has not yet addressed whether the Fourth Amendment applies to searches and surveillance for national security and intelligence purposes that involve U.S. persons22 who are connected to a foreign power or that are conducted wholly outside the United States.23 Lower courts, however, have found that there is an excep- tion to the Fourth Amendment’s warrant requirement for searches con- ducted for intelligence purposes within the United States that involve only non-U.S. persons or agents of foreign powers.24 The Supreme Court has yet to rule on this important issue, and Congress has not supplied any statutory language to fill the gap. • Third-party records. Two Supreme Court cases (United States . Miller, 1976, and Smith . Maryland, 1979)25 have established the precedent that there is no constitutionally based reasonable expectation of privacy for information held by a third party, and thus the government today has access unrestricted by the Fourth Amendment to private-sector records on every detail of how people live their lives. Today, these third-party transactional records are available to the government subject to a very low threshold—through subpoenas that can be written by almost any government agency without prior judicial oversight—and are one of the primary data feeds for a variety of counterterrorist data mining activities. Thus, the public policy response to privacy erosion as a result of data min- ing used with these records will have to address some combination of the scope of use for the data mining results, the legal standards for access to and use of transactional information, or both.26 (See also Appendix G for 22 A U.S. person is defined by law and Executive Order 12333 to mean “a United States citizen, an alien known by the intelligence agency concerned to be a permanent resident alien, an unincorporated association substantially composed of United States citizens or permanent resident aliens, or a corporation incorporated in the United States, except for a corporation directed and controlled by a foreign government or governments.” 23 J.H. Smith and E.L. Howe, “Federal legal constraints on electronic surveillance,” pp. 133- 148 in Protecting America’s Freedom in the Information Age, Markle Foundation Task Force on National Security in the Information Age, Markle Foundation, New York, N.Y., 2002. 24 See United States . Bin Laden, 126 F. Supp. 2d 264, 271-72 (S.D.N.Y. 2000). 25 United States . Miller, 425 U.S. 435 (1976); Smith . Maryland, 442 U.S. 735 (1979). 26 Transactional information is the data collected on individuals from their interactions (transactions) with outside entities, such as businesses (e.g., travel and sales records), public facilities and organizations (e.g., library loans), and Web sites (e.g., Internet usage). Aggre- gate information, in contrast, is information in summary form (e.g., total visitors and sales) that does not contain data that would permit the identification of a specific individual.

OCR for page 67
 CONCLUSIONS AND RECOMMENDATIONS discussion of how usage limitations can fill gaps in current regulation of the confidentiality of third-party records.) • Electronic sureillance law. Today’s law regarding electronic sur- veillance is complex. Some of the complexity is due to the fact that the situations and circumstances in which electronic surveillance may be involved are highly varied, and policy makers have decided that differ- ent situations call for different regulations. But it is an open question as to whether these differences, noted and established in one particular set of circumstances, can be effectively maintained over time. Although there is broad agreement that today’s legal regime is not optimally aligned with the technological and circumstantial realities of the present, there is profound disagreement about whether the basic principles underlying today’s regime continue to be sound as well as in what directions changes to today’s regime ought to occur. In making Recommendation 2, the committee intends the govern- ment’s reexamination of privacy law to cover the issues described above but not be limited to them. In short, Congress and the president should work together to ensure that the law is clear, appropriate, up to date, and responsive to real needs. Greater clarity and coherence in the legal regime governing information-based programs would have many benefits, both for privacy protection and for the counterterrorist mission. It is perhaps obvious that greater clarity helps to protect privacy by eliminating what might be seen as loopholes in the law—ambiguities that can be exploited by well- meaning national security authorities, thereby overturning or circumvent- ing the intent of previously established policy that balanced competing interests. But the benefits of greater clarity from the standpoint of improv- ing the ability of the U.S. government to prosecute its counterterrorism responsibilities are less obvious and thus deserve some elaboration. First and most importantly from this perspective, greater legal clarity would help to reduce public controversy over potentially important tools that might be used for counterterrorist purposes. Although many policy makers might wish that they had a free hand in pursuing the counterter- rorist mission and that public debate and controversy would just go away, the reality is that public controversy does result when the government is seen as exploiting ambiguities and loopholes. As discussed in Appendix I (“Illustrative Government Data Mining Programs and Activity”), a variety of government programs have been shut down, scaled back, delayed, or otherwise restricted over privacy considerations: TIA, CAPPS II for screening airline passengers, MATRIX (Multistate Anti-Terrorism Information Exchange) for linking law enforce- ment records across states with other government and private-sector

OCR for page 67
00 PROTECTING INDIVIDUAL PRIVACY IN THE STRUGGLE AGAINST TERRORISTS databases, and a number of data-sharing experiments between the U.S. government and various airlines. Public controversy about these efforts may have prematurely compromised counterterrorism tools that might have been useful. In addition, they have also made the government more wary of national security programs that involve data matching and made the private sector more reluctant to share personal information with the government in the future. In this regard, this first rationale for greater clarity is consistent with the conclusion of the Technology and Privacy Advisory Committee: “[pri- vacy] protections are essential so that the government can engage in appropriate data mining when necessary to fight terrorism and defend our nation. And we believe that those protections are needed to provide clear guidance to DOD personnel engaged in anti-terrorism activities.” 27 Second, greater legal clarity and coherence can enhance the effective- ness of certain information-based programs. For example, the Privacy Act of 1974 requires that personal data used by federal agencies be accurate, relevant, timely, and complete. On one hand, these requirements increase the likelihood that high-quality data are stored, thus enhancing the effec- tiveness of systems that use data subject to those requirements. On the other hand, both the FBI’s National Crime Information Center and the passenger screening database of the Transportation Security Administra- tion have exemptions from some of these requirements;28 to the extent that these exemptions result in lower-quality data, these systems are likely to perform less well. Third, the absence of a clear legal framework is likely to have a pro- found effect on the innovation and research that are necessary to improve the accuracy and effectiveness of information-based programs. Such clar- ity is necessary to support the investment of financial, institutional, and human resources in often risky research that may not pay dividends for 27 Technology and Privacy Advisory Committee, Safeguarding Priacy in the Fight Against Terrorism, U.S. Department of Defense, Washington, D.C., March 2004, p. 48, available at http://www.cdt.org/security/usapatriot/20040300tapac.pdf. 28 The Department of Justice and the Transportation Security Administration have pub- lished notices on these programs in the Federal Register, exempting them from certain provi- sions of the Privacy Act that are allowed under the act. In March 2003, the DOJ exempted the FBI’s National Crime Information Center from the Privacy Act’s requirements that data be “accurate, relevant, timely and complete,” Priacy Act of ; Implementation, 68 Federal Register 14140 (2003) (DOJ, final rule). In August 2003, the Department of Homeland Secu- rity exempted the TSA’s passenger screening database from the Privacy Act’s requirements that government records include only “relevant and necessary” personal information, Pri- acy Act of : Implementation of Exemption, 68 Federal Register 49410 (2003) (DHS, final rule). Outside these exceptions, the Privacy Act otherwise applies to these programs. (Under the act, exemptions have to be published to be effective, and so the committee assumes that there are no “secret” exemptions.)

OCR for page 67
0 CONCLUSIONS AND RECOMMENDATIONS decades. But that type of research is essential to counterterrorism efforts and to finding better ways of protecting privacy. Finally, a clear and coherent legal framework will almost certainly be necessary to realize the potential of new technologies to fight terror- ism. Because such technologies will operate in the political context of an American public concerned about privacy, the public—and congressional decision makers—will have to take measures that protect privacy when new technologies are deployed. All technological solutions will require a legal framework within which to operate, and there will always be gaps left by technological protections, which law will be essential to fill. Consequently, a lack of clarity in that framework may not only slow their development and deployment, as described above, but also make techno- logical solutions entirely unworkable.

OCR for page 67