Click for next page ( 76


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 75
2 The Value and Importance of Health Information Privacy Ethical health research and privacy protections both provide valuable benefits to society. Health research is vital to improving human health and health care. Protecting patients involved in research from harm and preserv- ing their rights is essential to ethical research. The primary justification for protecting personal privacy is to protect the interests of individuals. In con- trast, the primary justification for collecting personally identifiable health information for health research is to benefit society. But it is important to stress that privacy also has value at the societal level, because it permits complex activities, including research and public health activities to be car- ried out in ways that protect individuals’ dignity. At the same time, health research can benefit individuals, for example, when it facilitates access to new therapies, improved diagnostics, and more effective ways to prevent illness and deliver care. The intent of this chapter1 is to define privacy and to delineate its importance to individuals and society as a whole. The value and importance of health research will be addressed in Chapter 3. CONCEPTS AND VALUE OF PRIVACY Definitions Privacy has deep historical roots (reviewed by Pritts, 2008; Westin, 1967), but because of its complexity, privacy has proven difficult to 1 Sections of this chapter were adapted from a background paper by Pritts (2008). 

OCR for page 75
 BEYOND THE HIPAA PRIVACY RULE define and has been the subject of extensive, and often heated, debate by philosophers, sociologists, and legal scholars. The term “privacy” is used frequently, yet there is no universally accepted definition of the term, and confusion persists over the meaning, value, and scope of the concept of privacy. At its core, privacy is experienced on a personal level and often means different things to different people (reviewed by Lowrance, 1997; Pritts, 2008). In modern society, the term is used to denote different, but overlapping, concepts such as the right to bodily integrity or to be free from intrusive searches or surveillance. The concept of privacy is also context specific, and acquires a different meaning depending on the stated reasons for the information being gathered, the intentions of the parties involved, as well as the politics, convention and cultural expectations (Nissenbaum, 2004; NRC, 2007b). Our report, and the Privacy Rule itself, are concerned with health informational privacy. In the context of personal information, concepts of privacy are closely intertwined with those of confidentiality and secu- rity. However, although privacy is often used interchangeably with the terms “confidentiality” and “security,” they have distinct meanings. Privacy addresses the question of who has access to personal information and under what conditions. Privacy is concerned with the collection, storage, and use of personal information, and examines whether data can be collected in the first place, as well as the justifications, if any, under which data collected for one purpose can be used for another (secondary)2 purpose. An important issue in privacy analysis is whether the individual has authorized particular uses of his or her personal information (Westin, 1967). Confidentiality safeguards information that is gathered in the context of an intimate relationship. It addresses the issue of how to keep informa- tion exchanged in that relationship from being disclosed to third parties (Westin, 1976). Confidentiality, for example, prevents physicians from disclosing information shared with them by a patient in the course of a physician–patient relationship. Unauthorized or inadvertent disclosures of data gained as part of an intimate relationship are breaches of con- fidentiality (Gostin and Hodge, 2002; NBAC, 2001). Security can be defined as “the procedural and technical measures required (a) to prevent unauthorized access, modification, use, and dis- semination of data stored or processed in a computer system, (b) to prevent any deliberate denial of service, and (c) to protect the system in its entirety from physical harm” (Turn and Ware, 1976). Security helps keep health 2 The National Committee on Vital and Health Statistics has noted that the term “second- ary uses” of health data is ill defined and therefore urged abandoning it in favor of precise description of each use. Consequently, the IOM committee has chosen to minimize use of the term in this report.

OCR for page 75
 HEALTH INFORMATION PRIVACY records safe from unauthorized use. When someone hacks into a computer system, there is a breach of security (and also potentially, a breach of con- fidentiality). No security measure, however, can prevent invasion of privacy by those who have authority to access the record (Gostin, 1995). The Importance of Privacy There are a variety of reasons for placing a high value on protecting the privacy, confidentiality, and security of health information (reviewed by Pritts, 2008). Some theorists depict privacy as a basic human good or right with intrinsic value (Fried, 1968; Moore, 2005; NRC, 2007a; Terry and Francis, 2007). They see privacy as being objectively valuable in itself, as an essential component of human well-being. They believe that respecting privacy (and autonomy) is a form of recognition of the attributes that give humans their moral uniqueness. The more common view is that privacy is valuable because it facilitates or promotes other fundamental values, including ideals of personhood (Bloustein, 1967; Gavison, 1980; Post, 2001; Solove, 2006; Taylor, 1989; Westin, 1966) such as: • Personal autonomy (the ability to make personal decisions) • Individuality • Respect • Dignity and worth as human beings The bioethics principle nonmaleficence3 requires safeguarding personal privacy. Breaches of privacy and confidentiality not only may affect a person’s dignity, but can cause harm. When personally identifiable health information, for example, is disclosed to an employer, insurer, or fam- ily member, it can result in stigma, embarrassment, and discrimination. Thus, without some assurance of privacy, people may be reluctant to provide candid and complete disclosures of sensitive information even to their physicians. Ensuring privacy can promote more effective communica- tion between physician and patient, which is essential for quality of care, enhanced autonomy, and preventing economic harm, embarrassment, and discrimination (Gostin, 2001; NBAC, 1999; Pritts, 2002). However, it should also be noted that perceptions of privacy vary among individuals and various groups. Data that are considered intensely private by one per- son may not be by others (Lowrance, 2002). But privacy has value even in the absence of any embarrassment or 3 The ethical principle of doing no harm, based on the Hippocratic maxim, primum non nocere, first do no harm.

OCR for page 75
 BEYOND THE HIPAA PRIVACY RULE tangible harm. Privacy is also required for developing interpersonal rela- tionships with others. Although some emphasize the need for privacy to establish intimate relationships (Allen, 1997), others take a broader view of privacy as being necessary to maintain a variety of social relationships (Rachels, 1975). By giving us the ability to control who knows what about us and who has access to us, privacy allows us to alter our behavior with different people so that we may maintain and control our various social relationships (Rachels, 1975). For example, people may share different information with their boss than they would with their doctor. Most discussions on the value of privacy focus on its importance to the individual. Privacy can be seen, however, as also having value to society as a whole (Regan, 1995). Privacy furthers the existence of a free society (Gavison, 1980). For example, preserving privacy from widespread surveil- lance can be seen as protecting not only the individual’s private sphere, but also society as a whole: Privacy contributes to the maintenance of the type of society in which we want to live (Gavison, 1980; Regan, 1995). Privacy can foster socially beneficial activities like health research. Indi- viduals are more likely to participate in and support research if they believe their privacy is being protected. Protecting privacy is also seen by some as enhancing data quality for research and quality improvement initiatives. When individuals avoid health care or engage in other privacy-protective behaviors, such as withholding information, inaccurate and incomplete data are entered into the health care system. These data, which are subse- quently used for research, public health reporting, and outcomes analysis, carry with them the same vulnerabilities (Goldman, 1998). The bioethics principle of respect for persons also places importance on individual autonomy, which allows individuals to make decisions for them- selves, free from coercion, about matters that are important to their own well-being. U.S. society also places a high value on individual autonomy, and one way to respect persons and enhance individual autonomy is to ensure that people can make the choice about when, and whether, per- sonal information (particularly sensitive information) can be shared with others. Public Views of Health Information Privacy American society places a high value on individual rights, personal choice, and a private sphere protected from intrusion. Medical records can include some of the most intimate details about a person’s life. They docu- ment a patient’s physical and mental health, and can include information on social behaviors, personal relationships, and financial status (Gostin and Hodge, 2002). Accordingly, surveys show that medical privacy is a major concern for many Americans, as outlined below (reviewed by Pritts, 2008;

OCR for page 75
 HEALTH INFORMATION PRIVACY Westin, 2007). As noted in Chapter 1, however, there are some limits to what can be learned from surveys (Tourangeau et al., 2000; Wentland, 1993; Westin, 2007). For example, how the questions and responses are worded and framed can significantly influence the results and their inter- pretation. Also, responses are biased when respondents self-report measures of attitudes, behavior, and feelings in such a way as to represent themselves favorably. In a 1999 survey of consumer attitudes toward health privacy, three out of four people reported that they had significant concerns about the privacy and confidentiality of their medical records (Forrester Research, 1999). In a more recent survey, conducted in 2005 after the implementation of the Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule, 67 percent of respondents still said they were concerned about the privacy of their medical records, suggesting that the Privacy Rule had not effectively alleviated public concern about health privacy. Ethnic and racial minorities showed the greatest concern among the respondents. Moreover, the survey showed that many consumers were unfamiliar with the HIPAA privacy protections. Only 59 percent of respondents recalled receiving a HIPAA privacy notice, and only 27 percent believed they had more rights than they had before receiving the notice (Forrester Research, 2005). One out of eight respondents also admitted to engaging in behaviors intended to protect their privacy, even at the expense of risking dangerous health effects. These behaviors included lying to their doctors about symptoms or behaviors, refusing to provide information or providing inaccurate infor- mation, paying out of pocket for care that is covered by insurance, and avoiding care altogether (Forrester Research, 2005). A series of polls conducted by Harris Interactive suggest, however, that the privacy of health information has improved since implementation of the Privacy Rule. Prior to its creation, a 1993 survey by Harris Interac- tive showed that 27 percent of Americans believed their personal medical information had been released improperly in the past 3 years. In contrast, 14 percent and 12 percent of respondents believed this had happened to them in 2005 and 2007, respectively (Harris Interactive, 2005, 2007). In the 2005 survey, about two-thirds of respondents reported having received a HIPAA privacy notice, and of these people, 67 percent said the privacy notice increased their confidence that their medical information is being handled properly (Harris Interactive, 2005). Responses to other questions on recent public opinion polls conducted by Harris Interactive only partially corroborate these findings. In one sur- vey, 70 percent of respondents indicated that they are generally satisfied with how their personal health information is handled with regard to privacy protections and security. Nearly 60 percent of the respondents reported that they believe the existing federal and state health privacy pro-

OCR for page 75
0 BEYOND THE HIPAA PRIVACY RULE tection laws provide a reasonable level of privacy protection for their health information (Harris Interactive, 2005). Nonetheless, half of the respondents also believed that “[P]atients have lost all control today over how their medical records are obtained and used by organizations outside the direct patient health care such as life insurers, employers, and government health agencies.” In another survey, 83 percent of respondents reported that they trust health care providers to protect the privacy and confidentiality of their personal medical records and health information (Westin, 2007). However, in that survey, 58 percent of respondents believed the privacy of personal medical records and health information is not protected well enough today by federal and state laws and organizational practices. A number of studies suggest that the relative strength of privacy, con- fidentiality, and security protections can play an important role in people’s concerns about privacy (reviewed by Pritts, 2008). When presented with the possibility that there would be a nationwide system of electronic medi- cal records, one survey found 70 percent of respondents were concerned that sensitive personal medical record information might be leaked because of weak data security, 69 percent expressed concern that there could be more sharing of medical information without the patient’s knowledge, and 69 percent were concerned that strong enough data security will not be installed in the new computer system. Confidentiality is particularly important to adolescents who seek health care. When adolescents perceive that health services are not confidential, they report that they are less likely to seek care, particularly for reproduc- tive health matters or substance abuse (Weddle and Kokotailo, 2005). In addition, the willingness of a person to make self-disclosures necessary to mental health and substance abuse treatment may decrease as the perceived negative consequences of a breach of confidentiality increase (Petrila, 1999; Roback and Shelton, 1995; Taube and Elwork, 1990). These studies show that protecting the privacy of health information is important for ensuring that individuals seek and obtain quality care. The potential for economic harm resulting from discrimination in health insurance and employment is also a concern for many people (reviewed by Pritts, 2008). Polls consistently show that people are most concerned about insurers and employers accessing their health information without their per- mission (Forrester Research, 2005; PSRA, 1999). This concern arises from fears about employer and insurer discrimination. Concerns about employer discrimination based on health information, in particular, increased 16 percent between 1999 and 2005, with 52 percent of respondents in the later survey expressing concern that their information might be seen by an employer and used to limit job opportunities (Forrester Research, 2005; PSRA, 1999). Reports alleging that major employers such as Wal-Mart base

OCR for page 75
 HEALTH INFORMATION PRIVACY some of their hiring decisions on the health of applicants suggest that these concerns may be justified (Greenhouse and Barbaro, 2005). Studies show that individuals are especially concerned about genetic information being used inappropriately by their insurers and employers (reviewed by Pritts, 2008). Even health care providers appear to be affected by these concerns. In a survey of cancer-genetics specialists, more than half indicated that they would pay out of pocket rather than bill their insurance companies for genetic testing, for fear of genetic discrimination (Hudson, 2007). Although surveys do not reveal a significant percentage of indi- viduals who have experienced such discrimination, geneticists have reported that approximately 550 individuals were refused employment, fired, or denied life insurance based on their genetic constitution (NBAC, 1999). In addition, a study in the United Kingdom suggested that life insurers in that country do not have a full grasp on the meaning of genetic informa- tion and do not assess or act in accord with the actuarial risks presented by the information (Low et al., 1998). There is, therefore, some legitimate basis to individuals’ concerns about potential economic harm and the need to protect the privacy of their genetic information. Recent passage of the Genetic Information Nondiscrimination Act in the United States will hope- fully begin to address some of these concerns.4 Patient Attitudes About Privacy in Health Research Ideally, there would be empirical evidence regarding the privacy value of all the specific Privacy Rule provisions that impact researchers, but there are only limited data on this topic from the consumer/patient perspective. A few studies have attempted to examine the public’s attitudes about the use of health information in research. However, few have attempted to do so with respect to the intricacies of the protections afforded by the Privacy Rule or the Common Rule,5 which are likely not well known to the public. A review by Westin of 43 national surveys with health privacy ques- tions fielded between 1993 and September 2007 identified 9 surveys6 with one or more questions about health research and privacy (Westin, 2007). In some, the majority of respondents were not comfortable with their health 4 The Genetic Information Nondiscrimination Act of 2008 establishes some protections to prevent discrimination based on a patient’s genetic background. 5 The “Common Rule” is the term used by 18 federal agencies who have adopted the same regulations governing the protection of human subjects of research. See Chapter 3 for a detailed description of the rule. 6 These surveys were undertaken by a wide range of sponsors (Markle Foundation, Equifax, Institute for Health Freedom, Geneforum, Privacy Consulting Group) and a wide range of surveyors (Harris Interactive, Public Opinion Strategies, Genetics and Public Policy Center).

OCR for page 75
 BEYOND THE HIPAA PRIVACY RULE information being provided for health research except with notice and express consent. But in others, a majority of respondents were willing to forgo notice and consent if various safeguards and specific types of research were offered. For example, a recent Harris Poll found that 63 percent of respondents would give general consent to the use of their medical records for research, as long as there were guarantees that no personally identifiable health information would be released from such studies (Harris Interactive, 2007). This is similar to the percentage of people willing to participate in a “clinical research study” (Research!America, 2007; Woolley and Propst, 2005) (see also Chapter 3). A 2006 British survey also found strong sup- port for the use of personally identifiable information without consent for public health research and surveillance, via the National Cancer Registry (Barrett et al., 2007). Westin noted that opinions varied in the surveys according to devel- opments on the health care scene and with consumer privacy trends. He concluded from this review that the majority of consumers are positive about health research, and if asked in general terms, support their medical information being made available for research. However, he also noted that most of these surveys presented the choice in ways that did not articulate the key permission process, and that there was much ambiguity in who “researchers” are, what kind of “health research” is involved, and how the promised protection of personal identities would be ensured (Westin, 2007). Reviewing the handful of detailed studies examining patient views of the use of their medical information in research through surveys, structured interviews, or focus groups, Pritts determined that a number of common themes emerge (reviewed by Pritts, 2008): • Patients were generally very supportive of research provided safe- guards are established to protect the privacy and security of their medical information (Damschroder et al., 2007; Kass et al., 2003; Robling et al., 2004; Westin, 2007; Willison et al., 2007). • Patients were much more comfortable with the use of anonymized data (e.g., where obvious identifiers have been removed) than fully identifiable data for research (Damschroder et al., 2007; Kass et al., 2003; Robling et al., 2004; Whiddett et al., 2006). • Patients were less comfortable with sharing information about “sensitive” conditions such as mental health with researchers (Damschroder et al., 2007; Robling et al., 2004). In studies where patients were able to provide unstructured comments, they expressed concern about the potential that anonymized data would be reidentified. They were also concerned that insurers or employers or others who could discriminate against subjects could potentially access informa-

OCR for page 75
 HEALTH INFORMATION PRIVACY tion maintained by researchers (Damschroder et al., 2007; Kass et al., 2003; Robling et al., 2004). Some feared that researchers would sell information to drug companies or other third parties (Damschroder et al., 2007). Although supportive of research, the majority of patients in these studies expressed a desire to be consulted before their information was released for research (Damschroder et al., 2007; Kass et al., 2003; Robling et al., 2004; Westin, 2007; Whiddett et al., 2006; Willison et al., 2007). Some surveys also show that even if researchers would receive no directly identifying information (e.g., name, address, and health insurance number), the majority of respondents still wanted to have some input before their medical records were disclosed (Damschroder et al., 2007; Robling et al., 2004; Willison et al., 2007). For example, in a 2005 Australian survey, 67 percent of respondents indicated they would be willing to allow their deidentified health records to be used for medical research purposes, but 81 percent wanted to be asked first (Flannery and Tokley, 2005). Studies indicate that public support for research and willingness to share health information can vary with the purpose or type of activity being conducted (reviewed by Pritts, 2008). Studies have found there was less support for activities that were primarily for a commercial purpose, or that might be used in a manner that would not help patients (Damschroder et al., 2007; Willison et al., 2007). Some participants expressed concern that some researchers were motivated by monetary rewards and that decision makers would act out of self-interest (Damschroder et al., 2007). One recent study suggests that the biggest predictor of whether patients are willing to share their medical records with researchers is the patients’ trust that their information will be kept private and confidential (Damschroder et al., 2007). In this study, the patients who most trusted the Veterans Affairs system to keep their medical records private were more likely to accept less stringent requirements for informed consent. Thirty- four percent of veterans who participated in intensive focus groups using deliberative democracy were willing to allow researchers associated with the Veterans Health Administration to use their medical records without any procedures for patient input, subject to Institutional Review Board (IRB) approval, and another 17 percent reported that patients should have to ask for their medical records to be excluded from research studies (opt-out). But participants in focus groups also have expressed a desire to be informed of how their health information was used for research. This desire was tied to a sense of altruism—they wanted to know that their informa- tion was useful and that they may have contributed to helping others by allowing their medical records to be used for research (Damschroder et al., 2007; Robling et al., 2004). The veterans also recommended methods to give research participants more control over how their medical records are used in research. These recommendations included requiring that partici- pants are fully informed about how their medical records are being used

OCR for page 75
 BEYOND THE HIPAA PRIVACY RULE in research; providing assurances that the research being conducted will benefit fellow veterans; updating research participants about findings and ongoing research; and setting out clear and consistent consequences for anyone who violates a patient’s privacy (Damschroder et al., 2007). The recent Harris poll7 commissioned by the Institute of Medicine (IOM) committee for this study found that 8 percent of respondents had been asked to have their medical information used in research, but declined. When asked why, 30 percent indicated they were concerned about the privacy and confidentiality of their personal information, but many other reasons were also commonly cited (ranging from 5 to 24 percent of respondents), including worry that participation would be risky, painful, or unpleasant; lack of trust in the researchers; or belief that it would not help their condition or their family (Westin, 2007). Some studies also suggest that individuals’ attitudes toward the use of their medical records in research may be influenced by a person’s state of health. Although the commissioned Harris Poll found that people who are in only fair health, who have a disability, or who had taken a genetic test were slightly more concerned than the public about health researchers see- ing their medical records (55 percent versus 50 percent), other data suggest that people with health concerns may be more supportive of using medi- cal records in research. For example, qualitative market research by the National Health Council showed that individuals with chronic conditions have a very favorable attitude toward the implementation of electronic personal health records (EPHRs). During the focus group discussions, par- ticipants noted that EPHRs could be very advantageous in medical research and were supportive of this use even though many had expressed concern about the privacy and confidentiality of EPHRs (Balch et al., 2005, 2006). Although the Council did not specifically ask about attitudes toward health research and privacy, these results suggest that individuals with chronic conditions may be more likely to grant researchers access to their medical records, and to place less emphasis on protecting privacy than members of the general population. Also, a Johns Hopkins University survey of patients having, or at risk for, serious medical conditions examined these patients’ attitudes about the use of their medical records in research, and compared those results to polls from the general population. Thirty-one percent of respondents stated that medical researchers should have access to their medical records without their permission if it would help to advance medical knowledge. In contrast, the recent Harris poll of the public found that 19 percent of respondents would be willing to forgo consent to use personal medical and health information, as long as the study never revealed their identity 7 The survey was conducted online by Harris Interactive between September 11 and 18, 2007, with 2,392 respondents. The methodology for the survey is described in Appendix B.

OCR for page 75
 HEALTH INFORMATION PRIVACY and it was supervised by an IRB (Westin, 2007). An additional 8 percent indicated they would be willing to give general consent in advance to have personally identifiable medical or health information used in future research projects without the researchers having to contact them, and 1 percent said researchers should be free to use their personal medical and health informa- tion without their consent at all. Thus, 28 percent of respondents would be willing to grant researchers access to their medical records without giving specific consent for each research project. Thirty-eight percent believed they should be asked to consent to each research study seeking to use their personally identifiable medical or health information, and 13 percent did not want researchers to contact them or to use their personal or health information under any circumstances. However, those who preferred not to be contacted at all were actually less likely than those who would grant conditional permission to have declined participating in a research study. Notably, 20 percent of respondents were unsure how to respond to the question about notice and consent for research. Among the 38 percent who said they wanted notice and consent, 80 percent indicated that they would want to know the purpose of the research, and 46 percent wanted to know specifically whether the research could help their health condition or those of family members. Sixty-two percent indicated that knowing about the specific research study and who would be running it would allow the respondent to decide whether to trust the researchers. A little more than half of the respondents (54 percent) said they would be worried that their personally identifiable information may be disclosed outside the study. Among those 54 percent, three-quarters agreed with the statement “I would feel violated and my trust in the researchers betrayed.” Between 39 and 67 percent were concerned about discrimination in a government program, by an employer, or in obtaining life or health insurance (Westin, 2007). However, about 70 percent of all respondents indicated that they trusted health researchers to protect the privacy and confidentiality of the medical records and health information they obtain about research participants. Furthermore, among respondents who had participated in health research, only 2 percent reported that any of their personally identifiable medical information used in a study was given to anyone outside the research staff, and half of those disclosures were actually made to other researchers or research institutions (Westin, 2007). In summary, very limited data are available to assess the privacy value of the Privacy Rule provisions that impact researchers. Surveys indicate that the public is deeply concerned about the privacy and security of per- sonal health information, and that the HIPAA Privacy Rule has perhaps reduced—but not eliminated—those concerns. Patients were generally very supportive of research, provided safeguards were established to protect the privacy and security of their medical information, although some surveys

OCR for page 75
00 BEYOND THE HIPAA PRIVACY RULE research applications, and then encourage and facilitate broader use of such standards in the health research community. POTENTIAL TECHNICAL APPROACHES TO HEALTH DATA PRIVACY AND SECURITY The security of data will continue to grow in importance as the health care industry moves toward greater implementation of electronic health records, and Congress has already proposed numerous bills to facilitate and regulate that transition (see also Chapter 6). Advances in information technology will likely make it easier to implement such measures as audit trails and access controls in the future. Although the committee does not recommend a specific technology solution, there are at least four techno- logical approaches to enhancing data privacy and security that have been proposed by others as having the potential to be particularly influential in health research: (1) Privacy-preserving data mining and statistical disclo- sure limitation, (2) personal electronic health record devices, (3) indepen- dent consent management tools, and (4) pseudonymisation. Each seeks to minimize or eliminate the transfer of personally identifiable data (Burkert, 2001). The advantages, limitations, and current feasibility of each are described briefly below. Privacy-preserving data mining and statistical disclosure limitation. In recent years, a number of techniques have been proposed for modifying or trans- forming data in such a way so as to preserve privacy while statistically analyzing the data (reviewed in Aggarwal and Yu, 2008; NRC, 2000, 2005, 2007b,c). Typically, such methods reduce the granularity of representation in order to protect confidentiality. There is, however, a natural trade-off between information loss and the confidentiality protection because this reduction in granularity results in diminished accuracy and utility of the data, and methods used in their analysis. Thus, a key issue is to maintain maximum utility of the data without compromising the underlying privacy constraints. In addition, there are a very large number of definitions of pri- vacy and its protection in the statistical disclosure limitation and the privacy- preserving data mining literatures, in part because of the varying goals. Examples of statistical disclosure limitation and privacy-preserving data mining methods include perturbation methods such as noise addition, which attempts to mask the identifiable attributes of individual records, aggrega- tion methods such as k-anonymity, which attempts to reduce the granularity of representation of the data in such a way that a given record cannot be distinguished from at least (k – 1) other records, the release of summary statistics that can be used for actual statistical analyses such as marginal

OCR for page 75
0 HEALTH INFORMATION PRIVACY totals from contingency tables, and various approaches to the generation of synthetic data. Several of these are reviewed in Aggarwal and Yu (2008). Other technologies include cryptographic methods for distributive pri- vacy protection, which operate by allowing researchers to query various databases online using cryptographic algorithms (Brands, 2007; reviewed in Aggarwal and Yu, 2008), query auditing techniques, and output perturba- tion using methodology known as differential privacy (many of these tech- niques are reviewed in Aggarwal and Yu, 2008, and Dwork, 2008). These technologies aim to protect privacy by minimizing the outflow of informa- tion to researchers, as the providers of the databases do not make any of the actual data available to the researchers. The principal drawback of many of these methods relates to the potentially limited utility of the released information, especially for secondary analyses not planned in advance. Each of the methods referred to above have strengths and weaknesses for specific kinds of statistical analyses. Precisely how this body of develop- ing methodologies may be effectively used in the types of health research of the sort envisioned in this report remains an open question and this is an area of active research. Thus, alternative mechanisms for data protec- tion going beyond the removal of obvious identifiers and the application of limited modifications of data elements are required. These mechanisms need to be backed up by legal penalties and sanctions. Personal electronic health record devices. The use of personal electronic health record devices requires that all individuals possess a personal elec- tronic device, such as a personal digital assistant (PDA) or personal com- puter, to manage their health information. The electronic device is intended to be used by individuals to aggregate all of their health information into one location (i.e., the electronic device). The infrastructure for implement- ing this privacy-enhancing technology exists, but there are several serious problems with relying on this technology in health research. First, it is unclear who would provide individuals with the devices, how they would be maintained, and who would bear the cost of the maintenance. Second, it is impossible for researchers to query every single individual for permission to access his/her personal electronic health record device in order to deter- mine if he/she meets the criteria for the relevant study. Only individuals who are on the Internet and are involved in health research could easily be queried. Third, the use of personal electronic devices would make it almost impossible to aggregate data because of the difficulty of accessing data from multiple sources. These problems are sufficiently serious that the use of this technology is unlikely to offer a satisfactory solution to the privacy and security concerns in health research (Brands, 2007). Independent consent management tools. The independent consent manage-

OCR for page 75
0 BEYOND THE HIPAA PRIVACY RULE ment tool (or infomediary) relies on a health trust to store all of an individ- ual’s health data. When researchers are interested in accessing an individual’s health information for a study, the researchers must contact the health trust. The health trust will then approach the individual and asks whether he/she is willing to give consent for the research. Examples of this technology include Microsoft’s HealthVault, Google Health, and Revolution Health. Independent consent management tools allow individuals to make blanket consents for their health information to be released for certain types of researchers. For example, an individual can have a standing consent that his/her information can be released to all researchers at the Mayo Clinic, or for all research on cancer, etc. Thus, the use of a health trust allows an indi- vidual to have the power of consent for all uses of his/her health informa- tion, but does not require a specific consent in all instances (Brands, 2007). Some privacy advocates are very favorable about the use of this technology because they see it as a way to give patients complete control over who can see and use their health information (PPR, 2008). However, the use of this technology in health research has several major problems. The first problem is that the health trust in this system becomes a “honey pot” (i.e., the health trust holds ALL of an individual’s data). This creates serious trust and security issues because a person’s entire health record is stored in a single entity (Brands, 2007). A 2006 survey of global financial services institutions found that respondents reported that nearly 50 percent of all security breaches were a result of an internal failure (e.g., a virus or worm originating inside the organization, insider fraud, or inadvertent leakage of consumer data) (Melek and MacKinnon, 2006). Many security breaches in health care are likely also a result of internal failures. In addition, these organizations are currently not regulated by the HIPAA Privacy Rule, so there are no legal federal privacy restrictions preventing these entities from releasing individuals’ data to the govern- ment, marketing companies, or others, and no mandatory data security requirements. New legislation or regulation making health trusts liable for security breaches may be necessary before the public is willing to trust these organizations to store personal health data (Metz, 2008). The second major impediment to the widespread adoption of indepen- dent consent management tools is the difficulty of providing individuals with secure online access to view their health information. The companies marketing this technology need to develop a mechanism where individuals can access their medical information held by the health trust without endangering its security and privacy. The current methods for individual authentication online do not work well (NRC, 2003), but the use of a strong authentication system in a single domain may solve this problem. The companies will also need to address the fact that a significant portion of the population does not have online access at all (Brands, 2007).

OCR for page 75
0 HEALTH INFORMATION PRIVACY The final problem with using independent consent management systems in health research is the inability to ensure the authenticity and integrity of responses. There is no existing method for the health trusts to provide the researchers with a guarantee that the information contained in their database is accurate. If data are authenticated using existing methods, such as through the use of digital signing, then it is impossible to truly protect the privacy of the individuals’ information being disclosed (NRC, 2003). Cryptographic selective disclosure techniques may be able to solve this problem, but the technology does not exist yet (Brands, 2007). Pseudonymization. Pseudonymization is a method “used to replace the true identities (nominative) of individuals or organizations in databases by pseudo-identities (pseudo-IDs) that cannot be linked directly to their cor- responding nominative identities” (Claerhout and De Moor, 2005). The benefit of using pseudonymization in health research is that it protects indi- viduals’ identities while allowing researchers to link personal data across time and place by relying on the pseudo-IDs. Most pseudonymization methods use a trusted third party to perform the pseudonymization process. This results in at least three entities being involved in the creation of each database. There is the data source that has access to nominative personal data (e.g., PHI), the trusted third party, and the data register that uses the pseudonymized data for research. Two methods of pseudonymization are the batch data collection and the interactive data collection. In the batch data collection, the data sup- plier splits the data into two parts: (1) the identifiers that relate to a specific person (e.g., Social Security number, name), and (2) the payload data, which includes all the nonidentifiable data associated with each individual. The data are prepseudonymized at the data source and transferred to the trusted third party, which converts the prepseudonyms data into a final pseudo-ID. Both the final pseudo-ID and payload data are transferred to the data regis- ter, where they are stored and used for research; no data are stored with the trusted third party. Privacy concerns are minimized because the only version of the data that is available to researchers is pseudonymized data. The interactive data collection is used in situations where neither the data supplier nor the data register has a need for local storage of the data. All the data is stored by a trusted third party in pseudonymous form. Both the data supplier and the data register must query the trusted third party to access the data (Claerhout and De Moor, 2005; De Moor et al., 2003). It is unclear how technologies relying on pseudonymization would be implemented under the requirements of the HIPAA Privacy Rule. In order for information to be considered deidentified, the HIPAA Privacy Rule specifically states that covered entities can assign a code or other means of

OCR for page 75
0 BEYOND THE HIPAA PRIVACY RULE record identification (such as a pseudo-ID), but the code cannot be derived from, or related to, information about the subject of the information.15 This means that any pseudo-IDs created using this technology must be based entirely on nonpersonal information. Alternatively, any researchers using the pseudonymized data must go through the normal IRB/Privacy Board review process. CONCLUSIONS AND RECOMMENDATIONS Based on its review of the information described in this chapter, the committee agreed on an overarching principle to guide the formation of recommendations. The committee affirms the importance of maintaining and improving the privacy of health information. In the context of health research, privacy includes the commitment to handle personal information of patients and research participants with meaningful privacy protections, including strong security measures, transparency, and accountability.16 These commitments extend to everyone who collects, uses, or has access to personally identifiable health information of patients and research par- ticipants. Practices of security, transparency, and accountability take on extraordinary importance in the health research setting: Researchers and other data users should disclose clearly how and why personal informa- tion is being collected, used, and secured, and should be subject to legally enforceable obligations to ensure that personally identifiable information is used appropriately and securely. In this manner, privacy protection will help to ensure research participation and public trust and confidence in medical research. As part of the process of implementing this principle into the federal oversight regime of health research, the committee recommends that all institutions in the health research community that are involved in the col- lection, use, and disclosure of personally identifiable health information should take strong measures to safeguard the security of health data. For example, institutions could: • Appoint a security officer responsible for assessing data protection needs and implementing solutions and staff training. • Make greater use of encryption and other techniques for data security. • Include data security experts on IRBs. 15 Standards for Privacy of Individually Identifiable Health Information: Final Rule, 67 Fed. Reg. 53182, 53232 (2002). 16 This is derived from the principles of fair information practices (see Chapter 2 for more detail).

OCR for page 75
0 HEALTH INFORMATION PRIVACY • Implement a breach notification requirement, so that patients may take steps to protect their identity in the event of a breach. • Implement layers of security protection to eliminate single points of vulnerability to security breaches. In addition, the federal government should support the development and use of: • Genuine privacy-enhancing techniques that minimize or eliminate the collection of personally identifiable data. • Standardized self-evaluations and security audits and certification programs to help institutions achieve the goal of safeguarding the security of personal health data. Effective health privacy protections require effective data security mea- sures. The HIPAA Security Rule (which entails a set of regulatory provisions separate from the Privacy Rule) already sets a floor for data security stan- dards within covered entities, but not all institutions that conduct health research are subject to HIPAA regulations. Also, the survey data presented in this chapter show that neither the HIPAA Privacy Rule nor the HIPAA Security Rule have directly improved public confidence that personal health information will be kept confidential. Therefore, all institutions conducting health research should undertake measures to strengthen data protections. For example, given the recent spate of lost or stolen laptops containing patient health information, encryption should be required for all laptops and removable media containing such data. However, in general, given the differences among the missions and activities of institutions in the health research community, some flexibility in the implementation of specific secu- rity measures will be necessary. Enhanced security would reduce the risk of data theft and reinforce the public’s trust in the research community by diminishing anxiety about the potential for unintentional disclosure of information. The publication of best practices and outreach to all stakeholders by HHS, combined with a cooperative approach to compliance with security standards, such as self-evaluation and audit programs, would promote progress in this area. Research sponsors could also play a roll in fostering the adoption of best practices in data security. REFERENCES Aggarwal, C. C., and P. S. Yu, eds. 2008. Privacy-preserving data mining: Models and algo- rithms. Boston, MA: Kluwer Academic Publishers. AHIMA (American Health Information Management Association). 2006. The state of HIPAA privacy and security compliance. http://www.ahima.org/emerging_issues/ 2006StateofHIPAACompliance.pdf (accessed April 20, 2008).

OCR for page 75
0 BEYOND THE HIPAA PRIVACY RULE Allen, A. 1997. Genetic privacy: Emerging concepts and values. In Genetic secrets: Protecting privacy and confidentiality in the genetic era, edited by M. Rothstein. New Haven, CT: Yale University Press. Pp. 31–59. Balch, G. I., L. Doner, M. K. Hoffman, and E. Macario. 2005. An exploration of how patients and family caregivers think about counterfeit drugs and the safety of prescription drug retail outlets for the National Health Council. Oak Park, IL: Balch Associates. Balch, G. I., L. M. A. Doner, M. K. Hoffman, M. P. Merriman, E. Monroe-Cook, and G. Rathjen. 2006. Concept and message development research on engaging communities to promote electronic personal health records for the National Health Council. Oak Park, IL: Balch Associates. Barrett, G., J. A. Cassell, J. L. Peacock, and M. P. Coleman. 2007. National survey of British public’s view on use of identifiable medical data by the National Cancer Registry. British Medical Journal 332(7549):1068–1072. Bloustein, E. 1967. Privacy as an aspect of human dignity: An answer to Dean Prosser. New York Law Review 39:34. Bodger, J. A. 2006. Note, taking the sting out of reporting requirements: Reproductive health clinics and the constitutional right to informational privacy. Duke Law Journal 56:583–609. Burkert, H. 2001. Privacy-enhancing technologies: Typology, critique, vision. In Technology and privacy: The new landscape, edited by P. E. Agre and M. Rotenberg. Cambridge, MA: The MIT Press. Pp. 125–142. Claerhout, B., and G. J. E. De Moor. 2005. Privacy protection for clinical and genomic data: The use of privacy-enhancing techniques in medicine. Journal of Medical Informatics 74:257–265. Conn, J. 2008. CMS’ HIPAA watchdog presents potential conflict. Modern Healthcare. http:// www.modernhealthcare.com (accessed July 28, 2008). Damschroder, L. J., J. L. Pritts, M. A. Neblo, R. J. Kalarickal, J. W. Creswell, and R. A. Hayward. 2007. Patients, privacy and trust: Patients’ willingness to allow researchers to access their medical records. Social Science & Medicine 64(1):223–235. De Moor, G. J. E., B. Claerhout, and F. De Meyer. 2003. Privacy enhancing techniques: The key to secure communication and management of clinical and genomic data. Methods of Information in Medicine 42:148–153. Dwork, C. S., 2008. An ad omnia approach to defining and achieving private data analysis, proceedings of the first sigkdd international workshop on privacy, security, and trust in kdd (invited). Lecture Notes in Computer Science 4890. Feld, A. D., and A. D. Feld. 2005. The Health Insurance Portability and Accountability Act (HIPAA): Its broad effect on practice. American Journal of Gastroenterology 100(7):1440–1443. Flannery, J., and J. Tokley. 2005. AMA poll shows patients are concerned about the privacy and security of their medical records. Australian Medical Association. http://www.ama. com.au/web.nsf/doc/WEEN-6EG7LY (accessed December 10, 2007). Forrester Research. 1999. National survey: Confidentiality of medical records. http://www. chcf.org (accessed February 12, 2007). Forrester Research. 2005. National consumer health privacy survey 00. http://www.chcf. org/topics/view.cfm?itemID=115694 (accessed February 12, 2007). Fried, C. 1968. Privacy. Yale Law Journal 77:475–493. GAO (Government Accountability Office). 2007. Personal information: Data breaches are frequent, but evidence of resulting identity theft is limited. Washington, DC: GAO. GAO. 2008a. Although progress reported, federal agencies need to resolve significant deficiencies: Statement of Gregory C. Wilshusen, Director, Information Security Issues. Washington, DC: GAO.

OCR for page 75
0 HEALTH INFORMATION PRIVACY GAO. 2008b. Information security: Progress reported, but weaknesses at federal agencies persist: Statement of Gregory C. Wilshusen, Director, Information Security Issues. Wash- ington, DC: GAO. Gavison, R. 1980. Privacy and the limits of the law. Yale Law Journal 89:421–471. Gelman, R. 2008. Fair information practices: A basic history. http://bobgellman.com/rg-docs/ rg-FIPshistory.pdf (accessed April 15, 2008). Goldman, J. 1998. Protecting privacy to improve health care. Health Affairs 17(6):47–60. Gostin, L. O. 1995. Health information privacy. Cornell Law Review 80:101–184. Gostin, L. 2001. Health information: Reconciling personal privacy with the public good of human health. Health Care Analysis 9:321. Gostin, L. 2008. Surveillance and public health research: Personal privacy and the “right to know.” In Public health law: Power, duty, restraint. 2nd ed. Berkeley, CA: University of California Press. Gostin, L. O., and J. G. Hodge. 2002. Personal privacy and common goods: A framework for balancing under the national health information Privacy Rule. Minnesota Law Review 86:1439. Greenhouse, S., and M. Barbaro. 2005. Walmart memo suggests ways to cut employee benefit costs. The New York Times. http://www.nytimes.com/2005/10/26/business/26walmart. ready.html?pagewanted=1&_r=1 (accessed April 14, 2008). Harris Interactive. 2005. Health Information Privacy (HIPAA) notices have improved pub- lic’s confidence that their medical information is being handled properly. http://www. harrisinteractive.com/news/printerfriend/index.asp?NewsID=849 (accessed April 3, 2007). Harris Interactive. 2007. Many U.S. adults are satisfied with use of their personal health information. http://www.harrisinteractive.com/harris_poll/index.asp?PID=743 (accessed May 15, 2007). HEW (Department of Health, Education and Welfare). 1973. Records, computers and the rights of citizens: Report of the Secretary’s Advisory Committee on Automated Per- sonal Data Systems. http://aspe.hhs.gov/datacncl/1973privacy/tocprefacemembers.htm (accessed July 12, 2008) Hodge, J. G., Jr., L. O. Gostin, and P. D. Jacobson. 1999. Legal issues concerning electronic health information: Privacy, quality, and liability. JAMA 282(15):1466–1471. Hudson, K. L. 2007. Prohibiting genetic discrimination. New England Journal of Medicine 356:2021. IOM (Institute of Medicine). 2000. Protecting data privacy in health services research. Washington, DC: National Academy Press. ITRC (Identity Theft Resource Center). 2006. 00 disclosures of U.S. Data incidents. http:// idtheftmostwanted.org/ITRC%20Breach%20Report%202006.pdf (accessed July 7, 2008). ITRC. 2007. 00 breach list. http://idtheftmostwanted.org/ITRC%20Breach%20Report% 202007.pdf (accessed July 7, 2008). ITRC. 2008. Security breaches. http://www.idtheftcenter.org/artman2/publish/lib_survey/ ITRC_2008_Breach_List_printer.shtml (accessed July 22, 2008). Kass, N. E., M. R. Natowicz, S. C. Hull, R. R. Faden, L. Plantinga, L. O. Gostin, and J. Slutsman. 2003. The use of medical records in research: What do patients want? Journal of Law, Medicine & Ethics 31:429–433. Low, L., S. King, and T. Wilkie. 1998. Genetic discrimination in life insurance: Empirical evidence from a cross sectional survey of genetic support groups in the United Kingdom. British Medical Journal 317:1632–1635. Lowrance, W. W. 1997. Privacy and health research: A report to the U.S. Secretary of Health and Human Services. http://aspe.hhs.gov/DATACNCL/PHR.htm (accessed May 10, 2008).

OCR for page 75
0 BEYOND THE HIPAA PRIVACY RULE Lowrance, W. W. 2002. Learning from experience, privacy and the secondary use of data in health research. London: The Nuffield Trust. Magnussen, R. 2004. The changing legal and conceptual shape of health care privacy. The Journal of Law, Medicine & Ethics 32:681. Melek, A., and M. MacKinnon. 2006. Deloitte global security survey. http://www.deloitte.com/ dtt/cda/doc/content/us_fsi_150606globalsecuritysurvey(1).pdf (accessed July 23, 2008). Metz, R. 2008. Google makes health service publicly available. Associated Press. http://biz. yahoo.com/ap/080519/google_health.html (accessed August 13, 2008). Moore, A. 2005. Intangible property: Privacy, power and information control. In Informa- tion ethics: Privacy, property, and power, edited by A. Moore. Seattle, WA: University of Washington Press. NBAC (National Bioethics Advisory Commission). 1999. Research involving human biologi- cal materials: Ethical issues and policy guidance, report and recommendations. Vol. 1. Rockville, MD: NBAC. NBAC. 2001. Ethical and policy issues in research involving human participants. Rockville, MD: NBAC. NCSL (National Conference of State Legislatures). 2008. Privacy protections in state constitu- tions. http://www.ncsl.org/programs/lis/privacy/stateconstpriv03.htm (accessed June 10, 2008). Nissenbaum, H. 2004. Privacy as Contextual Integrity. Washington Law Review 79: 101–139. NRC (National Research Council). 2000. Improving access to and confidentiality of research data: Report of a workshop. Washington, DC: National Academy Press. NRC. 2003. Who goes there?: Authentication through the lens of privacy. Washington, DC: The National Academies Press. NRC. 2005. Expanding access to research data: Reconciling risks and opportunities. Wash- ington, DC: The National Academies Press. NRC. 2007a. Engaging privacy and information technology in a digital age. Washington, DC: The National Academies Press. NRC. 2007b. Privacy and information technology in a digital age. Washington, DC: The National Academies Press. NRC. 2007c. Putting people on the map: Protecting confidentiality with linked social-spatial data. Washington, DC: The National Academies Press. OCR (Office for Civil Rights). 2008. HIPAA compliance and enforcement. http://www.hhs. gov/ocr/privacy/enforcement/ (accessed August 13, 2008). OECD. 1980. Guidelines on the protection of privacy and transborder flows of personal data. http://www.oecd.org/document/0,2340,en_2649_34255_1815186_1_1_1_1,00.html (ac- cessed August 13, 2008). OIG (Office of Inspector General). 2008. Nationwide review of the Centers for Medicare & Medicaid Services Health Insurance Portability and Accountability Act of  oversight. Washington, DC: Department of Health and Human Services. OTA (Office of Technology Assessment). 1993. Protecting privacy in computerized medical information. Washington, DC: OTA. Petrila, J. 1999. Medical records confidentiality: Issues affecting the mental health and sub- stance abuse systems. Drug Benefit Trends 11:6–10. Post, R. 2001. Three concepts of privacy. Georgetown Law Journal 89:2087–2089. PPR (Patient Privacy Rights). 2008 (October 4). Press release: Microsoft raises the bar for privacy in electronic health record solutions. http://www.patientprivacyrights.org/site/ PageServer?pagename=HealthVault_PressRelease/ (accessed August 13, 2008). PRC (Privacy Rights Clearinghouse). 2008. A chronology of data breaches. http://www. privacyrights.org/ar/ChronDataBreaches.htm (accessed July 8, 2008).

OCR for page 75
0 HEALTH INFORMATION PRIVACY Pritts, J. L. 2002. Altered states: State health privacy laws and the impact of the federal health Privacy Rule. Yale Journal of Health Policy, Law & Ethics 2(2):327–364. Pritts, J. 2008. The importance and value of protecting the privacy of health information: Roles of HIPAA Privacy Rule and the Common Rule in health research. http://www.iom. edu/CMS/3740/43729/53160.aspx (accessed March 15, 2008). Privacy Protection Study Commission. 1977. Personal privacy in an information society. http://epic.org/privacy/ppsc1977report/ (accessed April 21, 2008). PSRA (Princeton Survey Research Associates). 1999. Medical privacy and confidentiality survey. http://www.chcf.org/topics/view.cfm?itemID=12500 (accessed August 11, 2008). Rachels, J. 1975. Why privacy is important. Philosophy and Public Affairs 4:323–333. Regan, P. 1995. Legislating privacy: Technology, social values, and public policy. Chapel Hill, NC: University of North Carolina Press. Research!America. 2007. America speaks: Poll summary. Vol. 7. Alexandria, VA: United Health Foundation. Richards, N. M., and D. J. Solove. 2007. Privacy’s other path: Recovering the law of confi- dentiality. Georgetown Law Journal 96:124. Roback, H., and M. Shelton. 1995. Effects of confidentiality limitations on the psycho- therapeutic process. Journal of Psychotherapy Practice and Research 4:185–193. Robling, M. R., K. Hood, H. Houston, R. Pill, J. Fay, and H. M. Evans. 2004. Public attitudes towards the use of primary care patient record data in medical research without consent: A qualitative study. Journal of Medical Ethics 30:104–109. Saver, R. 2006. Medical research and intangible harm. University of Cincinnati Law Review 74:941–1012. Solove, D. J. 2006. A taxonomy of privacy. University of Pennsylvania Law Review 154:516–518. Taube, D. O., and A. Elwork. 1990. Researching the effects of confidentiality law on patients’ self-disclosures. Professional Psychology: Research and Practice 21:72–75. Taylor, C. 1989. Sources of the self: The making of modern identity. Cambridge, MA: Harvard University Press. Terry, N. P., and L. P. Francis. 2007. Ensuring the privacy and confidentiality of electronic health records. University of Illinois Law Review 2007(2):681–736. Tourangeau, R., L. J. Rips, and K. Rasinski. 2000. The psychology of survey response. Cambridge, UK: Cambridge University Press. Turn, R., and W. H. Ware. 1976. Privacy and security issues in information systems. The RAND Paper Series. Santa Monica, CA: The RAND Corporation. Weddle, M., and P. Kokotailo. 2005. Confidentiality and consent in adolescent substance abuse: An update. Virtual Mentor, American Medical Association Journal of Ethics. http://virtualmentor.ama-assn.org/2005/03/pdf/pfor1-0503.pdf (accessed August 1, 2008). Wentland, E. J. 1993. Survey responses: An evaluation of their validity. San Diego, CA: Academic Press. Westin, A. 1966. Science, privacy and freedom. Columbia Law Review 66(7):1205–1253. Westin, A. 1967. Privacy and freedom. New York: Atheneum. Westin, A. 1976. Computers, health records, and citizen rights. http://eric.ed.gov/ERICWebPortal/ custom/portlets/recordDetails/detailmini.jsp?_nfpb=true&_&ERICExtSearch_SearchValue_ 0=ED143358&ERICExtSearch_SearchType_0=no&accno=ED143358 (accessed July 30, 2008). Westin, A. 2007. How the public views privacy and health research. Institute of Medicine. http:// www.iom.edu/Object.File/Master/48/528/%20Westin%20IOM%20Srvy%20Rept%2011- 1107.pdf (accessed November 11, 2007).

OCR for page 75
0 BEYOND THE HIPAA PRIVACY RULE Whiddett, R., I. Hunter, J. Engelbrecht, and J. Handy. 2006. Patients’ attitudes towards sharing their health information. International Journal of Medical Informatics 75(7):530–541. Willison, D. J., L. Schwartz, J. Abelson, C. Charles, M. Swinton, D. Northrup, and L. Thabane. 2007 (September 25–28). Alternatives to project-specific consent for access to personal information for health research. What do Canadians think? Paper presented at 29th International Conference of Data Protection and Privacy Commissioners, Montreal, Canada. Woolley, M., and S. M. Propst. 2005. Public attitudes and perceptions about health related research. JAMA 294:1380–1384.