5
Healthcare Data as a Public Good: Privacy and Security

INTRODUCTION

Any consideration of clinical data as a public good raises questions concerning the safety and security of individual patient records. Maintaining confidentiality of data records is of paramount importance. Public perceptions of privacy in the context of medical records links directly to the trust the public has in the entire healthcare establishment, and factors significantly into discussions of health data sharing. The complex issue has many challenging dimensions, from what happens after the initial intake of an individual’s data to what happens in data aggregation and secondary use. This chapter provides commentary from four experts considering key legal and social challenges to privacy issues from a variety of perspectives, including public opinion, the implications of the Health Insurance Portability and Accountability Act (HIPAA), and institutions’ experiences inside and outside of health care.

To provide insight into the public views on privacy issues in health care, Alan Westin, professor emeritus of public law and government at Columbia University and principal of the Privacy Consulting Group, presents outcomes of the 2007 national Harris/Westin survey that evaluates public attitudes toward the current state of health information privacy and security protection.1 The survey examines attitudes about handling sensitive patient information, health research activities involving individual patient data, and

1

This survey was commissioned by the Institute of Medicine as part of the work of the IOM Committee on Health Research and the Privacy of Health Information.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 171
5 Healthcare Data as a Public Good: Privacy and Security INTRODUCTION Any consideration of clinical data as a public good raises questions concerning the safety and security of individual patient records. Maintain- ing confidentiality of data records is of paramount importance. Public perceptions of privacy in the context of medical records links directly to the trust the public has in the entire healthcare establishment, and factors significantly into discussions of health data sharing. The complex issue has many challenging dimensions, from what happens after the initial intake of an individual’s data to what happens in data aggregation and secondary use. This chapter provides commentary from four experts considering key legal and social challenges to privacy issues from a variety of perspectives, including public opinion, the implications of the Health Insurance Porta- bility and Accountability Act (HIPAA), and institutions’ experiences inside and outside of health care. To provide insight into the public views on privacy issues in health care, Alan Westin, professor emeritus of public law and government at Columbia University and principal of the Privacy Consulting Group, presents out- comes of the 2007 national Harris/Westin survey that evaluates public atti- tudes toward the current state of health information privacy and security protection.1 The survey examines attitudes about handling sensitive patient information, health research activities involving individual patient data, and 1 This survey was commissioned by the Institute of Medicine as part of the work of the IOM Committee on Health Research and the Privacy of Health Information. 

OCR for page 171
2 CLINICAL DATA AS THE BASIC STAPLE OF HEALTH LEARNING opinions on the extent to which trust is accorded to health researchers by the public. The results indicate that the public holds strong privacy concerns about how their personal health information is handled, especially uses of data not directly relevant to providing care. The survey also indicates that current laws and organizational practices may not provide adequate privacy protection for patients. Westin suggests that patient-controlled privacy poli- cies, such as those offered through repositories of personal health records, might help with gaining traction on the issues of clinical data, privacy, and security with the public. He also recommends a scope of activities related to health privacy, patient notice, and public education on privacy and compli- ance as opportunities to provide evidence-based medicine (EBM). Balancing patient privacy protections with advancing data-driven clini- cal research and care delivery is an ongoing challenge for many healthcare organizations. In 2003, the HIPAA Privacy Rule took effect, and early changes to the Rule permitted sharing healthcare data for restricted pur- poses, essentially easing some limitations on providers and health plans related to health services research. With the increased incorporation of electronic health records (EHRs) into care delivery and research, the grow- ing volumes of valuable data for evidence-based research and care may eventually force significant changes to strike a balance between privacy and advancement. Marcy Wilder, a partner in the law firm of Hogan and Hartson, LLP, and former deputy general counsel at the Department of Health and Human Services (HHS), where she helped to develop HIPAA, comments on some important remaining legal barriers to effectively using clinical data for research. In particular, Wilder highlights the growing opportunity to address the confluence of future, unspecified research and individual rights regarding the use of individual data through policy. Also notable are her suggestions of formally reviewing HIPAA deidentification standards, safe harbor requirements, and distribution of liability burdens across covered and noncovered entities. Providing examples of other sectors’ approach to striking a balance between privacy and security and research innovation, Elliott Maxwell, a fellow in the communications program at Johns Hopkins University and distinguished research fellow at Pennsylvania State University, discusses the notion of data openness as demonstrated through projects such as the Human Genome Project. Examples of greater openness are also prevalent in the public registration of clinical trials and open-access journals. Greater digital openness has the potential to transform the use and application of clinical data in EBM, Maxwell suggests, but it must be tempered with deter- minations on the appropriate level of openness for given purposes. Maxwell provides an overview of the Committee for Economic Development’s report Harnessing Openness to Transform American Health Care, including rec- ommendations on patient consent requirements, electronic filing of device

OCR for page 171
 HEALTHCARE DATA AS A PUBLIC GOOD: PRIVACY AND SECURITY and drug approvals, and EHR adoption incentives. The report advocates for increased federal support for large, clinical databases to accelerate advancements in EBM and standards development. The quality of clinical care and access to care services are ubiquitous issues in American health care. The public demands higher quality care at lower costs with greater access. Healthcare data are uniquely positioned to provide deep insights into care delivery processes and outcomes. Simul- taneously, provider organizations must secure individual patient health information and improve the coordination and quality of care. The ten- sion between access to insight-generating data and security of health data continues to create significant barriers for organizations striving to provide clinical services. Alexandra Eremia, associate general counsel and corpo- rate privacy officer at MedStar Health, Inc., discusses perceived and actual privacy or security hurdles experienced at healthcare delivery organiza- tions nationwide. She elaborates on the opportunities for building trust in patients, structuring organizational policies and strategies to avoid adverse legal outcomes, and making strategic fiscal decisions associated with data retrieval and release for research. Addressing these opportunities through financial, strategic, regulatory, and public initiatives may advance access to healthcare data for research and EBM purposes. PUBLIC VIEWS Alan Westin, Ph.D., L.L.B. Principal, Priacy Consulting Group Based on a national Harris/Westin survey in 2007 sponsored by an IOM project, this paper will describe public attitudes toward the current state of health information privacy and security protection; health provider handling of patient data; health research activities; and trust in health researchers. The public is segmented into persons who have participated in health research projects, those who have been invited but declined (and why), and those never invited. Members of the public are identified who believe their personal health information has been disclosed improperly and by whom. Explaining the benefits and risks involved in having one’s per- sonally identified health records used in health research, the paper explores what kinds of advance patient/consumer notice and consent mechanisms are desired by various subsets of the public. Potential privacy harms are documented that patients see if their health records are used without notice and choice mechanisms, or disclosed improperly. The findings are applied to emerging large-scale health data systems, especially new online personal health record repositories and health data-mining programs. In terms of

OCR for page 171
 CLINICAL DATA AS THE BASIC STAPLE OF HEALTH LEARNING positive actions suggested by these survey results, updated federal health privacy rights in legislation supporting information technology/EHR pro- grams are discussed, as are national educational campaigns on the values of health research under robust health privacy rules or procedures, and new software tools to put direct control over the uses of health records into the hands of individual patients, through an individually driven “switch” mech- anism between health data providers and health-research data seekers. Privacy is pervasive in terms of the future of health information tech- nology (HIT). How the public feels about privacy issues links directly to the trust level that people have in the entire healthcare establishment, and fac- tors significantly in the move to EHRs, personal health records, interoper- ability exchanges, and so forth. Trust is a fragile commodity. Anything that profoundly threatens the trust that patients have in the healthcare system and in health researchers is a very dangerous step. We need to be careful, and my hope is that the survey data reported here will document this. A national survey sponsored by an IOM working committee (Commit- tee on Health Research and the Privacy of Health Information: The HIPAA Privacy Rule) investigated how the public feels about privacy in health care and the use of their information across the spectrum of healthcare opera- tions. The survey’s sample was 2,392 respondents who were 18 years of age and older. The data were adjusted to represent the entire population of 255 million persons age 18 years and older. We could analyze survey results not only by the majority, but also by health status groups, by standard demo- graphics, by people who reported on their personal experiences in health- care use, and by their policy attitudes. This paper presents only top-level results; the full 2007 survey project report is available through the Public Access Records Office of The National Academies (publicac@nas.edu). The survey formulated four statements and asked people to agree or disagree with each statement. The first statement was about how much people trusted their own healthcare providers—doctors and hospitals—to protect the privacy and confidentiality of their personal medical records and health information. A significant 83 percent expressed such trust, a result confirmed by many other surveys. (See Appendix D in the full 2007 survey project report, available from the IOM as shown above.) These surveys have shown high trust in the healthcare provider establishment as manifested in the direct relationships among the patient, doctor, labs, hospital, and so forth. However, when we asked people whether a healthcare provider ever disclosed their personally identified medical or health information in a way they believed was improper, 12 percent said yes. That represents roughly 27 million adults. The survey report shows how many said the informa- tion was disclosed by their doctor, their hospital, their pharmacy, their lab, their insurer, and others. This response indicates that a significant segment

OCR for page 171
 HEALTHCARE DATA AS A PUBLIC GOOD: PRIVACY AND SECURITY of the public is really not comfortable with the way even their healthcare providers have handled their confidential information. The second question was how much people agreed with this statement: “Health researchers can generally be trusted to protect the privacy and confidentiality of the medical records and health information they get about research subjects.” Sixty-nine percent said they agreed with that statement; fewer than for the healthcare providers, but still, a two-thirds majority endorsement of the health research function as seen by the public. Our third statement asked for agreement or disagreement with this pre- sentation: “The privacy of personal medical records and health information is not protected well enough today by federal and state laws and organiza- tional practices.” In previous health and consumer privacy surveys, we have worded this statement both ways: Sometimes we asked people to agree or disagree with the statement that privacy is “well enough protected,” and the results come out the same. Fifty-eight percent of the public in this IOM survey said they do not believe there is adequate protection today for their health information, either from laws or from organizational practice. This suggests that HIPAA has not created a sense of comfort and security in the majority of the population. My sense is that this judgment is being driven in part by the constant reporting of health data breaches taking place, such as theft of laptops with medical information, improper disposal of hardcopy medical records, and insiders leaking medical information. Such losses may not be at the same incidence level as the theft of financial information or identity theft through capture of consumer data. But reporting of medical data breaches contributes, in my view, to the judgment of a national major- ity that their medical information is not effectively secured today. Finally, we asked people to agree or disagree with this statement: “Even if nothing that identifies me were ever published or given to an organiza- tion making consumer or employee decisions about me, I still worry about a professional health researcher seeing my medical records.” The public is split right down the middle: 50/50. Half agree with the sense that there is an exposure that worries them and half are comfortable. Underlying this finding was probably the feeling that “if strangers are looking at my sensi- tive medical information, I am not quite comfortable with that.” The full report shows that this is more strongly felt by people who have potentially stigmatizing health conditions, such as those who use mental health ser- vices, have HIV or sexually transmitted diseases, have taken a genetic test, and so forth. Demographics and health status would give some subsets of the public an even stronger than 50 percent concern about this. Given the mission of the IOM committee that sponsored the survey, our prime focus was on how people would relate to health research per se. Consequently, we asked people how interested they would be in reading or hearing about the results of new health research studies, causes and preven-

OCR for page 171
 CLINICAL DATA AS THE BASIC STAPLE OF HEALTH LEARNING tion of diseases, and effectiveness of new medications and treatments. We cast the net widely and did not limit it to narrow, clinical trial-type health research. Matching other surveys, three-quarters of the public (78 percent) said they were interested in tracking that kind of health research. Perhaps the single most important focus of our study was when we asked people whether they were ready to have their personally identified health information used by health researchers, and, if so, what kind of notice and consent they would want to have provided. The fact that this was an online survey enabled us to ask a detailed and carefully crafted ques- tion that described how health research is done and gave the arguments of health researchers in favor of general advance consent or consents based on promises of confidentiality and human subject or Privacy Board oversight. We also put in comments of “some people” that only notices describing the researchers, the research topic, and the research result uses would ensure adequate privacy protection. Having presented our lengthy question, we asked people to choose one of five alternatives that best expressed their view. These were randomly presented to mitigate any presentation-order bias. A miniscule 1 percent said that researchers would be free to use their personal medical and health information without their consent at all. We might characterize this group as “let it all hang out.” Eight percent said they would be willing to give general consent in advance to have their personally identified medical or health information used in future research projects without the researchers having to contact them. This small group might be characterized as a segment of the national population having a “high trust in the research establishment.” Nineteen percent said their consent to have their personal and medical health information used for health research would not be needed as long as the study never revealed their personal identity and was supervised by an Institutional Review Board (IRB). These respondents were ready to trust such general researcher assurances. The largest group, 38 percent, equivalent to about 97 million adults in the population, chose the following response: “I would want each research study seeking to use my personal identified medical or health information to first describe the study to me and get my specific consent for such use.” Clearly what is on the mind of this group is an insistence on knowing who is doing the research, what the topic is, and how the information is going to be used. Finally, 13 percent said they do not want researchers to contact them or use their personal health information under any circumstances. This might be called the “no trust at all” segment of the public. However, one in five, or 20 percent, of respondents simply could not make up their mind. The fact that they could not choose one of the five

OCR for page 171
 HEALTHCARE DATA AS A PUBLIC GOOD: PRIVACY AND SECURITY alternatives suggests that a large group out there needs to be better informed or to have the choices put to them in a way that they can recognize and then make a choice. A 20 percent nonresponse rate is quite unusual in policy- related survey research of this kind. We asked those people who would require notice and express consent why they were adopting this position, providing four possible reasons. As one might expect, 80 percent chose “I would want to know what the pur- poses of the research are before I consent.” Sixty-two percent said, “knowing about the specific research study and who would be running it would allow them to decide whether I trusted them.” Fifty-four percent said they “would be worried that their personally identified medical or health information would be disclosed outside the study,” and 46 percent would want to know whether disclosing such information would help them or their family. When we turned to what kind of harm the 38 percent believed could take place if personally identified health information was disclosed outside the study group, the answers primarily focused on discrimination. Privacy and discrimination values have always been closely linked. One claims privacy in order to protect oneself against being discriminated against in some benefit or opportunity. Here, results showed that people worry that distribution of their medical data could affect their health insurance, their ability to get life insurance, or their employment, or that it could result in their being discriminated against in a government program. The smallest number (33 percent) worried about embarrassment in front of friends, associates, or the public. Now, here are some overall impressions about the survey results. First, this survey confirms, as many surveys have shown, that large majorities of the public hold strong concerns over the privacy and handling of their per- sonal health information, especially concerning secondary uses of the data not in the direct-care setting. A strong majority, 58 percent, do not believe that current laws and organizational practices provide adequate privacy protection. The majority generally trust health researchers (albeit research- ers undefined as to what kind they are) to maintain confidentiality, but what some researchers might hope for—that a promise of nonidentification and IRB review would persuade the public to give advance general consent—is not where the majority of the public is ready to come out at the present time. Also, even though we told people that researchers were concerned about the heavy costs in getting advance notice and consent, or that this might corrupt samples from statistical validity, that was not enough to persuade a majority. However, it is fair to say that surveys would get some different numbers if different kinds of researchers and topics were specified, so this is a variable to be understood. What are the implications of our survey for expanded health data uses? Clearly, we are in transition from a part paper and part electronic record

OCR for page 171
 CLINICAL DATA AS THE BASIC STAPLE OF HEALTH LEARNING realm to an interoperable world of electronic health records, personal health records, and huge new online personal health data pools. This opens some potentially valuable public-good health researcher possibilities. Pri- vacy, however, is a make-or-break issue for whether we are going to be able to achieve those advantages from large-scale health data research through electronic communication and transmission. Of course, privacy is not an absolute. Rather, privacy is a matter of balance and judgment, and it is very contextual. Still, unless we can cre- ate what the National Committee on Vital and Health Statistics called a new data stewardship responsibility for health data holders and secondary users, we are going to lose the balanced-privacy battle, with the risk of sharp limits being placed on using personal health data for very important health research. What elements would provide a positive health privacy context for health research? First, we need new legislation. HIPAA is outdated, as many people have said. The late Senator Edward Kennedy proposed support for HIT and EHR systems but, already, bills have been introduced by Senator Patrick Leahy and Representative Ed Markey to add strong privacy protec- tions to any bill that will support the health information technology cause. Without my endorsing any of those bills specifically, it is clear we will have to write a new code of privacy confidentiality and security into the legisla- tion that is going to help to organize and finance EHRs. Second, excellent models of voluntary patient control privacy policies are being offered by some new repositories of personal health records. Microsoft’s HealthVault is one example; Google Health has indicated it will do the same when it issues its health product shortly. Such models need to be encouraged and modeled by many others. Third, we need independent health privacy audits and compliance verification processes. Although no instrument is ready now to carry this out in the health information technology field, new organizations with the right mixture of nonprofit, for-profit, government, and consumer groups could be developed. Such meaningful audit and verification mechanisms are absolutely necessary for public acceptance and trust of the new large-scale health research enterprises. Fourth, there are some new, easy-to-use technologies for implementing patient notice and choice—not “trust me, I am going to store your data, I will only give it to the people you want,” but rather some new “switch, not store” programs. These will register patients and collect their privacy pref- erences. Then, they will connect data seekers—such as health researchers— with the data holders (providers, insurers, Regional Health Information Organizations, etc.) and facilitate the exchange of that information, with- out the data content ever being kept by the switch. This interesting idea

OCR for page 171
9 HEALTHCARE DATA AS A PUBLIC GOOD: PRIVACY AND SECURITY could revolutionize the ability of patients to make informed decisions about the use of their personal information in health research. Fifth, we need to conduct serious field research into how privacy is unfolding in the EHR programs being developed. Researchers need to survey the patients involved in EHR programs as well as to talk, onsite, face to face, about what experiences they have had, what worries them, whether and how those worries have been solved, and so forth. Otherwise, one is back at 10,000 feet talking abstract principles about EHR programs and privacy satisfaction. It would be highly valuable to fund and manage a program of empirical studies of the impacts of EHR systems on privacy, confidentiality, and security values. Finally, the health establishment needs to sponsor a major national educational campaign to promote privacy-compliant, evidence-based health research. Without such a national campaign, the danger is that the balance side—the public-good aspect of sharing patient medical data—will not be fully appreciated by the current privacy-sensitive public. HIPAA IMPLICATIONS AND ISSUES Marcy Wilder, J.D. Partner, Hogan & Hartson, LLP This paper will address the HIPAA Privacy Rule (45 C.F.R. § 164) and 45 its effect on data research. As healthcare and HIT systems evolve, experi- ence suggests that modifications are needed to strike the proper balance between protecting patient privacy and making data available for research to improve healthcare quality and to lower costs. Early advocacy efforts by the research community resulted in changes to the Privacy Rule that light- ened some of the administrative burdens on healthcare providers and plans associated with making data available for research purposes. In addition, HHS revised the Rule to permit disclosures of limited datasets for research purposes. Identifying and developing policy alternatives for addressing the most significant barriers that remain, including those related to future unspecified research and data deidentification, will be essential to promot- ing the research enterprise. The HIPAA Privacy Rule was the first comprehensive federal health privacy regulation. At the time of its drafting, HHS was focused on protect- ing privacy and ensuring that information would continue to be available within the healthcare system for appropriate uses. HHS set a baseline, making clear that health information could be used freely for treatment, payment, and healthcare operations. Policy makers were also clear that before health information could be used for marketing, an individual’s

OCR for page 171
0 CLINICAL DATA AS THE BASIC STAPLE OF HEALTH LEARNING authorization would be required. The extent to which health information should, as a policy matter, be made available for research was far less clear. HHS, other federal agencies involved in the HIPAA rule making, healthcare stakeholders, and consumer advocates did not agree among themselves or with each other. Many believed research should not be placed in the same category as treatment, payment, and healthcare operations. But at the same time, they did not believe that individual authorization should always be required before protected health information (PHI) could be used for research purposes. Some in the research community argued that HIPAA does not and should not regulate research per se and that the Privacy Rule simply should exempt research uses and disclosures.2 For nearly 25 years the Common Rule for Protection of Human Subjects (“Common Rule”)3 had regulated research privacy. IRBs were already tasked with determining whether protocols contained provisions adequate to protect the privacy of subjects and the confidentiality of data. The notion was to leave the current Food and Drug Administration (FDA), Office of Human Research Protections, and state regulatory frameworks in place and undisturbed by HIPAA. This argument, however, was ultimately rejected by regulators. HIPAA restricted access by researchers to PHI, which at that time was held by healthcare providers and health plans. These HIPAA-covered entities would need guidance on how they were to treat uses and disclosures of PHI for research purposes. In addition, although longstanding protections were in place, some privacy advocates believed current protections were not suf- ficient. When HHS ultimately did address issues related to research uses and disclosures, it did not attempt to harmonize HIPAA with the existing regulatory framework for human subjects’ protection. It simply added yet another layer of regulation. By 2002, 2 years after the Final Rule was issued, there was enough experience to suggest that the HIPAA Privacy Rule was unnecessarily creat- ing barriers to medical research and that some provisions needed to change. The research community focused a great deal of effort on the deidenti- fication safe harbor and the fact that data stripped of all requisite fields were not useful for many types of important research. The Department’s response was to add provisions permitting the disclosure of limited datasets for research, provided that a HIPAA-compliant data use agreement was in effect. Under HIPAA, as initially promulgated, before information could be freely used for research, it needed to be deidentified under strict standards. 2 67 Fed. Reg. 14776, 14793 (Mar. 27, 2002). 3 45 C.F.R. § 101.

OCR for page 171
 HEALTHCARE DATA AS A PUBLIC GOOD: PRIVACY AND SECURITY In response to concerns expressed by the research community, HHS intro- duced the notion of a limited dataset, which is essentially deidentified data plus ZIP Codes, dates of service, and other dates related to the individual.4 If a party wanted to use this partially deidentified information for research, it could enter into a data use agreement, the contents of which are pre- scribed by the regulation, promising to protect the information. Once the agreement was executed, the dataset could be released for research pur- poses. These provisions did enable researchers to obtain health data more easily. Although there is a question as to whether these provisions are suf- ficient, they clearly helped. In addition, in 2002 HHS provided an alternative to the accounting of disclosure requirement.5 The accounting of disclosure requirement man- dates that when covered entities such as hospital systems and health plans disclose information for research purposes pursuant to an IRB waiver, they need to keep an accounting of these disclosures and make it available to individuals on request. Keeping individualized records about which records were disclosed for which research protocols operating under an IRB waiver of consent was seen as quite burdensome by the covered entities. As a result, many covered entities, and in particular smaller hospitals and those not affiliated with an academic institution, were restricting access to data. HHS came up with an alternative. Instead of keeping track of every time data were disclosed pursuant to an IRB waiver, an institution could keep a list of all the research protocols for which information was dis- closed pursuant to an IRB waiver for research purposes. Anyone request- ing an accounting of disclosure would be given the entire list, which for institutions such as an academic medical center could be voluminous and burdensome to maintain. On request for an accounting of disclosures, the list would be provided and the individual would, in effect, be told that perhaps his or her information had been disclosed for one of the protocols on the list. The extent to which this is privacy protective or helpful to the individual is questionable at best. It seems to constitute an example of a privacy protection or a requirement that imposes cost and burden, yet does not deliver any meaningful privacy protection. Nonetheless, that is the cur- rent standard. Experience over the past few years has helped highlight the need for further changes. The landscape surrounding research data has changed con- siderably, due in large part to significant technological changes that permit data aggregation on a scale that was previously unimaginable. In addi- tion, emerging technology used by Google, Microsoft HealthVault, Dossia, 4 45 C.F.R. § 165.514(e). 5 45 C.F.R. § 164.508(c)(1).

OCR for page 171
92 CLINICAL DATA AS THE BASIC STAPLE OF HEALTH LEARNING important element of patient satisfaction. At the same time, healthcare providers are rich sources of data, which when properly used in research, have the potential to greatly enhance the quality of clinical care and may result in better clinical outcomes, improved efficiencies, cost savings, or other medical advances. Healthcare providers have an interest in each of these goals, but per- ceived and actual privacy or security hurdles, patient trust considerations, potential legal consequences, and actual costs associated with retrieval of data pose barriers to releasing data for research purposes. In particular, healthcare providers often find the privacy and security requirements of HIPAA confusing, and health information data custodians and researchers sometimes have limited awareness of HIPAA’s data access and disclosure requirements. Furthermore, even when access and disclosure are permitted under HIPAA, minimum necessary standards, accounting for disclosure obliga- tions, and other patient considerations may impede the willingness to make certain disclosures of identifiable information. In addition, it is often costly for healthcare providers to divert resources and personnel away from their primary clinical care activities to attend to administering system and records access/disclosure activities for research purposes. Although tech- nological solutions have the potential to mitigate some of these costs and resource burdens, at the current time, few such tools adequately address all of a healthcare provider’s privacy requirements. In fact, often the implemen- tation of new information technology brings with it additional complexities with respect to the ability to properly control research-related access. As a result, healthcare providers are often more motivated to protect patient privacy, to respect physician-patient relationships, to minimize the administrative impact on data retrieval, and to minimize legal risks and cus- tomer complaints than they are to accommodate the needs of researchers. Absent adequate financial or strategic incentives, regulatory amendment, and greater appreciation of the public benefits of research, access to identifi- able data for research will remain a challenge. MedStar Health is the largest provider of healthcare services in the mid-Atlantic area, composed of eight hospitals, including community-based hospitals and academic medical centers, as well as numerous satellite clinics and outpatient facilities. In the District of Columbia, we own and operate Georgetown University Hospital, the National Rehabilitation Hospital, and Washington Hospital Center. Collectively, our system has about 25,000 employees and at least 5,000 affiliated physicians. System wide, we annu- ally serve some 158,000 individual inpatients, have 787,000 inpatient days, treat 1,561,000 individuals on an outpatient basis, and make 208,000 home health visits. Therefore, the MedStar Health community is a rich source of diverse data that are potentially of great use to research. In

OCR for page 171
9 HEALTHCARE DATA AS A PUBLIC GOOD: PRIVACY AND SECURITY that context, this paper will reflect on some of the institutional challenges that we have balancing patient privacy interests with providing access for research purposes. At MedStar Health we have a vision of being the “trusted leader in car- ing for people and advancing health,” and we have long had a commitment and philosophy of putting the “patient first.” As a result, our leadership feels strongly that beyond what the law says, we are devoted to protecting the interests of our patients and their information, and we are committed to promoting the trust of our patients by protecting their privacy. At the same time, we have a strong commitment to innovation, the promotion of research, and a shared vision of “advancing health” though our education, technology, and research capabilities. As a part of MedStar’s operations, we regularly create and maintain a number of databases and record sets into which patient information is placed, processed, and stored. Given the wide range of services provided by MedStar Health and the diverse patient base we serve, both the volume and the variety of data within these resources are large. We therefore are approached on a regular basis by researchers outside of our covered entity who request both large data sets as well as ongoing open access to patient information. One area of significant concern, therefore, is how to most appropriately release and provide access to information to researchers who are not mem- bers of our workforce. Because these databases, repositories, and record sets are usually created primarily for treatment, healthcare operational pur- poses, or billing and financial purposes—not for research purposes—they often lack a built-in framework for addressing the needs and requirements associated with research-related access as well as the obligations we have for research-related disclosures. Furthermore, even when record sets are created in anticipation of potential research, they are often not designed to adequately facilitate compliance with privacy and security requirements. Consistent with trends across the healthcare industry, MedStar is in the process of transitioning from being a largely paper-based organization to one with electronic records. We actually have four or five separate unique, stand-alone, traditional electronic medical records. In addition, MedStar has also developed a product, which was ultimately bought by Microsoft, that aggregates data from disparate systems, and which has led to an ongoing development relationship between MedStar Health and Microsoft. Although these systems have greatly facilitated our healthcare activities in many ways, one of the largest challenges we have with respect to research interests is getting information out of these systems in a cost-effective for- mat that is useful to researchers. Data that we collect, capture, and hold—whether in electronic or paper format—are generally meant for our own internal operational and clinical

OCR for page 171
9 CLINICAL DATA AS THE BASIC STAPLE OF HEALTH LEARNING purposes, and often are not easily retrievable in a format that is usable for research purposes. Even when the data are in electronic format, in many cases the way in which information is accessed and used for normal opera- tions purposes (e.g., the types of queries made, the specific data points that need to be viewed, and the actual ways in which the data will be employed after retrieval) differs greatly from the manner in which researchers wish to interact with it. Because these systems have generally been designed and implemented with operational needs in mind, as opposed to the needs of researchers, the sort of retrieval, aggregation, and analysis tools necessary to researchers are often not readily available within these systems. As a result, often the extraction of data from our electronic systems requires a fairly manual and laborious manipulation process. To optimally meet researchers’ needs and to contain costs for covered entities would, for some systems, require the development of new software interfaces and tools, all of which require investments of time and resources. Furthermore, some of the information that MedStar collects or creates is proprietary information that we are unwilling to share in unfiltered/ unredacted format, if at all. For example, we often get research requests for our billing and coding information. Although it may be possible to remove such confidential, proprietary information from a dataset intended for research use, it can be difficult to dissociate this information from what we are willing to share. In many cases this removal would require intel- ligent software capable of making fine-grained discriminations and would be very costly. Moreover, concerns are sometimes raised by healthcare administrators that the goals of research are incompatible with the goals of being a leading community-based healthcare provider. Some of the goals of research—such as furthering scientific progress, translating research into improved clinical care, improving society, maintaining scientific integrity, perhaps pursuing technology transfer opportunities—are obviously all valuable, and no one denies that having appropriate information available to use for research is a public good. Nonetheless, such goals can sometimes run counter to the immediate goals of healthcare providers, which are fundamentally to pro- vide quality health care to patients that results in high levels of satisfaction, trust, and confidence and to do this all on increasingly slim operational margins. For many healthcare providers, these goals (or at least the pro- cesses involved in achieving these goals) appear incompatible. Beyond logistical barriers and differences in goals, moreover, HIPAA poses further obstacles to sharing information to outside parties for research purposes. The Privacy Rule continues to be confusing to many healthcare providers, who often view its requirements as arbitrary and overly com- plex. Healthcare administrators often face the burden of too many forms and policies that are generated as a result of our responsibilities to protect

OCR for page 171
9 HEALTHCARE DATA AS A PUBLIC GOOD: PRIVACY AND SECURITY patient privacy. Our administrators complain that they have inadequate resources to review requests and to assist in providing requested informa- tion, and this potentially results in reduced access to records and data. Furthermore, as with any large workforce, we experience frequent staff turnover, which results in a continual challenge of adequately educating our administrators and record custodians about how and when they can appro- priately release health information for research-related activities. Similarly, often researchers and their staff do not understand or fully appreciate the requirements that we must fulfill with respect to the control of health infor- mation or the complex documentation requirements relating to the release of health information. As an example, although researchers may understand that in order to review record sets for the purpose of identifying prospective partici- pants, they must obtain a waiver of the requirement for authorization from an IRB, but they are rarely aware (or may not be concerned) that any access to PHI by non-MedStar research personnel that occurs under this waiver triggers an accounting of disclosure requirement on the part of the records custodian. As a result, they may not take proper steps to ensure that they limit the scope of their requests, limit which other persons receive the screening information, or adequately notify the records custo- dian of the involvement of external personnel and take steps to facilitate our accounting requirements. Although IRBs can play an important role in ensuring that researchers properly address not just their own privacy requirements, but those of the information provider(s), IRBs need not be affiliated with the Covered Entity to grant a waiver of the authorization requirement and may not be entirely concerned with the Covered Entity’s obligations. In addition, many IRBs also have regular turnover and have many members, including unaffiliated community representatives, who sometimes do not understand the requirements for protecting patient privacy. These issues of affiliation and education—whether it is of staff members, researchers, or IRB mem- bers—add to overall concerns in maintaining the trust placed in us by our patients. To provide a few concrete examples of the challenges posed by the intersection of privacy concerns and research interests, I would like to focus briefly on a few specific issues that MedStar has encountered: (1) the need to adhere to different standards depending on who is requesting the information; (2) the potential for needing to honor patient restrictions; and (3) issues related to the relationship between an individual physician and his or her patient and the hospital in which the physician practices. These cases illustrate some of the difficulties inherent in trying to bridge the tensions between these two interests. As touched on briefly above, depending on the specific relationship

OCR for page 171
9 CLINICAL DATA AS THE BASIC STAPLE OF HEALTH LEARNING between an individual researcher and the Covered Entity whose health information he or she wishes to access, HIPAA requirements associated with access differ. Although researchers are permitted to access PHI for research purposes without an authorization under the Privacy Rule, any time a researcher who is not a member of that entity’s workforce does so, it is considered to be a disclosure that the entity must track and be able to account for on request. This is extremely burdensome for healthcare providers, particularly in the paper world, and often necessitates physically placing a marker or informational sheet in each record accessed. One might think this would be easier in an electronic world, but in reality it is not! Most of our electronic systems (especially our billing and other operational systems) do not include functionality that allows the adequate tracking of these disclosures with the level of associated information and detail required by the Privacy Rule. HIPAA’s alternative accounting mechanism, which provides for group or bulk accounting in cases where more than 50 disclosures are made for an individual study, is not really a viable alternative for a large decentralized and integrated healthcare organization. Without a central clearinghouse for evaluating data requests and/or registering the individual studies for which requests are made, it is difficult to confirm which studies may have had information released. For instance, for appropriate clinical efficiencies, some of our clinical systems allow physicians to access health information regardless of where the patients were seen in our system. As a result, it is possible that a researcher in Baltimore could request and access patient information from a system accessible at their location (i.e., in one of the Baltimore facilities) relating to information on a patient who was not seen in that Baltimore facility. Consequently, if a patient from the non-Baltimore facility requested an accounting of disclosures, it may be challenging to determine whether an accountable disclosure was made by the Baltimore facility. Dealing with this situation effectively would require the centraliza- tion of all research and other requests, so that all requests are handled by one central administrator. Unfortunately, this would be extremely burden- some and not currently a viable option for us because we have received potentially thousands of separate requests from thousands of different studies, resulting in hundreds of thousands of research-related disclosures over the course of the prior 6 years. The issue of accounting for disclosures is one where researchers them- selves could do much to help institutions in meeting the burdens associated with their privacy requirements and, in so doing, increase institutions’ will- ingness to provide information for research purposes. Among the ways this can be done are: (1) developing or subsidizing the development of disclo- sure tracking software; (2) subsidizing staff positions dedicated to meeting accounting requirements (records custodians are often severely overworked

OCR for page 171
9 HEALTHCARE DATA AS A PUBLIC GOOD: PRIVACY AND SECURITY and unable to shoulder this); and (3) personally providing required infor- mation sheets or disclosure data where necessary. These strategies and “unforeseen” costs associated with data screening and recruitment should be considered by researchers when calculating the costs of conducting research at the time of grant application or protocol development. Another difficulty associated with the research use of patient informa- tion involves the potential of “patient restrictions” placed on the use or disclosure of their own health information. Under the Privacy Rule, patients are permitted to request a restriction on how their health information may be used or disclosed. However, the Privacy Rule does not require a Covered Entity to accept that restriction request and, in fact, most health- care providers try not to, because it is extremely burdensome to honor these requests. Even if such restrictions are accepted, healthcare providers are not necessarily culpable under HIPAA if the release of information is for research purposes.8 Nonetheless, we believe that if we make a commitment to our patients, we are ethically obligated to try to fulfill it. Though most Notices of Privacy Practices require that any request for restrictions be placed in writing and though most Covered Entities try to educate their staff to not accept a restriction unless it is in writing and clearly agreed to, it is possible that physicians or other staff members occa- sionally and informally make commitments and promises to their patients that their health information will not be used for any purposes except their own treatment unless the patients otherwise consent. In some cases, the physician or staff member may actually sequester a file or flag in an attempt to limit access to the information. Unfortunately, because billing systems, registration systems, and other clinical systems are often highly integrated, it is often difficult for healthcare providers to completely restrict who accesses and uses the patient’s identifiable health information. When the patient is contacted by an outside researcher (even if the researcher legally and properly obtained the patient’s information), the patient will obviously feel betrayed and lose confidence in his or her healthcare provider. Given the number of employees that can potentially access any given patient’s records, it is difficult to ensure that a pledged restriction made by one staff member or physician is known and adhered to by others. This issue, furthermore, is inherently resistant to a centralized solution because of the individual nature of the patient–provider relationship. Even with a centralized office for accepting and implementing patient restrictions in place, it would not prevent individual physicians from making personal agreements or commitments with patients that do not get propagated across the system. This challenge is, similarly, more difficult for researchers themselves to help mitigate than, for example, the accounting of disclosures 8 45 C.F.R. 164.522(a)(1)(v).

OCR for page 171
9 CLINICAL DATA AS THE BASIC STAPLE OF HEALTH LEARNING requirement because the researcher has little ability to discern where restric- tions may be in place if they have not been adequately marked by those who accepted the restriction. As a result, completely confirming that healthcare providers are not violating any individualized commitments prior to making a research-related disclosure would literally require confirming such with each individual treating provider (obviously an insurmountably burden- some task). Finally, another example of a problematic barrier has to do with the physician–patient relationship. A small but vocal community of our physi- cians has strongly objected to us allowing health information about “their” patients to be accessed by researchers. Some feel strongly that researchers are effectively trying to “cherry pick” their patients because some of these researchers are also clinicians. They have also argued that this violates the trust of their patients because patients may not understand why some out- side researcher with whom they have no existing relationship is contacting them. It is argued that this may be perceived as similar to providing their contact information for “cold-calling” purposes. An additional concern is that these patients might get enrolled in trials that contraindicate the care their personal physician advocates. In fact, these objections run so deeply in some cases that some referring physicians have suggested, “If you do not protect my patients’ information, I am not going to refer patients to your hospitals any longer.” This, again, is an area in which researchers can play a personal role in mitigating concerns. If researchers are prepared to engage in meaningful discussion with treating physicians about the value and benefit of proposed research and accept the expressed concerns, they can help to work around these potential barriers. For instance, rather than screening patient records without the knowledge of treating physicians and contacting patients them- selves, researchers can work with physicians to identify potentially eligible patients and then ask the physicians to speak with them about the proposed research. This can alleviate both the potentially invasive feeling by patients of being contacted by a stranger for research as well as physicians’ concerns that their patients may be recruited without the physicians’ knowledge into research that they do not believe is commensurate with the care they provide. Patient attitudes also play a key role in determining whether health information can or should be released for research purposes. Some patients are altruistic and have no difficulty sharing all their identifiable health information if it will better serve the community. Others are much more protective of their individual information because of fears over misuse, discrimination, or social stigma. Some patients are comfortable releasing some, but not all, of their health information for research purposes. How- ever, although this could be a means of balancing privacy interests against

OCR for page 171
99 HEALTHCARE DATA AS A PUBLIC GOOD: PRIVACY AND SECURITY research interests, many researchers do not view this as an effective option because it potentially distorts the available data sources and could skew data results. Moreover, even in an electronic world, technical limitations can function as barriers to even this limited type of research access. As dis- cussed above, many systems do not have built-in abilities to easily capture data in a format useful for research purposes. Additionally, many systems lack the functionality that would be necessary to allow a patient to partially opt out of disclosures for research purposes (e.g., portions related to mental health or substance abuse). Unfortunately, most healthcare providers have no cost-effective way of protecting just limited portions of the patient record, even when indi- viduals feels comfortable that the rest of their file could be used for research purposes. Eventually, we may get to a point where we can make such dis- tinctions, but for now such requests put us in the untenable position as a Covered Entity of having to assume the burden and cost of basically pulling records or reports, reviewing eligibility criteria, and spending the necessary time to compile all this information to be used for research purposes. Under such circumstances, many administrators legitimately question what benefit these burdens provide to our patients and to our institutions. If researchers are willing to expend the time, efforts, and costs necessary to enhance these systems to better meet these needs, they can potentially go a long way toward increasing institutional support for research disclosures. All of this is not to suggest that healthcare providers do not have any commitment to research at all. Healthcare providers recognize the value and the public good of research. They are committed to research, especially when it is consistent with their own mission or values or when there is a direct benefit to them. Obviously, however, they do not want it to interfere with patient care. They do not want it to be overly burdensome or costly, thereby detracting from the resources available for activities more directly related to patient care. They do not want it to interfere with their relation- ships with their physicians or the relationships between the patients and the physicians. In addition, of course, all healthcare providers have to be concerned about legal risks and compliance with applicable laws. Recognizing the importance of research in furthering the practice of health care, and in improving society as a whole, MedStar has undertaken a number of different efforts to try to accommodate researchers in a fash- ion that balances our privacy concerns against the administrative burdens associated with research-related requests. One thing we have done is agree in some cases to effectively perform screening and recruiting activities on behalf of researchers. This includes screening participants—assuming there are no objections from patients or physicians—in order to obtain authoriza- tion on behalf of the researcher or to simply provide the subject with infor- mation about the research project and let him or her contact the researcher

OCR for page 171
200 CLINICAL DATA AS THE BASIC STAPLE OF HEALTH LEARNING directly. In theory, this avoids the accounting obligation, and could be more sensitive to some of our patients, but it requires time, training, and effort on our staff. In some cases, we have asked researchers for compensation to offset some of those costs. This is obviously not preferred by all research- ers, but it is a step toward closer engagement between us, as a healthcare provider, and the research community as we work to foster coordinated EHR user organization evidence development work. Another approach we have tried with limited success is to engage the researcher effectively as a business associate to handle all the screening, recruitment, and internal administrative processes that we have in place. This allows the researcher to recruit patients directly, but it avoids the accounting of disclosure obligations and shifts the burden to the researcher for cost. Depending on the HIPAA mechanism the researcher is using, the PHI may need to remain within our property as a Covered Entity. If the researcher does not obtain an authorization, the PHI would need to be returned or destroyed. This solution, unfortunately, is not appropriate or viable in all situations and, again, is not always palatable to researchers themselves. Other data access options include Limited Dataset/Data Use Agree- ments. This option would generally permit researchers to have a limited set of identifiable health information, without a patient authorization and without the accounting of disclosures responsibility, but it still requires resources of the Covered Entity to create the Limited Dataset and to nego- tiate the Data Use Agreement with the researcher. Our experience has shown that this option currently has limited effectiveness for the majority of research conducted at our facilities because most of our research requests are for more complete, identifiable datasets. As a result, the Limited Dataset will not be a truly useful or viable option for us absent systems that can cost-effectively produce the data and absent an amendment to the Privacy Rule that greatly expands the number of identifiers available through this vehicle. Going forward, we see several potential avenues for progress. We would like to see HIPAA amended to accommodate the needs of researchers while minimizing the burdens on Covered Entities. Eliminating the account- ing disclosure obligations would go a long way toward reducing our costs and burdens. Expansion of the Limited Dataset concept could potentially assist both researchers and Covered Entities if the Covered Entity has sys- tems that can cost-effectively produce data and the Limited Dataset vehicle is greatly expanded to include identifiers that would permit screening and recruitment activities. In addition, as vendors and suppliers of our data systems and elec- tronic medical records systems become more sophisticated in the poten- tial applications of this information, the design of operational databases

OCR for page 171
20 HEALTHCARE DATA AS A PUBLIC GOOD: PRIVACY AND SECURITY and electronic records will allow us to more generally protect the patient information that needs to be protected due to applicable laws or commit- ments to our patients, while making available information that can and should be available for research purposes. With respect to the physician– patient relationship, continued work is necessary to build communication and trust between all parties, and opportunities exist to further educate treating physicians about research opportunities. With respect to tech- nology, interoperable data exchange may ease some of the technological burdens we face and could result in greater access to health information by researchers, but the details and potential barriers associated with access to data exchanges remain uncertain and may require further legal clarifica- tions. Perhaps most importantly, an increased awareness and sensitivity on the part of researchers to the requirements, burdens, and costs associated with healthcare providers’ provision of information, and a willingness to share in those costs and burdens, can greatly aid in overcoming the obsta- cles that currently impede research efforts. REFERENCES CED (Committee for Economic Development). 2008. Harnessing openness to transform American health care. http://www.ced.org/docs/report/report_healthcare2007dcc.pdf (accessed August 21, 2008). Westin, A. E. 2008. How the public iews priacy and health research: Results of a national surey commissioned by the Institute of Medicine Committee on Health Research and the Priacy of Health Information: The HIPAA Priacy Rule. http://www.patientprivacyrights. org/site/DocServer/Westin_IOM_Srvy_Rept_2008_.doc?docID=3181 (accessed August 20, 2008).

OCR for page 171