Click for next page ( 200


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 199
Professional Codes and Guidelines in Data Sharing Robert F. Boruch and David S. Cordray INTRODUCTION This paper reviews available information about professional codes and guide- lines Hat are pertinent to data sharing. Our working definition of codes here includes statements of principle, conduct, or rule that bear on the rights and responsibilities of the parties involved in data shanng. Parties at interest here include primary and secondary analysts and the professional societies and as- sociations to which they may belong; federal, state, and local agencies that sponsor or conduct research; and the editors of professional journals. While this is not a complete listing of those involved in data sharing, nor is the defi- nition of professional practice satisfactory, we believe it suffices for this dis- cuss~on. Robert F. Boruch is a professor in the Department of Psychology and the School of Education and codirector of the Center for Statistics and Probability, Northwestern University. David S. Cordray is an assistant professor of psychology in the Division of Methodology and Evaluation Research, Department of Psychology, Northwestern University. Background research for this paper was supported by a stipend from the National Science Foundation to Northwestern University, Center for Statistics and Probability. 199

OCR for page 199
200 Robert F. Boruch and David 5. Cordray International codes are discussed first, followed by specialized codes gen- erated by disciplinary associations. Laws and regulations constitute a back- drop for any professional code, so they too are discussed bnefly. Some of the illustrations of actual practice mentioned here are discussed in greater detail in the other papers in this volume. The conclusion of the paper is dedicated to a brief discussion of the adequacy of professional guidelines. The discussion in this paper focuses on codes that concern data sharing. These codes often include provisions regarding privacy of individuals or of in- stitutions, standards for reporting, and other related topics; they are given far less attention here than those bearing directly on data sharing. More thor- ough reviews of privacy-related codes and guidelines are discussed else- where, e.g., Boruch and Cecil (19791. INTERNATIONAL COMMITTEES AND ORGANIZATIONS Bellagio Principles The broadest set of guidelines on data sharing appear to be the so-called Bellagio principles (see Exhibit A). The principles evolved from a confer- ence among university and government scholars and bureaucrats from five countries: the United States, United Kingdom, West Germany, Sweden, and Canada. The meetings, organized by historian and lawyer David Flaherty at Bellagio, Italy, resulted in a statement of 18 general principles on which all participants could agree (see Flaherty, 19781. The principles have been pll- blished in at least five journals. Though they have no legal standing, they do serve as a framework for international agreements and codes of good practice. The Bellagio principles endorse the idea of provision of government data to individual researchers or research institutions for legitimate research pur- poses. They were not designed to apply to the individual researcher's sharing his or her data with others, but might be considered for adoption to this as well. They cover three general aspects of data sharing: the conditions under which sharing should occur, modes of data release, and responsibilities of those who receive data. With respect to conditions, the principles endorse He idea of the broadest practicable access to government data by nongovern- ment researchers or research organizations, recognize the legitimacy of limit- ed constraints on access necessary to achieve a balance between public and re- searcher interests, and endorse the idea of statutory privilege for data col- lected primarily for research purposes. Eight of the principles focus on modes of data release. They include suggestions about distribution of data at the lowest possible level of aggregation when microdata are required and dis- tribution of public-use sample tapes win any individual identifiers removed.

OCR for page 199
Professional Codes and Guidelines 201 Two principles consider the fact that records on identifiable individuals and complete data rather than public-use samples are essential for certain kinds of research. The use of special techniques to protect against deductive disclo- sure, through customized user service procedures, for example, is acknowl- edged. One principle addresses the matter of linking files from independent sources when privacy is an issue and one considers the distinction between ad- ministrative and research records. The remaining principles focus attention on the responsibilities of research- ers and other parties. They address the need for statisticians and researchers to contribute to policy and legal definitions of privacy and enumerate simple ways to meet public concerns about privacy and confidentiality on the collec- tion and utilization of individual data. (Those ways include informed con- sent, public education, and provisions for public knowledge of data uses, among others.) One principle urges professional societies to devise codes of conduct. Another is devoted to ensuring that access is not discriminatory and that appeals processes are available in the event of conflict over access. The final principle places responsibility for proper conduct on users of microdata by encouraging researchers to sign written agreements for the protection of confidentiality . Statements of Government-Related International Organizations Various government-related international organizations have considered, al- though not necessarily adopted, guidelines on transnational data flow. Many of those guidelines are very general in that they do not distinguish between commercial exchanges or use of data and exchanges of research data. Many existing guidelines are less relevant for physical science and engineering data since those data do not concern records on identifiable individuals and the guidelines stem invariably from concerns about privacy and confidentiality. The practices and procedures specified by three organizations are summarized briefly here. The Organization for Economic Cooperation and Development (OECD) has accepted for consideration a set of guidelines on data protection submitted by He United States. The guidelines constitute a set of principles of fair prac- tice regarding records on individuals, and they are designed primarily to pro- tect individuals' interest within member states. So, for example, the princi- ples state that individuals should have a right of access to their records in per- sonal data recorLkeeping systems and right of correction and that there should be explicit limits on method of collection, use, and external disclosure of rec- ords. Personal implications of transnational data flows are considered in rec- ommendations that privacy law or policy be created and enforced and that

OCR for page 199
202 Robert F. Boruch and David 5. Cordrary codes of conduct be fostered. The OECD statements that are most pertinent to data sharing include the following (Organization for Economic Cooperation and Development, 19811: (5) Governments should undertake to ensure to the greatest extent possible, the uninterrupted free flow of infonnaiion. (6) Governments should undertake to avoid the unjustified disruption of intema- tional trade patterns and the creation of nontariff barriers Mat would interfere with transnahona1 data flow. (7) Government should refrain from restricting We import and export of data un- less doing so is essential to national security. The other statements call for more cooperation on policy and law governing individual privacy. Though the nature and function of the data being con- sidered is not explicit, the It material supplied by expert groups justifies interest in the area by arguing that transborder movements of personal data are essential to economic, scientific, educational, and social development (Katzan, 1980:1481. The Council of Europe's preliminary draft convention on automated rec- ords focuses on maintenance and privacy protection for automated administra- tive records. There is no explicit distinction between research and adminis- trative functions of records, no reference to sharing such data for research, and no recognition of exchanges among researchers. The emphasis appears to be on commercial uses despite the lack of distinctions (e.g., a German f~rm's processing an Italian fum's records or vice versa). The European Science Foundation's (1980) "Statement Concerning We Protection of Privacy and the Use of Personal Data for Research" contains six basic principles dealing with pnvacy. The statement itself was drawn up partly in reaction to OECD and Council of Europe directives that emphasized restrictions on data access in Me interests of institutional and individual priva- cy. Section 1.4 of Me statement (p. 5) directs special attention to data shar- ing: "Freedom of research presupposes the broadest possible access to infor- mation. Legislation should therefore, besides specifying the conditions und- er which personal data may be used for research, ensure access to die informa- tion needed." Reuse of such data is considered in two sections (2.10, 2.11) Mat bear on secure storage in a centralized research archive. ~ As can be seen, the statements considered or issued by international organi- zations such as the ones discussed here vary greatly. Except for the statement of the European Science Foundation, Key do not recognize any special characteristics of scientific data or records and provide no special guidance on sharing this kind of information. This omission is likely to lead to problems, as evidenced by difficulties engendered by Sweden's Privacy Act of 1973 and the U.S. Privacy Act of 1974: needless restrictions on collection and disclm sure of records whose function is solely research (see Mochmann and Muller,

OCR for page 199
Professional Codes and Guidelines 1979). 203 INFLUENCE OF LAW ON DATA SHARING Law and government rules within a country constitute broad limits on He con- duct of researchers. They can impede or enhance the sharing of data gener- ated in individual projects. More generally, they affect the extent to which independent researchers can obtain access to records maintained by govern- ment for research, policy, or management purposes. The influence of the U.S. Privacy Act and Freedom of Information Act (examined by Cecil and Griffin, in this volume) and of the Tax Reform Act and other legislation illus- trate the complex nature of problems. Here we confine attention to several recent studies of relevant law in developed countries. Rules and guidelines issued by specific agencies are considered later in the paper. Flaherty's (1979) study, supported by the Ford Foundation, examined five countries: Britain, the United States, Canada, Germany, and Sweden. It co- vered the legal framework and theory underlying privacy law for each coun- try, paying special attention to legal restrictions on access to government data by social scientists. The legal and institutional mechanisms for disseminating public-use tapes and other microdata are described. The examination of U.S. rules is based on the law, regulation, and practices of the National Center for Health Statistics and the Office of Research and Statistics of the Social Security Administration (both a part of the U.S. Deparunent of Health and Human Services). The Mochmann and Muller (1979) monograph includes reports frown the five countries covered by naherty as well as Norway, Denmark, Italy, Holland, and Belgium. Both of these county reports give a general descrip- tion of privacy legislation and law governing researchers' access to data and indicate whether such laws distinguish between administrative and statistical records, define anonymous and identifiable records, and provide for research- ers' use of data (including for sampling of individuals as a data base, linkage with other archives, etc.~. Law, regulation, or practice on conditions of ac- cess are also discussed for each country. One of Flaherty's (1979) major conclusions is that statistical information is often guarded on privacy grounds to such an extent that research needs in many fields are not being met. His conclusions and recommendations cover policy, rules, and regulation, public relations, dissemination of data, and o~- er matters. The largest compendium of state laws that bear on access to data for re- search purposes has been developed by Robbin and Jozefacki (1982~. It cov- ers vital, health, and social services records, the laws governing privacy of in- dividuals on whom records are kept, and researchers' access to those records.

OCR for page 199
204 Robert F. Boruch and David S. Cordray About 350 statutes are included for all 50 states. Work by Sasfy and Siegel (1982) is similar in spirit but focuses on criminal justice agencies and their record practices, including disclosure of police, court, and other records to re- searchers. It also covers all the states and more than 130 agencies. As one might expect, attention to disclosure for research purposes vanes considerably. Most privacy-related statutes do not include explicit provision for access by researchers. The minority that do most often provide for access by delegating access authority to an agency director's discretion. According to agency officials interviewed in these studies, they base their judgments on law and the nature of Me research for which the records are requested (e.g., quality of research design and relevance to the agency's missions. Some laws are explicit, partly as a result of federal models. California's Information Practices Act, for example, permits legitimate researchers access to medical, psychiatric, and psychological records, provided that the identi- fied information is essential to the research; farther disclosure is forbidden. Similarly, researchers involved in mental health work are granted access und- er certain restrictions to state agencies providing relevant services. The works by Robbin, Sasfy, and others demonstrate that there are many fewer federal and state laws that provide for researcher access to data than there are laws that govern privacy of individuals, collecting and storing infor- mation, and so on. When disclosure is made legally possible, it is most often discretionary. Laws are only occasionally explicit in specifying Hat records may be disclosed for legitimate research purposes; however, most researcher requests appear to be honored. To judge by Robbin's (1982) surveys, few ar- chivists are aware of the laws that provide access; to judge by lack of coverage of the topic in such journals as American Statistician, American Sociologist, and others, many professional groups may also not be informed. PROFESSIONAL SOCIETIES AND ORGANIZATIONS A variety of professional organizations and societies have issued codes of eth- ics or professional conduct that address data sharing at least indirectly. This section discusses Me extent to which societies have explicitly acknowledged data shag practices and summarizes the character of standards, guidelines, or codes issued by societies or professional groups.2 American Association for the Advancement of Science Professional Ethics Project In December 1980 Me American Association for Me Advancement of Science (AAAS) issued a report on professional ethics activities in scientific and engi- neering societies affiliated with AAAS (Chalk et al., 1980). At the time of

OCR for page 199
Professional Codes and Guidelines 205 the survey, 241 science and engineering societies were affiliated with AAAS. The data reported by Chalk et al. concern roughly 74 percent of the societies and cover a broad range of disciplines and society characteristics (large and small membership, new and established, etch. While there is some ambi- guity as to the number of societies that have adopted ethical rules or codes of conduct, it appears that between 50 and 60 societies have either done so or have issued advisory opinions. Chalk et al. (1980) identify 191 distinct rules of conduct. Appendix J of their report enumerates statements appearing in these documents and the fre- quency of each. We have analyzed the contents of 74 statements issued by 57 societies to provide a crude characterization of the extent to which data shar- ing is considered by professional societies, and we reproduced below the ones that are relevant to data sharing (the frequency with which each statement is given in parentheses; the most pertinent statements are italicized):3 Members shall disseminate knowledge and share experience with other colleagues and be honest, realistic and clear in presenting findings. (19) Members shall avoid and/or discourage sensational, exaggerated, false and unwar- rantedst~atements. (21) Members shall refrain from or exercise due care in criticizing another professional's work ire public, recognizing that the Association provides a proper forum for techni- cal discussion and criticism. (2) Members should not communicate their findings secretly to some and withhold them from others. ( 1 ) Members should clarify in advance with employers or sponsors expectations for sharing and utilizing data andlor the ownership of materials or patents. (9) Funding agencies should include in grants a stipulation that data gathered under the grants be made available to scholars at cost after a specific time. ( 1 ) Members shall protect clients from the misuse of information collected about them. (1) Members shall respect the privacy of their clients. (1) Information gained from research participants shall be held in confidence unless the subject's consent to release info~laiion is obtained. (5) Solicitation of research subjects should make clear the obligation, rewards, and consequences to research subjects for their participation. (4) As can be seen, the most frequently appearing statements in codes are those directed at honesty and balanced reporting (statements 1 and 2~; 40 of the 57 societies offer some advice on this maker. The frequency of sanctions against criticism or concealment, reflected in statements 3 and 4, are consid- erably fewer. Of particular relevance to the topic of data sharing are state- ments 5 and 6. Only 10 instances of statements pertinent to sharing are re- ported despite the frequency with which honesty and balanced reporting are advocated. The remaining statements apply to privacy and confidentiality and are more frequent, judging from the list in Chalk et al. Each of them em- phasizes the roles and responsibility of the research practitioner.

OCR for page 199
206 Robert F. Boruch and David 5. Cordray In the codes of conduct issued by professional societies, less explicit em- phasis is placed on conditions for release of information. Rather, the stress is on conducting honest and objective research. The completeness of any given professional society statement cannot be assessed using the ChaLk et al. re- port, but the data are available for reanalysis. (The report includes excerpts from ethics statements of a selected group of professional societies.) Guidelines Bearing on Statistical Research and Data Sharing Since 1980 four professional groups have issued standards and guidelines bearing at least partly on sharing statistical research data. They are remark- able in that each dedicates explicit attention to providing access to data used as a basis for reports. The Joint Committee on Standards for Educational Evaluation (1981) issued professional standards and guidelines for evaluating educational programs, projects, and material; the Evaluation Research Society (1980) recently issued a parallel document for a wide variety of disc~- plines, including education (also see Rossi, 1982~; the American Statistical Association (1980; 1983) has independently issued a Graft code of conduct bearing on Me topic of data sharing (also see Ellenberg, 19831; and the American Sociological Association (1982) issued a draft code of ethics to its members.4 The Evaluation Research Society (ERS) (1980) explicitly states (guideline number 7) Rat any restrictions on access to data generated as part of an eva- luation should be established at the outset. Similarly, the Joint Committee on Standards for Evaluation (1981) acknowledges the need to negotiate access to data as part of the planning process. The ERS standards are more explicit than the joint committee on when access is and is not negotiable. Specifically, access is not negotiable when the evaluation is subject to condi- tions specified by the Freedom of Information Act, or it is understood that re- sults are in the public domain. The ERS standards note that the sponsor or the evaluator are obligated to point this out. For privately sponsored research, "the client may rightfully expect confidentiality of the findings to be maintained" (ERS, guideline 71. To facilitate reanalysis of data to which ac- cess has been obtained, the ERS standards specify that a description of analy- sis procedures, including their assumptions and relevance to the data, should be provided (see guideline 32~. Documentation should be sufficient to make the analysis replicable (guidelines 36 and 49), and methods and circumstances of data collection should be recorded for each data item (guideline 291. As in the standards report on impact evaluations of the U.S. General Accounting Office ( 1978), We persons responsible for release of the data should be identi- fied.

OCR for page 199
Professional Codes and Guidelines 207 The ERS and joint committee standards are similar in their treatment of the issues related to access and the factors that facilitate or impede access (Cordray, 1982~. They differ in organization and detail. The joint commit- tee statements bearing on these issues are spread throughout the volume, re- flecting a need to consider access and limitations on access in the research de- sign, during data collection and processing, and in reporting. There are ex- plicit guidelines addressing access to data records (CS-I); identification of nght-to-know audiences to whom summary information is to be provided (A6-B, A6-C); agreement on to whom identified data should be released (A6-H); release of evaluation procedures, data, and reports so that they can be examined (judged) by other independent evaluators (C2-2 and D3-I); and a general statement (D4-G) regarding making data, procedures, and records of analysis available to responsibly planned reviews. When anonymity is promised, procedures are to be devised to protect subject anonymity (C5-J). A caveat acknowledges the need to avoid making prognoses of confidentiality when it cannot be guaranteed and to avoid a guarantee that information will not be used beyond its stated purposes when there is the possibility that it may be released (e.g., through a court order). Under accuracy guidelines, ana- lysts are urged to adopt and implement standard procedures for storing and re- trieving data (D7-D) and to implement checks for errors, processing, and re- poriing data (D7-E), and weaknesses in the data are to be described and their impact on conclusions assessed (D8-B). Seven guidelines are offered on what should be reported (D4-A to D4-G). The ERS guidelines explicitly acknowledge the need to identify those indi- viduals who are authorized to release the data. In this respect, Hey are simi- lar to guidelines issued 2 years earlier by the U.S. General Accounting Office (GAO) (1978) for assessing quality of federal program evaluations. The GAO also suggests that data files, stripped of identifiers, should be released as soon as possible after an evaluation is completed. The joint committee is less direct, treating this as a point of negotiation unless contractual or legal con- straints apply. The joint committee hedges a bit by prefacing its recomml~n- dation win the phrase "make available for responsibly planned reviews" (p. 108), implying that some requests could be justifiably denied. All of the guidelines share a concern Hat pledges of confidentiality be offered only when it is necessary, that researchers should not make assurances of conf~den- tiality that cannot be honored, and that restrictions on access due to pledges of confidentiality should be avoided. The Committee on Code of Conduct of the American Statistical Association (ASA) has proposed an interim set of guidelines on a 3-year Dial basis (see Amstat News, 1981; American Statistician 37~1), 1983~. Several items in the guidelines bear directly on data sharing and are summarized be- low (American Statistical Association, 19831.

OCR for page 199
208 Robert F. Boruch and David S. Cordray First, the code recommends that statisticians "make data sources available for analysis by other responsible parties with appropriate safe-guards for pri- vacy concerns" (p. 61. Second, it recommends that "statisticians establish their intentions where pertinent to protect the confidentiality of information . . . to ensure that the means are adequate to protect confidential- ity to the extent pledged . . . and to insure that transfers of data are in confor- mity with pledges" (p. 5~. Third, it recommends that the statistician docu- ment data sources used in an inquiry and known inaccuracies. These American Statistical Association proposals articulate the spirit of suggestions made by Bentley Glass (1965) to the general scientific community. His view is that the scientist is obligated to "publish his methods and his results so clearly and in such detail that another may confirm and extend his work" (p. 12581. For some sciences, it is only by getting hold of the raw data that condonation and extension are possible. Some of the commentaries on the ASA guidelines pertain to data sharing and reanalysis. For E.A. Gehan (1983), a biostatistician, a crucial ethical concern is whether certain subgroups of patients in clinical trials are analyzedthe subgroups one compares may produce very misleading esti- mates of the effect of a drug or surgical technique. Bross (1983) registers a related concern that major goverrunent-sponsored research is analyzed in ways that produce artificially favorable results. Mosteller's (1983) concern, which is also related, is that the guidelines should not lead to relaxation of vi- gilance against fraudulent activity by the statistician. None of the commenta- tors recognizes that the ASA guidelines may enhance an independent analyst's ability to reanalyze data to produce a less misleading or at least a more balanced view. Mosteller notes the ostensible internal conflict between a guideline that ad- vocates disclosure of data and one that warns against disclosure of a client's private information. For Kish (1983), the ostensible conflict lies between a guideline that urges researchers to collect only information that is necessary and We contemporary emphasis on omnibus surveys and data banks, in which what is "necessary" often cannot be clear for some time. The American Sociological Association issued a draft code of ethics in 1980 and a revised code in 1981. Its membership voted approval of a final draft in 1983; enforcement procedures are still under review. The code main- tains (American Sociological Association, 1982:2~: Sociologists are obligated to report findings fully and without omission of signifi- cant data . . . disclose details of theory, methods, and research designs that might bear upon interpretation of research findings. .. . Consistent with the spins of full disclosure of method and analysis, sociologists should make their data available to other qualified social scientists at reasonable

OCR for page 199
Professional Codes and Guidelines 209 costs, after they have completed their analyses, except in cases where confideniiali- ty or the claims of a field worker to the privacy of personal notes would be violated in doing so. The timeliness of this obligation is critical especially where the re- search is perceived to have policy implications. The code is remarkable in several respects. For instance, timeliness of dis- closure is recognized by no other codes that we are aware of. Other state- ments in the code make it plain that data generated in other countries, as well as in the United States, should be stripped of identifiers and made available for reanalysis. It also appears to be the only code that is explicit about meth- ods for ensuring privacy of respondents (American Sociological Association, 19821: To Me extent possible . . . researchers should anticipate potential Meats to con- fidentiality. Such means as removal of identifiers, the use of randomized re- sponses, and other statistical solutions to problems of privacy should be used where appropriate. The proposed code is terse, but covers a variety of other topics. Arguments Against Codes Whether the various science organizations should adopt a code of conduct at all, much less one that takes a position on data sharing, has been debated periodically. During the 1950s and 1960s, the arguments against codes in- cluded the view that scientific ethics although not codified are adhered to nonetheless and that unwritten ethics should not be codified and put to a vote (Lanz, 1963; Fosberg, 1963~. Opponents of guidelines argued that such codes are usually created for legalistic reasons, and the remoteness of science from legal settings obviated this justification for codes (Cranberg, 19631. The recent court cases over access to data (such as Forsham v. Harris), gov- ernment suspension of grants in cases of research fraud in medical research, and similar problems imply that this argument is no longer true (see Cecil and Griffin, in this volume). The recent comments on the American Statistical Association guidelines by many prominent statisticians are also instructive in this respect. For many, Me guidelines are a promising and fundamental vehicle for education in the profession, a set of reminders about what traps one ought to be aware of and how to avoid them (Martin, 1983; Gehan, 1983; Rice, 1983; Mosteller, 19831. For some, however, acceptance of the idea of guidelines is reluctant: they would have hoped guidelines to be unnecessary (Greenhouse, 1983; Kish, 19831. Still others believe such guidelines are, at best, gratuitous and will, at worst, be dangerous insofar as they invite sanctions against a political- ly unpopular view (Solomon, 1983) or detract from the development of per-

OCR for page 199
Professional Codes and Guidelines 215 ble to pressure as original analyses. Access to data can be impeded by politi- cians, bureaucrats, and scientists. In the absence of a formal policy, occa- sional refusals to release data will continue. Furthermore, the normal tum- over of staff of agencies, contractors, and advisory boards should not affect access to data, and we believe policy can have a stabilizing influence. Of course, any policy has to be monitored to ensure that it meets the needs of those who request data. RegulatioIIs of Operating Agencies Rules and regulations issued by federal operating agencies as a means of implementing legislation are also pertinent to data access and sharing for re- search purposes. For example, federal regulations on evaluating Title 1 com- pensatory education programs have required local education agencies and state education agencies to retain all of the data used to develop their reports for a period of five years or until any pending federal audit has been com- pleted (Federal Register, 1979:44~. For local education agencies, "all indi- vidual scores with an identifying code" are to be maintained. However, the regulations are not explicit as to who should bear fiscal responsibility for data storage or about the nature and scope of the documentation. Trochim (1982) was successful in securing data on Title I evaluations from such agencies to produce useful reports on the impact of Title I. But He information was of- ten poorly documented and not in machine-readable form; considerable com- munication between parties was required in order to successfully use the ac- quired data. As described in the paper by Boruch, other government agencies foster data sharing through a variety of contractual rules. For example, contracts issued by the National Center for Educational Statistics Hat are designed to support long-term longitudinal studies generally require the production of public-use data tapes by the contractor. Similar contract provisions have been created for large-scale evaluation studies, such as the graduated work incentive ex- periments supported by the U.S. Department of Heals and Human Services in Seattle, Denver, New Jersey, and elsewhere, and the housing allowance experiments supported by the U.S. Department of Housing and Urban Development. State and local examples are much less visible and doubtless less frequent, although they do exist, but we have no hard information on them. PROFESSIONAL JOURNALS AND POLICY ON DATA ACCESS In preparing this paper, an effort was made to identify when and how journal policy is structured with respect to data access issues. Time and resources

OCR for page 199
216 Robert F. Boruch and David S. Cordray did not permit an exhaustive review. We did uncover instances of explicit policies on access and other situations in which Me issue has never ansen. The prevalence of either cannot be determined at this time. Journal of Personality and Social Psychology Anthony Greenwald's editorial policy for the Journal of Personality and Social Psychology makes plain his position on data access. Two aspects of data sharing are considered. First, to aid in the editorial evaluation of a manuscnpt, the author is instructed to supply one copy of the summary tables for the major analyses reported in the manuscnpt. The second aspect is more pertinent to data sharing (Greenwald, 1976:5~: Submission of a research report to JPSP will be interpreted as an implicit assurance that the author has records of exact procedures and of data in unanalyzed form, and that both of these types of information shall be available to investigators who would like to replicate the research or reanalyze its data, respectively. . . . When a manu- script is accepted for publication, the author will be asked to provide assurance that (a) the data in unanalyzed form and Me exact details of the procedures will be avail- able to other investigators for at least 5 years after publication and (b) ethical prom lems have been handled in accordance with cu'Tent APA code unless indicated otherwise in We published article. Greenwald has since stepped down as editor of this journal. We were un- able to determine in this case the extent to which the policy was implemented, nor were we able to determine if the data made available under his editorship was actually used by secondary analysts. Journal of the American Statistical Association (JASA) Stephen Fienberg's policy on data sharing while editor of the applications sec- tion of JASA is similar to Greenwald's policy: authors' submissions were to be accompanied by the data so that referees could "check calculations or carry out alternative ones" and to make them available from published manuscripts so that others could conduct reanalyses. Fienberg notes certain obstacles (massive data sets, confidentiality restric- tions) he encountered in instituting the policy, but notes that provisions for summary tables rather than microdata, or statements indicating that the data are available from the author, usually solve the size problem. He reports few instances in which authors were reluctant to comply with these requests. As indicated earlier, the American Statistical Association Committee on Code of Conduct has proposed an interim code, to be followed for a 3-year trial basis. It is more explicit about what is to be documented than is Fienberg's policy, but the code of conduct does not apply to the process of

OCR for page 199
Professional Codes and Guidelines 217 submitting articles to journals. This seems to be an area that is left to the discretion of the individual editors. American Chemical Society Journal Practices Authors of papers appearing in the Journal of the American Chemical Society, Analytical Chemistry, and others published by the American Chemical Society (ACS) can and often do make their data available for reanalysis. In particular, ACS provides a regular subscription to a supplement, which ap- pears annually for the Journal, that contains auxiliary and raw data pertinent to a selection of published articles. The interested reader can also request supplementary material about a particular article. Both the annual supple- ment and the supplementary material are on microfiche. According to Charles Birch, head of the journals department of the ACS, not all authors provide data to the journal for the supplement; the decision is made by the author and journal editor. So, for example, in 197~1980 more than 13,000 articles were published by the Journal of the American Chemical Society, of which about 1,400 had supplements provided. Analytical Chemistry published 14 articles during the same penod, and all were issued with supplements. Not all supplements are used or even requested. Of the 1,400 articles in 197~1980 in the Journal for which material was available, there were requests for 235 articles. According to Birch, the rate varies con- siderably by journal, though, partly because of topic. The Journal often in- cludes articles on crystallography, for which the request rate is not high. On the other hand, requests have been made for all supplements for all articles carrying them in Analytical Chemistry. The practice of making supplementary material available is discussed brief- ly in the Author's Handbook for ACS publications, but there appears to be no readily accessible formal document on the topic or the history of the practice. Birch suggested that the development of a system of microfiche supplements came about because publication of raw data in the journals themselves was too expensive. Journals With No Policy on Data Access Of those journal contacted, a few reported not having an explicit policy on data sharing. Managing editor Robert V. Ormes of Science indicated that no written policy was in effect. Instead, authors are expected to disclose data should such a request be made. He suggested that Science and the American Association for Me Advancement of Science would facilitate data access if a primary researcher was reluctant to release the data reported in a published manuscript. In his 20 years at Science, Orrnes did not recall any instance in

OCR for page 199
218 Robert F. Boruch and David S. Cordrary which such action was necessary. The issue of confidentiality pertaining to ideas, findings, and procedures reported in unpublished manuscripts receives explicit attention by the staff of Science in its instructions to reviewers. Discussion with venous staff at the American Medical Association (AMA) and American Anthropological Association revealed no policy on data sharing for their major publications. In the case of the AMA, there is a policy on relesase of Heir own physician data but it does not apply to the data reported in the Journal of the American Medical Associaizon or any of the specialty journals. For both associations, the staff suggested that individual editors may prescribe their own policies. The American Economic Association (AEA) has adopted no formal code of ethics or conduct bearing on data sharing or related topics, such as privacy and confidentiality. Nor do the journals published under AEA auspices appear to devote any special attention to the matter. A major reason for He situtation is that many articles are based on published statistical data. Major articles, however, are careful to cite references to each source (see He illustration de- scnbing Feldstein's work, in Boruch, in this volume) but problems of detail do appear (Bowers and Pierce, 1975, 1981~. Whether reanalyses of earlier work are published depends on the journal editor, He quality of He paper, and other factors, just as it does in He over disciplines. MONITORING COMPLIANCE WITH CODES Some professional societies, agencies, regulations, and journal policies have advocated the need to make data available to other scholars. But with a few exceptions, procedures for handling violations are not specified. For exam- ple, one rule proposed in the Bellagio Principles is that data sharing should be equitable and that provisions should be made for hearing and adjudicating complaints of unfair practice or charges of unfair restrictions on data access. The principle does not state how this should be carried out and by whom. The National Science Foundation's policy is more explicit. It states that the NSF will resolve any conflicts among parties over access. In their review of professional ethics for scientific and engineering so- cieties, ChaLk et al. (1980) provide some information about how violations are handled. Roughly 16 percent of He societies responding to thei} sunrey have appeals procedures and a variety of sanctions, the most frequently mentioned being expulsion, formal censure, and informal reprimand. Over a l~year period, 76 societies have applied available sanctions 249 times; the most fre- quent kind (162) was an informal reprimand. These figures are for violations of any element in codes of ethics, and the reader should recognize that data sharing is a minor part of such codes, if it appears at all. The interesting

OCR for page 199
Professional Codes and Guidelines 219 aspect of the ChaLk et al. survey results is the infrequency with which action is possible, taken, or needed. Review by professional societies represents at least one option for monitoring compliance. Failure to comply with specified standards and guidelines for conduct should not necessarily be viewed as a transgression. Because laws, regula- tions, public sentiment, and the like change over time, some codes may re- quire modification. The Joint Committee on Educational Evaluation and the Evaluation Research Society, among others, maintain standing committees for review and modification of the guidelines they have issued. EVALUATION AND MODIFICATION OF CODES Modifications of standards and codes of conduct are necessary at times, and the organizations that have codes also have mechanisms for their change, at least in principle. For example, the American Statistical Association's Committee on Code of Conduct proposed set of practices and a 3-year trial period, which will presumably be reviewed during and at the close of the time period. The following criteria for evaluating codes and standards of practice have been proposed by ChaLlc et al. (1980:51~. 1. Applicability This refers to the responsiveness of the rules to specific problems. What is elegant in theory can sometimes be elusive in practice. How effectively can the rules be applied to real-world problems? Are some ethical problems not likely to be resolved by an approach based on rules? 2. ClarityAre the rules sufficiently clear to provide a basis for the re- sponsible exercise of professional authority? Ambiguity is likely to breed con- fusion and frustration and, as a consequence, may invite neglect. Moreover, clarity is especially important in dose cases where the rules are expected to play a role in the adjudication of grievances. 3. Consistency Are the rules internally consistent? Are there logical contradictions within or between rules? 4. OrderingDoes the statement of ethical rules provide a means for setting priorities between two or more rules which, although not patina facie inconsistent, when applied in practice will require the professional to choose between conflicting obligations? 5. CoverageThis refers to the scope of actions and situations addressed by the rules. Are the rules silent on matters of serious ethical concern? Do they overemphasize matters of convenience, etiquette or expedience at the ex- pense of more pressing issues? 6. Acceptability Do the rules express proper ideals? Should Hey be ac- cepted as epically prescriptive?

OCR for page 199
220 Robert F. Boruch and David 5. Cordray NOTES 1 The need for international scientific exchanges of lcuowledge (though not necessarily raw data) has been reiterated periodically by working groups of the American Association for the Advancement of Science (AAAS): see AAAS Special Committee on Civil Liberties for Scientists (1949) in defense against political calls for secrecy and loyalty oaths; the AAAS Committee on Social Aspects of Science (1957) on infonnation transfer; and the AAAS Committee on Science in Promotion of Human Welfare (1960) on international aspects of science. 2. The history of scientific codes of conduct, especially codes that bear on sharing informa- tion, seems not to have been well documented. Yet the transformation of ethics codes from ques- tions of etiquette through accepted tradition and codification and training seems interesting enough to warrant the historian's attention. Pigman and Carmuchael (1950) made a beginning in their appeal for a code that would improve professional relations and so "better morale and in- crease productivity amonag research men" (p. 644). Perhaps changing expectations about such codes also warrants attention. 3. The authors note that not all similar rules were identically phrased in the documents; some editorial discretion was used in the preparation of this list. 4. Some organizations that one might expect to have developed codes or guidelines relating to data sharing have not. They include the American Economic Association and others listed in the ChaLk et al. (1980) report. REFERENCES American Association for the Advancement of Science, Committee on Science in He Promotion of Human Welfare 1960 Science and human welfare. Science 132:6~73. American Association for the Advancement of Science, Committee on the Social Aspects of Science 1957 Social aspects of science. Science 125:25-147. American Association for the Advancement of Science, Special Committee on Civil Liberties for Scientists 1949 Civil liberties for scientists. Science l 10:177-179. American Sociological Association 1982 Code ofEthics. Washington,D.C.:AmencanSociologicalAssociaiion. American Statistical Association, Ad Hoc Committee on Professional Ethics 1983 Ethical guidelines for statistical practice: Report of He ad hoc committee. American Statistician 37:5~0. American Statistical Association 1980 Interim Code of Conduct for the ASA: General Guidelines. Washington, D.C.: American Statistical Association. Borsch, R. F., and Cecil, J. S. 1979 Assuring the Confidentiality of Data in Social Research. Philadelphia: University of Pennsylvania Press. Boruch, R. F., Cordray, D. S., and Woranan, P. M. 1981 Secondary analysis: why, when and how. In R. F. Boruch, P. M. Workman, and D.S. Cordray, eds., Reanalyzing Program Evaluations. San Francisco: Jossey-Bass. Bowers, W., and Pierce, G. 1975 The illusion of deterrence and Isaac Ehrlich's research on capital punishment. Yale Law Journal 85:187-208.

OCR for page 199
Professional Codes and Guidelines 221 1981 Capital punishment as deterrent: challenging Isaac Erlich's research. Pp. 237-261 in R. F. Boruch, P. M. Worunan, and D.S. Cordr~y, eds., Reanalyzing Program Evaluations. San Francisco: Jossey-Bass. Bross, I.D.J. 1983 Comment. American Statistician 37:12-13. ChaLlc, R., Frankel, M. S., and Chafer, S. B. 1980 AAAS Professional Ethics Project: Professional Ethics Activities in the Scientific and Engineering Societies. Washington, D.C.: American Association for the Advancement of Science. Cordray, D. S. 1982 An assessment of the utility of the ERS standards. New Directions for Program E v a l u a t i o n : S. t a n d a r d s f o r E d u c a t i o n a l P. r a c t i c e 1 5 : 6 7 - 8 1 . Cranberg, R. 1963 Ethicalcode for scientists? Science 141:1242. Deming, W.E. 1972 Code of professional conduct. International Statistical Review 40:21~219. Ellenberg, J.H. 1983 Ethical guidelines for statistical practice: a historical perspective. American Statistician 37:1~. European Science Foundation 1980 Statement Concerning the Protection of Privacy and the Use of Personal Data for Research. Strasbourg, France: European Science Foundation. Evaluation Research Society 1980 Standardsfor Program Evaluation. Potomac,Md.: Evaluation Research Society. Flaherty, D.H. 1978 The Bellagio conference on privacy, confidentiality, and the use of government micro data. New Directions in Program Evaluation 4:1~30. Flaherty, D. H. 1979 Privacy and Government Data Banks: An International Perspective. London: Mansell. Fosberg, F. R. 1963 Letter to the editor. Science 141:916. Garner, J. 1981 National Institute of Justice access and secondary analysis. Pp. 43~9 in R. F. Boruch, P. M. Workman, and D. S. Cordray, eds., Reanalyzing Program Evaluations. San Francisco: Jossey-Ba~,s. Gehan, E.A. 1983 Comment. American Statistician 37:~9. Glass, B. 1965 The ethical basis of science. Science 150:12501261. Greenhouse, S.W. 1983 Comment. American Statistician 37:15-16. Greenwald, A. 1976 An editorial. Journal of Personality and Social Psychology 33:1-7. Joint Committee on Standards for Educational Evaluation 1981 Standards for Evaluations of Educational Programs, Products, and Materials. New York: McGraw Hill. Katz, S. et al. n.d. Plan for Analysis of Existing Long-Term Data Relative to Distribution and Mix of Functional Impairment, and Effects of Care. Department of Community Health Science, College of Medicine, Michigan State University.

OCR for page 199
222 Robert F. Boruch and David S. Cordray Katzan, H.S. 1980 Multinational Computer Systems: An Introduction to Transnational Data Flow and Data Regulation. New York: Van Nostrand. Kish, L. 1983 Comment. American Statistician 37:17. Lanz, H. 1963 Letter to the editor. Science 141:916. Martin, M.E. 1983 Comment. American Statistician 37:~7. Mochmann, E., and Muller, P. J., eds. 1979 Data Protection and Social Science Research. Frankfurt: Campus Verlag. Mosteller, F. 1983 Comment. American Statistician 37: 1~11. National Science Foundation 1983 Grant General Conditions. NSF-FL 200(~83). Washington, D.C.: National Science Foundation. Organization for Economic Cooperation and Development 1981 Guidelines on the Protection of Privacy and Transborder Flows of Personal Data. Paris: Organization for Economic Cooperation and Development. Pigman, W., and Carmuchael, E. B. 1950 An ethical code for scientists. Science I 11 :643~47. President's Reorganization Project 1981 Federal statistical system: access and dissemination. Pp. 21-33 in R.F. Borsch, P.M. Womnan, and D.S. Cordray, eds., Reanalyzing Program Evaluations. San Prancisco: Jossey-Bass. Rice, D.P. 1983 Comment. American Statistician 37:9. Robbin, A. 1978 Ethical standards and the archivist. Secondary Analysis 4:7-18. 1981 Technical guidelines for prepanag and documenting statistical data for secondary anal- ysis. Pp. 8~143 in R.F. Boruch, P.M. Wortman and D.S. Cordray, eds., Reanalyzing Program Evaluations. San Francisco: Jossey-Bass. 1982 Ambiguity, Value Choice, and Administrative Discretion: When Policy and Practice Diverge in Public Organizations. Unpublished manuscript, University of Wisconsin. Robbin, A., and Josefacki, L. 1982 Public Policy on Health and Welfare Information: Compendium of State Legislation on Privacy and Access. Madison, Wisc.: University of Wisconsin. Roberts, H.V. 1983 Comment. American Statistician 37:18. Rossi, P., ed. 1982 New Directionsfor Program Evaluation: Standardsfor Educational Practice 15. Sasfy, J., and Siegel, L. 1982 A Study of Research Access to Confidential Criminal Justice Agency Data. McLean, Va.: Mitre Corp. Society of American Archivists 1980 A code of ethics for archivists. American Archivist 43:414 420. Solomon, H. 1983 Comment. American Statistician 37:15. Trochim, W.M.K. 1982 Methodologically based discrepancies in compensatory education evaluations. New Directions for Program Evaluation:

OCR for page 199
Professional Codes and Guidelines 223 Evaluation Revie~v6:443 480. U.S. General Accounting Office 1978 Assessing Social Program Impact Evaluations: A Checklist Approach. PAD 702. Washington, D.C.: U.S. General Accounting Office.

OCR for page 199
224 Robert F. Boruch and David S. Cordray Exhibit A The Bellagio Principles 1. National statistical offices should provide researchers both inside and outside government with the broadest practicable access to information within the bounds of accepted notions of privacy and legal requirements to preserve confidentiality . 2. Legal and social constraints on the dissemination of microdata are ap- propriate when they reflect the interests of respondents and the general public in an equitable manner. These constraints should be re-examined when they result in the protection of vested interests, or the failure to disseminate infor- mation for statistical and research purposes (i.e., without direct consequences for a specific individual). 3. All copies of government data collected or used for statistical purposes should be rendered immune from compulsory legal process by statute. 4. In making data available to researchers, national statistical offices should provide some means to ensure that decisions on selective access are subject to independent review and appeals. 5. The distinction between a research file, in the sense of a statistical rec- ord (as defined in the 1977 report of the U.S. Privacy Protection Study Commission), and other micro files is fundamental in discussions of privacy and dissemination of microdata. All dissemination of government microdata discussed in connection with Be Bellagio Principles is assumed to be a transfer of data to research files for use exclusively for research and statistical purposes. 6. There are valid and socially-significant fields of research for which ac- cess to microdata is indispensible. Statistical agencies are one of the prime sources of government microdata. 7. Public use samples of anonymized individual data are one of the most useful ways of disseminating microdata for research and statistical purposes. 8. Techniques now exist that permit preparation of public use samples of value for research purposes within the constraints imposed by the need for confidentiality. Countries with strict statutes on confidentiality have pre- pared public use samples. 9. There are legitimate research purposes requiring the use of individual data for which public use samples are inadequate. 10. There are legitimate research uses which require the utilization of iden- tifiable data within the framework of concern for confidentiality. 11. Over techniques of extending to approved research the same rights and obligations of access enjoyed by officers of Me government agency need to be considered in terms of better access. 12. There is considerable potential for development of more economical

OCR for page 199
Professional Codes and Guidelines 225 and responsive customized-user services, such as 1) record linkage under the protection of the statistical office, 2) special tabulations, 3) public use sam- ples for special purposes. Such services must often involve some form of cost recovery. 13. Some research and statistical activities require the linking of individual data for research and statistical purposes. The methods that have been deve- loped to permit record linkage without violating law or social custom regard- ing privacy should be used whenever possible. 14 Professional or national organizations should have codes of ethics for their disciplines concerning the utilization of individual data for research and statistical purposes. Such ethical codes should furnish mutually agreeable standards of behavior governing relations between providers and users of gov- ernmental data. 15. Users of microdata should be required to sign written undertakings for the protection of confidentiality. 16. Considerable efforts should be made to explain to the general public the procedures in force for the protection of the confidentiality of microdata collected and disseminated for research and statistical purposes. 17. The right of privacy is evolving rather than static, and closely related to how statistics and research are perceived. Therefore, statisticians and re- searchers have a responsibility to contribute to policy and legal definitions of privacy. 18. Public concern about privacy and confidentiality in the collection and utilization of individual data can be addressed in part as follows: (1) voluntary data collection, whenever practicable; (2) advanced general notice to respondents and informed consent, whenever practicable; (3) provisions for public knowledge of data uses; (4) public education on the distinction between administrative and re- search uses of information. Source: Flaheny (1978:1~27).