National Academies Press: OpenBook

Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics (2009)

Chapter: 5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency

« Previous: 4 State and Local Partnerships
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 209
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 210
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 211
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 212
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 213
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 214
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 215
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 216
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 217
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 218
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 219
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 220
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 221
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 222
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 223
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 224
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 225
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 226
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 227
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 228
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 229
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 230
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 231
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 232
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 233
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 234
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 235
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 236
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 237
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 238
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 239
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 240
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 241
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 242
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 243
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 244
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 245
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 246
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 247
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 248
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 249
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 250
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 251
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 252
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 253
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 254
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 255
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 256
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 257
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 258
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 259
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 260
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 261
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 262
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 263
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 264
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 265
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 266
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 267
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 268
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 269
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 270
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 271
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 272
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 273
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 274
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 275
Suggested Citation:"5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 276

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

–5– Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency O UR CHARGE DIRECTS US to provide guidance to the Bureau of Jus- tice Statistics (BJS) regarding its strategic priorities and goals. Be- fore doing this, it is important to consider the functions—and expectations—of BJS from a higher, agency-level perspective. One important filter through which to view the priorities and opera- tions of BJS is its role as one of the principal statistical agencies in the U.S. federal statistical system. Relative to other countries, the U.S. federal sta- tistical system is highly decentralized. Whereas other countries vest the pri- mary authority for collection and dissemination of statistical data in a single agency—the Australian Bureau of Statistics, Statistics Canada, and Statis- tics Netherlands, for example—authority for production of official statistics in the United States is divided across numerous agencies.1 These agencies are by no means equal in terms of their staffing levels and budgetary re- 1 The statistical system of the United Kingdom is also frequently cited as an example of centralization; it is currently in a state of change. Effective as of April 2008, a new Statistics and Registration Service Act formally abolished the legal role of the Office for National Statistics (ONS), previously the United Kingdom’s dominant statistical agency. ONS functions continue, but the office is now a subsidiary of the Statistics Board, created as an independent corporate body as the arbiter and producer of official statistics in the country. As discussed in our panel’s interim report (National Research Council, 2008b), the British Crime Survey is an example of a United Kingdom data collection that is not collected by ONS or the Statistics Board; it is administered by the Home Office. 209

210 JUSTICE STATISTICS Figure 5-1 Estimated direct funding levels for principal federal statistical agencies, fiscal year 2008 NOTES: NSF, National Science Foundation. Including the costs associated with the decennial census would add $797.1 million to the Census Bureau total. SOURCE: U.S. Office of Management and Budget (2007:Table 1). sources. As shown in Figure 5-1, the three largest statistical agencies—the Census Bureau, the Bureau of Labor Statistics, and the National Center for Education Statistics—dominate the others in terms of resources even though the subject-matter portfolios of the smaller agencies—justice, transportation, agriculture, and so forth—are undeniably important. It is appropriate, in the panel’s judgment, to evaluate BJS in the con- text of the larger federal statistical system, especially the principal statistical agencies whose primary mission is the collection and dissemination of statis- tical information. (There are 60–70 other federal agencies that spend more than $500,000 per year on statistical information dissemination, but whose program duties outweigh their statistical focus.) The panel benefited from a preexisting, fully vetted set of evaluative cri- teria for a federal statistical agency. The observations that we make in this chapter are generally structured around the Principles and Practices for a Fed- eral Statistical Agency, a white paper of the Committee on National Statistics (CNSTAT) (National Research Council, 2005b). Principles and Practices ar-

PRINCIPLES AND PRACTICES 211 ticulates the basic functions that are expected of a unit of the U.S. federal statistical system; it also outlines ideals for the relationship between indi- vidual statistical agencies and their parent departments. As such, it has been widely used by various statistical agencies in their interactions with Congress and with officials in their departments. Indeed, BJS has already embraced such evaluative criteria: a summary version of the Principles and Practices is featured prominently on the front page of BJS’s current strategic plan (Bureau of Justice Statistics, 2005a) and as a top-level link on BJS’s website (under “BJS Statistical Principles and Practices”). The two sections of this chapter assess BJS, its products, and its perfor- mance relative to the Principles and Practices of a Federal Statistical Agency; each subsection begins with a précis of the relevant descriptive text from the fourth edition of Principles and Practices (National Research Council, 2009). In the course of this review, we provide extended discussion of two recent “flashpoints” in recent BJS experience—the circumstances surrounding re- lease of data from the 2002 Police-Public Contact Survey (PPCS) that led to the dismissal of a BJS director and the reporting requirements imposed by the Prison Rape Elimination Act of 2003—that are particularly relevant to examination of the major principles of a statistical agency. We defer conclu- sions and assessments based on the chapter as a whole to Chapter 6, a more comprehensive statement on strategic goals for BJS. 5–A PRINCIPLES OF A FEDERAL STATISTICAL AGENCY 5–A.1 Trust Among Data Providers A federal statistical agency must have the trust of those whose infor- mation it obtains. Data providers, such as respondents to surveys and custodians of administrative records, must be able to rely on the word of a statistical agency that the information they provide about them- selves or others will be used only for statistical purposes. An agency earns the trust of its data providers by appropriately protecting the con- fidentiality of responses. Such protection, in particular, precludes the use of individually identifiable information maintained by a statistical agency—whether derived from survey responses or another agency’s administrative records—for any administrative, regulatory, or law en- forcement purpose. (National Research Council, 2009:5–6) In a democracy, government statistical agencies depend on the willing co- operation of resident respondents to provide information about themselves and their activities. This willingness requires assurance that their informa- tion will not be used to intervene in their lives in any way. When respondents to BJS data collections provide data to the agency (or contractors represent- ing the agency) they are told that their individual data will never be used to harm them; they are told that the purposes of the data collection will be

212 JUSTICE STATISTICS fulfilled by publicly available statistical results; they are informed that the agency is an independent statistical organization transcending the current administration in power; they are informed that their data will be kept con- fidential. Agencies that fulfill such pledges build over time with their data providers a sense of trust that the agency’s intentions are benign and that the agency respects their rights as data providers. When political interfer- ence is suspected by data providers, the trust that their reports are being used appropriately—solely to create statistical information—can be shaken, and restoring that trust can be a much slower process than its destruction. BJS has many different target populations of data providers. Some are very large (e.g., the entire U.S. household population for the National Crime Victimization Survey (NCVS) or the full set of state courts of general juris- diction) whereas others are quite small (e.g., state-level departments of cor- rection or federal prisons). Small populations of data providers generally are repeatedly asked for data in ongoing BJS series. In turn these data providers are often more interested in the outcome of the data collections and may use the statistical information for their own purposes. From its review of BJS documents, knowledge of its data sets, and in- teractions with its respondents, the panel concludes that BJS and its data collection agents are generally very diligent in preserving the confidentiality of responses from its respondents. This is particularly true for the NCVS, the effectiveness of which is wholly predicated on building trust and rapport between interviewer and respondent in order to obtain full and accurate ac- counts of victimization incidents. The use of respondents’ data solely for statistical purposes is generally well known and presented. However, we note that this is only “generally” true in that there exists a flagrant exception. In the judgment of the panel, the reporting requirements of the Prison Rape Elimination Act of 2003 (PREA) oblige BJS to violate the principle of trust among its institutional data providers. Specifically, the provision of information to a statistical agency is fundamentally different from the provision of information to a regulatory or enforcement agency. Regulatory agencies, by their very nature, have the goal of intervention in individual activities when they are found to violate some prescriptive actions sanctioned by the government. The crux of the problem is that the PREA reporting requirements assign to BJS a quasi-regulatory role, directly using data collected from responding institutions to impose sanctions. In the re- mainder of this section, we describe this breach of principle by describing the history and the implementation of the PREA reporting requirement. Historical Development of the Prison Rape Elimination Act In Farmer v. Brennan (511 U.S. 825 [1994]), the U.S. Supreme Court ruled that “deliberate indifference” to serious health and safety risks by

PRINCIPLES AND PRACTICES 213 prison officials constitutes a violation of the Eighth Amendment protec- tion against cruel and unusual punishment. The particular case in Farmer involved a preoperative transsexual prisoner who was raped and beaten shortly after transfer to a federal penitentiary; the Court’s ruling vacated lower court rulings that rejected the plaintiff ’s argument on the grounds that prison officials had not been demonstrated to be criminally reckless. By 2002–2003, the general problem of sexual assault in prison drew leg- islative interest in Congress. Ultimately, the legislative initiative produced PREA. The final act is lengthy, including specification of grant monies tar- geted at reduction strategies, the establishment of a national commission, and adoption of national standards. However, in this section, we focus on the specific demands put on BJS by a section of the act covering “national prison rape statistics, data, and research”—reporting requirements that, in certain respects, run counter to the proper and accepted role of a federal statistical agency. In the 107th Congress, identical versions of a proposed “Prison Rape Reduction Act” were introduced in both houses (H.R. 4943 and S. 2619). On the occasion of the introduction of the measure in the Senate, cosponsor Sen. Edward Kennedy (D-Mass.) described what little was known quantita- tively about the extent of sexual assault in U.S. prisons:2 Prison rape is a serious problem in our Nation’s prisons, jails, and de- tention facilities. Of the two million prisoners in the United States, it is conservatively estimated that one in ten has been raped. Accord- ing to a 1996 study, 22 percent of prisoners in Nebraska had been pressured or forced to have sex against their will while incarcerated [(Struckman-Johnson et al., 1996)].3 Human Rights Watch recently re- ported, “shockingly high rates of sexual abuse” in U.S. prisons [(Human Rights Watch, 2001)].4 Cosponsor Sen. Jeff Sessions (R-Ala.) concurred, and briefly described the statistical analysis section of the bill:5 Some studies have estimated that over 10 percent of the inmates in certain prisons are subject to rape. I hope that this statistic is an exag- geration. . . . 2 Congressional Record, June 13, 2002, p. S5337. 3 Struckman-Johnson et al. (1996:69–70) distributed questionnaires (for response by mail) to all inmates and staff at two maximum security men’s prisons, one minimum security men’s prison, and one women’s facility, all of which are “in the state prison system of a rural Midwest- ern state.” The state is not explicitly identified, but later discussions of the results included the acknowledgment of Nebraska as the survey site. In all, 1,801 prisoners and 714 staff members at these facilities were eligible to participate; 528 inmates and 264 staff members responded. 4 No formal survey or statistical data collection was used by Human Rights Watch (2001); instead, the report’s observations were based on written reports from about 200 prisoners, responding to announcements in publications and leaflets. 5 Congressional Record, June 13, 2002, pp. S5337, S5338.

214 JUSTICE STATISTICS [This] bill will require the Department of Justice to conduct statis- tical surveys on prison rape for Federal, State, and local prisons and jails. Further, the Department of Justice will select officials in charge of certain prisons with an incidence of prison rape exceeding the national average by 30 percent to come to Washington and testify to the Depart- ment about the prison rape problem in their institution. If they refuse to testify, the prison will lose 20 percent of certain Federal funds. In both chambers, the legislation was referred to Judiciary subcommittees and no further action was taken (save that the Senate Judiciary Committee held a hearing on the bill on July 31, 2002). In the 108th Congress, legislation identical to the previous bill was in- troduced by Rep. Frank Wolf (R-Va.) and Rep. Bobby Scott (D-Va.) in the House as H.R. 1707 on April 9, 2003.6 However, deliberations between members and staff in both chambers were progressing toward a revised, bi- partisan proposal, and these deliberations resulted in rapid passage of the bill. On June 11, 2003, the House Subcommittee on Crime, Terrorism, and Homeland Security replaced the existing text of H.R. 1707 with substitute language and favorably reported it to the full Judiciary Committee. In turn, the Judiciary Committee approved the revised bill on July 9. The Judiciary Committee’s report on the bill, H.Rept. 108-219, offers no explanation for the revised wording in the BJS data collection section of the act. On July 21, Sen. Sessions introduced S. 1435—consistent with7 the revised House language, but now bearing the name “Prison Rape Elimination Act.” Upon introduction, the bill was immediately passed by unanimous consent with- out debate or amendment; the House took up the Senate bill on July 25 and passed it without objection; and the bill was signed on September 4, becoming Public Law 108-79. Text of the Act and Reporting Requirements Box 5-1 shows the alterations to the section of PREA concerning BJS data collection between its original introduction in the 107th Congress and final passage. Both the original and final versions of the bill establish a Re- view Panel on Prison Rape; the original would have administratively housed the Review Panel in BJS while the final version makes it an organ of the Jus- tice Department. To be clear, it is important to note that the Review Panel is more limited in scope than the National Prison Rape Elimination Com- mission created by other sections of the act. The Review Panel’s work is structured around the BJS work, while the formally appointed Commission 6 A variant on the same bill, with the same reporting requirements on BJS, was introduced on April 10, 2003, as H.R. 1765 but progressed no further than referral to committee. 7 Judiciary Committee Chairman James Sensenbrenner (R-Wisc.) described the Senate bill as “substantively identical to H.R. 1707” in his floor remarks on passage of the act (Congressional Record, July 25, 2003, p. 7765).

PRINCIPLES AND PRACTICES 215 Box 5-1 Statistical Reporting Provisions of Original and Final Versions of the Prison Rape Elimination Act The following excerpt compares text from Section 2 of H.R. 4943 (107th Congress) and Section 4 of S. 1435 (108th Congress), the latter of which was enacted as Public Law 108-79. Subsections (d) and (e) on contracts and authorization of appropriations are omitted. Deletions from the earlier version are marked in strikethrough text; additions in the newer version are shown in italic type. NATIONAL PRISON RAPE STATISTICS, DATA, AND RESEARCH. (a) ANNUAL COMPREHENSIVE STATISTICAL REVIEW- (1) IN GENERAL- The Bureau of Justice Statistics of the Department of Justice (in this section referred to as the ‘Bureau’) shall carry out, for each calendar year, a comprehensive statistical review and analysis of the incidence and effects of prison rape. The statistical review and analysis shall include, but not be limited to the identification of the common characteristics of— (A) inmates who have been involved with prison rape, both victims and perpetrators both victims and perpetrators of prison rape; and (B) prisons and prison systems with a high incidence of prison rape. (2) CONSIDERATIONS- In carrying out paragraph (1), the Bureau shall consider— (A) how rape should be defined for the purposes of the statistical review and analysis; (B) how the Bureau should collect information about staff-on-inmate sexual assault; (C) how the Bureau should collect information beyond inmate self-reports of prison rape; (D) how the Bureau should adjust the data in order to account for differences among prisons as required by subsection (c)(3); (E) the categorization of prisons as required by subsection (c)(4); and (F) whether a preliminary study of prison rape should be conducted to inform the methodology of the comprehensive statistical review. (3) SOLICITATION OF VIEWS- The Bureau of Justice Statistics shall solicit views from representatives of the following: State departments of correction; county and municipal jails; juvenile correctional facilities; former inmates; victim advocates; researchers; and other experts in the area of sexual assault. (2)(4) SAMPLING TECHNIQUES- The analysis under paragraph (1) shall be based on a random sample, or other scientifically appropriate sample, of not less than 10 percent of all Federal, State, and county prisons, and a representative sample of municipal prisons. The selection shall include at least one prison from each State. The selection of facilities for sampling shall be made at the latest prac- ticable date prior to conducting the surveys and shall not be disclosed to any facility or prison system official prior to the time period studied in the survey. Se- lection of a facility for sampling during any year shall not preclude its selection for sampling in any subsequent year. (3)(5) SURVEYS- In carrying out the review required by this subsection and analysis under paragraph (1), the Bureau shall, in addition to such other methods as the Bureau considers appropriate, use surveys and other statistical studies of cur- rent and former inmates from a sample of Federal, State, county, and municipal prisons. The Bureau shall ensure the confidentiality of each survey participant. (continued)

216 JUSTICE STATISTICS Box 5-1 (continued) (6) PARTICIPATION IN SURVEY- Federal, State, or local officials or facility adminis- trators that receive a request from the Bureau under subsection (a)(4) or (5) will be required to participate in the national survey and provide access to any inmates under their legal custody. (b) REVIEW PANEL ON PRISON RAPE- (1) ESTABLISHMENT- To assist the Bureau in carrying out the review and analysis under subsection (a), there is established, within the Bureau Department of Justice, the Review Panel on Prison Rape (in this section referred to as the ‘Panel’). (2) MEMBERSHIP- (A) COMPOSITION- The Panel shall be composed of 3 members, each of whom shall be appointed by the Attorney General, in consultation with the Sec- retary of Health and Human Services. (B) QUALIFICATIONS- Members of the Panel shall be selected from among individuals with knowledge or expertise in matters to be studied by the Panel. (3) PUBLIC HEARINGS- (A) IN GENERAL- The duty of the Panel shall be to carry out, for each calendar year, public hearings concerning the operation of each entity identified in a report under clause (ii) or (iii) of subsection (c)(2)(B) the three prisons with the highest incidence of prison rape and the two prisons with the lowest incidence of prison rape in each category of facilities identified under sub- section (c)(4). The Panel shall hold a separate hearing regarding the three Federal or State prisons with the highest incidence of prison rape. The purpose of these hearings shall be to collect evidence to aid in the identi- fication of common characteristics of inmates who have been involved in prison rape, both victims and perpetrators both victims and perpetrators of prison rape, and the identification of common characteristics of prisons and prison systems with a high incidence of prison rape that appear to have been successful in deterring prison rape. (B) TESTIMONY AT HEARINGS- (i) PUBLIC OFFICIALS- In carrying out the hearings required under sub- paragraph (A), the Panel shall request the public testimony of Federal, State, and local officials (and organizations that represent such offi- cials), including the warden or director of each prison, who bears responsibility for the prevention, detection, and punishment of prison rape at each entity, and the head of the prison system encompassing such prison, who bear responsibility for the prevention, detection, and punishment of prison rape at each entity. (ii) VICTIMS- The Panel may request the testimony of prison rape vic- tims, organizations representing such victims, and other appropriate individuals and organizations. (C) FAILURE TO TESTIFY- If, after receiving a request by the Panel under subparagraph (B)(i), a State or local official declines to testify at a reasonably designated time, the Federal funds provided to the entity represented by that official pursuant to the grant programs designated by the Attorney General under section 9 shall be reduced by 20 percent and reallocated to other entities. This reduction shall be in addition to any other reduction provided under this Act. (continued)

PRINCIPLES AND PRACTICES 217 Box 5-1 (continued) (C) SUBPOENAS- (i) ISSUANCE- The Panel may issue subpoenas for the attendance of witnesses and the production of written or other matter. (ii) ENFORCEMENT- In the case of contumacy or refusal to obey a sub- poena, the Attorney General may in a Federal court of appropriate jurisdiction obtain an appropriate order to enforce the subpoena. (c) REPORTS- (1) IN GENERAL- Not later than March June 30 of each year, the Bureau Attor- ney General shall submit a report on the activities of the Bureau (including the Review Panel) and the Review Panel, with respect to prison rape, for the pre- ceding calendar year to– (A) Congress; and (B) the Attorney General; and (C)(B) the Secretary of Health and Human Services. (2) CONTENTS- The report required under paragraph (1) shall include— (A) with respect to the effects of prison rape, statistical, sociological, and psychological data; and (B) with respect to the incidence of prison rape— (i) statistical data aggregated at the Federal, State, prison system, and prison levels; (ii) an identification of the Federal Government, if applicable, and each State and local government (and each prison system and institution in the representative sample) where the incidence of prison rape exceeds the national median level by not less than 30 percent; and (iii) an identification of jail and police lockup systems in the representative sample where the incidence of prison rape is significantly avoidable. (ii) a listing of those institutions in the representative sample, separated into each category identified under subsection (c)(4) and ranked ac- cording to the incidence of prison rape in each institution; and (iii) an identification of those institutions in the representative sample that appear to have been successful in deterring prison rape; and (C) a listing of any prisons in the representative sample that did not cooperate with the survey conducted pursuant to section 4. (3) DATA ADJUSTMENTS- In preparing the information specified in paragraph (2), the Bureau shall, not later than the second year in which surveys are conducted under this Act, Attorney General shall use established statistical methods to adjust the data as necessary to account for exogenous factors, outside of the control of the State, prison system, or prison, which have demonstrably contributed to the incidence of prison rape differences among institutions in the representative sample, which are not related to the detection, prevention, reduction and punishment of prison rape, or which are outside the control of the State, prison, or prison system, in order to provide an accurate comparison among prisons. Such differences may include the mission, security level, size, and jurisdiction under which the prison operates. For each such adjustment made, the Bureau Attorney General shall identify and explain such adjustment in the report. (4) CATEGORIZATION OF PRISONS- The report shall divide the prisons surveyed into three categories. One category shall be composed of all Federal and State prisons. The other two categories shall be defined by the Attorney General in order to compare similar institutions.

218 JUSTICE STATISTICS has a broader charge to develop national standards for the detection and prevention of sexual violence in correctional facilities. It also appears that one of the intended roles of the Review Panel was to “assist” BJS in its data collection efforts (as is explicitly stated in both versions of the bill). This assistance function is consistent with concerns expressed at a congressional hearing on the bill, arguing that BJS should have an advisory group to work out definitional issues in measuring prison rape (U.S. House of Representa- tives, Committee on the Judiciary, 2003:19). The critical difference in the legislative texts in Box 5-1 lies in the re- porting requirements to support public hearings by the Review Panel. The original proposal called for public hearings with officials from institutions with high and low incidences of prison rape (facilities “where the incidence of prison rape exceeds the national median level by not less than 30 percent” and facilities “where the incidence of prison rape is significantly avoidable”). However, the final law directs that—each year, for different facility types— the facilities with the three highest and two lowest incidence rates be sum- moned to appear at hearings. Comparing the different versions of section (b)(3)(C) in Box 5-1, the original version of the act threatened institutions that refused to testify before the Review Panel with a 20 percent reduction in federal grant monies. The final version of the bill removed that threat but granted the Review Panel full subpoena power.8 In addition to iden- tifying the highest- and lowest-ranked institutions, the final legislative text also required the Review Panel (presumably using BJS’s work) to provide a complete listing of all the facilities in the sample, “ranked according to the incidence of prison rape. The original designation of “high” prison rape incidence—a value more than 30 percent greater than the national median—was a curious and in- triguing one. Depending on the distribution of incidence rates across fa- cilities, the criterion might have obliged the Review Panel to hear from an unworkably high number of parties, and perhaps that consideration drove the revision. Alternatively, singling out “the” highest-rate facilities may have been viewed by legislators as more consistent with the themes of account- ability and action (as with the change in nomenclature from a “Prison Rape Reduction” to a “Prison Rape Elimination” Act). From the record, it is un- clear exactly how and why the change came about. Indeed, both Rep. Scott’s prepared statement for the Judiciary Committee markup of the bill on July 9, 2003 (H.Rept. 108-219, p. 114), and floor statement on the Senate bill 8 Although the text does not appear in H.R. 3493 in Box 5-1, Corlew (2006) notes that the bill as originally proposed “would have granted a ten percent funding increase to prison systems that, because of their high percentages of prison rape, were required to provide testimony to the Review Panel.” Both the American Correctional Association and the Association of State Correctional Administrators objected that this provision “appeared to reward undeserving systems”—another reason for the change to subpoena authority in the final bill text.

PRINCIPLES AND PRACTICES 219 on July 25 (Congressional Record, p. H7764), refer to “conduct[ing] pub- lic reviews of institutions where the rate of prison rape is 30% above the national average rate”—even though that provision no longer existed in the revised language. The principal congressional hearing on the bill was held before the House Judiciary Subcommittee on Crime, Terrorism, and Homeland Se- curity on April 29, 2003. Being a House hearing, the bill referred to at the hearing was the original version of the legislation with the 30-percent- above-median reporting requirement. At that hearing, the only discussion of BJS’s reporting role was raised by then-principal deputy attorney gen- eral for the Office of Justice Programs (OJP) Tracy Henke, and that concern came as a brief ending to her opening statement. Although Henke’s remarks hinted at the inappropriateness of BJS’s use of data for administrative and regulatory purposes, the specific objection was raised to the original bill’s vague definition of low-prevalence facilities (U.S. House of Representatives, Committee on the Judiciary, 2003:13–14): I know my time is up, but real quickly, sir, another concern to the De- partment is that the Department believes that [it is of the utmost im- portance that9 ] the integrity of the statistical collection and analysis by the Bureau of Justice Statistics be preserved. The legislation currently requires BJS not only to collect but also to analyze data and produce reports on that analysis in a very short timeframe. We recognize the need for quick access to this information, but it must be balanced by providing BJS the opportunity to accurately and sufficiently analyze the data collected. Finally, the law authorizing BJS prohibits BJS from gathering data for any use other than statistical or research purposes. By requiring BJS to identify facilities “where the incidence of prison rape is significantly avoidable,” the legislation calls for BJS to make judgments about what level of prison rape is “significantly avoidable”. This responsibility goes beyond BJS’s authorized statistical role. BJS Data Collections and Reports in Support of the Act In response to the enactment of PREA, BJS organized a series of data collection efforts, summarized in Bureau of Justice Statistics (2004c), that have been characterized as “a quantum leap in methodology and our knowl- edge about the problem” of prison rape (Dumond, 2006). The main efforts in the PREA-related data collections are an annual administrative-records- based inventory dubbed the Survey of Sexual Violence (SSV) and a recurring National Inmate Survey program. For the SSV BJS contracted with the U.S. , 9 This grammatical insertion uses the wording of Henke’s prepared statement, printed in the hearing record after the spoken remarks (U.S. House of Representatives, Committee on the Judiciary, 2003:16).

220 JUSTICE STATISTICS Census Bureau’s Governments Division to collect records-based counts of re- ported incidents from federal and state prisons and a sample of local jails and private correctional facilities. Self-report personal interviewing contracts for the National Inmate Surveys were established with three separate contrac- tors, corresponding to the specific populations and facility types envisioned by the act: RTI International (adult prisons and jails), Westat (juvenile facil- ities), and the National Opinion Research Center (soon-to-be released and former prisoners). In all of these self-report options, BJS settled on the use of audio computer-assisted self-interviewing (ACASI) as the best means to obtain personally sensitive information such as that called for in the inmate survey of sexual victimization. Under ACASI methods, respondents com- plete a questionnaire on a computer, following instructions played through earphones from the computer; in this way, respondents do not have to di- rectly divulge embarrassing or sensitive information directly to another per- son, facilitating a more accurate response. Particularly for the adult prison populations, backup strategies for collection were also developed, including forms for administration to inmates considered too dangerous to interact with survey staff. Beck and Hughes (2005) issued the first report on SSV data on victimiza- tion incidents reported to correctional facilities, corresponding to data col- lected in 2004. New reports on the SSV for 2005 and 2006 have since been issued. At this writing, two reports from National Inmate Surveys have been released. A December 2007 report (Beck and Harrison, 2007) described the results from interviewing at a sample of 146 state and federal prisons, a June 2008 report covered interview results at a sample of 282 local jails (Beck and Harrison, 2008), and a July 2008 report summarized results from interviews at juvenile correctional facilities (Beck et al., 2008). Cognizant of BJS’s legal reporting requirements, both the prison and jail releases from the National Inmate Surveys identified the names of institu- tions with high rates of offending; however, both have explicitly described an inability to identify the three highest-rate and two lowest-rate facilities as prescribed by the law. Table 5-1 reproduces the key table from Beck and Harrison (2007) on federal and state prisons, identifying 10 high-rate facilities. Noting the standard errors calculated for the estimates, the re- port carefully explains that, “statistically, the NIS is unable to identify the facility with the highest prevalence rate” or “provide an exact ranking for all facilities as required” under PREA “as a consequence of sampling error” (Beck and Harrison, 2007:3). The report is accompanied by spreadsheets tabulating facility-specific estimates and standard errors of reported sexual victimization for the full sample; the entries are presented alphabetically by state rather than the strict ranking suggested in the text of the act. In the body of the report, BJS chose to tabulate the top 10 results. In a ranked list by weighted percentage of sexual victimization incidents, the

PRINCIPLES AND PRACTICES 221 Table 5-1 Prison Facilities with Highest and Lowest Prevalence of Sexual Victimization, National Inmate Survey, 2007 Percent of Inmates Reporting Sexual Victimizationa Number of Response Weighted Standard Facility Name Respondents b Rate (%) Percentc Errord U.S. total 23,398 72 4.5 0.3 10 highest Estelle Unit, TX 197 84 15.7 2.6 Clements Unit, TX 142 59 13.9 2.9 Tecumseh State Corr. Inst., NE 85 39 13.4 4.0 Charlotte Corr. Inst., FL 163 73 12.1 2.7 Great Meadow Corr. Fac., NY 144 62 11.3 2.7 Rockville Corr. Fac., INe 169 79 10.8 2.4 Valley State Prison for Women, CAe 181 78 10.3 2.3 Allred Unit, TX 186 71 9.9 2.2 Mountain View Unit, TXe 154 80 9.5 1.9 Coffield Unit, TX 194 76 9.3 2.1 6 lowest f Ironwood State Prison, CA 141 60 0.0 — Penitentiary of New Mexico, NM 83 38 0.0 — Gates Corr. Ctr., NC 52 74 0.0 — Bennettsville-Camp, BOP 77 69 0.0 — Big Spring Corr. Inst., BOP 155 66 0.0 — Schuylkill Fed. Corr. Inst., BOP 174 70 0.0 — a Percent of inmates reporting one or more incidents of sexual victimization involving another inmate or facility staff in past 12 months or since admission to the facility, if shorter. b Number of respondents selected for the National Inmate Survey on sexual victimization. c Weights were applied so that inmates who responded accurately reflected the entire popula- tion of each facility on selected characteristics, including age, gender, race, time served, and sentence length. d Standard errors may be used to construct confidence intervals around the weighted survey estimates. For example, the 95% confidence interval around the total percent is 4.5% plus or minus 1.96 times 0.3% (or 3.9% to 5.1%). e Female facility. f Facilities in which no incidents of sexual victimization were reported by inmates. NOTES: —, Not applicable. BOP, Bureau of Prisons. SOURCE: Reproduced from Beck and Harrison (2007:Table 1).

222 JUSTICE STATISTICS 11th-ranked facility (the Hays State Prison in Georgia) is the first whose difference from the highest-ranked facility (the Estelle Unit in Texas) is sta- tistically significant (α = 0.05). Hence, the top 10 results constitute a group whose overall sexual violence victimization rates are high relative to others, even if they are not statistically distinguishable from each other. Following the same logic, Beck and Harrison (2008) tabulated results for 18 high- rate local jails; in compliance with PREA requirements, Beck and Harrison (2008) also list sampled jails that declined to participate and permit inter- viewing in the survey. BJS’s report on the survey administration in juvenile facilities (Beck et al., 2008) differs from the other reports in the series in that it does not attempt any tabular listing of specific facilities or ranking of highest-offense facilities, instead reporting summary statistics from the sam- ple as a whole. However, the report does identify those juvenile facilities that declined to participate as well as those that reported no victimization incidents. In addition to our panel’s concern about the use of BJS data for reg- ulatory or administrative uses, we are also critical of the procedures used for this part of reporting pursuant to PREA. Specifically, we are concerned that the approach greatly understates the variability inherent in the data; see Box 5-2. Developments Following the First PREA Report Releases Following the release of Beck and Harrison (2007), the Review Panel on Prison Rape established by PREA in the Department of Justice (DOJ) held 7 days of hearings in March 2008, in Washington, DC, and Houston, Texas, to obtain testimony from each of the adult federal and state prisons identified in Table 5-1. The National Prison Rape Elimination Commission issued press releases on the occasion of the BJS report releases and the start of the Review Panel hearing. The Commission’s June 25, 2008, release noted that: Even with margins of error, the study reveals that these facilities have extraordinarily high rates of sexual assault, highlighting the severity of this national problem. . . . We welcome BJS’s stated willingness to adjust future surveys to gather additional information. We hope the agency will develop more questions about inmate reporting efforts, the response of officials and factors that may play into reporting, such as threats of retaliation. Since the enactment of PREA, similar legislative calls for expanded data collection on inmate health conditions have been introduced in Congress but have not advanced beyond referral to committee. For instance, the proposed Justice for the Unprotected Against Sexually Transmitted Infections Among the Confined and Exposed (JUSTICE) Act introduced as H.R. 178 in Jan- uary 2007 requires an annual survey of correctional facilities. In addition to

PRINCIPLES AND PRACTICES 223 Box 5-2 Critique of the Reported Rankings in the Prison Rape Elimination Act Inmate Surveys In its reports on the PREA inmate surveys in prisons and jails (Beck and Harrison, 2007, 2008), the Bureau of Justice Statistics (BJS) did what it could to convey the basic idea that the survey sample sizes are too small (and the underlying phenomenon being measured is sufficiently “rare”) to preclude identification of high-rate facilities with the precision called for by the law. However, the panel observes that BJS’s chosen ap- proach is, itself, partly inaccurate in that it understates the variability inherent in the data. As it stands, BJS has ranked the correctional facilities solely on the basis of sample-based estimates of prison rape rates. Each correctional facility among the top 10 is associated both with its name and its ranking. This is not a fully valid approach given that many of these rates have large standard errors. The standard errors reflect the uncertainty due to observing only a portion of the prison population. The estimated rates could have differed if a different sample of institutions were selected. This sampling uncertainty must be taken into consideration while developing such rankings. In other words, a facility’s name is fixed but its ranking is affected by the sampling error in the estimated rates. There are simple procedures to account for such sampling uncertainty. For example, consider the 20 facilities with highest rates and their standard errors. How fair is to label only the top 10 as “bad” (let us call this top-10 group “Tier A”) when the bottom 10 (“Tier B”) could have easily been in Tier A purely by chance? This question can be answered using a simple bootstrap procedure by simulating what could have been the estimated prison rape rates and their ranking for these 20 facilities purely by chance. We used 10,000 parametric bootstrap draws from the sampling distribution of the estimated overall prevalence rates for prison rape in Beck and Harrison (2007), and ranked them. We then computed the proportion of times a facility labeled in Tier A in BJS’s report would have been placed in Tier B in the simulation, and vice versa. The following table gives these estimated probabilities. Estimated Probability of Estimated Probability of Prison Rape Standard Being in Prison Rape Standard Being in Rates Error Tier B Rates Error Tier A Tier A (%) (%) (%) Tier B (%) (%) (%) 1 15.7 2.6 0.2 11 9.1 1.9 49.1 2 13.9 2.9 3.6 12 8.7 2.2 46.8 3 13.4 4.0 12.1 13 8.5 2.4 39.1 4 12.1 2.7 12.3 14 8.2 2.0 32.4 5 11.3 2.7 20.3 15 8.1 2.0 30.1 6 10.8 2.4 23.7 16 8.0 2.2 29.6 7 10.3 2.3 30.4 17 8.0 2.1 29.6 8 9.9 2.2 37.2 18 7.9 2.1 28.2 9 9.5 1.9 42.4 19 7.9 1.9 25.0 10 9.3 2.1 47.2 20 7.9 1.7 24.5 NOTES: Bootstrap estimate of the error or misclassification of rates purely based on the point estimates and ignoring the standard error. The misclassification rates are disturbingly high and expected given the large standard error, partly due to inadequate sample size. If the error rates of 5 percent or more are not acceptable, then only the first two facilities stand out in that they would have remained in the Tier A set with high probability had a different sample been obtained. For many facilities the decision to label them as Tier A or Tier B is quite arbitrary and made purely by chance. This amply illustrates the problems of using statistical data for regulatory purposes. (continued)

224 JUSTICE STATISTICS Box 5-2 (continued) Alternatively, all possible comparisons between Tier A set rates and Tier B set rates may be considered simultaneously and Bonferroni bounds may be used to conservatively determine significance levels for all possible pairwise comparisons. Such an approach has been used, in particular, in other applications where extreme ranks on some variable carry particular political sensitivity. For instance, the National Center for Education Statistics has developed such approaches for ranking states according to sample-based educational testing results, as described by Wainer (1996). PREA-type queries on the incidence of sexual assault, the proposed data col- lection would require information on facility policy on testing for sexually transmitted diseases and data on test results that are sufficiently detailed to support disaggregation by disease type, race and ethnicity, age, and gender. Assessment Whatever the reasons for the change in reporting requirements, both the original and the final versions of the PREA bill violated the expected princi- ples and practices of a federal statistical agency. BJS directly contributed to regulatory activities affecting individual data providers: explicitly singling out individual facilities to receive a summons to public hearings. Arguably, that direct summons to appear before the Review Panel is a somewhat lesser burden than a compulsory appearance before a congressional committee or the fuller National Prison Rape Elimination Commission. Nonetheless, the provision does explicitly direct the usage of data reported to BJS for nonsta- tistical purposes, a basic violation of the role of federal statistical data. The final language of the act exacerbated this violation by putting undue weight on point estimates of incidences of sexual assault—estimates of (ideally) a relatively low-probability phenomenon based on a small sample—without accounting for the inherent variability in the estimates. BJS and its major constituencies and stakeholders—chief among them Congress and the administration—must be mindful of the extensive legal mandates placed on the agency and how they correspond to the resources provided to BJS; it is crucial that BJS not be assigned duties that violate fundamental principles for statistical agency conduct. Finding 5.1: Under the terms of the Prison Rape Elimination Act of 2003, BJS was required to release the identity of selected responding institutions (i.e., facilities with the highest and low- est rates of sexual violence against inmates) for later regulatory action as part of a statistical program.

PRINCIPLES AND PRACTICES 225 Recommendation 5.1: Congress and the Department of Justice should not require, and BJS should not provide, individually identified data in support of regulatory functions that compro- mise the independence of BJS or require BJS to violate any of the principles of a federal statistical agency. To be sure, criticism of the reporting requirements of PREA should not be mistaken for criticism of the study of prison rape; the problem of sexual violence in correctional facilities is a valid and important one for inquiry. BJS’s work in developing the suite of inmate prison rape surveys also had the benefit of pushing the agency to make major methodological improve- ments, relative to other federal surveys, in the use of techniques such as ACASI. However, it is also important to note that implementing PREA in- volves major opportunity costs to BJS, over and above the concerns over the regulatory flavor of the work. The separate and highly sensitive nature of PREA interviewing makes it infeasible for BJS to conduct its standard in- mate interviewing programs at the same time. Further, it is still too early to assess whether the PREA interviewing has any chilling effect on response to BJS’s conventional corrections data series; although one possible reason for a dampening effect on response to regular corrections series might be resentment at being “singled out” for inclusion in the PREA sample, another is simply the time and resource burden of brokering BJS access to inmates on a more frequent basis. To its credit, BJS has taken steps to convey PREA’s re- quirements to individual facilities and elicited comments and feedback from facilities and administrators, including participation in relevant professional association meetings and conduct of stakeholder workshops. The PREA data providers have the risk of public display of their estimates in an active attempt at regulatory intervention. Within a short period of time, BJS will ask the same facilities for data for another purpose. The potential impact of BJS’s participation in regulatory actions is that the data providers will no longer believe that other data requested will not also be used in such a manner. Again, once data providers lose trust that cooperation with BJS will not lead to individual harmful actions on them, the agency faces large problems. 5–A.2 Strong Position of Independence A federal statistical agency must have a strong position of indepen- dence within the government. To be credible and unhindered in its mis- sion to provide objective, useful, high-quality information, a statistical agency must not only be distinct from those parts of a department that carry out law enforcement and policy-making activities but also have a widely acknowledged position of independence. It must be able to exe- cute its mission without being subject to pressures to advance a political

226 JUSTICE STATISTICS agenda. It must be impartial and avoid even the appearance that its collection, analysis, and reporting processes might be manipulated for political purposes or that individually identifiable data might be turned over for administrative, regulatory, or law enforcement purposes. (Na- tional Research Council, 2009:6) The establishment and maintenance of an independent, objective, and credible voice is a central principle for statistical agency operations. To maintain that objectivity and credibility, a statistical agency is obliged to keep apart from the policy-making sphere of the executive branch; its prod- ucts inform the development of policy, but they must not themselves be pol- icy statements. Maintaining this arm’s-length distance from policy develop- ment is particularly difficult for statistical agencies that are administratively housed with program agencies of the executive branch, whose purpose is the furtherance of specific objectives. In recent years, administrative layering of statistical agencies has become a subtle, but increasingly common, threat to the position of independence of federal statistical agencies. Agencies are diminished in their perceived importance, their claim to budgetary resources, and their attention from departmental policy makers through placement further down in a depart- ment’s organizational hierarchy. In 2002, the National Center for Educa- tion Statistics was redesignated by P 107-279 as a unit of a new Institute .L. of Education Sciences. In 2004, the National Center for Health Statistics— already administratively removed from the main Department of Health and Human Services by administrative placement in the Atlanta-based Centers for Disease Control and Prevention (CDC)—was placed under the further administrative layer of a “coordinating center,” as part of a broader CDC reorganization. Also in 2004, the Bureau of Transportation Statistics was converted by P 108-426 to become a unit under the new Research and .L. Innovative Technology Administration, and its director was changed from a presidential appointee with Senate confirmation to a career appointee des- ignated by the Secretary of Transportation. Of the current members of the Interagency Council on Statistical Policy (see Box 1-3), only the Bureau of Labor Statistics and the Energy Information Administration have direct re- porting authority to their respective cabinet secretary or department head. BJS, through its administrative placement in OJP is not the most heavily , layered of statistical agencies, but it ranks among them. In the panel’s judgment, the principle of a strong position of indepen- dence of a statistical agency was seriously violated in BJS’s recent past by the circumstances surrounding the release of data from the 2002 PPCS. This particular “flashpoint” in BJS’s recent history centered on the wording of a press release to accompany the data release. In the balance of this sec- tion, we describe the path toward this breach in principle and the corrective measures that have since been taken.

PRINCIPLES AND PRACTICES 227 OJP and Press Release Policy In a 1991 U.S. House subcommittee hearing on criminal justice statistics, Acting BJS Director Joseph Bessette was asked about policy on the press releases accompanying new BJS data releases and, specifically, the role of the Justice Department in clearing those releases. Bessette answered that: There has never been a case in my time [at BJS (5 years, at that point)], and people there tell me never before then as well, of the Department interfering in any way with the reports, with the accuracy, with the nature of the numbers, anything of that sort. So, in that respect, we have been functioning as a kind of semiautonomous statistical agency quite well. However, the BJS press releases—I use the term “BJS press releases,” but, actually, they are Department of Justice press releases of- ficially, and they have always gone to the Department for clearance. We draft them in BJS [and they] go up the chain of command for clearance, and that has been the case right along. So, in that respect, the policy hasn’t changed. Pressed further, he noted that “last year, for the first time, the Attorney General was quoted in a BJS press release commenting on the numbers and recommending public policy. That happened that one time; that has not happened since” (U.S. House of Representatives, Committee on the Judi- ciary, 1991:216).10 Over the next decade, the protocol for issuing BJS press releases evolved into the flow pattern illustrated in Figure 5-2. BJS staff would typically take the lead in developing the press release, in cooperation with the OJP Office of Communications. In all, the typical approval process required signoff from five noncareer appointees in DOJ and OJP (including the presidentially appointed BJS director). Figure 5-3 illustrates the general formatting of the standard notice and page posted to BJS’s website upon the release of a new product, and Figure 5-4 shows the formatting of a formal press release, for one recent BJS product for which OJP and DOJ elected to issue a press release. The 2002 Police-Public Contact Survey In August 2005, the New York Times, followed by other media out- lets, reported on a string of events over the previous 4 months that cul- minated in the removal of BJS Director Lawrence Greenfeld (Lichtblau, 2005a; Eggen, 2005; Sniffen, 2005). The removal was precipitated by dis- putes within the Justice Department over the statement of findings from the 10 In response, the questioner—then-Rep. Charles Schumer—commented before moving on to the next line of questioning: “I think it is a good idea to keep the two separate. The Attorney General should comment on policy but not in the statistical press releases” (U.S. House of Representatives, Committee on the Judiciary, 1991:216).

228 JUSTICE STATISTICS OJP Press Release BJS Office of Preparation Author/Supervisor Communications (OCOM) Staff Review and BJS Approvals Director OJP OJP OJP OJP Director of Public Deputy Assistant Director, OCOM Chief of Staff Affairs (OCOM) Attorney General OJP Assistant Attorney General Press Release DOJ BJS Dissemination Director, Office of Director Public Affairs Disseminate to Congress, media, and Post to BJS executive department website press offices Figure 5-2 Review, approval, and dissemination process for BJS survey press releases, 2007 NOTE: BJS, Bureau of Justice Statistics. DOJ, Department of Justice. OJP, Office of Justice Programs. White boxes indicate career employees; grey boxes denote noncareer appointees. SOURCE: Adapted from U.S. Government Accountability Office (2007:Fig. 3), based on infor- mation from BJS and OJP. PPCS supplement to the NCVS. As described in Section 3–C.4, the PPCS was first fielded on a pilot basis in 1996, followed by full-scale implemen- tation in 1999, 2002, and 2005. The events of 2005 concerned the release of information from the 2002 administration of the supplement (Durose et al., 2005). As indicated in the abstract of the report on the BJS website (http://www.ojp.usdoj.gov/bjs/abstract/cpp02.htm), Highlights [from the 2002 PPCS] include the following: • About 25% of the 45.3 million persons with a face-to-face contact indicated the reason for the contact was to report a crime or other problem. • In 2002 about 1.3 million residents age 16 or older—2.9% of the 45.3 million persons with contact—were arrested by police. • The likelihood of being stopped by police in 2002 did not differ significantly between white (8.7%), black (9.1%), and Hispanic (8.6%) drivers. • During the traffic stop, police were more likely to carry out some type of search on a black (10.2%) or Hispanic (11.4%) than a white (3.5%).

PRINCIPLES AND PRACTICES 229 Figure 5-3 Example summary and links to report and data on Bureau of Justice Statistics website Figure 5-4 Excerpt from example Office of Justice Programs press release accompanying new Bureau of Justice Statistics data release

230 JUSTICE STATISTICS After BJS staff developed a press release, the draft release was forwarded to OJP and then-Assistant Attorney General Tracy Henke. It was the inclu- sion of the last highlighted point—the finding of disparate levels of search (and related findings on use of force) by race and ethnicity—that led to a dispute. As Lichtblau (2005a) recounts, The planned announcement noted that the rate at which whites, blacks and Hispanics were stopped was “about the same,” and that finding was left intact by Ms. Henke’s office, according to a copy of the draft obtained by The New York Times. But the references in the draft to higher rates of searches and use of force for blacks and Hispanics were crossed out by hand, with a nota- tion in the margin that read, “Do we need this?” A note affixed to the edited draft, which the officials said was written by Ms. Henke, read “Make the changes,” and it was signed “Tracy.” That led to a fierce dis- pute after Mr. Greenfeld refused to delete the references, officials said. . . . Mr. Greenfeld refused to delete the racial references, arguing to his supervisors that the omissions would make the public announcement incomplete and misleading. The report was publicly released—posted to the agency’s website and disseminated through usual means—but without any accompanying news release or publicity. This decision “all but assured [that] the report would get lost amid the avalanche of studies issued by the government”; indeed, “a computer search of news articles [in August 2005] found no mentions of the study” (Lichtblau, 2005a). However, the study—and dispute over the press release—garnered considerable press attention after the New York Times story on the circumstances surrounding Greenfeld’s dismissal as BJS director. In the wake of these incidents, the U.S. Government Accountability Of- fice (GAO) initiated a review of the conduct of the various administrations of the PPCS and the release of those data. Responding to a draft report, Assistant Attorney General Regina Schofield asserted that some of GAO’s findings of interference were erroneous because they were “predicated on GAO’s assumption that a press release is a statistical product.” However, she continued (quoted in U.S. Government Accountability Office, 2007:48): We respectfully disagree with GAO’s assumption. A press release simply is not a statistical product and thus should not be treated as a statistical product at all—let alone one that is somehow covered by the [CNSTAT guidelines in Principles and Practices for a Federal Statistical Agency.] A press release, rather, is a public relations announcement issued to encourage media coverage. The mere presence of statistics in a press release does not transform a press release into a statistical product. Combining this argument with the legally nonbinding nature of the Prin- ciples and Practices, Schofield concluded (quoted in U.S. Government Ac- countability Office, 2007:50):

PRINCIPLES AND PRACTICES 231 By statute, 42 U.S.C. § 3732(b), the Director of BJS “shall be respon- sible for the integrity of data and statistics.” In the exercise of such authority, he may elect to follow the NRC guidelines, but he is not and cannot be legally bound to do so, in the absence of some supervening statute. [Thus,] even if the [CNSTAT] written guidelines did apply to press releases (and they do not), the Director would and does decline, in the exercise of his statutory authority to apply them to BJS press releases. In response, the GAO stood by its assumption, in large part for the sim- ple reason that “the Police-Public Contact Survey press release was made up almost entirely of survey statistics, indicating to us that it was a statistical product” and that “the content of the press release was a more important determinant than the label attached to it” (U.S. Government Accountability Office, 2007:23–24). The GAO observed that “the role that certain noncar- eer appointees outside BJS have the ability to play, pursuant to Department of Justice policy, in the product issuance process” means that “BJS was not in a position to fully follow all guidelines related to agency independence,” thus creating the potential for “future actual or perceived political inter- ference” in BJS product releases (U.S. Government Accountability Office, 2007:14).11 Later, but too late to affect the DOJ actions, the U.S. Office of Man- agement and Budget (OMB) issued formal guidance in early 2008 to clarify the gray-area dispute as to whether a press release constitutes a statistical product. In the March 7, 2008, Federal Register, OMB published Statisti- cal Policy Directive 4 on the release and dissemination of products from the federal statistical agencies. Defining a “statistical press release” as one of the product types covered by the directive, OMB “encouraged” agencies to issue press releases to accompany the issuance of new data and reports. The directive does not speak directly to the issue of administrative review of the content of such press releases, advising only that: to maintain a clear distinction between statistical data and policy inter- pretations of such data, the statistical press release must be produced and issued by the statistical agency and must provide a policy-neutral description of the data; it must not include policy pronouncements. 11 Reacting, most likely, to the GAO report, U.S. House appropriators issued an even stronger statement in its explanatory statement accompanying the fiscal year 2008 Commerce, Justice, and Science appropriations bill (H.Rept. 110-240): Ensuring objective BJS studies—The Committee directs that any statistical studies undertaken by the Bureau of Justice Statistics, as well as press releases describing the results of these studies, shall be publicly released by the Bureau without alteration or clearance by persons outside of the Bureau. However, this provision was not repeated in the explanatory statement for the consolidated appropriations act that eventually funded BJS and DOJ.

232 JUSTICE STATISTICS The issuance of this guidance appears to have improved the release process for BJS products in recent months, even though the guidance emphasizes the need to “coordinate with public affairs officials from the parent orga- nization” in those “cases in which the statistical unit currently relies on the parent agency for the public affairs function.” Aftermath In 2007, when data from the 2005 administration of the supplement were made available, the report (Durose et al., 2007) was accompanied by a press release (http://www.ojp.usdoj.gov/bjs/pub/press/cpp05pr.htm). Enti- tled “Police Stop White, Black, and Hispanic Drivers at Similar Rates Ac- cording to Department of Justice Report,” the release observed: The 2002 and 2005 surveys found that white, blacks and Hispan- ics were stopped at similar rates. . . . In both 2002 and 2005 police searched about 5 percent of stopped drivers. . . . While the survey found that black and Hispanic drivers were more likely than whites to be searched, such racial disparities do not necessarily demonstrate that po- lice treat people differently based on race or other demographic charac- teristics. This study did not take into account other factors that might explain these disparities. This press release—like others, issued in recent years and even subsequent to the March 2008 OMB guidance—was issued on OJP letterhead. Immediately following the dispute over the 2002 PPCS press release, BJS Director Greenfeld resigned. As the narrative description by Lichtblau (2005a) continues: Amid the debate over the traffic stop study, Mr. Greenfeld was called to the office of Robert D. McCallum Jr., then the third-ranking Justice Department official, and questioned about his handling of the matter, people involved in the episode said. Some weeks later, he was called to the White House, where personnel officials told him he was being replaced as director and was urged to resign, six months before he was scheduled to retire with full pension benefits, the officials said. After Mr. Greenfeld invoked his right as a former senior executive to move to a lesser position, the administration agreed to allow him to seek another job, and he is likely to be detailed to the Bureau of Prisons, the officials said. After the appearance of the Times article, numerous newspapers ran edi- torials critical of Greenfeld’s departure (see, e.g., Hartford Courant, 2005; Houston Chronicle, 2005; Joiner, 2005; Love, 2005; Miami Herald, 2005; Tennessee Tribune, 2005). Although some members of Congress called for Greenfeld to be reinstated (Lichtblau, 2005b), no such reversal was made.12 12 At the time of his dismissal, Greenfeld authored a farewell letter to members of the Justice Research and Statistics Association (JRSA) that noted a positive aspect of the flare-up over

PRINCIPLES AND PRACTICES 233 The wave of publicity concerning these events reinforced the perception that BJS’s position of independence had been threatened. Assessment One immediate recommendation that is appropriate in light of the PPCS incident is to express formal concurrence with the OMB guidance that even- tually followed. The press release associated with a new statistical series or the latest release of data from a continuing series is, properly, a statistical product. Taking care always to be policy-neutral, the press release is the agency’s first chance (and sometimes the only and best chance) for a sta- tistical agency to highlight its findings from the data, any methodological concerns that the new data may raise, and to promote accurate reporting and publicity of new results. Accordingly, press releases should share the same protections from interference as other BJS reports and releases. Finding 5.2: The appearance of political interference in release of statistical information undermines public trust in that infor- mation and in the entire agency. Recommendation 5.2: The Department of Justice review of any BJS statistical product and related communications should not require changes to the content, the release schedule, or the mode of dissemination planned by BJS. The promulgation of the OMB guidance solves, or at least ameliorates, the immediate cause of this most glaring violation to BJS’s position of in- dependence, but a larger problem remains. Independence is an ever-present tension that exists when a statistical agency is administratively nested in a program agency, as BJS is within OJP The OMB guidance is a useful safe- . guard but, by its nature and the nature of the decentralized statistical system, it is necessarily somewhat passive and advisory. That is, its successful imple- mentation in BJS’s case hinges on the compliance and goodwill of the lead- ership of BJS, OJP and the broader DOJ to ensure that boundaries are not , blurred. Though it is very welcome, the guidance makes no specific refer- ence to the circumstances that befell BJS concerning the 2002 PPCS release and, accordingly, falls short of a forceful statement by the statistical system press release language. “There is a good reason that more than 20,000 people a day turn to BJS for information on crime and the administration of justice; there is a good reason that no Congressional bill on crime and justice ever ignores our data on a subject and that we are repeatedly asked to gather even more data; there is a good reason that hundreds of thousands of newspaper and electronic media citations and numerous court decisions refer to BJS findings; there is a good reason that Office of Management and Budget regards our activities as the ‘most effective’ in all of the Department of Justice; and finally, there is a good reason that so many have expressed such concern about a few lines in a BJS press release, evidence of the importance of what we say” (Greenfeld, 2005).

234 JUSTICE STATISTICS that OJP’s intervention in the PPCS press release violated the basic practices of an official statistical system. The panel concludes that the current organizational arrangement under which BJS is administratively housed in a program agency (OJP) and the fact that its director serves at the pleasure of the president is a continu- ing and pressing threat to BJS’s position of independence as a provider of objective statistical information. It is critically important that, whatever or- ganizational structures or reporting requirements may apply, BJS function independently and be allowed to function independently. We also recognize that there exists no organizational arrangement that—on its own—can com- pletely shield a statistical agency from threats to its independence and guar- antee freedom from political or structural interference (or the appearance thereof). However, in our assessment, the continuing threat to BJS’s inde- pendence is sufficiently dire—and the past violations sufficiently severe—as to warrant what we believe to be the strongest possible corrective actions and deterrents to incursions on independent functioning: moving BJS out of OJP and fixing the term of service of the BJS director. BJS and the Office of Justice Programs BJS’s functions are unique in its parent branch, OJP with respect to both mission and technical require- , ments. Since grantmaking overwhelmingly drives the OJP organization and service-delivery infrastructure, OJP is ill-suited to address the needs of BJS to produce data and statistical reports and provides minimal support for carrying out these functions (although it does, with contributions from BJS and other OJP bureaus, operate the National Criminal Justice Reference Ser- vice for dissemination of BJS results). BJS’s administrative placement within OJP is doubly a hindrance on BJS’s effective function as the principal data- gathering unit within the Justice Department: first, by putting it into com- petition for funds and resources with popular grantmaking functions that provide assistance to state and local law enforcement and, second, by dimin- ishing BJS’s position within the Department. Other Justice Department divi- sions perform fairly major statistical and data collection functions—among them, the Civil Rights Division, the Justice Management Division, the Ex- ecutive Office for U.S. Attorneys, and the Federal Bureau of Investigation (FBI). These units utilize statistical analysis for performance measurement, examination of voting issues, review of discrimination concerns, and so forth—major issues in which BJS’s ability to offer advice or coordination is impaired by BJS’s positioning within the department.

PRINCIPLES AND PRACTICES 235 Although the historical reason for BJS being positioned within OJP is fairly clear, inheriting as both entities do from the Law Enforcement Assis- tance Administration (LEAA), the administrative positioning raises technical and practical concerns. The basic purpose of OJP is to promote certain activities, strategies, or interventions related to crime, primarily through fi- nancial assistance to state and local authorities. Statistics should serve as an independent way of assessing those practices by measuring whether crime problems are worsening or improving; that statistical activities are under the direction and funding of OJP creates the appearance and, at times, the reality of conflict and questionable integrity. Moreover, BJS’s placement within OJP forces it to compete for resources with grant monies that are popular with and coveted by local authorities and congressional representatives alike. In terms of budget, the Justice Depart- ment tends to view BJS as a small line entry in an overall OJP appropriation. The general process is such that OJP is budgeted or appropriated at a certain funding level and largely makes the internal distribution among component agencies; it is the assistant attorney general for OJP and not the director of , BJS, who is permitted to testify before congressional appropriations commit- tees. Put into head-to-head competition with grant programs to “put cops on the street” or fund crime assistance programs, sustaining the growing costs of BJS statistical programs become a lower-order concern. Worse, in recent years, OJP has taken steps to make explicit BJS’s subservience within a larger OJP appropriation, undercutting BJS’s presence as even a simple line item in annual spending bills. In the 2003 House appropriations subcommittee report for the fiscal year 2004 Commerce, Justice, and Science spending bill, appropriators took note of a change in the budget request it received from the Justice Department (H.Rept. 108-221, p. 36): The fiscal year 2004 budget request proposed merging all programs ad- ministered by the Office of Justice Programs (OJP) under the Justice Assistance heading. The Committee recommendation retains the ac- count structure used in previous years and funds State and local law enforcement programs under seven appropriation accounts. House-Senate conferees on the final consolidated appropriations bill for fis- cal year 2004 also noted that they “do not adopt the Administration’s pro- posal to consolidate all [OJP] activities” under the single “Justice Assistance” heading (H.Rept. 108-401, p. 533). Similar attempts to consolidate ac- counts were noted by House appropriators in the fiscal year 2005 submission (H.Rept. 108-576, pp. 33–34; H.Rept. 108-792, p. 738). The attempt to consolidate OJP funding into a single pool has continued in each subsequent year, including submissions for fiscal year 2009 (e.g., Senate appropriators commented that “the Committee again rejects the Department’s proposed merger of all OJP programs under this heading and instead has maintained the [previous] account structure;” S.Rept 110-397, p. 64).

236 JUSTICE STATISTICS The problem of BJS funding as it is currently situated within OJP is anal- ogous to problems encountered in other governmental programs where new initiatives often receive greater attention than the existing responsibilities— fixing potholes often takes a back seat to more glamorous new construction projects. In the case of BJS, after passage of the multibillion-dollar Violent Crime Control and Law Enforcement Act of 1994, OJP funding expanded greatly and external grants flowed freely, yet BJS received no enhancements to its appropriated funding and, indeed, had difficulty even securing addi- tional funding to cover cost-of-living adjustment increases payable to the Census Bureau for data collection. On one hand, statistical data collection activities should be seen as long-term activities requiring predictable funding so that they may be carried out on recurring schedules. On the other hand, the grant programs of the larger OJP have impermanence in both mission and appropriations; BJS’s base function is jeopardized from being tied to an administrative parent whose resources can rise or fall dramatically and whose local-assistance grants are more popular funding targets than contin- uing statistical activities. Statistical analysis and research have also been strikingly undervalued by OJP as evidenced by attempts to “outsource” most BJS staff positions and , functions. In August 2002, OJP was said to have issued a directive stat- ing that jobs within BJS would be turned over for competitive bid to the private sector (Butterfield, 2002). Under the terms of the Federal Activities Inventory Reform (FAIR) Act, positions within a government agency must be characterized as either “commercial” or “inherently governmental”; in late 2002, OMB was in the process of revising its Circular A-76 to more directly require that those positions classified as “commercial” be opened to com- petitive bid with private-sector companies. As described by the Consortium of Social Science Associations (2002:5) in February 2003, the FAIR Act in- ventory developed by OJP classified 51 out of 57 positions as “commercial” and thus designated for outsourcing. Several statistician positions within BJS were classified in the inventory as being “grants monitoring and evaluation”; 20 of 23 jobs labeled “statistical analysis” and 18 of 20 “grants monitoring” positions were labeled commercial. This classification drew protest from several social science organizations including the American Society of Crim- inology, whose executive board passed a resolution in November 2002 ar- guing that “the compilation, analysis, interpretation, reporting, monitoring, and management of crime and justice statistics . . . are inherently govern- mental functions” (http://www.asc41.com/boardmin.annual022.htm). This outsourcing effort was blocked by congressional appropriators: in explaining the fiscal year 2003 omnibus appropriations bill, House-Senate conferees insisted that the appropriations committees “must be assured that effectiveness is improved and savings are attained” through the OJP out- sourcing plan before proceeding with changes (H.Rept. 108-10, p. 635),

PRINCIPLES AND PRACTICES 237 a provision repeated by House appropriators the following year (H.Rept. 108-221, p. 40). In the fiscal year 2006 appropriations round, House- Senate conferees specifically directed that “any action taken by OJP relat- ing to [OMB’s] Circular A-76 shall be subject to” a general provision re- quiring advance notice and special justification to Congress for program changes that, among other conditions, would reduce the personnel of an agency by 10 percent or more (H.Rept. 109-272, pp. 46, 86). However, implementation of outsourcing is still possible, and would still be dam- aging to BJS. The Justice Department’s most recent publicly posted FAIR Act inventory listed commercial and inherently governmental activities for 2007 (http://www.usdoj.gov/jmd/pe/preface.htm); this roster lists 32 of 57 BJS positions (and 20 of 33 “statistical analysis” positions) as commercial, with the reason for classification as commercial listed as “pending an agency approved restructuring decision (e.g., closure, realignment).” In our assess- ment, the collection and analysis of statistical data by federal statistical agen- cies is an essential government function; that OJP has not more fully realized this point suggests a continued incompatibility of functions between BJS and its administrative parent. Exacerbating this mismatch in functions between OJP as a program agency and BJS as a statistical agency, two threads of legislative text that have developed since the late 1990s have suggested attempts to tether BJS closer to OJP objectives and diminish BJS’s functional independence. Both of these threads have involved wording changes that may appear short and subtle but have great meaning, and both require some detailed attention to legislative history to be fully understood. The first of these threads began in 1997 when House appropriators expressed concern that “the current structure of administration of grants within [OJP] produces a fragmented and possibly duplicative approach to disseminating information to State and local agencies on law enforcement programs and developing coordinated law enforcement strategies.” Noting a 213 percent growth in overall OJP grant program funding since 1995, the appropriators directed the assistant attorney general (AAG) for OJP to prepare a report recommending actions “that will ensure coordination and reduce the possibility of duplication and overlap among the various OJP di- visions” (H.Rept. 105-207, pp. 43–44). This language was preserved in the House-Senate conference on the fiscal year 1998 spending bill (H.Rept. 105-405) that became law. The AAG issued this requested report in January 1998; on the basis of the report, House and Senate appropriations con- ferees inserted a provision into the fiscal year 1999 omnibus spending act asserting an oversight role for the AAG in finalizing grants (Congressional Record, October 19, 1998, p. H11310). Specifically, the final act read (P .L. 105-277; 112 Stat. 2681-67; compressing a first clause that gives the AAG grantmaking authority):

238 JUSTICE STATISTICS Notwithstanding any other provision of law, during fiscal year 1999, the Assistant Attorney General for the Office of Justice Programs of the Department of Justice [shall] have final authority over all grants, coop- erative agreements, and contracts made, or entered into, for the Office of Justice Programs and the component organizations of that Office. Though it left intact language from BJS’s creation in 1979 giving the BJS director “final authority for all grants, cooperative agreements, and contracts awarded by the Bureau” (93 Stat. 1176; 42 USC § 3732(b)), this provision made OJP’s “final authority” for grants primary to BJS’s “final authority”— albeit only for fiscal year 1999. BJS briefly won exemption from this provision when new appropriations language changed the effective date from fiscal year 1999 to 2000 but added a caveat that the AAG’s final authority did not apply to grants made under certain sections of law (113 Stat. 1501A-20), including the section asserting BJS’s “final authority” for its own grants. In 2000, appropriations language made no further changes to the text but indicated that it “shall apply here- after” (114 Stat. 2762A-68), which led to the language being codified as 42 USC § 3715. However, section 614 of the 2001 USA PATRIOT Act (P 107-56; 115 Stat. 370) made two critical changes: .L. • By adding three words, the revised law gave the AAG “final authority over all functions, including any grants” (emphasis added), a much wider sweep of authority over BJS and other OJP-component offices. • The revised law amended “component organizations of that Office” to read “component organizations of that Office (including, notwith- standing any contrary provision of law (unless the same should ex- pressly refer to this section), any organization that administers any program established in title 1 of Public Law 90-351)”—a rather con- voluted way of making explicit that OJP’s “final authority” supersedes BJS’s (which still exists, albeit as an “other provision of law”). One year later in September 2002, this perceived takeover of BJS authority was exacerbated by one final small but sweeping change included in reau- thorization language for the Department of Justice. Reference to the AAG was stricken and the text amended to read that, “during any fiscal year, the Attorney General” shall have final authority—asserting strong Justice De- partment control over BJS and other OJP offices (P 107-273; 116 Stat. .L. 1778). The second legal thread deals with a clause in the enumerated powers of the AAG. The Justice System Improvement Act of 1979 that created BJS also created OJP’s predecessor, the Office of Justice Assistance, Research, and Statistics (OJARS), but did so in an interesting way: defining the LEAA, BJS, and the National Institute of Justice (NIJ) up front in sections A–C but only specifying OJARS in a catch-all Part H on “Administrative Provisions.”

PRINCIPLES AND PRACTICES 239 Specifically, section 802(b) of the Act (93 Stat. 1201) directed that (emphasis added): The Office of Justice Assistance, Research, and Statistics shall directly provide staff support to, and coordinate the activities of, the National Institute of Justice, the Bureau of Justice Statistics, and the Law Enforce- ment Assistance Administration. The Justice Assistance Act of 1984 substantially rewrote and reorganized the existing law, creating OJP in its current form and pointedly giving it primacy by defining it in Part A (where the LEAA was previously defined). The 1984 act also made explicit that “the Director [of BJS] shall report to the Attorney General through the [AAG]” (98 Stat. 2079). In place of the above-quoted 1979 language, the 1984 act specified duties of the AAG including (98 Stat. 2078; emphasis added): The Assistant Attorney General shall . . . (5) provide staff support to co- ordinate the activities of the Office and the Bureau of Justice Assistance, the National Institute of Justice, the Bureau of Justice Statistics, and the Office of Juvenile Justice and Delinquency Prevention. . . . The Homeland Security Act of 2002 (P 107-296; 116 Stat. 2162) made a .L. small but telling change to point (5), simply inserting the words “coordinate and” at the beginning to give the phrase its current form (42 USC § 3712(a), emphasis added): The Assistant Attorney General shall . . . (5) coordinate and provide staff support to coordinate the activities of the Office [and the] Bureau of Justice Statistics. . . . In isolation, these legislative changes might appear to be relatively in- nocuous. In terms of strict legislative text, the 2002 Homeland Security Act’s provision did nothing but restore a “coordination” function held by OJARS at its (and BJS’s) founding in 1979—at which point it was arguably a worse situation for BJS, given OJARS’s more weakly defined position. How- ever, in context and in combination, the changes convey an intent by OJP to take a more heavy-handed role in BJS activities. A press account at the height of this legislative activity in 2002 noted a statement by then-AAG Deborah Daniels, suggesting that stronger OJP control over BJS and NIJ was desirable in order to ensure that DOJ “speaks with one voice” on crime and justice issues (Butterfield, 2002:33). This rationale is antithetical to the position of independence that statistical and research agencies must have in order to be most effective; statistical agencies must have the latitude to release findings that run counter to the policy of their parent departments, if those findings are borne out by the data. Consequently, taken together, OJP’s legislative as- sertion of “final authority” over BJS functions and its intent to “coordinate” BJS activities constitute dangerous infringements of BJS’s proper function. Conceptually, the current organizational structure under which BJS is housed within OJP along with other research and subject-matter bureaus

240 JUSTICE STATISTICS does have certain advantages. If heavy-handed “coordination” gave way to real synergy—full collaboration between BJS and sister bureaus such as the Office of Juvenile Justice and Delinquency Prevention or the Office for Vic- tims of Crime—BJS data and analysis could meaningfully inform OJP policy development. Likewise, in such a true synergistic environment, the AAG for OJP could provide strong and visible advocacy for BJS concerns. However, we believe that such an effective and beneficial implementation of the status quo organizational arrangement hinges critically on the priorities and tem- peraments of the AAG and other top officials in the Justice Department and the strength of the BJS director to function independently. In our assess- ment, the inherent conflicts between the priorities of a program office such as OJP and a statistical agency such as BJS—and the too-fine line between synergistic work by OJP offices and attempts to make those offices “speak with one voice”—makes the status quo untenable in the long run. On the basis of these arguments, we conclude that BJS’s administrative placement in OJP is detrimental: Finding 5.3: The placement of BJS within the Office of Justice Programs has harmed the agency’s ability to innovate in data collections and expand the efficiency of achieving its statistical mission. It suffers from a zero-sum game in competition with programs of direct financial benefit to states and localities. In the panel’s assessment, a BJS that is better established as an indepen- dent structure within the DOJ infrastructure would have an enhanced ability to support and sustain statistical programs. We also expect that a higher- placed BJS—ideally as a direct report to the attorney general or the deputy attorney general—would have a powerful effect on the timeliness of infor- mation released by BJS, because it would be called upon to provide more contemporaneous information to the highest levels in the department. Such an administrative move would make clear the permanence of data-gathering functions and the need to use the resulting information in policy develop- ment and review; it would also provide a clear separation from competing interests who wish to advocate for certain programs or initiatives. In terms of data collection, a more-prominent and higher-profile BJS would also be helpful in dealing with balky or resistant data suppliers. To be sure, admin- istrative attachment of BJS to the office of the attorney general runs the risk of politicization—far from the intended effect. However, in our judgment, such a high-level attachment would afford BJS the most prominence and stature and, hence, be the strongest corrective remedy for past breaches of BJS’s independence. Accordingly, we recommend: Recommendation 5.3: BJS should be administratively moved out of the Office of Justice Programs, reporting to the attorney general or deputy attorney general.

PRINCIPLES AND PRACTICES 241 It follows that this administrative change involves removing the legislative language asserting a strong OJP oversight role over BJS functions. To this general recommendation, we add two corollaries: • In foregoing ties to OJP it is important for BJS to retain the capacity , for letting contracts. In particular, it is vital that BJS retain full ability for administering grants such as those that maintain the state Statisti- cal Analysis Center (SAC) network and that support development of and improvement to source criminal justice databases, as described in Chapter 4. • The problems faced by BJS in its administrative nesting within a pro- gram agency are similar to those faced by some other OJP units, no- tably NIJ: a research agency embedded within a program agency. In November 2008, John Jay College of Criminal Justice president and former NIJ Director Jeremy Travis issued an open letter to the mem- bership of the American Society of Criminology urging the creation of an Office of Justice Research within DOJ. This new office would in- clude BJS and NIJ, elevating NIJ’s Office of Science and Technology to become the National Institute of Justice Technology; all three agencies would report to an assistant attorney general for justice research, ap- pointed by the president with Senate confirmation. Relevant excerpts from this letter are shown in Box 5-3. Determining the administrative placement of NIJ is beyond this panel’s scope; a parallel National Research Council panel is currently evaluating NIJ’s research program, and NIJ’s structure is more the province of that panel. However, we note that an approach by which both BJS and NIJ report to an assistant attorney general for research is certainly consistent with our own recommendation; our guidance in this report is intended to speak to a choice between BJS remaining in OJP versus moving out of OJP and the Travis proposal would also , achieve the result we think is best for BJS. A separate office including both a research agency and a statistical agency would also be uniquely poised to develop research programs in justice-related issues that have received relatively little rigorous empirical treatment, such as the ex- tent to which forensic evidence (e.g., fingerprints or firearm-related toolmarks) are introduced in judicial proceedings (and the effective- ness of that evidence) or the perceived fairness of court verdicts. Term of Appointment of BJS Director To provide an added measure of in- sularity, the panel further concludes that BJS would benefit from the desig- nation of the BJS directorship as a fixed-term appointment by the president, with the advice and consent of the Senate.

242 JUSTICE STATISTICS Box 5-3 Excerpts from Travis (2008) Open Letter on an Office of Justice Research I propose that the Congress create, with support from the new Administration, a new office in the Department of Justice, called the Office of Justice Research, to be headed by an Assistant Attorney General for Justice Research. This office would be separate from the Office of Justice Programs, which would continue to administer the funding programs that support reform efforts by state and local law enforcement and criminal justice agencies. . . . The argument for creation of the new Office of Justice Research, separate from the Office of Justice Programs, is very straightforward: if the research, statistics, and scientific development functions of the federal government are located within an office that is primarily responsible for the administration of assistance programs, three risks are created. First, the scientific integrity of the research functions is vulnerable to compromise. Second, the research and development function will never be given the priority treatment that is needed to meet the enormous crime challenges facing the country. Third, the research agenda on crime and justice will more likely reflect short-term programmatic needs rather than the long-term need to develop a better understanding of the phenomenon of crime in America and the best ways to prevent and respond to crime. . . . [As part of this new office,] the Bureau of Justice Statistics would continue all of the functions currently carried out by BJS. [But] the current constellation of data collections systems on crime and justice are fragmented and incomplete. To remedy this situation—and to provide the nation the capability to track crime trends in a timely manner—the mandate of BJS should be expanded significantly. First, BJS should be authorized to work closely with the Federal Bureau of Investigation to improve the timeliness and completeness of the Uniform Crime Reports. Similarly, responsibility for the ADAM program [(see Section 2–C.4)] should be transferred from ONDCP (it was originally housed at NIJ), and responsibility for the statistical series on juvenile justice should be transferred from the Office of Juvenile Justice and Delinquency Prevention (a component of OJP). But the new BJS would be more than a manager of existing statistical series. It should also develop new initiatives to track crime trends, drawing on capabilities of police departments that now post crime trends close to real time. It would develop new protocols for tracking critical crime issues, such as the level of illegal drug selling activity, public confidence in the criminal justice system, the operations of the federal law enforcement agencies, etc. This expanded portfolio would clearly require additional funding, but there are compelling arguments for creating a robust national capacity to improve our understanding of crime trends. . . . If we were designing a federal research and development capacity on crime and justice today, we would probably not propose the current structure that houses NIJ and BJS within the Office of Justice Programs, three levels below the Attorney General, with a focus on state and local criminal justice. Rather, we would create a scientific branch of government that operates under scientific principles reporting directly to the Attorney General. We would recognize that crime is now a transnational phenomenon and we need to understand human trafficking, drug smuggling, immigration trends and terrorism. We would examine the many systems of justice—civil justice, immigration courts, the federal justice system, in addition to state and local justice systems. We would develop a modern capacity to understand local crime conditions using high-tech surveys. We would develop creative ways to measure (continued)

PRINCIPLES AND PRACTICES 243 Box 5-3 (continued) non-traditional crimes, such as identity theft, corporate and white collar crime, and transnational crime. We would design a research and development program that would harness the power of technology so the agencies that enforce the law can benefit from the scientific and technological revolution. This ambitious agenda clearly requires additional resources. But it also requires a new structure within the Depart- ment of Justice, a structure that guarantees both scientific integrity and policy relevance. SOURCE: Excerpted from Travis (2008:1, 4, 5); emphasis in the original. Finding 5.4: Under current law, the director of the Bureau of Justice Statistics serves at the pleasure of the president; the di- rector is nominated to an unspecified term by the president, with the advice and consent of the Senate (42 USC § 3732(b)). It is worth noting that fixed-term appointments are relatively rare in the federal statistical system. Currently, only two of the nation’s principal sta- tistical agencies—the Bureau of Labor Statistics and the National Center for Education Statistics—have heads who are appointed and confirmed to fixed terms of 4 and 6 years, respectively (29 USC § 3 and 20 USC § 9517(b)).13 The heads of BJS, the Census Bureau, and the Energy Information Adminis- tration are appointees (with Senate confirmation) who serve at the pleasure of the president; the other nine heads of Interagency Council on Statisti- cal Policy member organizations are career employees and departmental ap- pointments. Bills to create a termed appointment for the director of the Cen- sus Bureau have been introduced, but not enacted, in recent Congresses— most recently, one that would fix the term at 5 years (at the same time that it would remove the Census Bureau from the Department of Commerce and establish it as an independent executive agency).14 The range of models for the term of appointment of a BJS director can be expressed simply: • Presidential appointment with Senate confirmation, at pleasure (the status quo); • Presidential appointment with Senate confirmation, fixed term; and • Career employee, appointed by the president, cabinet secretary, or other official. 13 Ironically, the same legislation that positioned the National Center for Education Statistics under a new administrative layer—the Institute of Education Sciences—also extended the length of the fixed term for the commissioner of education statistics. Prior to 2003, commissioners served a 4-year term rather than a 6-year term. 14 See H.R. 7069, introduced by Rep. Carolyn Maloney (D-N.Y.) on September 25, 2008, in the 110th Congress.

244 JUSTICE STATISTICS In the right environment—with a strong and well-defined position of inde- pendence and the latitude for innovation—the career employee directorship is an attractive option that has the added advantage of ensuring that a direc- tor is well versed in the agency’s existing work and subject-matter domain. Indeed, among BJS’s fellow statistical agencies, career employee appoint- ments such as the directorship of the Bureau of Economic Analysis rank among the most effective leadership models. However, as we described in arguing for an administrative move out of OJP BJS does not enjoy such an , environment. We view a presidential appointment with Senate confirmation as a necessity for the BJS directorship, carrying with it the stature to interact effectively with the appointees at the top ranks of the Justice Department. The events of 2005 demonstrated that BJS can be and has been harmed by the current arrangement by which the BJS director serves strictly at the pleasure of the administration. The circumstances of Director Greenfeld’s dismissal—in the immediate aftermath of refusing to alter a press release to address political concerns—fostered the appearance of formal and structural interference in BJS’s operations. In our assessment, a fixed-term appoint- ment for the BJS directorship would be the best and strongest palliative measure to put some distance between BJS and its political superiors in the Justice Department (whether BJS remains in OJP or not). The model of the directorship of the Bureau of Labor Statistics is the one that we find most compelling for BJS: In our judgment, it makes sense for the federal offi- cer directly tasked with reporting key indicators of social justice in America to have stature, political insularity, and term of service commensurate with the federal officer directly responsible for reporting key economic indicators such as unemployment and job growth.15 The director of BJS must have the capability to objectively report both good news and bad news—to provide information on crime and justice in the United States, even when the findings are politically inconvenient or unappealing. We believe that a presidential appointment with confirmation provides the appropriate stature for such a position, and that the specification of a fixed term of service prevents the kinds of attempted interference that has harmed BJS in recent years. Accordingly, we recommend: Recommendation 5.4: Congress and the administration should make the BJS director a fixed-term presidential appointee with the advice and consent of the Senate. To insulate the BJS director from political interference, the term of service should be no less than 4 years. 15 Though the jobs are obviously much different in scope, it is worth noting that the other principal federal officer tasked with reporting statistics on crime in the United States—the di- rector of the FBI, reporting results from the Uniform Crime Reporting program—holds the relative insularity of a 10-year fixed-term appointment, nonrenewable, with Senate confirma- tion (P.L. 90-351 § 1101).

PRINCIPLES AND PRACTICES 245 It would make sense for the term to be about 6 years because that would take the director to a new administration or to a second term of a incumbent administration. 5–A.3 Relevance to Policy Issues A federal statistical agency must be in a position to provide objec- tive information that is relevant to issues of public policy. A statistical agency must be knowledgeable about the issues and requirements of public policy and federal programs and able to provide objective infor- mation that is relevant to policy and program needs. . . . In establishing priorities for statistical programs for this purpose, a statistical agency must work closely with the users of such information in the executive branch, Congress, and interested nongovernmental groups. (National Research Council, 2009:4) This principle has implications, both for the parent department of a sta- tistical agency and for the actions of the agency itself. The parent depart- ment must take the agency seriously. Statistical units, when best used by their parent agency, are the window into the performance of their agency in addressing key issues facing the society. When intelligently used, the statis- tical agency can measure the prevalence and importance of different issues tasked to the department. When intelligently used, they can be the manage- ment dashboard to guide allocation of budget to different activities. When intelligently used, they can assemble information about likely trends of fu- ture phenomena within the mission of the department. However, achievement of such a role is not merely dependent on out- reach by the leadership of the parent department. Rarely are the govern- ment officials appointed to departmental leadership aware of the utility of statistical information to guide the work of the department. The director and senior staff of the statistical agency have an obligation to be outwardly- focused, to become expert in the program mission of the agency. Only with such substantive expertise can the department’s statistical agency produce optimally relevant statistical information to the policy makers of the depart- ment. Statistical agencies are part of the management information system for policy making in program departments. Senior statistical staff must have the skills, time resources, and mandate to develop relationships with the policy- making units to provide information relevant (not necessarily supporting, but relevant) to the policy makers’ tasks. In the judgment of the panel, BJS’s ability to carry out this role of pro- viding policy-relevant data is impaired by its relatively low profile within the agency. At one of its plenary meetings, the panel met with senior DOJ offi- cials and discussed past and current roles of BJS within DOJ policy-making activities; from those discussions, it was apparent that BJS was not viewed

246 JUSTICE STATISTICS as a relevant player in many of the key initiatives of DOJ. Indeed, there did not seem to be high awareness of the range of BJS activities or the ways in which data could be brought to bear in broader DOJ activities. In the panel’s view, BJS has not been perceived as an important asset in assembling relevant information for key policy initiatives; fault for this is undoubtedly shared by BJS (for limited “promotion” of its work within the department) and by higher officials in DOJ. There are two potential, relevant solutions, the first of which looks at BJS activities within DOJ. The panel believes that the BJS director should be a very visible and active promoter of the value of objective statistical information for use in policy decisions within DOJ. Every budget initiative of DOJ is a potential opportunity for enriched statistical information about the status of the justice system. The BJS director and his or her senior staff should increase their outreach to sister DOJ units. Recommendation 5.5: The BJS director needs to reach out to other agencies within DOJ, forming partnerships to propose ini- tiatives for information collection that are relevant to policy needs. Recommendation 5.6: The Department of Justice should build provisions for BJS collection of data and statistical information into its program initiatives aimed at crime reduction. These are not intended as program evaluation funds, but rather as funds for the basic monitoring and assessment of the phenomena tar- geted by the initiative. Although this recommendation is a necessary step to achieve more rele- vance to DOJ, the panel believes that it may not be sufficient. Effective out- reach by BJS depends on willingness to receive such outreach and respect for BJS expertise. The visibility of BJS within DOJ and in the legislature appears to be quite low. On budget initiatives the BJS director rarely meets directly with legislative staff; the BJS budget is reviewed as part of the OJP budget, and so those discussions are held at the OJP level. Hence, our previous recommendation to administratively move BJS out of OJP—giving the BJS director the authority (and the duty) to interact directly with congressional appropriators and overseers—would also contribute greatly to BJS’s ability to provide policy-relevant data. In Section 5–B.8 below, we discuss the need for an effective research program as another means of bolstering the relevance of BJS and its data products. 5–A.4 Credibility Among Data Users A federal statistical agency must have credibility with those who use its data and information. . . . To have credibility, an agency must be free—

PRINCIPLES AND PRACTICES 247 and must be perceived to be free—of political interference and policy advocacy. Also important for credibility is for an agency to follow such practices as wide dissemination of data on an equal basis to all users, openness about the data provided, and a commitment to quality and professional practice. (National Research Council, 2009:5) Credibility is a reputational attribute of a statistical agency. It is fre- quently argued that the credibility of the statistical products is partly de- rived from sound statistical properties (high precision and low statistical bias) and from perceptions that the source of the information has no point of view or ideological lens on the information (National Research Coun- cil, 2005b:5). Thus credibility is enhanced with sound professional practice and widespread recognition of this professionalism. It is also enhanced by demonstration of independence from influence from policy viewpoints. Panel members and staff were active observers in a workshop of users of BJS data, conducted by the Council of Professional Associations for Fed- eral Statistics (COPAFS) with BJS sponsorship, in February 2008. Attendees at the workshop included members of BJS’s state SAC network, academic researchers, representatives of police chiefs, representatives of state courts, and others, along with BJS staff and officials. There was general high praise for BJS, some calls for increased timeliness of BJS data (for enhanced law enforcement management purposes), and finer granularity of estimates for local uses. For some panel members in the audience of the workshop, some of the law enforcement community were asking for almost real-time event data—a goal that is difficult for any statistical agency to achieve. Despite these types of critiques of BJS, panel after panel at the workshop expressed great belief that the BJS data series were credible, valued, and relevant to their work. Finding 5.5: BJS enjoys high credibility but often is critiqued for missing fine-grained data by geography or time. 5–B PRACTICES OF A FEDERAL STATISTICAL AGENCY 5–B.1 Clearly Defined and Well-Accepted Mission An agency’s mission should include responsibility for all elements of its programs for providing statistical information—determining sources of data, measurement methods, efficient methods of data collection and processing, and appropriate methods of analysis—and ensuring the pub- lic availability not only of the data, but also of documentation of the methods used to obtain the data and their quality. (National Research Council, 2009:7) That BJS’s mission and basic functions are clearly defined is virtually in- disputable. We have frequently referred to Box 1-2, BJS’s extensive list of

248 JUSTICE STATISTICS authorized activities under its enabling legislation, which is testament to the detail in BJS’s defining mission. Whether they are clearly accepted is quite another matter. As we discussed in Section 5–A.3, the panel was disap- pointed by the apparent lack of understanding of BJS’s role and its potential when it met with higher-level Justice Department officials. Although ex- pressions of support were plentiful, an understanding of the importance of high-quality data for shaping policy was generally lacking. BJS’s recent history in the appropriations process is also, potentially, ev- idence that its range of existing data collections—and the cost of data col- lection, generally—is not well understood in important places. In summer 2006, the appropriations committees in both houses of Congress processed BJS’s budget request of about $60 million. While the House sought to keep BJS funding at about fiscal year 2006 levels ($36 million, compared to final 2006 allocation of $34.6 million; H.Rept. 109-520), the Senate’s mark came in considerably lower at $20 million (S.Rept. 109-280). (No final appropri- ations bill for DOJ was passed for fiscal year 2007; like many other federal agencies, it was funded through a series of continuing resolutions at fiscal year 2006 levels, with some exceptions). A brief explanatory note in the Senate committee’s report acknowledged BJS’s role in collecting the NCVS and other data programs but did not explain the reason for the reduction. The problem was exacerbated in the fiscal year 2008 appropriations pro- cess: House appropriators provided $45 million for BJS (H.Rept. 110-240) but the Senate appropriators, with no explanatory statement whatsoever, in- cluded only $10 million for BJS: a funding level that would have terminated the NCVS, if not much of BJS’s activities. Inquiries by the Consortium of So- cial Science Associations yielded the explanation from Senate subcommittee staff that the $10 million figure was a “misprint” that would be corrected and replaced by “full funding” later in the process (Consortium of Social Science Associations, 2007:3). It was not corrected in the version of the bill that finally passed the Senate; in the final consolidated appropriations bill that included DOJ, BJS funding came closer to the House mark than the Senate mark.16 As before, the panel concludes that a clear separation between BJS and OJP and placement of BJS elsewhere in the DOJ hierarchy would help clarify the mission of BJS and strengthen its profile as a principal statistical agency. Given congressional stalemate and the inability to pass most individual ap- propriations bills, the particular budget climate in recent fiscal years would be difficult for any organizational configuration of BJS within DOJ. Still, the story of the varying appropriations marks suggests that, in at least one im- portant circle, knowledge of the basic cost of data collection and the value 16 For fiscal year 2009, Senate appropriators recommended $40 million for BJS (S.Rept. 110-397).

PRINCIPLES AND PRACTICES 249 (and cost) of BJS’s flagship data collection was sufficiently weak as to put BJS’s viability at stake. BJS’s mission is not well served by having its in- terests solely represented and managed by OJP in the budget and planning arenas, precisely because BJS’s own mission is not well articulated by OJP’s general mission “to increase public safety and improve the fair administra- tion of justice across America through innovative leadership and programs” (U.S. Department of Justice, Office of Justice Programs, 2006:3), principally through financial assistance. 5–B.2 Continual Development of More Useful Data Statistical agencies must continually look to improve their data systems to provide information that is accurate, timely, and relevant for chang- ing public policy needs. They should also continually seek to improve the efficiency of their programs for collecting, analyzing, and dissemi- nating statistical information. (National Research Council, 2009:7) The February 2008 data users workshop, sponsored by BJS and con- ducted by COPAFS, was a good step for BJS in carrying out the practice of improving and modifying its data collections to be more useful and rel- evant. The session suggested both useful analyses and extracts that could be made from existing data series (e.g., tailoring analyses and sponsoring research on the NCVS; Heimer, 2008) and wholesale revisions to collection methodologies to improve timeliness or relevance (e.g., an NCVS-type sur- vey of experiences in civil justice matters; Eisenberg, 2008). As we observed in Chapter 4, BJS’s state SACs, and its coordination through JRSA, provide it with a mechanism for ready communication and interaction with state- level practitioners, all of which contribute to reevaluation of individual BJS programs and reports. Although BJS has done well on this score, we encourage it to push further and develop the tools that other statistical agencies use to inform themselves of changing data needs of their user bases. Specifically: 1. As BJS staff indicated at the time, the February 2008 users workshop should be seen as a first step and not a one-time conversation. BJS could sponsor an annual users conference, perhaps drawing from a larger base of downstream users than JRSA’s annual research confer- ence. These user meetings could be similar to those routinely held by the National Center for Health Statistics, CDC (for the Behavioral Risk Factor Surveillance System), and the Census Bureau. 2. Through JRSA, BJS sponsors a journal (Justice Research and Policy), much as the Bureau of Transportation Statistics has done for its re- lated fields. BJS’s role in such a journal or statistical publication— and knowledge of strengths and weaknesses in BJS data—could be enhanced by encouraging BJS staff or grantees to seek publication in

250 JUSTICE STATISTICS the journal or developing “special users” on specific user constituency needs. 3. Consistent with item 21 in BJS’s legally authorized duties (Box 1-2), BJS could convene meetings of official justice statisticians from other countries, charged with missions similar to that of BJS, to apprise itself of international comparability. 4. BJS could commission small “white papers” from key leaders in the justice systems about future data needs. 5. BJS should continue, and interact with, informal advisory mechanisms that have developed over the years, such as the Committee on Law and Justice Statistics of the American Statistical Association. Historically, BJS has convened periodic expert workshops as a first step in scoping out new work. McEwen (1996) summarized the 1995 workshop on police use of force that contributed to the development of PPCS, and BJS partnered with SEARCH, the National Consortium for Justice Information and Statistics, on a series of workshops on law enforcement databases such as criminal history records and sex offender registries (Bureau of Justice Statis- tics, 1995, 1997b, 1998a). However, such workshops have become rarer events in light of funding resources. As suggested by the first point in our list above, we think that these workshops are an important mechanism that would have the added benefit of improving concerns about the timeliness of content in BJS data collections; they would provide for regular input and feedback on emerging problems and views. One possible topic on which such a stakeholder workshop could be beneficial is to review content in the correctional data series and the NCVS in order to ensure that definitions and concepts of “mental health” are consistent with current practitioner usage. Recommendation 5.7: To effectively get input on contemporane- ous topics of interest, BJS should regularly convene ad hoc stake- holder workshops to suggest areas of immediate data needs. However, we also believe that BJS would strongly benefit from a more formal means of obtaining user input: therefore, we recommend that BJS es- tablish a standing technical advisory committee, appointed under the terms of the Federal Advisory Committee Act (5 USC App. 1). The legislation that created BJS, the Justice System Improvement Act of 1979, originally mandated a 21-member BJS Advisory Board, with members appointed to 3-year terms by the attorney general; this board was directed to review and make recommendations on BJS programs as well as to recommend candi- dates in the event of a vacancy in the BJS directorship (93 Stat. 1178–1179). However, this provision for an advisory board was removed in the 1984 reauthorization (see notes at 42 USC § 3734). Although BJS receives valu- able advice through informal means, we conclude that there would be real

PRINCIPLES AND PRACTICES 251 value in having a standing advisory committee, including members with sub- stantive expertise, operating staff within justice system institutions, statisti- cal experts, and others who could articulate future needs. It is important that such an advisory board contain high-level policy makers and justice sys- tem practitioners as well as methodologists and statisticians so that detailed research-specific recommendations are paired with input on the timeliness and usefulness of the data in the field.17 The Census Bureau organizes several such advisory committees (includ- ing, for instance, groups specifically focused on input from diverse race and ethnicity groups and on advice from relevant professional associations); an- other model is the Board of Scientific Counselors of the National Center for Health Statistics. Both of these advisory structures in the statistical system provide written recommendations to their respective agencies and, in the case of the Board on Scientific Counselors, undertake program reviews of parts of the agency’s portfolio; this kind of regular feedback would greatly benefit BJS operations. Recommendation 5.8: BJS should establish an Advisory Group under the Federal Advisory Committee Act to provide guidance to BJS on the addition of new data collection efforts and the modification of current ones in light of needs identified by the group. Membership in the group should include, at a minimum, leaders and practitioners from each of the major subject matters covered by BJS data, as well as those with statistical and other types of academic expertise in these subject matters. The mem- bers of the group should be selected by the BJS director and the group should provide the director with at least two reports each year that contain its recommendations. This recommendation is consistent with, but more fully articulated than, Recommendation 5.1 in our interim report (National Research Council, 2008b). A standing advisory committee could be designed with subgroups of topic specialties in mind so that, for instance, the committee is poised to ren- der NCVS-specific methodological advice without having to convene sepa- rate committees for each major collection. By having both coverage and depth in topic areas, a standing advisory committee would be useful as a means for suggesting new directions for research. One specific example 17 As reference, the original BJS Advisory Board specified in the Justice Systems Improvement Act was to have members including “representatives of States and units of local government, representatives of police, prosecutors, defense attorneys, courts, corrections, experts in the area of victim and witness assistance, and other components of the justice system at all levels of government, representatives of professional organizations, members of the academic, research, and statistics community, officials of neighborhood and community organizations, members of the business community, and the general public” (93 Stat. 1178).

252 JUSTICE STATISTICS where a formal advisory committee would be useful would be in revisiting content in the Law Enforcement Management and Administrative Statistics (LEMAS) survey, as part of implementing a core-supplement design. By its nature, LEMAS is an establishment survey that is targeted at a wide variety of individual law enforcement agencies. However, these agencies may dif- fer in their usage and basic definition of terms; for example, depending on the prevailing definition of “community-oriented policing,” all departments might consider themselves to follow that practice whereas others (possibly confounding the term with specific grant/funding streams) may think that they do not. Regular review of the basic language used in the data collection is important to avoid the perception that questions are overly blunt or are confusing. In developing its outreach to its user base, it is important that BJS not neglect the needs and interests of a critical user constituency: members of Congress and their staffs. Steps to assess the issues of interest to the House and Senate Judiciary committees would be useful to build awareness of and interest in BJS products, promote a clearer understanding of what is and is not possible in statistical data collections (as did not seem to occur in developing the PREA reporting requirements), and gain critical support for new and continuing data collections. Recommendation 5.9: DOJ should take steps to ensure that con- gressional staff are aware of BJS data that could be used in devel- oping legislation; DOJ and BJS should learn from congressional staff how their data are needed to inform/support legislation so that they can improve the utility of their current data and so that they can develop new data sets that could enhance policy development. 5–B.3 Openness About Sources and Limitations of Data A statistical agency should be open about its data and their strengths and limitations, taking as much care to understand and explain how its statistics may fall short of accuracy as it does to produce accurate data in the first place. Data releases from a statistical program should be accompanied by a full description of the purpose of the program; the methods and assumptions used for data collection, processing, and re- porting; what is known and not known about the quality and relevance of the data; sufficient information for estimating variability in the data; appropriate methods for analysis that take account of variability and other sources of error; and the results of research on the methods and data. (National Research Council, 2009:8) In general, the panel believes that the BJS staff is fully open regarding the strengths and weaknesses of its data series. Its house style for the prepara- tion of the report emphasizes that even short reports contain a fairly detailed

PRINCIPLES AND PRACTICES 253 section on methodology; these sections generally do a good job at presenting synopses of the design of data collections. The recent episodes concerning the 2006 and 2007 releases of data from the NCVS—culminating in the con- clusion that 2006 data constituted a “break in series” (see Section 3–A.3)—is illustrative in this regard. Recognizing the presence of a problem, BJS staff sought external opinions and worked closely with the Census Bureau to try to understand what had occurred. The declaration of a “break in series” was not an easy one to make, but BJS’s descriptions of the circumstances in its reports (and the documentation accompanying the archived data file) are certainly candid about the limitations of the data. However, the “break in series” incident also illustrates a point that we make later in this chapter concerning the technical skill mix of the BJS staff. In such an incident, it would be useful for BJS to have more in-house staff with advanced technical skills, to more completely understand how design changes and sample size reductions combine to produce discrepant effects. BJS shares with other federal statistical agencies a fundamental problem that it has insufficient numbers of technical staff whose primary job is to focus on evaluation of the quality of data collected by and for BJS. Because of this absence, the outside user of BJS data has no set of working papers, method- ological briefs, or quality profiles that may be consulted to inform them- selves of the characteristics of particular data sets or the potential strengths and weaknesses for their specific uses of the data. The lack of routine evaluation and quality assessments of BJS data is problematic because of the wide variety of sources from which BJS data series are drawn; BJS’s correctional data provide a useful example. Much of the correctional data are collected from agencies and institutions that rely on varied local systems of record-keeping. Heterogeneity in record- keeping standards produces heterogeneity in responses to administrative sur- veys. For some data collections, such as the National Corrections Reporting Program (NCRP), states may have varying definitions of the race, ethnic- ity, and schooling of admitted and released prisoners. Detailed instructions for classification and measurement would improve the quality of corrections data reporting. Recommendation 5.10: To improve the utility and accuracy of the National Corrections Reporting Program (NCRP), BJS should work with correctional agencies to develop their own internal records to promote consistent data collections and ex- pand coverage beyond the 41 states covered in the most recent NCRP. It follows that the same kind of evaluation of the raw data provided by state and local authorities, coupled with work to promote consistent reporting,

254 JUSTICE STATISTICS would also benefit BJS’s other correctional, law enforcement, and adjudica- tion data series. 5–B.4 Wide Dissemination of Data A statistical agency should strive for the widest possible dissemination of the data it compiles. . . . Elements of an effective dissemination pro- gram [include] a variety of avenues for data dissemination [including, but] not limited to, an agency’s Internet website, government deposi- tory libraries, conference exhibits and programs, newsletters and jour- nals, e-mail address lists, and the media for regular communication of major findings. (National Research Council, 2009:9) BJS deserves great credit for its data dissemination efforts, several of which are described in Box 1-1. It makes good use of public use data set archiving through the National Archive of Criminal Justice Data (NACJD); its own website and the OJP-sponsored National Criminal Justice Reference Service provide ready access to an extensive backfile of reports; its website entries for individual reports generally provide the reports in text or print formats and typically include either plain text or spreadsheet tables corre- sponding to key data tables. As noted in Chapter 4, the state SAC network also provides a means for the dissemination of BJS data and products (and SAC analyses thereof) to local audiences. All of these steps have been a great service to the user community and represent shrewd use of partnerships with outside groups with specific expertise that in-house BJS staff could not not do in isolation. The coupling of the public data archive with the regular in- structional workshops conducted by the Inter-university Consortium for Po- litical and Social Research is a very valuable service, opening BJS resources to new researchers. Timeliness of Data Release Although we laud BJS for its work in data dissemination, this principle does suggest three areas where some further comment is necessary, the first of which concerns the timeliness of data release. Once a report is prepared and new data are ready for release, BJS is very good at executing the re- lease; the problem is that the lag times between data collection and the time of report and data release can be considerable, sometimes taking several years, which hurts the freshness and timeliness of the new results. This is, of course, a fundamental problem that applies to statistical agencies other than BJS: timely release of data is essential for those data to be useful in policy formulation and in research, yet the process of collecting high-quality data, ensuring that quality, and protecting the confidentiality of response takes time and is not one that can readily be rushed without overburdening respondents.

PRINCIPLES AND PRACTICES 255 Finding 5.6: A recurring criticism of BJS data products is that their quality is highly valued but that they are not sufficiently timely to meet user needs. All statistical agencies are attempt- ing to grapple with new data collection designs that offer more timely estimates. Delays in the release of data arise—and can be particularly pronounced— in those circumstances where BJS is dependent on other agencies, especially the Census Bureau. By this, we do not impugn the Census Bureau but merely note that it has its own privacy protection protocols and data quality pro- cedures that, combined with BJS’s own review, can add substantially to pro- cessing time. In some instances where the Census Bureau has been the data collection agent, release of data can be obstructed because of post hoc de- terminations that a particular release format would threaten confidentiality. This has been the case with collection and coding of the industry and oc- cupation data from the Workplace Risk Supplement, for which the Census Bureau has opposed release because the cell sizes for certain occupations are too small. Negotiations with the Census Bureau have continued for years, to the extent that these data collected in 2002 have not yet (late 2008/early 2009) been released. Another case of the Census Bureau restricting or impeding the timely availability of data is the removing of the area-identified NCVS from Census Analysis Centers. These data were available in analysis centers around the nation for a number of years but were subsequently withdrawn amidst con- cerns about confidentiality and documentation. Similar issues have barred the release of a special area-identified data file from the NCVS. Such a file is critical to studying the prospects for local-area estimation from the NCVS, and the file was once made available through BJS’s data archive, but it has now been offline and unavailable for about 4 years. Delays of this extent suggest that something is broken in the relationship between BJS and the Census Bureau, which is obstructing the timely release of these data. In cases where other agencies provide the funding of a data collection, such as supplements to the NCVS, the release of the data can be delayed be- cause both BJS and this other agency must issue a “first” release and because there can be ambiguity with regard to which agencies have “control” over the data. All of these factors delay release of the data and should be scru- tinized to see if there could be joint “first” releases or other streamlining of this process. Agreements on supplements or other joint ventures with other agencies could include time limits on the release of data and clearer lines of authority for release. Maximizing the use of BJS data requires that it be released in a timely and equitable fashion and in formats that facilitate its use, while protecting the confidentiality of the data and furthering the goals of the agency. These

256 JUSTICE STATISTICS objectives are often conflicting, and balancing them is no simple matter. It would benefit BJS to track the processing that occurs after data collection is complete and document the times of data collection, report preparation, report release, and data archival to study which components of processing are most time-consuming (and which may be made more efficient). More generally, BJS should work to confront the challenge of timely data release in creative ways. One mechanism to consider is issuance of preliminary estimates—labeled as such and clearly noted as being subject to future revision—that could be issued on a quick basis and separate from a fuller and more detailed report that would contain final estimates. Another (and more elaborate) idea that is worthy of consideration is adoption of continuous data collection designs. These designs spread diffuse sample; information is collected from a smaller number of respondents at any given time but collectors are in the field on as continual a time basis as possible. These designs have the advantage of avoiding the startup costs of reinventing survey design machinery and sample from scratch every time a new round of data is collected, but their continuous streams of data can also be combined and pooled to produce more timely estimates. With the Census Bureau’s introduction of the American Community Survey (ACS), the U.S. public will become accustomed to interpreting the period estimates that span several time periods (e.g., 3-year or 5-year averages); opportunities to present BJS data in similar structures should be considered. Recommendation 5.11: BJS should evaluate each of its data pro- grams to inquire whether more timely estimates might be ob- tained by (a) making discrete data collections into more con- tinuous operations and (b) issuing preliminary estimates, to be followed by final estimates. Equitable Release of Data A second area of discussion aboutdata dissemination is the equitable re- lease of data, meaning that all of the public should generally have access to data releases at the same time, in formats that are conducive to use and inter- pretation. There may be instances in which some individuals outside of BJS should have access to some data before general release because it furthers the goals of the agency, such as evaluating and maintaining data quality. In those cases, priority access should be available and granted. Joint publica- tions where BJS staff collaborate with persons outside the agency may also be an acceptable form of early release. Except in circumstances such as these, statistical agencies such as BJS should strive to ensure that individuals’ access to data files are on an equal footing. The formats in which BJS are data are released should facilitate the use of these data. Here, format includes the medium by which the data are

PRINCIPLES AND PRACTICES 257 made available as well as the content of the releases. BJS, like all statisti- cal agencies, has different formats for different user communities. Written reports and electronic versions of written reports are available for readers who do not wish to manipulate the data. Spreadsheet versions of key tables and some Web-based tools for simple online analysis are provided for users who want to manipulate but may lack the sophistication to do so in a com- plex way. The full data sets are available for the most sophisticated users who are interested in manipulating the microdata substantially. It would be helpful for BJS to be more direct in spelling out the logic and connectedness of their product lines and formats. It would be useful for the website for a LEMAS report to indicate that users can go to a separate part of the BJS website to access online anlaysis options if they cannot find the particular rate or cross-tabulation in the hard-copy and electronic reports; this clue is not immediately obvious. If the online analytical capabilities cannot answer their question, then consumers should be explicitly referred to the NACJD where they can download data sets. This search logic may be obvious to some but not to others who visit the BJS website, and it is not clear that the formats and product lines currently available have a coherent and integrated dissemination plan or strategy. Increasing sophistication of the public with regard to electronic access to information may warrant a reevaluation of the mix of media used to dissem- inate BJS data. BJS has already taken steps to reduce the number of reports produced in hard copy by emphasizing online distribution as Portable Doc- ument Format (PDF) files; a next step would be to consider ways to reduce paper format even more and to make better use of hyperlink facilities in the PDF files to point users to related reports. Some BJS publications such as the Fireram Inquiry Statistics summarizing background checks for handgun purchases have moved to a release format where the “report” release con- sists almost entirely of data tables, with minimal prose. Finding additional avenues for this format would have the dual benefit of potentially provid- ing more timely release (as discussed above) and freeing staff to spend less time on standard report writing and more on innovation and evaluation; that said, careful prose summaries are also very important, and we do not want to be construed as saying that the standard written reports should be abandoned. These suggestions for improving the dissemination of BJS data will put more strain on an overworked staff. Some of the format changes may free up some resources, if they reduce the amount of time required in the editorial process. In the short run, the agency may consider making greater use of the NACJD to develop some of the format changes mentioned in the foregoing paragraphs.

258 JUSTICE STATISTICS Figure 5-5 Bureau of Justice Statistics home page, July 2008 NOTE: URL for home page is http://www.ojp.usdoj.gov/bjs/; this version accessed July 21, 2008. BJS Web Presence A third, and final, discussion topic under the general heading of data dis- semination concerns an essential tool for such dissemination: BJS’s presence on the World Wide Web, the current front page of which is illustrated in Figure 5-5. As suggested by our comments earlier in this section, BJS rec- ognizes the importance of its Web presence to the spread of its information. Former BJS Director Jeffrey Sedgwick (2008:2) commented that: Over the past year, we have continued to develop a new website that will more effectively connect our users with the information they need. The website restructures the way our information is presented, giving users a more intuitive way to retrieve the data they need. Future de- velopment will include enhancing our ability to generate custom data tables and other interactive products online. No website design is perfect in the eyes of every user. It is unclear how useful specific design suggestions would be, though we have indicated pref- erences for some additional topic pages throughout this report (e.g., sum- marizing what data are and are not available concerning white-collar crime; Section 2–C.1). However, one point that we do want to raise as BJS revamps its Web presence is to suggest emphasis on data sourcing and external col-

PRINCIPLES AND PRACTICES 259 laboration. It is worthwhile to frame this discussion by stating a conclusion that is consistent with the principles expected of a federal statistical agency: Finding 5.7: The credibility of BJS’s products is a function of its quality review procedures. It follows that the BJS “brand”—explicitly being labeled as a BJS product—carries weight and is a meaningful distinction. Hence, there is a need to take care in what gets designated and explicitly linked to as a BJS product. BJS’s collaborative projects, such as the Federal Justice Statis- tics program with the Urban Institute and the Court Statistics Project with the National Center for State Courts, are prone to ambiguity and confusion on this score: BJS’s website is sometimes abrupt in linking users to the Ur- ban Institute-hosted Web hub for the federal system statistics. Likewise, the Court Statistics reports sometimes carry a BJS logo but BJS’s sponsorship role (and use of some of the data) is not immediately apparent. To be clear, we do not argue that the reports and portals on non-BJS Web servers are bad in any sense or that the BJS “brand” is being misused by these external placements. Quite to the contrary, the hope is for both BJS and its data collection partners to receive appropriate credit for good work. Accordingly, we conclude and recommend as follows: Finding 5.8: Several BJS data series are collected and maintained by external organizations linked to the BJS website (e.g., Federal Justice System statistics). It is not clear why some data and re- ports reside on external websites, rather than on the BJS website. It is unclear whether such data and reports achieve the quality standards used by BJS. It is not apparent why some websites are permitted to use the BJS label (http://fjsrc.urban.org). Recommendation 5.12: BJS should articulate why some data collections are housed on external websites and describe the pro- cess by which links to external websites are allowed. BJS should articulate and justify the use of its insignia on external websites. We also endorse BJS’s efforts to develop the capability for users to per- form custom tabulations and data summaries directly through the BJS web- site, as envisioned by former Director Sedgwick’s comments above. By doing so, BJS would establish a more full-fledged Web presence rather than serving principally as a document repository. The current model under which (gen- erally) some set tabulations are available as spreadsheets but more advanced data users are directed to download raw data files through the NACJD may actually be said to minimize BJS’s presence somewhat: the precise infor- mation is available but not (directly) from BJS. Some of the larger federal statistical agencies—notably the Bureau of Labor Statistics and the Census Bureau (the latter through its “American FactFinder” interface)—have made considerable efforts in permitting website users to tabulate (and even to plot

260 JUSTICE STATISTICS on a map) their own queries of interest. Clearly, the same level of interactive features cannot be expected without commensurate resources, but develop- ing means by which steady streams of researchers, reporters, students, or congressional staff could readily obtain BJS information directly from the BJS site would ultimately be beneficial. 5–B.5 Cooperation with Data Users [A statistical agency should] seek advice on data concepts, statistical methods, and data products from data users as well as from other pro- fessional and technical subject-matter and methodological experts, us- ing a variety of formal and informal means of communication that are appropriate to the types of input sought. (National Research Council, 2009:9–10) We have described BJS’s existing programs for outreach to and feedback from user groups and key constituencies in Section 5–B.2, in the context of the continual search to provide more useful data. Hence, our comments in this section are brief: BJS deserves credit for implementing a variety of out- reach venues and the discussion at the February 2008 users workshop pro- vided ample testimony that there is widespread appreciation of BJS among the user base. BJS’s performance is certainly within the norms of other prin- cipal statistical agencies and we suggest that it could be improved still further through the recommendations we offer in the earlier section. 5–B.6 Fair Treatment of Data Providers [Fair treatment practices include] policies and procedures to maintain the confidentiality of data, whether collected directly or obtained from administrative record sources, [and to] inform data providers of the pur- poses of data collection and the anticipated uses of the information. . . . [They also include] respecting the privacy of respondents by minimiz- ing the contribution of time and effort asked of them, consistent with the purposes of the data collection activity. (National Research Council, 2009:10) Fair treatment practice is fairly synonymous with the principle of estab- lishing a relationship of mutual respect and trust with data providers, de- scribed in detail in Section 5–A.1. The same general messages apply: BJS is generally very diligent and fair in its relationship with both establishment (state agency or individual facility) and person respondents. However, in our assessment, the PREA reporting requirements to which BJS is currently subject constitute a direct violation of this practice. The relationship of trust within which BJS collects information from its data providers is threat- ened by PREA because this data collection directly threatens and sanctions the data providers in ways that others do not. When there is direct harm

PRINCIPLES AND PRACTICES 261 from PREA participation perceived by a data provider, the other BJS data collections are threatened. Fair treatment of data providers is one of the foundations of trust. Violating this practice can have consequences that take decades to undo. 5–B.7 Commitment to Quality and Professional Standards of Practice A statistical agency should: • use modern statistical theory and sound statistical practice in all technical work. • develop strong staff expertise in the disciplines relevant to its mis- sion, in the theory and practice of statistics, and in data collection, processing, analysis, and dissemination techniques. • develop an understanding of the validity and accuracy of its data and convey the resulting measures of quality to users in ways that are comprehensible to nonexperts. . . . (National Research Coun- cil, 2009:11) As indicated at several points in this chapter, in our judgment, BJS has high standards for quality that are generally well understood. For this, BJS deserves considerable credit but, having expressed the point already, we do not reiterate at length here. In the area of using modern statistical techniques and data collection practices, we worry that BJS is somewhat out of touch with current develop- ments in statistical data collection. For instance, as described in Box 5-2, the PREA reporting requirements put BJS in a position where the inherent vari- ability in estimates is such that it could not identify the highest- and lowest- ranked facilities as specified by the act (flawed and inappropriate though that requirement is). Instead, BJS chose to list a group of high-incidence facilities that, in some sense, are indistinguishable from each other. Yet this approach still has the effect of suggesting a level of precision that the estimates sim- ply do not support; though we recognize that BJS faced difficult choices in issuing its PREA reports and that it was undoubtedly correct not to try to match the exact letter of the requirements in the law, the release would have benefited from very rigorous review of other approaches for presenting high-sensitivity data and attention to issues of multiple comparisons. Similarly, in our interim report (National Research Council, 2008b:119) we expressed concern about the lack of mathematical statistics and survey practitioner expertise on the BJS staff; in its recent problems with the NCVS and the possible “break in series,” BJS was possibly too dependent on the Census Bureau’s (unfortunately post hoc) analyses of the effects of design changes and sample size reductions on the final NCVS estimates. Subsequent to our interim report, BJS has created a “senior leader” position among its top management with the idea of bolstering its survey management exper-

262 JUSTICE STATISTICS tise. This is a very positive development, yet we still suggest that the absence of a chief mathematical statistician is troubling because such a post (as well as a chief survey methodologist) tends to lead the agency’s attention to con- tinual statistical improvements over time. (We return to the issue of staff expertise in Section 5–B.8.) One way to judge professionalism is to look at methodological contribu- tions made by an agency’s staff with the intent of making it easier for users to correctly use and interpret data. One major contribution in this regard was BJS’s sponsorship of development of a “crosswalk” data set by NACJD staff between the FBI’s Originating Agency Identifier (ORI) codes and more standard geographic constructs such as cities and counties (Bureau of Justice Statistics, 2004d). The service populations of law enforcement agencies with ORI codes do not necessarily correspond neatly with official geographies and, in many cases, may overlap each other. The crosswalk file approxi- mates the service populations to facilitate some direct comparisons between the FBI’s UCR data and other data sources. Other user-oriented method- ology contributions include the summary by Langan and Levin (1999) of differences in state prisoner counts when prison records (NCRP) or court records (National Judicial Reporting Program) are used and a series of clear, approachable pieces on the conceptual differences between the UCR and the NCVS (U.S. Department of Justice, Bureau of Justice Statistics, 2004). The panel also requested of BJS a summary of professional activities of the staff, in an effort to evaluate whether the staff was connected with net- works that would alert them to new developments in statistical design, data collection, and estimation. BJS staff are frequent participants in interagency working groups of staff from the range of federal statistical agencies. Sev- eral of these activities are topic working groups of the Federal Committee on Statistical Methodology, itself an interagency working group coordinated by OMB. Other interagency groups to which BJS contributes members are the Federal Interagency Forum on Aging Related Statistics, the Interagency Forum on Child and Family Statistics, and the Interagency Subcommittee on Disability Statistics. As a stakeholder and sponsor of the Census Bu- reau’s demographic surveys program, it also participates in several intera- gency working groups organized by the Census Bureau, specifically those on the ACS and Sample Survey Redesign (updating sample and addresses for de- mographic surveys based on new census results). On the international level, BJS staff have also participated in relevant statistical programs of the United Nations Economic Commission on Europe (UNECE) and the Organisation for Economic Co-operation and Development, including specific UNECE task forces on victimization surveys and statistical dissemination and com- munication. However, the bulk of its staff professional and working group activities are internal to DOJ, ranging from membership on NIJ committees on drugs and crime and evaluation of justice on American Indian reserva-

PRINCIPLES AND PRACTICES 263 Box 5-4 Review Process for an Information Collection by a Federal Agency The U.S. Office of Management and Budget (OMB) is responsible for reviewing and approving any information collection activity—not only surveys for statistical purposes, but any form or application—that will be administered to 10 or more respondents (44 USC § 3502(3)(A)(1)). This “clearance” process can be time-consuming, because it must include two postings in the Federal Register for public comment as well as time for OMB’s Office of Information and Regulatory Affairs to render its decision. Agencies develop and submit an Information Collection Review (ICR) request or, more colloquially, a “clearance package,” to OMB. In this process, surveys and any information collection making use of statistical methodology (for editing, imputation, or sample selection) are held to a higher standard. All ICRs must include a Part A, giving a detailed justification for the collection, indicating how and for what purpose the data will be used (or, if the ICR is reauthorizing an existing collection, how the data have been used); Part A also includes cost and time burden estimates. Statistical collections must also include a Part B report, which must include details on the sampling strategy for the collection and procedures for handling nonresponse, as well as descriptions of any tests to be conducted prior to full fielding of a collection. Names and contact infor- mation of any person consulted on the design of the collection are also required in Part B. OMB maintains a publicly accessible database of pending and completed ICRs, including links to agency-submitted supporting statements, at http://www.reginfo.gov. tions and tribal lands to membership on the Bureau of Prisons’ institutional review board. Collectively, these efforts suggest attempts to build ties and outreach to other units in DOJ—and hence increase BJS’s relevance to DOJ, which we encouraged and recommended above. However, the range of these activities is largely insular to the Justice Department and the executive branch; this bolsters the importance of the outreach efforts, including an advisory panel, suggested above. Though there is much to commend in BJS’s professional standards of practice, there is one area where BJS often displays, publicly, a marked weakness: the preparation of supporting statements for its information col- lections. As described in Box 5-4, all federal agency requests to collect infor- mation from 10 or more respondents must be cleared with OMB, in compli- ance with the Paperwork Reduction Act. For collections involving statistical methodology, the bar for approval is set higher; the Information Collection Review (ICR) packages submitted to OMB must include a “Part B” return providing details on sample construction, procedures for collecting and pro- cessing information, and pretests of survey instruments. Public versions of the ICRs are browseable online at http://www.reginfo.gov by searching for data collections listed under OJP . On one hand, preparation of ICR supporting statements could be seen as no more and no less than clearing a bureaucratic hurdle. On the other,

264 JUSTICE STATISTICS however, a reading of many of BJS’s submissions over the past few years sug- gests a surprising and disappointing lack of specificity, as well as less-than- compelling arguments for the necessity and utility of the studies. Questions on the justification for the information collection are usually answered along strictly legal lines, citing BJS’s general mandate to “collect and analyze sta- tistical information, concerning the operations of the criminal justice system at the Federal, State, and local levels” (see Box 1-2) and usually including a copy of that section of the U.S. Code as an attachment. Rarely does the justification section indicate how the collection fits with, supplants, or is superior to existing data series, and information on the uses to which the data will be put is sparing. As strong as the methodology sections of BJS’s final reports are, its technical specifications in the information collection requests—language that ought to be, effectively, the first draft of the tech- nical documentation for a new data set—are strikingly weak. Examples of these ICR packages, and deficiencies in their support documentation, are described in Box 5-5. Though they are, functionally, a bureaucratic step, the ICRs that BJS de- velops to obtain clearance from OMB are also a first opportunity to carefully explain the rationale for data collections from the substantive and technical viewpoints. They are also first drafts of the technical documentation for new data series and templates for actual data collection efforts. On these dimen- sions, neither new nor continuing BJS data collections are helped by having weak and deficient supporting statements made for them in a public (if not widely viewed) forum. 5–B.8 Active Research Program A statistical agency should have a research program that is integral to its activities. Because smaller agencies may not be able to afford as extensive a research program as larger agencies, agencies should share research results and methods. Agencies can also augment their staff resources for research by obtaining the services of experts not on the agency’s staff through consulting or other arrangements as appropriate. (National Research Council, 2009:11) Some of the estimates produced from BJS data have acquired a status as national benchmarks that should be preserved, and the agency’s products are known for their quality standards and objectivity. To be sure, maintenance of series continuity is, properly, a high priority; this is because estimates of change, and especially change over a relatively long period of time, are among the most important pieces of information that these long-term data resources can provide. At the same time, it is important for statistical agencies to ensure that their product lines are current both substantively and methodologically. As

PRINCIPLES AND PRACTICES 265 Box 5-5 Problems in Bureau of Justice Statistics Information Collection Requests BJS’s Information Collection Request (ICR) package for the proposed Census of Law Enforcement Aviation Units (ICR 200708-1121-002) is a useful example. The abstract mentions that collection is a “part of the BJS Law Enforcement Management and Administrative Statistics program,” and the statement on the necessity of the collection references 2003 LEMAS data: It is estimated that about 250 law enforcement aviation units are in operation among State and local agencies in the United States. These units operate an estimated 1,000 aircraft, including about 600 helicopters and 450 fixed-wing aircraft. The 2007 Census of Law Enforcement Aviation Units will be a census of all agencies, sampled in the 2004 LEMAS survey, which reported having either a fixed-wing aircraft or helicopter. It will be the most comprehensive study conducted in this area to date. The data collection will include detailed items on the functions, personnel, equipment, record keeping, expenditures, and safety records of these units. The basic cited need for the collection is homeland security-tinged—“it is important to know the location and nature of available assets that could be mobilized in the event of large-scale regional or National emergencies”—with the add-on mention that “this information is also critical to law enforcement policy development, planning, and budgeting at all levels of government.” The description is muddled as to whether the data are intended to draw some inference about characteristics of agencies that maintain aviation units (e.g., through the detailed items on equipment and safety records) or as a convenient directory of relevant agencies (for mobilization purposes). The statement is further unclear about how the collection fits with the broader LEMAS program, whether the information is sufficiently important that it should be collected on a regular basis, and whether there is any auxiliary information to evaluate the accuracy of the 2003 estimate that about 250 agencies have such units. Most disappointing in this ICR, however, is the Part B return on statistical methods. Save for BJS contact information, what is supposed to be a fairly detailed technical specification of data collection techniques and planned methodologies runs about half a page, as follows: Universe and Respondent Selection: This data collection will be a census of law enforcement aviation units from among agencies with 100 or more officers. No sampling is involved with this collection. Procedures for Collecting Information: The census will be conducted initially by mailout. The address mailing list will be updated prior to mailout in order to maintain a current list of the respondents. Personal telephone interviews will be conducted for non-respondents. Methods to Maximize Response: We will do everything possible to maximize response, including telephone facsimile transmission, telephone interviews, and on-site assistance. Response rates for prior BJS law enforcement surveys and censuses have typically been 95% and above. Testing of Procedures: The census instrument has been pretested in three selected jurisdictions by individuals that will be receiving the final census instru- ment. Comments received as a result of that testing have been incorporated into the census instrument accompanying this ICR. (continued)

266 JUSTICE STATISTICS Box 5-5 (continued) The grounds for criticism of this extremely scant statement are numerous: • The proposed collection shares with its fellow special-agency censuses a lack of clarity over whether the collection is intended as a “survey” or a “census.” Throughout the rest of the document, and in the title of the collection, “survey” had been used; in the Universe and Respondent Selection section, “census” suddenly becomes the preferred choice. • Regardless of the “survey” or “census” label, the primary source of contact in- formation is the existing LEMAS survey; even if the aviation unit study is meant as a census, the method of construction of its frame/address list (LEMAS) should be described in more detail. Part A suggests that the LEMAS listings would be supplemented by listings from the Airborne Law Enforcement Association and In- ternational Association of Chiefs of Police; coverage properties for either of those lists is missing, as is any hint of how many additional units might be added through reference to those lists. The restriction to agencies with 100 or more officers is not previously mentioned, or described further. • The statement is absent of any notion of whether and how the contact strat- egy differs from that of the main LEMAS collection or, indeed, who will carry out the collection. Likewise, any formal connection to the basic LEMAS survey (e.g., whether the results of the aviation-specific study might be used to revise ques- tions on the main survey) is unspecified. • The reference to providing “on-site assistance” is vague—does it refer to follow-up by a field interviewer? • The reference to response rates in previous law enforcement surveys is interesting but unpersuasive; a better point of comparison might be similarly scoped attempts to canvass special units within departments rather than the main LEMAS survey. • The final section, on testing of procedures, is particularly uninformative. How were the pilot jurisdictions chosen? Were there any difficulties encountered in the questionnaire, such as terminology usage? Did specific comments from the pilot respondents lead to changes in the contact strategy? The Law Enforcement Aviation Unit ICR is an example of particularly weak justification and technical specification statements, but reading of other BJS-prepared ICRs show similar deficiencies. BJS’s request for clearance of the 2007 Survey of Law Enforcement Gang Units (ICR 200705-1121-001) shared some gross features of the aviation unit ICR, again using the “survey” nomenclature but describing the effort as a “nationwide census of all law enforcement gang units operating within police agencies of 100 or more officers.” The supporting statement for the gang unit study does not explain whether any other data sources besides previous LEMAS returns are to be used to build the frame of dedicated gang units, leaving it unclear whether the collection is indeed a census (a canvass of all known gang units) or a survey (probability sample). In another example, the section on testing of procedures in the ICR for the 2007/2008 National Survey of Prosecutors says that “the survey instrument was previously pretested with 310 jurisdictions during the 2005 data collection whereby BJS received a 99% response rate ” (ICR 200704-1121-004). However, other portions of the statement make clear that the newer 2007/2008 version was purposely designed as a complete census of prosecutor offices, meaning that questions were revised and the number of questions was scaled back. Since this makes the newer survey different in scope and character than the 2005 version, the 2005 response rate—though impressive—fails to answer the question of experience in pretesting the questionnaire.

PRINCIPLES AND PRACTICES 267 is true of other statistical agencies facing tight resources, BJS has been forced into an overriding focus on basic production of a set of data series and stan- dard reports, at the expense of research, development, and innovation. As we discussed in Section 3–F.2, the performance measures in BJS’s strategic plan are largely ones of volume and throughput—counts of file access on the NACJD, number of reports and supporting material accessible on the BJS website, number of data collections performed or updated each year—that lack a forward-looking focus on improvements in methodology and options for improving content. A statistical agency should be among the most intensive and creative users of its own data, both to formally evaluate the quality and properties of its data series but also to understand the findings from those data and shape future refinements. BJS’s “Special Reports” series have, in the past, gone into depth on topics not routinely studied by the agency’s standard reports or have taken unique looks at BJS data, such as age effects in intimate part- ner violence (Rennison, 2001), the interaction between alcohol and crimi- nal behavior (Greenfeld, 1998; Greenfeld and Henneberg, 2001), and the prevalence of ever having served time in prison among the U.S. population (Bonczar and Beck, 1997; Bonczar, 2003). They have also provided some opportunity for BJS analysts to make use of multiple BJS data sets or com- bine BJS data with non-BJS data sets in interesting ways: • To study educational attainment in the correctional population, Harlow (2003) studied data from BJS’s prisoner and jail inmate sur- veys, its 1995 Survey of Adults on Probation, the Current Population Survey of the Bureau of Labor Statistics, and the 1992 National Adult Literacy Survey of the National Center for Education Statistics. • Zawitz and Strom (2000) combined data from the NCVS and multiple data series from the National Center for Health Statistics to describe both lethal and nonlethal violent crime incidents involving firearms. • Greenfeld (1997) combined information from the UCR, the NCVS, and BJS’s corrections and adjudications to summarize the state of quantitative information on sex offenses including rape and sexual as- sault. Moreover, in fairness, BJS deserves credit for several innovative tacks that it has taken. Although full use of electronic questionnaires took consid- erable time, BJS and the NCVS were (through its work with the Census Bu- reau) relatively early adopters of computer-assisted methods in major federal household surveys. And, though we have argued at length that the reporting requirements are inappropriate, BJS’s work on data collections in support of PREA led the agency to make great strides in the use of ACASI and other techniques for interviewing on sensitive topics. BJS has also demonstrated itself to be effective and innovative in developing data collection instruments

268 JUSTICE STATISTICS to confront very tough methodological problems: identity theft, hate crimes, police-public contact, and crimes against the developmentally disabled. But innovative in-house data analyses by BJS have slowed in recent years as the focus on production has increased and resources have tightened; ma- jor methodological innovations such as the use of ACASI were possible be- cause PREA carried with it substantial funding. BJS’s need to update long- standing products and keep activities in place, for basic organizational sur- vival, has too frequently trumped innovative research and intensive explo- ration of new and emerging topic areas. Indeed, the principal means for identifying “emerging data needs” cited in BJS’s strategic plan is not exam- ination of the criminological literature or frequent interaction with crimi- nal justice practitioner communities, but rather “emerging data needs as ex- pressed through Attorney General priorities and Congressional mandates” (Bureau of Justice Statistics, 2005a:32).18 In our assessment, the lack of a research program (and the capacity for a research program) puts BJS and its data products at risk of growing stagnant and becoming less relevant. Finding 5.9: The active investigation of new ways of measuring and understanding crime and criminal justice issues is a criti- cal responsibility of BJS. The agency has lacked the resources needed to fully meet this responsibility and, for some issues, has fallen behind in developing such innovations. Finding 5.10: BJS has lacked the resources to sufficiently pro- duce new topical reports with data it currently gathers. It also lacks the resources and staff to routinely conduct methodolog- ical analyses of changes in the quality of its existing data series and to fully document those issues. Instead, the BJS production portfolio primarily is limited to a routine set of annual, biannual, and periodic reports and for some topics, the posting of updated data points in online spreadsheets. In our interim report, we made specific recommendations to stimulate research directly related to the NCVS, specifically calling for BJS to initiate studies of changes in survey reference period, improvements to sample ef- ficiency, effects of mixed-mode data collection, and studies of nonresponse bias (National Research Council, 2008b:Recs. 4.2, 4.7, 4.8, 4.9). In re- sponse, BJS quickly issued requests for proposals for external researchers to conduct such studies, and has also signaled its intent to conduct a sur- vey design competition to evaluate broad redesign options (Rec. 5.8 in Na- tional Research Council, 2008b). This is a laudable reaction that is a step toward laying out more concrete options for and future activities related to 18 “In addition,” the plan notes shortly thereafter, “BJS staff meet regularly with Federal, State, and local officials to identify emerging data needs or desirable modifications to existing collection and reporting programs” (Bureau of Justice Statistics, 2005a:32).

PRINCIPLES AND PRACTICES 269 the NCVS, BJS’s largest data program, but a fuller research program is criti- cal to future-oriented option development for BJS’s non-NCVS programs. It is also critical to avoid implementation problems such as those experienced in the 2006 administration of the NCVS. As we noted in our interim report, “design changes made (or forced) in the name of fiscal expediency, without grounding in testing and evaluation, are highly inadvisable” (National Re- search Council, 2008b:83). To this end, a short recommendation that we offered in our interim report (National Research Council, 2008b:Rec. 4.1) is worth formally restating here: Recommendation 5.13: BJS should carefully study changes in the NCVS survey design before implementing them. It follows that this guidance can be applied to changes to other BJS data collections, and that such evaluative studies are not possible without the resources necessary to make innovative research a priority for the agency. Congress and the administration cannot reasonably expect BJS to shoul- der daunting data collection requests without the agency engaging in ongo- ing research, development, and evaluation. Going forward, a key priority should be detailed error analysis of the NCVS to get a sense of how big a problem survey nonobservation may be in specific socioeconomic subgroups, as the basis for understanding where improvements may most properly be made. On a related matter, BJS research activities should also be directed at improving outreach and data collection coverage of groups that are tra- ditionally hard to reach by survey methods; such groups include new immi- grant groups and persons and households where English is not the primary spoken language, young minorities in urban centers, and the homeless. Recommendation 5.14: BJS should study the measurement of emerging or hard-to-reach groups and should develop more ap- propriate approaches to sampling and measurement of these populations. In the following, we suggest a few selected areas for a BJS research pro- gram. These should not necessarily be interpreted as the only or as the most pressing research priorities, but we believe they are all important directions. In terms of methodological innovations, BJS should consider greater use of model-based estimation. In our interim report, we recommended inves- tigation of such modeling for the generation of subnational estimates from the NCVS (National Research Council, 2008b:Rec. 4.5); improving the spa- tial and, perhaps, temporal resolution of estimates from the NCVS remains the highest priority in this regard, but the methodology could be brought to bear in other areas. The development of small-area estimates is particularly pressing because the agency is often criticized for not being able to speak to subnational areas. Modeling can also refer to the use of multivariate anal- yses to control for factors that mask real changes in the phenomenon of

270 JUSTICE STATISTICS interest. Just as many economic indicators are adjusted for inflation or sea- sonal fluctuation, it would make sense to adjust crime rates for factors that mask important variation. Age-adjusting crime rates, for example, would help separate the effects of macro-level social changes (over which one has little control) from more troubling and actionable changes in the incidence of crime. The same can be said of incarceration rates: adjusting admission rates for the volume of crime would provide a perspective on the use of incarceration not available in simple population-based rates. Modeled data should surely be used when we know that right or left censoring of data makes data incomplete and inaccurate. For years BJS published estimates of time served in prison using exiting cohorts when they knew that this se- riously underestimated the time served. This is a case where model-based estimates would most certainly have been more accurate than data-based estimates. However, greater use of model-based estimates must be done with cau- tion, for several reasons. One is the challenge of interpretation: Modeling may not be understood by many consumers of BJS data. This may be largely a presentational problem that can be solved by presenting the estimates sim- ply and then providing the detailed description of modeling elsewhere. The use of double-decrement life tables by Bonczar and Beck (1997) (later up- dated by Bonczar, 2003) is a good illustration of how modeling could be presented in BJS reports. Another challenge is that models are always based on assumptions, assumptions that can be more or less accurate or robust (and there can be wide disagreement over what is accurate or robust). Hence, sit- uations where the choice of assumptions may be interpreted as reflecting political or other bias should be avoided. A more basic methodological development, but still complex research ef- fort, would be for BJS to invest in the creation and revision of basic classifica- tions and typologies for crime and criminal justice matters. Its role in coordi- nating information from a variety of justice-related agencies and promoting standards through National Criminal History Improvement Program–type grants for improvement of source databases gives BJS unique advantages in taking on such an effort. The classification of “index crimes” used in the UCR has changed little in 80 years and remains the nation’s major crime classification; its implications for what crimes are most serious are central to the definitions used in the NCVS and other BJS collections. Yet the inter- est in crime and the amount of information available on crime has changed greatly over those 80 years, and the basic classification of crime should be revisited to keep pace with these changes. BJS should also invest some effort in getting denominators for risk rates that are more reflective of the at-risk population. Major cities, for exam- ple, are disadvantaged in the annual crime rankings of jurisdictions based on UCR data because their rates are based upon their residential population—a

PRINCIPLES AND PRACTICES 271 base that excludes the commuters, shoppers, and culture seekers who con- tribute to the numerators of the rates. Likewise, incarceration rates based on the entire population are technically correct but may be otherwise mis- leading, because very young and very old populations are not at risk. The generation of risk rates should not be restricted to the data generated by BJS but should use other data as long as the quality and periodicity of those data are acceptable. To report estimates from BJS’s inmate surveys as pro- portions of the prison population misses a great opportunity to understand much better how the nation uses its prison resources; incarceration rates reflecting the general household population (as in Bonczar, 2003) may be uniquely informative. 5–B.9 Strong Internal and External Evaluation Program Statistical agencies that fully follow [this set of prescribed practices] will likely be in a good position to make continuous assessments of and im- provements in the relevance and quality of their data collection systems. . . . Regular, well-designed program evaluations, with adequate budget support, are key to ensuring that data collection programs do not dete- riorate. (National Research Council, 2009:47, 48) The practice of instituting a strong internal and external evaluation pro- gram is a new addition to the fourth edition of Principles and Practices of a Federal Statistical Agency. It is similar to the practice of an ongoing research program (Section 5–B.8) but has slightly different connotations, emphasizing not only the continuous quality assessment of individual data collection pro- grams but periodic examination of the quality and relevance of an agency’s entire data collection portfolio. It is very much to BJS’s credit with respect to following this practice that it has periodically sought the advice of ex- ternal users and methodologists on specific methodological problems, that it engaged in the intensive rounds of testing and evaluation that led to the redesigned NCVS in the early 1990s, that it regularly receives feedback on data quality from its state SAC network and JRSA, and that it actively sought and encouraged this panel’s review of the full BJS portfolio. Like other small statistical agencies, BJS’s ability to mount large-scale evaluation efforts is limited by available resources. Still, attention to internal and external evaluation is critical. Indeed, some of the guidance we offer in this report—for instance, on emphasizing the flows from step to step in the justice system within existing BJS data sets and facilitating linkage between current data sets (Section 3–F.1)—depends critically on careful evaluation of the strengths and limitations of current data collections and structures as a first step. One general direction for improvement by statistical agencies, including BJS, is greater attention to known data quality issues and comparisons with

272 JUSTICE STATISTICS other data resources as part of the general documentation of data sets. BJS reports are generally careful to include a concise methodology section, and the public-use data files that are accessible at the NACJD typically include additional detail in their codebooks. Still, as a general practice, BJS should work to find ways to improve the documentation on its major data hold- ings that is directly accessible from BJS. This could include developing and making available technical reports based on specific user experiences and providing direct links to Census Bureau (and other BJS-contracted data col- lection agents) technical reports on developing specific survey instruments. As part of an evaluation program, it would also be useful for BJS to move beyond individual series examinations and approach critiques of the relative quality of multiple sources. This work should be done in partner- ship with other statistical agencies or data users, such as we describe below in Section 5–B.11 for comparing BJS’s prison and jail censuses with the data quality and resolution provided by the Census Bureau’s ACS. Other exam- ples for multiple-source evaluation include: • Examination of differences between homicide rates computed from the UCR data and those from the cause-of-death data coded in the vital statistics that are compiled by the National Center for Health Statistics; • Reconciliation of the number of gunshot victims known to the police (or measured in emergency room admissions data) with the number of self-reported gunshot victims in the NCVS (see, e.g. Cook, 1985); and • Examination of the reasons why serious-violence victimization rates from the NCVS and School Crime Supplement differ from those de- rived from CDC’s Youth Risk Behavior Surveillance System. 5–B.10 Professional Advancement of Staff To develop and maintain a high-caliber staff, a statistical agency must recruit and retain qualified people with the relevant skills for its ef- ficient and effective operation, including analysts in fields relevant to its mission (e.g., demographers, economists), statistical methodologists who specialize in data collection and analysis, and other specialized staff (e.g., computer specialists). (National Research Council, 2009:12) At the panel’s request, BJS supplied biographical information for its staff members as of fall 2008. A total of 32 of the 53 staff members hold positions with labels connoting direct statistical work (statistician, senior statistician, or branch chief); 12 have doctoral degrees (with an additional five listed as being Ph.D. candidates) and nearly all list master’s degrees. However, none holds a doctoral or master’s degree in statistics, although two statisticians have completed master’s degrees in the Joint Program in Survey Methodol- ogy of the University of Maryland, the University of Michigan, and Westat.

PRINCIPLES AND PRACTICES 273 Indeed, the only formal statistics degree on the full BJS staff is a bache- lor’s degree, held by a specialist on the support staff. As is not surprising, advanced degrees in criminology (or criminal justice) or sociology abound, though other fields such as social psychology, social welfare, and public af- fairs are also included. The statistician ranks in BJS also include one holder of a law degree. Our review of the staff biographies—and of BJS’s publications, through- out this report—suggests a very capable and dedicated staff, with a median length of service of about 8 years and including several career staff members of 20 years or more. Our intent is not to impugn the good work of the BJS staff. However, in Section 5–B.7 and our interim report, we commented on the need for more highly skilled technical leaders within BJS; we think this is necessary to put BJS on a better footing in dealing with its external data collection agents, to cultivate a climate of research and innovation, and to safeguard the continued credibility and quality of BJS data. Going fur- ther, we suggest that BJS would benefit from additional staff expertise in mathematical and survey statistics; computer science and database manage- ment are also notable deficiencies in staff expertise, given the agency’s role in executing grants to improve criminal justice databases and the importance of record linkage for conducting longitudinal studies of flows in the justice system. Recommendation 5.15: BJS must improve the technical skills of its staff, including mathematical statisticians, computer scien- tists, survey methodologists, and criminologists. At the same time, the panel notes that the recruitment problem for tech- nical staff to all statistical agencies is a large one. The agencies in the federal statistical system that seem to do better on this score are those who are ac- tively supporting advanced degrees among their junior staff—that is, making human capital investments in bachelor’s-level staff and assisting their grad- uate studies to yield more technically astute staff in 2–4 years. In addition, agencies have sponsored dissertation fellowships on their own data, using the contact with the Ph.D. candidate to recruit talented staff. 5–B.11 Coordination and Cooperation with Other Statistical Agencies Although agencies differ in their subject-matter focus, there is overlap in their missions and a common interest in serving the public need for credible, high-quality statistics gathered as efficiently and fairly as pos- sible. When possible and appropriate, federal statistical agencies should cooperate not only with each other, but also with state and local statis- tical agencies in the provision of data for subnational areas. (National Research Council, 2009:13)

274 JUSTICE STATISTICS There are some valuable and mutually productive partnerships between BJS and other statistical agencies. These include relatively long-term ar- rangements such as the National Center for Education Statistics’ sponsorship of the School Crime Supplement as well as one-time collaborations, such as a joint report by BJS and CDC staff on findings from the NCVS on injuries sustained in the course of violent crime victimizations (Simon et al., 2001). BJS has also enjoyed some collaborative work with the National Center for Health Statistics, including use of vital statistics data collected from state public health departments and registrars. BJS has also, on occasion, worked with agencies that are not principal statistical agencies but that do conduct statistical work; for instance, BJS sponsored the Consumer Product Safety Commission to add a Survey of Injured Victims of Violence as a module to the commission’s National Electronic Injury Surveillance System—a sample of hospitals that provide their emergency department records for coding and analysis (Rand, 1997). Of course, BJS’s most intensive relationship with another statistical agency is with the Census Bureau. Although there are some cooperative aspects of the partnership between the two agencies, the panel believes that there are some fundamental strains in the relationship. One is that, as noted in the preceding section, BJS has lacked the strong statistical expertise to fully engage with the Census Bureau staff on design (and redesign) issues, and so its role in modifying the NCVS to fit within budgetary constraints has largely been one of deciding which Census Bureau–developed cost-saving options are least objectionable. Another element of strain is discussed in our interim report (National Research Council, 2008b:Sec. 5–D): the failure of the Census Bureau to provide transparency in its costs and charges for data collection to BJS (or its other federal agency sponsors), which makes assessments of the trade-offs between survey costs and errors impossible. Agencies that contract out much of their work—and BJS is one of the extreme cases within the statistical system in that regard—can easily evolve into ones where contract management is the dominant focus. While more (and more sophisticated) technical staff will not solve the BJS budget prob- lems, they can make BJS a stronger partner to the other statistical agencies with which it works. On substantive grounds, an important area in which a healthy BJS– Census Bureau relationship and collaboration would beneficial is in recon- ciling BJS’s corrections data series with the Census Bureau’s measures of the correctional institution population. The American correctional apparatus has grown enormously since the mid-1970s; there are now on the order of 2.3 million persons in prison or jail, and the incarceration rate has grown fourfold since 1980. Another 800,000 people are on parole, and 4.2 million are on probation. Virtually all the growth in incarceration since 1980 has been among those with less than a high school education. In this context, the

PRINCIPLES AND PRACTICES 275 BJS data collections are a valuable supplement to the large Census Bureau household surveys which are drawn exclusively (or nearly so) from the non- institutional household population. BJS collections on the population under correctional supervision are not just an important part of an accounting of the criminal justice system, but an increasingly important part of the nation’s accounting for the population as a whole. Those groups overrepresented in prison populations—minority men under age 40 with little schooling—are also significantly undercounted in household surveys and other data collec- tions from the general population. The Census Bureau’s ACS now contains the detailed social, demographic, and economic questions that were traditionally asked of a sample of the population through the “long form” questionnaire of the decennial census. When the ACS entered full-scale collection earlier this decade, it also in- cluded coverage of the group quarters (nonhousehold) population, includ- ing prisoners. The first 3-year-average estimates from the ACS for areas with populations of 20,000–65,000 only became available in 2008, and the first 5-year-average estimates for all geographic areas (including those under 20,000 population) are only slated for release in 2010. Hence, the prop- erties of these estimates—much less their accuracy for segments of the rela- tively small group quarters population—are only beginning to be studied and understood. Going forward, an important question will be how the most ac- curate picture of the prison and jail population can be derived, balancing the ACS estimates with the annual count (and basic demographic) information from BJS’s prison and jail censuses and the detailed information available from BJS’s inmate surveys. 5–C SUMMARY The panel believes that BJS and DOJ should conduct continual examina- tion of BJS’s fulfillment of the principles and practices of a federal statistical agency. Our panel’s review found that the perceived independence of the agency was severely shaken by recent events. We found that the trust of data providers is threatened by BJS directly assisting regulatory activities. We also found that a renewed emphasis on increasing the technical and research skills of BJS’s staff is needed.

Next: 6 Strategic Goals for the Bureau of Justice Statistics »
Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics Get This Book
×
Buy Paperback | $80.00 Buy Ebook | $64.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Bureau of Justice Statistics (BJS) of the U.S. Department of Justice is one of the smallest of the U.S. principal statistical agencies but shoulders one of the most expansive and detailed legal mandates among those agencies. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics examines the full range of BJS programs and suggests priorities for data collection.

BJS's data collection portfolio is a solid body of work, well justified by public information needs or legal requirements and a commendable effort to meet its broad mandate given less-than-commensurate fiscal resources. The book identifies some major gaps in the substantive coverage of BJS data, but notes that filling those gaps would require increased and sustained support in terms of staff and fiscal resources.

In suggesting strategic goals for BJS, the book argues that the bureau's foremost goal should be to establish and maintain a strong position of independence. To avoid structural or political interference in BJS work, the report suggests changing the administrative placement of BJS within the Justice Department and making the BJS directorship a fixed-term appointment.

In its thirtieth year, BJS can look back on a solid body of accomplishment; this book suggests further directions for improvement to give the nation the justice statistics--and the BJS--that it deserves.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!