Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
â5â Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency O UR CHARGE DIRECTS US to provide guidance to the Bureau of Jus- tice Statistics (BJS) regarding its strategic priorities and goals. Be- fore doing this, it is important to consider the functionsâand expectationsâof BJS from a higher, agency-level perspective. One important ï¬lter through which to view the priorities and opera- tions of BJS is its role as one of the principal statistical agencies in the U.S. federal statistical system. Relative to other countries, the U.S. federal sta- tistical system is highly decentralized. Whereas other countries vest the pri- mary authority for collection and dissemination of statistical data in a single agencyâthe Australian Bureau of Statistics, Statistics Canada, and Statis- tics Netherlands, for exampleâauthority for production of ofï¬cial statistics in the United States is divided across numerous agencies.1 These agencies are by no means equal in terms of their stafï¬ng levels and budgetary re- 1 The statistical system of the United Kingdom is also frequently cited as an example of centralization; it is currently in a state of change. Effective as of April 2008, a new Statistics and Registration Service Act formally abolished the legal role of the Ofï¬ce for National Statistics (ONS), previously the United Kingdomâs dominant statistical agency. ONS functions continue, but the ofï¬ce is now a subsidiary of the Statistics Board, created as an independent corporate body as the arbiter and producer of ofï¬cial statistics in the country. As discussed in our panelâs interim report (National Research Council, 2008b), the British Crime Survey is an example of a United Kingdom data collection that is not collected by ONS or the Statistics Board; it is administered by the Home Ofï¬ce. 209
210 JUSTICE STATISTICS Figure 5-1 Estimated direct funding levels for principal federal statistical agencies, ï¬scal year 2008 NOTES: NSF, National Science Foundation. Including the costs associated with the decennial census would add $797.1 million to the Census Bureau total. SOURCE: U.S. Ofï¬ce of Management and Budget (2007:Table 1). sources. As shown in Figure 5-1, the three largest statistical agenciesâthe Census Bureau, the Bureau of Labor Statistics, and the National Center for Education Statisticsâdominate the others in terms of resources even though the subject-matter portfolios of the smaller agenciesâjustice, transportation, agriculture, and so forthâare undeniably important. It is appropriate, in the panelâs judgment, to evaluate BJS in the con- text of the larger federal statistical system, especially the principal statistical agencies whose primary mission is the collection and dissemination of statis- tical information. (There are 60â70 other federal agencies that spend more than $500,000 per year on statistical information dissemination, but whose program duties outweigh their statistical focus.) The panel beneï¬ted from a preexisting, fully vetted set of evaluative cri- teria for a federal statistical agency. The observations that we make in this chapter are generally structured around the Principles and Practices for a Fed- eral Statistical Agency, a white paper of the Committee on National Statistics (CNSTAT) (National Research Council, 2005b). Principles and Practices ar-
PRINCIPLES AND PRACTICES 211 ticulates the basic functions that are expected of a unit of the U.S. federal statistical system; it also outlines ideals for the relationship between indi- vidual statistical agencies and their parent departments. As such, it has been widely used by various statistical agencies in their interactions with Congress and with ofï¬cials in their departments. Indeed, BJS has already embraced such evaluative criteria: a summary version of the Principles and Practices is featured prominently on the front page of BJSâs current strategic plan (Bureau of Justice Statistics, 2005a) and as a top-level link on BJSâs website (under âBJS Statistical Principles and Practicesâ). The two sections of this chapter assess BJS, its products, and its perfor- mance relative to the Principles and Practices of a Federal Statistical Agency; each subsection begins with a précis of the relevant descriptive text from the fourth edition of Principles and Practices (National Research Council, 2009). In the course of this review, we provide extended discussion of two recent âï¬ashpointsâ in recent BJS experienceâthe circumstances surrounding re- lease of data from the 2002 Police-Public Contact Survey (PPCS) that led to the dismissal of a BJS director and the reporting requirements imposed by the Prison Rape Elimination Act of 2003âthat are particularly relevant to examination of the major principles of a statistical agency. We defer conclu- sions and assessments based on the chapter as a whole to Chapter 6, a more comprehensive statement on strategic goals for BJS. 5âA PRINCIPLES OF A FEDERAL STATISTICAL AGENCY 5âA.1 Trust Among Data Providers A federal statistical agency must have the trust of those whose infor- mation it obtains. Data providers, such as respondents to surveys and custodians of administrative records, must be able to rely on the word of a statistical agency that the information they provide about them- selves or others will be used only for statistical purposes. An agency earns the trust of its data providers by appropriately protecting the con- ï¬dentiality of responses. Such protection, in particular, precludes the use of individually identiï¬able information maintained by a statistical agencyâwhether derived from survey responses or another agencyâs administrative recordsâfor any administrative, regulatory, or law en- forcement purpose. (National Research Council, 2009:5â6) In a democracy, government statistical agencies depend on the willing co- operation of resident respondents to provide information about themselves and their activities. This willingness requires assurance that their informa- tion will not be used to intervene in their lives in any way. When respondents to BJS data collections provide data to the agency (or contractors represent- ing the agency) they are told that their individual data will never be used to harm them; they are told that the purposes of the data collection will be
212 JUSTICE STATISTICS fulï¬lled by publicly available statistical results; they are informed that the agency is an independent statistical organization transcending the current administration in power; they are informed that their data will be kept con- ï¬dential. Agencies that fulï¬ll such pledges build over time with their data providers a sense of trust that the agencyâs intentions are benign and that the agency respects their rights as data providers. When political interfer- ence is suspected by data providers, the trust that their reports are being used appropriatelyâsolely to create statistical informationâcan be shaken, and restoring that trust can be a much slower process than its destruction. BJS has many different target populations of data providers. Some are very large (e.g., the entire U.S. household population for the National Crime Victimization Survey (NCVS) or the full set of state courts of general juris- diction) whereas others are quite small (e.g., state-level departments of cor- rection or federal prisons). Small populations of data providers generally are repeatedly asked for data in ongoing BJS series. In turn these data providers are often more interested in the outcome of the data collections and may use the statistical information for their own purposes. From its review of BJS documents, knowledge of its data sets, and in- teractions with its respondents, the panel concludes that BJS and its data collection agents are generally very diligent in preserving the conï¬dentiality of responses from its respondents. This is particularly true for the NCVS, the effectiveness of which is wholly predicated on building trust and rapport between interviewer and respondent in order to obtain full and accurate ac- counts of victimization incidents. The use of respondentsâ data solely for statistical purposes is generally well known and presented. However, we note that this is only âgenerallyâ true in that there exists a ï¬agrant exception. In the judgment of the panel, the reporting requirements of the Prison Rape Elimination Act of 2003 (PREA) oblige BJS to violate the principle of trust among its institutional data providers. Speciï¬cally, the provision of information to a statistical agency is fundamentally different from the provision of information to a regulatory or enforcement agency. Regulatory agencies, by their very nature, have the goal of intervention in individual activities when they are found to violate some prescriptive actions sanctioned by the government. The crux of the problem is that the PREA reporting requirements assign to BJS a quasi-regulatory role, directly using data collected from responding institutions to impose sanctions. In the re- mainder of this section, we describe this breach of principle by describing the history and the implementation of the PREA reporting requirement. Historical Development of the Prison Rape Elimination Act In Farmer v. Brennan (511 U.S. 825 [1994]), the U.S. Supreme Court ruled that âdeliberate indifferenceâ to serious health and safety risks by
PRINCIPLES AND PRACTICES 213 prison ofï¬cials constitutes a violation of the Eighth Amendment protec- tion against cruel and unusual punishment. The particular case in Farmer involved a preoperative transsexual prisoner who was raped and beaten shortly after transfer to a federal penitentiary; the Courtâs ruling vacated lower court rulings that rejected the plaintiff âs argument on the grounds that prison ofï¬cials had not been demonstrated to be criminally reckless. By 2002â2003, the general problem of sexual assault in prison drew leg- islative interest in Congress. Ultimately, the legislative initiative produced PREA. The ï¬nal act is lengthy, including speciï¬cation of grant monies tar- geted at reduction strategies, the establishment of a national commission, and adoption of national standards. However, in this section, we focus on the speciï¬c demands put on BJS by a section of the act covering ânational prison rape statistics, data, and researchââreporting requirements that, in certain respects, run counter to the proper and accepted role of a federal statistical agency. In the 107th Congress, identical versions of a proposed âPrison Rape Reduction Actâ were introduced in both houses (H.R. 4943 and S. 2619). On the occasion of the introduction of the measure in the Senate, cosponsor Sen. Edward Kennedy (D-Mass.) described what little was known quantita- tively about the extent of sexual assault in U.S. prisons:2 Prison rape is a serious problem in our Nationâs prisons, jails, and de- tention facilities. Of the two million prisoners in the United States, it is conservatively estimated that one in ten has been raped. Accord- ing to a 1996 study, 22 percent of prisoners in Nebraska had been pressured or forced to have sex against their will while incarcerated [(Struckman-Johnson et al., 1996)].3 Human Rights Watch recently re- ported, âshockingly high rates of sexual abuseâ in U.S. prisons [(Human Rights Watch, 2001)].4 Cosponsor Sen. Jeff Sessions (R-Ala.) concurred, and brieï¬y described the statistical analysis section of the bill:5 Some studies have estimated that over 10 percent of the inmates in certain prisons are subject to rape. I hope that this statistic is an exag- geration. . . . 2 Congressional Record, June 13, 2002, p. S5337. 3 Struckman-Johnson et al. (1996:69â70) distributed questionnaires (for response by mail) to all inmates and staff at two maximum security menâs prisons, one minimum security menâs prison, and one womenâs facility, all of which are âin the state prison system of a rural Midwest- ern state.â The state is not explicitly identiï¬ed, but later discussions of the results included the acknowledgment of Nebraska as the survey site. In all, 1,801 prisoners and 714 staff members at these facilities were eligible to participate; 528 inmates and 264 staff members responded. 4 No formal survey or statistical data collection was used by Human Rights Watch (2001); instead, the reportâs observations were based on written reports from about 200 prisoners, responding to announcements in publications and leaï¬ets. 5 Congressional Record, June 13, 2002, pp. S5337, S5338.
214 JUSTICE STATISTICS [This] bill will require the Department of Justice to conduct statis- tical surveys on prison rape for Federal, State, and local prisons and jails. Further, the Department of Justice will select ofï¬cials in charge of certain prisons with an incidence of prison rape exceeding the national average by 30 percent to come to Washington and testify to the Depart- ment about the prison rape problem in their institution. If they refuse to testify, the prison will lose 20 percent of certain Federal funds. In both chambers, the legislation was referred to Judiciary subcommittees and no further action was taken (save that the Senate Judiciary Committee held a hearing on the bill on July 31, 2002). In the 108th Congress, legislation identical to the previous bill was in- troduced by Rep. Frank Wolf (R-Va.) and Rep. Bobby Scott (D-Va.) in the House as H.R. 1707 on April 9, 2003.6 However, deliberations between members and staff in both chambers were progressing toward a revised, bi- partisan proposal, and these deliberations resulted in rapid passage of the bill. On June 11, 2003, the House Subcommittee on Crime, Terrorism, and Homeland Security replaced the existing text of H.R. 1707 with substitute language and favorably reported it to the full Judiciary Committee. In turn, the Judiciary Committee approved the revised bill on July 9. The Judiciary Committeeâs report on the bill, H.Rept. 108-219, offers no explanation for the revised wording in the BJS data collection section of the act. On July 21, Sen. Sessions introduced S. 1435âconsistent with7 the revised House language, but now bearing the name âPrison Rape Elimination Act.â Upon introduction, the bill was immediately passed by unanimous consent with- out debate or amendment; the House took up the Senate bill on July 25 and passed it without objection; and the bill was signed on September 4, becoming Public Law 108-79. Text of the Act and Reporting Requirements Box 5-1 shows the alterations to the section of PREA concerning BJS data collection between its original introduction in the 107th Congress and ï¬nal passage. Both the original and ï¬nal versions of the bill establish a Re- view Panel on Prison Rape; the original would have administratively housed the Review Panel in BJS while the ï¬nal version makes it an organ of the Jus- tice Department. To be clear, it is important to note that the Review Panel is more limited in scope than the National Prison Rape Elimination Com- mission created by other sections of the act. The Review Panelâs work is structured around the BJS work, while the formally appointed Commission 6 A variant on the same bill, with the same reporting requirements on BJS, was introduced on April 10, 2003, as H.R. 1765 but progressed no further than referral to committee. 7 Judiciary Committee Chairman James Sensenbrenner (R-Wisc.) described the Senate bill as âsubstantively identical to H.R. 1707â in his ï¬oor remarks on passage of the act (Congressional Record, July 25, 2003, p. 7765).
PRINCIPLES AND PRACTICES 215 Box 5-1 Statistical Reporting Provisions of Original and Final Versions of the Prison Rape Elimination Act The following excerpt compares text from Section 2 of H.R. 4943 (107th Congress) and Section 4 of S. 1435 (108th Congress), the latter of which was enacted as Public Law 108-79. Subsections (d) and (e) on contracts and authorization of appropriations are omitted. Deletions from the earlier version are marked in strikethrough text; additions in the newer version are shown in italic type. NATIONAL PRISON RAPE STATISTICS, DATA, AND RESEARCH. (a) ANNUAL COMPREHENSIVE STATISTICAL REVIEW- (1) IN GENERAL- The Bureau of Justice Statistics of the Department of Justice (in this section referred to as the âBureauâ) shall carry out, for each calendar year, a comprehensive statistical review and analysis of the incidence and effects of prison rape. The statistical review and analysis shall include, but not be limited to the identiï¬cation of the common characteristics ofâ (A) inmates who have been involved with prison rape, both victims and perpetrators both victims and perpetrators of prison rape; and (B) prisons and prison systems with a high incidence of prison rape. (2) CONSIDERATIONS- In carrying out paragraph (1), the Bureau shall considerâ (A) how rape should be deï¬ned for the purposes of the statistical review and analysis; (B) how the Bureau should collect information about staff-on-inmate sexual assault; (C) how the Bureau should collect information beyond inmate self-reports of prison rape; (D) how the Bureau should adjust the data in order to account for differences among prisons as required by subsection (c)(3); (E) the categorization of prisons as required by subsection (c)(4); and (F) whether a preliminary study of prison rape should be conducted to inform the methodology of the comprehensive statistical review. (3) SOLICITATION OF VIEWS- The Bureau of Justice Statistics shall solicit views from representatives of the following: State departments of correction; county and municipal jails; juvenile correctional facilities; former inmates; victim advocates; researchers; and other experts in the area of sexual assault. (2)(4) SAMPLING TECHNIQUES- The analysis under paragraph (1) shall be based on a random sample, or other scientiï¬cally appropriate sample, of not less than 10 percent of all Federal, State, and county prisons, and a representative sample of municipal prisons. The selection shall include at least one prison from each State. The selection of facilities for sampling shall be made at the latest prac- ticable date prior to conducting the surveys and shall not be disclosed to any facility or prison system ofï¬cial prior to the time period studied in the survey. Se- lection of a facility for sampling during any year shall not preclude its selection for sampling in any subsequent year. (3)(5) SURVEYS- In carrying out the review required by this subsection and analysis under paragraph (1), the Bureau shall, in addition to such other methods as the Bureau considers appropriate, use surveys and other statistical studies of cur- rent and former inmates from a sample of Federal, State, county, and municipal prisons. The Bureau shall ensure the conï¬dentiality of each survey participant. (continued)
216 JUSTICE STATISTICS Box 5-1 (continued) (6) PARTICIPATION IN SURVEY- Federal, State, or local ofï¬cials or facility adminis- trators that receive a request from the Bureau under subsection (a)(4) or (5) will be required to participate in the national survey and provide access to any inmates under their legal custody. (b) REVIEW PANEL ON PRISON RAPE- (1) ESTABLISHMENT- To assist the Bureau in carrying out the review and analysis under subsection (a), there is established, within the Bureau Department of Justice, the Review Panel on Prison Rape (in this section referred to as the âPanelâ). (2) MEMBERSHIP- (A) COMPOSITION- The Panel shall be composed of 3 members, each of whom shall be appointed by the Attorney General, in consultation with the Sec- retary of Health and Human Services. (B) QUALIFICATIONS- Members of the Panel shall be selected from among individuals with knowledge or expertise in matters to be studied by the Panel. (3) PUBLIC HEARINGS- (A) IN GENERAL- The duty of the Panel shall be to carry out, for each calendar year, public hearings concerning the operation of each entity identiï¬ed in a report under clause (ii) or (iii) of subsection (c)(2)(B) the three prisons with the highest incidence of prison rape and the two prisons with the lowest incidence of prison rape in each category of facilities identiï¬ed under sub- section (c)(4). The Panel shall hold a separate hearing regarding the three Federal or State prisons with the highest incidence of prison rape. The purpose of these hearings shall be to collect evidence to aid in the identi- ï¬cation of common characteristics of inmates who have been involved in prison rape, both victims and perpetrators both victims and perpetrators of prison rape, and the identiï¬cation of common characteristics of prisons and prison systems with a high incidence of prison rape that appear to have been successful in deterring prison rape. (B) TESTIMONY AT HEARINGS- (i) PUBLIC OFFICIALS- In carrying out the hearings required under sub- paragraph (A), the Panel shall request the public testimony of Federal, State, and local ofï¬cials (and organizations that represent such ofï¬- cials), including the warden or director of each prison, who bears responsibility for the prevention, detection, and punishment of prison rape at each entity, and the head of the prison system encompassing such prison, who bear responsibility for the prevention, detection, and punishment of prison rape at each entity. (ii) VICTIMS- The Panel may request the testimony of prison rape vic- tims, organizations representing such victims, and other appropriate individuals and organizations. (C) FAILURE TO TESTIFY- If, after receiving a request by the Panel under subparagraph (B)(i), a State or local ofï¬cial declines to testify at a reasonably designated time, the Federal funds provided to the entity represented by that ofï¬cial pursuant to the grant programs designated by the Attorney General under section 9 shall be reduced by 20 percent and reallocated to other entities. This reduction shall be in addition to any other reduction provided under this Act. (continued)
PRINCIPLES AND PRACTICES 217 Box 5-1 (continued) (C) SUBPOENAS- (i) ISSUANCE- The Panel may issue subpoenas for the attendance of witnesses and the production of written or other matter. (ii) ENFORCEMENT- In the case of contumacy or refusal to obey a sub- poena, the Attorney General may in a Federal court of appropriate jurisdiction obtain an appropriate order to enforce the subpoena. (c) REPORTS- (1) IN GENERAL- Not later than March June 30 of each year, the Bureau Attor- ney General shall submit a report on the activities of the Bureau (including the Review Panel) and the Review Panel, with respect to prison rape, for the pre- ceding calendar year toâ (A) Congress; and (B) the Attorney General; and (C)(B) the Secretary of Health and Human Services. (2) CONTENTS- The report required under paragraph (1) shall includeâ (A) with respect to the effects of prison rape, statistical, sociological, and psychological data; and (B) with respect to the incidence of prison rapeâ (i) statistical data aggregated at the Federal, State, prison system, and prison levels; (ii) an identiï¬cation of the Federal Government, if applicable, and each State and local government (and each prison system and institution in the representative sample) where the incidence of prison rape exceeds the national median level by not less than 30 percent; and (iii) an identiï¬cation of jail and police lockup systems in the representative sample where the incidence of prison rape is signiï¬cantly avoidable. (ii) a listing of those institutions in the representative sample, separated into each category identiï¬ed under subsection (c)(4) and ranked ac- cording to the incidence of prison rape in each institution; and (iii) an identiï¬cation of those institutions in the representative sample that appear to have been successful in deterring prison rape; and (C) a listing of any prisons in the representative sample that did not cooperate with the survey conducted pursuant to section 4. (3) DATA ADJUSTMENTS- In preparing the information speciï¬ed in paragraph (2), the Bureau shall, not later than the second year in which surveys are conducted under this Act, Attorney General shall use established statistical methods to adjust the data as necessary to account for exogenous factors, outside of the control of the State, prison system, or prison, which have demonstrably contributed to the incidence of prison rape differences among institutions in the representative sample, which are not related to the detection, prevention, reduction and punishment of prison rape, or which are outside the control of the State, prison, or prison system, in order to provide an accurate comparison among prisons. Such differences may include the mission, security level, size, and jurisdiction under which the prison operates. For each such adjustment made, the Bureau Attorney General shall identify and explain such adjustment in the report. (4) CATEGORIZATION OF PRISONS- The report shall divide the prisons surveyed into three categories. One category shall be composed of all Federal and State prisons. The other two categories shall be deï¬ned by the Attorney General in order to compare similar institutions.
218 JUSTICE STATISTICS has a broader charge to develop national standards for the detection and prevention of sexual violence in correctional facilities. It also appears that one of the intended roles of the Review Panel was to âassistâ BJS in its data collection efforts (as is explicitly stated in both versions of the bill). This assistance function is consistent with concerns expressed at a congressional hearing on the bill, arguing that BJS should have an advisory group to work out deï¬nitional issues in measuring prison rape (U.S. House of Representa- tives, Committee on the Judiciary, 2003:19). The critical difference in the legislative texts in Box 5-1 lies in the re- porting requirements to support public hearings by the Review Panel. The original proposal called for public hearings with ofï¬cials from institutions with high and low incidences of prison rape (facilities âwhere the incidence of prison rape exceeds the national median level by not less than 30 percentâ and facilities âwhere the incidence of prison rape is signiï¬cantly avoidableâ). However, the ï¬nal law directs thatâeach year, for different facility typesâ the facilities with the three highest and two lowest incidence rates be sum- moned to appear at hearings. Comparing the different versions of section (b)(3)(C) in Box 5-1, the original version of the act threatened institutions that refused to testify before the Review Panel with a 20 percent reduction in federal grant monies. The ï¬nal version of the bill removed that threat but granted the Review Panel full subpoena power.8 In addition to iden- tifying the highest- and lowest-ranked institutions, the ï¬nal legislative text also required the Review Panel (presumably using BJSâs work) to provide a complete listing of all the facilities in the sample, âranked according to the incidence of prison rape. The original designation of âhighâ prison rape incidenceâa value more than 30 percent greater than the national medianâwas a curious and in- triguing one. Depending on the distribution of incidence rates across fa- cilities, the criterion might have obliged the Review Panel to hear from an unworkably high number of parties, and perhaps that consideration drove the revision. Alternatively, singling out âtheâ highest-rate facilities may have been viewed by legislators as more consistent with the themes of account- ability and action (as with the change in nomenclature from a âPrison Rape Reductionâ to a âPrison Rape Eliminationâ Act). From the record, it is un- clear exactly how and why the change came about. Indeed, both Rep. Scottâs prepared statement for the Judiciary Committee markup of the bill on July 9, 2003 (H.Rept. 108-219, p. 114), and ï¬oor statement on the Senate bill 8 Although the text does not appear in H.R. 3493 in Box 5-1, Corlew (2006) notes that the bill as originally proposed âwould have granted a ten percent funding increase to prison systems that, because of their high percentages of prison rape, were required to provide testimony to the Review Panel.â Both the American Correctional Association and the Association of State Correctional Administrators objected that this provision âappeared to reward undeserving systemsââanother reason for the change to subpoena authority in the ï¬nal bill text.
PRINCIPLES AND PRACTICES 219 on July 25 (Congressional Record, p. H7764), refer to âconduct[ing] pub- lic reviews of institutions where the rate of prison rape is 30% above the national average rateââeven though that provision no longer existed in the revised language. The principal congressional hearing on the bill was held before the House Judiciary Subcommittee on Crime, Terrorism, and Homeland Se- curity on April 29, 2003. Being a House hearing, the bill referred to at the hearing was the original version of the legislation with the 30-percent- above-median reporting requirement. At that hearing, the only discussion of BJSâs reporting role was raised by then-principal deputy attorney gen- eral for the Ofï¬ce of Justice Programs (OJP) Tracy Henke, and that concern came as a brief ending to her opening statement. Although Henkeâs remarks hinted at the inappropriateness of BJSâs use of data for administrative and regulatory purposes, the speciï¬c objection was raised to the original billâs vague deï¬nition of low-prevalence facilities (U.S. House of Representatives, Committee on the Judiciary, 2003:13â14): I know my time is up, but real quickly, sir, another concern to the De- partment is that the Department believes that [it is of the utmost im- portance that9 ] the integrity of the statistical collection and analysis by the Bureau of Justice Statistics be preserved. The legislation currently requires BJS not only to collect but also to analyze data and produce reports on that analysis in a very short timeframe. We recognize the need for quick access to this information, but it must be balanced by providing BJS the opportunity to accurately and sufï¬ciently analyze the data collected. Finally, the law authorizing BJS prohibits BJS from gathering data for any use other than statistical or research purposes. By requiring BJS to identify facilities âwhere the incidence of prison rape is signiï¬cantly avoidable,â the legislation calls for BJS to make judgments about what level of prison rape is âsigniï¬cantly avoidableâ. This responsibility goes beyond BJSâs authorized statistical role. BJS Data Collections and Reports in Support of the Act In response to the enactment of PREA, BJS organized a series of data collection efforts, summarized in Bureau of Justice Statistics (2004c), that have been characterized as âa quantum leap in methodology and our knowl- edge about the problemâ of prison rape (Dumond, 2006). The main efforts in the PREA-related data collections are an annual administrative-records- based inventory dubbed the Survey of Sexual Violence (SSV) and a recurring National Inmate Survey program. For the SSV BJS contracted with the U.S. , 9 This grammatical insertion uses the wording of Henkeâs prepared statement, printed in the hearing record after the spoken remarks (U.S. House of Representatives, Committee on the Judiciary, 2003:16).
220 JUSTICE STATISTICS Census Bureauâs Governments Division to collect records-based counts of re- ported incidents from federal and state prisons and a sample of local jails and private correctional facilities. Self-report personal interviewing contracts for the National Inmate Surveys were established with three separate contrac- tors, corresponding to the speciï¬c populations and facility types envisioned by the act: RTI International (adult prisons and jails), Westat (juvenile facil- ities), and the National Opinion Research Center (soon-to-be released and former prisoners). In all of these self-report options, BJS settled on the use of audio computer-assisted self-interviewing (ACASI) as the best means to obtain personally sensitive information such as that called for in the inmate survey of sexual victimization. Under ACASI methods, respondents com- plete a questionnaire on a computer, following instructions played through earphones from the computer; in this way, respondents do not have to di- rectly divulge embarrassing or sensitive information directly to another per- son, facilitating a more accurate response. Particularly for the adult prison populations, backup strategies for collection were also developed, including forms for administration to inmates considered too dangerous to interact with survey staff. Beck and Hughes (2005) issued the ï¬rst report on SSV data on victimiza- tion incidents reported to correctional facilities, corresponding to data col- lected in 2004. New reports on the SSV for 2005 and 2006 have since been issued. At this writing, two reports from National Inmate Surveys have been released. A December 2007 report (Beck and Harrison, 2007) described the results from interviewing at a sample of 146 state and federal prisons, a June 2008 report covered interview results at a sample of 282 local jails (Beck and Harrison, 2008), and a July 2008 report summarized results from interviews at juvenile correctional facilities (Beck et al., 2008). Cognizant of BJSâs legal reporting requirements, both the prison and jail releases from the National Inmate Surveys identiï¬ed the names of institu- tions with high rates of offending; however, both have explicitly described an inability to identify the three highest-rate and two lowest-rate facilities as prescribed by the law. Table 5-1 reproduces the key table from Beck and Harrison (2007) on federal and state prisons, identifying 10 high-rate facilities. Noting the standard errors calculated for the estimates, the re- port carefully explains that, âstatistically, the NIS is unable to identify the facility with the highest prevalence rateâ or âprovide an exact ranking for all facilities as requiredâ under PREA âas a consequence of sampling errorâ (Beck and Harrison, 2007:3). The report is accompanied by spreadsheets tabulating facility-speciï¬c estimates and standard errors of reported sexual victimization for the full sample; the entries are presented alphabetically by state rather than the strict ranking suggested in the text of the act. In the body of the report, BJS chose to tabulate the top 10 results. In a ranked list by weighted percentage of sexual victimization incidents, the
PRINCIPLES AND PRACTICES 221 Table 5-1 Prison Facilities with Highest and Lowest Prevalence of Sexual Victimization, National Inmate Survey, 2007 Percent of Inmates Reporting Sexual Victimizationa Number of Response Weighted Standard Facility Name Respondents b Rate (%) Percentc Errord U.S. total 23,398 72 4.5 0.3 10 highest Estelle Unit, TX 197 84 15.7 2.6 Clements Unit, TX 142 59 13.9 2.9 Tecumseh State Corr. Inst., NE 85 39 13.4 4.0 Charlotte Corr. Inst., FL 163 73 12.1 2.7 Great Meadow Corr. Fac., NY 144 62 11.3 2.7 Rockville Corr. Fac., INe 169 79 10.8 2.4 Valley State Prison for Women, CAe 181 78 10.3 2.3 Allred Unit, TX 186 71 9.9 2.2 Mountain View Unit, TXe 154 80 9.5 1.9 Cofï¬eld Unit, TX 194 76 9.3 2.1 6 lowest f Ironwood State Prison, CA 141 60 0.0 â Penitentiary of New Mexico, NM 83 38 0.0 â Gates Corr. Ctr., NC 52 74 0.0 â Bennettsville-Camp, BOP 77 69 0.0 â Big Spring Corr. Inst., BOP 155 66 0.0 â Schuylkill Fed. Corr. Inst., BOP 174 70 0.0 â a Percent of inmates reporting one or more incidents of sexual victimization involving another inmate or facility staff in past 12 months or since admission to the facility, if shorter. b Number of respondents selected for the National Inmate Survey on sexual victimization. c Weights were applied so that inmates who responded accurately reï¬ected the entire popula- tion of each facility on selected characteristics, including age, gender, race, time served, and sentence length. d Standard errors may be used to construct conï¬dence intervals around the weighted survey estimates. For example, the 95% conï¬dence interval around the total percent is 4.5% plus or minus 1.96 times 0.3% (or 3.9% to 5.1%). e Female facility. f Facilities in which no incidents of sexual victimization were reported by inmates. NOTES: â, Not applicable. BOP, Bureau of Prisons. SOURCE: Reproduced from Beck and Harrison (2007:Table 1).
222 JUSTICE STATISTICS 11th-ranked facility (the Hays State Prison in Georgia) is the ï¬rst whose difference from the highest-ranked facility (the Estelle Unit in Texas) is sta- tistically signiï¬cant (α = 0.05). Hence, the top 10 results constitute a group whose overall sexual violence victimization rates are high relative to others, even if they are not statistically distinguishable from each other. Following the same logic, Beck and Harrison (2008) tabulated results for 18 high- rate local jails; in compliance with PREA requirements, Beck and Harrison (2008) also list sampled jails that declined to participate and permit inter- viewing in the survey. BJSâs report on the survey administration in juvenile facilities (Beck et al., 2008) differs from the other reports in the series in that it does not attempt any tabular listing of speciï¬c facilities or ranking of highest-offense facilities, instead reporting summary statistics from the sam- ple as a whole. However, the report does identify those juvenile facilities that declined to participate as well as those that reported no victimization incidents. In addition to our panelâs concern about the use of BJS data for reg- ulatory or administrative uses, we are also critical of the procedures used for this part of reporting pursuant to PREA. Speciï¬cally, we are concerned that the approach greatly understates the variability inherent in the data; see Box 5-2. Developments Following the First PREA Report Releases Following the release of Beck and Harrison (2007), the Review Panel on Prison Rape established by PREA in the Department of Justice (DOJ) held 7 days of hearings in March 2008, in Washington, DC, and Houston, Texas, to obtain testimony from each of the adult federal and state prisons identiï¬ed in Table 5-1. The National Prison Rape Elimination Commission issued press releases on the occasion of the BJS report releases and the start of the Review Panel hearing. The Commissionâs June 25, 2008, release noted that: Even with margins of error, the study reveals that these facilities have extraordinarily high rates of sexual assault, highlighting the severity of this national problem. . . . We welcome BJSâs stated willingness to adjust future surveys to gather additional information. We hope the agency will develop more questions about inmate reporting efforts, the response of ofï¬cials and factors that may play into reporting, such as threats of retaliation. Since the enactment of PREA, similar legislative calls for expanded data collection on inmate health conditions have been introduced in Congress but have not advanced beyond referral to committee. For instance, the proposed Justice for the Unprotected Against Sexually Transmitted Infections Among the Conï¬ned and Exposed (JUSTICE) Act introduced as H.R. 178 in Jan- uary 2007 requires an annual survey of correctional facilities. In addition to
PRINCIPLES AND PRACTICES 223 Box 5-2 Critique of the Reported Rankings in the Prison Rape Elimination Act Inmate Surveys In its reports on the PREA inmate surveys in prisons and jails (Beck and Harrison, 2007, 2008), the Bureau of Justice Statistics (BJS) did what it could to convey the basic idea that the survey sample sizes are too small (and the underlying phenomenon being measured is sufï¬ciently ârareâ) to preclude identiï¬cation of high-rate facilities with the precision called for by the law. However, the panel observes that BJSâs chosen ap- proach is, itself, partly inaccurate in that it understates the variability inherent in the data. As it stands, BJS has ranked the correctional facilities solely on the basis of sample-based estimates of prison rape rates. Each correctional facility among the top 10 is associated both with its name and its ranking. This is not a fully valid approach given that many of these rates have large standard errors. The standard errors reï¬ect the uncertainty due to observing only a portion of the prison population. The estimated rates could have differed if a different sample of institutions were selected. This sampling uncertainty must be taken into consideration while developing such rankings. In other words, a facilityâs name is ï¬xed but its ranking is affected by the sampling error in the estimated rates. There are simple procedures to account for such sampling uncertainty. For example, consider the 20 facilities with highest rates and their standard errors. How fair is to label only the top 10 as âbadâ (let us call this top-10 group âTier Aâ) when the bottom 10 (âTier Bâ) could have easily been in Tier A purely by chance? This question can be answered using a simple bootstrap procedure by simulating what could have been the estimated prison rape rates and their ranking for these 20 facilities purely by chance. We used 10,000 parametric bootstrap draws from the sampling distribution of the estimated overall prevalence rates for prison rape in Beck and Harrison (2007), and ranked them. We then computed the proportion of times a facility labeled in Tier A in BJSâs report would have been placed in Tier B in the simulation, and vice versa. The following table gives these estimated probabilities. Estimated Probability of Estimated Probability of Prison Rape Standard Being in Prison Rape Standard Being in Rates Error Tier B Rates Error Tier A Tier A (%) (%) (%) Tier B (%) (%) (%) 1 15.7 2.6 0.2 11 9.1 1.9 49.1 2 13.9 2.9 3.6 12 8.7 2.2 46.8 3 13.4 4.0 12.1 13 8.5 2.4 39.1 4 12.1 2.7 12.3 14 8.2 2.0 32.4 5 11.3 2.7 20.3 15 8.1 2.0 30.1 6 10.8 2.4 23.7 16 8.0 2.2 29.6 7 10.3 2.3 30.4 17 8.0 2.1 29.6 8 9.9 2.2 37.2 18 7.9 2.1 28.2 9 9.5 1.9 42.4 19 7.9 1.9 25.0 10 9.3 2.1 47.2 20 7.9 1.7 24.5 NOTES: Bootstrap estimate of the error or misclassiï¬cation of rates purely based on the point estimates and ignoring the standard error. The misclassiï¬cation rates are disturbingly high and expected given the large standard error, partly due to inadequate sample size. If the error rates of 5 percent or more are not acceptable, then only the ï¬rst two facilities stand out in that they would have remained in the Tier A set with high probability had a different sample been obtained. For many facilities the decision to label them as Tier A or Tier B is quite arbitrary and made purely by chance. This amply illustrates the problems of using statistical data for regulatory purposes. (continued)
224 JUSTICE STATISTICS Box 5-2 (continued) Alternatively, all possible comparisons between Tier A set rates and Tier B set rates may be considered simultaneously and Bonferroni bounds may be used to conservatively determine signiï¬cance levels for all possible pairwise comparisons. Such an approach has been used, in particular, in other applications where extreme ranks on some variable carry particular political sensitivity. For instance, the National Center for Education Statistics has developed such approaches for ranking states according to sample-based educational testing results, as described by Wainer (1996). PREA-type queries on the incidence of sexual assault, the proposed data col- lection would require information on facility policy on testing for sexually transmitted diseases and data on test results that are sufï¬ciently detailed to support disaggregation by disease type, race and ethnicity, age, and gender. Assessment Whatever the reasons for the change in reporting requirements, both the original and the ï¬nal versions of the PREA bill violated the expected princi- ples and practices of a federal statistical agency. BJS directly contributed to regulatory activities affecting individual data providers: explicitly singling out individual facilities to receive a summons to public hearings. Arguably, that direct summons to appear before the Review Panel is a somewhat lesser burden than a compulsory appearance before a congressional committee or the fuller National Prison Rape Elimination Commission. Nonetheless, the provision does explicitly direct the usage of data reported to BJS for nonsta- tistical purposes, a basic violation of the role of federal statistical data. The ï¬nal language of the act exacerbated this violation by putting undue weight on point estimates of incidences of sexual assaultâestimates of (ideally) a relatively low-probability phenomenon based on a small sampleâwithout accounting for the inherent variability in the estimates. BJS and its major constituencies and stakeholdersâchief among them Congress and the administrationâmust be mindful of the extensive legal mandates placed on the agency and how they correspond to the resources provided to BJS; it is crucial that BJS not be assigned duties that violate fundamental principles for statistical agency conduct. Finding 5.1: Under the terms of the Prison Rape Elimination Act of 2003, BJS was required to release the identity of selected responding institutions (i.e., facilities with the highest and low- est rates of sexual violence against inmates) for later regulatory action as part of a statistical program.
PRINCIPLES AND PRACTICES 225 Recommendation 5.1: Congress and the Department of Justice should not require, and BJS should not provide, individually identiï¬ed data in support of regulatory functions that compro- mise the independence of BJS or require BJS to violate any of the principles of a federal statistical agency. To be sure, criticism of the reporting requirements of PREA should not be mistaken for criticism of the study of prison rape; the problem of sexual violence in correctional facilities is a valid and important one for inquiry. BJSâs work in developing the suite of inmate prison rape surveys also had the beneï¬t of pushing the agency to make major methodological improve- ments, relative to other federal surveys, in the use of techniques such as ACASI. However, it is also important to note that implementing PREA in- volves major opportunity costs to BJS, over and above the concerns over the regulatory ï¬avor of the work. The separate and highly sensitive nature of PREA interviewing makes it infeasible for BJS to conduct its standard in- mate interviewing programs at the same time. Further, it is still too early to assess whether the PREA interviewing has any chilling effect on response to BJSâs conventional corrections data series; although one possible reason for a dampening effect on response to regular corrections series might be resentment at being âsingled outâ for inclusion in the PREA sample, another is simply the time and resource burden of brokering BJS access to inmates on a more frequent basis. To its credit, BJS has taken steps to convey PREAâs re- quirements to individual facilities and elicited comments and feedback from facilities and administrators, including participation in relevant professional association meetings and conduct of stakeholder workshops. The PREA data providers have the risk of public display of their estimates in an active attempt at regulatory intervention. Within a short period of time, BJS will ask the same facilities for data for another purpose. The potential impact of BJSâs participation in regulatory actions is that the data providers will no longer believe that other data requested will not also be used in such a manner. Again, once data providers lose trust that cooperation with BJS will not lead to individual harmful actions on them, the agency faces large problems. 5âA.2 Strong Position of Independence A federal statistical agency must have a strong position of indepen- dence within the government. To be credible and unhindered in its mis- sion to provide objective, useful, high-quality information, a statistical agency must not only be distinct from those parts of a department that carry out law enforcement and policy-making activities but also have a widely acknowledged position of independence. It must be able to exe- cute its mission without being subject to pressures to advance a political
226 JUSTICE STATISTICS agenda. It must be impartial and avoid even the appearance that its collection, analysis, and reporting processes might be manipulated for political purposes or that individually identiï¬able data might be turned over for administrative, regulatory, or law enforcement purposes. (Na- tional Research Council, 2009:6) The establishment and maintenance of an independent, objective, and credible voice is a central principle for statistical agency operations. To maintain that objectivity and credibility, a statistical agency is obliged to keep apart from the policy-making sphere of the executive branch; its prod- ucts inform the development of policy, but they must not themselves be pol- icy statements. Maintaining this armâs-length distance from policy develop- ment is particularly difï¬cult for statistical agencies that are administratively housed with program agencies of the executive branch, whose purpose is the furtherance of speciï¬c objectives. In recent years, administrative layering of statistical agencies has become a subtle, but increasingly common, threat to the position of independence of federal statistical agencies. Agencies are diminished in their perceived importance, their claim to budgetary resources, and their attention from departmental policy makers through placement further down in a depart- mentâs organizational hierarchy. In 2002, the National Center for Educa- tion Statistics was redesignated by P 107-279 as a unit of a new Institute .L. of Education Sciences. In 2004, the National Center for Health Statisticsâ already administratively removed from the main Department of Health and Human Services by administrative placement in the Atlanta-based Centers for Disease Control and Prevention (CDC)âwas placed under the further administrative layer of a âcoordinating center,â as part of a broader CDC reorganization. Also in 2004, the Bureau of Transportation Statistics was converted by P 108-426 to become a unit under the new Research and .L. Innovative Technology Administration, and its director was changed from a presidential appointee with Senate conï¬rmation to a career appointee des- ignated by the Secretary of Transportation. Of the current members of the Interagency Council on Statistical Policy (see Box 1-3), only the Bureau of Labor Statistics and the Energy Information Administration have direct re- porting authority to their respective cabinet secretary or department head. BJS, through its administrative placement in OJP is not the most heavily , layered of statistical agencies, but it ranks among them. In the panelâs judgment, the principle of a strong position of indepen- dence of a statistical agency was seriously violated in BJSâs recent past by the circumstances surrounding the release of data from the 2002 PPCS. This particular âï¬ashpointâ in BJSâs recent history centered on the wording of a press release to accompany the data release. In the balance of this sec- tion, we describe the path toward this breach in principle and the corrective measures that have since been taken.
PRINCIPLES AND PRACTICES 227 OJP and Press Release Policy In a 1991 U.S. House subcommittee hearing on criminal justice statistics, Acting BJS Director Joseph Bessette was asked about policy on the press releases accompanying new BJS data releases and, speciï¬cally, the role of the Justice Department in clearing those releases. Bessette answered that: There has never been a case in my time [at BJS (5 years, at that point)], and people there tell me never before then as well, of the Department interfering in any way with the reports, with the accuracy, with the nature of the numbers, anything of that sort. So, in that respect, we have been functioning as a kind of semiautonomous statistical agency quite well. However, the BJS press releasesâI use the term âBJS press releases,â but, actually, they are Department of Justice press releases of- ï¬cially, and they have always gone to the Department for clearance. We draft them in BJS [and they] go up the chain of command for clearance, and that has been the case right along. So, in that respect, the policy hasnât changed. Pressed further, he noted that âlast year, for the ï¬rst time, the Attorney General was quoted in a BJS press release commenting on the numbers and recommending public policy. That happened that one time; that has not happened sinceâ (U.S. House of Representatives, Committee on the Judi- ciary, 1991:216).10 Over the next decade, the protocol for issuing BJS press releases evolved into the ï¬ow pattern illustrated in Figure 5-2. BJS staff would typically take the lead in developing the press release, in cooperation with the OJP Ofï¬ce of Communications. In all, the typical approval process required signoff from ï¬ve noncareer appointees in DOJ and OJP (including the presidentially appointed BJS director). Figure 5-3 illustrates the general formatting of the standard notice and page posted to BJSâs website upon the release of a new product, and Figure 5-4 shows the formatting of a formal press release, for one recent BJS product for which OJP and DOJ elected to issue a press release. The 2002 Police-Public Contact Survey In August 2005, the New York Times, followed by other media out- lets, reported on a string of events over the previous 4 months that cul- minated in the removal of BJS Director Lawrence Greenfeld (Lichtblau, 2005a; Eggen, 2005; Sniffen, 2005). The removal was precipitated by dis- putes within the Justice Department over the statement of ï¬ndings from the 10 In response, the questionerâthen-Rep. Charles Schumerâcommented before moving on to the next line of questioning: âI think it is a good idea to keep the two separate. The Attorney General should comment on policy but not in the statistical press releasesâ (U.S. House of Representatives, Committee on the Judiciary, 1991:216).
228 JUSTICE STATISTICS OJP Press Release BJS Ofï¬ce of Preparation Author/Supervisor Communications (OCOM) Staff Review and BJS Approvals Director OJP OJP OJP OJP Director of Public Deputy Assistant Director, OCOM Chief of Staff Affairs (OCOM) Attorney General OJP Assistant Attorney General Press Release DOJ BJS Dissemination Director, Ofï¬ce of Director Public Affairs Disseminate to Congress, media, and Post to BJS executive department website press ofï¬ces Figure 5-2 Review, approval, and dissemination process for BJS survey press releases, 2007 NOTE: BJS, Bureau of Justice Statistics. DOJ, Department of Justice. OJP, Ofï¬ce of Justice Programs. White boxes indicate career employees; grey boxes denote noncareer appointees. SOURCE: Adapted from U.S. Government Accountability Ofï¬ce (2007:Fig. 3), based on infor- mation from BJS and OJP. PPCS supplement to the NCVS. As described in Section 3âC.4, the PPCS was ï¬rst ï¬elded on a pilot basis in 1996, followed by full-scale implemen- tation in 1999, 2002, and 2005. The events of 2005 concerned the release of information from the 2002 administration of the supplement (Durose et al., 2005). As indicated in the abstract of the report on the BJS website (http://www.ojp.usdoj.gov/bjs/abstract/cpp02.htm), Highlights [from the 2002 PPCS] include the following: ⢠About 25% of the 45.3 million persons with a face-to-face contact indicated the reason for the contact was to report a crime or other problem. ⢠In 2002 about 1.3 million residents age 16 or olderâ2.9% of the 45.3 million persons with contactâwere arrested by police. ⢠The likelihood of being stopped by police in 2002 did not differ signiï¬cantly between white (8.7%), black (9.1%), and Hispanic (8.6%) drivers. ⢠During the trafï¬c stop, police were more likely to carry out some type of search on a black (10.2%) or Hispanic (11.4%) than a white (3.5%).
PRINCIPLES AND PRACTICES 229 Figure 5-3 Example summary and links to report and data on Bureau of Justice Statistics website Figure 5-4 Excerpt from example Ofï¬ce of Justice Programs press release accompanying new Bureau of Justice Statistics data release
230 JUSTICE STATISTICS After BJS staff developed a press release, the draft release was forwarded to OJP and then-Assistant Attorney General Tracy Henke. It was the inclu- sion of the last highlighted pointâthe ï¬nding of disparate levels of search (and related ï¬ndings on use of force) by race and ethnicityâthat led to a dispute. As Lichtblau (2005a) recounts, The planned announcement noted that the rate at which whites, blacks and Hispanics were stopped was âabout the same,â and that ï¬nding was left intact by Ms. Henkeâs ofï¬ce, according to a copy of the draft obtained by The New York Times. But the references in the draft to higher rates of searches and use of force for blacks and Hispanics were crossed out by hand, with a nota- tion in the margin that read, âDo we need this?â A note afï¬xed to the edited draft, which the ofï¬cials said was written by Ms. Henke, read âMake the changes,â and it was signed âTracy.â That led to a ï¬erce dis- pute after Mr. Greenfeld refused to delete the references, ofï¬cials said. . . . Mr. Greenfeld refused to delete the racial references, arguing to his supervisors that the omissions would make the public announcement incomplete and misleading. The report was publicly releasedâposted to the agencyâs website and disseminated through usual meansâbut without any accompanying news release or publicity. This decision âall but assured [that] the report would get lost amid the avalanche of studies issued by the governmentâ; indeed, âa computer search of news articles [in August 2005] found no mentions of the studyâ (Lichtblau, 2005a). However, the studyâand dispute over the press releaseâgarnered considerable press attention after the New York Times story on the circumstances surrounding Greenfeldâs dismissal as BJS director. In the wake of these incidents, the U.S. Government Accountability Of- ï¬ce (GAO) initiated a review of the conduct of the various administrations of the PPCS and the release of those data. Responding to a draft report, Assistant Attorney General Regina Schoï¬eld asserted that some of GAOâs ï¬ndings of interference were erroneous because they were âpredicated on GAOâs assumption that a press release is a statistical product.â However, she continued (quoted in U.S. Government Accountability Ofï¬ce, 2007:48): We respectfully disagree with GAOâs assumption. A press release simply is not a statistical product and thus should not be treated as a statistical product at allâlet alone one that is somehow covered by the [CNSTAT guidelines in Principles and Practices for a Federal Statistical Agency.] A press release, rather, is a public relations announcement issued to encourage media coverage. The mere presence of statistics in a press release does not transform a press release into a statistical product. Combining this argument with the legally nonbinding nature of the Prin- ciples and Practices, Schoï¬eld concluded (quoted in U.S. Government Ac- countability Ofï¬ce, 2007:50):
PRINCIPLES AND PRACTICES 231 By statute, 42 U.S.C. § 3732(b), the Director of BJS âshall be respon- sible for the integrity of data and statistics.â In the exercise of such authority, he may elect to follow the NRC guidelines, but he is not and cannot be legally bound to do so, in the absence of some supervening statute. [Thus,] even if the [CNSTAT] written guidelines did apply to press releases (and they do not), the Director would and does decline, in the exercise of his statutory authority to apply them to BJS press releases. In response, the GAO stood by its assumption, in large part for the sim- ple reason that âthe Police-Public Contact Survey press release was made up almost entirely of survey statistics, indicating to us that it was a statistical productâ and that âthe content of the press release was a more important determinant than the label attached to itâ (U.S. Government Accountability Ofï¬ce, 2007:23â24). The GAO observed that âthe role that certain noncar- eer appointees outside BJS have the ability to play, pursuant to Department of Justice policy, in the product issuance processâ means that âBJS was not in a position to fully follow all guidelines related to agency independence,â thus creating the potential for âfuture actual or perceived political inter- ferenceâ in BJS product releases (U.S. Government Accountability Ofï¬ce, 2007:14).11 Later, but too late to affect the DOJ actions, the U.S. Ofï¬ce of Man- agement and Budget (OMB) issued formal guidance in early 2008 to clarify the gray-area dispute as to whether a press release constitutes a statistical product. In the March 7, 2008, Federal Register, OMB published Statisti- cal Policy Directive 4 on the release and dissemination of products from the federal statistical agencies. Deï¬ning a âstatistical press releaseâ as one of the product types covered by the directive, OMB âencouragedâ agencies to issue press releases to accompany the issuance of new data and reports. The directive does not speak directly to the issue of administrative review of the content of such press releases, advising only that: to maintain a clear distinction between statistical data and policy inter- pretations of such data, the statistical press release must be produced and issued by the statistical agency and must provide a policy-neutral description of the data; it must not include policy pronouncements. 11 Reacting, most likely, to the GAO report, U.S. House appropriators issued an even stronger statement in its explanatory statement accompanying the ï¬scal year 2008 Commerce, Justice, and Science appropriations bill (H.Rept. 110-240): Ensuring objective BJS studiesâThe Committee directs that any statistical studies undertaken by the Bureau of Justice Statistics, as well as press releases describing the results of these studies, shall be publicly released by the Bureau without alteration or clearance by persons outside of the Bureau. However, this provision was not repeated in the explanatory statement for the consolidated appropriations act that eventually funded BJS and DOJ.
232 JUSTICE STATISTICS The issuance of this guidance appears to have improved the release process for BJS products in recent months, even though the guidance emphasizes the need to âcoordinate with public affairs ofï¬cials from the parent orga- nizationâ in those âcases in which the statistical unit currently relies on the parent agency for the public affairs function.â Aftermath In 2007, when data from the 2005 administration of the supplement were made available, the report (Durose et al., 2007) was accompanied by a press release (http://www.ojp.usdoj.gov/bjs/pub/press/cpp05pr.htm). Enti- tled âPolice Stop White, Black, and Hispanic Drivers at Similar Rates Ac- cording to Department of Justice Report,â the release observed: The 2002 and 2005 surveys found that white, blacks and Hispan- ics were stopped at similar rates. . . . In both 2002 and 2005 police searched about 5 percent of stopped drivers. . . . While the survey found that black and Hispanic drivers were more likely than whites to be searched, such racial disparities do not necessarily demonstrate that po- lice treat people differently based on race or other demographic charac- teristics. This study did not take into account other factors that might explain these disparities. This press releaseâlike others, issued in recent years and even subsequent to the March 2008 OMB guidanceâwas issued on OJP letterhead. Immediately following the dispute over the 2002 PPCS press release, BJS Director Greenfeld resigned. As the narrative description by Lichtblau (2005a) continues: Amid the debate over the trafï¬c stop study, Mr. Greenfeld was called to the ofï¬ce of Robert D. McCallum Jr., then the third-ranking Justice Department ofï¬cial, and questioned about his handling of the matter, people involved in the episode said. Some weeks later, he was called to the White House, where personnel ofï¬cials told him he was being replaced as director and was urged to resign, six months before he was scheduled to retire with full pension beneï¬ts, the ofï¬cials said. After Mr. Greenfeld invoked his right as a former senior executive to move to a lesser position, the administration agreed to allow him to seek another job, and he is likely to be detailed to the Bureau of Prisons, the ofï¬cials said. After the appearance of the Times article, numerous newspapers ran edi- torials critical of Greenfeldâs departure (see, e.g., Hartford Courant, 2005; Houston Chronicle, 2005; Joiner, 2005; Love, 2005; Miami Herald, 2005; Tennessee Tribune, 2005). Although some members of Congress called for Greenfeld to be reinstated (Lichtblau, 2005b), no such reversal was made.12 12 At the time of his dismissal, Greenfeld authored a farewell letter to members of the Justice Research and Statistics Association (JRSA) that noted a positive aspect of the ï¬are-up over
PRINCIPLES AND PRACTICES 233 The wave of publicity concerning these events reinforced the perception that BJSâs position of independence had been threatened. Assessment One immediate recommendation that is appropriate in light of the PPCS incident is to express formal concurrence with the OMB guidance that even- tually followed. The press release associated with a new statistical series or the latest release of data from a continuing series is, properly, a statistical product. Taking care always to be policy-neutral, the press release is the agencyâs ï¬rst chance (and sometimes the only and best chance) for a sta- tistical agency to highlight its ï¬ndings from the data, any methodological concerns that the new data may raise, and to promote accurate reporting and publicity of new results. Accordingly, press releases should share the same protections from interference as other BJS reports and releases. Finding 5.2: The appearance of political interference in release of statistical information undermines public trust in that infor- mation and in the entire agency. Recommendation 5.2: The Department of Justice review of any BJS statistical product and related communications should not require changes to the content, the release schedule, or the mode of dissemination planned by BJS. The promulgation of the OMB guidance solves, or at least ameliorates, the immediate cause of this most glaring violation to BJSâs position of in- dependence, but a larger problem remains. Independence is an ever-present tension that exists when a statistical agency is administratively nested in a program agency, as BJS is within OJP The OMB guidance is a useful safe- . guard but, by its nature and the nature of the decentralized statistical system, it is necessarily somewhat passive and advisory. That is, its successful imple- mentation in BJSâs case hinges on the compliance and goodwill of the lead- ership of BJS, OJP and the broader DOJ to ensure that boundaries are not , blurred. Though it is very welcome, the guidance makes no speciï¬c refer- ence to the circumstances that befell BJS concerning the 2002 PPCS release and, accordingly, falls short of a forceful statement by the statistical system press release language. âThere is a good reason that more than 20,000 people a day turn to BJS for information on crime and the administration of justice; there is a good reason that no Congressional bill on crime and justice ever ignores our data on a subject and that we are repeatedly asked to gather even more data; there is a good reason that hundreds of thousands of newspaper and electronic media citations and numerous court decisions refer to BJS ï¬ndings; there is a good reason that Ofï¬ce of Management and Budget regards our activities as the âmost effectiveâ in all of the Department of Justice; and ï¬nally, there is a good reason that so many have expressed such concern about a few lines in a BJS press release, evidence of the importance of what we sayâ (Greenfeld, 2005).
234 JUSTICE STATISTICS that OJPâs intervention in the PPCS press release violated the basic practices of an ofï¬cial statistical system. The panel concludes that the current organizational arrangement under which BJS is administratively housed in a program agency (OJP) and the fact that its director serves at the pleasure of the president is a continu- ing and pressing threat to BJSâs position of independence as a provider of objective statistical information. It is critically important that, whatever or- ganizational structures or reporting requirements may apply, BJS function independently and be allowed to function independently. We also recognize that there exists no organizational arrangement thatâon its ownâcan com- pletely shield a statistical agency from threats to its independence and guar- antee freedom from political or structural interference (or the appearance thereof). However, in our assessment, the continuing threat to BJSâs inde- pendence is sufï¬ciently direâand the past violations sufï¬ciently severeâas to warrant what we believe to be the strongest possible corrective actions and deterrents to incursions on independent functioning: moving BJS out of OJP and ï¬xing the term of service of the BJS director. BJS and the Ofï¬ce of Justice Programs BJSâs functions are unique in its parent branch, OJP with respect to both mission and technical require- , ments. Since grantmaking overwhelmingly drives the OJP organization and service-delivery infrastructure, OJP is ill-suited to address the needs of BJS to produce data and statistical reports and provides minimal support for carrying out these functions (although it does, with contributions from BJS and other OJP bureaus, operate the National Criminal Justice Reference Ser- vice for dissemination of BJS results). BJSâs administrative placement within OJP is doubly a hindrance on BJSâs effective function as the principal data- gathering unit within the Justice Department: ï¬rst, by putting it into com- petition for funds and resources with popular grantmaking functions that provide assistance to state and local law enforcement and, second, by dimin- ishing BJSâs position within the Department. Other Justice Department divi- sions perform fairly major statistical and data collection functionsâamong them, the Civil Rights Division, the Justice Management Division, the Ex- ecutive Ofï¬ce for U.S. Attorneys, and the Federal Bureau of Investigation (FBI). These units utilize statistical analysis for performance measurement, examination of voting issues, review of discrimination concerns, and so forthâmajor issues in which BJSâs ability to offer advice or coordination is impaired by BJSâs positioning within the department.
PRINCIPLES AND PRACTICES 235 Although the historical reason for BJS being positioned within OJP is fairly clear, inheriting as both entities do from the Law Enforcement Assis- tance Administration (LEAA), the administrative positioning raises technical and practical concerns. The basic purpose of OJP is to promote certain activities, strategies, or interventions related to crime, primarily through ï¬- nancial assistance to state and local authorities. Statistics should serve as an independent way of assessing those practices by measuring whether crime problems are worsening or improving; that statistical activities are under the direction and funding of OJP creates the appearance and, at times, the reality of conï¬ict and questionable integrity. Moreover, BJSâs placement within OJP forces it to compete for resources with grant monies that are popular with and coveted by local authorities and congressional representatives alike. In terms of budget, the Justice Depart- ment tends to view BJS as a small line entry in an overall OJP appropriation. The general process is such that OJP is budgeted or appropriated at a certain funding level and largely makes the internal distribution among component agencies; it is the assistant attorney general for OJP and not the director of , BJS, who is permitted to testify before congressional appropriations commit- tees. Put into head-to-head competition with grant programs to âput cops on the streetâ or fund crime assistance programs, sustaining the growing costs of BJS statistical programs become a lower-order concern. Worse, in recent years, OJP has taken steps to make explicit BJSâs subservience within a larger OJP appropriation, undercutting BJSâs presence as even a simple line item in annual spending bills. In the 2003 House appropriations subcommittee report for the ï¬scal year 2004 Commerce, Justice, and Science spending bill, appropriators took note of a change in the budget request it received from the Justice Department (H.Rept. 108-221, p. 36): The ï¬scal year 2004 budget request proposed merging all programs ad- ministered by the Ofï¬ce of Justice Programs (OJP) under the Justice Assistance heading. The Committee recommendation retains the ac- count structure used in previous years and funds State and local law enforcement programs under seven appropriation accounts. House-Senate conferees on the ï¬nal consolidated appropriations bill for ï¬s- cal year 2004 also noted that they âdo not adopt the Administrationâs pro- posal to consolidate all [OJP] activitiesâ under the single âJustice Assistanceâ heading (H.Rept. 108-401, p. 533). Similar attempts to consolidate ac- counts were noted by House appropriators in the ï¬scal year 2005 submission (H.Rept. 108-576, pp. 33â34; H.Rept. 108-792, p. 738). The attempt to consolidate OJP funding into a single pool has continued in each subsequent year, including submissions for ï¬scal year 2009 (e.g., Senate appropriators commented that âthe Committee again rejects the Departmentâs proposed merger of all OJP programs under this heading and instead has maintained the [previous] account structure;â S.Rept 110-397, p. 64).
236 JUSTICE STATISTICS The problem of BJS funding as it is currently situated within OJP is anal- ogous to problems encountered in other governmental programs where new initiatives often receive greater attention than the existing responsibilitiesâ ï¬xing potholes often takes a back seat to more glamorous new construction projects. In the case of BJS, after passage of the multibillion-dollar Violent Crime Control and Law Enforcement Act of 1994, OJP funding expanded greatly and external grants ï¬owed freely, yet BJS received no enhancements to its appropriated funding and, indeed, had difï¬culty even securing addi- tional funding to cover cost-of-living adjustment increases payable to the Census Bureau for data collection. On one hand, statistical data collection activities should be seen as long-term activities requiring predictable funding so that they may be carried out on recurring schedules. On the other hand, the grant programs of the larger OJP have impermanence in both mission and appropriations; BJSâs base function is jeopardized from being tied to an administrative parent whose resources can rise or fall dramatically and whose local-assistance grants are more popular funding targets than contin- uing statistical activities. Statistical analysis and research have also been strikingly undervalued by OJP as evidenced by attempts to âoutsourceâ most BJS staff positions and , functions. In August 2002, OJP was said to have issued a directive stat- ing that jobs within BJS would be turned over for competitive bid to the private sector (Butterï¬eld, 2002). Under the terms of the Federal Activities Inventory Reform (FAIR) Act, positions within a government agency must be characterized as either âcommercialâ or âinherently governmentalâ; in late 2002, OMB was in the process of revising its Circular A-76 to more directly require that those positions classiï¬ed as âcommercialâ be opened to com- petitive bid with private-sector companies. As described by the Consortium of Social Science Associations (2002:5) in February 2003, the FAIR Act in- ventory developed by OJP classiï¬ed 51 out of 57 positions as âcommercialâ and thus designated for outsourcing. Several statistician positions within BJS were classiï¬ed in the inventory as being âgrants monitoring and evaluationâ; 20 of 23 jobs labeled âstatistical analysisâ and 18 of 20 âgrants monitoringâ positions were labeled commercial. This classiï¬cation drew protest from several social science organizations including the American Society of Crim- inology, whose executive board passed a resolution in November 2002 ar- guing that âthe compilation, analysis, interpretation, reporting, monitoring, and management of crime and justice statistics . . . are inherently govern- mental functionsâ (http://www.asc41.com/boardmin.annual022.htm). This outsourcing effort was blocked by congressional appropriators: in explaining the ï¬scal year 2003 omnibus appropriations bill, House-Senate conferees insisted that the appropriations committees âmust be assured that effectiveness is improved and savings are attainedâ through the OJP out- sourcing plan before proceeding with changes (H.Rept. 108-10, p. 635),
PRINCIPLES AND PRACTICES 237 a provision repeated by House appropriators the following year (H.Rept. 108-221, p. 40). In the ï¬scal year 2006 appropriations round, House- Senate conferees speciï¬cally directed that âany action taken by OJP relat- ing to [OMBâs] Circular A-76 shall be subject toâ a general provision re- quiring advance notice and special justiï¬cation to Congress for program changes that, among other conditions, would reduce the personnel of an agency by 10 percent or more (H.Rept. 109-272, pp. 46, 86). However, implementation of outsourcing is still possible, and would still be dam- aging to BJS. The Justice Departmentâs most recent publicly posted FAIR Act inventory listed commercial and inherently governmental activities for 2007 (http://www.usdoj.gov/jmd/pe/preface.htm); this roster lists 32 of 57 BJS positions (and 20 of 33 âstatistical analysisâ positions) as commercial, with the reason for classiï¬cation as commercial listed as âpending an agency approved restructuring decision (e.g., closure, realignment).â In our assess- ment, the collection and analysis of statistical data by federal statistical agen- cies is an essential government function; that OJP has not more fully realized this point suggests a continued incompatibility of functions between BJS and its administrative parent. Exacerbating this mismatch in functions between OJP as a program agency and BJS as a statistical agency, two threads of legislative text that have developed since the late 1990s have suggested attempts to tether BJS closer to OJP objectives and diminish BJSâs functional independence. Both of these threads have involved wording changes that may appear short and subtle but have great meaning, and both require some detailed attention to legislative history to be fully understood. The ï¬rst of these threads began in 1997 when House appropriators expressed concern that âthe current structure of administration of grants within [OJP] produces a fragmented and possibly duplicative approach to disseminating information to State and local agencies on law enforcement programs and developing coordinated law enforcement strategies.â Noting a 213 percent growth in overall OJP grant program funding since 1995, the appropriators directed the assistant attorney general (AAG) for OJP to prepare a report recommending actions âthat will ensure coordination and reduce the possibility of duplication and overlap among the various OJP di- visionsâ (H.Rept. 105-207, pp. 43â44). This language was preserved in the House-Senate conference on the ï¬scal year 1998 spending bill (H.Rept. 105-405) that became law. The AAG issued this requested report in January 1998; on the basis of the report, House and Senate appropriations con- ferees inserted a provision into the ï¬scal year 1999 omnibus spending act asserting an oversight role for the AAG in ï¬nalizing grants (Congressional Record, October 19, 1998, p. H11310). Speciï¬cally, the ï¬nal act read (P .L. 105-277; 112 Stat. 2681-67; compressing a ï¬rst clause that gives the AAG grantmaking authority):
238 JUSTICE STATISTICS Notwithstanding any other provision of law, during ï¬scal year 1999, the Assistant Attorney General for the Ofï¬ce of Justice Programs of the Department of Justice [shall] have ï¬nal authority over all grants, coop- erative agreements, and contracts made, or entered into, for the Ofï¬ce of Justice Programs and the component organizations of that Ofï¬ce. Though it left intact language from BJSâs creation in 1979 giving the BJS director âï¬nal authority for all grants, cooperative agreements, and contracts awarded by the Bureauâ (93 Stat. 1176; 42 USC § 3732(b)), this provision made OJPâs âï¬nal authorityâ for grants primary to BJSâs âï¬nal authorityââ albeit only for ï¬scal year 1999. BJS brieï¬y won exemption from this provision when new appropriations language changed the effective date from ï¬scal year 1999 to 2000 but added a caveat that the AAGâs ï¬nal authority did not apply to grants made under certain sections of law (113 Stat. 1501A-20), including the section asserting BJSâs âï¬nal authorityâ for its own grants. In 2000, appropriations language made no further changes to the text but indicated that it âshall apply here- afterâ (114 Stat. 2762A-68), which led to the language being codiï¬ed as 42 USC § 3715. However, section 614 of the 2001 USA PATRIOT Act (P 107-56; 115 Stat. 370) made two critical changes: .L. ⢠By adding three words, the revised law gave the AAG âï¬nal authority over all functions, including any grantsâ (emphasis added), a much wider sweep of authority over BJS and other OJP-component ofï¬ces. ⢠The revised law amended âcomponent organizations of that Ofï¬ceâ to read âcomponent organizations of that Ofï¬ce (including, notwith- standing any contrary provision of law (unless the same should ex- pressly refer to this section), any organization that administers any program established in title 1 of Public Law 90-351)ââa rather con- voluted way of making explicit that OJPâs âï¬nal authorityâ supersedes BJSâs (which still exists, albeit as an âother provision of lawâ). One year later in September 2002, this perceived takeover of BJS authority was exacerbated by one ï¬nal small but sweeping change included in reau- thorization language for the Department of Justice. Reference to the AAG was stricken and the text amended to read that, âduring any ï¬scal year, the Attorney Generalâ shall have ï¬nal authorityâasserting strong Justice De- partment control over BJS and other OJP ofï¬ces (P 107-273; 116 Stat. .L. 1778). The second legal thread deals with a clause in the enumerated powers of the AAG. The Justice System Improvement Act of 1979 that created BJS also created OJPâs predecessor, the Ofï¬ce of Justice Assistance, Research, and Statistics (OJARS), but did so in an interesting way: deï¬ning the LEAA, BJS, and the National Institute of Justice (NIJ) up front in sections AâC but only specifying OJARS in a catch-all Part H on âAdministrative Provisions.â
PRINCIPLES AND PRACTICES 239 Speciï¬cally, section 802(b) of the Act (93 Stat. 1201) directed that (emphasis added): The Ofï¬ce of Justice Assistance, Research, and Statistics shall directly provide staff support to, and coordinate the activities of, the National Institute of Justice, the Bureau of Justice Statistics, and the Law Enforce- ment Assistance Administration. The Justice Assistance Act of 1984 substantially rewrote and reorganized the existing law, creating OJP in its current form and pointedly giving it primacy by deï¬ning it in Part A (where the LEAA was previously deï¬ned). The 1984 act also made explicit that âthe Director [of BJS] shall report to the Attorney General through the [AAG]â (98 Stat. 2079). In place of the above-quoted 1979 language, the 1984 act speciï¬ed duties of the AAG including (98 Stat. 2078; emphasis added): The Assistant Attorney General shall . . . (5) provide staff support to co- ordinate the activities of the Ofï¬ce and the Bureau of Justice Assistance, the National Institute of Justice, the Bureau of Justice Statistics, and the Ofï¬ce of Juvenile Justice and Delinquency Prevention. . . . The Homeland Security Act of 2002 (P 107-296; 116 Stat. 2162) made a .L. small but telling change to point (5), simply inserting the words âcoordinate andâ at the beginning to give the phrase its current form (42 USC § 3712(a), emphasis added): The Assistant Attorney General shall . . . (5) coordinate and provide staff support to coordinate the activities of the Ofï¬ce [and the] Bureau of Justice Statistics. . . . In isolation, these legislative changes might appear to be relatively in- nocuous. In terms of strict legislative text, the 2002 Homeland Security Actâs provision did nothing but restore a âcoordinationâ function held by OJARS at its (and BJSâs) founding in 1979âat which point it was arguably a worse situation for BJS, given OJARSâs more weakly deï¬ned position. How- ever, in context and in combination, the changes convey an intent by OJP to take a more heavy-handed role in BJS activities. A press account at the height of this legislative activity in 2002 noted a statement by then-AAG Deborah Daniels, suggesting that stronger OJP control over BJS and NIJ was desirable in order to ensure that DOJ âspeaks with one voiceâ on crime and justice issues (Butterï¬eld, 2002:33). This rationale is antithetical to the position of independence that statistical and research agencies must have in order to be most effective; statistical agencies must have the latitude to release ï¬ndings that run counter to the policy of their parent departments, if those ï¬ndings are borne out by the data. Consequently, taken together, OJPâs legislative as- sertion of âï¬nal authorityâ over BJS functions and its intent to âcoordinateâ BJS activities constitute dangerous infringements of BJSâs proper function. Conceptually, the current organizational structure under which BJS is housed within OJP along with other research and subject-matter bureaus
240 JUSTICE STATISTICS does have certain advantages. If heavy-handed âcoordinationâ gave way to real synergyâfull collaboration between BJS and sister bureaus such as the Ofï¬ce of Juvenile Justice and Delinquency Prevention or the Ofï¬ce for Vic- tims of CrimeâBJS data and analysis could meaningfully inform OJP policy development. Likewise, in such a true synergistic environment, the AAG for OJP could provide strong and visible advocacy for BJS concerns. However, we believe that such an effective and beneï¬cial implementation of the status quo organizational arrangement hinges critically on the priorities and tem- peraments of the AAG and other top ofï¬cials in the Justice Department and the strength of the BJS director to function independently. In our assess- ment, the inherent conï¬icts between the priorities of a program ofï¬ce such as OJP and a statistical agency such as BJSâand the too-ï¬ne line between synergistic work by OJP ofï¬ces and attempts to make those ofï¬ces âspeak with one voiceââmakes the status quo untenable in the long run. On the basis of these arguments, we conclude that BJSâs administrative placement in OJP is detrimental: Finding 5.3: The placement of BJS within the Ofï¬ce of Justice Programs has harmed the agencyâs ability to innovate in data collections and expand the efï¬ciency of achieving its statistical mission. It suffers from a zero-sum game in competition with programs of direct ï¬nancial beneï¬t to states and localities. In the panelâs assessment, a BJS that is better established as an indepen- dent structure within the DOJ infrastructure would have an enhanced ability to support and sustain statistical programs. We also expect that a higher- placed BJSâideally as a direct report to the attorney general or the deputy attorney generalâwould have a powerful effect on the timeliness of infor- mation released by BJS, because it would be called upon to provide more contemporaneous information to the highest levels in the department. Such an administrative move would make clear the permanence of data-gathering functions and the need to use the resulting information in policy develop- ment and review; it would also provide a clear separation from competing interests who wish to advocate for certain programs or initiatives. In terms of data collection, a more-prominent and higher-proï¬le BJS would also be helpful in dealing with balky or resistant data suppliers. To be sure, admin- istrative attachment of BJS to the ofï¬ce of the attorney general runs the risk of politicizationâfar from the intended effect. However, in our judgment, such a high-level attachment would afford BJS the most prominence and stature and, hence, be the strongest corrective remedy for past breaches of BJSâs independence. Accordingly, we recommend: Recommendation 5.3: BJS should be administratively moved out of the Ofï¬ce of Justice Programs, reporting to the attorney general or deputy attorney general.
PRINCIPLES AND PRACTICES 241 It follows that this administrative change involves removing the legislative language asserting a strong OJP oversight role over BJS functions. To this general recommendation, we add two corollaries: ⢠In foregoing ties to OJP it is important for BJS to retain the capacity , for letting contracts. In particular, it is vital that BJS retain full ability for administering grants such as those that maintain the state Statisti- cal Analysis Center (SAC) network and that support development of and improvement to source criminal justice databases, as described in Chapter 4. ⢠The problems faced by BJS in its administrative nesting within a pro- gram agency are similar to those faced by some other OJP units, no- tably NIJ: a research agency embedded within a program agency. In November 2008, John Jay College of Criminal Justice president and former NIJ Director Jeremy Travis issued an open letter to the mem- bership of the American Society of Criminology urging the creation of an Ofï¬ce of Justice Research within DOJ. This new ofï¬ce would in- clude BJS and NIJ, elevating NIJâs Ofï¬ce of Science and Technology to become the National Institute of Justice Technology; all three agencies would report to an assistant attorney general for justice research, ap- pointed by the president with Senate conï¬rmation. Relevant excerpts from this letter are shown in Box 5-3. Determining the administrative placement of NIJ is beyond this panelâs scope; a parallel National Research Council panel is currently evaluating NIJâs research program, and NIJâs structure is more the province of that panel. However, we note that an approach by which both BJS and NIJ report to an assistant attorney general for research is certainly consistent with our own recommendation; our guidance in this report is intended to speak to a choice between BJS remaining in OJP versus moving out of OJP and the Travis proposal would also , achieve the result we think is best for BJS. A separate ofï¬ce including both a research agency and a statistical agency would also be uniquely poised to develop research programs in justice-related issues that have received relatively little rigorous empirical treatment, such as the ex- tent to which forensic evidence (e.g., ï¬ngerprints or ï¬rearm-related toolmarks) are introduced in judicial proceedings (and the effective- ness of that evidence) or the perceived fairness of court verdicts. Term of Appointment of BJS Director To provide an added measure of in- sularity, the panel further concludes that BJS would beneï¬t from the desig- nation of the BJS directorship as a ï¬xed-term appointment by the president, with the advice and consent of the Senate.
242 JUSTICE STATISTICS Box 5-3 Excerpts from Travis (2008) Open Letter on an Ofï¬ce of Justice Research I propose that the Congress create, with support from the new Administration, a new ofï¬ce in the Department of Justice, called the Ofï¬ce of Justice Research, to be headed by an Assistant Attorney General for Justice Research. This ofï¬ce would be separate from the Ofï¬ce of Justice Programs, which would continue to administer the funding programs that support reform efforts by state and local law enforcement and criminal justice agencies. . . . The argument for creation of the new Ofï¬ce of Justice Research, separate from the Ofï¬ce of Justice Programs, is very straightforward: if the research, statistics, and scientiï¬c development functions of the federal government are located within an ofï¬ce that is primarily responsible for the administration of assistance programs, three risks are created. First, the scientiï¬c integrity of the research functions is vulnerable to compromise. Second, the research and development function will never be given the priority treatment that is needed to meet the enormous crime challenges facing the country. Third, the research agenda on crime and justice will more likely reï¬ect short-term programmatic needs rather than the long-term need to develop a better understanding of the phenomenon of crime in America and the best ways to prevent and respond to crime. . . . [As part of this new ofï¬ce,] the Bureau of Justice Statistics would continue all of the functions currently carried out by BJS. [But] the current constellation of data collections systems on crime and justice are fragmented and incomplete. To remedy this situationâand to provide the nation the capability to track crime trends in a timely mannerâthe mandate of BJS should be expanded signiï¬cantly. First, BJS should be authorized to work closely with the Federal Bureau of Investigation to improve the timeliness and completeness of the Uniform Crime Reports. Similarly, responsibility for the ADAM program [(see Section 2âC.4)] should be transferred from ONDCP (it was originally housed at NIJ), and responsibility for the statistical series on juvenile justice should be transferred from the Ofï¬ce of Juvenile Justice and Delinquency Prevention (a component of OJP). But the new BJS would be more than a manager of existing statistical series. It should also develop new initiatives to track crime trends, drawing on capabilities of police departments that now post crime trends close to real time. It would develop new protocols for tracking critical crime issues, such as the level of illegal drug selling activity, public conï¬dence in the criminal justice system, the operations of the federal law enforcement agencies, etc. This expanded portfolio would clearly require additional funding, but there are compelling arguments for creating a robust national capacity to improve our understanding of crime trends. . . . If we were designing a federal research and development capacity on crime and justice today, we would probably not propose the current structure that houses NIJ and BJS within the Ofï¬ce of Justice Programs, three levels below the Attorney General, with a focus on state and local criminal justice. Rather, we would create a scientiï¬c branch of government that operates under scientiï¬c principles reporting directly to the Attorney General. We would recognize that crime is now a transnational phenomenon and we need to understand human trafï¬cking, drug smuggling, immigration trends and terrorism. We would examine the many systems of justiceâcivil justice, immigration courts, the federal justice system, in addition to state and local justice systems. We would develop a modern capacity to understand local crime conditions using high-tech surveys. We would develop creative ways to measure (continued)
PRINCIPLES AND PRACTICES 243 Box 5-3 (continued) non-traditional crimes, such as identity theft, corporate and white collar crime, and transnational crime. We would design a research and development program that would harness the power of technology so the agencies that enforce the law can beneï¬t from the scientiï¬c and technological revolution. This ambitious agenda clearly requires additional resources. But it also requires a new structure within the Depart- ment of Justice, a structure that guarantees both scientiï¬c integrity and policy relevance. SOURCE: Excerpted from Travis (2008:1, 4, 5); emphasis in the original. Finding 5.4: Under current law, the director of the Bureau of Justice Statistics serves at the pleasure of the president; the di- rector is nominated to an unspeciï¬ed term by the president, with the advice and consent of the Senate (42 USC § 3732(b)). It is worth noting that ï¬xed-term appointments are relatively rare in the federal statistical system. Currently, only two of the nationâs principal sta- tistical agenciesâthe Bureau of Labor Statistics and the National Center for Education Statisticsâhave heads who are appointed and conï¬rmed to ï¬xed terms of 4 and 6 years, respectively (29 USC § 3 and 20 USC § 9517(b)).13 The heads of BJS, the Census Bureau, and the Energy Information Adminis- tration are appointees (with Senate conï¬rmation) who serve at the pleasure of the president; the other nine heads of Interagency Council on Statisti- cal Policy member organizations are career employees and departmental ap- pointments. Bills to create a termed appointment for the director of the Cen- sus Bureau have been introduced, but not enacted, in recent Congressesâ most recently, one that would ï¬x the term at 5 years (at the same time that it would remove the Census Bureau from the Department of Commerce and establish it as an independent executive agency).14 The range of models for the term of appointment of a BJS director can be expressed simply: ⢠Presidential appointment with Senate conï¬rmation, at pleasure (the status quo); ⢠Presidential appointment with Senate conï¬rmation, ï¬xed term; and ⢠Career employee, appointed by the president, cabinet secretary, or other ofï¬cial. 13 Ironically, the same legislation that positioned the National Center for Education Statistics under a new administrative layerâthe Institute of Education Sciencesâalso extended the length of the ï¬xed term for the commissioner of education statistics. Prior to 2003, commissioners served a 4-year term rather than a 6-year term. 14 See H.R. 7069, introduced by Rep. Carolyn Maloney (D-N.Y.) on September 25, 2008, in the 110th Congress.
244 JUSTICE STATISTICS In the right environmentâwith a strong and well-deï¬ned position of inde- pendence and the latitude for innovationâthe career employee directorship is an attractive option that has the added advantage of ensuring that a direc- tor is well versed in the agencyâs existing work and subject-matter domain. Indeed, among BJSâs fellow statistical agencies, career employee appoint- ments such as the directorship of the Bureau of Economic Analysis rank among the most effective leadership models. However, as we described in arguing for an administrative move out of OJP BJS does not enjoy such an , environment. We view a presidential appointment with Senate conï¬rmation as a necessity for the BJS directorship, carrying with it the stature to interact effectively with the appointees at the top ranks of the Justice Department. The events of 2005 demonstrated that BJS can be and has been harmed by the current arrangement by which the BJS director serves strictly at the pleasure of the administration. The circumstances of Director Greenfeldâs dismissalâin the immediate aftermath of refusing to alter a press release to address political concernsâfostered the appearance of formal and structural interference in BJSâs operations. In our assessment, a ï¬xed-term appoint- ment for the BJS directorship would be the best and strongest palliative measure to put some distance between BJS and its political superiors in the Justice Department (whether BJS remains in OJP or not). The model of the directorship of the Bureau of Labor Statistics is the one that we ï¬nd most compelling for BJS: In our judgment, it makes sense for the federal ofï¬- cer directly tasked with reporting key indicators of social justice in America to have stature, political insularity, and term of service commensurate with the federal ofï¬cer directly responsible for reporting key economic indicators such as unemployment and job growth.15 The director of BJS must have the capability to objectively report both good news and bad newsâto provide information on crime and justice in the United States, even when the ï¬ndings are politically inconvenient or unappealing. We believe that a presidential appointment with conï¬rmation provides the appropriate stature for such a position, and that the speciï¬cation of a ï¬xed term of service prevents the kinds of attempted interference that has harmed BJS in recent years. Accordingly, we recommend: Recommendation 5.4: Congress and the administration should make the BJS director a ï¬xed-term presidential appointee with the advice and consent of the Senate. To insulate the BJS director from political interference, the term of service should be no less than 4 years. 15 Though the jobs are obviously much different in scope, it is worth noting that the other principal federal ofï¬cer tasked with reporting statistics on crime in the United Statesâthe di- rector of the FBI, reporting results from the Uniform Crime Reporting programâholds the relative insularity of a 10-year ï¬xed-term appointment, nonrenewable, with Senate conï¬rma- tion (P.L. 90-351 § 1101).
PRINCIPLES AND PRACTICES 245 It would make sense for the term to be about 6 years because that would take the director to a new administration or to a second term of a incumbent administration. 5âA.3 Relevance to Policy Issues A federal statistical agency must be in a position to provide objec- tive information that is relevant to issues of public policy. A statistical agency must be knowledgeable about the issues and requirements of public policy and federal programs and able to provide objective infor- mation that is relevant to policy and program needs. . . . In establishing priorities for statistical programs for this purpose, a statistical agency must work closely with the users of such information in the executive branch, Congress, and interested nongovernmental groups. (National Research Council, 2009:4) This principle has implications, both for the parent department of a sta- tistical agency and for the actions of the agency itself. The parent depart- ment must take the agency seriously. Statistical units, when best used by their parent agency, are the window into the performance of their agency in addressing key issues facing the society. When intelligently used, the statis- tical agency can measure the prevalence and importance of different issues tasked to the department. When intelligently used, they can be the manage- ment dashboard to guide allocation of budget to different activities. When intelligently used, they can assemble information about likely trends of fu- ture phenomena within the mission of the department. However, achievement of such a role is not merely dependent on out- reach by the leadership of the parent department. Rarely are the govern- ment ofï¬cials appointed to departmental leadership aware of the utility of statistical information to guide the work of the department. The director and senior staff of the statistical agency have an obligation to be outwardly- focused, to become expert in the program mission of the agency. Only with such substantive expertise can the departmentâs statistical agency produce optimally relevant statistical information to the policy makers of the depart- ment. Statistical agencies are part of the management information system for policy making in program departments. Senior statistical staff must have the skills, time resources, and mandate to develop relationships with the policy- making units to provide information relevant (not necessarily supporting, but relevant) to the policy makersâ tasks. In the judgment of the panel, BJSâs ability to carry out this role of pro- viding policy-relevant data is impaired by its relatively low proï¬le within the agency. At one of its plenary meetings, the panel met with senior DOJ ofï¬- cials and discussed past and current roles of BJS within DOJ policy-making activities; from those discussions, it was apparent that BJS was not viewed
246 JUSTICE STATISTICS as a relevant player in many of the key initiatives of DOJ. Indeed, there did not seem to be high awareness of the range of BJS activities or the ways in which data could be brought to bear in broader DOJ activities. In the panelâs view, BJS has not been perceived as an important asset in assembling relevant information for key policy initiatives; fault for this is undoubtedly shared by BJS (for limited âpromotionâ of its work within the department) and by higher ofï¬cials in DOJ. There are two potential, relevant solutions, the ï¬rst of which looks at BJS activities within DOJ. The panel believes that the BJS director should be a very visible and active promoter of the value of objective statistical information for use in policy decisions within DOJ. Every budget initiative of DOJ is a potential opportunity for enriched statistical information about the status of the justice system. The BJS director and his or her senior staff should increase their outreach to sister DOJ units. Recommendation 5.5: The BJS director needs to reach out to other agencies within DOJ, forming partnerships to propose ini- tiatives for information collection that are relevant to policy needs. Recommendation 5.6: The Department of Justice should build provisions for BJS collection of data and statistical information into its program initiatives aimed at crime reduction. These are not intended as program evaluation funds, but rather as funds for the basic monitoring and assessment of the phenomena tar- geted by the initiative. Although this recommendation is a necessary step to achieve more rele- vance to DOJ, the panel believes that it may not be sufï¬cient. Effective out- reach by BJS depends on willingness to receive such outreach and respect for BJS expertise. The visibility of BJS within DOJ and in the legislature appears to be quite low. On budget initiatives the BJS director rarely meets directly with legislative staff; the BJS budget is reviewed as part of the OJP budget, and so those discussions are held at the OJP level. Hence, our previous recommendation to administratively move BJS out of OJPâgiving the BJS director the authority (and the duty) to interact directly with congressional appropriators and overseersâwould also contribute greatly to BJSâs ability to provide policy-relevant data. In Section 5âB.8 below, we discuss the need for an effective research program as another means of bolstering the relevance of BJS and its data products. 5âA.4 Credibility Among Data Users A federal statistical agency must have credibility with those who use its data and information. . . . To have credibility, an agency must be freeâ
PRINCIPLES AND PRACTICES 247 and must be perceived to be freeâof political interference and policy advocacy. Also important for credibility is for an agency to follow such practices as wide dissemination of data on an equal basis to all users, openness about the data provided, and a commitment to quality and professional practice. (National Research Council, 2009:5) Credibility is a reputational attribute of a statistical agency. It is fre- quently argued that the credibility of the statistical products is partly de- rived from sound statistical properties (high precision and low statistical bias) and from perceptions that the source of the information has no point of view or ideological lens on the information (National Research Coun- cil, 2005b:5). Thus credibility is enhanced with sound professional practice and widespread recognition of this professionalism. It is also enhanced by demonstration of independence from inï¬uence from policy viewpoints. Panel members and staff were active observers in a workshop of users of BJS data, conducted by the Council of Professional Associations for Fed- eral Statistics (COPAFS) with BJS sponsorship, in February 2008. Attendees at the workshop included members of BJSâs state SAC network, academic researchers, representatives of police chiefs, representatives of state courts, and others, along with BJS staff and ofï¬cials. There was general high praise for BJS, some calls for increased timeliness of BJS data (for enhanced law enforcement management purposes), and ï¬ner granularity of estimates for local uses. For some panel members in the audience of the workshop, some of the law enforcement community were asking for almost real-time event dataâa goal that is difï¬cult for any statistical agency to achieve. Despite these types of critiques of BJS, panel after panel at the workshop expressed great belief that the BJS data series were credible, valued, and relevant to their work. Finding 5.5: BJS enjoys high credibility but often is critiqued for missing ï¬ne-grained data by geography or time. 5âB PRACTICES OF A FEDERAL STATISTICAL AGENCY 5âB.1 Clearly Deï¬ned and Well-Accepted Mission An agencyâs mission should include responsibility for all elements of its programs for providing statistical informationâdetermining sources of data, measurement methods, efï¬cient methods of data collection and processing, and appropriate methods of analysisâand ensuring the pub- lic availability not only of the data, but also of documentation of the methods used to obtain the data and their quality. (National Research Council, 2009:7) That BJSâs mission and basic functions are clearly deï¬ned is virtually in- disputable. We have frequently referred to Box 1-2, BJSâs extensive list of
248 JUSTICE STATISTICS authorized activities under its enabling legislation, which is testament to the detail in BJSâs deï¬ning mission. Whether they are clearly accepted is quite another matter. As we discussed in Section 5âA.3, the panel was disap- pointed by the apparent lack of understanding of BJSâs role and its potential when it met with higher-level Justice Department ofï¬cials. Although ex- pressions of support were plentiful, an understanding of the importance of high-quality data for shaping policy was generally lacking. BJSâs recent history in the appropriations process is also, potentially, ev- idence that its range of existing data collectionsâand the cost of data col- lection, generallyâis not well understood in important places. In summer 2006, the appropriations committees in both houses of Congress processed BJSâs budget request of about $60 million. While the House sought to keep BJS funding at about ï¬scal year 2006 levels ($36 million, compared to ï¬nal 2006 allocation of $34.6 million; H.Rept. 109-520), the Senateâs mark came in considerably lower at $20 million (S.Rept. 109-280). (No ï¬nal appropri- ations bill for DOJ was passed for ï¬scal year 2007; like many other federal agencies, it was funded through a series of continuing resolutions at ï¬scal year 2006 levels, with some exceptions). A brief explanatory note in the Senate committeeâs report acknowledged BJSâs role in collecting the NCVS and other data programs but did not explain the reason for the reduction. The problem was exacerbated in the ï¬scal year 2008 appropriations pro- cess: House appropriators provided $45 million for BJS (H.Rept. 110-240) but the Senate appropriators, with no explanatory statement whatsoever, in- cluded only $10 million for BJS: a funding level that would have terminated the NCVS, if not much of BJSâs activities. Inquiries by the Consortium of So- cial Science Associations yielded the explanation from Senate subcommittee staff that the $10 million ï¬gure was a âmisprintâ that would be corrected and replaced by âfull fundingâ later in the process (Consortium of Social Science Associations, 2007:3). It was not corrected in the version of the bill that ï¬nally passed the Senate; in the ï¬nal consolidated appropriations bill that included DOJ, BJS funding came closer to the House mark than the Senate mark.16 As before, the panel concludes that a clear separation between BJS and OJP and placement of BJS elsewhere in the DOJ hierarchy would help clarify the mission of BJS and strengthen its proï¬le as a principal statistical agency. Given congressional stalemate and the inability to pass most individual ap- propriations bills, the particular budget climate in recent ï¬scal years would be difï¬cult for any organizational conï¬guration of BJS within DOJ. Still, the story of the varying appropriations marks suggests that, in at least one im- portant circle, knowledge of the basic cost of data collection and the value 16 For ï¬scal year 2009, Senate appropriators recommended $40 million for BJS (S.Rept. 110-397).
PRINCIPLES AND PRACTICES 249 (and cost) of BJSâs ï¬agship data collection was sufï¬ciently weak as to put BJSâs viability at stake. BJSâs mission is not well served by having its in- terests solely represented and managed by OJP in the budget and planning arenas, precisely because BJSâs own mission is not well articulated by OJPâs general mission âto increase public safety and improve the fair administra- tion of justice across America through innovative leadership and programsâ (U.S. Department of Justice, Ofï¬ce of Justice Programs, 2006:3), principally through ï¬nancial assistance. 5âB.2 Continual Development of More Useful Data Statistical agencies must continually look to improve their data systems to provide information that is accurate, timely, and relevant for chang- ing public policy needs. They should also continually seek to improve the efï¬ciency of their programs for collecting, analyzing, and dissemi- nating statistical information. (National Research Council, 2009:7) The February 2008 data users workshop, sponsored by BJS and con- ducted by COPAFS, was a good step for BJS in carrying out the practice of improving and modifying its data collections to be more useful and rel- evant. The session suggested both useful analyses and extracts that could be made from existing data series (e.g., tailoring analyses and sponsoring research on the NCVS; Heimer, 2008) and wholesale revisions to collection methodologies to improve timeliness or relevance (e.g., an NCVS-type sur- vey of experiences in civil justice matters; Eisenberg, 2008). As we observed in Chapter 4, BJSâs state SACs, and its coordination through JRSA, provide it with a mechanism for ready communication and interaction with state- level practitioners, all of which contribute to reevaluation of individual BJS programs and reports. Although BJS has done well on this score, we encourage it to push further and develop the tools that other statistical agencies use to inform themselves of changing data needs of their user bases. Speciï¬cally: 1. As BJS staff indicated at the time, the February 2008 users workshop should be seen as a ï¬rst step and not a one-time conversation. BJS could sponsor an annual users conference, perhaps drawing from a larger base of downstream users than JRSAâs annual research confer- ence. These user meetings could be similar to those routinely held by the National Center for Health Statistics, CDC (for the Behavioral Risk Factor Surveillance System), and the Census Bureau. 2. Through JRSA, BJS sponsors a journal (Justice Research and Policy), much as the Bureau of Transportation Statistics has done for its re- lated ï¬elds. BJSâs role in such a journal or statistical publicationâ and knowledge of strengths and weaknesses in BJS dataâcould be enhanced by encouraging BJS staff or grantees to seek publication in
250 JUSTICE STATISTICS the journal or developing âspecial usersâ on speciï¬c user constituency needs. 3. Consistent with item 21 in BJSâs legally authorized duties (Box 1-2), BJS could convene meetings of ofï¬cial justice statisticians from other countries, charged with missions similar to that of BJS, to apprise itself of international comparability. 4. BJS could commission small âwhite papersâ from key leaders in the justice systems about future data needs. 5. BJS should continue, and interact with, informal advisory mechanisms that have developed over the years, such as the Committee on Law and Justice Statistics of the American Statistical Association. Historically, BJS has convened periodic expert workshops as a ï¬rst step in scoping out new work. McEwen (1996) summarized the 1995 workshop on police use of force that contributed to the development of PPCS, and BJS partnered with SEARCH, the National Consortium for Justice Information and Statistics, on a series of workshops on law enforcement databases such as criminal history records and sex offender registries (Bureau of Justice Statis- tics, 1995, 1997b, 1998a). However, such workshops have become rarer events in light of funding resources. As suggested by the ï¬rst point in our list above, we think that these workshops are an important mechanism that would have the added beneï¬t of improving concerns about the timeliness of content in BJS data collections; they would provide for regular input and feedback on emerging problems and views. One possible topic on which such a stakeholder workshop could be beneï¬cial is to review content in the correctional data series and the NCVS in order to ensure that deï¬nitions and concepts of âmental healthâ are consistent with current practitioner usage. Recommendation 5.7: To effectively get input on contemporane- ous topics of interest, BJS should regularly convene ad hoc stake- holder workshops to suggest areas of immediate data needs. However, we also believe that BJS would strongly beneï¬t from a more formal means of obtaining user input: therefore, we recommend that BJS es- tablish a standing technical advisory committee, appointed under the terms of the Federal Advisory Committee Act (5 USC App. 1). The legislation that created BJS, the Justice System Improvement Act of 1979, originally mandated a 21-member BJS Advisory Board, with members appointed to 3-year terms by the attorney general; this board was directed to review and make recommendations on BJS programs as well as to recommend candi- dates in the event of a vacancy in the BJS directorship (93 Stat. 1178â1179). However, this provision for an advisory board was removed in the 1984 reauthorization (see notes at 42 USC § 3734). Although BJS receives valu- able advice through informal means, we conclude that there would be real
PRINCIPLES AND PRACTICES 251 value in having a standing advisory committee, including members with sub- stantive expertise, operating staff within justice system institutions, statisti- cal experts, and others who could articulate future needs. It is important that such an advisory board contain high-level policy makers and justice sys- tem practitioners as well as methodologists and statisticians so that detailed research-speciï¬c recommendations are paired with input on the timeliness and usefulness of the data in the ï¬eld.17 The Census Bureau organizes several such advisory committees (includ- ing, for instance, groups speciï¬cally focused on input from diverse race and ethnicity groups and on advice from relevant professional associations); an- other model is the Board of Scientiï¬c Counselors of the National Center for Health Statistics. Both of these advisory structures in the statistical system provide written recommendations to their respective agencies and, in the case of the Board on Scientiï¬c Counselors, undertake program reviews of parts of the agencyâs portfolio; this kind of regular feedback would greatly beneï¬t BJS operations. Recommendation 5.8: BJS should establish an Advisory Group under the Federal Advisory Committee Act to provide guidance to BJS on the addition of new data collection efforts and the modiï¬cation of current ones in light of needs identiï¬ed by the group. Membership in the group should include, at a minimum, leaders and practitioners from each of the major subject matters covered by BJS data, as well as those with statistical and other types of academic expertise in these subject matters. The mem- bers of the group should be selected by the BJS director and the group should provide the director with at least two reports each year that contain its recommendations. This recommendation is consistent with, but more fully articulated than, Recommendation 5.1 in our interim report (National Research Council, 2008b). A standing advisory committee could be designed with subgroups of topic specialties in mind so that, for instance, the committee is poised to ren- der NCVS-speciï¬c methodological advice without having to convene sepa- rate committees for each major collection. By having both coverage and depth in topic areas, a standing advisory committee would be useful as a means for suggesting new directions for research. One speciï¬c example 17 As reference, the original BJS Advisory Board speciï¬ed in the Justice Systems Improvement Act was to have members including ârepresentatives of States and units of local government, representatives of police, prosecutors, defense attorneys, courts, corrections, experts in the area of victim and witness assistance, and other components of the justice system at all levels of government, representatives of professional organizations, members of the academic, research, and statistics community, ofï¬cials of neighborhood and community organizations, members of the business community, and the general publicâ (93 Stat. 1178).
252 JUSTICE STATISTICS where a formal advisory committee would be useful would be in revisiting content in the Law Enforcement Management and Administrative Statistics (LEMAS) survey, as part of implementing a core-supplement design. By its nature, LEMAS is an establishment survey that is targeted at a wide variety of individual law enforcement agencies. However, these agencies may dif- fer in their usage and basic deï¬nition of terms; for example, depending on the prevailing deï¬nition of âcommunity-oriented policing,â all departments might consider themselves to follow that practice whereas others (possibly confounding the term with speciï¬c grant/funding streams) may think that they do not. Regular review of the basic language used in the data collection is important to avoid the perception that questions are overly blunt or are confusing. In developing its outreach to its user base, it is important that BJS not neglect the needs and interests of a critical user constituency: members of Congress and their staffs. Steps to assess the issues of interest to the House and Senate Judiciary committees would be useful to build awareness of and interest in BJS products, promote a clearer understanding of what is and is not possible in statistical data collections (as did not seem to occur in developing the PREA reporting requirements), and gain critical support for new and continuing data collections. Recommendation 5.9: DOJ should take steps to ensure that con- gressional staff are aware of BJS data that could be used in devel- oping legislation; DOJ and BJS should learn from congressional staff how their data are needed to inform/support legislation so that they can improve the utility of their current data and so that they can develop new data sets that could enhance policy development. 5âB.3 Openness About Sources and Limitations of Data A statistical agency should be open about its data and their strengths and limitations, taking as much care to understand and explain how its statistics may fall short of accuracy as it does to produce accurate data in the ï¬rst place. Data releases from a statistical program should be accompanied by a full description of the purpose of the program; the methods and assumptions used for data collection, processing, and re- porting; what is known and not known about the quality and relevance of the data; sufï¬cient information for estimating variability in the data; appropriate methods for analysis that take account of variability and other sources of error; and the results of research on the methods and data. (National Research Council, 2009:8) In general, the panel believes that the BJS staff is fully open regarding the strengths and weaknesses of its data series. Its house style for the prepara- tion of the report emphasizes that even short reports contain a fairly detailed
PRINCIPLES AND PRACTICES 253 section on methodology; these sections generally do a good job at presenting synopses of the design of data collections. The recent episodes concerning the 2006 and 2007 releases of data from the NCVSâculminating in the con- clusion that 2006 data constituted a âbreak in seriesâ (see Section 3âA.3)âis illustrative in this regard. Recognizing the presence of a problem, BJS staff sought external opinions and worked closely with the Census Bureau to try to understand what had occurred. The declaration of a âbreak in seriesâ was not an easy one to make, but BJSâs descriptions of the circumstances in its reports (and the documentation accompanying the archived data ï¬le) are certainly candid about the limitations of the data. However, the âbreak in seriesâ incident also illustrates a point that we make later in this chapter concerning the technical skill mix of the BJS staff. In such an incident, it would be useful for BJS to have more in-house staff with advanced technical skills, to more completely understand how design changes and sample size reductions combine to produce discrepant effects. BJS shares with other federal statistical agencies a fundamental problem that it has insufï¬cient numbers of technical staff whose primary job is to focus on evaluation of the quality of data collected by and for BJS. Because of this absence, the outside user of BJS data has no set of working papers, method- ological briefs, or quality proï¬les that may be consulted to inform them- selves of the characteristics of particular data sets or the potential strengths and weaknesses for their speciï¬c uses of the data. The lack of routine evaluation and quality assessments of BJS data is problematic because of the wide variety of sources from which BJS data series are drawn; BJSâs correctional data provide a useful example. Much of the correctional data are collected from agencies and institutions that rely on varied local systems of record-keeping. Heterogeneity in record- keeping standards produces heterogeneity in responses to administrative sur- veys. For some data collections, such as the National Corrections Reporting Program (NCRP), states may have varying deï¬nitions of the race, ethnic- ity, and schooling of admitted and released prisoners. Detailed instructions for classiï¬cation and measurement would improve the quality of corrections data reporting. Recommendation 5.10: To improve the utility and accuracy of the National Corrections Reporting Program (NCRP), BJS should work with correctional agencies to develop their own internal records to promote consistent data collections and ex- pand coverage beyond the 41 states covered in the most recent NCRP. It follows that the same kind of evaluation of the raw data provided by state and local authorities, coupled with work to promote consistent reporting,
254 JUSTICE STATISTICS would also beneï¬t BJSâs other correctional, law enforcement, and adjudica- tion data series. 5âB.4 Wide Dissemination of Data A statistical agency should strive for the widest possible dissemination of the data it compiles. . . . Elements of an effective dissemination pro- gram [include] a variety of avenues for data dissemination [including, but] not limited to, an agencyâs Internet website, government deposi- tory libraries, conference exhibits and programs, newsletters and jour- nals, e-mail address lists, and the media for regular communication of major ï¬ndings. (National Research Council, 2009:9) BJS deserves great credit for its data dissemination efforts, several of which are described in Box 1-1. It makes good use of public use data set archiving through the National Archive of Criminal Justice Data (NACJD); its own website and the OJP-sponsored National Criminal Justice Reference Service provide ready access to an extensive backï¬le of reports; its website entries for individual reports generally provide the reports in text or print formats and typically include either plain text or spreadsheet tables corre- sponding to key data tables. As noted in Chapter 4, the state SAC network also provides a means for the dissemination of BJS data and products (and SAC analyses thereof) to local audiences. All of these steps have been a great service to the user community and represent shrewd use of partnerships with outside groups with speciï¬c expertise that in-house BJS staff could not not do in isolation. The coupling of the public data archive with the regular in- structional workshops conducted by the Inter-university Consortium for Po- litical and Social Research is a very valuable service, opening BJS resources to new researchers. Timeliness of Data Release Although we laud BJS for its work in data dissemination, this principle does suggest three areas where some further comment is necessary, the ï¬rst of which concerns the timeliness of data release. Once a report is prepared and new data are ready for release, BJS is very good at executing the re- lease; the problem is that the lag times between data collection and the time of report and data release can be considerable, sometimes taking several years, which hurts the freshness and timeliness of the new results. This is, of course, a fundamental problem that applies to statistical agencies other than BJS: timely release of data is essential for those data to be useful in policy formulation and in research, yet the process of collecting high-quality data, ensuring that quality, and protecting the conï¬dentiality of response takes time and is not one that can readily be rushed without overburdening respondents.
PRINCIPLES AND PRACTICES 255 Finding 5.6: A recurring criticism of BJS data products is that their quality is highly valued but that they are not sufï¬ciently timely to meet user needs. All statistical agencies are attempt- ing to grapple with new data collection designs that offer more timely estimates. Delays in the release of data ariseâand can be particularly pronouncedâ in those circumstances where BJS is dependent on other agencies, especially the Census Bureau. By this, we do not impugn the Census Bureau but merely note that it has its own privacy protection protocols and data quality pro- cedures that, combined with BJSâs own review, can add substantially to pro- cessing time. In some instances where the Census Bureau has been the data collection agent, release of data can be obstructed because of post hoc de- terminations that a particular release format would threaten conï¬dentiality. This has been the case with collection and coding of the industry and oc- cupation data from the Workplace Risk Supplement, for which the Census Bureau has opposed release because the cell sizes for certain occupations are too small. Negotiations with the Census Bureau have continued for years, to the extent that these data collected in 2002 have not yet (late 2008/early 2009) been released. Another case of the Census Bureau restricting or impeding the timely availability of data is the removing of the area-identiï¬ed NCVS from Census Analysis Centers. These data were available in analysis centers around the nation for a number of years but were subsequently withdrawn amidst con- cerns about conï¬dentiality and documentation. Similar issues have barred the release of a special area-identiï¬ed data ï¬le from the NCVS. Such a ï¬le is critical to studying the prospects for local-area estimation from the NCVS, and the ï¬le was once made available through BJSâs data archive, but it has now been ofï¬ine and unavailable for about 4 years. Delays of this extent suggest that something is broken in the relationship between BJS and the Census Bureau, which is obstructing the timely release of these data. In cases where other agencies provide the funding of a data collection, such as supplements to the NCVS, the release of the data can be delayed be- cause both BJS and this other agency must issue a âï¬rstâ release and because there can be ambiguity with regard to which agencies have âcontrolâ over the data. All of these factors delay release of the data and should be scru- tinized to see if there could be joint âï¬rstâ releases or other streamlining of this process. Agreements on supplements or other joint ventures with other agencies could include time limits on the release of data and clearer lines of authority for release. Maximizing the use of BJS data requires that it be released in a timely and equitable fashion and in formats that facilitate its use, while protecting the conï¬dentiality of the data and furthering the goals of the agency. These
256 JUSTICE STATISTICS objectives are often conï¬icting, and balancing them is no simple matter. It would beneï¬t BJS to track the processing that occurs after data collection is complete and document the times of data collection, report preparation, report release, and data archival to study which components of processing are most time-consuming (and which may be made more efï¬cient). More generally, BJS should work to confront the challenge of timely data release in creative ways. One mechanism to consider is issuance of preliminary estimatesâlabeled as such and clearly noted as being subject to future revisionâthat could be issued on a quick basis and separate from a fuller and more detailed report that would contain ï¬nal estimates. Another (and more elaborate) idea that is worthy of consideration is adoption of continuous data collection designs. These designs spread diffuse sample; information is collected from a smaller number of respondents at any given time but collectors are in the ï¬eld on as continual a time basis as possible. These designs have the advantage of avoiding the startup costs of reinventing survey design machinery and sample from scratch every time a new round of data is collected, but their continuous streams of data can also be combined and pooled to produce more timely estimates. With the Census Bureauâs introduction of the American Community Survey (ACS), the U.S. public will become accustomed to interpreting the period estimates that span several time periods (e.g., 3-year or 5-year averages); opportunities to present BJS data in similar structures should be considered. Recommendation 5.11: BJS should evaluate each of its data pro- grams to inquire whether more timely estimates might be ob- tained by (a) making discrete data collections into more con- tinuous operations and (b) issuing preliminary estimates, to be followed by ï¬nal estimates. Equitable Release of Data A second area of discussion aboutdata dissemination is the equitable re- lease of data, meaning that all of the public should generally have access to data releases at the same time, in formats that are conducive to use and inter- pretation. There may be instances in which some individuals outside of BJS should have access to some data before general release because it furthers the goals of the agency, such as evaluating and maintaining data quality. In those cases, priority access should be available and granted. Joint publica- tions where BJS staff collaborate with persons outside the agency may also be an acceptable form of early release. Except in circumstances such as these, statistical agencies such as BJS should strive to ensure that individualsâ access to data ï¬les are on an equal footing. The formats in which BJS are data are released should facilitate the use of these data. Here, format includes the medium by which the data are
PRINCIPLES AND PRACTICES 257 made available as well as the content of the releases. BJS, like all statisti- cal agencies, has different formats for different user communities. Written reports and electronic versions of written reports are available for readers who do not wish to manipulate the data. Spreadsheet versions of key tables and some Web-based tools for simple online analysis are provided for users who want to manipulate but may lack the sophistication to do so in a com- plex way. The full data sets are available for the most sophisticated users who are interested in manipulating the microdata substantially. It would be helpful for BJS to be more direct in spelling out the logic and connectedness of their product lines and formats. It would be useful for the website for a LEMAS report to indicate that users can go to a separate part of the BJS website to access online anlaysis options if they cannot ï¬nd the particular rate or cross-tabulation in the hard-copy and electronic reports; this clue is not immediately obvious. If the online analytical capabilities cannot answer their question, then consumers should be explicitly referred to the NACJD where they can download data sets. This search logic may be obvious to some but not to others who visit the BJS website, and it is not clear that the formats and product lines currently available have a coherent and integrated dissemination plan or strategy. Increasing sophistication of the public with regard to electronic access to information may warrant a reevaluation of the mix of media used to dissem- inate BJS data. BJS has already taken steps to reduce the number of reports produced in hard copy by emphasizing online distribution as Portable Doc- ument Format (PDF) ï¬les; a next step would be to consider ways to reduce paper format even more and to make better use of hyperlink facilities in the PDF ï¬les to point users to related reports. Some BJS publications such as the Fireram Inquiry Statistics summarizing background checks for handgun purchases have moved to a release format where the âreportâ release con- sists almost entirely of data tables, with minimal prose. Finding additional avenues for this format would have the dual beneï¬t of potentially provid- ing more timely release (as discussed above) and freeing staff to spend less time on standard report writing and more on innovation and evaluation; that said, careful prose summaries are also very important, and we do not want to be construed as saying that the standard written reports should be abandoned. These suggestions for improving the dissemination of BJS data will put more strain on an overworked staff. Some of the format changes may free up some resources, if they reduce the amount of time required in the editorial process. In the short run, the agency may consider making greater use of the NACJD to develop some of the format changes mentioned in the foregoing paragraphs.
258 JUSTICE STATISTICS Figure 5-5 Bureau of Justice Statistics home page, July 2008 NOTE: URL for home page is http://www.ojp.usdoj.gov/bjs/; this version accessed July 21, 2008. BJS Web Presence A third, and ï¬nal, discussion topic under the general heading of data dis- semination concerns an essential tool for such dissemination: BJSâs presence on the World Wide Web, the current front page of which is illustrated in Figure 5-5. As suggested by our comments earlier in this section, BJS rec- ognizes the importance of its Web presence to the spread of its information. Former BJS Director Jeffrey Sedgwick (2008:2) commented that: Over the past year, we have continued to develop a new website that will more effectively connect our users with the information they need. The website restructures the way our information is presented, giving users a more intuitive way to retrieve the data they need. Future de- velopment will include enhancing our ability to generate custom data tables and other interactive products online. No website design is perfect in the eyes of every user. It is unclear how useful speciï¬c design suggestions would be, though we have indicated pref- erences for some additional topic pages throughout this report (e.g., sum- marizing what data are and are not available concerning white-collar crime; Section 2âC.1). However, one point that we do want to raise as BJS revamps its Web presence is to suggest emphasis on data sourcing and external col-
PRINCIPLES AND PRACTICES 259 laboration. It is worthwhile to frame this discussion by stating a conclusion that is consistent with the principles expected of a federal statistical agency: Finding 5.7: The credibility of BJSâs products is a function of its quality review procedures. It follows that the BJS âbrandââexplicitly being labeled as a BJS productâcarries weight and is a meaningful distinction. Hence, there is a need to take care in what gets designated and explicitly linked to as a BJS product. BJSâs collaborative projects, such as the Federal Justice Statis- tics program with the Urban Institute and the Court Statistics Project with the National Center for State Courts, are prone to ambiguity and confusion on this score: BJSâs website is sometimes abrupt in linking users to the Ur- ban Institute-hosted Web hub for the federal system statistics. Likewise, the Court Statistics reports sometimes carry a BJS logo but BJSâs sponsorship role (and use of some of the data) is not immediately apparent. To be clear, we do not argue that the reports and portals on non-BJS Web servers are bad in any sense or that the BJS âbrandâ is being misused by these external placements. Quite to the contrary, the hope is for both BJS and its data collection partners to receive appropriate credit for good work. Accordingly, we conclude and recommend as follows: Finding 5.8: Several BJS data series are collected and maintained by external organizations linked to the BJS website (e.g., Federal Justice System statistics). It is not clear why some data and re- ports reside on external websites, rather than on the BJS website. It is unclear whether such data and reports achieve the quality standards used by BJS. It is not apparent why some websites are permitted to use the BJS label (http://fjsrc.urban.org). Recommendation 5.12: BJS should articulate why some data collections are housed on external websites and describe the pro- cess by which links to external websites are allowed. BJS should articulate and justify the use of its insignia on external websites. We also endorse BJSâs efforts to develop the capability for users to per- form custom tabulations and data summaries directly through the BJS web- site, as envisioned by former Director Sedgwickâs comments above. By doing so, BJS would establish a more full-ï¬edged Web presence rather than serving principally as a document repository. The current model under which (gen- erally) some set tabulations are available as spreadsheets but more advanced data users are directed to download raw data ï¬les through the NACJD may actually be said to minimize BJSâs presence somewhat: the precise infor- mation is available but not (directly) from BJS. Some of the larger federal statistical agenciesânotably the Bureau of Labor Statistics and the Census Bureau (the latter through its âAmerican FactFinderâ interface)âhave made considerable efforts in permitting website users to tabulate (and even to plot
260 JUSTICE STATISTICS on a map) their own queries of interest. Clearly, the same level of interactive features cannot be expected without commensurate resources, but develop- ing means by which steady streams of researchers, reporters, students, or congressional staff could readily obtain BJS information directly from the BJS site would ultimately be beneï¬cial. 5âB.5 Cooperation with Data Users [A statistical agency should] seek advice on data concepts, statistical methods, and data products from data users as well as from other pro- fessional and technical subject-matter and methodological experts, us- ing a variety of formal and informal means of communication that are appropriate to the types of input sought. (National Research Council, 2009:9â10) We have described BJSâs existing programs for outreach to and feedback from user groups and key constituencies in Section 5âB.2, in the context of the continual search to provide more useful data. Hence, our comments in this section are brief: BJS deserves credit for implementing a variety of out- reach venues and the discussion at the February 2008 users workshop pro- vided ample testimony that there is widespread appreciation of BJS among the user base. BJSâs performance is certainly within the norms of other prin- cipal statistical agencies and we suggest that it could be improved still further through the recommendations we offer in the earlier section. 5âB.6 Fair Treatment of Data Providers [Fair treatment practices include] policies and procedures to maintain the conï¬dentiality of data, whether collected directly or obtained from administrative record sources, [and to] inform data providers of the pur- poses of data collection and the anticipated uses of the information. . . . [They also include] respecting the privacy of respondents by minimiz- ing the contribution of time and effort asked of them, consistent with the purposes of the data collection activity. (National Research Council, 2009:10) Fair treatment practice is fairly synonymous with the principle of estab- lishing a relationship of mutual respect and trust with data providers, de- scribed in detail in Section 5âA.1. The same general messages apply: BJS is generally very diligent and fair in its relationship with both establishment (state agency or individual facility) and person respondents. However, in our assessment, the PREA reporting requirements to which BJS is currently subject constitute a direct violation of this practice. The relationship of trust within which BJS collects information from its data providers is threat- ened by PREA because this data collection directly threatens and sanctions the data providers in ways that others do not. When there is direct harm
PRINCIPLES AND PRACTICES 261 from PREA participation perceived by a data provider, the other BJS data collections are threatened. Fair treatment of data providers is one of the foundations of trust. Violating this practice can have consequences that take decades to undo. 5âB.7 Commitment to Quality and Professional Standards of Practice A statistical agency should: ⢠use modern statistical theory and sound statistical practice in all technical work. ⢠develop strong staff expertise in the disciplines relevant to its mis- sion, in the theory and practice of statistics, and in data collection, processing, analysis, and dissemination techniques. ⢠develop an understanding of the validity and accuracy of its data and convey the resulting measures of quality to users in ways that are comprehensible to nonexperts. . . . (National Research Coun- cil, 2009:11) As indicated at several points in this chapter, in our judgment, BJS has high standards for quality that are generally well understood. For this, BJS deserves considerable credit but, having expressed the point already, we do not reiterate at length here. In the area of using modern statistical techniques and data collection practices, we worry that BJS is somewhat out of touch with current develop- ments in statistical data collection. For instance, as described in Box 5-2, the PREA reporting requirements put BJS in a position where the inherent vari- ability in estimates is such that it could not identify the highest- and lowest- ranked facilities as speciï¬ed by the act (ï¬awed and inappropriate though that requirement is). Instead, BJS chose to list a group of high-incidence facilities that, in some sense, are indistinguishable from each other. Yet this approach still has the effect of suggesting a level of precision that the estimates sim- ply do not support; though we recognize that BJS faced difï¬cult choices in issuing its PREA reports and that it was undoubtedly correct not to try to match the exact letter of the requirements in the law, the release would have beneï¬ted from very rigorous review of other approaches for presenting high-sensitivity data and attention to issues of multiple comparisons. Similarly, in our interim report (National Research Council, 2008b:119) we expressed concern about the lack of mathematical statistics and survey practitioner expertise on the BJS staff; in its recent problems with the NCVS and the possible âbreak in series,â BJS was possibly too dependent on the Census Bureauâs (unfortunately post hoc) analyses of the effects of design changes and sample size reductions on the ï¬nal NCVS estimates. Subsequent to our interim report, BJS has created a âsenior leaderâ position among its top management with the idea of bolstering its survey management exper-
262 JUSTICE STATISTICS tise. This is a very positive development, yet we still suggest that the absence of a chief mathematical statistician is troubling because such a post (as well as a chief survey methodologist) tends to lead the agencyâs attention to con- tinual statistical improvements over time. (We return to the issue of staff expertise in Section 5âB.8.) One way to judge professionalism is to look at methodological contribu- tions made by an agencyâs staff with the intent of making it easier for users to correctly use and interpret data. One major contribution in this regard was BJSâs sponsorship of development of a âcrosswalkâ data set by NACJD staff between the FBIâs Originating Agency Identiï¬er (ORI) codes and more standard geographic constructs such as cities and counties (Bureau of Justice Statistics, 2004d). The service populations of law enforcement agencies with ORI codes do not necessarily correspond neatly with ofï¬cial geographies and, in many cases, may overlap each other. The crosswalk ï¬le approxi- mates the service populations to facilitate some direct comparisons between the FBIâs UCR data and other data sources. Other user-oriented method- ology contributions include the summary by Langan and Levin (1999) of differences in state prisoner counts when prison records (NCRP) or court records (National Judicial Reporting Program) are used and a series of clear, approachable pieces on the conceptual differences between the UCR and the NCVS (U.S. Department of Justice, Bureau of Justice Statistics, 2004). The panel also requested of BJS a summary of professional activities of the staff, in an effort to evaluate whether the staff was connected with net- works that would alert them to new developments in statistical design, data collection, and estimation. BJS staff are frequent participants in interagency working groups of staff from the range of federal statistical agencies. Sev- eral of these activities are topic working groups of the Federal Committee on Statistical Methodology, itself an interagency working group coordinated by OMB. Other interagency groups to which BJS contributes members are the Federal Interagency Forum on Aging Related Statistics, the Interagency Forum on Child and Family Statistics, and the Interagency Subcommittee on Disability Statistics. As a stakeholder and sponsor of the Census Bu- reauâs demographic surveys program, it also participates in several intera- gency working groups organized by the Census Bureau, speciï¬cally those on the ACS and Sample Survey Redesign (updating sample and addresses for de- mographic surveys based on new census results). On the international level, BJS staff have also participated in relevant statistical programs of the United Nations Economic Commission on Europe (UNECE) and the Organisation for Economic Co-operation and Development, including speciï¬c UNECE task forces on victimization surveys and statistical dissemination and com- munication. However, the bulk of its staff professional and working group activities are internal to DOJ, ranging from membership on NIJ committees on drugs and crime and evaluation of justice on American Indian reserva-
PRINCIPLES AND PRACTICES 263 Box 5-4 Review Process for an Information Collection by a Federal Agency The U.S. Ofï¬ce of Management and Budget (OMB) is responsible for reviewing and approving any information collection activityânot only surveys for statistical purposes, but any form or applicationâthat will be administered to 10 or more respondents (44 USC § 3502(3)(A)(1)). This âclearanceâ process can be time-consuming, because it must include two postings in the Federal Register for public comment as well as time for OMBâs Ofï¬ce of Information and Regulatory Affairs to render its decision. Agencies develop and submit an Information Collection Review (ICR) request or, more colloquially, a âclearance package,â to OMB. In this process, surveys and any information collection making use of statistical methodology (for editing, imputation, or sample selection) are held to a higher standard. All ICRs must include a Part A, giving a detailed justiï¬cation for the collection, indicating how and for what purpose the data will be used (or, if the ICR is reauthorizing an existing collection, how the data have been used); Part A also includes cost and time burden estimates. Statistical collections must also include a Part B report, which must include details on the sampling strategy for the collection and procedures for handling nonresponse, as well as descriptions of any tests to be conducted prior to full ï¬elding of a collection. Names and contact infor- mation of any person consulted on the design of the collection are also required in Part B. OMB maintains a publicly accessible database of pending and completed ICRs, including links to agency-submitted supporting statements, at http://www.reginfo.gov. tions and tribal lands to membership on the Bureau of Prisonsâ institutional review board. Collectively, these efforts suggest attempts to build ties and outreach to other units in DOJâand hence increase BJSâs relevance to DOJ, which we encouraged and recommended above. However, the range of these activities is largely insular to the Justice Department and the executive branch; this bolsters the importance of the outreach efforts, including an advisory panel, suggested above. Though there is much to commend in BJSâs professional standards of practice, there is one area where BJS often displays, publicly, a marked weakness: the preparation of supporting statements for its information col- lections. As described in Box 5-4, all federal agency requests to collect infor- mation from 10 or more respondents must be cleared with OMB, in compli- ance with the Paperwork Reduction Act. For collections involving statistical methodology, the bar for approval is set higher; the Information Collection Review (ICR) packages submitted to OMB must include a âPart Bâ return providing details on sample construction, procedures for collecting and pro- cessing information, and pretests of survey instruments. Public versions of the ICRs are browseable online at http://www.reginfo.gov by searching for data collections listed under OJP . On one hand, preparation of ICR supporting statements could be seen as no more and no less than clearing a bureaucratic hurdle. On the other,
264 JUSTICE STATISTICS however, a reading of many of BJSâs submissions over the past few years sug- gests a surprising and disappointing lack of speciï¬city, as well as less-than- compelling arguments for the necessity and utility of the studies. Questions on the justiï¬cation for the information collection are usually answered along strictly legal lines, citing BJSâs general mandate to âcollect and analyze sta- tistical information, concerning the operations of the criminal justice system at the Federal, State, and local levelsâ (see Box 1-2) and usually including a copy of that section of the U.S. Code as an attachment. Rarely does the justiï¬cation section indicate how the collection ï¬ts with, supplants, or is superior to existing data series, and information on the uses to which the data will be put is sparing. As strong as the methodology sections of BJSâs ï¬nal reports are, its technical speciï¬cations in the information collection requestsâlanguage that ought to be, effectively, the ï¬rst draft of the tech- nical documentation for a new data setâare strikingly weak. Examples of these ICR packages, and deï¬ciencies in their support documentation, are described in Box 5-5. Though they are, functionally, a bureaucratic step, the ICRs that BJS de- velops to obtain clearance from OMB are also a ï¬rst opportunity to carefully explain the rationale for data collections from the substantive and technical viewpoints. They are also ï¬rst drafts of the technical documentation for new data series and templates for actual data collection efforts. On these dimen- sions, neither new nor continuing BJS data collections are helped by having weak and deï¬cient supporting statements made for them in a public (if not widely viewed) forum. 5âB.8 Active Research Program A statistical agency should have a research program that is integral to its activities. Because smaller agencies may not be able to afford as extensive a research program as larger agencies, agencies should share research results and methods. Agencies can also augment their staff resources for research by obtaining the services of experts not on the agencyâs staff through consulting or other arrangements as appropriate. (National Research Council, 2009:11) Some of the estimates produced from BJS data have acquired a status as national benchmarks that should be preserved, and the agencyâs products are known for their quality standards and objectivity. To be sure, maintenance of series continuity is, properly, a high priority; this is because estimates of change, and especially change over a relatively long period of time, are among the most important pieces of information that these long-term data resources can provide. At the same time, it is important for statistical agencies to ensure that their product lines are current both substantively and methodologically. As
PRINCIPLES AND PRACTICES 265 Box 5-5 Problems in Bureau of Justice Statistics Information Collection Requests BJSâs Information Collection Request (ICR) package for the proposed Census of Law Enforcement Aviation Units (ICR 200708-1121-002) is a useful example. The abstract mentions that collection is a âpart of the BJS Law Enforcement Management and Administrative Statistics program,â and the statement on the necessity of the collection references 2003 LEMAS data: It is estimated that about 250 law enforcement aviation units are in operation among State and local agencies in the United States. These units operate an estimated 1,000 aircraft, including about 600 helicopters and 450 ï¬xed-wing aircraft. The 2007 Census of Law Enforcement Aviation Units will be a census of all agencies, sampled in the 2004 LEMAS survey, which reported having either a ï¬xed-wing aircraft or helicopter. It will be the most comprehensive study conducted in this area to date. The data collection will include detailed items on the functions, personnel, equipment, record keeping, expenditures, and safety records of these units. The basic cited need for the collection is homeland security-tingedââit is important to know the location and nature of available assets that could be mobilized in the event of large-scale regional or National emergenciesââwith the add-on mention that âthis information is also critical to law enforcement policy development, planning, and budgeting at all levels of government.â The description is muddled as to whether the data are intended to draw some inference about characteristics of agencies that maintain aviation units (e.g., through the detailed items on equipment and safety records) or as a convenient directory of relevant agencies (for mobilization purposes). The statement is further unclear about how the collection ï¬ts with the broader LEMAS program, whether the information is sufï¬ciently important that it should be collected on a regular basis, and whether there is any auxiliary information to evaluate the accuracy of the 2003 estimate that about 250 agencies have such units. Most disappointing in this ICR, however, is the Part B return on statistical methods. Save for BJS contact information, what is supposed to be a fairly detailed technical speciï¬cation of data collection techniques and planned methodologies runs about half a page, as follows: Universe and Respondent Selection: This data collection will be a census of law enforcement aviation units from among agencies with 100 or more ofï¬cers. No sampling is involved with this collection. Procedures for Collecting Information: The census will be conducted initially by mailout. The address mailing list will be updated prior to mailout in order to maintain a current list of the respondents. Personal telephone interviews will be conducted for non-respondents. Methods to Maximize Response: We will do everything possible to maximize response, including telephone facsimile transmission, telephone interviews, and on-site assistance. Response rates for prior BJS law enforcement surveys and censuses have typically been 95% and above. Testing of Procedures: The census instrument has been pretested in three selected jurisdictions by individuals that will be receiving the ï¬nal census instru- ment. Comments received as a result of that testing have been incorporated into the census instrument accompanying this ICR. (continued)
266 JUSTICE STATISTICS Box 5-5 (continued) The grounds for criticism of this extremely scant statement are numerous: ⢠The proposed collection shares with its fellow special-agency censuses a lack of clarity over whether the collection is intended as a âsurveyâ or a âcensus.â Throughout the rest of the document, and in the title of the collection, âsurveyâ had been used; in the Universe and Respondent Selection section, âcensusâ suddenly becomes the preferred choice. ⢠Regardless of the âsurveyâ or âcensusâ label, the primary source of contact in- formation is the existing LEMAS survey; even if the aviation unit study is meant as a census, the method of construction of its frame/address list (LEMAS) should be described in more detail. Part A suggests that the LEMAS listings would be supplemented by listings from the Airborne Law Enforcement Association and In- ternational Association of Chiefs of Police; coverage properties for either of those lists is missing, as is any hint of how many additional units might be added through reference to those lists. The restriction to agencies with 100 or more ofï¬cers is not previously mentioned, or described further. ⢠The statement is absent of any notion of whether and how the contact strat- egy differs from that of the main LEMAS collection or, indeed, who will carry out the collection. Likewise, any formal connection to the basic LEMAS survey (e.g., whether the results of the aviation-speciï¬c study might be used to revise ques- tions on the main survey) is unspeciï¬ed. ⢠The reference to providing âon-site assistanceâ is vagueâdoes it refer to follow-up by a ï¬eld interviewer? ⢠The reference to response rates in previous law enforcement surveys is interesting but unpersuasive; a better point of comparison might be similarly scoped attempts to canvass special units within departments rather than the main LEMAS survey. ⢠The ï¬nal section, on testing of procedures, is particularly uninformative. How were the pilot jurisdictions chosen? Were there any difï¬culties encountered in the questionnaire, such as terminology usage? Did speciï¬c comments from the pilot respondents lead to changes in the contact strategy? The Law Enforcement Aviation Unit ICR is an example of particularly weak justiï¬cation and technical speciï¬cation statements, but reading of other BJS-prepared ICRs show similar deï¬ciencies. BJSâs request for clearance of the 2007 Survey of Law Enforcement Gang Units (ICR 200705-1121-001) shared some gross features of the aviation unit ICR, again using the âsurveyâ nomenclature but describing the effort as a ânationwide census of all law enforcement gang units operating within police agencies of 100 or more ofï¬cers.â The supporting statement for the gang unit study does not explain whether any other data sources besides previous LEMAS returns are to be used to build the frame of dedicated gang units, leaving it unclear whether the collection is indeed a census (a canvass of all known gang units) or a survey (probability sample). In another example, the section on testing of procedures in the ICR for the 2007/2008 National Survey of Prosecutors says that âthe survey instrument was previously pretested with 310 jurisdictions during the 2005 data collection whereby BJS received a 99% response rate â (ICR 200704-1121-004). However, other portions of the statement make clear that the newer 2007/2008 version was purposely designed as a complete census of prosecutor ofï¬ces, meaning that questions were revised and the number of questions was scaled back. Since this makes the newer survey different in scope and character than the 2005 version, the 2005 response rateâthough impressiveâfails to answer the question of experience in pretesting the questionnaire.
PRINCIPLES AND PRACTICES 267 is true of other statistical agencies facing tight resources, BJS has been forced into an overriding focus on basic production of a set of data series and stan- dard reports, at the expense of research, development, and innovation. As we discussed in Section 3âF.2, the performance measures in BJSâs strategic plan are largely ones of volume and throughputâcounts of ï¬le access on the NACJD, number of reports and supporting material accessible on the BJS website, number of data collections performed or updated each yearâthat lack a forward-looking focus on improvements in methodology and options for improving content. A statistical agency should be among the most intensive and creative users of its own data, both to formally evaluate the quality and properties of its data series but also to understand the ï¬ndings from those data and shape future reï¬nements. BJSâs âSpecial Reportsâ series have, in the past, gone into depth on topics not routinely studied by the agencyâs standard reports or have taken unique looks at BJS data, such as age effects in intimate part- ner violence (Rennison, 2001), the interaction between alcohol and crimi- nal behavior (Greenfeld, 1998; Greenfeld and Henneberg, 2001), and the prevalence of ever having served time in prison among the U.S. population (Bonczar and Beck, 1997; Bonczar, 2003). They have also provided some opportunity for BJS analysts to make use of multiple BJS data sets or com- bine BJS data with non-BJS data sets in interesting ways: ⢠To study educational attainment in the correctional population, Harlow (2003) studied data from BJSâs prisoner and jail inmate sur- veys, its 1995 Survey of Adults on Probation, the Current Population Survey of the Bureau of Labor Statistics, and the 1992 National Adult Literacy Survey of the National Center for Education Statistics. ⢠Zawitz and Strom (2000) combined data from the NCVS and multiple data series from the National Center for Health Statistics to describe both lethal and nonlethal violent crime incidents involving ï¬rearms. ⢠Greenfeld (1997) combined information from the UCR, the NCVS, and BJSâs corrections and adjudications to summarize the state of quantitative information on sex offenses including rape and sexual as- sault. Moreover, in fairness, BJS deserves credit for several innovative tacks that it has taken. Although full use of electronic questionnaires took consid- erable time, BJS and the NCVS were (through its work with the Census Bu- reau) relatively early adopters of computer-assisted methods in major federal household surveys. And, though we have argued at length that the reporting requirements are inappropriate, BJSâs work on data collections in support of PREA led the agency to make great strides in the use of ACASI and other techniques for interviewing on sensitive topics. BJS has also demonstrated itself to be effective and innovative in developing data collection instruments
268 JUSTICE STATISTICS to confront very tough methodological problems: identity theft, hate crimes, police-public contact, and crimes against the developmentally disabled. But innovative in-house data analyses by BJS have slowed in recent years as the focus on production has increased and resources have tightened; ma- jor methodological innovations such as the use of ACASI were possible be- cause PREA carried with it substantial funding. BJSâs need to update long- standing products and keep activities in place, for basic organizational sur- vival, has too frequently trumped innovative research and intensive explo- ration of new and emerging topic areas. Indeed, the principal means for identifying âemerging data needsâ cited in BJSâs strategic plan is not exam- ination of the criminological literature or frequent interaction with crimi- nal justice practitioner communities, but rather âemerging data needs as ex- pressed through Attorney General priorities and Congressional mandatesâ (Bureau of Justice Statistics, 2005a:32).18 In our assessment, the lack of a research program (and the capacity for a research program) puts BJS and its data products at risk of growing stagnant and becoming less relevant. Finding 5.9: The active investigation of new ways of measuring and understanding crime and criminal justice issues is a criti- cal responsibility of BJS. The agency has lacked the resources needed to fully meet this responsibility and, for some issues, has fallen behind in developing such innovations. Finding 5.10: BJS has lacked the resources to sufï¬ciently pro- duce new topical reports with data it currently gathers. It also lacks the resources and staff to routinely conduct methodolog- ical analyses of changes in the quality of its existing data series and to fully document those issues. Instead, the BJS production portfolio primarily is limited to a routine set of annual, biannual, and periodic reports and for some topics, the posting of updated data points in online spreadsheets. In our interim report, we made speciï¬c recommendations to stimulate research directly related to the NCVS, speciï¬cally calling for BJS to initiate studies of changes in survey reference period, improvements to sample ef- ï¬ciency, effects of mixed-mode data collection, and studies of nonresponse bias (National Research Council, 2008b:Recs. 4.2, 4.7, 4.8, 4.9). In re- sponse, BJS quickly issued requests for proposals for external researchers to conduct such studies, and has also signaled its intent to conduct a sur- vey design competition to evaluate broad redesign options (Rec. 5.8 in Na- tional Research Council, 2008b). This is a laudable reaction that is a step toward laying out more concrete options for and future activities related to 18 âIn addition,â the plan notes shortly thereafter, âBJS staff meet regularly with Federal, State, and local ofï¬cials to identify emerging data needs or desirable modiï¬cations to existing collection and reporting programsâ (Bureau of Justice Statistics, 2005a:32).
PRINCIPLES AND PRACTICES 269 the NCVS, BJSâs largest data program, but a fuller research program is criti- cal to future-oriented option development for BJSâs non-NCVS programs. It is also critical to avoid implementation problems such as those experienced in the 2006 administration of the NCVS. As we noted in our interim report, âdesign changes made (or forced) in the name of ï¬scal expediency, without grounding in testing and evaluation, are highly inadvisableâ (National Re- search Council, 2008b:83). To this end, a short recommendation that we offered in our interim report (National Research Council, 2008b:Rec. 4.1) is worth formally restating here: Recommendation 5.13: BJS should carefully study changes in the NCVS survey design before implementing them. It follows that this guidance can be applied to changes to other BJS data collections, and that such evaluative studies are not possible without the resources necessary to make innovative research a priority for the agency. Congress and the administration cannot reasonably expect BJS to shoul- der daunting data collection requests without the agency engaging in ongo- ing research, development, and evaluation. Going forward, a key priority should be detailed error analysis of the NCVS to get a sense of how big a problem survey nonobservation may be in speciï¬c socioeconomic subgroups, as the basis for understanding where improvements may most properly be made. On a related matter, BJS research activities should also be directed at improving outreach and data collection coverage of groups that are tra- ditionally hard to reach by survey methods; such groups include new immi- grant groups and persons and households where English is not the primary spoken language, young minorities in urban centers, and the homeless. Recommendation 5.14: BJS should study the measurement of emerging or hard-to-reach groups and should develop more ap- propriate approaches to sampling and measurement of these populations. In the following, we suggest a few selected areas for a BJS research pro- gram. These should not necessarily be interpreted as the only or as the most pressing research priorities, but we believe they are all important directions. In terms of methodological innovations, BJS should consider greater use of model-based estimation. In our interim report, we recommended inves- tigation of such modeling for the generation of subnational estimates from the NCVS (National Research Council, 2008b:Rec. 4.5); improving the spa- tial and, perhaps, temporal resolution of estimates from the NCVS remains the highest priority in this regard, but the methodology could be brought to bear in other areas. The development of small-area estimates is particularly pressing because the agency is often criticized for not being able to speak to subnational areas. Modeling can also refer to the use of multivariate anal- yses to control for factors that mask real changes in the phenomenon of
270 JUSTICE STATISTICS interest. Just as many economic indicators are adjusted for inï¬ation or sea- sonal ï¬uctuation, it would make sense to adjust crime rates for factors that mask important variation. Age-adjusting crime rates, for example, would help separate the effects of macro-level social changes (over which one has little control) from more troubling and actionable changes in the incidence of crime. The same can be said of incarceration rates: adjusting admission rates for the volume of crime would provide a perspective on the use of incarceration not available in simple population-based rates. Modeled data should surely be used when we know that right or left censoring of data makes data incomplete and inaccurate. For years BJS published estimates of time served in prison using exiting cohorts when they knew that this se- riously underestimated the time served. This is a case where model-based estimates would most certainly have been more accurate than data-based estimates. However, greater use of model-based estimates must be done with cau- tion, for several reasons. One is the challenge of interpretation: Modeling may not be understood by many consumers of BJS data. This may be largely a presentational problem that can be solved by presenting the estimates sim- ply and then providing the detailed description of modeling elsewhere. The use of double-decrement life tables by Bonczar and Beck (1997) (later up- dated by Bonczar, 2003) is a good illustration of how modeling could be presented in BJS reports. Another challenge is that models are always based on assumptions, assumptions that can be more or less accurate or robust (and there can be wide disagreement over what is accurate or robust). Hence, sit- uations where the choice of assumptions may be interpreted as reï¬ecting political or other bias should be avoided. A more basic methodological development, but still complex research ef- fort, would be for BJS to invest in the creation and revision of basic classiï¬ca- tions and typologies for crime and criminal justice matters. Its role in coordi- nating information from a variety of justice-related agencies and promoting standards through National Criminal History Improvement Programâtype grants for improvement of source databases gives BJS unique advantages in taking on such an effort. The classiï¬cation of âindex crimesâ used in the UCR has changed little in 80 years and remains the nationâs major crime classiï¬cation; its implications for what crimes are most serious are central to the deï¬nitions used in the NCVS and other BJS collections. Yet the inter- est in crime and the amount of information available on crime has changed greatly over those 80 years, and the basic classiï¬cation of crime should be revisited to keep pace with these changes. BJS should also invest some effort in getting denominators for risk rates that are more reï¬ective of the at-risk population. Major cities, for exam- ple, are disadvantaged in the annual crime rankings of jurisdictions based on UCR data because their rates are based upon their residential populationâa
PRINCIPLES AND PRACTICES 271 base that excludes the commuters, shoppers, and culture seekers who con- tribute to the numerators of the rates. Likewise, incarceration rates based on the entire population are technically correct but may be otherwise mis- leading, because very young and very old populations are not at risk. The generation of risk rates should not be restricted to the data generated by BJS but should use other data as long as the quality and periodicity of those data are acceptable. To report estimates from BJSâs inmate surveys as pro- portions of the prison population misses a great opportunity to understand much better how the nation uses its prison resources; incarceration rates reï¬ecting the general household population (as in Bonczar, 2003) may be uniquely informative. 5âB.9 Strong Internal and External Evaluation Program Statistical agencies that fully follow [this set of prescribed practices] will likely be in a good position to make continuous assessments of and im- provements in the relevance and quality of their data collection systems. . . . Regular, well-designed program evaluations, with adequate budget support, are key to ensuring that data collection programs do not dete- riorate. (National Research Council, 2009:47, 48) The practice of instituting a strong internal and external evaluation pro- gram is a new addition to the fourth edition of Principles and Practices of a Federal Statistical Agency. It is similar to the practice of an ongoing research program (Section 5âB.8) but has slightly different connotations, emphasizing not only the continuous quality assessment of individual data collection pro- grams but periodic examination of the quality and relevance of an agencyâs entire data collection portfolio. It is very much to BJSâs credit with respect to following this practice that it has periodically sought the advice of ex- ternal users and methodologists on speciï¬c methodological problems, that it engaged in the intensive rounds of testing and evaluation that led to the redesigned NCVS in the early 1990s, that it regularly receives feedback on data quality from its state SAC network and JRSA, and that it actively sought and encouraged this panelâs review of the full BJS portfolio. Like other small statistical agencies, BJSâs ability to mount large-scale evaluation efforts is limited by available resources. Still, attention to internal and external evaluation is critical. Indeed, some of the guidance we offer in this reportâfor instance, on emphasizing the ï¬ows from step to step in the justice system within existing BJS data sets and facilitating linkage between current data sets (Section 3âF.1)âdepends critically on careful evaluation of the strengths and limitations of current data collections and structures as a ï¬rst step. One general direction for improvement by statistical agencies, including BJS, is greater attention to known data quality issues and comparisons with
272 JUSTICE STATISTICS other data resources as part of the general documentation of data sets. BJS reports are generally careful to include a concise methodology section, and the public-use data ï¬les that are accessible at the NACJD typically include additional detail in their codebooks. Still, as a general practice, BJS should work to ï¬nd ways to improve the documentation on its major data hold- ings that is directly accessible from BJS. This could include developing and making available technical reports based on speciï¬c user experiences and providing direct links to Census Bureau (and other BJS-contracted data col- lection agents) technical reports on developing speciï¬c survey instruments. As part of an evaluation program, it would also be useful for BJS to move beyond individual series examinations and approach critiques of the relative quality of multiple sources. This work should be done in partner- ship with other statistical agencies or data users, such as we describe below in Section 5âB.11 for comparing BJSâs prison and jail censuses with the data quality and resolution provided by the Census Bureauâs ACS. Other exam- ples for multiple-source evaluation include: ⢠Examination of differences between homicide rates computed from the UCR data and those from the cause-of-death data coded in the vital statistics that are compiled by the National Center for Health Statistics; ⢠Reconciliation of the number of gunshot victims known to the police (or measured in emergency room admissions data) with the number of self-reported gunshot victims in the NCVS (see, e.g. Cook, 1985); and ⢠Examination of the reasons why serious-violence victimization rates from the NCVS and School Crime Supplement differ from those de- rived from CDCâs Youth Risk Behavior Surveillance System. 5âB.10 Professional Advancement of Staff To develop and maintain a high-caliber staff, a statistical agency must recruit and retain qualiï¬ed people with the relevant skills for its ef- ï¬cient and effective operation, including analysts in ï¬elds relevant to its mission (e.g., demographers, economists), statistical methodologists who specialize in data collection and analysis, and other specialized staff (e.g., computer specialists). (National Research Council, 2009:12) At the panelâs request, BJS supplied biographical information for its staff members as of fall 2008. A total of 32 of the 53 staff members hold positions with labels connoting direct statistical work (statistician, senior statistician, or branch chief); 12 have doctoral degrees (with an additional ï¬ve listed as being Ph.D. candidates) and nearly all list masterâs degrees. However, none holds a doctoral or masterâs degree in statistics, although two statisticians have completed masterâs degrees in the Joint Program in Survey Methodol- ogy of the University of Maryland, the University of Michigan, and Westat.
PRINCIPLES AND PRACTICES 273 Indeed, the only formal statistics degree on the full BJS staff is a bache- lorâs degree, held by a specialist on the support staff. As is not surprising, advanced degrees in criminology (or criminal justice) or sociology abound, though other ï¬elds such as social psychology, social welfare, and public af- fairs are also included. The statistician ranks in BJS also include one holder of a law degree. Our review of the staff biographiesâand of BJSâs publications, through- out this reportâsuggests a very capable and dedicated staff, with a median length of service of about 8 years and including several career staff members of 20 years or more. Our intent is not to impugn the good work of the BJS staff. However, in Section 5âB.7 and our interim report, we commented on the need for more highly skilled technical leaders within BJS; we think this is necessary to put BJS on a better footing in dealing with its external data collection agents, to cultivate a climate of research and innovation, and to safeguard the continued credibility and quality of BJS data. Going fur- ther, we suggest that BJS would beneï¬t from additional staff expertise in mathematical and survey statistics; computer science and database manage- ment are also notable deï¬ciencies in staff expertise, given the agencyâs role in executing grants to improve criminal justice databases and the importance of record linkage for conducting longitudinal studies of ï¬ows in the justice system. Recommendation 5.15: BJS must improve the technical skills of its staff, including mathematical statisticians, computer scien- tists, survey methodologists, and criminologists. At the same time, the panel notes that the recruitment problem for tech- nical staff to all statistical agencies is a large one. The agencies in the federal statistical system that seem to do better on this score are those who are ac- tively supporting advanced degrees among their junior staffâthat is, making human capital investments in bachelorâs-level staff and assisting their grad- uate studies to yield more technically astute staff in 2â4 years. In addition, agencies have sponsored dissertation fellowships on their own data, using the contact with the Ph.D. candidate to recruit talented staff. 5âB.11 Coordination and Cooperation with Other Statistical Agencies Although agencies differ in their subject-matter focus, there is overlap in their missions and a common interest in serving the public need for credible, high-quality statistics gathered as efï¬ciently and fairly as pos- sible. When possible and appropriate, federal statistical agencies should cooperate not only with each other, but also with state and local statis- tical agencies in the provision of data for subnational areas. (National Research Council, 2009:13)
274 JUSTICE STATISTICS There are some valuable and mutually productive partnerships between BJS and other statistical agencies. These include relatively long-term ar- rangements such as the National Center for Education Statisticsâ sponsorship of the School Crime Supplement as well as one-time collaborations, such as a joint report by BJS and CDC staff on ï¬ndings from the NCVS on injuries sustained in the course of violent crime victimizations (Simon et al., 2001). BJS has also enjoyed some collaborative work with the National Center for Health Statistics, including use of vital statistics data collected from state public health departments and registrars. BJS has also, on occasion, worked with agencies that are not principal statistical agencies but that do conduct statistical work; for instance, BJS sponsored the Consumer Product Safety Commission to add a Survey of Injured Victims of Violence as a module to the commissionâs National Electronic Injury Surveillance Systemâa sample of hospitals that provide their emergency department records for coding and analysis (Rand, 1997). Of course, BJSâs most intensive relationship with another statistical agency is with the Census Bureau. Although there are some cooperative aspects of the partnership between the two agencies, the panel believes that there are some fundamental strains in the relationship. One is that, as noted in the preceding section, BJS has lacked the strong statistical expertise to fully engage with the Census Bureau staff on design (and redesign) issues, and so its role in modifying the NCVS to ï¬t within budgetary constraints has largely been one of deciding which Census Bureauâdeveloped cost-saving options are least objectionable. Another element of strain is discussed in our interim report (National Research Council, 2008b:Sec. 5âD): the failure of the Census Bureau to provide transparency in its costs and charges for data collection to BJS (or its other federal agency sponsors), which makes assessments of the trade-offs between survey costs and errors impossible. Agencies that contract out much of their workâand BJS is one of the extreme cases within the statistical system in that regardâcan easily evolve into ones where contract management is the dominant focus. While more (and more sophisticated) technical staff will not solve the BJS budget prob- lems, they can make BJS a stronger partner to the other statistical agencies with which it works. On substantive grounds, an important area in which a healthy BJSâ Census Bureau relationship and collaboration would beneï¬cial is in recon- ciling BJSâs corrections data series with the Census Bureauâs measures of the correctional institution population. The American correctional apparatus has grown enormously since the mid-1970s; there are now on the order of 2.3 million persons in prison or jail, and the incarceration rate has grown fourfold since 1980. Another 800,000 people are on parole, and 4.2 million are on probation. Virtually all the growth in incarceration since 1980 has been among those with less than a high school education. In this context, the
PRINCIPLES AND PRACTICES 275 BJS data collections are a valuable supplement to the large Census Bureau household surveys which are drawn exclusively (or nearly so) from the non- institutional household population. BJS collections on the population under correctional supervision are not just an important part of an accounting of the criminal justice system, but an increasingly important part of the nationâs accounting for the population as a whole. Those groups overrepresented in prison populationsâminority men under age 40 with little schoolingâare also signiï¬cantly undercounted in household surveys and other data collec- tions from the general population. The Census Bureauâs ACS now contains the detailed social, demographic, and economic questions that were traditionally asked of a sample of the population through the âlong formâ questionnaire of the decennial census. When the ACS entered full-scale collection earlier this decade, it also in- cluded coverage of the group quarters (nonhousehold) population, includ- ing prisoners. The ï¬rst 3-year-average estimates from the ACS for areas with populations of 20,000â65,000 only became available in 2008, and the ï¬rst 5-year-average estimates for all geographic areas (including those under 20,000 population) are only slated for release in 2010. Hence, the prop- erties of these estimatesâmuch less their accuracy for segments of the rela- tively small group quarters populationâare only beginning to be studied and understood. Going forward, an important question will be how the most ac- curate picture of the prison and jail population can be derived, balancing the ACS estimates with the annual count (and basic demographic) information from BJSâs prison and jail censuses and the detailed information available from BJSâs inmate surveys. 5âC SUMMARY The panel believes that BJS and DOJ should conduct continual examina- tion of BJSâs fulï¬llment of the principles and practices of a federal statistical agency. Our panelâs review found that the perceived independence of the agency was severely shaken by recent events. We found that the trust of data providers is threatened by BJS directly assisting regulatory activities. We also found that a renewed emphasis on increasing the technical and research skills of BJSâs staff is needed.