National Academies Press: OpenBook
« Previous: 3 Overview of Bureau of Justice Statistics Data Series
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 165
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 166
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 167
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 168
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 169
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 170
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 171
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 172
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 173
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 174
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 175
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 176
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 177
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 178
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 179
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 180
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 181
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 182
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 183
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 184
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 185
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 186
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 187
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 188
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 189
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 190
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 191
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 192
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 193
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 194
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 195
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 196
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 197
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 198
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 199
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 200
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 201
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 202
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 203
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 204
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 205
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 206
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 207
Suggested Citation:"4 State and Local Partnerships." National Research Council. 2009. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics. Washington, DC: The National Academies Press. doi: 10.17226/12671.
×
Page 208

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

–4– State and Local Partnerships T HE U NITED S TATES has a significant national justice system, com- posed of the federal court and penal systems and numerous federal law enforcement agencies. However, the vast majority of the activity related to crime and justice occurs at the subnational level. Most crime is pinpointed geographically, and much of the response to crime is handled by police, courts, and correctional facilities at the state, county, and municipal levels. The Bureau of Justice Statistics (BJS) thus shares with many of its fel- low federal statistical agencies the challenge that it is a national government agency tasked to measure phenomena that are inherently local in nature. To meet this challenge, it has been common for federal statistical agencies to forge partnerships with state and local governments. These partnerships vary in their level of formality, in their goals and objectives, and in the fiscal resources dedicated to them on both the federal and state sides. In some cases, they are as basic as establishing regional offices to make interactions with local authorities more convenient and to serve as a venue for dissem- ination of information; in others, the federal agency and individual state governments are essentially equally committed to the partnership, jointly funding and staffing data collection operations. Federal-state partnerships also include models where the federal agency role is principally one of co- ordination and compilation, directly accumulating data provided by local authorities and piecing together national files. Under its current legal authority, BJS has at least two distinguishing char- acteristics in terms of its work with state and local governments, relative to its peers in the federal statistical system. One is the boldness with which state and local issues are written into BJS’s legal mandate: in no uncertain terms, 165

166 JUSTICE STATISTICS BJS’s authorizing language mandates that “[BJS] shall give primary emphasis to the problems of State and local justice systems” (42 USC § 3731), and its list of legal duties is replete with reference to performing studies “at the Federal, State, and local levels” (see Box 1-2). The second is BJS’s explicit charter—inherited from the functions of the former Law Enforcement Assis- tance Administration and consistent with the function of BJS’s parent Office of Justice Programs (OJP)—to provide direct financial and technical assis- tance to local governments and agencies, rather than solely conduct data collection functions. It follows that an assessment of BJS’s programs and functions must pay particular attention to the agency’s interactions with state and local govern- ments, evaluating the effectiveness of these partnerships and contemplating the role of BJS’s grant programs for local authorities. In this chapter, we discuss the centerpiece of BJS’s State Justice Statistics (SJS) program—BJS’s network of state Statistical Analysis Centers (SACs; Section 4–A)—and di- rectly compare BJS’s work with the states to the models of federal-state co- operation in other parts of the federal statistical system. We then turn to BJS’s principal grant program, the National Criminal History Improvement Program (NCHIP; Section 4–B). This chapter on state and local partnerships is also the most logical place to explore in depth federal and state roles in the compilation of one of the longest-standing statistical series in the criminal justice system—albeit not one administered by BJS. For decades, state and local police departments have supplied crime count data to the U.S. Federal Bureau of Investigation (FBI) as part of the Uniform Crime Reporting (UCR) program. As part of the panel’s charge to examine BJS’s relationship to other data-gathering entities in the U.S. Department of Justice, we discuss the current and future state of the UCR and BJS’s role relative to that of the FBI in this series; this discussion is in Section 4–C. 4–A STATE JUSTICE STATISTICS: THE STATISTICAL ANALYSIS CENTER NETWORK BJS’s network of state-based SACs actually predates the creation of BJS in its current form. The same section of the 1968 Omnibus Crime Control and Safe Streets Act that authorized the new Law Enforcement Assistance Administration (LEAA) to collect statistical information also directed the LEAA to (P 90-351 § 515(c)): .L. cooperate with and render technical assistance to States, units of general local government, combinations of such States or units, or other public and private agencies, organizations, or institutions in matters relating to law enforcement.

STATE AND LOCAL PARTNERSHIPS 167 The LEAA’s National Criminal Justice Information and Statistics Service started the Comprehensive Data Systems (CDS) program in 1972, providing the earliest funds for establishment of SACs. As summarized by the Justice Research and Statistics Association (2008): The CDS guidelines established six objectives for the SACs: • provide objective analysis of criminal justice data, including data collected by operating agencies; • generate statistical reports on crime and the processing of criminal offenders in support of planning agencies; • coordinate technical assistance in support of the CDS program in the state; • collect, analyze, and disseminate management and administrative statistics on the criminal justice resources expended in the state; • promote the orderly development of criminal justice information and statistical systems in the state; and • provide uniform data on criminal justice processes for the prepa- ration of national statistical reports. Ten states established SACs or designated existing agencies in SACs in 1972; by 1976, the SAC network had grown to 34 states and a nonprofit association—the Criminal Justice Statistics Association—developed to sup- port and coordinate SAC activities. In 1991, the association renamed itself the Justice Research and Statistics Association (JRSA). BJS assumed responsi- bility for the SAC program in 1979 as LEAA was phased out (to be replaced by OJP) and it has retained this role since, as reflected in the authorizing legislation. The initial focus of SACs was to coordinate state-level data collection, act as a statistical clearinghouse, and assist the federal government with jus- tice statistics series through contributions of state data or statistics. SAC involvement in statistical collections has historically focused and remains on state-level needs, although either direct participation in BJS programs (e.g., contributing data) or indirect participation (facilitating participation by other state agencies) has remained central to the program. As of 2008, all states and several U.S. territories had a designated SAC, though their forms and functions vary; Box 4-1 describes the types of struc- tures that exist in the current SAC network. The funding program for SACs was reformulated from a clearinghouse orientation to a research-oriented program in 1996 under the leadership of then-BJS Director Jan Chaiken. The SJS program has since provided BJS an avenue for fostering data collection and analysis in areas consistent with BJS and Justice Department priorities. This approach has been particularly use- ful in collecting data on emerging local or regional issues, which often are of greater concern in specific parts of the country and states before becoming

168 JUSTICE STATISTICS Box 4-1 State Statistical Analysis Center Network The Justice Research and Statistics Association (JRSA) defines state Statistical Analysis Centers (SACs) as state-level units or agencies “that use operational, management, and research information from all components of the criminal justice system to conduct objective analyses of statewide and systemwide policy issues” (http://www.jrsa.org/sac/aboutsacs.html). Individual SACs receive financial support through the Bureau of Justice Statistics’ (BJS’s) State Justice Statistics Program, as does JRSA; some SACs receive additional funding from their state governments. JRSA is a nonprofit organization of SAC directors that provides a central staff and coordination effort for SAC activities; it hosts an annual research conference (with BJS funding), pub- lishes the journal Justice Research and Policy, and maintains databases of SAC activities. SACs vary in their organizational structure and standing with respect to the state government, falling into a few basic categories: • Independent State Justice Information Center: For instance, the Illinois SAC (cre- ated in 1977) is the research and analysis unit of the Illinois Criminal Justice In- formation Authority (ICJIA). The ICJIA was created as an independent state agency by executive order in 1982, inheriting functions from a former Illinois Law Enforce- ment Commission. Other states following similar models as of 2008 are Alabama and Arkansas. • Unit of State Crime Commission or Planning Commission: For example, the Kansas SAC is a unit of the state sentencing commission and Montana’s is part of the state Board on Crime Control; both Georgia and the District of Columbia have SACs affiliated with the Criminal Justice Coordinating Council in those ju- risdictions. Other states following this model include Arizona, Indiana, Louisiana, Maryland, Nebraska, Oregon, Pennsylvania, Rhode Island, and Utah. • Unit of State Law Enforcement Agency or Justice Department: For this most com- mon organizational structure, examples include California (branch of the state justice department), Missouri (part of the state highway patrol), and Tennessee (unit of the Tennessee Bureau of Investigation). Echoing BJS’s administrative po- sition in the U.S. Department of Justice, both the Minnesota and South Carolina SACs are part of an Office of Justice Programs in those states’ public safety de- partments; similarly, Idaho’s SAC is housed in the Office of Planning, Grants, and Research of the Idaho State Police. Other states following this basic model in- clude Colorado, Florida, Hawaii, Kentucky, Massachusetts, New Hampshire, New Jersey, New York, North Carolina, North Dakota, Ohio, South Dakota, Virginia, and West Virginia. • Unit of Other State Agency or Entity: Examples include the SACs in Delaware (part of state Office of Management and Budget), Iowa (Department of Human Rights), Oklahoma (Legislative Service Bureau), Washington (Office of Financial Management), and Wisconsin (Office of Justice Assistance). The Texas SAC was reestablished, after some absence, by executive order in 2007, and is housed in the Office of the Governor. (continued)

STATE AND LOCAL PARTNERSHIPS 169 Box 4-1 (continued) • Affiliation with Academic Department or Institute: In some states, the SAC is di- rectly affiliated with a university or university-affiliated research institute, and the SAC director or staff may be faculty members. The states organized in this manner (with their host institutions) are Alaska (University of Alaska Anchorage), Connecti- cut (Central Connecticut State University), Maine (University of Southern Maine, in partnership with state Department of Corrections), Michigan (Michigan State University), Mississippi (University of Southern Mississippi), Nevada (University of Nevada, Las Vegas), New Mexico (University of New Mexico), Vermont (the non- profit Norwich Studies and Analysis Institute, affiliated with Norwich University), and Wyoming (University of Wyoming). SACs also operate in the Northern Mariana Islands and Puerto Rico. a national issue. States continue to assist BJS as a liaison for data collec- tion efforts, enhancing state and local analytical efforts and analyses which demonstrate the utility of various systems (e.g., National Incident-Based Re- porting System, criminal history records etc.). Data and policy priorities of BJS are reflected in the substantive areas under which SACs may apply for SJS program support. The “themes” in the annual SJS solicitation include issues related to BJS initiatives in an array that also allows states to focus on problems of more immediate state, local or regional concern. Themes from the 2008 SJS program for SAC solicitation are enumerated in Box 4-2. 4–A.1 State Partnerships in the Federal Statistical System In examining BJS’s partnerships, it is useful to consider some exemplars from other statistical agencies; we describe some of these arrangements in the following list. In doing so, we emphasize that this is a selective list meant to describe a range of approaches, rather than a complete canvass of federal-state partnerships, and that no assessment of the quality of the data produced by the systems is implied. Unless otherwise indicated, cost infor- mation in the following list is from the fiscal year 2008 edition of the U.S. Office of Management and Budget (OMB) publication Statistical Programs of the United States Government (U.S. Office of Management and Budget, 2007): • The Behavioral Risk Factor Surveillance System (BRFSS) is a collabora- tive data collection system maintained by the Behavioral Surveillance Branch of the Centers for Disease Control and Prevention (CDC). We described the BRFSS in our first report (National Research Council, 2008b:Box 4-1 and Section 4–B) as the basis for one possible design alternative for the National Crime Victimization Survey (NCVS). As a federal-state partnership, the BRFSS follows a contracting model:

170 JUSTICE STATISTICS Box 4-2 State Justice Statistics Program Themes, 2008 (1) Deaths in Police Custody Reporting—Obtaining statewide data on deaths oc- curring in the process of arrest or in pursuit of arrest. [The Bureau of Justice Statistics (BJS)] continues to request assistance from State [Statistical Analysis Centers (SACs)] to obtain specified data on these deaths and report them quarterly to BJS. [Applicants] wishing to address this theme may utilize SJS to establish a long term reporting process, rather than a one time study. (2) Prison rape and victimization in confinement facilities—improving quality of administrative data involving criminal acts within adult and juvenile facilities. [BJS] continues to encourage SACs to examine the quality of their State adminis- trative records and where feasible provide recommendations for the improvement of the quality and accuracy of these data. (3) Criminal justice system crisis planning. The SAC may wish to pursue research or data collection to support criminal justice system planning for dealing with major crises, disorders, or other catastrophic incidents. [Examples include] prisoner relo- cation and/or alternative housing needs [and] backup records systems in the courts or other entities. (4) Increased Web access to data. SJS funds could be used by the SAC for Internet infrastructure development, enhancements, and linkages, including building a World Wide Web site, computer support, and preparing reports for dissemination via the Internet. (5) Performance measurement. SJS funds could be used by the SAC to help States develop and improve performance measures and the tools available to agencies to assess progress in addressing public safety and administration of justice goals. (6) Analyses utilizing a State’s criminal history records. BJS encourages SACs to utilize the State’s criminal history records for research purposes. In particular, the SAC may wish to seek SJS funds to support studies of: a) Patterns of criminal behavior such as sex offending, stalking, or domestic violence; b) Arrests, prosecutions and convictions for firearms-related offenses; c) Prisoner and/or probationer recidivism, including rates of rearrest, reconvic- tion, and return to correctional custody; d) The implementation and/or impact of programs such as drug courts, prisoner reentry initiatives, or specialized probation programs; or e) The implementation and/or impact of a State’s criminal history record improve- ment activities. At most one topic may be proposed in this thematic area. Funds may be requested to establish the technical capacity to conduct criminal history records-based re- search. The application must either state that the applicant is also the State’s administrator of [National Criminal History Improvement Program (NCHIP)] funds or include a letter or memorandum of endorsement from the State agency administer- ing NCHIP funds. (7) Statewide crime victimization surveys. (8) Analysis of the uses of new or emerging biometric technologies to improve the administration of criminal justice. SJS funds may be used by the SAC to sup- port research which describes and examines the uses of new or emerging biometric technologies (DNA evidence collection/analysis, facial recognition, etc.) to improve the administration of criminal justice in a State. (9) Research using incident-based crime data that are compatible with the Na- tional Incident-Based Reporting System (NIBRS). [SJS] funds under this theme may be used to examine the utility of linking NIBRS incident reports to a State’s criminal history records for research purposes. (continued)

STATE AND LOCAL PARTNERSHIPS 171 Box 4-2 (continued) (10) Data collection and/or research examining a special topical area: (a) Minority overrepresentation in the criminal or juvenile justice systems. ... (b) Civil justice. SJS funds may be used by the SAC in developing estimates of the number and characteristics of tort, contract, and real property cases and the dispositions of those cases for both adjudicated and settled civil matters. The longer term objective might be to estimate change over time within the State in the nature of case issues, judgments, and awards and to evaluate the impact of civil justice reforms such as capping punitive awards or medical malpractice mediation boards. (c) Cybercrime. SJS funds could be used by the SAC to examine the magnitude and consequences of computer crime and identity theft and fraud. (d) Human trafficking. (e) Justice issues in Indian Country. (f) Criminal activity in U.S. border areas. (g) Violent crime in schools. (h) The impact of substance abuse on State and/or local criminal justice and public health systems. (i) Family violence and/or stalking. (11) Evaluation of prisoner reentry initiatives and programs. (12) Other theme or topic identified by the SAC. SJS funds may be used by the SAC to support research examining another theme or topic provided the application is accompanied by persuasive documentation and justification that the subject is a top priority for the State’s Governor or criminal justice policy officials. SOURCE: Excerpts from Bureau of Justice Statistics (2008f); emphases in the original. the CDC executes contracts with state health agencies, paying for them to conduct monthly telephone interviews and administer a core questionnaire. In return, the states get processed returns in terms of state-level estimates (by design, the BRFSS is an amalgam of state sam- ples and is not meant to be a nationally-representative sample); they also have the latitude to add their own topic supplements. This form of partnership—in which the federal agency exercises strong control over content and resulting data but pays the states to provide data collection—is expensive. Of the $453.1 million estimated to be spent in fiscal year 2008 on purchasing statistical services from state and lo- cal governments, CDC’s $162.2 million (not including activities of the National Center for Health Statistics [NCHS], which is a component of CDC) is the largest single share. • The Vital Statistics program of NCHS compiles information from birth and death certificates that have been collected by state health (vital reg- istration) departments. NCHS’s fiscal year 2008 allocation for statis-

172 JUSTICE STATISTICS tical purchases from state governments is about $18.6 million—lower spending relative to the BRFSS because of the different relationship between the federal agency and the localities. The law directs that (42 USC § 242k(h)(1)): There shall be an annual collection of data from the records of births, deaths, marriages, and divorces in registration areas. The data shall be obtained only from and restricted to such records of the States and municipalities which [NCHS] determines possess records affording satisfactory data in necessary detail and form. . . . Each State or registration area shall be paid by [NCHS] the Federal share of its reasonable costs (as determined by [NCHS]) for collecting and transcribing (at the request of [NCHS] and by whatever method authorized by [NCHS]) its records for such data. Hence, the collection costs of vital statistics data are largely assumed at the state level (with some reimbursement of “the Federal share of the reasonable cost”). NCHS’s costs involve compilation and processing, as well as the promulgation of standards. This model has its difficulties, because disagreements over the costs of provid- ing the data can put a damper on participation. In recent years, NCHS has struggled to achieve cooperation by state health depart- ments (and the county health departments and local facilities coor- dinated by the states) in adopting new standard formats for birth and death certificates. Participation in supplying divorce records de- clined sufficiently that NCHS abandoned their collection in 1996, and the most recent comprehensive study by NCHS of marriage and di- vorce information from vital records is based on 1989–1990 data (see http://www.cdc.gov/nchs/mardiv.htm). • The Bureau of Labor Statistics allocation for purchases of statistical services from states and localities was the second largest among statis- tical agencies in fiscal year 2008, estimated at $96 million. BLS op- erates federal-state cooperative arrangements through its regional of- fices; limited contracts for collection and sharing of employment data between BLS and individual states had been crafted as early as 1916, but the establishment of the regional offices in 1942 formalized those arrangements (Hines and Engen, 1992). Today, the BLS Federal-State Cooperative Programs (administered through six regional offices) en- compass a number of labor market information programs, including surveys and records collections on occupational safety and health.1 1 In recent years, BLS has switched from operating eight regional offices to six: its Kansas City, Missouri, regional office was merged with the Dallas office and the operations of its Boston and New York offices were consolidated into a Boston (Northeast) office. However, New York City still retains a Regional Office for Economic Information and Analysis headed by a regional commissioner.

STATE AND LOCAL PARTNERSHIPS 173 As an operational model, BLS’s Federal-State Cooperative Program is closer to the contract-driven BRFSS model than the vital statistics ex- ample, involving contractual agreements to collect and transfer specific data series. The Census Bureau employs numerous mechanisms to work with state and local governments, including the coordination of field activ- ities through 12 regional offices. Two of its major partnerships are of particular interest because they share a similar structure with the BLS model. Between 1967 and 1973, the Census Bureau formalized loose arrangements with state agencies to create the Federal-State Co- operative Program for Population Estimates, under which states sup- ply some of the raw information (vital statistics data on births and deaths, estimates of prison population, and other records) needed to update decennial census information and generate intercensal popula- tion estimates. By 1979, a parallel Federal-State Cooperative Programs for Population Projections was forged to build collaboration with the states on the production of population forecasts. • As an agency, the National Agricultural Statistical Service (NASS) of the U.S. Department of Agriculture (USDA) dates back to the 1961 formation of a Statistical Reporting Service, but its roots—and part- nership with states—are more extensive. Wisconsin was the first state to enter into a memorandum of understanding (MOU) for data gath- ering and sharing with USDA in 1917; over the years, similar MOUs were executed with state agriculture departments, land grant univer- sities, and other agricultural entities, and all states currently have an MOU on file with NASS (Dantzler, 2008). The defining character- istic of NASS’s state partnerships is the high degree to which labor and other resources are shared, by means of a third party. In 1972, NASS established an agreement with the National Association of State Departments of Agriculture (NASDA) under which NASDA bears the principal costs of data collection, including salaries and travel expenses of about 3,700 field interviewers. Field work is coordinated through NASS’s 46 field offices,2 which are staffed by a mix of federal/NASS employees (675 total) and state “cooperator” employees (151 total). Because of the intermediary role of NASDA, OMB tabulations indicate that NASS purchases no statistical services directly from state and local governments, but rather from a private-sector entity ($29 million, out of NASS’s $167.7 million total estimated budget). BJS’s state partnerships do not correspond neatly with any of these or- ganizational models. The NCHIP and related grant programs provide rela- 2 All states are covered by the NASS-state partnerships; however, the New England states are coordinated through a single regional office in Concord, New Hampshire.

174 JUSTICE STATISTICS tively unfettered funding to the states, not the more formal data collection contracts executed by the other agencies. In its SAC network, BJS’s system bears some similarities to the NASS arrangement, including the presence of a third-party coordinator (NASDA for NASS and JRSA for the SACs), but is neither as formal nor intensive as the state agricultural arrangements. In large part, the more-limited role of the SACs is dictated by basic logic and the breadth of BJS’s scope: in the criminal justice arena, there is no ideal, single, state-level point of contact with which a strong data collection arrangement can be brokered, because of the differences in state justice organizations. That is, state police or public safety departments may be distinct from state corrections departments, which in turn are distinct from state court systems, which are distinct from state victims’ offices and other related agencies, all of which are distinct from their local or large-city equivalents. As noted in our first report (National Research Council, 2008b:60), BJS has provided technical support to state and local agencies as well as broker- ing partnerships to disseminate and collect data. This is particularly true of its development of software tools for local data collection. BJS developed, and made freely available to state and local agencies, software for conduct- ing victimization surveys, using the NCVS (including the detailed Incident Report) as a template. The software has since been relabeled Justice Survey Software and is administered by SureCode Technologies, and remains avail- able at http://www.bjsjss.org; templates have been added for victimization surveys conducted in individual states, as well as for BJS’s National Survey of Prosecutors, Police-Public Contact Survey supplement, and State Court Processing Statistics inventory. More recently, this survey-building software has been ported to the Web as “BJScvs” and made available to state and lo- cal agencies through the website http://www.bjscvs.org (Justice Research and Statistics Association, 2006b:5). 4–A.2 Assessment Some of the basic benefits of a strong and active partnership between BJS and its state SACs can be listed in brief: • SACs are familiar with state, local, and often regional justice issues and can provide context for federal initiatives. • Federal-state cooperation promotes consistency in definitions and con- cepts across data collections, which in turn has the benefit of facili- tating more effective comparisons between jurisdictions and agencies. With partnerships, BJS is also in a position to provide guidance to states and localities on common standards for data quality, measure- ment, and analysis. • The response to crime is predominantly local and state in nature, hence

STATE AND LOCAL PARTNERSHIPS 175 the need for familiarity with justice systems, data, and agencies at this level. The SACs are often able to facilitate access to key agencies, data systems, and collection mechanisms that benefit federal statistical system efforts. • States benefit from a strong state-federal partnership in several key ways: – BJS is able to provide technical assistance that would not other- wise be available to states, or that might be cost-prohibitive for any one jurisdiction to obtain. – Although the financial benefit of the SAC program has not been large, states have been able to leverage BJS assistance (including NCHIP) for system, data quality, and analytical enhancements that might otherwise not be available from state resources alone. – A major benefit for BJS and the nation is improvement in justice data systems and a national perspective on crime and justice. Pol- icy development in the states often requires benchmarking and an understanding of crime on the national scene; BJS is uniquely situated to provide this perspective. • BJS benefits from a strong relationship with the SACs through the in- ventiveness of research performed at the state level and through the states’ direct contact with issues of local interest; feedback from SAC partners and successes with state-level activities can inform the devel- opment of national-level data collections. The capacity of the state partnerships to assist BJS to more nimbly meet new data needs was clearly illustrated by the data collection efforts set up to comply with the Deaths in Custody Reporting Act of 2000 (P 106- .L. 297). As we also discuss in Box 3-3 and Section 3–B.3, the act tasked BJS with quarterly data collection on incidents involving the death of persons while in criminal justice system physical custody. BJS was able to use its ongoing relationship with state Departments of Correction to collect data on deaths occurring while in correctional custody. However, a data collec- tion system for deaths occurring in law enforcement or other justice system custody proved more problematic, given significant variation in the ability to identify and capture data on incidents even within states. SACs assisted in developing data collection systems initially and in some cases continue to assist BJS in preparing reports for the Deaths in Custody project mandated by Congress; it remains one of the suggested program themes in the 2008 program solicitation. In the panel’s assessment, the BJS-state SAC network stands as a relatively low-cost activity on BJS’s part with great dividends in terms of outreach and feedback, as well as dissemination of data and products to state policy makers. Consistent with other recommendations we make in Chapter 5 on

176 JUSTICE STATISTICS BJS developing mechanisms for securing external advice, BJS’s good work in establishing state-level ties through the SACs, coordinated through JRSA, should continue to be a high-priority activity for the agency. Finding 4.1: BJS’s state Statistical Analysis Center (SAC) pro- gram has cultivated a strong federal-state relationship, relative to other federal statistical agencies. Development of the SAC network—which provides points of contact across the justice sys- tem to facilitate research on individual data series, dissemination of BJS information, and coordination of activities—has involved forging unique relationships adapted to state environments (for instance, whether the SAC is part of a state law enforcement department or is housed at a university). Implicit in this finding is the determination that BJS’s SACs are appro- priately positioned and that the heterogeneity of organizational structures across the SACs is a strength of the program. However, going forward, a challenge that would be useful for BJS and the SACs to consider (together with JRSA and BJS’s data collection agents) is finding a stronger role for the SACs in facilitating data collection activities. The Deaths in Custody example is a good one, where the existence and expertise in the SAC net- work made it possible to establish a new data series (and respond to a legal mandate) in a short time frame; it would be beneficial to find other avenues where such efficiency can be achieved and where the SACs can serve as an active point of contact or a collaborator in gathering information. We do not suggest by this language that the BJS-SAC relationship be revamped to look more like the vital statistics (dominant state, coordinating federal roles) or the NASS (dual federal-state staffing) models. Such models are not viable in the justice case because the major state-level operations of interest—law enforcement, corrections, and judiciary—are generally not located within the same department. Still, acknowledging the fact that the capacity to use the SACs for data collection will always be limited due to the range of types of SACs and the lack of a central justice information authority in many states, good statistical systems are ones in which states are active partners in data systems. BJS’s state partnerships—not only the work of the SACs but also its role in ad- ministering grants such as NCHIP—give it distinctive possibilities relative to other federal statistical agencies. Through these works, BJS has the ability to subtly but directly affect the quality of the data that it receives as input to its ongoing series and the technical systems used to generate those data. Recommendation 4.1: Through its Statistical Analysis Center and State Justice Statistics programs, BJS should continue to de- velop its ties with the states, and more fully exploit the potential for using states as partners in data collections.

STATE AND LOCAL PARTNERSHIPS 177 It is particularly essential that the state SAC perspective be brought to bear in addressing the points raised in Section 3–F.1 on emphasizing lon- gitudinal structures within series. Tapping state expertise on available data and information systems would be highly beneficial in finding new ways to link existing data sets or to design panel surveys to follow cohorts of persons through the various steps of the justice system. Because states are the most likely immediate consumers and disseminators of small-area data, efforts by BJS to generate subnational measurements from the NCVS or other surveys should certainly be done with the active input from the states. Recommendation 4.2: Developments toward longitudinal and small-area measurement systems should involve state partners who are active in data collection and knowledgeable about state justice systems. 4–B NATIONAL CRIMINAL HISTORY IMPROVEMENT PROGRAM AND RELATED GRANT PROGRAMS 4–B.1 Background Checks and the Development of NCHIP NCHIP makes grants to states for establishment or upgrading of infor- mation systems, in response to a provision in the Brady Handgun Violence Prevention Act of 1993 (P 103-159): .L. The Attorney General, through the Bureau of Justice Statistics, shall, subject to appropriations and with preference to States that as of the date of enactment of this Act have the lowest percent currency of case dispositions in computerized criminal history files, make a grant to each State to be used— (A) for the creation of a computerized criminal history record system or improvement of an existing system; (B) to improve accessibility to the national instant criminal back- ground system; and (C) upon establishment of the national system, to assist the State in the transmittal of criminal records to the national system. Shortly thereafter, the National Child Protection Act of 1993 (P 103-209) .L. added similar grant-making authority with specific reference to improving state computerization and transmittal of criminal history records involving child abuse. The specific background check system created by the Justice Department in response to this mandate is the National Instant Criminal Background Check System, better known as NICS, which began operating in November 1998. The Brady Act requires federal firearm licensees to check potential firearm purchasers against the NICS database to determine whether the ap- plicant is disqualified from making the purchase. NICS was developed by

178 JUSTICE STATISTICS the FBI in consultation with federal, state, and local law enforcement agen- cies; like the UCR program, NICS is administratively housed in the FBI’s Criminal Justice Information Services Division in Clarksburg, West Virginia. A query against NICS is an instant check against three separate databases: • The Interstate Identification Index (“Triple I” or III) is an index of criminal history records, including persons arrested for felonies and some serious misdemeanors. It is an index (including identifying in- formation such as name, gender, race and ethnicity, and data of birth) of criminal histories rather than a full-fledged compilation of records. The basic “instant” query against the index takes only a few seconds and indicates whether arrest records exist for the target person in any state. If the instant query suggests a match, separate record re- quests (using either the FBI-assigned or state-issued identification num- bers coded on the record) retrieve the specific, detailed records corre- sponding to the individual from state record repositories. Ramker and Adams (2008:9) note that “forty-nine states (Vermont is working to- ward participation) and the District of Columbia currently participate in III and the system now includes over 66 million criminal records.” • The National Crime Information Center (NCIC) database is a compi- lation of a wide variety of personal and property records; it includes sex offender and protection order registries, arrest warrant records, and parole and conviction records, as well as records of vehicle or property theft and existing firearm records. • The NICS index culls from federal and state records to cover infor- mation on characteristics that are identified by law (18 USC § 922) as disqualifying a potential firearm purchaser but that are not covered by either the Triple I or the NCIC databases. Notably, these characteris- tics include immigration (alien) status and mental health history. A NICS query results in one of three responses: “Proceed,” “Denied,” or “Unresolved.” If the instant check against these databases suggests that the potential purchaser falls into a prohibited category, then the sale or transfer is “denied;” the query itself does not tell the federal firearm licensee (or the potential purchaser) the category or categories that resulted in the disquali- fication, though the individual has the right to request such information and appeal any inaccurate information. Exactly why BJS was designated as the administrator of the grant pro- gram to support implementation of NICS and related criminal history databases—as opposed to the FBI (which housed the existing record systems) or a purely grant-based agency such as the Bureau of Justice Assistance—is not clear. One possible reason is simply that BJS, as part of OJP has grant- , making authority that the FBI lacks; another is that references to criminal record systems remained among BJS’s legally mandated duties (Box 1-2) fol-

STATE AND LOCAL PARTNERSHIPS 179 lowing its creation from the predecessor LEAA. However the authority came about, BJS made its first grants related to computerized record improvement as early as 1995, and its grantmaking program came to be known as NCHIP . NCHIP was expanded in scope by the Crime Identification Technology Act of 1998 (codified as 42 USC § 14601(a)), which directed that grants be made to “the State[s], in conjunction with units of local government, State and local courts, other States, or combinations thereof.” The intended purpose of these grant monies was to: establish or upgrade an integrated approach to develop information and identification technologies and systems to— (1) upgrade criminal history and criminal justice record systems, in- cluding systems operated by law enforcement agencies and courts; (2) improve criminal justice identification; (3) promote compatibility and integration of national, State, and local systems for— (A) criminal justice purposes; (B) firearms eligibility determinations; (C) identification of sexual offenders; (D) identification of domestic violence offenders; and (E) background checks for other authorized purposes unrelated to criminal justice; and (4) capture information for statistical and research purposes to im- prove the administration of criminal justice. Authority for the issuance of these grants was specifically vested in “the Of- fice of Justice Programs relying principally on the expertise of the Bureau of Justice Statistics.” As indicated in Box 1-2, the description of BJS’s duties under the law was subsequently modified in 2006 to specifically reference NCHIP-related activities. The 1998 Crime Identification Technology Act served to dramatically increase the scope of information and identification systems eligible for im- provement grants. As detailed in Box 4-3, the act specifically covered a wide array of information systems used by state and local law enforcement agen- cies, courts, and support agencies, ranging in content and data type from person-level attributes (e.g., sexual offender registration and criminal his- tory records) to graphic images (e.g., scans of fingerprints and images of the toolmarks on ballistics evidence [spent bullets and cartridge casings]). The act also formally defined funds for improving systems for tracking domes- tic violence and stalking activity, including filed protective orders, as was authorized by amendments to the Violence Against Women Act.3 3 Funds for these purposes are sometimes called by a separate name and acronym—the Stalking and Domestic Violence Records Improvement Program, or SDVRIP—but are adminis- tered as part of NCHIP.

180 JUSTICE STATISTICS Box 4-3 Information Systems Covered by the Crime Identification Technology Act of 1998 and National Criminal History Improvement Program Grants under this section may be used for programs to establish, develop, update, or upgrade— (1) State centralized, automated, adult and juvenile criminal history record informa- tion systems, including arrest and disposition reporting; (2) automated fingerprint identification systems that are compatible with standards established by the National Institute of Standards and Technology and interopera- ble with the Integrated Automated Fingerprint Identification System (IAFIS) of the Federal Bureau of Investigation; (3) finger imaging, live scan, and other automated systems to digitize fingerprints and to communicate prints in a manner that is compatible with standards estab- lished by the National Institute of Standards and Technology and interoperable with systems operated by States and by the Federal Bureau of Investigation; (4) programs and systems to facilitate full participation in the Interstate Identification Index of the National Crime Information Center; (5) systems to facilitate full participation in any compact relating to the Interstate Identification Index of the National Crime Information Center; (6) systems to facilitate full participation in the national instant criminal background check system established under [the] Brady Handgun Violence Prevention [Act] for firearms eligibility determinations; (7) integrated criminal justice information systems to manage and communicate criminal justice information among law enforcement agencies, courts, prosecu- tors, and corrections agencies; (8) noncriminal history record information systems relevant to firearms eligibility de- terminations for availability and accessibility to the national instant criminal back- ground check system established under [the] Brady Handgun Violence Prevention [Act]; (9) court-based criminal justice information systems that promote— (A) reporting of dispositions to central State repositories and to the Federal Bureau of Investigation; and (B) compatibility with, and integration of, court systems with other criminal justice information systems; (10) ballistics identification and information programs that are compatible and inte- grated with the National Integrated Ballistics Network (NIBN); (11) the capabilities of forensic science programs and medical examiner programs related to the administration of criminal justice, including programs leading to ac- creditation or certification of individuals or departments, agencies, or laboratories, and programs relating to the identification and analysis of deoxyribonucleic acid; (12) sexual offender identification and registration systems; (13) domestic violence offender identification and information systems; (14) programs for fingerprint-supported background checks capability for noncriminal justice purposes, including youth service employees and volunteers and other individuals in positions of responsibility, if authorized by Federal or State law and administered by a government agency; (continued)

STATE AND LOCAL PARTNERSHIPS 181 Box 4-3 (continued) (15) criminal justice information systems with a capacity to provide statistical and re- search products including incident-based reporting systems that are compatible with the National Incident-Based Reporting System (NIBRS) and uniform crime reports; (16) multiagency, multijurisdictional communications systems among the States to share routine and emergency information among Federal, State, and local law enforcement agencies; (17) the capability of the criminal justice system to deliver timely, accurate, and com- plete criminal history record information to child welfare agencies, organizations, and programs that are engaged in the assessment of risk and other activities re- lated to the protection of children, including protection against child sexual abuse, and placement of children in foster care; and (18) notwithstanding subsection (c) of this section, antiterrorism purposes as they relate to any other uses under this section or for other antiterrorism programs. NOTE: The network referred to in point 10 is mislabeled in this legislative text; it should be the National Integrated Ballistic Information Network (NIBIN), which is operated by the Bureau of Alcohol, Tobacco, Firearms, and Explosives. See National Research Council (2008a) for additional description of NIBIN. SOURCE: Excerpted from 42 USC § 14601(b). The 1998 act established some basic eligibility criteria for the funds, as well as conditions and limitations on their use. To be eligible to receive these funds, states must demonstrate “the capability to contribute pertinent information to the national instant criminal background check systems” and have documented plans for developing integrated information technology systems (42 USC § 14601(c)). An important condition placed on the funds (42 USC § 14601(e)) is a 5 percent set-aside for BJS study and documenta- tion purposes: Not more than 5 percent may be used for technical assistance, training and evaluations, and studies commissioned by Bureau of Justice Statis- tics of the Department of Justice (through discretionary grants or oth- erwise) in furtherance of the purposes of this section. Every 2 years since 1989, BJS has sponsored a Survey of Criminal History Information Systems that, in part, serves to measure progress made through NCHIP grants. The survey is conducted by SEARCH, the National Consor- tium for Justice Information and Statistics. BJS has issued periodic updates on progress in criminal history record improvement (e.g., Bureau of Justice Statistics, 2001) and, in 2005, published a self-review of accomplishments of the NCHIP program (Brien, 2005). However, the only current data series that actually measures uses of and results from NCHIP-covered databases is the Firearm Inquiry Statistics program mandated by the original Brady Act. This program provides summaries of the number of handgun-purchase back-

182 JUSTICE STATISTICS ground checks completed (and failed) each year; see, e.g., Bureau of Justice Statistics (2008a). In 2004, the General Accounting Office (GAO; later the Government Ac- countability Office) issued a review of the NCHIP program conducted at the request of the House Committee on the Judiciary. The study concluded that, “using their own funds, as well as NCHIP and other federal grants, states have made much progress in automating their records and making them ac- cessible nationally” (U.S. General Accounting Office, 2004:35). However, the report also warned of both increasing demands for background check services and the costs of upgrading and replacing computer systems infras- tructure. Four years later, GAO conducted a second audit of NCHIP with , specific mandates from congressional requesters to describe Department of Justice oversight of the funds. The second review again reported significant progress in automating criminal history records; replying to a review version of the report, BJS indicated technology reporting and better case disposition reporting from court information systems as particular priorities for NCHIP work (Larence, 2008). In addition to the two GAO reviews, the NCHIP program was formally submitted to OMB’s Performance Assessment Rating Tool (PART) process in 2003 and deemed to be “moderately effective,” the second-highest ranking in the PART framework.4 4–B.2 Recent Law and Developments In January 2008, the NICS Improvement Amendments Act of 2007 was signed into law (P 110-180). The legislation was developed in the after- .L. math of the April 2007 mass homicide at Virginia Polytechnic Institute and State University; that shooting is specifically cited in the initial findings sec- tion of the act as the act’s motivation, along with a 2002 shooting in a church in Lynbrook, New York. The existing NICS index coverage of mental health history includes only formal determinations by a legal authority (such as a finding of insanity or incompetence to stand trial) or actual commitment to an institution. In the Virginia Tech incident, the perpetrator evidenced a history of mental illness but not the level of formal legal commitment that would be recorded in the NICS index; in the Lynbrook incident, the perpe- trator did have a mental health commitment as well as a restraining order against him, but neither of those disqualifying factors was registered in the instant background check. The act seeks to improve the coverage of men- tal health adjudications and commitments in the NICS databases; it further 4 See http://www.whitehouse.gov/omb/expectmore/detail/10001094.2003.html for the PART summary. The PART evaluation was completed before BJS had developed a “Record Quality Index” to assess the quality of existing systems in individual states in order to bet- ter target resources—the sole substantive point on which the PART found fault in the NCHIP program.

STATE AND LOCAL PARTNERSHIPS 183 requires states to provide records of convictions on misdemeanor domestic violence charges. To do so, the act authorizes the attorney general to make grants “in a manner consistent with the [NCHIP] program” to help states supply these records and generally improve submittal of records for NICS purposes. The act further explicitly directs the director of BJS to conduct ongoing evaluations of NICS5 and to submit two annual reports to Congress, one on the general operations of the background check system and the other on specific practices by the states in assembling and providing the relevant records (identifying and recommending best practices for all states). Although the new act supported an increased role for NCHIP-type grants, the level of funds appropriated by Congress for NCHIP has declined dramatically in recent years. In fiscal year 2003, BJS had funds to allocate about $47.5 million to states and territories; award totals dropped to about $26 million in fiscal year 2005 and $8.5 million in both fiscal years 2007 and 2008.6 Although NCHIP funding levels may have decreased, recent legislation has also created the possibility for BJS to actually use for research purposes the criminal history record data that its NCHIP grants have helped to im- prove. The most recent reauthorization of the Department of Justice (P .L. 109-162, which became law in 2006) did three specific things to put BJS in a position where it can actually utilize criminal history record databases for research. First, it added specific detail to the 19th listed duty of BJS (see Box 1-2), in particular authorizing “statistical research for critical analysis of the improvement and utilization of criminal history records.” Second, it vested the director of BJS with responsibility for maintaining the integrity and confidentiality of data in BJS hands: the director “shall be responsible for the integrity of data and statistics and shall protect against improper or illegal use or disclosure” (42 USC § 3732(b)). Third, it expanded existing au- thority for BJS to request information from federal, state, and local agencies by authorizing BJS to enter into data-sharing agreements: the BJS director shall “confer and cooperate with Federal statistical agencies as needed to carry out the purposes of this subchapter, including by entering into coop- erative data sharing agreements in conformity with all laws and regulations applicable to the disclosure and use of data” (42 USC § 3732(d)(6)). Armed with these new legal authorities, BJS began the process of nego- tiating access to criminal history records with the FBI. One important, and somewhat complex, step in this process involved arranging for the FBI to issue BJS an Originating Agency Identification number—codes that are nor- mally issued to law enforcement agencies—for the purpose of issuing Triple 5 A separate title of the act obligates GAO to audit the funds allocated under the act and report to Congress on how they were spent. 6 See http://www.ojp.usdoj.gov/bjs/stfunds.htm, which summarizes the amount of NCHIP awards by fiscal year and by state.

184 JUSTICE STATISTICS I requests for research purposes. In August 2008, BJS entered into a co- operative agreement with Nlets—the International Justice and Public Safety Information Sharing Network—to develop the information technology for BJS to work with Triple I records.7 Specific tasks to be completed by Nlets— within approximately 1 year—include “provid[ing] BJS with the capability to request/obtain multiple electronic criminal history records at one time” and to “develop and implement a simplified uniform criminal history record format to facilitate BJS’s statistical analysis of the criminal history record information” (Ramker and Adams, 2008:9–10). 4–B.3 Assessment On one hand, the NCHIP and related grant programs are the easiest tar- gets for criticism in a review of the programs of BJS as a statistical agency, precisely because they are not statistical data collection programs. BJS’s role in the grantmaking programs is generally limited to award and administra- tion of the grant funds and it does not acquire significant series of data as a direct result of the funds (save for the firearm inquiry counts). Significantly, BJS lacked any access whatsoever to the data systems that the grant funds sought to improve, and the research on quality and content of the resulting record databases has not been commensurate with what one would expect given the 5 percent set-aside for evaluation purposes. Finding 4.2: The National Criminal History Improvement Pro- gram (NCHIP) is a grantmaking program but not directly a sta- tistical collection, even though it is administered by BJS. How- ever, improved criminal history records are important for the prospects of longitudinal analysis of the criminal justice sys- tem. Analysis of the National Instant Background Check System serves as one approach to provide the data necessary to evaluate national policy on regulation of firearms purchases. However, the caveat we noted above—that BJS can signal particular pri- orities and interests in its solicitation announcements, and so can subtly in- fluence the quality of the information systems that will ultimately be used to generate data on justice system operations—is a real and significant one. BJS’s capacity to let systems improvement grants through NCHIP directly affects the level of entry of criminal history records into a central repository but, over time, also affects the input streams from court processing systems, 7 As summarized by Ramker and Adams (2008:9), “Nlets is responsible for all interstate exchange of federal and state criminal history records, and operates a national telecommunica- tions infrastructure for this purpose. Nlets is also a member of the FBI’s Joint Task Force for Rap Sheet Standardization and serves as a custodian of the standardized rap sheet layout and national standard format.”

STATE AND LOCAL PARTNERSHIPS 185 law enforcement booking and case management files, and correctional su- pervision records. Although the effects may not be as immediate or massive as might be hoped, BJS’s grantmaking authority does put it in a distinctive position relative to other statistical agencies of being able to do something about the quality of source-level data rather than just bemoaning or adjusting for shortcomings in the data. The developments in 2008 that will, apparently, put BJS in a position to harness criminal history records for research purposes are extremely promis- ing. BJS’s new recidivism, reentry, and special projects unit should be en- couraged to be wide ranging in considering the ways in which access to these data can inform studies of histories and “careers” in crime. The use of the records to support an ongoing measure of recidivism and recurrence of crim- inal behavior—as a relatively low-cost complement to formal panel studies of persons released from correctional supervision—is a solid first step. More generally, access to criminal history records is a linchpin to improving BJS’s collections on longitudinal flows within the justice system, as we discussed in Section 3–F.1. Recommendation 4.3: BJS should actively utilize the NCHIP program to improve criminal history records necessary for lon- gitudinal studies of crime. It is appropriate, in this chapter on federal-state partnerships, to observe that BJS can learn from and build on work done in several of its SAC affili- ates, some of which (being parts of law enforcement departments) have been able to utilize electronic records in their work. In particular, Burton et al. (2004) provide a good example of the type of analysis that could be done through actual analysis of computerized criminal history records, developing and assessing a measure of “seriousness” of criminal career trajectories. 4–C BJS AND THE UNIFORM CRIME REPORTING PROGRAM A federal criminal investigative agency within the U.S. Department of Justice, the FBI was formally founded by executive order in 1933, expand- ing the authority vested in a substantially smaller Bureau of Investigation (founded in 1908) by incorporating key functions from the Bureau of Pro- hibition. Legislation in 1935 dubbed the agency the “Federal Bureau of Investigation,” the name it has retained since. Significantly for the purposes of this report, the FBI has been engaged in the collection of crime statistics since its earliest days, including through direct collection from state and lo- cal law enforcement officials, and so is a natural point of comparison for BJS’s programs. The general functions of the director of the FBI (and, hence, of the agency) are articulated in Title 28, Section 0.85 of the Code of Federal Reg-

186 JUSTICE STATISTICS ulations. Chief among these is its basic criminal justice role: to “investigate violations of the laws, including the criminal drug laws, of the United States and collect evidence in cases in which the United States is or may be a party in interest.” However, the FBI’s duties also include a provision that creates overlaps with BJS’s responsibilities in several respects: “operate a central clearinghouse for police statistics under the Uniform Crime Reporting Pro- gram, and a computerized nationwide index of law enforcement information under the National Crime Information Center” (28 CFR § 0.85(f)). Specif- ically, the reasons why it is useful to consider the relationship between the FBI and BJS are: • The summary records from the FBI’s UCR program are published an- nually as Crime in the United States and are frequently used as a na- tional indicator of crime. In this function as national indicator of the incidence of crime, the UCR is a counterpart to BJS’s NCVS. The two measures differ conceptually and so provide the benefit of offering multiple vantages on the same underlying phenomenon of crime in the United States. However, since the existence of two measures may also commonly be seen as redundant or wasteful, it is important that the relative strengths and weaknesses of the two data sources be well documented and conveyed to the public. • A component of the UCR summary reporting program also generates administrative information on law enforcement personnel, a point of overlap with BJS’s Law Enforcement Management and Administrative Statistics (LEMAS) series. Specific aspects of coverage of law enforce- ment in the UCR program are described in Box 4-4. Table 4-1 sum- marizes further similarities and differences in coverage and content between the UCR and the NCVS. • As discussed in the preceding section, the FBI maintains and admin- isters national-level criminal history record databases, such as the in- stant background check database used to screen potential firearm pur- chasers, through its NCIC. BJS supports state and local law enforce- ment departments and their capacity to populate these FBI databases through NCHIP and other grants but it has no “ownership” of—or, until recently, access to, for statistical purposes—the resulting data compiled by the FBI. 4–C.1 Overview of the UCR Program In 1930, the Justice Department was authorized to “acquire, collect, clas- sify, and preserve identification, criminal identification, crime, and other records” (28 USC § 534(a)(1)). In turn, the attorney general delegated au- thority for collecting crime information via the UCR program to the FBI.

STATE AND LOCAL PARTNERSHIPS 187 Box 4-4 Law Enforcement Coverage in the Uniform Crime Reporting Program As part of the Summary Reporting System of the Uniform Crime Reporting (UCR) program, local law enforcement agencies report summary counts of the number of offenses reported to police, the number and basic characteristics (age, sex, race) of arrestees, and the number of “clearances” for each major (Type I) crime. A clearance is an offense-level attribute, not a person-level count; hence, “Several crimes may be cleared by the arrest of one person, or the arrest of many persons may clear only one crime” (Federal Bureau of Investigation, 2004b:79). Under FBI definitions, an offense can be cleared in only one of two ways: – Clearance through arrest, or “solved for crime reporting purposes,” occurs when “at least one person is (1) arrested, (2) charged with the commission of the offense, and (3) turned over to the court for prosecution” (Federal Bureau of Investigation, 2004b:79); or – Clearance through exceptional means, such as when the offender is killed or when a confession is obtained from a person already serving a sentence for another crime. Technically, a crime may be cleared exceptionally if an agency “can answer all of the following questions in the affirmative” (Federal Bureau of Investigation, 2004b:80–81): 1. Has the investigation definitely established the identity of the offender? 2. Is there enough information to support an arrest, charge, and turning over to the court for prosecution? 3. Is the exact location of the offender known so that the subject could be taken into custody now? 4. Is there some reason outside law enforcement control that precludes ar- resting, charging, and prosecuting the offender? Even absent full compliance with National Incident-Based Reporting System reporting, law enforcement officials are also asked to supply detailed incident-level information on homicides on monthly Supplementary Homicide Reports. These data, which are separately tabulated and made available for analysis, include detail on the circumstance of the incident, the type of weapon used in the murder, and what information is known about the relationship between the victim and the offender. In recent years, police departments have also been required to submit quarterly reports of hate crime incidents; these reports, too, include incident-level characteristics such as the type of bias motivation. On an annual basis, agencies are asked to provide counts of the number of personnel in their employ (total and sworn officers, specifically) as of October 31; these are tabulated and published in the annual Crime in the United States report. In addition, a UCR subprogram asks agencies to submit information on incidents in which officers are killed (feloniously or accidentally) or assaulted in the line of duty. (A record is supposed to be made in cases in which an officer is off-duty but is “acting in an official capacity, that is, reacting to a situation that would ordinarily fall within the scope of his or official duties as a law enforcement officer” [Federal Bureau of Investigation, 2004b:109].) These data are labeled as Law Enforcement Officers Killed or Assaulted, and are tabulated in a separate publication by the FBI.

188 JUSTICE STATISTICS The FBI’s authority for operating the UCR is currently assigned by regula- tion (28 CFR § 0.85(f), tasking the director of the FBI to “operate a central clearinghouse for police statistics under the Uniform Crime Reporting Pro- gram”). Authority for that designation was further affirmed by the Uniform Federal Crime Reporting Act of 1988 (P 100-690 § 7332), which autho- .L. rized the attorney general to “designate the Federal Bureau of Investigation as the lead agency” for UCR purposes and to establish “such advisory and oversight boards as may be necessary.”8 The current UCR program is a cooperative program of law enforcement agencies that produces aggregate data on crimes reported to police. When data collection began in January 1930, about 400 cities contributed infor- mation; indeed, national-level estimates of crime rates from the UCR were not issued until 1958 because of incomplete coverage (Maltz, 1999:4). As of 2004, 17,000 law enforcement agencies were participating in UCR (Federal Bureau of Investigation, 2004b:Foreword). Like the NCIC that adminis- ters the FBI’s criminal history databases, the UCR program is administra- tively housed in the FBI’s Criminal Justice Information Services Division in Clarksburg, West Virginia. For years, the FBI has been in the process of trying to transition from collection of UCR data as has been done for decades—what is now known as the Summary Reporting System (SRS)—to a newer and more detailed system. The National Incident-Based Reporting System (NIBRS) is poised to eventually supplant the SRS but has been slow to develop. Summary Reporting System The core content of the SRS inherits directly from the work of a Com- mittee on Uniform Crime Records convened by the International Associa- tion of Chiefs of Police (IACP) in 1927. “Recognizing a need for national crime statistics,” that committee “evaluated various crimes on the basis of their seriousness, frequency of occurrence, pervasiveness in all geographic areas, and likelihood of being reported to law enforcement” (Federal Bu- reau of Investigation, 2004b:2). Although the labels have changed slightly, the seven crimes identified by the 1927 IACP committee remain the focus of today’s Uniform Crime Reports and are known as “Part I offenses.” Three of these are crimes against persons—criminal homicide, forcible rape, and aggravated assault—and four are crimes against property: robbery, burglary, larceny-theft, and motor vehicle theft. The only substantive change to this list of Part I offenses was made in 1978, when legislation directed that ar- son be designated a Part I offense; however, arson continues to be reported on a separate form rather than the standard “Return A” used to report the 8 Provisions of the 1988 act also required the UCR to collect data on federal criminal of- fenses and to “classify offenses involving illegal drugs and drug trafficking as a part I crime.”

STATE AND LOCAL PARTNERSHIPS 189 other Part I offenses. The Part I offenses are also known as “index crimes” because they are used to derive a general, national indicator of criminality— the national Crime Index. The index, first computed and reported in 1958, consists of the sum of the seven original Part I offenses, except that larceny is restricted to thefts of over $50. The FBI instructs agencies to follow a specified “hierarchy rule” in cod- ing offenses for generation of the monthly summary counts. The FBI directs that multiple-offense situations—incidents in which more than one crime is committed simultaneously—are to be handled by “locat[ing] the offense that is highest on the hierarchy list and scor[ing] that offense involved and not the other offense(s)” (Federal Bureau of Investigation, 2004b:10). This rule is described in Box 4-5. BJS uses a similar “seriousness hierarchy” for clas- sification of events in some of its work on the NCVS, with the important distinction that the hierarchy is applied after data collection for generation of some incident count tables. Thus, “the NCVS collects and preserves infor- mation for each crime occurring in the incident, which enables researchers to create their own classification scheme.” In comparison, the application of the UCR hierarchy rule at the point of data collection collapses incidents involving several crime types to record just one type, losing the full incident detail (Addington, 2007b:229). UCR participants are asked to provide monthly reports under the SRS. The basic form tallying the monthly counts of Part I offenses known to law enforcement is known as Return A (arson incidents are reported on a sep- arate monthly return). However, Return A is not the only data collection requested by the FBI and UCR. The SRS also asks participating agencies to complete additional forms. Unless otherwise noted, all of these supplemen- tal forms are also expected to be completed on a monthly basis by reporting agencies: • Type and value of property stolen and recovered: A monthly supple- ment to Return A queries departments for estimates of the value of stolen property in their jurisdictions; in most instances, this is taken to be the reporting victim’s evaluation of the value of the property but may also include estimates researched by the police. Estimates are re- quested for each of 11 property types (e.g., jewelry, office equipment). A separate table in the supplement asks departments to further classify and total these stolen property incidents (and corresponding values) by the type of crime involved (e.g., if the theft occurred as part of a mur- der) or the location of the crime (e.g., convenience store, residence). • Supplementary Homicide Reports (SHRs): Since 1962, reporting agen- cies have also been asked to complete SHRs for every incidence of murder and nonnegligent manslaughter. SHR data provide a wealth of detail about the particular crime of homicide, including what is

190 JUSTICE STATISTICS Box 4-5 Hierarchy Rule for Part I Offenses and Suboffenses, Uniform Crime Reporting Program 1. Criminal homicide a. Murder and nonnegligent manslaughter b. Manslaughter by negligence 2. Forcible rape a. Rape by force b. Attempts to commit forcible rape 3. Robbery a. Firearm b. Knife or cutting instrument c. Other dangerous weapon d. Strong-arm (hands, fists, feet, etc.) 4. Aggravated assault a. Firearm b. Knife or cutting instrument c. Other dangerous weapon d. Strong-arm (hands, fists, feet, etc.) 5. Burglary a. Forcible entry b. Unlawful entry (no force) c. Attempted forcible entry 6. Larceny-theft (except motor vehicle theft) 7. Motor vehicle theft a. Autos b. Trucks and buses c. Other vehicles 8. Arson a.–g. Structural h.–i. Mobile j. Other The Uniform Crime Reporting Program Handbook (Federal Bureau of Investigation, 2004b) defines three major exceptions to use of this hierarchy rule for crime reporting. First, motor vehicle theft—as a special class of larceny, generally—can outrank larceny; hence, the theft of a car with valuables inside it would be coded as a motor vehicle theft (trumping the classification as larceny) even if the vehicle is subsequently recovered but the valuables are not. Arson is also a special case because it is reported on a separate form from the other Part I offense: multiple-offense crimes involving arson can include two reported Part I offenses, the arson tally on the separate schedule and the highest-ranking Part I offense under the usual rule reported on Return A. The third exception to the hierarchy rule is justifiable homicide, “defined as and limited to the killing of a felon by a police officer in the line of duty [or] the killing of a felon, during the commission of a felony, by a private citizen.” By this definition, justifiable homicide necessarily “occurs in conjunction with other offenses”; those offenses are the ones to be considered in classifying the incident.

STATE AND LOCAL PARTNERSHIPS 191 known about the relationship between the victim(s) and offender(s), the circumstances of the killing, and the use of weapons. Similar in- formation is also requested for incidents of negligent manslaughter, though traffic fatalities and accidental deaths are not included in this accounting. • Age, race, and sex arrest data: On a monthly basis, agencies are asked to provide counts of completed arrests by the age, race, and sex of the arrestee(s). These data are counts aggregated by type of crime, not individual records per crime incident; moreover, separate counts are requested by age group (16 groups; individual ages 18–24, 5-year groups from 25 to 64, and 65 and over) crossed with sex, but not by race. Totals by race group, four categories, are tallied separately. The age, race, and sex data are requested for Part II offenses as well as the Part I offenses considered in Return A, making these data the UCR’s only systematic source of information on these offenses as well as the only source of offender attributes.9 A separate schedule is used to count crime totals for juveniles (under 18 years of age). • Law Enforcement Officers Killed and Assaulted (LEOKA): Collected on monthly forms and published annually since 1972, the LEOKA data are intended to count line-of-duty deaths or assaults of law enforce- ment officers. “Line-of-duty” does not mean “on duty” but rather that the officer is acting in an official capacity, responding to a situation that would normally fall within official duties. An eight-page follow- up questionnaire is used to provide additional information on LEOKA incidents in which a firearm or a knife (or other cutting instrument) was used against the officer. • Hate crime statistics: The Hate Crime Statistics Act of 1990 led to the collection of a variable on “bias motivation in incidents in which the offense resulted in whole or in part because of the offender’s preju- dice against a race, religion, sexual orientation, or ethnicity/national origin” (Federal Bureau of Investigation, 2004b:3). The scope of hate crimes reported in this series was expanded in 1994 to include crimes motivated by victims’ physical or mental disability. Aggregate counts of hate crime incidents are reported on a quarterly basis; a one-page report is also requested for every specific incident, recording the of- 9 The 21 offenses currently tallied as Part II offenses are other assaults; forgery and coun- terfeiting; fraud; embezzlement; stolen property (buying, receiving, possessing); vandalism; weapons (carrying, possessing, etc.); prostitution and commercialized vice; sex offenses; drug abuse violations; gambling; offenses against the family and children; driving under the influ- ence; liquor laws; drunkenness; disorderly conduct; vagrancy; all other offenses; suspicion; curfew and loitering laws (persons under 18); and runaways (persons under 18) (Federal Bu- reau of Investigation, 2004b:8).

192 JUSTICE STATISTICS fense type, location, type of bias motivation, victim type, and number and race of known offenders. • Law enforcement employees report: On an annual basis, UCR partic- ipant agencies are sent a form to provide a count of full-time sworn and civilian personnel on the payroll as of October 31. Though the Uniform Crime Reporting Handbook provides considerable detail on who should and should not be included in the count (Federal Bureau of Investigation, 2004b:124), only the aggregate employee count is requested. The UCR is, ultimately, a program in which participation in voluntary; consequently, UCR coverage by reporting law enforcement agency can be spotty. Complete nonresponse to the UCR program, for individual years or for long stretches of crime, occurs and is sometimes pervasive for some states and large states and localities. Gaps in UCR coverage have been described most thoroughly by Maltz (1999, 2007). The FBI uses imputation to bridge some of these gaps for deriving national-level estimates. The status of UCR estimates as crimes officially reported to police im- parts a veneer of legitimacy in some respects. For example, government grant programs that use crime information in scoring areas or allocating funds typically use UCR numbers; these include “renewal community” funds from the U.S. Department of Housing and Urban Development (24 CFR § 599.303) and even Justice Department grants to correctional facilities (28 CFR Part 91). In the latter example, the measure of “Part 1 violent crimes” required in grant submissions is defined by 28 CFR § 91.2(c) as those “re- ported to the Federal Bureau of Investigation for purposes of the Uniform Crime Reports. If such data [are] unavailable, Bureau of Justice Statistics (BJS) publications may be utilized.” It is also worth noting that the FBI’s du- ties under current regulation also include “carry[ing] out the Department’s responsibilities under the Hate Crime Statistics Act” (28 CFR § 0.85(m)). That is, it is the UCR measure of hate crimes—and not any product of BJS—that is used as the official, legally mandated measure of hate crime prevalence.10 10 A BJS report by Harlow (2005) acknowledges the legal distinction, noting that “the At- torney General delegated data collection of hate crimes principally to the FBI.” The report compares the NCVS and UCR measures of hate crimes, using the NCVS to provide detail on the circumstances and characteristics of such attacks; the analysis suggests that less than half (44 percent) of hate crimes are reported to police, with no significant difference in reporting of hate- and nonhate-related crimes of the same violent crime type (Harlow, 2005:4). Direct comparison of the NCVS and UCR measures suggest general similarity on some characteristics (categories of offenses, reported motivation) but some differences in demographics (age and race of victim) and some contextual factors (use of weaponry in the incident).

STATE AND LOCAL PARTNERSHIPS 193 National Incident-Based Reporting System Development of NIBRS dates to the publication of a joint BJS-FBI task force study (Poggio et al., 1985). Recommendations in this Blueprint for the Future of the Uniform Crime Reporting Program led to pilot work with several law enforcement agencies in South Carolina in 1987 and the presen- tation of the new NIBRS concepts at a national UCR conference in March 1988; the first NIBRS data were received by the FBI in January 1989, and the first NIBRS data for public use were released in 1998 (for 1996 data). The Uniform Crime Reporting Program Handbook (Federal Bureau of Investigation, 2004b:3) notes: The intent of NIBRS is to take advantage of available crime data main- tained in modern law enforcement records systems. Providing consid- erably more detail, NIBRS yields richer and more meaningful data than those produced by the traditional summary UCR system. The con- ference attendees recommended that the implementation of national incident-based reporting proceed at a pace commensurate with the re- sources and limitations of contributing law enforcement agencies. The handbook also summarizes the basic content of NIBRS as follows: NIBRS collects data on each incident and arrest within 22 offense cat- egories made up of 46 specific crimes called Group A offenses. For each incident known to police within these categories, law enforcement collects administrative, offense, victim, property, offender, and arrestee information. In addition to the Group A offenses, there are 11 Group B offenses for which only arrest data are collected. The NIBRS incident report is quite intricate and allows for great flexibility in the coding of individual events: Each report can include up to 10 offenses, 3 weapons, 10 relationships to victim, and 2 circumstance codes. Work on NIBRS has generally concentrated on implementation and in getting additional agencies to use the new format, rather than refinement of the NIBRS instruments themselves. An explanatory webpage published by the FBI observes that (http://www.fbi.gov/ucr/cius2006/about/about_ucr. html): In the late 1980s, the FBI committed to hold all changes to the NIBRS in abeyance until a substantial amount of contributors implemented the system. [However,] three modifications have been necessary. To meet growing challenges in the fight against crime, the system’s flexibility has permitted the addition of a new data element to capture bias-motivated offenses (1990), the expansion of an existing data element to indicate the presence of gang activity (1997), and the addition of three new data elements to collect data for law enforcement officers killed and assaulted (2003). Coverage continues to be a problem for NIBRS usage; “in the official 2005 NIBRS data released through [the Inter-university Consortium for Po-

194 JUSTICE STATISTICS litical and Social Research], only about 0.3% of all of the NIBRS reporting jurisdictions for 2005 fall into the 500,000 and 999,999 population cate- gory, reporting slightly over 12% of the NIBRS incidents. There are no cities with populations over 1,000,000 reporting data through NIBRS” (Faggiani, 2007:3). As of September 2007, JRSA estimated that only about 25 percent of the nation’s population is included in NIBRS-compliant jurisdictions; see http://www.jrsa.org/ibrrc/background-status/nibrs_states.shtml [12/1/07]. In all, about 26 percent of agencies that supply data to the UCR do so using the NIBRS format. Among the states that have not yet implemented NIBRS are California, New York, and Pennsylvania; in Illinois, the only NIBRS participant to date is the Rockford Police Department. Five states—Alaska, Florida, Georgia, Nevada, and Wyoming—have not yet specified any formal plan for participation in NIBRS. 4–C.2 BJS Role in UCR and NIBRS BJS plays no role in the collection or dissemination of UCR or NIBRS data. However, it has issued grants over the past decade to promote the tran- sition to NIBRS reporting by law enforcement agencies (particularly those in larger states and metropolitan areas). In part, it was BJS’s work in the area of administering grant monies by block grant—a duty assigned to it by a 1994 law—that focused some attention on the limitations of UCR data; this led to a period in which BJS issued grants specifically for NIBRS improvement. As Maltz (1999:7, 9) recounts: In 1994, in reauthorizing the Omnibus Crime Control and Safe Streets Act of 1968, the U.S. Congress appropriated anticrime funding for juris- dictions under the Local Law Enforcement Block Grant Program. The amount of funds received by a jurisdiction was to be based on the num- ber of violent crimes they had experienced in the 3 most recent years (1992–94). According to the statute, the UCR was to be the source of the crime data. [This action was significant because it] marked the first time that funding decisions were to be made on the basis of the data in the UCR. BJS was called upon to develop the allocation formula, based on UCR data, to divide funds among localities. As intended by the law, BJS used the actual raw crime data as reported by each police agency to the FBI, rather than the imputed data, in the allocation formula. But in reviewing the raw UCR data, BJS immediately recognized their limitations: Of the 18,413 police agencies that reported to the FBI in 1992–94, 3,516 (19%) did not provide crime data for any month during the 36-month period used in the formula and another 3,197 (17%) reported between 1 and 35 months.

STATE AND LOCAL PARTNERSHIPS 195 Hence, BJS worked with the FBI on improving its imputation procedures, including convening an expert conference on the topic.11 As states strug- gled with implementation of NIBRS reporting standards, BJS’s experience in information technology improvement grants and its basic authority to let local assistance grants led it to administer specific grants to states in the late 1990s to improve their crime reporting capabilities to conform to NIBRS standards. The “Data Online” and “Data for Analysis” sections of BJS’s website provide users with the ability to generate custom tables from UCR data from 1985 though the most recent year of release, and a separate tool generates tables from the UCR SHRs. Curiously, the BJS site provides no such direct tabulations or estimates based on its own NCVS data. 4–C.3 The UCR Program and the FBI’s Strategic Priorities Since September 11, 2001, the FBI has largely recast its mission as one of deterring and preventing acts of terrorism. The bureau’s current strategic plan (Federal Bureau of Investigation, 2004a:9) identifies eight priorities, in descending order: 1. Protect the United States from terrorist attack; 2. Protect the United States against foreign intelligence operations and espionage; 3. Protect the United States against cyber-based attacks and high-technology crimes; 4. Combat public corruption at all levels; 5. Protect civil rights; 6. Combat transnational and national criminal organizations and enterprises; 7. Combat major white-collar crime; [and] 8. Combat significant violent crime. The plan further identifies two “key enabling functions that are of such im- portance they merit inclusion:” “Support federal, state, local, and interna- tional partners” and “Upgrade technology to successfully perform the FBI’s mission.” In light of these evolving priorities, the question can be raised as to whether the UCR program receives due resources and attention or whether administration of UCR detracts from the FBI’s overall strategic objectives. The UCR program is mentioned briefly in the FBI’s most recent strategic 11 In 2004, BJS issued a technical report (Bauer, 2004) summarizing BJS’s final allocation formula and the amounts of money dispersed to the statements under the Local Law Enforce- ment Block Grant Program from 1996 to 2004. BJS was formally tasked with deriving the formula for the Edward Byrne Memorial Justice Assistance Grant program, the successor to the block grant program (Hickman, 2005a).

196 JUSTICE STATISTICS plan, under the strategic goal of the FBI’s Criminal Justice Information Ser- vices (CJIS) Division. The discussion of strategic objective IVD.1, “Expand information sharing capabilities to support customer needs,” notes (Federal Bureau of Investigation, 2004a:98): An array of state-of-the-art technology in the Uniform Crime Report (UCR) Program is needed to provide more efficient, optimum quality, and timely products and services to law enforcement and other con- sumers of UCR crime data. The new system will optimize the produc- tion capabilities of the existing CJIS information systems by leveraging the immense amount of data already regularly contained in each system repository. This cursory mention of the program appears to concentrate on the produc- tion of estimates and products, rather than any ongoing assessment of the quality and timeliness of the actual data from local agencies. The strategic plan also identifies an expansion of NIBRS content as an objective (Federal Bureau of Investigation, 2004a:99): [The “enhanced NIBRS”] data set combines the current 53 NIBRS crime descriptors with the specific personal and event identifiers which form the core of most police department incident reports. Many states collect more information than the current 53 NIBRS data elements describing incidents for their own in-house purposes. To realize the full potential of the information sharing capabilities of NIBRS data, additional identi- fying data (e.g., victim, offender, and suspect) must be included, which will provide law enforcement additional investigative leads. However, this topic is only identified as an area for which an implementation plan should be developed. 4–C.4 Assessment Managing the BJS-UCR Relationship As we argued in more detail in our first report (National Research Coun- cil, 2008b:Sec. 3–F), we do not see the relationship between the FBI’s UCR program and BJS’s NCVS (and related data series) as an either-or proposi- tion. Although the two programs overlap in that they cover a similar set of crimes and are both used to generate national-level estimates of violent crime, their major differences in scope and methodology make each a valu- able source of information on crime and violence. The major features of the UCR compared with the NCVS are summarized in Table 4-1. The intended successor to the UCR summary reporting program, NIBRS, has fallen short of its promise to date because of the slow adoption of the more detailed reporting scheme by local departments. In turn, the low cov- erage in NIBRS has also affected the extent of available research that, by

STATE AND LOCAL PARTNERSHIPS 197 Table 4-1 National Data Sources Related to Crime Victimization in the United States UCR Data Characteristics NCVS Summary NIBRS Target population Noninstitu- Crime incidents Crime incidents tionalized per- occurring in the occurring in the sons age 12 and United States United States older in the United States Unit of observation Individual Law enforce- Crime incident ment agency Estimated coverage Nationally rep- 94.2% of U.S. Approximately resentative population cov- 25% of U.S. sample ered by agen- population cov- cies active in ered by agen- UCR reporting cies reporting in NIBRS for- mat Types of victimization covered Criminal homicide No Yes Yes Other index crimes Yes Yes Yes Geographic areas identified Region Yes Yes Yes State Yes∗ Yes Yes County Yes∗ Yes Yes Census tract Yes∗ No No Demographic coverage Age Yes No No Race Yes No Yes Sex Yes No Yes Ethnicity Yes No Yes Vulnerable groups Children 12 & older No Yes Immigrants (native born) No No No Disabled (learning disability only) No No No Elderly Yes No No Timeliness of data availability Pre-announced schedule Yes Yes Yes Fixed schedule Yes Yes Yes Accuracy and quality Sampling error Routinely Unmeasured Unmeasured estimated Other errors (nonsampling) No ongoing Unknown Unknown evaluation ∗ These areas are not identified in the public use files currently available from the National Archive of Criminal Justice Data, but are on area-identified files maintained by the Census Bureau. It should be noted that the NCVS sample is not drawn with the intent of producing estimates at these geographic levels.

198 JUSTICE STATISTICS generating particularly interesting or useful findings, could spur greater in- terest and participation in the series. As Faggiani (2007:2) summarizes: The implementation of NIBRS by local law enforcement agencies is an evolving but slow moving process and this has had an impact on its use for research. The early NIBRS data releases (1996 through 1998) were mostly limited to small and medium-sized law enforcement agencies representing primarily rural states. For example, the 1996 data covered only nine cities with populations in excess of 100,000 and no cities with populations over 250,000. Researchers examining the utility of NIBRS for scientific research began to raise serious questions about the overall representativeness of this supposed national data system. For the purpose of studying the occurrence of crime in the United States, a healthy UCR program and, particularly, a full-fledged NIBRS are both crit- ical data systems. A fully featured NIBRS with high participation has the potential to shed light on some dynamics of law enforcement operations that are not visible in current data (and not even envisioned within the tight management and administrative focus of the LEMAS series). One such ex- ample is the potential for NIBRS to provide detail on police clearance rates, which are currently reported as a gross indicator of departmental success. However, the aggregate rates, minus the type of contextual information on incidents and the extent of police contact with victims and offenders, mask a great deal of potentially useful information. On an explanatory basis, Addington (2007a) used the incident and police clearance date recorded in NIBRS data to test (and confirm) the conventional wisdom that those mur- der cases that are cleared by police tend to be cleared early and that there is a major drop in the clearance rate after more than a week has passed since the homicide. With fuller data resources, and study of different crime types, NIBRS could provide a useful platform to study the factors that influence the successful clearance of crimes by police. Finding 4.3: A full-fledged NIBRS would be a source of basic information on police responses to public complaints (911 calls), including whether or not a case is “cleared” by police through an arrest. Having concluded that the UCR and NIBRS programs have their merits—and that the nation benefits from having multiple data systems (UCR/NIBRS and the NCVS) to measure the incidence and circumstances of crime from different perspective—the question that remains is whether they should be managed by separate parts of the Justice Department. Put more bluntly, the question is whether it makes sense for BJS, the principal statisti- cal agency in the Justice Department, to lack authority for what is arguably the most prominent statistical data series produced by the department, and whether it would be preferable for BJS to “take over” UCR operations from the FBI.

STATE AND LOCAL PARTNERSHIPS 199 Rosenfeld (2007:830) argues that “the FBI is no longer the appropriate institutional home for the UCR program, if it ever was.” The principal ar- gument for the transfer is that BJS is more likely to be able to provide the technical support and capability for ongoing improvement of the UCR than the FBI, given that “tracking conventional crime is not a high priority in the [FBI’s] post 9/11 focus.” The “appropriate focus and necessary human and technical resources” to best monitor locally recorded crimes reside in BJS rather than the FBI and its administrative placement is more historical arti- fact than organizational efficiency: “had the BJS existed 75 years ago, the re- sponsibility to compile local crime statistics would have been placed there,” but by the time BJS was founded in 1979, the UCR was well entrenched as part of the FBI and little incentive existed to transfer authority for the UCR program (Rosenfeld, 2007:831). To be sure, Rosenfeld (2007:831) argues that transfer of the UCR program “is a necessary but not sufficient condi- tion to upgrade the nation’s crime monitoring capabilities,” and that BJS would require additional resources to make substantial improvements in the timeliness and quality of UCR estimates. Rosenfeld (2007) further cites a recent example of both UCR and NCVS being bested by another data source—in terms of timeliness of information— as motivation for improving both benchmark measures of crime through an organizational realignment. By August 2006, “local police chiefs had been complaining for months . . . that violent crime was on the rise and that they lacked the resources to combat it.” However, such a shift in crime rate (re- versing several years of declining trends) could not be measured by UCR: “the FBI report [of UCR results for 2006] was not released until Septem- ber 2006 and covered only the period through the end of 2005.” The FBI released a preliminary report covering data from the first half of 2006 in December 2006—rapid dissemination by UCR standards, but in a sense the information was still too late. Any reading from the NCVS lagged behind the FBI’s figures, meaning that “no single source of [publicly available] system- atic data” existed to refute or corroborate the chiefs’ claims. That August, the Police Executive Research Forum (PERF) convened a Violent Crime Fo- rum of police chiefs, the result of which was a compilation of current crime data from several of the participating police departments. PERF and the chiefs used these data to describe apparent crime increases in its report A Gathering Storm: Violent Crime in America (Police Executive Research Fo- rum, 2006)—a report that went public in October 2006. An update of that report, published in April 2007, pointedly reminded readers that PERF en- courages police agencies “not to wait” for the FBI to release its UCR crime figures and to send their data directly to PERF for compilation and early release (Rosenfeld, 2007:826).

200 JUSTICE STATISTICS The basic arguments for BJS acquiring authority for the UCR and NIBRS programs include, in brief: • The operational transfer would solidify BJS’s position as the preemi- nent statistical agency within the Department of Justice and the pre- eminent governmental source for justice statistics, generally. • The transfer would permit the FBI to sharpen its new organizational focus on antiterrorism efforts. • Placing authority for the UCR and NIBRS within a true statistical agency would facilitate attention to methodological problems in the series, including adjustments for nonresponse and imputation routines. The strongest argument against BJS acquiring control of the UCR is the potential disruption of the relationships that have built up over the decades of FBI administration of the UCR, brokered through outlets such as the IACP A great strength of BJS’s correctional data series and a key to their . quality is the network of ties that BJS has built with state departments of corrections and individual facilities; likewise, BJS continues to develop ties to state court systems. Just as it is reasonable to expect that the quality of re- sulting data would be impaired by a sudden change in reporting structures, shifting lines of data reporting that have existed, in some cases, since the 1930s is not something that should be taken lightly. There is, moreover, a trust that may be implicit in UCR reporting relationships—law enforcement agencies providing data to a fellow law enforcement agency—that is non- trivial. It is certainly possible that, with strong endorsement and assistance from collaboratives such as the IACP and PERF, a transition from FBI to BJS could be successful in time, but the short-term impact on response rates could be significant. In our assessment, the prospect of BJS “taking over” the UCR picks unnecessary turf fights with both the police community and the FBI, both of which have historically been protective of the program. Although the organizational transfer of UCR from one Justice Depart- ment agency to another would seem to be a fairly easy task, we think that this appearance is deceptive. The suggestion severely underestimates the level of energy and expense that would be necessary to get the UCR SRS (much less a full-fledged NIBRS) to function efficiently and effectively un- der a new administrative parent and as a part of the statistical system. In our interim report, we noted the inherent rigidity of the UCR—that, for instance, the core set of crime types covered by the UCR has remained the same since the UCR’s creation in 1929 (save for the addition of arson as a top-tier “Part I” crime. We commended the value of the NCVS as an in- dependent check on the UCR (and vice versa), but urged that “the utility of an UCR-independent measure of crime should not prevent consideration of [NCVS] design options that reduce lockstep similarity between the UCR and the NCVS” (National Research Council, 2008b:77). To make clear the

STATE AND LOCAL PARTNERSHIPS 201 tacit criticism in that statement, the UCR is in some respects an antiquated and inefficient system for collecting and disseminating annual estimates of level and changes in crime reported to the police at the national level. As a census-type measure of all jurisdictions, the UCR has and will continue to have essential roles, including such purposes as allocating funding across all jurisdictions based on crime counts. However, our concern is that the products of a principal statistical agency are held to high standards (as we describe in great detail throughout Chapter 5). In particular, recast as a core statistical collection within a principal statistical agency, a BJS-led UCR would require much more intensive—and expensive—attention to issues of data quality, response, data collection instrumentation, and documentation than has previously been brought to bear on the UCR. Clearly, as we have indicated, it is in the national interest to have a high- quality UCR program. It follows that if the FBI’s strategic goals shift even more heavily toward its expanded portfolio in terrorism surveillance, and hence that attention to and resources for administration of the UCR become so scarce that the UCR and NIBRS programs will atrophy, then an admin- istrative transfer of authority of UCR to BJS would be sensible (short-term effects on response notwithstanding). Barring these conditions, we find no compelling reason, other than the organizational neatness of consolidating statistical functions in one agency, for UCR to shift away from the FBI. Timely Records-Based Collection from Local Law Enforcement Files Rather than “take over” UCR, as the option might be bluntly described, our recommendation is that BJS explore the possibility of doing something that is different from either the UCR or NCVS and that we think is bet- ter than the UCR in some respects. Specifically, we suggest that BJS work with local law enforcement agencies to develop a system under which BJS could regularly extract records from individual departments’ own computer systems—data that many departments regularly compile on their own and some departments post on their websites—for a sample of jurisdictions. This system would shift much of the burden of response and data gathering from the local authorities (i.e., filling out UCR summary forms) to BJS (i.e., sam- ple design, data editing, and inference). Such a system would be capable of providing more timely (if less detailed) glimpses of crime trends than is possible under UCR, NIBRS, or the NCVS. The strategy we propose is consistent with one commonly used by statis- tical agencies, pairing sample- and census-based methods. Among U.S. fed- eral statistical agencies, as well as in government statistical systems around the world, it is common to conduct sample-based measurements in parallel with more exhaustive, census-type measurements of the same basic phenom- ena. For example, BLS interviews a large sample of employers (covering

202 JUSTICE STATISTICS about 390,000 worksites) each month as part of its Current Employment Statistics (CES) program. Each month, the CES data are used to produce a count of changes in the number of jobs in the country; this monthly “Em- ployment Situation” report is a familiar and highly publicized barometer of economic conditions. The CES also supports production of employ- ment figures for states and metropolitan areas.12 This monthly measure of national, state, and local employment conditions is complemented by the Quarterly Census of Employment and Wages (QCEW), which taps into data from the state-based unemployment insurance systems. Thus, the QCEW provides a more comprehensive census-type count in the change of jobs at the national level (with QCEW coverage estimated to include 98 percent of U.S. jobs), while yielding detailed estimates down to the county level.13 Another example is the national Vital Statistics program administered by NCHS, which we described in Section 4–A.1. Drawing from reports from state health departments and reporting authorities, the vital statistics esti- mates represent a census-type measure of births in the United States. How- ever, NCHS also fields the National Survey of Family Growth (NSFG; see http://www.cdc.gov/nchs/NSFG.htm), which measures self-reported births among probability samples of females, providing independent measures of births and fertility. The NSFG is also capable of generating more detail about the pregnancies and related births and yields national estimates of marriage and divorce, topics that are no longer covered by the Vital Statis- tics program. The UCR and the NCVS do not fit this paired census- and sample-based measurement approach because of the much more extensive scope of the NCVS, including crimes and general incidents of victimization that are not reported to the police. There are three reasons that such paired census-based and sample-based measures are useful to policy makers and professionals. First, the sample- based measurements can be constructed to be more timely than the census- based measurements. Individual attention can be paid to each sample unit; efforts of interviewers and other agents can raise the level of quick response to the survey request. Second, by focusing survey research resources on a small number of sample units, the quality of reporting can be raised, over that expected from administrative systems. Often this higher quality is obtained through increased standardization in reporting across the vari- ous sample units. Because participation is assisted through the survey data collectors, estimates can be published more quickly, to the benefit of the country. However, and the third reason for parallel samples and censuses, 12 Because of cuts in BLS’s funding for fiscal year 2008, BLS has had to eliminate the produc- tion of some CES estimates for all metropolitan areas and completely eliminate the generation of estimates for the smallest metropolitan areas; see http://www.bls.gov/sae/msareductions.htm. 13 See http://www.bls.gov/ces/ and http://www.bls.gov/sae/ for additional information on the CES program and http://www.bls.gov/cew/ regarding the QCEW .

STATE AND LOCAL PARTNERSHIPS 203 those benefits come at a price: the sample sizes are generally inadequate to offer stable estimates of the prevalence of rare events or estimates of dif- ferences of small subsets of the population. Only with full census-based measurement can analysis at such levels be accomplished. In short, the sample-based methods are appealing because they can offer higher-quality responses, obtained in a timelier fashion. In contrast, census- based methods are necessary to understand small subpopulations or rare events related to the phenomenon of interest. Facilitating both of these systems makes sense, especially when the census-based system is already in place for administrative and management reasons. The panel believes that the country would be well served by a similar par- allel structure applied to crimes reported to police. The new system would rely on the UCR as the census-based vehicle that would permit fine-grade comparison of areal patterns of crime, while a sample-based measurement drawn from the crime information compiled and disseminated in real time by many police departments would provide a more timely indicator of gen- eral crime trends. The NCVS, of course, would be a third component of this new structure, adding benefit through its rich contextual information and coverage of incidents not reported to police. Recommendation 4.4: To improve the timeliness of crime statis- tics, BJS should explore the development of a crime reporting system based on a probability sample of police administrative records. The goals of such a system would be national repre- sentativeness, high response, high data quality, timeliness and flexibility in terms of crime classification and analysis, and na- tional statistics for the monitoring of crime trends. Such a system has two notable precursors. First, it hearkens back to the original purpose of the NIBRS program, “to take advantage of available crime data maintained in modern law enforcement records systems” (Fed- eral Bureau of Investigation, 2004b:3). Where NIBRS has endured slow implementation because of the need to develop systems to provide rich in- cident detail, the hope of a timely records-based summary system would be to tap the more aggregate crime statistics used by departments to measure their own progress. The second key precursor is the one that makes such a records-based system feasible at this time, when it has not in the past. The New York City Police Department, under then-commissioner William Bratton, is credited with developing the COMPSTAT approach to mea- suring and planning responses to crime in 1994. Alternatively described as a contraction for “computational statistics” or “comparative statistics,” COMPSTAT combined a managerial focus emphasizing internal account- ability among district and regional commanders for crime activity in their areas with a technical basis in the use of timely crime incident data to iden-

204 JUSTICE STATISTICS tify problems and tailor responses. The COMPSTAT program was popularly credited, at least in part, for major crime rate decreases in New York, which in turns spurred the adoption of similar programs in other cities. COMP- STAT implementation and use in a variety of sites, including detailed case studies, have been reviewed by Willis et al. (2003), Weisburd et al. (2003), and Weisburd et al. (2004). The system we suggest is akin to, but not as extensive as, the “national COMPSTAT” that has been suggested by PERF in the wake of PERF’s initial work in combining local-agency records (and hence monitoring crime-rate shifts earlier than data provided by either the UCR or the NCVS). In his introduction to PERF’s report on its second Violent Crime Summit, PERF Executive Director Chuck Wexler described the “national COMPSTAT” idea (Police Executive Research Forum, 2007:iv; emphases in original): We want to change the way that people view crime. In the past, crim- inologists waited several years to make conclusions about crime trends. They were cautious about drawing conclusions and waited until they could state with scientific certainty that there was a changing pattern. The problem with that approach is that by the time a crime trend has been identified, the information is so old as to make it useless, because new trends, new crime patterns, and new causes of crime have taken hold. Programs and policies that we undertake today, to respond to the crime problems of last year, are not likely to succeed. [A “national COMPSTAT” approach would use] accurate, timely in- formation to track crime as it happens, to search for pockets of violence wherever and whenever they occur, and to react quickly. In a sense, we believe that police leaders should act more like public health epidemi- ologists, who don’t wait for a pandemic to overtake the nation, with hundreds or thousands of people dead, before they sound an alarm and start implementing countermeasures. We say that our proposal is akin to but not identical to a “national COMPSTAT” in that our proposal is to generate a timely summary or index value of crime, and not a direct analysis of the geographically coded incident data that can be used to inform COMPSTAT managers’ decisions to allocate personnel and resources. Valuable though such data would be, there is an in- herent tension between the statistical information that an agency such as BJS can provide and the fine-grained, “tactical” assessments—informing specific interventions and used to hold line officers and commanders accountable— that are of particular interest to law enforcement agencies. Rather, our in- tent is to provide policy makers with crime data that provide information that is both contemporaneous (immediately of interest and relevance to con- sumers in law enforcement) and timely (short time between collection and dissemination). Our intent in proposing this system is not to be duplicative of effort in the existing UCR data collection. Rather, the idea is to maximize the use

STATE AND LOCAL PARTNERSHIPS 205 of those data systems that local police departments prepare and maintain on their own, as a course of doing business and, in several cases, to promote public transparency by providing up-to-date crime information for public view on websites. Rather than put the burden for processing, coding, and interpreting results on the local departments (through completion of sum- mary report questionnaires), the intent of this system would be to create means by which available electronic data could be transmitted to BJS with minimal effort on the part of local departments (save, perhaps, for stripping personal identifiers). The burden of sample design, data processing, and compilation would be on BJS. However, the availability of the data in elec- tronic form and the development of routines for handling specific varieties of local data types would, ideally, yield a system in which some summary measure or index could be generated and disseminated quickly. The value of BJS involvement in compiling crime data from local depart- ments (and, in some cases, publicly posted online as well as used in inter- nal meetings and assessments) would be rigor in design (documenting that index measures are representative of some larger whole, if not the entire nation then something like urban areas of a particular size), consistency in definition and coding, and attention to data quality. Some further comments along these lines may be useful: • Though the objective of such a system would be to be minimally invasive—making use of data that departments already have, and have in electronic format—success in implementation will depend on build- ing ties with and commitments from individual departments. Making use of consortia such as PERF (given its initial work in the area), the Major City Chiefs Association, and IACP would be instrumental. • A necessary first step in constructing such a system is to assess its basic feasibility: studying individual departments’ technical capability and the availability of suitable data resources. Short of a complete inven- tory of technical systems status (such as BJS performed for state correc- tions departments and individual facilities; Bureau of Justice Statistics, 1998b) or its SEARCH-conducted Survey of Criminal History Infor- mation Systems (Section 4–B), a module of questions on the LEMAS survey would be a useful start. • The sample panel of departments would likely emphasize the largest cities, given that they are most likely to have amenable record systems. The preliminary work we just described would be useful in determin- ing whether the content must be limited to such large cities (say, above 250,000 population) or whether a more nationally representative sam- ple including smaller agencies is feasible. In constructing the sample, BJS should take into account the distribution in crime with respect to population size and changes in that distribution. Crime is not as highly

206 JUSTICE STATISTICS concentrated in the most populous of cities as in past decades, with his- torically high-rate large cities having been some of the beneficiaries of major declines in the 1990s and 2000s. • For jurisdictions that organize their collections by their relevant state criminal codes, and whose definitions of standard crimes are not con- sistent with the FBI’s UCR definitions, BJS will need to learn and apply the techniques used by the localities in “converting” their data in or- der to report UCR summaries. A crime type such as aggravated assault, for instance, can vary substantially in scope across the states, and so a common metric would need to be defined. • As a sample-based equivalent to the UCR, and for keeping collection tractable, an initial focus on UCR Part I crimes is sensible. Over time, an interesting question is whether to expand beyond that scope. Data on arrests with no “crimes known to police” equivalent in the UCR, such as drug, weapons, and DWI arrests would likely have strong pol- icy interest. • Using a panel of reporting departments should make it possible to de- velop quarterly estimates. However, at least two technical challenges would need to be resolved in order to smooth the introduction of a new data collection effort. The first is the exact timing and coordina- tion of collection from “live” incident files. Department incident files may be subject to revision after initial entry, “closing” for reporting purposes at some defined data period (such as a month or quarter). Protocols for timing access to and collection of data would need to be developed, as well as mechanisms for updates as necessary (e.g., when assault turns into homicide when a victim survives a shooting for a long period). The second technical challenge is the resolution of the data that the local departments could, and could most feasibly, give to BJS: whether incident-level files as in existing NIBRS or aggre- gate summary reports like the agencies now send to the FBI. In some ways, aggregated summary figures would be the least demanding on both suppliers and BJS, but may require additional care (and timing) in coding to reflect federal definitions. Raw incident-level files would obviously be analytically useful, and BJS could request them geocoded, but this would be substantially harder to negotiate and would impose a greater burden on the data collector and on BJS as aggregator. BJS and its data collectors would have to anticipate a two-track plan, working with data of either type, until one became more universally preferable. The implementation of such a sample-based analogue to the UCR would enable great strides in making BJS’s law enforcement data more timely and relevant to data users, and would well complement a reengineered core- supplement design for LEMAS. In time, success in building two relatively

STATE AND LOCAL PARTNERSHIPS 207 nimble systems could prove useful for law enforcement agencies and policy makers alike. For instance, a LEMAS supplement could provide a “quick response” capability for documenting local agencies’ experience with some new or emerging crime problem; the records-based selection could begin to show continued growth of the problem (or hint at signs of resolution); and the UCR and, particularly, the NCVS would be poised to provide richness of information on contexts and possible causes that are not possible with the interim indicators. All of this—well-designed data systems working at multiple resolutions—would contribute greatly to BJS’s core mission to assist state and local governments and agencies.

Next: 5 Principles and Practices: BJS as a Principal U.S. Federal Statistical Agency »
Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics Get This Book
×
Buy Paperback | $80.00 Buy Ebook | $64.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Bureau of Justice Statistics (BJS) of the U.S. Department of Justice is one of the smallest of the U.S. principal statistical agencies but shoulders one of the most expansive and detailed legal mandates among those agencies. Ensuring the Quality, Credibility, and Relevance of U.S. Justice Statistics examines the full range of BJS programs and suggests priorities for data collection.

BJS's data collection portfolio is a solid body of work, well justified by public information needs or legal requirements and a commendable effort to meet its broad mandate given less-than-commensurate fiscal resources. The book identifies some major gaps in the substantive coverage of BJS data, but notes that filling those gaps would require increased and sustained support in terms of staff and fiscal resources.

In suggesting strategic goals for BJS, the book argues that the bureau's foremost goal should be to establish and maintain a strong position of independence. To avoid structural or political interference in BJS work, the report suggests changing the administrative placement of BJS within the Justice Department and making the BJS directorship a fixed-term appointment.

In its thirtieth year, BJS can look back on a solid body of accomplishment; this book suggests further directions for improvement to give the nation the justice statistics--and the BJS--that it deserves.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!