Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 117
Surveying Victims: Options for Conducting the National Crime Victimization Survey –5– Decision-Making Process for a New Victimization Measurement System IN THIS CHAPTER, WE FOCUS on two broader issues related to moving forward with refinements to the National Crime Victimization Survey (NCVS). The first is the need to consider ways to best develop the survey in order to shore up and expand constituencies for it (Section 5–A), and the second is the choice of the data collection agent for the survey (5–B). Several of the topics and recommendations in this chapter differ from the rest of the report in that they are agency-level in focus, aimed at better equipping the Bureau of Justice Statistics (BJS) to understand its own products and to interact with its users. This is in keeping with the panel’s charge to focus on the complete portfolio of BJS programs. We make these recommendations here, in initial form, because they are pertinent to the NCVS; however, we emphasize that we expect to expand on them in our final report. 5–A BOLSTERING QUALITY AND BUILDING CONSTITUENCIES NCVS data and estimates are routinely used by researchers and the public to understand the patterns and consequences of victimization. Researchers can access the raw data through the National Archive of Criminal Justice Data at the Interuniversity Consortium for Political and Social Research and thus can analyze the data to fit the needs of their investigation. The vast majority of the public, in contrast, has access to the data primarily through the form of routine annual estimates available on the BJS website, or through
OCR for page 118
Surveying Victims: Options for Conducting the National Crime Victimization Survey special topic reports developed and released periodically on the website. However, when the public has interest in specific topics for which no regular NCVS report exists (for example, trends in rural victimization1), it is often beyond people’s expertise to use the survey data or even to determine whether they can compile this information themselves. This problem can be addressed by using an advisory committee charged with providing BJS with information about public interest in specific kinds of NCVS reports; improving the organization of the victimization component of the BJS website so that it is clear what NCVS reports are available and what requires special analyses; and expanding the number of trend charts and spreadsheets to include compilations of interest to the public. Any federal statistical agency must constantly strive to maintain clear communications with its users and with the best technical minds in the country relative to its data. While BJS some years ago took the initiative to stimulate the creation of the American Statistical Association’s (ASA) Committee on Law and Justice Statistics, the committee is not a formal advisory committee to BJS. This means that the meetings are not public, the recommendations of the committee have no real formal documentation, and the agency does not consistently turn to the committee for key problems facing it. Furthermore, the committee consists exclusively of ASA members, who may or may not have all the expertise needed to advise BJS. A formal advisory committee has both the benefits and costs of Federal Advisory Committee Act oversight, yet it would address many of the issues cited above. Most other federal statistical agencies actively use their advisory committees (e.g., the National Center for Health Statistics, the Census Bureau, the Bureau of Labor Statistics) to seek technical input into critical challenges. This is especially true now because of the growing pressures on survey budgets arising from declining U.S. response rates. A formal advisory committee should have membership that is appointed for its expertise. It should have experts in criminology, law enforcement, judicial processes, and incarceration. It should include state and local area experts. This expertise in the substance of the statistics should be supplemented with expertise in the methods of designing, collecting, and analyzing statistical data. Recommendation 5.1: BJS should establish a scientific advisory board for the agency’s programs; a particular focus should be on maintaining and enhancing the utility of the NCVS. 1 Comparison of trends in urban, suburban, and rural victimization were the focus of a BJS report issued in 2000 (Duhart, 2000), but this specific analysis has not been replicated since that time.
OCR for page 119
Surveying Victims: Options for Conducting the National Crime Victimization Survey The NCVS is largely designed and conducted for BJS by the Census Bureau. Complex survey contracts cannot be wisely administered without highly sophisticated statistical and methodological expertise. Federal statistical agencies that successfully contract out their data collection (either to the Census Bureau or a private contractor) generally have mathematical statisticians and survey methodologists who direct, coordinate, and oversee the activities of the contractor. While many of the BJS staff are labeled “statisticians,” the panel observed the lack of statistical expertise that is crucial in dealing with the trade-offs of costs, sample size, numbers of primary sampling units, interviewer training, questionnaire length, use of bounding interviews, etc. The expressions of displeasure about the Census Bureau’s management of the NCVS were not matched with BJS statistical analyses and simulations of design alternatives that might offer better outcomes for the agency. Furthermore, the panel thinks that the number of of BJS full-time staff dedicated to the analysis of NCVS data and the generation of reports is insufficient to exploit the full value of the survey and to navigate its challenging future. Some of the issues that require analysis (e.g., the effects of declining response rates on estimates, trade-offs of waves and questionnaire length) need statistical and methodological expertise that goes beyond current in-house capabilities. Following the lead of other federal statistical agencies, BJS could usefully enhance statistical expertise on its staff with a program of outside research funds. When federal agencies form useful partnerships with academic researchers, they can reduce their overall costs of innovation. BJS has a track record of small research grants connected to the NCVS. The panel applauds these and urges an expansion to tackle the real methodological issues facing the NCVS. Recommendation 5.2: BJS should perform additional and advanced analysis of NCVS data. To do so, BJS should expand its capacity in the number and training of personnel and the ability to let contracts. One reason that the panel thinks that technical staffing and external research are important is that many of the questions posed about the NCVS have not been evaluated sufficiently for us to provide recommendations to BJS on the final design of the survey. The panel thinks that this is the long-term result of “eating its seed corn,” of using the operating budget too much to release the traditional reports and too little to scope out the problems of the future. It was well known 15 years ago that household survey response rates were falling; the impact on survey costs of these falling rates was clear (de Leeuw and de Heer, 2002). Federal statistical agencies (see CNSTAT’s Principles and Practices of a Federal Statistical Agency) must consistently
OCR for page 120
Surveying Victims: Options for Conducting the National Crime Victimization Survey probe and analyze their own data, beyond the level required for descriptive reports, in order to see their weaknesses and their strengths. Only with such detailed knowledge can wise decisions about cost and error trade-offs be made. Recommendation 5.3: BJS should undertake research to continuously evaluate and improve the quality of NCVS estimates. Another way that federal statistical agencies improve their data series is by nurturing a wide community of secondary analysts, using as much data as can be released within confidentiality constraints. Such analysts form a ready-made informed constituency for improving data products over time. Such analysts act as a multiplier of the impact of federal data series. Using the Internet, some agencies have expanded their impact by making available various “predigested” forms of survey data in tables, spreadsheets, graphing capabilities, etc. The panel thinks that the BJS should consider such capabilities linked to the NCVS website. These might be time series of individual population rates and means in spreadsheet form, attractive to a very broad audience, as well as microdata predesigned to have commonly desired analytic variables on observation units that are popular. Recommendation 5.4: BJS should continue to improve the availability of NCVS data and estimates in ways that facilitate user access. BJS and the Census Bureau must keep their pledges of confidentiality to NCVS respondents. They also have the obligation to maximize the good statistical uses of the data collected with taxpayer money. Geographically identified NCVS data were available to qualified researchers from approximately 1998–2002 at the Census Bureau’s research data centers (Wiersema, 1999); however, access was subsequently suspended because the data did not conform to technical conditions for research access and oversight. A project to reestablish the availability of these data by documenting and formatting internal Census Bureau data files so that they conform to Census Bureau standards began in 2005 and should be completed by the time of this report. As soon as such work is completed, these data should be made available to qualified researchers. Access to geographically identified NCVS data would permit analyses of how local characteristics and policies are associated with victimization risk and its consequences. Recommendation 5.5: The Census Bureau and BJS should ensure that geographically identified NCVS data are available to qualified researchers through the Census Bureau’s research data centers, in a manner that ensures proper privacy protection.
OCR for page 121
Surveying Victims: Options for Conducting the National Crime Victimization Survey At this writing, the U.S. statistical budget has been relatively flat for some years (except for the advent of the American Community Survey budget). These flat-line budgets have occurred at the same time that the difficulty and costs of measuring U.S. society have increased. In a climate of tight budgets and increasing costs of demographic measurement, federal statistical agencies face real threats. Such are the times that need real statistical leadership and careful stewardship of the statistical information infrastructure of the country. We fear that many surveys, the NCVS among them, can easily die “deaths from a thousand cuts.” Attempts to live within the budgets lead to short-term cuts in features of surveys without certain knowledge of their effects on survey quality. Each such decision runs the risk that the country will be misled due to increased errors in data products. At some point, the basic goals of a survey cannot be met under restricted funding. The country deserves to know this when it is occurring. The panel thinks that one opportunity for such communication comes in the annual report on statistical program funding that the U.S. Office of Management and Budget is required to prepare by a provision of the Paperwork Reduction Act of 1995 (44 U.S.C. 3504(e)(2)). This annual report—Statistical Programs of the United States Government—has been published for each fiscal year since 1997. The report can serve as a vehicle for alerting the executive and legislative branches to how the budget has affected the quality of statistical programs, both to the good and to the bad. With specific regard to BJS, the annual reports have generally documented the agency’s responses to declining budgets. For instance, the reports for fiscal years 2007 and 2008 bore a similar warning (U.S. Office of Management and Budget, 2006c:8): BJS did not receive the funding requested to restore its base funding necessary to meet the growing costs of data collection and the information demands of policymakers and the criminal justice community. To address base adjustments insufficient to carry out ongoing operations of its National Crime Victimization Survey (NCVS) and other national collection programs, BJS has utilized many strategies, such as cutting sample, to keep costs within available spending levels. However, changes to the NCVS have had significant effects on the precision of the estimates—year-to-year change estimates are no longer feasible and have been replaced with two-year rolling averages. The guidance provided by these annual reports could be enhanced through fuller explication of the impact of budget reductions (or increases) on the precision of estimates, as well as articulation of constraints and effects on federal statistical surveys systemwide. An example of the latter is the Census Bureau’s sample redesign process; following the decennial census, the Census Bureau realigns the sample frames for the various demographic
OCR for page 122
Surveying Victims: Options for Conducting the National Crime Victimization Survey surveys that it conducts (including the NCVS) so that the household samples are updated and coordinated across the various data collection programs. This work is done in collaboration with the agencies that sponsor Census Bureau–conducted surveys; “the portion of the sample redesign work that can be linked to a specific survey is funded by the sponsoring agency as part of the reimbursable cost of the survey,” while portions that are not directly identified with a specific survey are funded by the Census Bureau. “Thus, the approach combines central funding with user fees for survey specific redesign activities” (U.S. Office of Management and Budget, 2000:45–46). Although the sample redesign process has been routinely mentioned as an ongoing, cross-cutting activity in Statistical Programs of the United States Government, little detail on the progress (and consequences) of the effort was provided in the annual reports from 2001 to 2007. Ultimately, conversion from a sample deriving from the 1990 census to one using the 2000 numbers was not fully achieved for the NCVS until 2007; the redesign work was originally planned to be complete in fiscal year 2004.2 We recommend that the annual report provide additional discussion—and warning—of budget-related effects on basic survey maintenance when appropriate. Recommendation 5.6: The Statistical Policy Office of the U.S. Office of Management and Budget is uniquely positioned to identify instances in which statistical agencies have been unable to perform basic sample or survey maintenance functions. For example, BJS was unable to update the NCVS household sample to reflect population and household shifts identified in the 2000 census until 2007. The Statistical Policy Office should note such breakdowns in basic survey maintenance functions in its annual report Statistical Programs of the United States Government. 5–B DATA COLLECTION AGENT FOR THE NCVS A review of any survey, particularly one conducted with an eye toward reducing costs, must inevitably consider the question of who collects the data (in addition to exactly how the data are collected). In the case of the NCVS, the U.S. Census Bureau of the U.S. Department of Commerce has been engaged as data collection agent since the survey’s inception. In fact, as described in Box 1-1, the Census Bureau was heavily involved in the prehistory of the survey, entering into discussions with BJS’s predecessor in the 2 The new sample was phased in panel by panel. One panel of addresses based on the 2000 census was introduced in January 2005 for areas already included in the sample. “Beginning in January 2006, [the Census Bureau] introduced sample based on the 2000 decennial census in new areas. The phase-in of the 2000 sample and the phase-out of the 1990 sample will be complete in January 2008” (Demographic Surveys Division, U.S. Census Bureau, 2007b).
OCR for page 123
Surveying Victims: Options for Conducting the National Crime Victimization Survey late 1960s and convening planning conferences that would give shape to the NCVS and its pretests. Since “it was clear from the pilot studies that large samples would be required to obtain reliable estimates of victimization for crime classes of intense interest (e.g., rape),” “the Census Bureau was the only organization that could field such a large survey” and hence was the natural choice as the data collection agent for the new NCVS (Cantor and Lynch, 2000:105). The choice of the Census Bureau as the data collector for the NCVS had implications for the survey’s design, as summarized by Cantor and Lynch (2000:107): Other design features of NCS were occasioned by the need to fit into the organization of the Census Bureau and the Current Population Survey (CPS). CPS is the largest intercensal survey conducted in the world and, at the time, NCS was to be the second largest of these surveys. Sharing interviewers between the two surveys would mean great efficiencies for the [Census Bureau]. CPS employed a rotating panel design. This was viewed as an advantage to NCS for a number of reasons. One was the ability to use prior interviews to ‘bound’ subsequent interviews. … A second was that the rotating panel design substantially increased the precision of the year-to-year change estimates. The panel design feature produces a natural positive correlation across annual estimates. This, in turn, substantially reduces the standard error on change estimates. As may be expected, the experience of decades of work has illustrated both advantages and disadvantages of the relationship between BJS as sponsor and funder of the NCVS and the Census Bureau as its data collector. Relatively few of the conceptual pros and cons are unique to the BJS-Census relationship; rather, they are generally applicable to any contractor and client. Others, however, in the panel’s view deserve comment. A basic concern that has arisen about the Census Bureau as the data collection agent for the NCVS is the lack of transparency in costs. Historically, the Census Bureau has not provided its federal agency survey sponsors with detailed breakdowns in survey costs (and rationales for changes in costs, over and above the known increasing costs of gaining compliance in survey research). It is the panel’s view that disaggregated costs are key to effective innovation in large-scale surveys. The data collector must know what survey design choices are associated with the largest portions of costs in order to effectively consider trade-offs of costs and errors. Recent attention to survey costs (e.g., at conferences hosted by the Federal Committee on Statistical Methodology and the National Institute of Statistical Sciences) have shown the value of detailed cost accounting.3 3 See http://www.fcsm.gov/events/program/2006FCSMFinalprogram.pdf (see the session on “modeling survey costs”); Karr and Last (2006).
OCR for page 124
Surveying Victims: Options for Conducting the National Crime Victimization Survey Recommendation 5.7: Because BJS is currently receiving inadequate information about the costs of the NCVS, the Census Bureau should establish a data-based, data-driven survey cost and information system. Some of the features of the NCVS are not shared by other designs and, lacking a strong evidentiary base for their choice, this stimulates the panel to wonder why the Census Bureau and BJS have chosen them. These include the recycling of cases from the field to centralized computer-assisted telephone interviewing (CATI) (instead of using a dispersed field interviewing corps for the telephone interviews). They include the slowness of moving from paper questionnaires to computer-assisted personal interviewing (CAPI). They include the failure to study the use of audio computer-assisted interviewing for many of the sensitive topics in the survey, despite its widespread use in other federal surveys (e.g., the National Survey of Drug Use and Health and the National Survey of Family Growth, as well as BJS-sponsored data collections as required by the Prison Rape Elimination Act). They include the lack of study of how best to use the bounding interview in estimation. Finally, the panel notes that there is very little substantive expertise in criminology and justice programs within the Census Bureau staff working on the NCVS. That means that the Census Bureau focuses on field and statistical issues without the advantage of formal educational background in the substance of the NCVS. Just as the BJS staff would be stronger with more technical and statistical expertise, the panel thinks that the Census Bureau could mount a better NCVS and partner more effectively with BJS with more substantive expertise. That said, it must be noted with equal force that there are important advantages to the use of the Census Bureau as data collector. Census Bureau household surveys, by and large, achieve higher response rates than comparable surveys conducted by a private contractor on behalf of the federal government. It is common throughout the world that central government statistical agencies achieve higher response rates than private-sector survey organization (Groves and Couper, 1998). The Census Bureau has maintained a strong confidentiality pledge through the force of the Title 13 law, although under the widened protection of the Confidential Information Protection and Statistical Efficiency Act of 2002, it is not clear that that advantage will be maintained. Furthermore, interagency agreements within the federal government appear to be simpler and less burdened by regulation than federal contracts. Finally—in the event that a radical option for collecting victimization data were necessary—continued partnership with the Census Bureau could offer the benefit of more readily piggybacking some victimization measures on one of the Census Bureau’s ongoing surveys (e.g.,
OCR for page 125
Surveying Victims: Options for Conducting the National Crime Victimization Survey the American Community Survey or Current Population Survey; see Section 4–B.1). BJS has sought input regarding contracting out the NCVS to the private sector. We urge careful consideration of survey cost structures prior to such a move. The panel notes that this review would be greatly facilitated if BJS could obtain disaggregated costs from the Census Bureau for the current NCVS. BJS should study other federal surveys contracted out to the private sector to determine the extent to which flexibility in dealing with changes and innovations was or was not realized. It should also study the implications of contracting out on the desired staff skills within BJS. One way to increase understanding of the trade-offs of different NCVS designs and different contracting models is to seek formal design alternatives from the Census Bureau and others. A formal design competition could be mounted, perhaps through a set of commissioned designs, both from the Census Bureau and other survey methodologists. The designs would be guided by the same goals, articulated by BJS, but would be left to the creativity of the designers. The design options should be costed out in as much detail as possible, and the designs should be critiqued through peer review. Recommendation 5.8: BJS should consider a survey design competition in order to get a more accurate reading of the feasibility of alternative NCVS redesigns. The design competition should be administered with the assistance of external experts, and the competition should include private organizations under contract and the Census Bureau under an interagency agreement.
OCR for page 126
Surveying Victims: Options for Conducting the National Crime Victimization Survey This page intentionally left blank.