National Academies Press: OpenBook

Surveying Victims: Options for Conducting the National Crime Victimization Survey (2008)

Chapter: 5 Decision-Making Process for a New Victimization Measurement System

« Previous: 4 Matching Design Features to Desired Goals
Suggested Citation:"5 Decision-Making Process for a New Victimization Measurement System." National Research Council. 2008. Surveying Victims: Options for Conducting the National Crime Victimization Survey. Washington, DC: The National Academies Press. doi: 10.17226/12090.
×
Page 117
Suggested Citation:"5 Decision-Making Process for a New Victimization Measurement System." National Research Council. 2008. Surveying Victims: Options for Conducting the National Crime Victimization Survey. Washington, DC: The National Academies Press. doi: 10.17226/12090.
×
Page 118
Suggested Citation:"5 Decision-Making Process for a New Victimization Measurement System." National Research Council. 2008. Surveying Victims: Options for Conducting the National Crime Victimization Survey. Washington, DC: The National Academies Press. doi: 10.17226/12090.
×
Page 119
Suggested Citation:"5 Decision-Making Process for a New Victimization Measurement System." National Research Council. 2008. Surveying Victims: Options for Conducting the National Crime Victimization Survey. Washington, DC: The National Academies Press. doi: 10.17226/12090.
×
Page 120
Suggested Citation:"5 Decision-Making Process for a New Victimization Measurement System." National Research Council. 2008. Surveying Victims: Options for Conducting the National Crime Victimization Survey. Washington, DC: The National Academies Press. doi: 10.17226/12090.
×
Page 121
Suggested Citation:"5 Decision-Making Process for a New Victimization Measurement System." National Research Council. 2008. Surveying Victims: Options for Conducting the National Crime Victimization Survey. Washington, DC: The National Academies Press. doi: 10.17226/12090.
×
Page 122
Suggested Citation:"5 Decision-Making Process for a New Victimization Measurement System." National Research Council. 2008. Surveying Victims: Options for Conducting the National Crime Victimization Survey. Washington, DC: The National Academies Press. doi: 10.17226/12090.
×
Page 123
Suggested Citation:"5 Decision-Making Process for a New Victimization Measurement System." National Research Council. 2008. Surveying Victims: Options for Conducting the National Crime Victimization Survey. Washington, DC: The National Academies Press. doi: 10.17226/12090.
×
Page 124
Suggested Citation:"5 Decision-Making Process for a New Victimization Measurement System." National Research Council. 2008. Surveying Victims: Options for Conducting the National Crime Victimization Survey. Washington, DC: The National Academies Press. doi: 10.17226/12090.
×
Page 125
Suggested Citation:"5 Decision-Making Process for a New Victimization Measurement System." National Research Council. 2008. Surveying Victims: Options for Conducting the National Crime Victimization Survey. Washington, DC: The National Academies Press. doi: 10.17226/12090.
×
Page 126

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

–5– Decision-Making Process for a New Victimization Measurement System I N THIS CHAPTER , WE FOCUS on two broader issues related to moving forward with refinements to the National Crime Victimization Survey (NCVS). The first is the need to consider ways to best develop the survey in order to shore up and expand constituencies for it (Section 5–A), and the second is the choice of the data collection agent for the survey (5–B). Several of the topics and recommendations in this chapter differ from the rest of the report in that they are agency-level in focus, aimed at better equipping the Bureau of Justice Statistics (BJS) to understand its own products and to interact with its users. This is in keeping with the panel’s charge to focus on the complete portfolio of BJS programs. We make these recommendations here, in initial form, because they are pertinent to the NCVS; however, we emphasize that we expect to expand on them in our final report. 5–A BOLSTERING QUALITY AND BUILDING CONSTITUENCIES NCVS data and estimates are routinely used by researchers and the public to understand the patterns and consequences of victimization. Researchers can access the raw data through the National Archive of Criminal Justice Data at the Interuniversity Consortium for Political and Social Research and thus can analyze the data to fit the needs of their investigation. The vast ma- jority of the public, in contrast, has access to the data primarily through the form of routine annual estimates available on the BJS website, or through 117

118 SURVEYING VICTIMS special topic reports developed and released periodically on the website. However, when the public has interest in specific topics for which no reg- ular NCVS report exists (for example, trends in rural victimization1 ), it is often beyond people’s expertise to use the survey data or even to determine whether they can compile this information themselves. This problem can be addressed by using an advisory committee charged with providing BJS with information about public interest in specific kinds of NCVS reports; improv- ing the organization of the victimization component of the BJS website so that it is clear what NCVS reports are available and what requires special analyses; and expanding the number of trend charts and spreadsheets to include compilations of interest to the public. Any federal statistical agency must constantly strive to maintain clear communications with its users and with the best technical minds in the coun- try relative to its data. While BJS some years ago took the initiative to stim- ulate the creation of the American Statistical Association’s (ASA) Committee on Law and Justice Statistics, the committee is not a formal advisory com- mittee to BJS. This means that the meetings are not public, the recommenda- tions of the committee have no real formal documentation, and the agency does not consistently turn to the committee for key problems facing it. Fur- thermore, the committee consists exclusively of ASA members, who may or may not have all the expertise needed to advise BJS. A formal advisory committee has both the benefits and costs of Federal Advisory Committee Act oversight, yet it would address many of the issues cited above. Most other federal statistical agencies actively use their advisory committees (e.g., the National Center for Health Statistics, the Census Bureau, the Bureau of Labor Statistics) to seek technical input into critical challenges. This is espe- cially true now because of the growing pressures on survey budgets arising from declining U.S. response rates. A formal advisory committee should have membership that is appointed for its expertise. It should have experts in criminology, law enforcement, judicial processes, and incarceration. It should include state and local area experts. This expertise in the substance of the statistics should be supple- mented with expertise in the methods of designing, collecting, and analyzing statistical data. Recommendation 5.1: BJS should establish a scientific advisory board for the agency’s programs; a particular focus should be on maintaining and enhancing the utility of the NCVS. 1 Comparison of trends in urban, suburban, and rural victimization were the focus of a BJS report issued in 2000 (Duhart, 2000), but this specific analysis has not been replicated since that time.

DECISION-MAKING PROCESS 119 The NCVS is largely designed and conducted for BJS by the Census Bureau. Complex survey contracts cannot be wisely administered without highly sophisticated statistical and methodological expertise. Federal sta- tistical agencies that successfully contract out their data collection (either to the Census Bureau or a private contractor) generally have mathematical statisticians and survey methodologists who direct, coordinate, and over- see the activities of the contractor. While many of the BJS staff are labeled “statisticians,” the panel observed the lack of statistical expertise that is cru- cial in dealing with the trade-offs of costs, sample size, numbers of primary sampling units, interviewer training, questionnaire length, use of bounding interviews, etc. The expressions of displeasure about the Census Bureau’s management of the NCVS were not matched with BJS statistical analyses and simulations of design alternatives that might offer better outcomes for the agency. Furthermore, the panel thinks that the number of of BJS full- time staff dedicated to the analysis of NCVS data and the generation of re- ports is insufficient to exploit the full value of the survey and to navigate its challenging future. Some of the issues that require analysis (e.g., the effects of declining response rates on estimates, trade-offs of waves and question- naire length) need statistical and methodological expertise that goes beyond current in-house capabilities. Following the lead of other federal statistical agencies, BJS could usefully enhance statistical expertise on its staff with a program of outside research funds. When federal agencies form useful partnerships with academic re- searchers, they can reduce their overall costs of innovation. BJS has a track record of small research grants connected to the NCVS. The panel applauds these and urges an expansion to tackle the real methodological issues facing the NCVS. Recommendation 5.2: BJS should perform additional and ad- vanced analysis of NCVS data. To do so, BJS should expand its capacity in the number and training of personnel and the ability to let contracts. One reason that the panel thinks that technical staffing and external re- search are important is that many of the questions posed about the NCVS have not been evaluated sufficiently for us to provide recommendations to BJS on the final design of the survey. The panel thinks that this is the long- term result of “eating its seed corn,” of using the operating budget too much to release the traditional reports and too little to scope out the problems of the future. It was well known 15 years ago that household survey response rates were falling; the impact on survey costs of these falling rates was clear (de Leeuw and de Heer, 2002). Federal statistical agencies (see CNSTAT’s Principles and Practices of a Federal Statistical Agency) must consistently

120 SURVEYING VICTIMS probe and analyze their own data, beyond the level required for descrip- tive reports, in order to see their weaknesses and their strengths. Only with such detailed knowledge can wise decisions about cost and error trade-offs be made. Recommendation 5.3: BJS should undertake research to contin- uously evaluate and improve the quality of NCVS estimates. Another way that federal statistical agencies improve their data series is by nurturing a wide community of secondary analysts, using as much data as can be released within confidentiality constraints. Such analysts form a ready-made informed constituency for improving data products over time. Such analysts act as a multiplier of the impact of federal data series. Using the Internet, some agencies have expanded their impact by making available various “predigested” forms of survey data in tables, spreadsheets, graphing capabilities, etc. The panel thinks that the BJS should consider such capabil- ities linked to the NCVS website. These might be time series of individual population rates and means in spreadsheet form, attractive to a very broad audience, as well as microdata predesigned to have commonly desired ana- lytic variables on observation units that are popular. Recommendation 5.4: BJS should continue to improve the avail- ability of NCVS data and estimates in ways that facilitate user access. BJS and the Census Bureau must keep their pledges of confidentiality to NCVS respondents. They also have the obligation to maximize the good statistical uses of the data collected with taxpayer money. Geographically identified NCVS data were available to qualified researchers from approxi- mately 1998–2002 at the Census Bureau’s research data centers (Wiersema, 1999); however, access was subsequently suspended because the data did not conform to technical conditions for research access and oversight. A project to reestablish the availability of these data by documenting and formatting internal Census Bureau data files so that they conform to Census Bureau standards began in 2005 and should be completed by the time of this report. As soon as such work is completed, these data should be made available to qualified researchers. Access to geographically identified NCVS data would permit analyses of how local characteristics and policies are associated with victimization risk and its consequences. Recommendation 5.5: The Census Bureau and BJS should en- sure that geographically identified NCVS data are available to qualified researchers through the Census Bureau’s research data centers, in a manner that ensures proper privacy protection.

DECISION-MAKING PROCESS 121 At this writing, the U.S. statistical budget has been relatively flat for some years (except for the advent of the American Community Survey budget). These flat-line budgets have occurred at the same time that the difficulty and costs of measuring U.S. society have increased. In a climate of tight budgets and increasing costs of demographic measurement, federal statistical agen- cies face real threats. Such are the times that need real statistical leadership and careful stewardship of the statistical information infrastructure of the country. We fear that many surveys, the NCVS among them, can easily die “deaths from a thousand cuts.” Attempts to live within the budgets lead to short-term cuts in features of surveys without certain knowledge of their effects on survey quality. Each such decision runs the risk that the country will be misled due to increased errors in data products. At some point, the basic goals of a survey cannot be met under restricted funding. The country deserves to know this when it is occurring. The panel thinks that one opportunity for such communication comes in the annual report on statistical program funding that the U.S. Office of Management and Budget is required to prepare by a provision of the Paper- work Reduction Act of 1995 (44 U.S.C. 3504(e)(2)). This annual report— Statistical Programs of the United States Government—has been published for each fiscal year since 1997. The report can serve as a vehicle for alert- ing the executive and legislative branches to how the budget has affected the quality of statistical programs, both to the good and to the bad. With specific regard to BJS, the annual reports have generally documented the agency’s responses to declining budgets. For instance, the reports for fiscal years 2007 and 2008 bore a similar warning (U.S. Office of Management and Budget, 2006c:8): BJS did not receive the funding requested to restore its base funding necessary to meet the growing costs of data collection and the infor- mation demands of policymakers and the criminal justice community. To address base adjustments insufficient to carry out ongoing opera- tions of its National Crime Victimization Survey (NCVS) and other na- tional collection programs, BJS has utilized many strategies, such as cut- ting sample, to keep costs within available spending levels. However, changes to the NCVS have had significant effects on the precision of the estimates—year-to-year change estimates are no longer feasible and have been replaced with two-year rolling averages. The guidance provided by these annual reports could be enhanced through fuller explication of the impact of budget reductions (or increases) on the precision of estimates, as well as articulation of constraints and effects on federal statistical surveys systemwide. An example of the latter is the Census Bureau’s sample redesign process; following the decennial census, the Census Bureau realigns the sample frames for the various demographic

122 SURVEYING VICTIMS surveys that it conducts (including the NCVS) so that the household samples are updated and coordinated across the various data collection programs. This work is done in collaboration with the agencies that sponsor Census Bureau–conducted surveys; “the portion of the sample redesign work that can be linked to a specific survey is funded by the sponsoring agency as part of the reimbursable cost of the survey,” while portions that are not directly identified with a specific survey are funded by the Census Bureau. “Thus, the approach combines central funding with user fees for survey specific re- design activities” (U.S. Office of Management and Budget, 2000:45–46). Al- though the sample redesign process has been routinely mentioned as an on- going, cross-cutting activity in Statistical Programs of the United States Gov- ernment, little detail on the progress (and consequences) of the effort was provided in the annual reports from 2001 to 2007. Ultimately, conversion from a sample deriving from the 1990 census to one using the 2000 num- bers was not fully achieved for the NCVS until 2007; the redesign work was originally planned to be complete in fiscal year 2004.2 We recommend that the annual report provide additional discussion—and warning—of budget- related effects on basic survey maintenance when appropriate. Recommendation 5.6: The Statistical Policy Office of the U.S. Office of Management and Budget is uniquely positioned to identify instances in which statistical agencies have been unable to perform basic sample or survey maintenance functions. For example, BJS was unable to update the NCVS household sample to reflect population and household shifts identified in the 2000 census until 2007. The Statistical Policy Office should note such breakdowns in basic survey maintenance functions in its annual report Statistical Programs of the United States Government. 5–B DATA COLLECTION AGENT FOR THE NCVS A review of any survey, particularly one conducted with an eye toward reducing costs, must inevitably consider the question of who collects the data (in addition to exactly how the data are collected). In the case of the NCVS, the U.S. Census Bureau of the U.S. Department of Commerce has been engaged as data collection agent since the survey’s inception. In fact, as described in Box 1-1, the Census Bureau was heavily involved in the pre- history of the survey, entering into discussions with BJS’s predecessor in the 2 The new sample was phased in panel by panel. One panel of addresses based on the 2000 census was introduced in January 2005 for areas already included in the sample. “Beginning in January 2006, [the Census Bureau] introduced sample based on the 2000 decennial census in new areas. The phase-in of the 2000 sample and the phase-out of the 1990 sample will be complete in January 2008” (Demographic Surveys Division, U.S. Census Bureau, 2007b).

DECISION-MAKING PROCESS 123 late 1960s and convening planning conferences that would give shape to the NCVS and its pretests. Since “it was clear from the pilot studies that large samples would be required to obtain reliable estimates of victimization for crime classes of intense interest (e.g., rape),” “the Census Bureau was the only organization that could field such a large survey” and hence was the natural choice as the data collection agent for the new NCVS (Cantor and Lynch, 2000:105). The choice of the Census Bureau as the data collector for the NCVS had implications for the survey’s design, as summarized by Cantor and Lynch (2000:107): Other design features of NCS were occasioned by the need to fit into the organization of the Census Bureau and the Current Population Survey (CPS). CPS is the largest intercensal survey conducted in the world and, at the time, NCS was to be the second largest of these surveys. Sharing interviewers between the two surveys would mean great efficiencies for the [Census Bureau]. CPS employed a rotating panel design. This was viewed as an advantage to NCS for a number of reasons. One was the ability to use prior interviews to ‘bound’ subsequent interviews. . . . A second was that the rotating panel design substantially increased the precision of the year-to-year change estimates. The panel design feature produces a natural positive correlation across annual estimates. This, in turn, substantially reduces the standard error on change estimates. As may be expected, the experience of decades of work has illustrated both advantages and disadvantages of the relationship between BJS as spon- sor and funder of the NCVS and the Census Bureau as its data collector. Rel- atively few of the conceptual pros and cons are unique to the BJS-Census re- lationship; rather, they are generally applicable to any contractor and client. Others, however, in the panel’s view deserve comment. A basic con- cern that has arisen about the Census Bureau as the data collection agent for the NCVS is the lack of transparency in costs. Historically, the Census Bureau has not provided its federal agency survey sponsors with detailed breakdowns in survey costs (and rationales for changes in costs, over and above the known increasing costs of gaining compliance in survey research). It is the panel’s view that disaggregated costs are key to effective innovation in large-scale surveys. The data collector must know what survey design choices are associated with the largest portions of costs in order to effec- tively consider trade-offs of costs and errors. Recent attention to survey costs (e.g., at conferences hosted by the Federal Committee on Statistical Methodology and the National Institute of Statistical Sciences) have shown the value of detailed cost accounting.3 3 See http://www.fcsm.gov/events/program/2006FCSMFinalprogram.pdf (see the session on “modeling survey costs”); Karr and Last (2006).

124 SURVEYING VICTIMS Recommendation 5.7: Because BJS is currently receiving inad- equate information about the costs of the NCVS, the Census Bureau should establish a data-based, data-driven survey cost and information system. Some of the features of the NCVS are not shared by other designs and, lacking a strong evidentiary base for their choice, this stimulates the panel to wonder why the Census Bureau and BJS have chosen them. These in- clude the recycling of cases from the field to centralized computer-assisted telephone interviewing (CATI) (instead of using a dispersed field interview- ing corps for the telephone interviews). They include the slowness of moving from paper questionnaires to computer-assisted personal interview- ing (CAPI). They include the failure to study the use of audio computer- assisted interviewing for many of the sensitive topics in the survey, despite its widespread use in other federal surveys (e.g., the National Survey of Drug Use and Health and the National Survey of Family Growth, as well as BJS- sponsored data collections as required by the Prison Rape Elimination Act). They include the lack of study of how best to use the bounding interview in estimation. Finally, the panel notes that there is very little substantive expertise in criminology and justice programs within the Census Bureau staff working on the NCVS. That means that the Census Bureau focuses on field and statistical issues without the advantage of formal educational background in the substance of the NCVS. Just as the BJS staff would be stronger with more technical and statistical expertise, the panel thinks that the Census Bureau could mount a better NCVS and partner more effectively with BJS with more substantive expertise. That said, it must be noted with equal force that there are important ad- vantages to the use of the Census Bureau as data collector. Census Bureau household surveys, by and large, achieve higher response rates than com- parable surveys conducted by a private contractor on behalf of the federal government. It is common throughout the world that central government statistical agencies achieve higher response rates than private-sector survey organization (Groves and Couper, 1998). The Census Bureau has main- tained a strong confidentiality pledge through the force of the Title 13 law, although under the widened protection of the Confidential Information Pro- tection and Statistical Efficiency Act of 2002, it is not clear that that advan- tage will be maintained. Furthermore, interagency agreements within the federal government appear to be simpler and less burdened by regulation than federal contracts. Finally—in the event that a radical option for col- lecting victimization data were necessary—continued partnership with the Census Bureau could offer the benefit of more readily piggybacking some victimization measures on one of the Census Bureau’s ongoing surveys (e.g.,

DECISION-MAKING PROCESS 125 the American Community Survey or Current Population Survey; see Sec- tion 4–B.1). BJS has sought input regarding contracting out the NCVS to the private sector. We urge careful consideration of survey cost structures prior to such a move. The panel notes that this review would be greatly facilitated if BJS could obtain disaggregated costs from the Census Bureau for the current NCVS. BJS should study other federal surveys contracted out to the private sector to determine the extent to which flexibility in dealing with changes and innovations was or was not realized. It should also study the implica- tions of contracting out on the desired staff skills within BJS. One way to increase understanding of the trade-offs of different NCVS designs and different contracting models is to seek formal design alternatives from the Census Bureau and others. A formal design competition could be mounted, perhaps through a set of commissioned designs, both from the Census Bureau and other survey methodologists. The designs would be guided by the same goals, articulated by BJS, but would be left to the creativity of the designers. The design options should be costed out in as much detail as possible, and the designs should be critiqued through peer review. Recommendation 5.8: BJS should consider a survey design com- petition in order to get a more accurate reading of the feasibility of alternative NCVS redesigns. The design competition should be administered with the assistance of external experts, and the competition should include private organizations under contract and the Census Bureau under an interagency agreement.

Next: References »
Surveying Victims: Options for Conducting the National Crime Victimization Survey Get This Book
×
 Surveying Victims: Options for Conducting the National Crime Victimization Survey
Buy Paperback | $65.00 Buy Ebook | $54.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

It is easy to underestimate how little was known about crimes and victims before the findings of the National Crime Victimization Survey (NCVS) became common wisdom. In the late 1960s, knowledge of crimes and their victims came largely from reports filed by local police agencies as part of the Federal Bureau of Investigation's (FBI) Uniform Crime Reporting (UCR) system, as well as from studies of the files held by individual police departments. Criminologists understood that there existed a "dark figure" of crime consisting of events not reported to the police. However, over the course of the last decade, the effectiveness of the NCVS has been undermined by the demands of conducting an increasingly expensive survey in an effectively flat-line budgetary environment.

Surveying Victims: Options for Conducting the National Crime Victimization Survey, reviews the programs of the Bureau of Justice Statistics (BJS.) Specifically, it explores alternative options for conducting the NCVS, which is the largest BJS program. This book describes various design possibilities and their implications relative to three basic goals; flexibility, in terms of both content and analysis; utility for gathering information on crimes that are not well reported to police; and small-domain estimation, including providing information on states or localities.

This book finds that, as currently configured and funded, the NCVS is not achieving and cannot achieve BJS's mandated goal to "collect and analyze data that will serve as a continuous indication of the incidence and attributes of crime." Accordingly, Surveying Victims recommends that BJS be afforded the budgetary resources necessary to generate accurate measure of victimization.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!