reasons to suspect there is bias due to nonresponse and attrition: it is likely that factors making it difficult to interview and reinterview respondents (in a longitudinal setting) may be correlated with their outcomes. For example, a lack of stable residence may indicate financial trouble for a family and may make it more difficult to locate a survey sample member. Homeless persons are rarely included in survey sample frames, as we noted earlier, and longitudinal survey respondents who become homeless are generally lost for future reinterview attempts.

Weighting and imputation procedures can potentially reduce nonresponse biases although they can rarely eliminate them (see Kalton and Kasprzyk, 1986 and Little and Rubin, 1987). With specific attention to surveys of low-income populations, Groves and Couper (2001) discuss survey design considerations for reducing nonresponse and nonresponse adjustments and Mohadjer and Choudhry (2001) provide more detail on weighting adjustment procedures. Incentive payments to encourage sample members to respond to surveys have also been effective in increasing response rates in surveys. Initial evidence from a small number of experiments further suggests that incentive payments may be particularly effective with low-income populations (Singer and Kulka, 2001). There has been some movement towards using incentives payments for the SIPP and SPD.

Response rates in the key national-level surveys vary considerably. CPS response rates are around 94–95 percent each month, although the response rates for the March CPS Supplement are a little lower.4 The ACS is still undergoing field tests and response rates are not available. However, in a 1996 test in four sites, the weighted response rates for the ACS were about 95 percent. The NSAF, which oversamples low-income households, had an overall response rate of 70 percent for the 1997 round and about 64 percent for the 1999 round (Safir, Scheuren, and Wang, 2001).

For the longitudinal surveys, nonresponse and attrition over multiple waves is a significant threat to data quality. For the SIPP and SPD, response rates in the initial waves were high (between 91 and 95 percent for first panels of SIPP from 1984–1996 and 91 percent for the first wave of the SPD—which corresponded to the 1992 and 1993 panels of SIPP), but many first-wave respondents in both surveys have not been reinterviewed. By the eighth wave, the cumulative nonresponse rates for the 1984–1991 panels were between 21–22 percent, 25 percent for the 1992 and 1993 panels and 31 percent for the 1996 panel. This attrition seems to be the result of refusals, rather than the inability to track sample members (U.S. Census Bureau, 1998). The SPD sample is comprised of the 1992 and 1993 SIPP panels. The first SPD survey, the “bridge survey” in 1997, attempted


For example, in the 2000 CPS March Supplement, the response rate for the basic monthly labor survey was just over 93 percent, but 8 percent of the basic sample did not respond to the supplement and so the total response rate was 86 percent (U.S. Census Bureau and Bureau of Labor Statistics, 2000).

The National Academies of Sciences, Engineering, and Medicine
500 Fifth St. N.W. | Washington, D.C. 20001

Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement