Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 11
Reengineering the Survey of Income and Program Participation 1 Introduction The Survey of Income and Program Participation (SIPP) is a continuing program of the U.S. Census Bureau, which began interviewing households for the survey in late 1983 and is planning to introduce a major redesign and reengineering of the survey beginning in 2013. Under its current design, in which members of sampled households (panels) are interviewed every 4 months for 3 or 4 years, SIPP provides vital information for planning, evaluation, and improvement of government programs intended to address social and economic needs of the U.S. population. Uniquely among surveys, SIPP not only provides detailed information on incomes by source for a representative sample of U.S. households, but also tracks changes in program eligibility and participation for the members of those households as their incomes and other circumstances change. Understanding these changes is essential for government social welfare program planning and evaluation. To make the survey more cost-effective while improving to the extent possible the quality and timeliness of the data, the Census Bureau began a research and development program in 2006 to assess new ways to collect, process, and disseminate the data. As one component of its program to reengineer SIPP, the Census Bureau requested the Committee on National Statistics of the National Academies to convene a panel to study technical issues of using administrative records as part of SIPP. The panel was charged to consider the advantages and disadvantages of strategies for linking administrative records and survey data, taking account of the accessibility of relevant administrative records, the operational feasibility of linking, the quality and usefulness of the linked data, and the ability to provide access to
OCR for page 12
Reengineering the Survey of Income and Program Participation the linked data while protecting the confidentiality of individual respondents. The panel was also charged to consider alternative uses of administrative records for a reengineered SIPP that do not require actual data linking (for example, to evaluate SIPP data quality). In addition, the panel could consider aspects of the reengineered SIPP survey with regard to interview periodicity, mode of data collection, and sample source and size. SIPP IN BRIEF Before SIPP was initiated in the early 1980s, government experts and scholars agreed that better data on incomes and program participation were needed in order to assess and redesign social programs (see National Research Council, 1993:26-28). The major source of such data, the Current Population Survey (CPS), provided only limited information on family incomes and participation in government programs. This information was inadequate, not only because the CPS income reporting period (the previous calendar year) did not match the income reporting period for programs (the previous month in many instances), but also because the data did not allow researchers to track individuals and families over time. Experience in administering such programs as unemployment insurance and food stamps indicated that at least some program participants faced frequent changes in employment, earnings, and income, and that these changes were often associated with changes in program eligibility and participation that were important to understand. A new survey that followed the same individuals over time, recording as many of these changes in income and program participation as possible, was therefore needed. To fill this gap, the Office of the Assistant Secretary for Planning and Evaluation and the Social Security Administration in what was then the U.S. Department of Health, Education and Welfare worked with the Census Bureau and outside researchers over a period of years to conceptualize, design, and field test survey questions and methods for a new survey. SIPP was the result. Interviews for the first SIPP panel of households began in fall 1983, and, with a few exceptions, a SIPP panel has been in the field every year since then. Each panel consists of the members of a representative sample of households (ranging in size from 12,000 to 51,000 households at the start of a panel), who are interviewed every 4 months about their income, employment, family relationships, and program participation for each of the 4 months preceding the interview. Most panels have continued for 2-4 years. In the early years of the survey, SIPP interviewers conducted in-person interviews of sample members using paper and pencil questionnaires. At present, SIPP interviewers use computer-assisted personal interviewing (CAPI) for the first two interview waves and computer-assisted telephone interviewing (CATI) for all subsequent waves.
OCR for page 13
Reengineering the Survey of Income and Program Participation Originally, a new SIPP panel began every year; in a redesign introduced in 1996, a new panel begins every 3 or 4 years following the conclusion of the previous panel. The selection of the sample for each panel is a complex procedure that results in a probability sample of the U.S. population (excluding only inmates of institutions and armed forces members living on base without their families), with oversampling of low-income households based on their census characteristics. The sample includes cases in every state and the District of Columbia, although SIPP currently can support reliable state-level estimates for only 14 states. In addition to the core data asked in every interview wave, SIPP includes topical modules, which are sets of questions asked one or more times of each panel on a wide range of subjects. Responses to topical module questions on child care arrangements, child well-being, marital, fertility, and employment history, pension rights, asset holdings, and other subjects broaden and deepen the analyses that can be conducted with SIPP data on important social and economic welfare issues of public policy concern. SIPP’S UNIQUE CONTRIBUTION The unique feature of SIPP is its capacity to measure dynamics in the short run. Monthly data on incomes and demographic characteristics of households allow analysts to study intrayear transitions in marital status, poverty, employment, health insurance coverage, and eligibility for and participation in a wide range of government programs. These kinds of analyses are not possible with other nationally representative data sets, which require respondents to recall income amounts, program participation, and other characteristics for an entire year, not just 4 months, as in SIPP. The monthly time frame is critical given that eligibility for many public programs is assessed on a monthly basis and that people may have short spells of both program eligibility and participation. While administrative data can be used to look at dynamic patterns of participation in a single program, only SIPP, which includes both participants and nonparticipants in a wide range of programs, can be used to examine dynamic patterns of eligibility—and of participation contingent on eligibility—in single and multiple programs at the same or different time periods. For example, in one of the first such analyses, considering Social Security, Supplemental Security Income, public assistance, and food stamps, Doyle and Long (1988) estimated that 17 percent of people in the first month of the 1984 SIPP panel participated in a single program, and another 6 percent participated in more than one of these programs. Moreover, during the next 11 months, about 6 percent of the initial program recipients experienced at least one transition to a different program combination or ended their participation.
OCR for page 14
Reengineering the Survey of Income and Program Participation Choosing the right time interval for a specific policy analysis of program participation and eligibility can have important consequences. The free and reduced-price school lunch program offers an example. A report of the Food and Nutrition Service of the U.S. Department of Agriculture (USDA) suggested that the number of children certified for free meals in 1999 was 27 percent greater than the number who appeared to be eligible, indicating extensive “overcertification” in the school meals program (Food and Nutrition Service, 1999). Results like this contributed to the Improper Payments Information Act of 2002, which requires that various federal agencies identify and reduce erroneous payments in their programs. The USDA is one of the agencies the act targeted. The CPS Annual Social and Economic Supplement, the source of the data used for the Food and Nutrition Service’s overcertification results, collects only annual data on income. Annual income necessarily smoothes month-to-month variation in income—yet it is monthly income that statutorily affects eligibility. Parents or guardians self-report household income for the calendar month prior to the application for free or reduced-price school meals. Income must be equal to or less than 130 percent of the poverty line or the household must receive food stamps or Temporary Assistance for Needy Families benefits for the children to be eligible for free school lunches. Using data from SIPP, which allowed them to calculate eligibility based on information that mirrors the statutes governing program eligibility, Dahl and Scholz (2005) report that participation in the school lunch program (as a fraction of eligible children) is 77 percent, far lower than USDA’s CPS–based estimate of 127 percent. The Dahl and Scholz estimate covers free meals for the period from 1993 through 2003. The differences between the two studies’ results are large, and estimates of program cost, take-up (i.e., the percentage of eligible people who apply for and receive program benefits), and the consequences of altering program rules simply cannot be made without accurate monthly data on the eligible population. The fact that the SIPP monthly data are obtained from a panel survey—in which sample members are followed over time, rather than from a cross-sectional survey collecting retrospective monthly information—is important not only for longitudinal uses of the data, but also for cross-sectional, point-in-time analyses. Households and families are dynamic, experiencing such events as the birth of a child or the loss of a parent due to death or divorce, and these changes affect the income and other resources available to household members during a year or other time period. SIPP can capture these kinds of changes, which cross-sectional surveys cannot. Research on poverty has shown the importance of the panel feature of SIPP for cross-sectional analyses. Annual poverty rates estimated from SIPP are consistently lower than those from the CPS (see, for example, Lamas, Tin, and Eargle, 1994). These differences are due to several factors, including that
OCR for page 15
Reengineering the Survey of Income and Program Participation income reporting for the low-income population is more complete in SIPP with its 4-month interviews than in the annual CPS retrospective interviews. Another factor is that the CPS poverty rates are based on the characteristics of a family at a point in time and do not capture the income that may have been available to family members during the year from people who were in the family only part of the year. REENGINEERING Despite a record of providing invaluable data for important research and policy studies, SIPP has experienced many ups and downs over its 25 years of existence (see Chapter 2). Periodically, budget cuts have necessitated cuts in sample size and the length of panels. Some problems have plagued the survey from its inception, such as late delivery of data files to users, complex file structures, and inadequate documentation, which make it difficult for users to work with the data. In addition, growing rates of attrition of sample members over the life of a panel (both at the first interview wave and in subsequent waves), underreporting of program recipients and benefit amounts when compared with administrative records, and other factors have led to concerns about the quality of the data. The Census Bureau recognized the need to reengineer the outmoded SIPP data processing system, which contributed to delays in data release, and to address other problems. But the budget climate was not favorable to making the needed investment in the survey. In January 2006, when required by the Office of Management and Budget to absorb a significant budget cut, the bureau decided to discontinue SIPP. Congress, however, encouraged by an outpouring of support for the survey from data users (see Chapter 2), appropriated funds not only to continue SIPP in its current form, but also to reengineer the survey to be more timely and cost-effective in the future. The Census Bureau is well along on its reengineering agenda, which includes testing an event history calendar approach to collecting the core data. This approach may permit interviews to be scaled back from three interviews to just one per year. Another aspect of the reengineering agenda is the search for cost-effective uses of administrative records, such as federal and state tax and transfer program records, to assess the quality of responses to questions on the SIPP interview or to supplement the survey with additional information. ORGANIZATION OF THE REPORT This report with its conclusions and recommendations is organized into four chapters and two appendixes. Chapter 2 fleshes out the history of SIPP from the early days of its conceptualization through the present period of
OCR for page 16
Reengineering the Survey of Income and Program Participation redesign. The chapter describes the strengths of SIPP and the challenges it presents to users in terms of data quality, timeliness, and complexity of data files. Chapter 3 discusses the possible roles for federal and state administrative records in a reengineered SIPP, which include their use to evaluate the quality of survey reports, improve imputations used to provide values for missing responses, correct survey responses for misreporting, and replace survey questions. The chapter considers the costs and benefits of each major use of administrative records and both short-term and longer term goals for making the best use of records for a reengineered SIPP. An important consideration for expanding the role of administrative records in SIPP concerns the consequences for access to microdata for research and policy analysis while protecting the confidentiality of individual responses. The extent to which confidentiality protection becomes more difficult than with the current design depends heavily on the specific roles that are identified for administrative records in a new SIPP design. Chapter 4 discusses proposed innovations in design and data collection for SIPP that may interact with proposed uses of administrative records. The chapter focuses on the planned use of an event history calendar to collect intrayear data of high quality at less frequent intervals than under the current design. It identifies potential strengths and weaknesses of that approach in comparison with the current SIPP design and outlines a comprehensive set of evaluations for understanding the consequences of adopting an event history calendar approach. It also addresses related considerations of the length and number of interviews and the length and overlap of panels, along with issues of data content, timeliness, and budget for a reengineered SIPP. It briefly addresses the SIPP sample size and design. The appendixes provide additional information on SIPP data quality and the backgrounds of panel members and staff.