National Academies Press: OpenBook

Studies of Welfare Populations: Data Collection and Research Issues (2002)

Chapter: 2 Methods for Obtaining High Response Rates in Telephone Surveys

« Previous: 1 Designing Surveys Acknowledging Nonresponse
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

2
Methods for Obtaining High Response Rates in Telephone Surveys

David Cantor and Patricia Cunningham

The purpose of this paper is to review methods used to conduct telephone surveys of low-income populations. The motivation for this review is to provide information on “best practices” applicable to studies currently being conducted to evaluate the Personal Responsibility and Work Opportunity Reconciliation Act of 1996 (PRWORA—hereafter referred to as “Welfare Reform”). The National Academy of Sciences panel observed that many of the states are conducting telephone surveys for this purpose and that it would be useful to provide them with information on the best methods for maximizing response rates. The information provided in this paper is intended to assist these individuals, as well as others, to either conduct these studies themselves or to evaluate and monitor contractors conducting the studies.

We have divided the telephone surveys into two types. The first, primary, method is to sample welfare recipients or welfare leavers from agency lists. This can take the form of a randomized experiment, where recipients are randomly assigned to different groups at intake, with a longitudinal survey following these individuals over an extended period of time. More commonly, it takes the form of a survey of those leaving welfare during a particular period (e.g., first quarter of the year). These individuals are then followed up after “X” months to assess how they are coping with being off welfare.

The second type of telephone survey is one completed using a sample generated by random digit dialing methods (RDD). In this type of study, telephone numbers are generated randomly. The numbers then are called and interviews are completed with those numbers that represent residential households and that agree to participate in the interview. To effectively evaluate welfare reform, this

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

type of survey would attempt to oversample persons who are eligible and/or who are participating in welfare programs.

The issues related to these two types of telephone surveys, one from a list of welfare clients and one using RDD, overlap to a large degree. The following discussion reviews the common issues as well as the unique aspects related to each type of survey. In the next section, we discuss methods to increase response rates on telephone surveys, placing somewhat more emphasis on issues related to conducting surveys from lists of welfare clients. We chose this emphasis because this is the predominant method being used by states to evaluate welfare reform. The third section reviews a number of welfare studies that have been implemented recently. In this section we discuss how the methods that are being used match up with the “best practices” and how this may relate to response rates. The fourth section provides an overview of issues that are unique to RDD surveys when conducting a survey of low-income populations. To summarize the discussion, the final section highlights practices that can be implemented for a relatively low cost but that could have relatively large impacts.

METHODS TO INCREASE RESPONSE RATES

In this section we discuss the methods needed to obtain high response rates in a telephone survey. These methods include locating, contacting, and obtaining the cooperation of survey subjects. The review applies to all types of telephone surveys, but we have highlighted those methods that seem particularly important for conducting surveys from lists of welfare clients. A later section provides issues unique to RDD.

The Importance of Language

The methods discussed in the following sections should be considered in terms of the language and cultural diversity of the state being studied. The percentage of non-English speakers ranges from as high as a third in California to a quarter in New York and Texas, down to a low of 4 to 5 percent in South Carolina, Missouri, and Georgia (1990 Census). Spanish is the most common language spoken by non-English speakers. Again these percentages vary a great deal by state, with 20 percent of the population in California and Texas speaking Spanish at home and only 2 percent in South Carolina. These variations imply that surveys may have to be prepared to locate and interview respondents in languages other than English and Spanish. Moreover, language barriers are greater among low-income households, and low-income households are more likely to be isolated linguistically, where no one in the household speaks English.

The need for bilingual staff as well as Spanish (and perhaps other languages) versions of all questionnaires and materials is crucial, particularly in some states.

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

It is important to keep in mind that many people who do not speak English also may not be literate in their native language, so they may not be able to read materials or an advance letter even if it is translated into a language they speak. In some situations it might be useful to partner with social service agencies and community groups that serve and have special ties with different language and culture communities. Such groups may be able to vouch for the legitimacy of the survey, provide interviewers or translators with special language skills, and assist in other ways. Representatives of respected and trusted organizations can be invaluable in communicating the purpose of the study and explaining to prospective respondents that it is in the community’s best interest to cooperate.

Locating Respondents

Locating survey subjects begins with having sufficient information to find those that moved from their latest residence. Low-income households move at higher rates than the general population, and it seems reasonable to assume that within this group, “welfare leavers” will be the most mobile. Therefore, if surveys are going to become a routine part of the evaluation process, agencies should consider future evaluation needs in all their procedures. This includes changing intake procedures to obtain additional information to help locate subjects in the future and designing systems to allow access to other state administrative records. In the sections that follow, these presurvey procedures are discussed in more detail. This is followed by a description of the initial mail contact, which provides the first indication of whether a subject has moved. The section ends with a discussion of some tracing procedures that might be implemented if the subject is lost to the study.

Presurvey Preparations

As part of the intake process, or shortly thereafter (but perhaps separately from the eligibility process), detailed contact information should be obtained for at least two other people who are likely to know the subject’s whereabouts and who do not live in the same household as the subject. In addition to name, address, and telephone number, the relationship of the contact to the subject should be determined along with his or her place of employment. We believe this step is crucial to obtaining acceptable response rates. This step also may be difficult to achieve because, in some situations, it may require a change in the intake system.

It is also useful to consider obtaining the subject’s informed consent as needed to access databases that require consent at the same time as contact information is obtained. It is hard to state when and how consent might be used given the differences in state laws, but we assume that, at a minimum, state income tax records fall into this category (if they are assessable at all, even with

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

consent). This is a common procedure on studies that track and interview drug users and criminal populations (Anglin et al., 1996).

Data to Be Provided to the Contractor with the Sample

In addition to the subject’s name, address, telephone number, Social Security number, and all contact information, consideration should be given to running the subject through other state administrative databases (Medicaid, food stamps, etc.) prior to the survey. This may be particularly useful if the information in the sample file from which the sample is drawn is old or if the information in the files is different. Initial contacts should always start with the most recent address and telephone number. The older information is useful if a subject needs to be traced. The advantage of using the older information is that it might help to avoid unnecessary calls and tracing.

If the field period extends for a long period of time, it might be necessary to update this information for some subjects during the course of the survey.

Contacting Survey Subjects by Mail

Sending letters to prenotify the subject is accepted practice when conducting surveys (Dillman, 1978). It serves the dual purpose of preparing the subject for the telephone interview and identifying those subjects whose address is no longer valid. It is always iterative in a survey of this type. That is, each time a new address is located for a subject (through tracing as discussed later), an advance letter is sent prior to telephone contact.

If an envelope is stamped “return service requested,” for a small fee, the U.S. Postal Service will not forward the letter, but instead will affix the new address to the envelope and return it to the sender. This only works if (1) the subject has left a forwarding address and (2) the file is still active, which is usually just 6 months. If the letter is returned marked “undeliverable,” “unknown,” insufficient address,” etc., additional tracing steps must be initiated.

Because the post office updating procedure is only active for 6 months, it is important to continue mail contacts with survey subjects if they are to be interviewed at different points in time. These mail contacts can be simple and include thoughtful touches such as a birthday card or perhaps a newsletter with interesting survey results.

Mailings should include multiple ways for the subject to contact the survey organization, such as an 800 number and a business reply post card with space to update name, address, and telephone numbers. Some small percentage will call and/or return the post card, negating the need for further tracing.

One of the problems with first-class letters is that the letters often do not reach the subject. The address may be out of date and not delivered to the correct household (e.g., Traugott et al., 1997), the letter may be thrown out before any-

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

one actually looks at it, or the subject may open but not read the letter. To increase the chances that the subject does read the letter, consideration should be given to using express delivery rather than first-class mail. This idea is based on the logic that express delivery will increase the likelihood that the package will be opened by potential respondents and the contents perceived to be important. Express delivery may also provide more assurance that the letter has actually reached the household and the subject, particularly if a signature is required. However, requiring a signature may not produce the desired result if it becomes burdensome for the subject, for example, if the subject is not home during the initial delivery and needs to make special arrangements to pick it up. The annoyance may be lessened if, in addition to the letter, an incentive is enclosed.

Because express delivery is costly (but less than in-person contacts), it should be saved for those prenotification situations in which other means of contact have not been fruitful. For example, if first-class letters appear to be delivered, yet telephone contact has not been established, and tracing seems to indicate the address is correct, an express letter might be sent. It also might be used if the telephone number is unlisted or if the subject does not have a telephone. In these situations an express letter with a prepaid incentive might induce the subject to call an 800 number to complete the interview by telephone.

Tracing

Tracing is costly. Tracing costs also vary quite a bit by method. As a general rule, it is best to use the least costly methods first when the number of missing subjects is greatest, saving the costlier methods for later when fewer subjects are missing. Database searches are generally the least costly at a few pennies a “hit,” while telephone and in-person tracing can cost hundreds of dollars a hit.

Two key components of a tracing operation are: (1) a comprehensive plan that summarizes the steps to be taken in advance, and (2) a case management system to track progress. The case management system should maintain the date and result of each contact or attempt to contact each subject (and each lead). The system should provide reports by subject and by tracing source. The subject reports provide “tracers” with a history and allow the tracer to look for leads in the steps taken to date. The reports should provide cost and hit data for each method to help manage the data collection effort. In the end it helps to determine those methods that were the most and least cost effective for searching for the population of interest, and this knowledge can be used for planning future surveys. Each of the tracing sources is discussed briefly in the following paragraphs.

Directory assistance (DA). Several DA services are now available. Accuracy of information from these services is inconsistent. DA is useful and quick, however, when just one or two numbers are needed. If the first DA attempt is not successful it may be appropriate to try again a few minutes later (with a different operator)

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

or to try a different service. These calls are not free and the rates vary widely. Costs also include the labor charges of the interviewer/tracer making the calls.

Telephone lookup databases. There are several large telephone lookup services that maintain telephone directory and information from “other” sources in a database. These data are available by name, by address, or by telephone number. The search is based on a parameter determined by the submitter, such as, match on full name and address, match on last name and address, match on address only, or match on last name in zip code. Early in the tracing process the criteria should be strict, with matches on address only and/or address with the last name preferred. Later in the process broader definitions may be incorporated. Charges for database lookups are generally based on the number of matches found, rather than the total number of submissions. The cost is usually a few cents. These lookups are quick, generally requiring less than 48 hours, with many claiming 24-hour turnaround. However, the match rate is likely to be low. In a general population survey the match rate might be as high as 60 percent, and of those, some proportion will not be accurate. For a highly mobile, low-income population, where only those whose numbers are known to have changed are submitted, the hit rate is likely to be quite low. However, given the relatively low cost, even a very low match rate makes this method attractive.

Several companies provide this information, so one might succeed where another might fail. There also may be regional differences, with data in one area being more complete than in others. In California, for example, telephone numbers are often listed with a name and city, but no address. This limits the data’s usefulness, especially for persons with common last names.

Specialized databases. These include credit bureaus and department of motor vehicles (DMV) checks where permitted. Checks with one or more of the credit bureaus require the subject’s Social Security number, and they are more costly than other database searches. Charges are based on each request not the outcome of the request. More up-to-date information will be returned if the subject has applied for credit recently, which is less likely with a low-income population than the general population. DMV checks in many states, such as California, require advance planning to obtain the necessary clearances to search records.

Other databases. Proprietary databases available on the Internet and elsewhere contain detailed information on large numbers of people. Access to the databases is often restricted. However, these restrictions are often negotiable for limited searches for legitimate research purposes. Like credit bureaus, these files often are compiled for marketing purposes and low-income populations may not make the purchases necessary to create a record. Records on people often are initiated by simple acts such as ordering a pizza or a taxi.

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

Telephone tracers. For the purpose of this discussion, it is assumed that each telephone number and address that has been obtained for the subject has led to a dead end. This includes all original contact information and results from all database searches. At this point tracing becomes expensive. Tracers receive special training on how to mine the files for leads. People who have done similar work in the past, such as “skip tracing” for collection agencies, tend to be adept at this task. Tracers need investigative instincts, curiosity, and bullheadedness that not all interviewers possess. Tracers usually are paid more than regular interviewers.

The tracers’ task is to review the subject’s tracing record, looking for leads, and to begin making telephone calls in an attempt to locate the subject. For example, online criss-cross directories and mapping programs might be used to locate and contact former neighbors; if children were in the household, neighborhood schools might be called; and employers, if known, might be contacted. Of course, all contact must be carried out discreetly. Some of these techniques are more productive in areas where community members have some familiarity with one another, generally places other than the inner cities of New York, Chicago, and Los Angeles. Nonetheless, even in urban areas, these techniques sometimes work.

Cost control is crucial in this process because much of the work is limited only by the imagination of the tracer (and tracers sometimes follow the wrong trail). Perhaps a time limit of 15 or 20 minutes might be imposed. At that time limit, the tracers work could be reviewed by a supervisor to determine if further effort seems fruitful, if another approach might be tried, or the case seems to have hit a dead end.

In-person tracing. This is the most expensive method of tracing, and it is most cost effective if it is carried out in conjunction with interviewing. Like telephone tracing, in-person tracing requires special skills that an interviewer may not possess and vice versa. For this reason it might be prudent to equip tracers with cellular telephones so that the subject, when located, can be interviewed by telephone interviewers. The tracer can thus concentrate on tracing.

Tracing in the field is similar to telephone tracing except that the tracer actually visits the former residence(s) of the subject and interviews neighbors, neighborhood businesses, and other sources. Cost control is more of a problem because supervisory review and consultation is more difficult but just as important.

Contacting Subjects

When a telephone number is available for either a subject or a lead, the process of establishing contact becomes important. An ill-defined calling proto-

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

col can lead to significant nonresponse. In this section we discuss some of the issues related to contact procedures.

Documenting Call Histories and Call Scheduling

Telephone calls need to be spread over different days of the week and different times of the day in order to establish contact with the household (not necessarily the subject). If contact with the household is established, it is possible to learn if the subject can be contacted through that telephone number, and if so, the best time to attempt to call. If the subject is no longer at the number, questions can be asked to determine if anyone in the household knows the subject’s location.

If the telephone is not answered on repeated attempts, an assessment must be made of the utility of further attempts against the possibility that the number is no longer appropriate for the subject. In other words, how many times should a nonanswered telephone be dialed before checking to make sure it is the correct number for the respondent? It is important to remember that this is an iterative process applicable to the initial number on the subject’s record as well as to each number discovered through tracing, some of which will be “better” than others. The issue is assessing the tradeoffs between time and cost.

Many survey firms suggest making seven calls over a period of 2 weeks—on different days (two), evenings (three), and weekends (two)—before doing any further checking (e.g., checking with DA; calling one or more of the contacts; or searching one of the databases). Other firms suggest doubling the number of calls, theorizing that the cost of the additional calls is less than the cost of the searches. Unfortunately, there is no definitive answer because much depends on the original source of the number being dialed, the time of year, the age of the number, and other factors. Very “old” numbers are less likely to be good, and perhaps fewer calls (perhaps seven) should be made before moving to a tracing mode. If contact information is available, checking with the contact may be cost effective earlier in the process. In the summer or around holidays, more calls (perhaps 10 to 12) might be prudent.

Call histories, by telephone number, for the subject (and lead) should be documented thoroughly. This includes the date, time, outcome, as well as any comments that might prove useful as a lead should tracing be necessary.

Message Machines

Message machines are now present in an estimated 60 to 70 percent of U.S. households (STORES, 1995; Baumgartner et al., 1997). As more households obtain machines, there has been a growing concern that subjects will use them to screen calls and thereby become more difficult to contact. However, empirical evidence to date has not shown message machines to be a major impediment to

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

contacting respondents. Oldendick and Link (1994) estimate that a maximum of 2 to 3 percent of respondents may be using the machine in this way.1

A related issue has been the proper procedure to use when an interviewer reaches an answering machine. Should a message be left? If so, when should it be left? Survey organizations differ on how they handle this situation. Some organizations leave a message only after repeated contacts fail to reach a respondent on the phone (as reported by one of the experts interviewed). Other organizations leave a message at the first contact and do not leave one thereafter. The latter procedure has been found to be effective in RDD studies relative to not leaving any message at all (Tuckel et al., 1996; Xu et al., 1993). The authors favor leaving messages more often (perhaps with every other call with a maximum of four or five) than either of these approaches. We believe, but cannot substantiate empirically, that if the goal is to locate and interview a particular person, then the number of messages left might signal the importance of the call to the person hearing the message and might induce that person to call the 800 number. Even if the caller says the subject does not live there, that is useful information. However, leaving too many messages may have a negative effect.

Obtaining Cooperation

In this section we highlight some of the standard survey procedures for obtaining high cooperation rates once contact with the subject has been established. These can be divided into issues of interviewer training, the questionnaire, and the treatment of refusals.

Interviewer Materials and Training

Interviewer experience has been found to be related to obtaining high respondent cooperation (Groves and Fultz, 1985; Dillman et al., 1976). The theory is that experience makes interviewers familiar with many questions reluctant respondents may have about cooperating (Collins et al., 1988), and allows them to respond in a quick and confident manner. Showing any type of hesitation or lack of confidence is correlated with high refusal rates.

This finding suggests that intense training of interviewers on how to handle reluctant respondents may provide them with increased confidence, as well as the necessary skills, to handle difficult situations. Groves and Couper (1998) present results from an experiment on an establishment survey that shows significant improvement in cooperation rates once interviewers are provided with detailed training on how to handle reluctant respondents. This training consisted of drill-

1  

A related concern is whether respondents are using caller ID in a similar way.

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

ing interviewers, through a series of role plays, on providing quick responses to respondent concerns about participating in the study. Because this study was done in an establishment survey, the applicability to a survey of low-income respondents is not clear. Respondents to establishment surveys are more willing to converse with the interviewer, which allows for more time to present arguments on why the respondent should participate in the study.

Nevertheless, this suggests that interviewers must have the skills to answer the subject’s questions, to overcome objections, and to establish the necessary rapport to conduct the interview. Training in these areas is crucial if refusals are to be avoided. Answers to Frequently Asked Questions (FAQs) must be prepared and practiced so that the “answers” sound like the interviewer’s own words rather than a script that is being read. Interviewers also must be trained to know when to accept a refusal, leaving the door open for future conversion by a different interviewer who might have more success. This type of training is more difficult than training that is centered on the content of questions, but it is also vital if refusals are to be avoided.

Questionnaire Design

Several areas related to the design of the questionnaire could impact response rates. These include: (1) the length of the questionnaire, (2) the introduction used, and (3) the type and placement of the questions. Each of these has been hypothesized to affect the ability of the interviewer to obtain a high response rate. Interestingly, for each of these characteristics, there is some belief that the effects are primarily on the interviewer’s perception of the task, rather than concerns the respondent may have with the procedure. If interviewers perceive the task to be particularly difficult to complete, their confidence levels may go down and their performance might be affected.

Pretests of the questionnaire should be conducted as part of any research design. Pretests, and accompanying debriefings of the interviewers often uncover problems that are easily corrected prior to interviewing the sample subjects. More elaborate pretesting methods also should be considered. These include, for example, “cognitive interviews,” as well as review of the questionnaire by a survey research professional who has experience in conducting structured interviews.

Questionnaire length. Although it is commonly believed that the length of the questionnaire is related to response rates, very little empirical evidence shows that this, in fact, is true. Much of the evidence that does show a relationship between length and response rates concerns mail surveys, where respondents get visual cues on how long the interview may be (Bogen, 1996). The length of a telephone interview may not be mentioned unless the respondent asks, so the respondent may not know how long it will take. This fact further confuses the relationship between interview length and response rates.

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

Two exceptions to this are studies by Collins et al. (1988) and Sobal (1982). Both found a relationship between how long the interviewer told the respondent the interview would take and the response rate. Collins et al. (1988) found a modest effect of approximately 2 percent, while Sobal (1982) found a much larger reduction of 16 percent when comparing a 5-minute interview to a 20-minute interview. These studies, however, are difficult to generalize to other studies because they do not compare the effects of different descriptions of the length of the interview to one that does not state the length at all. This makes it unclear what the overall effect of interview length might be in the context of another survey, which does not state the length of the interview (unless asked).

This research does suggest, however, that significantly shortening the interview to 5 minutes may increase response rates to some degree. If the interview were shortened to this length, then it might be advantageous to state the length of the interview in the introduction to the survey. One would assume that cutting the interview to just 5 minutes is not an efficient way to increase the response rate. The loss of information needed for analyses will be much larger than anticipated gains in the response rate. For this reason, it might be useful to consider shortening the interview only for a special study of refusers. If this strategy significantly increases the number of persons who are converted after an initial refusal, more information might be obtained on how respondents differ from nonrespondents.

Survey introduction. A natural place to start redesigning the questionnaire to improve response rates is the introduction. Many respondents refuse at this point in the interview. This is especially the case for an RDD survey, where interviewers do not have the name of the respondent and the respondent does not recognize the voice on the other end of the call. For this reason, it is important to mention anything that is seen as an advantage to keeping the respondent on the line. Advantages generally are believed to be: (1) the sponsor of the study, (2) the organization conducting the interviews, (3) the topic of the survey, and (4) why the study is important.

Research in an RDD survey context has not found any general design parameters for the introduction that are particularly effective in increasing response rates. Dillman et al. (1976), for example, find no effects of offering respondents results from the survey or statements about the social utility of the survey. Similarly, Groves et al. (1979) find variations in the introduction do not change response rates. Exceptions to this are a few selected findings that: (1) government sponsorship seems to increase response rates (Goyder, 1987), (2) university sponsorship may be better than private sponsorship, and (3) making a “nonsolicitation statement” (e.g., “I am not asking for money”) can help if the survey is not sponsored by the government (Everett and Everett, 1989).

The most widely agreed-on rule about introductions is that they need to be as short as possible. Evidence that shorter is better is found in Dillman et al. (1976), as well as our own experience. Because the interviewer may not have the full

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

attention of the respondent at the initial outset of the call, it is better to simply state the best points of the survey and get the respondent to react to the first question. Interviewers also generally prefer short introductions, because they provide a greater opportunity to involve the respondent in the conversation (less opportunity to hang up). By increasing interviewer confidence, the response rate should be affected positively. It is important to balance the informational requirements with the need to be brief and simple. Long explanations, going into great detail about the survey, may turn respondents off more than motivate them to participate. The best approach is to provide the respondent with a broad set of statements to capture their attention at this point in the interview. Once rapport and trust have built up a bit, more details about the study can be presented.

Type/placement of questions. Sensitive questions have higher rates of nonresponse and should be placed later in the questionnaire but still positioned logically so that the flow from one topic to the next is smooth. Sensitive information includes topics such as income, detailed household composition (e.g., naming everyone in the household), participation in social programs, and child care. Careful placement allows these questions to be asked after rapport has been established. This is especially true with initial contacts into the household. Asking sensitive questions within the first few minutes of the initial contact may turn respondents off unnecessarily.

Refusal Conversion

If a respondent refuses to participate, it is important for the interviewer to indicate the level of hostility, if any. It may not be desirable (nor cost effective) to try to convert subjects who are extremely hostile (e.g., one in which the respondent is abusive). Other subjects might be recontacted in an attempt to have them reconsider their decision. This recontact should take place several days (7 to 21) after the initial contact to allow the respondent time to reconsider.

Prior to refusal conversion, a letter should be sent to try to convince the respondent to participate. This letter has been shown to be particularly effective if: (1) an incentive is enclosed, and (2) express delivery is used for mailing (Cantor et al., 1999). Comparisons between the use of express delivery to a first class refusal conversion letter show a difference of 10 percentage points in conversion rates on an RDD study and a difference of 15 to 20 percentage points if an incentive is enclosed. These results are not likely to be as dramatic for a survey of welfare leavers. However, this strategy has been applied in this context and is believed to be effective.

Based on work related to personal interviews (Couper et al., 1992), it is possible to create specialized letters for refusal conversion based on what the respondent said at the time of the refusal. A procedure adopted for the National

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

Survey of America’s Families (NSAF) was to have the interviewer provide a recommendation on the type of letter that would be sent to the respondent after the refusal occurred. Because most refusals fall into two or three categories (e.g., “no time,” “not interested”), special letters could be developed that emphasized particular arguments why the respondent should cooperate (e.g., “no time”—emphasize the length of the interview; can do the interview in several calls). The problem with this procedure is that for most refusals, the interviewer has little information on which to base a good decision on the reason for refusal. A large number of respondents hang up before providing detailed feedback to the interviewer. As a result, a large majority of the mailouts for the refusal conversion are done using the “general” letter, which does not emphasize anything in particular.

However, in a survey of welfare leavers, where the interviewer may have more information about the reason for refusal, tailoring the letters to the respondent’s concerns may be useful. This would depend on the amount of information the interviewer is able to collect on the reason for the nonresponse.

Refusal conversion calls are best handled by a select group of handpicked interviewers who are trained to carry out this type of work. They must be trained to analyze the reason for the refusal and be able to prepare answers for different situations.

STUDIES OF WELFARE LEAVERS

Table 2–1 summarizes the procedures discussed previously. It is organized around the three primary activities required to conduct a study: (1) locating the subject, (2) contacting the subject, and (3) obtaining cooperation.

In this section we discuss how these “best practices” have been applied in a number of surveys that have been conducted to evaluate welfare reform in different states. The purpose of this review is to provide a picture of the range of practices that have been used and how these practices relate to results.

Description of Methods Used in Recent Studies

To better understand the methods that have been implemented in recent studies of welfare reform, we collected information on a small sample of state surveys. The largest portion of our sample of studies is from the group of Assistant Secretary for Planning and Evaluation (ASPE) grantees funded in FY99 (9 of the 13 studies). The remaining studies were chosen by networking or referral by colleagues. Information was collected through interviews with the director of the research team and any reports that were available. These studies are meant to represent what the current practice is for welfare-leaver studies.

A summary of key characteristics for these 13 surveys is shown in Table 2–2. In 12 of the 13 surveys for which we collected information, a mixed-mode,

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

TABLE 2–1 Summary of Best Practices for Conducting Telephone Surveys of Welfare Leavers

Task

Method

Comment

Locate Respondent

 

• Accurate address and telephone number

• Contact for persons not living with subject

Collect at intake and update regularly

Try to collect consent to search other databases

• Use other sources to locate subject

Use available administrative databases (e.g., food stamps, Medicaid, driver’s licenses); use commercially available sources (reverse directories, credit bureaus)

Start with the least expensive methods

• Telephone tracing; in-person tracing

• In-person tracing

Review tracing record and follow leads

Very expensive and requires specialized skills

Contacting Subjects

 

• Prenotification

Send letter prior to making contact

Use express delivery if possible

• Incentives and continued contact

Repeated mailings to subjects

 

• Call scheduling

Spread out calls over day/night; weekdays/weekends

 

Obtaining Cooperation

 

• Interviewer training and experience

Provide interviewers with answers to common questions

Try to use experienced interviewers with good records

• Questionnaire design

Minimize redundant questions

Keep length as short as possible

Pretest questions and allow for time to revise after the pretest

• Survey introduction

Keep initial introduction as short as possible

 

• Refusal conversion

Prenotify with express mail and incentives

 

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

two-step, approach was used. First, as many telephone interviews as possible were conducted using information accessible to home office staff. Respondents were contacted initially using information available from the administrative records from the sample frame. Advance letters were sent out. For those persons who do not have a phone number, the letter asked for the subject to call an 800 number to do the interview or set up an appointment.

If the telephone number did not lead to the subject, tracing was done from the home office. This typically included using directory assistance, reverse directories to find other addresses and free services on the Internet. Other methods implemented by most of the studies included:

  • Searches of credit databases: These include databases such as Transunion, CBI/Equifax and TRW. Stapulonis et al. (1999) report the use of an unnamed database that seemed to add information above and beyond these.

  • Searches of other databases across agencies: These included food stamps, unemployment insurance, child support enforcement, motor vehicles, Medicaid, employment training, Social Security, vital records, and state ID cards.

The ability to search the “other databases” was possible because in all cases the research organizations had the Social Security number of the respondent.

In discussions with different organizations, we got a clear sense that the original contact information was not of high quality. One study reported, for example, that 78 percent of the original phone numbers did not lead directly to subjects. This may be, in part, because there is very little need for agency representatives to maintain contact with recipients over the telephone. In one state, for example, recipients are paid using a debit card that is continually re-valued at the beginning of a payment period. Thus, the address and telephone number information is not used on a frequent basis. In a study conducted by Westat several years ago, a similar result was found when trying to locate convicted felons (Cantor, 1995). Contact information provided by probation officers was found to be accurate about 50 percent of the time.2

If the subject cannot be located with available contact information, the case is sent out into the field. In some instances, the field interviewer is expected to both locate and interview the subject. In other instances the interviewer asked the subject to call a central interviewing facility. If the subject does not have a telephone, the interviewer provides them with a cellular telephone to call the facility. Several organizations reported that having the respondent call into the central office allowed for more specialization in the field tracing task. Interviewers would not be required to administer the interview. When hiring field person-

2  

This rate is surprisingly low, given that probation officers should be in regular contact with probationers.

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

TABLE 2–2 Results of Interviews with 14 Selected Telephone Surveys of Welfare or Ex-Welfare Recipients

Study Number

Advance Letter/Incentives

Telephone Tracking Sources

Field Tracking

Refusal Conversion

Response Rate %

Field Period

1

Yes/Yes

Directory assistance

Reverse directory

No

No

30

3 months

2

Yes/Yes

Directory assistance

Reverse directory

Specialized tracking firm

Motor vehicle/ID records

Yes

Exp. staff

Yes

51

9 months+

3

Yes/Yes

Directory assistance

Credit databases

Other welfare offices

Yes

Exp. staff

Yes

Total: >70

Tel: 50

5 months

4

Yes/Yes

Directory assistance

Credit databases

Other welfare offices

Yes

Exp. staff

Yes

Total: >70

Tel: 50

5 months

5

Yes/Yes

DK

Yes

Unknown

Exp.

DK

52

DK

6

No/No

Directory assistance

Other agency database

Yes

Exp. staff

No

Total: 78

Tel: 66

4 months

7

Yes/Yes

Directory assistance

Credit databases

Motor vehicle

Other agency databases

Yes

Exp. staff

No exp.

Yes

Total: 72

Tel: 25–30

60 months

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

8

Yes/Yes

Reverse directory

Other agencies

Yes

No exp.

no

Total: 46

Tel: 40

DK

9

DK

Directory assistance

Tracing contact

Credit databases

Other agencies

Yes

Exp. staff

DK

Total: 80

Tel: 50

DK

10

Yes/Yes

Directory assistance

Credit databases

Other agencies

Yes

Exp. staff

DK

Total: 75

DK

11

Yes/Yes

Directory assistance

Credit databases

Other agencies

Yes

Exp. staff

Yes

Total: 81*

DK

12

Yes/Yes

Directory assistance

Credit databases

Other agencies

No

DK

Total: 40

4 months

13

Yes/Yes

Directory assistance

Other agencies

Yes

Exp. staff

No

Total: 72

DK

14++

Yes/Yes**

Directory assistance

Other agencies

Yes

No Exp

No

Total: 72

Tel: >65

2 months

DK=Don’t Know

* Response rate after 5 years.

** Used nonmonetary incentive.

+ Interviewing period was 3 months. Used 6 months before interviewing period to establish contact information and find respondents.

++Did extensive tracing over the telephone with highly experienced personnel.

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

nel, therefore, the agency should be able to recruit individuals who are especially adept at tracking and tracing.

Empirical Results and Relation to Best Practice

These 14 studies provide some data on the possibilities and limitations related to conducting welfare-leaver studies. Many of these studies are implementing the “best practices” summarized in Table 2–1. These include, for example, advance letters, incentives, tracking/tracing, and refusal conversion. Resulting response rates ranged from a low of 30 percent to a high of 80 percent. Many studies are in the 40 to 50 percent range.

It is clear from these data, as well as from the authors’ collective experience, that no single design feature guarantees a high response rate. The effectiveness of particular methods varies by situation and a number of methods are needed to maximize response rates. A useful illustration is a survey that was completed in Iowa of current and former Temporary Assistance for Needy Families recipients (Stapulonis et al, 1999). This was a mixed-mode survey that implemented all of the methods discussed earlier, including: (1) repeated mailings to respondents, (2) use of telephone interviewers experienced in tracking respondents over the phone, (3) incentive payments, (4) specialized database searches, and (5) use of field staff to trace and interview respondents. As reported by Stapulonis et al. (1999), no single method produced a high response. A response rate of approximately 25 to 30 percent was achieved through the use of the telephone. At the end of 16 weeks, a 48-percent response rate was achieved by offering an incentive of $10 and sending cases into the field. The remainder of the 60-week field period was used to increase the rate to 72 percent. During this interim period, numerous methods were instituted, such as increasing incentive payments, remailings (using express mail) to households, field tracing, and using more specialized tracing sources and methods. The latter included using highly experienced trackers in the telephone center and the field.

The data in Table 2–2 seem to indicate that the mixed-mode approach, at least as currently implemented by most of these states, is necessary to achieve response rates of at least 50 percent. The data also indicate that for many studies, use of only the telephone yields a response rate of approximately 30 to 40 percent. The clearest example of this is study #1 and #2. These two studies were completed in the same state by the same organizations. In study #1, where a 30-percent response rate was obtained, only telephone and limited tracking was done from a central office. Study #2 instituted a number of additional tracing steps, but also added a field component. Similarly, study #7 reported a 25 to 30-percent response rate before going into the field and study #8 reported a 40-percent response rate before releasing cases to the field. The major exceptions to these patterns are the few studies that report final response rates of at least 70 percent. In these instances, the response rate obtained over the telephone is at least 50

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

percent and, in one case, 66 percent. Study #14 had a response rate of 72 percent and reported very poor experiences with their field tracers. Effectively, most of their cases were completed using the telephone. A few of these higher rates were achieved as part of planned experiments, where contact was initiated while the recipient was in the program. Other successes may be attributed more to the quality of the information available at the start of the study. Overall, we believe that if response rates of at least 50 percent are desired, it would seem important to use both telephone and field personnel to trace and locate respondents.

This pattern is consistent with our general experience in working with low-income populations. Although it is possible, using proper procedures and preparation, to complete a significant number of interviews via mail and telephone, a proportion of this group simply does not respond to anything but in-person contacts. This may be related to this group’s mobility rate, the intermittent availability of the telephone, or simply busy work schedules. Whatever is the case, it is unlikely that achieving extremely high response rates (e.g., 70 percent or above) for welfare leavers can be achieved by simply the use of mail or telephone interviews.

Tracking Respondents

As one might expect, the primary source of nonresponse in these studies is noncontact, rather than refusals. For example, of the 25-percent nonresponse in the study #6, approximately 17 percent is from not being able to locate respondents and 8 percent is from refusals. For surveys that have lower response rates (e.g., around 50 percent), the percent of nonlocatables is even higher. This suggests that improving response rates has to do most with improving tracking.

Given this, an important component to pushing response rates above 50 percent is to improve the ability to find subjects. This relates to both the type of staff and the type of information available for finding the subjects. Study #6, with a 78-percent response rate, is a good illustration of the importance of experienced staff. This study did not implement many of the procedures discussed previously, including prenotification letters, refusal conversion, or incentives. The staff doing the interviewing and tracing, however, were program quality assurance personnel. Because part of their job is to find and interview welfare recipients to conduct audits, they were highly experienced in finding this population. In addition, the supervisor seemed to have very strong oversight of the interviewers’ progress. Similarly, study #14 completed all interviews over the telephone and achieved a 72-percent response rate. The study did not offer a monetary incentive and did not conduct refusal conversion. The success of this survey was attributed to the interviewers, who were also part of the quality assurance program.

Alternatively, a number of survey directors reported that the major barriers they encountered were related to inexperienced staff, either in the phone center or in the field, in tracking and tracing subjects. Stapulonis et al. (1999) report failure

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

of a field effort because of inexperienced field trackers, as did the survey director for study #8. In the latter case, the telephone interviewers were asked to conduct field interviewing.

Our experience has been very similar to this profile and it applies to in-person interviewing as well as tracking from a central telephone facility. The ability to look over case records, find leads, and followup on those leads requires the ingenuity of a detective, as well as a personality that gains trust when calling neighbors or other community members.

Having solid information from which to trace subjects is also essential to finding them eventually. As noted previously, most survey directors commented on the poor quality of the information contained in the original sample records. In many cases, the information is quite old (e.g., up to 6–9 months) and, in many cases, of questionable accuracy. Because this is a highly mobile population, the age of the records limits the utility of the information quickly. Study #2 attempted to minimize this problem by beginning the tracking process as soon as subjects came off the welfare records. Although this may lead to tracking too many people,3 it provided a way to maintain contact with subjects until the field interviewing started 6 months later. The success of this process is yet to be evaluated, but this method may provide a way to keep the information about sampled persons as up to date as possible.

All the studies had Social Security numbers for subjects, as well as access to databases in other parts of the government (e.g., motor vehicle registrations, food stamps, child support enforcement, Medicaid, unemployment insurance). These provide a powerful set of tools to find respondents. However, only two of the studies have tracing contact information, containing the names and phone numbers of at least one person, preferably someone who the subject does not live with, who is likely to know where the person is at any point in the future. These two studies both achieved response rates above 75 percent. Both studies were experiments, set up in advance to sample clients at intake and collect this information at that time.

The availability of tracing contacts would not only improve the tracking rates for these studies, but it would reduce the amount of time devoted to tracing. In fact, our experience, has shown that with good tracing contacts, as well as occasional interim contacts with subjects (e.g., every 6 months), little in-person tracking has to be done. Respondents can be located by interviewers making update phone calls. This is what many longitudinal surveys do as part of their routine activities for staying in touch with respondents. As a point of illustration, Westat recently completed a feasibility study that located 85 percent of subjects 3

3  

Most studies had, as an eligibility criteria, that leavers had to stay off the welfare program for at least 2 months. Sampling within a month of leaving the program, therefore, eventually results in having to drop subjects because they return to the program within 2 months.

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

years after their last contact with the study. These subjects were high-risk youth who had been diverted into a family counseling program in 1993 and were last contacted in 1996. At that time, tracing contact information had been collected. This population lived in highly urbanized, poor neighborhoods and could be considered comparable to those being traced in the welfare-leaver studies discussed previously. Approximately 67 percent of the population was found through the use of mail and telephone contacts. The remaining 18 percent were found by field tracing.

Increasing Cooperation

Pushing response rates higher also can be done through selective adoption of other methods related to making the response task easier. Some percentage of the persons classified as nonlocatable are really tacit refusers. That is, some of those individuals that “can’t be located” are explicitly avoiding contact with the interviewer or field tracer because of reluctance to participate in the survey. This reluctance may be because the person does not want to take the time to do the survey or it may be more deep-seated and related to a general fear of being found by anyone who happens to be looking for them.

Several of the studies found that repeated mailings to the same addresses over time did result in completed interviews. This approach seemed to be especially effective when these mailings were tied to increased incentives. This approach would tend to support the idea that at least some portion of the “noncontacts” are actually persons who are tacitly refusing to do the interview, at least the first few times around. Express mail was also used for selected followup mailings, although it is unclear whether this method of delivery was particularly effective.

As noted in Table 2–2, a number of the studies do not implement any type of refusal conversion. The reluctance stems from fear that this would be viewed as coercive, because the agency conducting the research is the same agency responsible for providing benefits on a number of support programs. Other survey groups, however, reported confidence in conducting refusal conversion activities, as long as they were convinced the interviewers were well trained and understood the line between trying to directly address respondents’ concerns and coercion. In fact, many initial refusals are highly situational. The interviewer may call when the kids are giving the parent an especially difficult time or at a time when the subject just came home from an exhausting day at work. In another situation, the respondent may not have understood the nature of the survey request. In all of these cases, calling back at another time, with an elaborated explanation of the survey, is useful. In fact, one study director reported that about 50 percent of the initial refusers in the study were eventually converted to final completed interviews. This is not out of line with refusal conversion rates found on other studies, either of the general or low-income populations.

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

SPECIAL ISSUES FOR RDD SURVEYS OF LOW-INCOME POPULATIONS

In many ways, RDD surveys pose a much different set of challenges than those for list-based samples, especially on issues related to nonresponse. For surveys of welfare clients, the target population is identified clearly and quality issues have to do with finding sample members to conduct the interview. For RDD surveys, the primary issues have to do with efficiently identifying low-income subjects and, once identified, convincing them to participate in a survey.

Response Rates on RDD Surveys

To provide some perspective on the level of response achieved on RDD surveys, Massey et al. (1998) presented results of a study that reviewed the response rates of a large number of RDD surveys conducted for government agencies or as part of a large survey effort. Their results found a median response rate 60 to 64 percent with about 20 percent of the surveys exceeding 70 percent. The overall perception among survey analysts is that the trend is for this rate to decrease over time. That is, achieving high response rates for RDD surveys is becoming more difficult.

An RDD survey of low-income populations faces several hurdles relative to achieving a high response rate. The first is the need to screen all households on the basis of income. This leads to two types of problems. The first is that it adds an opportunity for someone to refuse to do the survey. A screener written to find low-income households has to include a number of questions that respondents are sensitive to, including information on who lives in the households, and some type of income measure. Much of the nonresponse on RDD surveys occur at this point in the process. For example, on the NSAF, a national RDD survey that oversamples low-income groups, the screener response rate was in the high 70s. Once a respondent within the household was selected, the response rate to do the extended interview was in the 80s. Nonetheless, the combination of the two rates, which form the final response rate, resulted in a rate in the mid-60s (Brick et al., 1999).

Low response rates on RDD surveys are partly an issue of credibility. Relative to a survey of welfare leavers, the issue of credibility places more emphasis on design features that motivate respondents to participate in the survey (vis-à-vis trying to locate respondents). For example, research on methods to increase RDD response rates has shown that prenotification prior to the call, methods of delivery of prenote letters, and use of incentives can provide important boosts above those normally achieved when implementing many of the other important design features reviewed earlier. All three of these increase response rates in the context of an RDD survey (Camburn et al., 1995; Brick et al., 1997; Cantor et al, 1999).

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

In addition, refusal conversion is particularly important for an RDD survey, because such a large proportion of the nonresponse is from refusals. Refusal to the screener could be from almost any member of the household, because most screeners accept responses from any adult who answers the phone. Calling the household a second time provides an opportunity to obtain another person in the household (who may be more willing to participate) or reach the same respondent who may not be as difficult to convince to participate in a short screening instrument. Refusal to the extended interview may be more difficult to turn around. Refusal conversion strategies at this level are amenable to more traditional “tailoring” methods (e.g., Groves and Couper, this volume: Chapter 1), because respondents at this stage of the process may be more willing to listen to the interviewer.

Efficiently Sampling Low-Income Populations

A second issue related to conducting RDD surveys of low-income populations is the ability to actually find and oversample this group. Screening for persons of low-income has been found to have considerable error. This has been assessed when comparing the poverty status reported on the initial screener and the income reported when using more extensive questions in the longer, extended interview. For example, on the NSAF approximately 10 to 15 percent of those who report being below 200 percent of poverty on the longer interview initially tell the screener they are above this mark. Alternatively, 20 to 30 percent of those reporting themselves as above 200 percent of poverty on the extended interview initially screen in as above this mark (Cantor and Wang, 1998). Similar patterns have been observed for in-person surveys, although the rates do not seem to be as extreme. This reduces the overall efficiency of the sample design. This, in turn, requires increasing sample sizes to achieve the desired level of precision.

To date, the problem has not had a clear solution. In-person surveys have developed more extensive screening interviews to allow predicting income status at the point of the screener (Moeller and Mathiowetz, 1994). This approach also might be taken for RDD screeners, although there is less opportunity to ask the types of questions that are needed to predict income. For example, asking detailed household rosters, or collecting information on jobs or material possessions likely would reduce the screener response rate.

A second issue related to sample design on an RDD survey is the coverage of low-income households. Although only 6 percent of the national population is estimated to be without a telephone (Thornberry and Massey, 1988), about 30 percent of those under poverty are estimated to be in this state. For an RDD survey of a low-income populations, therefore, it is important to decide how coverage issues will be approached. One very expensive approach would be to introduce an area frame into the design. This would include screening for

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

nontelephone households in person and then conducting the extended interviews either in person or over the telephone.4

Over the past few years, a new method, based on an imputation method, has been tested that does not require doing in-person interviews (Keeter, 1995). The premise of the method is based on the idea that for a certain segment of the population, having telephone service is a dynamic, rather than stable, characteristic. Consequently, many of the people who do not have service at one point in time may have service shortly thereafter. This implies that one might be able to use persons that have a telephone, but report interrupted service, as proxies for those who do not have telephones at the time the survey is being conducted. Based on this idea, telephone surveys increasingly are including a question that asks respondents if they have had any interruptions in their telephone service over an extended period of time (e.g., past 12 months). If there was an interruption, they are asked how long they did not have service. This information is used in the development of the survey weights. Those reporting significant interruptions of service are used as proxies for persons without a telephone.

Recent evaluations of this method as a complete substitute for actually conducting in-person interviews has shown some promise (Flores-Cervantes et al., 1999). Initial analysis has shown that the use of these questions significantly reduces the bias for key income and other well-being measures when compared to estimates that use in-person interviewing. This is not always the case, however. For certain statistics and certain low income subgroups, the properties of the estimator are unstable. This may be due, in part, to developing better weighting strategies than currently employed. Nonetheless, the use of these questions seems to offer a solution that, given the huge expense involved with doing in-person interviews, may offer significant advantages.

The use of this method also may be of interest to those conducting telephone surveys with persons from a list of welfare clients. Rather than being viewed as a way to reduce coverage error, however, they could be used when trying to impute missing data for high nonresponse rates.

HIGHLIGHTING LOW-COST ACTIONS

This paper has attempted to provide information on methods to achieve high response rates on telephone surveys of low-income populations. We have concentrated much of the review on studies that start with a list of welfare recipients, but we also have provided information for persons conducting RDD interviews. The second section of this paper provided a list of best practices that should be considered when conducting telephone surveys. The third section provided examples of what is currently being practiced in recently completed welfare-leaver

4  

Telephone interviews would be conducted by having the respondent call into a central facility using a cellular telephone.

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

studies and how these practices relate to results. The fourth section provided special issues related to RDD surveys. In this section we concentrate on high-lighting suggestions that seem practical and relatively low cost.

Improve Tracking and Tracing

Clearly one primary theme taken from our review is the need to improve the ability of studies to find subjects. Most agencies face similar situations—the information used to track respondents is both relatively old (6–9 months) and limited. The age of the information could be addressed partly through methods such as those mentioned earlier—start contacting respondents immediately after they leave the program. Maintain this contact until the time to conduct the interview (e.g., 6 months after leaving). This approach, however, is relatively expensive to implement. Following subjects over extended periods of time can be labor intensive. Furthermore, the information provided when exiting the program may not have been updated for quite some time. This constraint is difficult to get around.

A more effective and cost-efficient method to improve contact information is to collect tracing contacts when the subjects initially enter the program. This type of information does not go out of date nearly as fast as a single name and address. Even if they go out of date, the names and addresses can provide additional leads that can be followed up by trackers. When collecting this information, it is important that the names are of persons who do not live with the subject. This decreases the possibility that if the subject moves, the contact person would have moved as well.

Another potentially rich source of information is case history documentation. Many of the studies reviewed above reported using information from other government databases, such as motor vehicles or other recipiency programs, to collect updated addresses and phone numbers. Examination of hardcopy case folders, if they exist, would be one way to supplement this information. One study reported doing this and found it was a good source for tracing contact information. Subjects, at some point, could have provided information on references, employers and friends as part of the application process. This information, if confidentiality issues can be addressed, can be examined to locate further leads to find and track those people who cannot be found.

To provide some perspective on the impact that tracing might have on the cost of a survey, we developed estimates of cost under two scenarios, one in which contact information is available and another in which it is not available. Costs for surveys of this type are difficult to estimate because so much depends on the ability of the data collector to monitor the process; the availability of skilled staff to carry out the tracing; and the nature and quality of information that is available at the start. The first two factors rest with the data collector while the latter depends on information obtained about each subject (and his or her acces-

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

sibility) by the agency. If the data are not current or not complete, tracing is difficult and costly regardless of the controls the data collector has in place.

Under the assumptions described in Table 2–3, we estimate that approximately 600 fewer hours are required to trace 1,000 subjects if tracing contact information is available. Contact information, for this example, would have been obtained during the intake process and delivered to the data collector with the sample. The table may be somewhat deceptive because, for purposes of illustration, we have forced the two samples to have approximately the same location rate in order to compare the level of effort. In reality, the location rate (and consequently the response rate) for those with contact data would be higher than for those without.

In creating Table 2–3, we assumed the following times for each level of tracing. In practice, most of the welfare leaver studies have used both telephone and in-person surveys:

  • 20 minutes for calling through the contacts

  • 20 minutes for calls to the hits of database searches

  • 1 hour for intense telephone tracing

  • 7 hours for in-person tracing

Although these estimated times are reasonable, they also can be misleading. For example, if several databases are used (e.g., agency, credit bureau, DMV,

TABLE 2–3 Comparison of Tracing Hours, by Availability of Contact Information

 

(a)

(b)

(c)

(d)

(e)

(f)

 

No Tracing

Calling Contacts

Database Search

Intense Telephone Follow-up

In-person Tracing

Total

Tracing Time Per Sample Unit (minutes)

0

20

20

60

420

 

With Contact Information

 

Sample size

1,000

700

490

343

240

1,000

Percent of cases located

0.30

0.30

0.30

0.30

0.15

0.80

Number located

300

210

147

103

36

796

Estimated number of hours

0

233

163

343

1,681

2,420

Without Contact Information

 

Sample size

1,000

N/A

700

469

314.23

1,000

Found rate

0.30

 

0.33

0.33

0.33

0.79

Number found

300

 

231

154.77

104

789

Estimated number of hours

0

 

233

469

2,200

2,902

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

commercial) each can produce a hit and require followup, so it is likely that more than one followup call might be carried out for some sample members, and none for others. In organizing a hit, care must be taken to make sure it is genuine and not a duplicate of an earlier hit that already has been invalidated. This adds time, though the process can be aided by a case management system.

The level of interviewer/tracer effort is only one dimension of cost. Supervisory hours will range between 20 to 40 percent of interviewer hours, depending on the activity, with the highest percentage needed for the intense tracing. Other variable labor costs include all clerical functions related to mailing and maintaining the case management system, and all direct nonlabor costs. These include, but are not limited to charges from database management companies to run files, directory assistance charges, telephone line charges, field travel expenses, and postage/express delivery charges.

Fixed costs include the management costs to coordinate the activities and the programming functions to develop a case management system; preparing files for data searches; updating files with results of data searches; and preparing labels for mailing.

A second important point to remember for agencies operating on a limited budget is to hire supervisory and interviewing staff who are experienced at locating subjects. Prudent screening of personnel, whether internal employees or external contractors, potentially have a big payoff with respect to maximizing the response rate. Strong supervisors are especially important because they can teach new interviewers methods of finding particular cases. They also can provide guidance and new ideas for experienced interviewers. The supervision has to be done on an interviewer-by-interviewer basis. Supervisors should review each case with interviewers on a frequent basis (e.g., every week) and provide feedback/advice on how to proceed with each one. This includes making sure the interviewer is following up with the leads that are in hand, as well as discussing ideas on how to generate more leads.

Effective locating and management of a survey cannot be learned on the job. Therefore, sponsoring agencies should gather evidence that the personnel involved have the appropriate experience and successful track record to complete the work successfully. This advice applies if using personnel within the sponsoring agency or through a contractor. When considering a contractor, the sponsoring agency should ask for hard evidence that a study like this has been conducted, and check references and evaluate success rates. Questions should be asked about the availability of experienced staff to complete the work. If the work is to be done by telephone, then some information on the track record of telephone tracers should be requested. For in-person contacts, information on the experience of personnel who reside in the local area where the study is to be conducted should be collected.

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Improving Methods to Contact and Obtain Cooperation

First and foremost in any survey operation is the need to develop an infrastructure that maintains control over cases as they move from the initial prenotification letter to call scheduling and case documentation. Understanding what stage each case is in and what has been tried already is critical to making sure each case goes through all possibilities. These basics are not particularly expensive to implement and can yield a large payoff in terms of completed interviews. For example, supervisory staff should be reviewing telephone cases as they move to different dispositions, such as “ring, no answer,” “initial refusal,” or “subject not at this number.” As with tracing, supervisors should review cases and make case-by-case determinations on the most logical next step.

Monitoring of the call scheduling also should ensure that different times of the day and different days are used when trying to contact respondents. This is one big advantage of a centralized computer-assisted telephone interview (CATI). The computer “deals” cases at the appropriate times and pretty much ensures that the desired calling algorithms are followed. However, if the study is being done with paper and pencil, then a system to document and monitor call history should be in place to ensure that this occurs.

Prenotification is being used extensively for the studies reviewed earlier. Low-income populations are particularly difficult to reach by mail. For this reason, some attention to the form and content of this correspondence is likely worth a small investment of professional time. This includes, for example, the way the letters are addressed (e.g., labels, computer generated, hand written), the method of delivery (express delivery versus first-class mail) and the clarity of the message. The contents of the letter should be structured to be as clear and as simple as possible. One study reviewed earlier noted an improvement (although not experimentally tested) when formatting letters with large subheadings and minimal text. The details surrounding the study were relegated to a question-and-answer sheet. We also have found this to be an improvement over the standard letter-type format. Similarly, use of express delivery, at least when there is some confidence in the validity of the respondent’s address, may also be a cost-effective way to provide respondents with information about the survey that would eventually increase their motivation to participate.

Incentives are also being used in the studies mentioned previously. We have not elaborated on this much, partly because another paper will be presented on just this topic. One pattern we did notice for the studies reviewed earlier was that all incentives are “promised” for completion of the interview. Amounts generally ranged from $15 to $50, with the largest amounts being paid at the end of field periods to motivate the most reluctant respondents. Research has found that prepaid incentives are more effective than promised incentives. Research we have done in an RDD context has shown, in fact, that not only is more effective, but the amount of money needed to convince people to participate is much

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

smaller. It may be worth experimenting with prepayments that are considerably smaller than the current promised incentives (e.g., $5) to see if there is a significant improvement in the ability to locate and interview respondents.

In conclusion, conducting a telephone survey of low-income populations is a task that requires careful preparation and monitoring. The surveys implemented by states to this point have been discovering this as they attempt to locate and interview respondents. Improving response rates will require attention to increasing the information used to locate respondents, as well as making it as easy as possible for respondents to participate. This paper has provided a thumbnail sketch of some important procedures to consider to achieve this goal. It will be interesting to see how future surveys adapt or innovate on these procedures to overcome the barriers they are currently encountering.

REFERENCES

Anglin, D.A., B.Danila, T.Ryan, and K.Mantius 1996 Staying in Touch: A Fieldwork Manual of Tracking Procedures for Locating Substance Abusers for Follow-Up Studies. Washington, DC: National Evaluation Data and Technical Assistance Center.


Bogen, K. 1996 The effect of questionnaire length on response rates—A review of the literature. In Proceedings of the Section on Survey Research Methods. Alexandria, VA: American Statistical Association.

Brick, J.M., M.Collins, and K.Chandler 1997 An Experiment in Random Digit Dial Screening. Washington, DC: U.S. Department of Education, National Center for Education Statistics.

Brick, J.M., I.Flores-Cervantes, and D.Cantor 1999 1997 NSAF Response Rates and Methods Evaluation, Report No. 8. NSAF Methodology Reports. Washington, DC: Urban Institute.


Camburn, D.P., P.J.Lavrakas, M.P.Battaglia, J.T.Massey, and R.A.Wright 1995 Using Advance Respondent Letters in Random-Digit-Dialing-Telephone Surveys. Unpublished paper presented at the American Association for Public Opinion Research Conference; May 18–21, 1995; Fort Lauderdale, FL, May.

Cantor, D. 1995 Prevalence of Drug Use in the DC Metropolitan Area Juvenile and Adult Offender Populations: 1991. Washington, DC: National Institute on Drug Abuse.

Cantor, D., P.Cunningham, and P.Giambo 1999 The use of pre-paid incentives and express delivery on a Random Digit Dial Survey. Unpublished paper presented at the International Conference on Survey Nonresponse. Portland, October.

Cantor, D., and K.Wang 1998 Correlates of Measurement Error when Screening on Poverty Status for a Random Digit Dial Survey. Unpublished paper presented at the meeting of American Statistical Association, Dallas, August.

Collins, M., W.Sykes, P.Wilson, and N.Blackshaw 1988 Non-response: The UK experience. Pp. 213–232 in Telephone Survey Methodology, R.M. Groves, P.P.Biermer, L.E. Lyberg, J.T., Massey, W.L.Nicholls, and J.Waksberg, eds. New York: John Wiley and Sons.

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

Couper, M., R.M.Groves, and R.B.Cialdini 1992 Understanding the decision to participate in a survey. Public Opinion Quarterly 56:475– 495.

Dillman, D. 1978 Mail and Telephone Surveys: The Total Design Method. New York: John Wiley and Sons.

Dillman, D., J.G.Gallegos, and J.H.Frey 1976 Reducing refusal rates for telephone interviews. Public Opinion Quarterly 40:66–78.


Everett, S.E., and S.C.Everett 1989 Effects of Interviewer Affiliation and Sex Upon Telephone Survey Refusal Rates. Paper presented at the Annual Conference of the Midwest Association for Public Opinion Research, Chicago.


Flores-Cervantes, I., J.M.Brick, T.Hankins, and K.Wang 1999 Evaluation of the Use of Data on Interruption in Telephone Service. Unpublished paper presented at the meeting of American Statistical Association in Baltimore, August 5–12.

Frankel, M.R., K.P.Srinath, M.P.Battaglia, D.C.Hoaglin, R.A.Wright, and P.J.Smith 1999 Reducing nontelephone bias in RDD surveys. Pp. 934–939 in Proceedings of the Section on Survey Research Methods. Alexandria, VA: American Statistical Association.


Goyder, J. 1987 The Silent Minority, Nonrespondents on Sample Surveys. Cambridge, Eng.: Polity Press.

Groves, R., D.Cantor, K.McGonagle, and J.Van Hoewyk 1997 Research Investigations in Gaining Participation from Sample Firms in the Current Employment Statistics Program. Unpublished paper presented at the Annual Meeting of the American Statistical Association, Anaheim, CA, August 10–14.

Groves, R.M., and M.P.Couper 1998 Nonresponse in household interview surveys. New York: John Wiley & Sons.

Groves, R., and N.Fultz 1985 Gender effects among telephone interviewers in a survey of economic attributes. Sociological Methods and Research 14:31–52.

Groves, R.M., C.Cannell, and M.O’Neil 1979 Telephone interview introductions and refusal rates: Experiments in increasing respondent cooperation. Pp. 252–255 in Proceedings of the Section on Survey Research Methods. Alexandria, VA: American Statistical Association.


Keeter, S. 1995 Estimating telephone noncoverage bias with a telephone survey. Public Opinion Quarterly 59:196–217.


Massey, J., D.O’Connor, K.Krotki, and K.Chandler 1998 An investigation of response rates in random digit dialing (RDD) telephone surveys. In Proceedings of the Section on Survey Research Methods. Alexandria, VA: American Statistical Association.

Moeller, J.F., and N.A.Mathiowetz 1994 Problems of screening for poverty status. Journal of Official Statistics 10:327–337.


Oldendick, R.W., and M.W.Link 1994 The answering machine generation: Who are they and what problem do they pose for survey research? Public Opinion Quarterly 58:264–273.


Sobal, J. 1982 Disclosing information in interview introductions: Methodological consequences of informed consent. Sociology and Social Research 66:349–361.

Stapulonis, R.A., M.Kovac, and T.M.Fraker 1999 Surveying Current and Former TANF Recipients in Iowa. Unpublished paper presented at the 21st Annual Research Conference of the Association for Public Policy Analysis and Management, Washington, DC: Mathematica Policy Research, Inc.

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×

STORES 1995 Answering machines hold off voice mail challenge. STORES 77(11):38–39.

Thornberry, O.T., and J.T.Massey 1988 Trends in United States telephone coverage across time and subgroups. Pp. 25–50 in Telephone Survey Methodology, R.M.Groves, P.P.Biemer, L.E.Lyberg, J.T., Massey, W.L.Nicholls, J.Waksberg, eds. New York: Wiley.

Traugott, M., J.Lepkowski, and P.Weiss 1997 An Investigation of Methods for Matching RDD Respondents with Contact Information for Validation Studies. Unpublished paper presented at the Annual Meeting of the American Association for Public Opinion Research, Norfolk, VA, May 15–17.

Tuckel, P.S., and B.M.Feinberg 1991 The answering machine poses many questions for telephone survey researchers. Public Opinion Quarterly 55:200–217.

Tuckel, P., and H.O’Neil 1996 New technology and nonresponse bias in RDD surveys. Pp. 889–894 in Proceedings of the Section on Survey Research Methods. Alexandria, VA: American Statistical Association.


Xu, M., B.Bates, and J.C.Schweitzer 1993 The impact of messages on survey participation in answering machine households. Public Opinion Quarterly 57:232–237.

Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 55
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 56
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 57
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 58
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 59
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 60
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 61
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 62
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 63
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 64
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 65
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 66
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 67
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 68
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 69
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 70
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 71
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 72
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 73
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 74
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 75
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 76
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 77
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 78
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 79
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 80
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 81
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 82
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 83
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 84
Suggested Citation:"2 Methods for Obtaining High Response Rates in Telephone Surveys." National Research Council. 2002. Studies of Welfare Populations: Data Collection and Research Issues. Washington, DC: The National Academies Press. doi: 10.17226/10206.
×
Page 85
Next: 3 High Response Rates for Low-Income Population In-Person Surveys »
Studies of Welfare Populations: Data Collection and Research Issues Get This Book
×
Buy Paperback | $74.00 Buy Ebook | $59.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

This volume, a companion to Evaluating Welfare Reform in an Era of Transition, is a collection of papers on data collection issues for welfare and low-income populations. The papers on survey issues cover methods for designing surveys taking into account nonresponse in advance, obtaining high response rates in telephone surveys, obtaining high response rates in in-person surveys, the effects of incentive payments, methods for adjusting for missing data in surveys of low-income populations, and measurement error issues in surveys, with a special focus on recall error. The papers on administrative data cover the issues of matching and cleaning, access and confidentiality, problems in measuring employment and income, and the availability of data on children. The papers on welfare leavers and welfare dynamics cover a comparison of existing welfare leaver studies, data from the state of Wisconsin on welfare leavers, and data from the National Longitudinal Survey of Youth used to construct measures of heterogeneity in the welfare population based on the recipient's own welfare experience. A final paper discusses qualitative data.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!