Web surveys are being increasingly incorporated into national survey data collection programs in the United States because of their cost/time-efficiencies. Yet, response rates and data quality issues in web surveys remain important challenges. As a basic study designed to better understand data quality in a mixed mode national survey, this article investigates the degree to which web versus mail survey modes affect unit and item responses. Findings indicate that the web survey mode produces a lower unit response rate compared to the mail mode. However, the web mode elicits higher data quality in terms of item responses to both closed- and open-ended questions. These mode effects on data quality remain after sociodemographic variables are held constant. Given the increasing integration of web survey questionnaires into mixed mode studies, additional research is necessary to understand and document the processes that underlie mode differences when responding to self-administered surveys.
Decreasing survey response rates are a growing concern as survey estimates may be biased by selective non-response. One method of assessing non-response bias is to examine the timing of survey response, specifically comparing those who respond late to a survey with those who respond early. This paper draws upon data obtained from multiple panel surveys conducted over a six-month period, and examines whether early, intermediate and late respondents differ significantly in demographics or in their responses to survey questions. By considering response timing as a repeated behaviour, or habit, spanning multiple surveys, a longitudinal measure of response timing is developed to identify the predictors of responding early to multiple surveys conducted over a period of time. Results indicate some directional differences in demographics and better data quality from early respondents, compared to their intermediate and late counterparts. We discuss the findings from the study and conclude with recommendations for future research.
Survey response rates have been declining over the past several decades, particularly for random-digit-dialing (RDD) telephone surveys (see de Leeuw and de Heer 2002; Steeh 1981). This trend affects research panels such as the Gallup Panel, which uses RDD methodology to recruit its members. If significant improvements in panel recruitment response rates are to be achieved, new approaches must be considered. This paper presents the findings of a mail and telephone mode experiment conducted by the Gallup Panel to analyze the individual and combined effects of incentives, advance letters, and follow-up telephone calls on the panel recruitment response rate. Study results indicate that the mail recruitment approach nets a higher panel response rate, and that the cost-effectiveness of the mail recruitment approach is significantly greater than the telephone recruitment approach. Study results also suggest that the advance letter, incentive, and telephone follow-up conditions all have independent, positive influences on the response rate; and that the groups that receive an advance letter, that receive incentives, and that receive a follow-up telephone call have higher panel recruitment response rates than the control group.
Despite higher hit rates for cell phone samples, inefficiencies in processing calls to these numbers relative to landline numbers continue to be documented in the U.S. literature. In this study, we propose one method for using cell phone provider information and Internet resources for validating number status. Specifically, we describe how we used ''in network'' options available from three major providers' web sites to determine the validity of cell phone numbers. We tested differences in working number rates (WNRs) among valid and nonvalid numbers against a normal processing control group and determined that the WNR among valid numbers was approximately 14 percentage points higher than the WNR of the comparison group. This process also shows promise in reducing the effort required to determine working status and may provide a basis for developing screening tools for cell phones that capitalize on resources that are unique to this technology.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.