Respondents who break off from a web survey prior to completing it are a prevalent problem in data collection. To prevent breakoff bias, it is crucial to keep as many diverse respondents in a web survey as possible. As a first step of preventing breakoffs, this study aims to understand breakoff and the associated response behavior. We analyze data from an annual online survey using dynamic survival models and ROC analyses. We find that breakoff risks between respondents using mobile devices versus PCs do not differ at the beginning of the questionnaire, but the risk for mobile device users increases as the survey progresses. Very fast respondents as well as respondents with changing response times both have a higher risk of quitting the questionnaire, compared to respondents with slower and steady response times. We conclude with a discussion of the implications of these findings for future practice and research in web survey methodology.
Summary
Several studies have shown that conversational interviewing (CI) reduces response bias for complex survey questions relative to standardized interviewing. However, no studies have addressed concerns about whether CI increases intra‐interviewer correlations (IICs) in the responses collected, which could negatively impact the overall quality of survey estimates. The paper reports the results of an experimental investigation addressing this question in a national face‐to‐face survey. We find that CI improves response quality, as in previous studies, without substantially or frequently increasing IICs. Furthermore, any slight increases in the IICs do not offset the reduced bias in survey estimates engendered by CI.
Standardized interviewing (SI) and conversational interviewing are two approaches to collect survey data that differ in how interviewers address respondent confusion. This article examines interviewer–respondent interactions that occur during these two techniques, focusing on requests for and provisions of clarification. The data derive from an experimental study in Germany, where the face-to-face interviews were audio-recorded. A sample of 111 interviews was coded in detail. We find that conversational interviewers do make use of the ability to clarify respondent confusion. Although the technique improved response accuracy in the main study compared to SI, conversational interviewers seem to provide clarifications even when there is no evidence of respondent confusion. This may lengthen administration time and potentially increase data collection costs. Conversational interviewers also employ neutral probes, which are generally associated with standardized interviews, at an unexpectedly high rate. We conclude with suggestions for practice and directions for future research.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.