Search citation statements
Paper Sections
Citation Types
Year Published
Publication Types
Relationship
Authors
Journals
Interviewer (mis)behavior has been shown to change with interviewers’ professional experience (general experience) and experience gained during the field period (survey experience). We extend this study by using both types of experiences to analyze interviewer effects on a core quality indicator: interview duration. To understand whether the effect of interviewer experience on duration is driven by increased efficiency or deviant behavior—both mechanisms of shorter interview durations—we additionally examine the triggering rate of filter questions to avoid burdensome follow-up questions and response differentiation over the field period. Using multilevel models and data from a large-scale survey on a special and difficult-to-interview population of refugees in Germany, we find that interview duration decreases with increasing survey experience, particularly among the generally inexperienced interviewers. However, this effect is not found for the triggering rate and response differentiation. The results are robust to different sample and model specifications. We conclude that the underlying mechanism driving interview duration is related to increasing efficiency, and not deviant behavior.
Interviewer (mis)behavior has been shown to change with interviewers’ professional experience (general experience) and experience gained during the field period (survey experience). We extend this study by using both types of experiences to analyze interviewer effects on a core quality indicator: interview duration. To understand whether the effect of interviewer experience on duration is driven by increased efficiency or deviant behavior—both mechanisms of shorter interview durations—we additionally examine the triggering rate of filter questions to avoid burdensome follow-up questions and response differentiation over the field period. Using multilevel models and data from a large-scale survey on a special and difficult-to-interview population of refugees in Germany, we find that interview duration decreases with increasing survey experience, particularly among the generally inexperienced interviewers. However, this effect is not found for the triggering rate and response differentiation. The results are robust to different sample and model specifications. We conclude that the underlying mechanism driving interview duration is related to increasing efficiency, and not deviant behavior.
MotivationWhen collecting evidence from the field, the quality of the data determines the reliability of the analysis. When data are collected in the field by enumerators, the latter's performance needs to be monitored to avoid errant behaviour that could compromise data quality.PurposeWe show how paradata on the process of data collection itself can improve enumerator performance, using a household survey in India as a case study.Approach and methodsWe conducted action research to improve data quality in the India Working Study conducted in early 2020 in Karnataka and Rajasthan. We designed indicators (flags) from the paradata to mark potential deviant enumerator behaviour in the early stages of the survey. Flagged enumerators were contacted by supervisors who provided constructive feedback. We then measured the performance of the flagged enumerators over the remainder of the survey.We were able to benchmark specific groups of enumerators facing similar field conditions, namely location and gender of respondents. This allowed us to compare enumerators to a subset of their peers, rather than the entire set of enumerators.FindingsOur feedback improved enumerator behaviour in the field: flagged enumerators subsequently spent more time on a core module of the questionnaire.Policy implicationsIn any survey, two objectives compete: completing a fixed number of interviews per day to reduce costs versus enumerators spending enough time with each respondent to collect meaningful data. To strike a balance between these competing demands, we recommend tracking three paradata indicators: count of completed interviews; average time per completed interview; and ratio of completed to initiated interviews.We recommend using paradata to improve the quality of data when surveying, thereby reducing standard errors for estimates based on the data and leading to more reliable analysis.
Over its 50 years of existence, the National Crime Victimization Survey (NCVS) has changed to address the informational needs of its users, as well as adapt to an evolving survey climate. Perhaps more than any other victim survey in the world, data quality has been an emphasis. In this article, the evolution of the methodology of the survey is described up to the point of the 2024 redesign. The first section provides an abbreviated description of the innovations and methods instituted up through the redesign in 1992. This section highlights key milestones and refers the reader to more detailed descriptions of the origins of the NCVS and the 1992 redesign. The second section provides more detail on methodological issues and changes since the 1992 redesign. This article closes with the challenges the survey faces moving forward into the mid-21st century.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.