2022
DOI: 10.1017/psrm.2022.30
|View full text |Cite
|
Sign up to set email alerts
|

A randomized experiment evaluating survey mode effects for video interviewing

Abstract: Rising costs and challenges of in-person interviewing have prompted major surveys to consider moving online and conducting live web-based video interviews. In this paper, we evaluate video mode effects using a two-wave experimental design in which respondents were randomized to either an interviewer-administered video or interviewer-administered in-person survey wave after completing a self-administered online survey wave. This design permits testing of both within- and between-subject differences across surve… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
8
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(8 citation statements)
references
References 36 publications
0
8
0
Order By: Relevance
“…As with all data collection methods, a challenge with mixed-mode and innovative methods is ensuring the quality of the survey data. Mixing modes results in survey data being collected under very different conditions (e.g., interviewer presence or absence, verbal or visual presentation of question stimuli, differing question formats); thus, mode effects can affect data quality and survey estimates (Conrad et al, 2022;de Leeuw & Hox, 2015;Endres et al, 2022;Lugtig et al, 2011;West et al, 2022). Moreover, when relatively new data collection methods are used that are unfamiliar to both interviewers and respondents, such as CAVI, little is known about the problems that may occur during the interview, such as interrupted speech and frozen or distorted video (Conrad et al, 2022), and about the impact of the new interview situation and the problems encountered on response behavior and data quality.…”
Section: Challenges In Data Qualitymentioning
confidence: 99%
See 2 more Smart Citations
“…As with all data collection methods, a challenge with mixed-mode and innovative methods is ensuring the quality of the survey data. Mixing modes results in survey data being collected under very different conditions (e.g., interviewer presence or absence, verbal or visual presentation of question stimuli, differing question formats); thus, mode effects can affect data quality and survey estimates (Conrad et al, 2022;de Leeuw & Hox, 2015;Endres et al, 2022;Lugtig et al, 2011;West et al, 2022). Moreover, when relatively new data collection methods are used that are unfamiliar to both interviewers and respondents, such as CAVI, little is known about the problems that may occur during the interview, such as interrupted speech and frozen or distorted video (Conrad et al, 2022), and about the impact of the new interview situation and the problems encountered on response behavior and data quality.…”
Section: Challenges In Data Qualitymentioning
confidence: 99%
“…Even if the technical requirements are met, not all respondents are ready for and comfortable with a video interview. Like KtN, CAVI involves comprehensive scheduling (Endres et al, 2022; Schober et al, 2020). Respondents’ varying ability and willingness to participate and the more complex call scheduling, particularly for CAVI and KtN, underscore the importance of tailored fieldwork management, propensity modeling, and nonresponse bias analysis.…”
Section: Interviewer-observed Paradata In Mixed-mode and Innovative D...mentioning
confidence: 99%
See 1 more Smart Citation
“…Research has tested mode effects extensively in the context of Europe and North America (e.g., Heerwegh 2009;Atkeson et al 2014;Aquilino 1994;Pasek 2016; Endres et al 2023) and increasingly in developing country contexts (e.g., Boas et al 2020;Greenleaf et al 2020).…”
mentioning
confidence: 99%
“…The focus is on willingness to participate rather than actual participation, as part of exploring the viability of live video surveys, based on evidence from other technologies that the behavioral intention to use a technology can directly affect actual usage behavior (Davis & Wiedenbeck, 2001). The point of the study is not to generalize to a full demographic analysis of U.S. preferences at the moment of data collection, nor to test the quality of survey responses in live video versus other modes in actual interviews (e.g., Conrad et al, 2022;Endres et al, 2023)-which are currently rare-but rather to test targeted hypotheses about what may affect people's behavioral intention-their willingness to participate-as new norms of video usage emerge.…”
mentioning
confidence: 99%