2019
DOI: 10.1177/0894439319834289
|View full text |Cite
|
Sign up to set email alerts
|

How Effective Are Eye-Tracking Data in Identifying Problematic Questions?

Abstract: To collect high-quality data, survey designers aim to develop questions that each respondent can understand as intended. A critical step to this end is designing questions that minimize the respondents’ burden by reducing the cognitive effort required to comprehend and answer them. One promising technique for identifying problematic survey questions is eye tracking. This article investigates the potential of eye movements and pupil dilations as indicators for evaluating survey questions. Respondents were rando… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 33 publications
0
3
0
Order By: Relevance
“…Eye-tracking has also been used to collect data on participants' pupil dilation, item fixation and duration alongside think-aloud interviews to evaluate challenges during response process. [72][73][74][75][76]…”
Section: Alternatives To Cognitive Interviewsmentioning
confidence: 99%
“…Eye-tracking has also been used to collect data on participants' pupil dilation, item fixation and duration alongside think-aloud interviews to evaluate challenges during response process. [72][73][74][75][76]…”
Section: Alternatives To Cognitive Interviewsmentioning
confidence: 99%
“…In the related field of sociological methodology, the specific request for a multimodal assessment of respondents' cognitive load and perceived men tal effort and their impact on the quality of the survey data has also been growing over the recent years [Deviatko, Lebedev, 2017;Höhne, Schlosser, Krebs, 2017;Höhne, Lenzner, 2018;Kaminska, Foulsham, 2014;Neuert, 2021;Stodel, 2015]. At the same time, the possibilities of relatively new approaches to measuring survey related cognitive load using unobtrusive and noninvasive neurophysiological methods such as modern portable and wearable devices for eye tracking and pupillometry remain rather underestimated, despite the fact that these devices proved to be instrumental in conducting the accurate comparisons of the oculographic indicators of cognitive effort related to processing the specific question formats and response categories [Höhne, 2019], the different survey modes [Deviatko, Bogdanov, Lebedev, 2021], as well as in identifying problematic survey questions leading to excessive respondents' burden [Neuert, 2020]. The latter strain of research demonstrated, in particular, that the longdebated possible advantage of the itemspecific question format over the agree/disagree (A/D) one in susceptibility to response bias is counterbalanced by deeper cognitive processing as measured by mark edly longer fixations on response categories for A/D format [Höhne, 2019], while fixation times seemingly turned out to be more sensitive in revealing the problematic, poorly worded questions when compared to pupil data [Neuert, 2020].…”
Section: Introductionmentioning
confidence: 99%
“…At the same time, the possibilities of relatively new approaches to measuring survey related cognitive load using unobtrusive and noninvasive neurophysiological methods such as modern portable and wearable devices for eye tracking and pupillometry remain rather underestimated, despite the fact that these devices proved to be instrumental in conducting the accurate comparisons of the oculographic indicators of cognitive effort related to processing the specific question formats and response categories [Höhne, 2019], the different survey modes [Deviatko, Bogdanov, Lebedev, 2021], as well as in identifying problematic survey questions leading to excessive respondents' burden [Neuert, 2020]. The latter strain of research demonstrated, in particular, that the longdebated possible advantage of the itemspecific question format over the agree/disagree (A/D) one in susceptibility to response bias is counterbalanced by deeper cognitive processing as measured by mark edly longer fixations on response categories for A/D format [Höhne, 2019], while fixation times seemingly turned out to be more sensitive in revealing the problematic, poorly worded questions when compared to pupil data [Neuert, 2020]. However, the possible differences in taskrelated cognitive load associated with making either normative or factual judgments made by survey respondents, which are the focus of this article, still remains relatively unexplored with both more traditional and relatively newer methods.…”
Section: Introductionmentioning
confidence: 99%