Survey data quality suffers when respondents have difficulty completing complex tasks in questionnaires. Cognitive load theory informed the development of strategies for educators to reduce the cognitive load of learning tasks. We investigate whether these cognitive load reduction strategies can be used in questionnaire design to reduce task difficulty and, in so doing, improve survey data quality. We find that this is not the case and conclude that some of the traditional survey answer formats, such as grid questions, which have been criticized in the past lead to equally good data and do not frustrate respondents more than alternative formats.
Tracking respondents’ eyes while they complete a survey reveals that (a) they do not read instructions, survey questions, and answer options carefully enough, investing only as little as 32% of the required time; (b) their attention diminishes over the course of the survey; and (c) their self-reports of the survey experience do not reflect actual survey completion behavior. As much as 15% of survey data may be negatively affected by systematic respondent inattention. From these findings, we derive practical recommendations on how to improve pre-testing of surveys and how to reduce the likelihood of survey respondents ignoring instructions and not reading survey questions and answer options.
Low survey participation from online panel members is a key challenge for market and social researchers. We identify 10 key drivers of panel members’ online survey participation from a qualitative study and then determine empirically using a stated choice experiment the relative importance of each of those drivers at aggregate and segment levels. We contribute to knowledge on survey participation by (a) eliciting key drivers of survey participation by online panel members, (b) determining the relative importance of each driver, and (c) accounting for heterogeneity across panel members in the importance assigned to drivers. Findings offer immediate practical guidance to market and social researchers on how to increase participation in surveys using online panels.
Low survey participation from online panel members is a key challenge for market and social researchers. We identify 10 key drivers of panel members’ online survey participation from a qualitative study and then determine empirically using a stated choice experiment the relative importance of each of those drivers at aggregate and segment levels. We contribute to knowledge on survey participation by (a) eliciting key drivers of survey participation by online panel members, (b) determining the relative importance of each driver, and (c) accounting for heterogeneity across panel members in the importance assigned to drivers. Findings offer immediate practical guidance to market and social researchers on how to increase participation in surveys using online panels.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.