2022
DOI: 10.1037/pha0000564
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating and improving the quality of survey data from panel and crowd-sourced samples: A practical guide for psychological research.

Abstract: The use of crowd-sourced and panel survey data in addiction research has become widespread. However, the validity of data obtained from newer panels such as Qualtrics has not been extensively evaluated. Furthermore, few addiction researchers appear to employ previously recommended guidelines for maximizing the quality of data obtained from panel samples. The goals of the present study were as follows: (a) to evaluate the quality of survey data obtained from Qualtrics including an evaluation of the company's in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

1
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 32 publications
(10 citation statements)
references
References 35 publications
1
9
0
Order By: Relevance
“…On average, the survey took approximately 25 min for participants to complete. Women were compensated approximately $20 to participate, but the exact rate was designated by Qualtrics (see Belliveau & Yakovenko, 2022, for additional information on Qualtrics panels).…”
Section: Economic Resilience and Polystrengthsmentioning
confidence: 99%
“…On average, the survey took approximately 25 min for participants to complete. Women were compensated approximately $20 to participate, but the exact rate was designated by Qualtrics (see Belliveau & Yakovenko, 2022, for additional information on Qualtrics panels).…”
Section: Economic Resilience and Polystrengthsmentioning
confidence: 99%
“…They also provide practical recommendations to address these issues to include the use of both overt and covert fidelity measures. Belliveau and Yakovenko (2022) complement these findings by providing a practical implementation guide with step-by-step instructions for screening for speeding, straight-lining (i.e., tendency to make the same response in a group of questions), inconsistent responding, nonsensical responding, and missing data. Open-source code for conducting these procedures is provided for those looking to adopt these methods in their own work.…”
mentioning
confidence: 93%
“…Compared with other crowdsourcing platforms, Prolific has been found to have fewer careless errors and better data quality than MTurk (Jones et al, 2022; Peer et al, 2022). Several strategies for evaluating and confirming the quality of data obtained from crowdsource platforms (Belliveau & Yakovenko, 2022) were utilized in this study, including attention to speeding (completing a questionnaire overly quickly), straight lining (responding the same across all questions), inconsistent responding, and missing responses.…”
mentioning
confidence: 99%