2017
DOI: 10.2308/bria-51977
|View full text |Cite
|
Sign up to set email alerts
|

A Technical Guide to Using Amazon's Mechanical Turk in Behavioral Accounting Research

Abstract: Multiple social science researchers claim that online data collection, mainly via Amazon's Mechanical Turk (MTurk), has revolutionized the behavioral sciences (Gureckis et al. 2016; Litman, Robinson, and Abberbock 2017). While MTurk-based research has grown exponentially in recent years (Chandler and Shapiro 2016), reasonable concerns have been raised about online research participants' ability to proxy for traditional research participants (Chandler, Mueller, and Paolacci 2014). This paper reviews recent MTur… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
39
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 89 publications
(41 citation statements)
references
References 46 publications
2
39
0
Order By: Relevance
“…If it is not compatible, you will have to deepen your evaluation. If the auditor finds that his preliminary assessment of population characteristics needs to be reviewed, he has the option of requesting help from management to investigate potential errors by making any adjustments he deems necessary [21].…”
Section: Evaluation Of Sample Resultsmentioning
confidence: 99%
“…If it is not compatible, you will have to deepen your evaluation. If the auditor finds that his preliminary assessment of population characteristics needs to be reviewed, he has the option of requesting help from management to investigate potential errors by making any adjustments he deems necessary [21].…”
Section: Evaluation Of Sample Resultsmentioning
confidence: 99%
“…However, while accounting research has found that AMT users are appropriate proxies for investors (Farrell et al, 2016;Krische, 2018), other research has identified potential limitations which researchers must plan around when using these platforms. To mitigate concerns around response quality, I followed current research recommendations by including distractor questions during the screening process to minimize demand effects (Buchheit et al, 2018), as well as asking verification questions at the end of the study to establish consistency in responses. Full details of these controls are given later.…”
Section: Methodsmentioning
confidence: 99%
“…Subjects for the experiment were recruited with Amazon MTurk, a crowdsourcing online platform increasingly used by researchers for many purposes (Buchheit et al 2018). MTurk assembles individuals willing to do online tasks for relatively small monetary payments.…”
Section: Methodsmentioning
confidence: 99%