2021
DOI: 10.1080/15366367.2020.1750934
|View full text |Cite
|
Sign up to set email alerts
|

Effects of Motivation on the Accuracy and Speed of Responding in Tests: The Speed-Accuracy Tradeoff Revisited

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(9 citation statements)
references
References 70 publications
3
6
0
Order By: Relevance
“…Interestingly, we also found an interaction between participant ability and item difficulty: whereas the worst-performing participants maintained the same work rate irrespective of item difficulty, the best-performing participants spent more time deliberating as items became more challenging. These results are consistent with a persistence interpretation of ability, wherein good task performance in part reflects a willingness to invest time in the solution process and poor task performance reflects a tendency to give up sooner (Ranger, Kuhn, & Pohl, 2021). As such, the MaRs-IB may be suitable not only for measuring matrix reasoning ability but also mental effort costs (Kool & Botvinick, 2018), opportunity costs (Payne, Bettman, & Luce, 1996), or other motivational factors (Duckworth et al, 2011) related to people's tendency to exert effort or give up.…”
Section: Discussionsupporting
confidence: 81%
“…Interestingly, we also found an interaction between participant ability and item difficulty: whereas the worst-performing participants maintained the same work rate irrespective of item difficulty, the best-performing participants spent more time deliberating as items became more challenging. These results are consistent with a persistence interpretation of ability, wherein good task performance in part reflects a willingness to invest time in the solution process and poor task performance reflects a tendency to give up sooner (Ranger, Kuhn, & Pohl, 2021). As such, the MaRs-IB may be suitable not only for measuring matrix reasoning ability but also mental effort costs (Kool & Botvinick, 2018), opportunity costs (Payne, Bettman, & Luce, 1996), or other motivational factors (Duckworth et al, 2011) related to people's tendency to exert effort or give up.…”
Section: Discussionsupporting
confidence: 81%
“…We aim to study, in a variety of data, whether the general intuition behind the SAT holds. Conceptually, this study builds on work suggesting that additional time spent on a response does not always increase its accuracy (Bolsinova & Molenaar, 2018; Goldhammer et al, 2014; Ranger et al, 2021). For example, Chen et al (2018) discuss a curvilinear relationship between RT and accuracy: Increases in time spent on an item were associated with increases in accuracy, but only up to a certain point.…”
Section: Introductionmentioning
confidence: 98%
“…The use of response time data is one of the most established applications of response process data in assessment, with a degree of formalization about its methods (Lee & Jia, 2014; Li et al, 2017; Wise, 2019) and related discussion of validation (Goldhammer et al, 2021; Li et al, 2017; Reis Costa et al, 2021; Wise, 2017, 2019). The digital transition has witnessed extensive application, such as measures to support inferences about rapid guessing behaviors and disengagement in large-scale assessments (Ercikan et al, 2020; Goldhammer et al, 2017; Kroehne et al, 2020; Lundgren & Eklöf, 2020; Soland et al, 2018; Wise, 2017, 2019, 2020a, 2020b; Wise & Gao, 2017, Wise & Kong, 2005, Wise et al, 2019), and to explore the relationship between response times, accuracy and ability (e.g., Ivanova et al, 2020; Michaelides et al, 2020; Ranger et al, 2021; Reis Costa et al, 2021).…”
Section: Item Response Times and Disabilitymentioning
confidence: 99%