2020
DOI: 10.1080/13803611.2021.1963939
|View full text |Cite
|
Sign up to set email alerts
|

How does the number of actions on constructed-response items relate to test-taking effort and performance?

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 39 publications
2
6
0
Order By: Relevance
“…In line with Hypothesis 1, the extended DLC-TL IRT model that included the new indicators showed a better fit as compared to a model with response times alone, supporting the finding of previous studies that process data beyond response time might indeed provide valuable information on response engagement (Ivanova et al, 2020;Lundgren & Eklöf, 2020;Patel et al, 2021;Sahin & Colvin, 2020). However, a closer inspection of the results revealed that only item response times and text reread predicted response engagement while answer change and item revisit did not.…”
Section: Discussionsupporting
confidence: 85%
See 1 more Smart Citation
“…In line with Hypothesis 1, the extended DLC-TL IRT model that included the new indicators showed a better fit as compared to a model with response times alone, supporting the finding of previous studies that process data beyond response time might indeed provide valuable information on response engagement (Ivanova et al, 2020;Lundgren & Eklöf, 2020;Patel et al, 2021;Sahin & Colvin, 2020). However, a closer inspection of the results revealed that only item response times and text reread predicted response engagement while answer change and item revisit did not.…”
Section: Discussionsupporting
confidence: 85%
“…One potential source of information on test-taking behavior that has been increasingly available with the advent of computer-based assessments is process data, which provides a rich array of easily accessible information on the test-taking process. Few studies have yet investigated the potential of process data beyond item response times for the identification of disengaged responding (Lundgren & Eklöf, 2020;Ivanova et al, 2020;Patel et al, 2021;Sahin & Colvin, 2020). So far, available studies either concentrated on response engagement in interactive items or were rather exploratory.…”
Section: Identifying Disengaged Responding In Multiple-choice Items: ...mentioning
confidence: 99%
“…The use of response time data is one of the most established applications of response process data in assessment, with a degree of formalization about its methods (Lee & Jia, 2014; Li et al, 2017; Wise, 2019) and related discussion of validation (Goldhammer et al, 2021; Li et al, 2017; Reis Costa et al, 2021; Wise, 2017, 2019). The digital transition has witnessed extensive application, such as measures to support inferences about rapid guessing behaviors and disengagement in large-scale assessments (Ercikan et al, 2020; Goldhammer et al, 2017; Kroehne et al, 2020; Lundgren & Eklöf, 2020; Soland et al, 2018; Wise, 2017, 2019, 2020a, 2020b; Wise & Gao, 2017, Wise & Kong, 2005, Wise et al, 2019), and to explore the relationship between response times, accuracy and ability (e.g., Ivanova et al, 2020; Michaelides et al, 2020; Ranger et al, 2021; Reis Costa et al, 2021).…”
Section: Item Response Times and Disabilitymentioning
confidence: 99%
“…Sahin and Colvin (2020) showed that the number and type of actions could in some cases slightly improve the response time thresholds in interactive tasks. Ivanova and colleagues (2020) found that the number of actions in constructed response items was related to test performance, self-reported test-taking effort, and item position. Finally, Patel and colleagues (2021) used a machine learning approach to predict test-taking efficiency in a test of mixed item types.…”
Section: Enhancing the Identification Of Disengaged Responses With Pr...mentioning
confidence: 99%