2021
DOI: 10.5281/zenodo.5275312
|View full text |Cite
|
Sign up to set email alerts
|

Modeling NAEP Test-Taking Behavior Using Educational Process Analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

3
3
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(6 citation statements)
references
References 25 publications
3
3
0
Order By: Relevance
“…In line with Hypothesis 1, the extended DLC-TL IRT model that included the new indicators showed a better fit as compared with a model with response times alone, supporting the finding of previous studies that process data beyond response time might indeed provide valuable information on response engagement ( Ivanova et al, 2020 ; Lundgren & Eklöf, 2020 ; Patel et al, 2021 ; Sahin & Colvin, 2020 ). However, a closer inspection of the results revealed that only item response times and text reread predicted response engagement while answer change and item revisit did not.…”
Section: Discussionsupporting
confidence: 84%
See 2 more Smart Citations
“…In line with Hypothesis 1, the extended DLC-TL IRT model that included the new indicators showed a better fit as compared with a model with response times alone, supporting the finding of previous studies that process data beyond response time might indeed provide valuable information on response engagement ( Ivanova et al, 2020 ; Lundgren & Eklöf, 2020 ; Patel et al, 2021 ; Sahin & Colvin, 2020 ). However, a closer inspection of the results revealed that only item response times and text reread predicted response engagement while answer change and item revisit did not.…”
Section: Discussionsupporting
confidence: 84%
“… Ivanova and colleagues (2020) found that the number of actions in constructed response items was related to test performance, self-reported test-taking effort, and item position. Finally, Patel and colleagues (2021) used a machine learning approach to predict test-taking efficiency in a test of mixed item types. Besides measures related to test or item response time, the number of certain actions (e.g., answer changes and navigation button use) were identified as important features of test efficiency.…”
Section: Enhancing the Identification Of Disengaged Responses With Pr...mentioning
confidence: 99%
See 1 more Smart Citation
“…In line with Hypothesis 1, the extended DLC-TL IRT model that included the new indicators showed a better fit as compared to a model with response times alone, supporting the finding of previous studies that process data beyond response time might indeed provide valuable information on response engagement (Ivanova et al, 2020;Lundgren & Eklöf, 2020;Patel et al, 2021;Sahin & Colvin, 2020). However, a closer inspection of the results revealed that only item response times and text reread predicted response engagement while answer change and item revisit did not.…”
Section: Discussionsupporting
confidence: 84%
“…One potential source of information on test-taking behavior that has been increasingly available with the advent of computer-based assessments is process data, which provides a rich array of easily accessible information on the test-taking process. Few studies have yet investigated the potential of process data beyond item response times for the identification of disengaged responding (Lundgren & Eklöf, 2020;Ivanova et al, 2020;Patel et al, 2021;Sahin & Colvin, 2020). So far, available studies either concentrated on response engagement in interactive items or were rather exploratory.…”
Section: Identifying Disengaged Responding In Multiple-choice Items: ...mentioning
confidence: 99%