Log-file data from computer-based assessments can provide useful collateral information for estimating student abilities. In turn, this can improve traditional approaches that only consider response accuracy. Based on the amounts of time students spent on 10 mathematics items from the PISA 2012, this study evaluated the overall changes in and measurement precision of ability estimates and explored country-level heterogeneity when combining item responses and time-on-task measurements using a joint framework. Our findings suggest a notable increase in precision with the incorporation of response times and indicate differences between countries in how respondents approached items as well as in their response processes. Results also showed that additional information could be captured through differences in the modeling structure when response times were included. However, such information may not reflect the testing objective.
Given the ongoing development of computer-based tasks, there has been increasing interest in modelling students’ behaviour indicators from log file data with contextual variables collected via questionnaires. In this work, we apply a latent regression model to analyse the relationship between latent constructs (i.e., performance, speed, and exploration behaviour) and contextual variables among Scandinavian students (i.e., Norway, Sweden, and Denmark) during their completion of three interactive mathematics tasks in the 2012 Program for International Student Assessment (PISA). The purpose of this study is threefold: (1) to investigate whether a factor model is adequate for explaining the relationship between extracted process indicators from log files and students’ performance on the three items; (2) to evaluate whether measurement invariance holds for the three analysed countries; and (3) to determine how well contextual variables [i.e., a student’s background, availability, use of information and communication technology (ICT) resources, and learning conditions for mathematical literacy] correlate with the model’s latent constructs. Our findings reveal that a three-factor CFA model is well-suited to the analysed data and that a weak measurement invariance model performs best. By including contextual variables in the modelling, we also highlight the differences in Scandinavian students’ profiles. Specifically, higher economic social cultural status (ESCS) consistently led to higher math proficiency across all three countries. Norway did not show any gender differences in math proficiency, response time, or exploration behaviours. Experience with math tasks at school led to more exploration behaviours across all three countries. Swedish and Norwegian students who had more experience with pure math tasks at school were likely to obtain higher mathematics achievement scores and tended to interact more frequently with the testing platform when answering the three math items. When Danish students had higher ICT scores (i.e., more ICT resources available at school and home), they had lower response accuracy. There was, however, no effect of ICT on the three factors among Swedish and Norwegian students. Finally, we also discuss the implications and limitations of this study.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.