Research-based assessment instruments (RBAIs) are ubiquitous throughout both physics instruction and physics education research. The vast majority of analyses involving student responses to RBAI questions have focused on whether or not a student selects correct answers and using correctness to measure growth. This approach often undervalues the rich information that may be obtained by examining students' particular choices of incorrect answers. In the present study, we aim to reveal some of this valuable information by quantitatively determining the relative correctness of various incorrect responses. To accomplish this, we propose an assumption that allow us to define relative correctness: students who have a high understanding of Newtonian physics are likely to answer more questions correctly and also more likely to choose better incorrect responses, than students who have a low understanding. Analyses using item response theory align with this assumption, and Bock's nominal response model allows us to uniquely rank each incorrect response. We present results from over 7,000 students' responses to the Force and Motion Conceptual Evaluation.
Using data from over 14,000 student responses, we rank incorrect responses on the Force and Motion Concept Evaluation (FMCE). We develop a hierarchy of responses using item response theory and the McNemar-Bowker chi-square test for asymmetry. We use item response theory (IRT) under the assumption that students who score well have a greater understanding of physics than those who do not; therefore, responses that have a greater likelihood of being selected by those who score well are considered better responses. We use the McNemar-Bowker chi-square test (MB) under the assumption that student understanding is more likely to increase than decrease after an introductory mechanics course. Therefore, more dominant transitions from one answer to another from pretest to posttest indicate that one answer is better than another. We present the results from the IRT and MB analyses, highlighting both agreement and disagreement between the hierarchies of responses generated by each.
Using data from over 14,000 student responses we create item response curves, fitted to the polytomous item response theory model for nominal responses, to evaluate the relative "correctness" of various incorrect responses to questions on the Force and Motion Conceptual Evaluation (FMCE). Based on this ranking of incorrect responses, we examine individual students' pairs of responses to FMCE questions, using transition matrices and consistency plots, to show how student ideas develop over the span of an introductory mechanics course. Using data from two different schools (N ≈ 200 each), we explore how these representations can show student learning even when individuals do not choose the correct answer. Comparing response pairs provides a rich picture of student learning that is unavailable in many traditional analyses.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.