2020
DOI: 10.1103/physrevphyseducres.16.010107
|View full text |Cite
|
Sign up to set email alerts
|

Quantitatively ranking incorrect responses to multiple-choice questions using item response theory

Abstract: Research-based assessment instruments (RBAIs) are ubiquitous throughout both physics instruction and physics education research. The vast majority of analyses involving student responses to RBAI questions have focused on whether or not a student selects correct answers and using correctness to measure growth. This approach often undervalues the rich information that may be obtained by examining students' particular choices of incorrect answers. In the present study, we aim to reveal some of this valuable infor… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3
1

Relationship

3
5

Authors

Journals

citations
Cited by 17 publications
(12 citation statements)
references
References 30 publications
0
12
0
Order By: Relevance
“…This item appears to be considerably less challenging than Flag of Bhutan, as 58% of all students answer correctly on PIQL 20W, however it is not an MCMR question so it cannot be compared directly [27]. While the majority of students in this sample answered correctly, incorrect answer choices provide insight into what productive resources students that do not answer correctly are using.…”
Section: B Ferris Wheelmentioning
confidence: 90%
See 1 more Smart Citation
“…This item appears to be considerably less challenging than Flag of Bhutan, as 58% of all students answer correctly on PIQL 20W, however it is not an MCMR question so it cannot be compared directly [27]. While the majority of students in this sample answered correctly, incorrect answer choices provide insight into what productive resources students that do not answer correctly are using.…”
Section: B Ferris Wheelmentioning
confidence: 90%
“…One of the challenges of the Flag of Bhutan item as written on the PIQL is that it is a multiple-choice/multipleresponse (MCMR) question. Thereby, its score is low compared to other items on the PIQL because these items are scored dichotomously for comparison with other multiplechoice/single-response items [27]. However, the nature of the item does not fully account for the significantly low number of completely correct responses (26% of all students) and this rate does not change significantly throughout the introductory sequence (25% in 121, 25% in 122, and 31% in 123) suggesting that students do not improve with instruction.…”
Section: A Flag Of Bhutanmentioning
confidence: 99%
“…Morris et al introduced IRCs as a simplified form of item response theory (IRT) that uses the total test score as an independent variable, rather than the IRT latent trait of ability level, which, by definition, is unmeasurable [11,12]. IRC analyses are similar to the IRT nominal response model but require far less computational power [16][17][18][19].…”
Section: Item Response Curvesmentioning
confidence: 99%
“…An example of a more rigorous treatment of finding the ordering of response options cane be found in Ref. [46] for the force and motions conceptual evaluation. Something similar to this study could be performed for the FCI, and other multiple choice conceptual assessments.…”
Section: Limitations and Future Workmentioning
confidence: 99%