2022
DOI: 10.1111/jedm.12310
|View full text |Cite
|
Sign up to set email alerts
|

Exploring the Impact of Random Guessing in Distractor Analysis

Abstract: Multiple-choice (MC) items are widely used in educational tests. Distractor analysis, an important procedure for checking the utility of response options within an MC item, can be readily implemented in the framework of item response theory (IRT). Although random guessing is a popular behavior of test-takers when answering MC items, none of the existing IRT models for distractor analysis have considered the influence of random guessing in this process. In this article, we propose a new IRT model to distinguish… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
4
0
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 68 publications
0
4
0
1
Order By: Relevance
“…However, even though disengaged responding might be related to the ability of the test-taker or the item difficulty, disengaged responses do not reflect these properties in the same way as responses stemming from solution behavior. As a consequence, simply ignoring disengaged responding can lead to biased item and person parameter estimates (Jin et al, 2022, Rios et al, 2017. Furthermore, it can also influence conclusions and decisions on other aspects such as differential item functioning (DeMars & Wise, 2010) or the speed-ability relation (Deribo et al, 2022).…”
Section: Types Of Test-taking Behaviormentioning
confidence: 99%
“…However, even though disengaged responding might be related to the ability of the test-taker or the item difficulty, disengaged responses do not reflect these properties in the same way as responses stemming from solution behavior. As a consequence, simply ignoring disengaged responding can lead to biased item and person parameter estimates (Jin et al, 2022, Rios et al, 2017. Furthermore, it can also influence conclusions and decisions on other aspects such as differential item functioning (DeMars & Wise, 2010) or the speed-ability relation (Deribo et al, 2022).…”
Section: Types Of Test-taking Behaviormentioning
confidence: 99%
“…However, even though disengaged responding might be related to the ability of the test-taker or the item difficulty, disengaged responses do not reflect these properties in the same way as responses stemming from solution behavior. As a consequence, simply ignoring disengaged responding can lead to biased item and person parameter estimates ( Jin et al, 2022 ; Rios et al, 2017 ). Furthermore, it can also influence conclusions and decisions on other aspects such as differential item functioning ( DeMars & Wise, 2010 ) or the speed-ability relation ( Deribo et al, 2022 ).…”
Section: Types Of Test-taking Behaviormentioning
confidence: 99%
“…Sejalan dengan itu penelitian oleh maulana menunjukkan bahwa analisis data menunjukkan perlunya penyederhanaan rentang pilihan jawaban, juga variasi tingkat kesulitan butir yang tidak terlalu beragam sehingga sebagian besar responden tidak dapat terukur dengan baik (Maulana, Rangkuti, & Wahyuni, 2020). Dalam penelitian yang berbeda disebutkan bahwa analisis distraktor merupakan prosedur penting untuk memeriksa kegunaan opsi respons dalam soal pilihan ganda dan dapat dengan mudah diimplementasikan dalam kerangka teori respons butir (Teori tes modern), dan prilaku menebak opsi jawaban secara acak adalah perilaku populer peserta tes saat menjawab soal pilihan ganda sehngga perlu dibuat alternatif analisi supaya dapat membuat opsi pilihan ganda tidak mudah ditebak dan distraktor berfungsi maksimal (Jin & Siu, 2022). Terkait dengan distraktor dan daya beda menunjukkan bahwa efisiensi distraktor tidak berkorelasi dalam pola yang konsisten dengan indeks diskriminasi (daya beda), hanya 50% distraktor yang efisien melakukan pengukuran (Puthiaparampil & Rahman, 2021).…”
Section: Pendahuluanunclassified