Background: Multiple choice questions, used in medical school assessments for decades, have many drawbacks such as hard to construct, allow guessing, encourage test-wiseness, promote rote learning, provide no opportunity for examinees to express ideas, and do not provide information about strengths and weakness of candidates. Directly asked, directly answered questions like Very Short Answer Questions (VSAQ) are considered a better alternative with several advantages. Objectives: This study aims to compare student performance in MCQ and VSAQ and obtain feedback. from the stakeholders. Methods: Conduct multiple true-false, one best answer, and VSAQ tests in two batches of medical students, compare their scores and psychometric indices of the tests and seek opinion from students and academics regarding these assessment methods. Results: Multiple true-false and best answer test scores showed skewed results and low psychometric performance compared to better psychometrics and more balanced student performance in VSAQ tests. The stakeholders' opinions were significantly in favour of VSAQ. Conclusion and recommendation: This study concludes that VSAQ is a viable alternative to multiple-choice question tests, and it is widely accepted by medical students and academics in the medical faculty.
Background Multiple choice questions, used in medical school assessments for decades, have many drawbacks, such as: hard to construct, allow guessing, encourage test-wiseness, promote rote learning, provide no opportunity for examinees to express ideas, and do not provide information about strengths and weakness of candidates. Directly asked and answered questions like Very Short Answer Questions (VSAQ) is considered a better alternative with several advantages. Objectives This study aims to substantiate the superiority of VSAQ by actual tests and obtaining feedback from the stakeholders. Methods Conduct multiple true-false, one best answer and VSAQ tests in two batches of medical students, compare their scores and psychometric indexes of the tests and seek opinions from students and academics regarding these assessment methods. Results Multiple true-false and best answer test scores showed skewed results and low psychometric performance compared to better psychometrics and more balanced student performance in VSAQ tests. The stakeholders’ opinions were significantly in favour of VSAQ. Conclusion and recommendation This study concludes that VSAQ is a viable alternative to multiple choice question tests, and it is widely accepted by medical students and academics in the medical faculty.
Background Distractor efficiency and the optimum number of functional distractors per item in One Best Answer Questions have been debated. The prevalence of non-functional distractors has led to a reduction in the number of distractors per item with the advantage of adding more items in the test. The existing literature eludes a definite answer to the question of what distractor efficiency best matches excellent psychometric indices. We examined the relationship between distractor efficiency and the psychometric indices of One Best Answer Questions in search of an answer. Methods We analysed 350 items used in 7 professional examinations and determined their distractor efficiency and the number of functional distractors per item. The items were sorted into five groups - excellent, good, fair, remediable and discarded based on their discrimination index. We studied how the distractor efficiency and functional distractors per item correlated with these five groups. Results Correlation of distractor efficiency with psychometric indices was significant but far from perfect. The excellent group topped in distractor efficiency in 3 tests, the good group in one test, the remediable group equalled excellent group in one test, and the discarded group topped in 2 tests. Conclusions The distractor efficiency did not correlate in a consistent pattern with the discrimination index. Fifty per cent or higher distractor efficiency, not hundred percent, was found to be the optimum.
Background: Multiple true-false tests (MTF), a component of our assessment system, have consistently generated low scores and many failures. This was attributed to the negative marking scheme. However, no study was conducted to explore the issue further. Item analysis revealed that students omitted false options more frequently and answered them wrongly more frequently than true options. The aim of this study was to determine the performance discrepancy between true and false options of MTF tests and the reasons for such discrepancy and the poor performance of MTF in general.Methods: The student performance of past 7 years of year-3 medicine end-of-posting examinations was analysed. The item analysis reports of 23 MTF tests were used to determine the significance of the differences in omission rates, correct-answer rates and the discrimination index of true and false options.Results: There were statistically significant differences in the omission rates, correct-answer rates and discrimination index values of true and false options. This study revealed that the false options consistently let down student performance. Although negative marking could be partly blamed for the situation, no justification could be found for the use of false options to test knowledge.Conclusions: Some publications endorse MTF, but many highlight its drawbacks. The use of false options in MTF was seen as an inherent defect in this instrument. As viable alternatives like VSAQ and Constructed Response Tests are in the horizon, we conclude that MTF ought to be discarded as an assessment instrument.
This article elaborates assessment analysis, a topic recently introduced in another article published in January 2017 by the same author in AMEE MedEdPublish. The formulae used to generate difficulty index and discrimination index of examiner-scored assessments as well as whole papers like MCQ are described in simple terms. The formulae used for assessment analysis are aligned to those established decades ago for item analysis. A step-by-step guideline is included to enable those interested to practice assessment analysis. The formulae, an Excel worksheet showing index calculations and the interpretation of index values are also included.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.