The COVID‐19 pandemic has accelerated the digitalization of assessment, creating new challenges for measurement professionals, including big data management, test security, and analyzing new validity evidence. In response to these challenges, Machine Learning (ML) emerges as an increasingly important skill in the toolbox of measurement professionals in this new era. However, most ML tutorials are technical and conceptual‐focused. Therefore, this tutorial aims to provide a practical introduction to ML in the context of educational measurement. We also supplement our tutorial with several examples of supervised and unsupervised ML techniques applied to marking a short‐answer question. Python codes are available on GitHub. In the end, common misconceptions about ML are discussed.
IntroductionNewer electronic differential diagnosis supports (EDSs) are efficient and effective at improving diagnostic skill. Although these supports are encouraged in practice, they are prohibited in medical licensing examinations. The purpose of this study is to determine how using an EDS impacts examinees' results when answering clinical diagnosis questions.MethodThe authors recruited 100 medical students from McMaster University (Hamilton, Ontario) to answer 40 clinical diagnosis questions in a simulated examination in 2021. Of these, 50 were first‐year students and 50 were final‐year students. Participants from each year of study were randomised into one of two groups. During the survey, half of the students had access to Isabel (an EDS) and half did not. Differences were explored using analysis of variance (ANOVA), and reliability estimates were compared for each group.ResultsTest scores were higher for final‐year versus first‐year students (53 ± 13% versus 29 ± 10, p < 0.001) and higher with the use of EDS (44 ± 28% versus 36 ± 26%, p < 0.001). Students using the EDS took longer to complete the test (p < 0.001). Internal consistency reliability (Cronbach's alpha) increased with EDS use among final‐year students but was reduced among first‐year students, although the effect was not significant. A similar pattern was noted in item discrimination, which was significant.ConclusionEDS use during diagnostic licensing style questions was associated with modest improvements in performance, increased discrimination in senior students and increased testing time. Given that clinicians have access to EDS in routine clinical practice, allowing EDS use for diagnostic questions would maintain ecological validity of testing while preserving important psychometric test characteristics.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.