Background
The integration of Health System Science (HSS) in medical education emphasizes mastery of competencies beyond mere knowledge acquisition. With the shift to online platforms during the COVID-19 pandemic, there is an increased emphasis on Technology Enhanced Assessment (TEA) methods, such as video assessments, to evaluate these competencies. This study investigates the efficacy of online video assessments in evaluating medical students’ competency in HSS.
Methods
A comprehensive assessment was conducted on first-year medical students (
n
= 10) enrolled in a newly developed curriculum integrating Health System Science (HSS) into the Bachelor of Medicine program in 2021. Students undertook three exams focusing on HSS competency. Their video responses were evaluated by a panel of seven expert assessors using a detailed rubric. Spearman rank correlation and the Interclass Correlation Coefficient (ICC) were utilized to determine correlations and reliability among assessor scores, while a mixed-effects model was employed to assess the relationship between foundational HSS competencies (C) and presentation skills (P).
Results
Positive correlations were observed in inter-rater reliability, with ICC values suggesting a range of reliability from poor to moderate. A positive correlation between C and P scores was identified in the mixed-effects model. The study also highlighted variations in reliability and correlation, which might be attributed to differences in content, grading criteria, and the nature of individual exams.
Conclusion
Our findings indicate that effective presentation enhances the perceived competency of medical students, emphasizing the need for standardized assessment criteria and consistent assessor training in online environments. This study highlights the critical roles of comprehensive competency assessments and refined presentation skills in online medical education, ensuring accurate and reliable evaluations.