Objective. Multi-patient care is important among medical trainees in an emergency department (ED). While resident efficiency is a typically measured metric, multi-patient care involves both efficiency and diagnostic / treatment accuracy. Multi-patient care ability is difficult to assess, though simulation is a potential alternative. Our objective was to generate validity evidence for a serious game in assessing multi-patient care skills among a variety of learners. Methods. This was a cross-sectional validation study using a digital serious game VitalSignsTM simulating multi-patient care within a pediatric ED. Subjects completed 5 virtual “shifts,” triaging, stabilizing, and discharging or admitting patients within a fixed time period; patients arrived at cascading intervals with pre-programmed deterioration if neglected. Predictor variables included generic multi-tasking ability, video game experience, medical knowledge, and clinical efficiency with real patients. Outcome metrics in 3 domains measured diagnostic accuracy (i.e. critical orders, diagnoses), efficiency (i.e. number of patients, time-to-order) and critical thinking (number of differential diagnoses); MANOVA determined differences between novice learners and expected expert physicians. Spearman Rank correlation determined associations between levels of expertise. Results. Ninety-five subjects’ gameplays were analyzed. Diagnostic accuracy and efficiency distinguished skill level between residency trained (residents, fellows and attendings) and pre-residency trained (medical students and undergraduate) subjects, particularly for critical orders, patients seen, and correct diagnoses (p < 0.003). There were moderate to strong correlations between the game’s diagnostic accuracy and efficiency metrics compared to level of training, including patients seen (rho = 0.47, p < 0.001); critical orders (rho = 0.80, p < 0.001); time-to-order (rho = −0.24, p = 0.025); and correct diagnoses (rho = 0.69, p < 0.001). Video game experience also correlated with patients seen (rho = 0.24, p = 0.003). Conclusion. A digital serious game depicting a busy virtual ED can distinguish between expected experts in multi-patient care at the pre- vs. post-residency level. Further study can focus on whether the game appropriately assesses skill acquisition during residency.