Purpose: Non-Medical Expert or Intrinsic CanMEDS Roles, as outlined by the Royal College of Physicians and Surgeons of Canada, can be difficult to evaluate in an objective and specialty-specific manner. This study investigates an Objective Structured Clinical Examination (OSCE) evaluation tool to assess these competencies for Diagnostic Radiology Residents.
Methods:A five station CanMEDS OSCE was developed for postgraduate year 3 and 4 Residents to evaluate the Communicator, Collaborator, Manager, Health Advocate, Scholar and Professional Roles. Performance was assessed by postgraduate year 5 Residents using standardized scoring rubrics. CanMEDS OSCE scores were correlated with American College of Radiology (ACR) scores and Medical Expert OSCEs.Results: Seventy Residents in three separate cohorts (n=21, 26, 23) participated in the CanMEDS OSCE. Mean station scores were consistent across cohorts. In general, one-way ANOVAs showed no effect of postgraduate year on station scores. There were no significant correlations between CanMEDS OSCE scores and ACR exam scores or CanMEDS OSCE scores and Medical Expert OSCE scores, demonstrating divergent construct validity. In turn, this indicates construct validity for the CanMEDS OSCE by demonstrating that unique competencies are being measured. In contrast, there was a correlation between ACR exam scores and Medical Expert OSCE scores, confirming that these both assess the same construct.Conclusions: An OSCE can be a useful assessment tool to assess Intrinsic CanMEDS Roles in a specialty-specific manner. Correlational analyses indicate that unique competencies are being evaluated that are not captured by other traditional assessment means.