Rationale, aims, and objectivesProgrammatic assessment has been identified as a system‐oriented approach to achieving the multiple purposes for assessment within Competency‐Based Medical Education (CBME, i.e., formative, summative, and program improvement). While there are well‐established principles for designing and evaluating programs of assessment, few studies illustrate and critically interpret, what a system of programmatic assessment looks like in practice. This study aims to use systems thinking and the ‘two communities’ metaphor to interpret a model of programmatic assessment and to identify challenges and opportunities with operationalization.MethodAn interpretive case study was used to investigate how programmatic assessment is being operationalized within one competency‐based residency program at a Canadian university. Qualitative data were collected from residents, faculty, and program leadership via semi‐structured group and individual interviews conducted at nine months post‐CBME implementation. Data were analyzed using a combination of data‐based inductive analysis and theory‐derived deductive analysis.ResultsIn this model, Academic Advisors had a central role in brokering assessment data between communities responsible for producing and using residents' performance information for decision making (i.e., formative, summative/evaluative, and program improvement). As system intermediaries, Academic Advisors were in a privileged position to see how the parts of the assessment system contributed to the functioning of the whole and could identify which system components were not functioning as intended. Challenges were identified with the documentation of residents' performance information (i.e., system inputs); use of low‐stakes formative assessments to inform high‐stakes evaluative judgments about the achievement of competence standards; and gaps in feedback mechanisms for closing learning loops.ConclusionsThe findings of this research suggest that program stakeholders can benefit from a systems perspective regarding how their assessment practices contribute to the efficacy of the system as a whole. Academic Advisors are well positioned to support educational development efforts focused on overcoming challenges with operationalizing programmatic assessment.
Purpose Despite the broad endorsement of competency-based medical education (CBME), myriad difficulties have arisen in program implementation. The authors sought to evaluate the fidelity of implementation and identify early outcomes of CBME implementation using Rapid Evaluation to facilitate transformative change. Method Case-study methodology was used to explore the lived experience of implementing CBME in the emergency medicine postgraduate program at Queen’s University, Canada, using iterative cycles of Rapid Evaluation in 2017–2018. After the intended implementation was explicitly described, stakeholder focus groups and interviews were conducted at 3 and 9 months post-implementation to evaluate the fidelity of implementation and early outcomes. Analyses were abductive, using the CBME core components framework and data-driven approaches to understand stakeholders’ experiences. Results In comparing planned with enacted implementation, important themes emerged with resultant opportunities for adaption. For example, lack of a shared mental model resulted in frontline difficulty with assessment and feedback and a concern that the granularity of competency-focused assessment may result in “missing the forest for the trees,” prompting the return of global assessment. Resident engagement in personal learning plans was not uniformly adopted, and learning experiences tailored to residents’ needs were slow to follow. Conclusions Rapid Evaluation provided critical insights into the successes and challenges of operationalizing CBME. Implementing the practical components of CBME was perceived as a sprint, while realizing the principles of CBME and changing culture in postgraduate training was a marathon requiring sustained effort in the form of frequent evaluation and continuous faculty and resident development.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.