; for the ESO-KSU session participants* Abstract About the meeting: The purpose of the European Stroke Organisation (ESO)-Karolinska Stroke Update Conference is to provide updates on recent stroke therapy research and to give an opportunity for the participants to discuss how these results may be implemented into clinical routine. Several scientific sessions discussed in the meeting and each session produced consensus statements. The meeting started 20 years ago as Karolinska Stroke Update, but since 2014, it is a joint conference with ESO. Importantly, it provides a platform for discussion on the ESO guidelines process and on recommendations to the ESO guidelines committee on specific topics. By this, it adds a direct influence from stroke professionals otherwise not involved in committees and work groups on the guidelines procedure. The discussions at the conference may also inspire new guidelines when motivated. The topics raised at the meeting are selected by the scientific programme committee mainly based on recent important scientific publications. The ESO-Karolinska Stroke Update consensus statement and recommendations will be published every 2 years and it will work as implementation of ESO-guidelines Background: This year's ESO-Karolinska Stroke Update Meeting was held in Stockholm on 13-15 November 2016. There were 10 scientific sessions discussed in the meeting and each session produced a consensus statement (Full version with background, issues, conclusions and references are published as web-material and at http://www.eso-karolinska.org/2016 and http://eso-stroke.org) and recommendations which were prepared by a writing committee consisting of session chair(s), secretary and speakers and presented to the 312 participants of the meeting. In the open meeting, general participants commented on the consensus statement and recommendations and the final document were adjusted based on the discussion from the general participants. Recommendations (grade of evidence) were graded according to the 1998 Karolinska Stroke Update meeting with regard to the strength of evidence. Grade A Evidence: Strong support from randomised controlled trials and statistical reviews (at least one randomised controlled trial plus one statistical review). Grade B Evidence: Support from randomised controlled trials and statistical reviews (one randomised controlled trial or one statistical review). Grade C Evidence: No reasonable support from randomised controlled trials, recommendations based on small randomised and/or non-randomised controlled trials evidence.
Purpose Competency‐based medical education (CBME) has prompted widespread implementation of workplace‐based assessment (WBA) tools using entrustment anchors. This study aimed to identify factors that influence faculty's rating choices immediately following assessment and explore their experiences using WBAs with entrustment anchors, specifically the Ottawa Surgical Competency Operating Room Evaluation scale. Method A convenience sample of 50 semi‐structured interviews with Emergency Medicine (EM) physicians from a single Canadian hospital were conducted between July and August 2019. All interviews occurred within two hours of faculty completing a WBA of a trainee. Faculty were asked what they considered when rating the trainee's performance and whether they considered an alternate rating. Two team members independently analysed interview transcripts using conventional content analysis with line‐by‐line coding to identify themes. Results Interviews captured interactions between 70% (26/37) of full‐time EM faculty and 86% (19/22) of EM trainees. Faculty most commonly identified the amount of guidance the trainee required as influencing their rating. Other variables such as clinical context, trainee experience, past experiences with the trainee, perceived competence and confidence were also identified. While most faculty did not struggle to assign ratings, some had difficulty interpreting the language of entrustment anchors, being unsure whether their assessment should be retrospective or prospective in nature, and if/how the assessment should change whether they were ‘in the room’ or not. Conclusions By going to the frontline during WBA encounters, this study captured authentic and honest reflections from physicians immediately engaged in assessment using entrustment anchors. While many of the factors identified are consistent with previous retrospective work, we highlight how some faculty consider factors outside the prescribed approach and struggle with the language of entrustment anchors. These results further our understanding of ‘in‐the‐moment’ assessments using entrustment anchors and may facilitate effective faculty development regarding WBA in CBME.
Introduction: Competency based medical education (CBME) has triggered widespread utilization of workplace-based assessment (WBA) tools in postgraduate training programs. These WBAs predominately use rating scales with entrustment anchors, such as the Ottawa Surgical Competency Operating Room Evaluation (O-SCORE). However, little is known about the factors that influence a supervising physician's decision to assign a particular rating on scales using entrustment anchors. This study aimed to identify the factors that influence supervisors’ ratings of trainees using WBA tools with entrustment anchors at the time of assessment and to explore the experiences with and challenges of using entrustment anchors in the emergency department (ED). Methods: A convenience sample of full-time emergency medicine (EM) faculty were recruited from two sites within a single academic Canadian EM hospital system. Fifty semi-structured interviews were conducted with EM physicians within two hours of completing a WBA for an EM trainee. Interviews were audio-recorded, transcribed verbatim, and independently analyzed by two members of the research team. Themes were stratified by trainee level, rating and task. Results: Interviews involved 73% (27/37) of all EM staff and captured assessments completed on 83% (37/50) of EM trainees. The mean WBA rating of studied samples was 4.34 ± 0.77 (2 to 5), which was similar to the mean rating of all WBAs completed during the study period. Overall, six major factors were identified that influenced staff WBA ratings: amount of guidance required, perceived competence through discussion and questioning, trainee experience, clinical context, past experience working with the trainee, and perceived confidence. The majority of staff denied struggling to assign ratings. However, when they did struggle, it involved the interpretation of WBA anchors and their application to the clinical context in the ED. Conclusion: Several factors appear to be taken into account by clinical supervisors when they make decisions regarding the particular rating that they will assign a trainee on a WBA that uses entrustment anchors. Not all of these factors are specific to that particular clinical encounter. The results from this study further our understanding on the use of entrustment anchors within the ED and may facilitate faculty development regarding WBA completion as we move forward in CBME.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.