Objective: Residents must become proficient in a variety of procedures. The practice of learning procedural skills on patients has come under ethical scrutiny, giving rise to the concept of simulation-based medical education. Resident training in a simulated environment allows skill acquisition without compromising patient safety. We assessed the impact of a simulation-based procedural skills training course on residents' competence in the performance of critical resuscitation procedures. Methods: We solicited self-assessments of the knowledge and clinical skills required to perform resuscitation procedures from a cross-sectional multidisciplinary sample of 28 resident study participants. Participants were then exposed to an intensive 8-hour simulation-based training program, and asked to repeat the self-assessment questionnaires on completion of the course, and again 3 months later. We assessed the validity of the self-assessment questionnaire by evaluating participants' skills acquisition through an Objective Structured Clinical Examination station. Results: We found statistically significant improvements in participants' ratings of both knowledge and clinical skills during the 3 self-assessment periods (p < 0.001). The participants' year of postgraduate training influenced their self-assessment of knowledge (F 2,25 = 4.91, p < 0.01) and clinical skills (F 2,25 = 10.89, p < 0.001). At the 3-month follow-up, junior-level residents showed consistent improvement from their baseline scores, but had regressed from their posttraining measures. Senior-level residents continued to show further increases in their assessments of both clinical skills and knowledge beyond the simulation-based training course. Conclusion: Significant improvement in self-assessed theoretical knowledge and procedural skill competence for residents can be achieved through participation in a simulation-based resuscitation course. Gains in perceived competence appear to be stable over time, with senior learners gaining further confidence at the 3-month follow-up. Our findings support the benefits of simulation-based training for residents.
Background: There is a question of whether a single assessment tool can assess the key competencies of residents as mandated by the Royal College of Physicians and Surgeons of Canada CanMEDS roles framework. Objective: The objective of the present study was to investigate the reliability and validity of an emergency medicine (EM) in-training evaluation report (ITER). Method: ITER data from 2009 to 2011 were combined for residents across the 5 years of the EM residency training program. An exploratory factor analysis with varimax rotation was used to explore the construct validity of the ITER. A total of 172 ITERs were completed on residents across their first to fifth year of training. Results: A combined, 24-item ITER yielded a five-factor solution measuring the CanMEDs role Medical Expert/ Scholar, Communicator/Collaborator, Professional, Health Advocate and Manager subscales. The factor solution accounted for 79% of the variance, and reliability coefficients (Cronbach alpha) ranged from a 5 0.90 to 0.95 for each subscale and a 5 0.97 overall. The combined, 24-item ITER used to assess residents' competencies in the EM residency program showed strong reliability and evidence of construct validity for assessment of the CanMEDS roles. Conclusion: Further research is needed to develop and test ITER items that will differentiate each CanMEDS role exclusively.
Introduction: Emergency department handover is a high-risk period for patient safety. A recent study showed a decreased rate of preventable adverse events and errors after implementation of a resident hand-off bundle on pediatric inpatient wards. In a 2013 survey by the Canadian Associations of Internes and Residents, only 11% of residents in any discipline stated they received a formal teaching session on handover. Recently, the CanMEDS 2015 Physician Competency Framework has added safe and skillful transfer of patient care as a new proficiency within the collaborator role. We hypothesize that significant variation exists in the current delivery and evaluation of handover education in Canadian EM residencies. Methods: We conducted a descriptive, cross-sectional survey of Canadian residents enrolled in the three main training streams of Emergency Medicine (FRCP CCFP-EM, PEM). The primary outcome was to determine which educational modalities are used to teach and assess handover proficiency. Secondarily, we described current sign-over practices and perceived competency at patient handover. Results: 130 residents completed the survey (73% FRCP, 19% CCFP-EM, 8% PEM). 6% of residents were aware of handover proficiency objectives within their curriculum, while 15% acknowledged formal evaluation in this area. 98% of respondents were taught handover by observation of staff or residents on shift, while 55% had direct teaching on the job. Less than 10% of respondents received formal sessions in didactic lecture, small group or simulation formats. Evaluation of handover skills occurred primarily by on shift observation (100% of respondents), while 3% of residents had received assessment through simulation. Local centre handover practices were variable; less than half of residents used mnemonic tools, written or electronic adjuncts. Conclusion: Canadian EM residents receive variable and sparse formal training and assessment on emergency department handover. The majority of training occurs by on shift observation and few trainees receive instruction on objective tools or explicit patient care standards. There exists potential for further development of standardized objectives, utilization of other educational modalities and formal assessments to better prepare residents to conduct safer patient handoffs.
We sought to assess the impact of procedural skills simulation training on residents’ competence in performing critical resuscitation skills. Our study was a prospective, cross-sectional study of residents from three residency training programs (Family Medicine, Emergency Medicine and Internal Medicine) at the University of Calgary. Participants completed a survey measuring competence in the performance of the procedural skills required to manage hemodynamic instability. The study intervention was an 8 hour simulation based training program focused on resuscitation procedure psychomotor skill acquisition. Competence was criterion validated at the Right Internal Jugular Central Venous Catheter Insertion station by an expert observer using a standardized checklist (Observed Structured Clinical Examination (OSCE) format). At the completion of the simulation course participants repeated the self-assessment survey. Descriptive Statistics, Cronbach’s alpha, Pearson’s correlation coefficient and Paired Sample t-test statistical tools were applied to the analyze the data. Thirty-five of 37 residents (9 FRCPC Emergency Medicine, 4 CCFP-Emergency Medicine, 17 CCFP, and 5 Internal Medicine) completed both survey instruments and the eight hour course. Seventy-two percent of participants were PGY-1 or 2. Mean age was 30.7 years of age. Cronbach’s alpha for the survey instrument was 0.944. Pearson’s Correlation Coefficient was 0.69 (p < 0.001) for relationship between Expert Assessment and Self-Assessment. The mean improvement in competence score pre- to post-intervention was 6.77 (p < 0.01, 95% CI 5.23-8.32). Residents from a variety of training programs (Internal Medicine, Emergency Medicine and Family Medicine) demonstrated a statistically significant improvement in competence with critical resuscitation procedural skills following an intensive simulation based training program. Self-assessment of competence was validated using correlation data based on expert assessments. Dawson S. Procedural simulation: a primer. J Vasc Interv Radiol. 2006; 17(2.1):205-13. Vozenilek J, Huff JS, Reznek M, Gordon JA. See one, do one, teach one: advanced technology in medical education. Acad Emerg Med. 2004; 11(11):1149-54. Ziv A, Wolpe PR, Small SD, Glick S. Simulation-based medical education: an ethical imperative. Acad Med. 2003; 78(8):783-8.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.