The skill of delivering bad news is difficult to teach and evaluate. Residents may practice in simulated settings; however, this may not translate to confidence or competence during real experiences. We investigated the acceptability and feasibility of social workers as evaluators of residents' delivery of bad news during patient encounters, and assessed the attitudes of both groups regarding this process. From August 2013 to June 2014, emergency medicine residents completed self-assessments after delivering bad news. Social workers completed evaluations after observing these conversations. The Assessment tools were designed by modifying the global Breaking Bad News Assessment Scale. Residents and social workers completed post-study surveys. 37 evaluations were received, 20 completed by social workers and 17 resident self-evaluations. Social workers reported discussing plans with residents prior to conversations 90 % of the time (18/20, 95 % CI 64.5, 97.8). Social workers who had previously observed the resident delivering bad news reported that the resident was more skilled on subsequent encounters 90 % of the time (95 % CI 42.2, 99). Both social workers and residents felt that prior training or experience was important. First-year residents valued advice from social workers less than advice from attending physicians, whereas more experienced residents perceived advice from social workers to be equivalent with that of attending physicians (40 versus 2.9 %, p = 0.002). Social worker assessment of residents' abilities to deliver bad news is feasible and acceptable to both groups. This formalized self-assessment and evaluation process highlights the importance of social workers' involvement in delivery of bad news, and the teaching of this skill. This method may also be used as direct-observation for resident milestone assessment.
BackgroundThe transition from medical student to first-year intern can be challenging. The stress of increased responsibilities, the gap between performance expectations and varying levels of clinical skills, and the need to adapt to a new institutional space and culture can make this transition overwhelming. Orientation programs intend to help new residents prepare for their new training environment.ObjectiveTo ease our interns’ transition, we piloted a novel clinical primer course. We believe this course will provide an introduction to basic clinical knowledge and procedures, without affecting time allotted for mandatory orientation activities, and will help the interns feel better prepared for their clinical duties.MethodsFirst-year Emergency Medicine residents were invited to participate in this primer course, called the Introductory Clinician Development Series (or “intern boot camp”), providing optional lecture and procedural skills instruction prior to their participation in the mandatory orientation curriculum and assumption of clinical responsibilities. Participating residents completed postcourse surveys asking for feedback on the experience.ResultsSurvey responses indicated that the intern boot camp helped first-year residents feel more prepared for their clinical shifts in the Emergency Department.ConclusionAn optional clinical introductory series can allow for maintenance of mandatory orientation activities and clinical shifts while easing the transition from medical student to clinician.
Background: Clinical reasoning is an essential skill to be learned during medical education. A developmental framework for the assessment and measurement of this skill has not yet been described in the literature. Objective: The authors describe the creation and pilot implementation of a rubric designed to assess the development of clinical reasoning skills in pre-clinical medical education. Design: The multidisciplinary course team used Backwards Design to develop course goals, objectives, and assessment for a new Clinical Reasoning Course. The team focused on behaviors that students were expected to demonstrate, identifying each as a 'desired result' element and aligning these with three levels of performance: emerging, acquiring, and mastering. Results: The first draft of the rubric was reviewed and piloted by faculty using sample student entries; this provided feedback on ease of use and appropriateness. After the first semester, the course team evaluated whether the rubric distinguished between different levels of student performance in each competency. A systematic approach based on descriptive analysis of mid-and end of semester assessments of student performance revealed that from mid-to end-of-semester, over half the students received higher competency scores at semester end. Conclusion: The assessment rubric allowed students in the early stages of clinical reasoning development to understand their trajectory and provided faculty a framework from which to give meaningful feedback. The multidisciplinary background of the course team supported a systematic and robust course and assessment design process. The authors strongly encourage other colleges to support the use of collaborative and multidisciplinary course teams.
No abstract
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.