Background Not thinking of a diagnosis is a leading cause of diagnostic error in the emergency department, resulting in delayed treatment, morbidity, and excess mortality. Electronic differential diagnostic support (EDS) results in small but significant reductions in diagnostic error. However, the uptake of EDS by clinicians is limited. Objective We sought to understand physician perceptions and barriers to the uptake of EDS within the emergency department triage process. Methods We conducted a qualitative study using a research associate to rapidly prototype an embedded EDS into the emergency department triage process. Physicians involved in the triage assessment of a busy emergency department were provided the output of an EDS based on the triage complaint by an embedded researcher to simulate an automated system that would draw from the electronic medical record. Physicians were interviewed immediately after their experience. Verbatim transcripts were analyzed by a team using open and axial coding, informed by direct content analysis. Results In all, 4 themes emerged from 14 interviews: (1) the quality of the EDS was inferred from the scope and prioritization of the diagnoses present in the EDS differential; (2) the trust of the EDS was linked to varied beliefs around the diagnostic process and potential for bias; (3) clinicians foresaw more benefit to EDS use for colleagues and trainees rather than themselves; and (4) clinicians felt strongly that EDS output should not be included in the patient record. Conclusions The adoption of an EDS into an emergency department triage process will require a system that provides diagnostic suggestions appropriate for the scope and context of the emergency department triage process, transparency of system design, and affordances for clinician beliefs about the diagnostic process and addresses clinician concern around including EDS output in the patient record.
Background Problem-based learning (PBL) is a common instructional method in undergraduate health professions training. Group interactions with and within PBL curricula may influence learning outcomes, yet few studies have synthesized the existing evidence. This scoping review summarized the literature examining the influence of group function on individual student PBL outcomes. Following Kirkpatrick’s framework, experiential, academic, and behavioral outcomes were considered. The impacts of three aspects of group function were explored: (1) Group Composition (identities and diversity), (2) Group Processes (conduct and climate, motivation and confidence, and facilitation), and (3) PBL Processes (tutorial activities). Methods A literature search was conducted using Medline, CINAHL, and APA PsychInfo from 1980–2021, with the help of a librarian. English-language empirical studies and reviews that related group function to learning outcome, as defined, in undergraduate health professions PBL curricula were included. Relevant references from included articles were also added if eligibility criteria were met. The methods, results, discussions, and limitations of the sample were summarized narratively. Results The final sample (n = 48) varied greatly in context, design, and results. Most studies examined junior medical students (n = 32), used questionnaires for data collection (n = 29), and reported immediate cross-sectional outcomes (n = 34). Group Processes was the most frequently examined aspect of group function (n = 29), followed by Group Composition (n = 26) and PBL Processes (n = 12). The relationships between group function and outcomes were not consistent across studies. PBL experiences were generally highly rated, but favorable student experiences were not reliable indicators of better academic or behavioral outcomes. Conversely, problematic group behaviors were not predictors of poorer grades. Common confounders of outcome measurements included exam pressure and self-study. Conclusions The main findings of the review suggested that (1) group function is more predictive of experiential than academic or behavioral PBL outcomes, and (2) different Kirkpatrick levels of outcomes are not highly correlated to each other. More research is needed to understand the complexity of group function in PBL tutorials under variable study contexts and better inform curricular training and design. Standardized tools for measuring PBL group function may be required for more conclusive findings.
Purpose Postgraduate medical education in Canada has quickly transformed to a competency-based model featuring new entrustable professional activities (EPAs) and associated milestones. It remains unclear, however, how these milestones are distributed between the central medical expert role and 6 intrinsic roles of the larger CanMEDS competency framework. A document review was thus conducted to measure how many EPA milestones are classified under each CanMEDS role, focusing on the overall balance between representation of intrinsic roles and that of medical expert. Method Data were extracted from the EPA guides of 40 Canadian specialties in 2021 to measure the percentage of milestones formally linked to each role. Subsequent analyses explored for differences when milestones were separated by stage of postgraduate training, weighted by an EPA’s minimum number of observations, or sorted by surgical and medical specialties. Results Approximately half of all EPA milestones (mean = 48.6%; 95% confidence interval [CI] = 45.9, 51.3) were classified under intrinsic roles overall. However, representation of the health advocate role was consistently low (mean = 2.95%; 95% CI = 2.49, 3.41), and some intrinsic roles—mainly leader, scholar, and professional—were more heavily concentrated in the final stage of postgraduate training. These findings held true under all conditions examined. Conclusions The observed distribution of roles in EPA milestones fits with high-level descriptions of CanMEDS in that intrinsic roles are viewed as inextricably linked to medical expertise, implying both are equally important to cultivate through curricula. Yet a fine-grained analysis suggests that a low prevalence or late emphasis of some intrinsic roles may hinder how they are taught or assessed. Future work must explore whether the quantity or timing of milestones shapes the perceived value of each role, and other factors determining the optimal distribution of roles throughout training.
Background: Interprofessional feedback is becoming increasingly emphasized within health professions’ training programs. The objective of this scoping review is to determine what is known about how learners perceive and interact with feedback in an interprofessional context for learning. Methods: A search strategy was developed and conducted in Ovid MEDLINE. Title and abstract screening were performed by two reviewers independently. Next, full texts of selected articles were reviewed by one reviewer to determine the articles included in the review. Data extraction was performed to determine the articles’ study population, methodologies and outcomes relevant to the research objective. Results: Our analysis of the relevant outcomes yielded four key concepts: (1) issues with the feedback process and the need for training; (2) the perception of feedback providers, affecting how the feedback is utilized; (3) professions of the feedback providers, affecting the feedback process; and (4) learners’ own attitude toward feedback, affecting the feedback process. Conclusions: The learner’s perception of interprofessional feedback can be an obstacle in the feedback process. Training around interprofessional feedback should be included as part of interprofessional programs. Research is needed to explore how to address barriers in feedback interaction that stem from misguided perceptions of feedback providers’ professions.
IntroductionNewer electronic differential diagnosis supports (EDSs) are efficient and effective at improving diagnostic skill. Although these supports are encouraged in practice, they are prohibited in medical licensing examinations. The purpose of this study is to determine how using an EDS impacts examinees' results when answering clinical diagnosis questions.MethodThe authors recruited 100 medical students from McMaster University (Hamilton, Ontario) to answer 40 clinical diagnosis questions in a simulated examination in 2021. Of these, 50 were first‐year students and 50 were final‐year students. Participants from each year of study were randomised into one of two groups. During the survey, half of the students had access to Isabel (an EDS) and half did not. Differences were explored using analysis of variance (ANOVA), and reliability estimates were compared for each group.ResultsTest scores were higher for final‐year versus first‐year students (53 ± 13% versus 29 ± 10, p < 0.001) and higher with the use of EDS (44 ± 28% versus 36 ± 26%, p < 0.001). Students using the EDS took longer to complete the test (p < 0.001). Internal consistency reliability (Cronbach's alpha) increased with EDS use among final‐year students but was reduced among first‐year students, although the effect was not significant. A similar pattern was noted in item discrimination, which was significant.ConclusionEDS use during diagnostic licensing style questions was associated with modest improvements in performance, increased discrimination in senior students and increased testing time. Given that clinicians have access to EDS in routine clinical practice, allowing EDS use for diagnostic questions would maintain ecological validity of testing while preserving important psychometric test characteristics.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.