Context The principles of evidence‐based medicine (EBM) provide clinicians with the ability to identify, source, appraise and integrate research evidence into medical decision making. Despite the mantra of EBM encouraging the use of evidence to inform practice, there appears little evidence available on how best to teach EBM to medical trainees. A systematic review was performed to identify what type of educational method is most effective at increasing medical trainees' competency in EBM. Methods A systematic review of randomised controlled trials (RCTs) was performed. Electronic searches were performed across three databases. Two reviewers independently searched, extracted and reviewed the articles. The quality of each study was assessed using the Cochrane Collaboration's risk of bias assessment tool. Results In total, 177 citations were returned, from which 14 studies were RCTs and examined for full text. Nine of the studies met the inclusion criteria and were included in this review. Learner competency in EBM increased post‐intervention across all studies. However, no difference in learner outcomes was identified across a variety of educational modes, including lecture versus online, direct versus self‐directed, multidisciplinary versus discipline‐specific groups, lecture versus active small group facilitated learning. Conclusions The body of evidence available to guide educators on how to teach EBM to medical trainees is small, albeit of a good quality. The major limitation in assessing risk of bias was the inability to blind participants to an educational intervention and lack of clarity regarding certain aspects within studies. Further evidence, and transparency in design, is required to guide the development and implementation of educational strategies in EBM, including modes of teaching and the timing of delivering EBM content within the broader medical curriculum. Further research is required to determine the effects of timing, content and length of EBM courses and teaching methods.
SP-based education is widely accepted as a valuable and effective means of teaching communication skills but there is limited evidence of how this translates to patient outcomes and no indication of economic benefit for this type of training over another method.
Context High‐quality research into education costs can inform better decision making. Improvements to cost research can be guided by information about the research questions, methods and reporting of studies evaluating costs in health professions education (HPE). Our objective was to appraise the overall state of the field and evaluate temporal trends in the methods and reporting quality of cost evaluations in HPE research. Methods We searched the MEDLINE, CINAHL (Cumulative Index to Nursing and Allied Health Literature), EMBASE, Business Source Complete and ERIC (Education Resources Information Centre) databases on 31 July 2017. To evaluate trends over time, we sampled research reports at 5‐year intervals (2001, 2006, 2011 and 2016). All original research studies in HPE that reported a cost outcome were included. The Medical Education Research Study Quality Instrument (MERSQI) and the BMJ economic checklist were used to appraise methodological and reporting quality, respectively. Trends in quality over time were analysed. Results A total of 78 studies were included, of which 16 were published in 2001, 15 in 2006, 20 in 2011 and 27 in 2016. The region most commonly represented was the USA (n = 43). The profession most commonly referred to was that of the physician (n = 46). The mean ± standard deviation (SD) MERSQI score was 10.9 ± 2.6 out of 18, with no significant change over time (p = 0.55). The mean ± SD BMJ score was 13.5 ± 7.1 out of 35, with no significant change over time (p = 0.39). A total of 49 (63%) studies stated a cost‐related research question, 23 (29%) stated the type of cost evaluation used, and 31 (40%) described the method of estimating resource quantities and unit costs. A total of 16 studies compared two or more interventions and reported both cost and learning outcomes. Conclusions The absolute number of cost evaluations in HPE is increasing. However, there are shortcomings in the quality of methodology and reporting, and these are not improving over time.
Effective education of practical skills can alter clinician behaviour, positively influence patient outcomes, and reduce the risk of patient harm. This study compares the efficacy of two innovative practical skill teaching methods, against a traditional teaching method. Year three pre-clinical physiotherapy students consented to participate in a randomised controlled trial, with concealed allocation and blinded participants and outcome assessment. Each of the three randomly allocated groups were exposed to a different practical skills teaching method (traditional, pre-recorded video tutorial or student self-video) for two specific practical skills during the semester. Clinical performance was assessed using an objective structured clinical examination (OSCE). The students were also administered a questionnaire to gain the participants level of satisfaction with the teaching method, and their perceptions of the teaching methods educational value. There were no significant differences in clinical performance between the three practical skill teaching methods as measured in the OSCE, or for student ratings of satisfaction. A significant difference existed between the methods for the student ratings of perceived educational value, with the teaching approaches of pre-recorded video tutorial and student self-video being rated higher than 'traditional' live tutoring. Alternative teaching methods to traditional live tutoring can produce equivalent learning outcomes when applied to the practical skill development of undergraduate health professional students. The use of alternative practical skill teaching methods may allow for greater flexibility for both staff and infrastructure resource allocation.
BackgroundSimulation education can be costly—however, costs need to be considered against what you get in return to determine whether these costs are justified. Unfortunately in simulation education, evaluations that yield information about the return on investment are scarce. An economic evaluation provides a comparison of value. In short—what is it that is being obtained, what do you need to give up to get it, and how does that compare to what you get with the next best alternative? When educators are equipped with this knowledge, they will be better informed to know the place that simulation-based learning approaches should take in optimal course structures.Main bodyThis article provides an overview of the costs and consequences associated with simulation in healthcare education. It provides an outline of the benefits of using economic evaluations to inform decision-making by educators and clinicians concerning the most appropriate educational approaches. It also provides guidance for educational researchers interested in investigating the cost and value of their innovations.ConclusionMeasures of cost and value in simulation are required to provide information about the viability and sustainability of simulation education, enabling simulation education in health care to demonstrate its worth.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.