BackgroundSpaced education is a novel method that improves medical education through online repetition of core principles often paired with multiple-choice questions. This model is a proven teaching tool for medical students, but its effect on resident learning is less established. We hypothesized that repetition of key clinical concepts in a “Clinical Pearls” format would improve knowledge retention in medical residents.MethodsThis study investigated spaced education with particular emphasis on using a novel, email-based reinforcement program, and a randomized, self-matched design, in which residents were quizzed on medical knowledge that was either reinforced or not with electronically-administered spaced education. Both reinforced and non-reinforced knowledge was later tested with four quizzes.ResultsOverall, respondents incorrectly answered 395 of 1008 questions (0.39; 95% CI, 0.36–0.42). Incorrect response rates varied by quiz (range 0.34–0.49; p = 0.02), but not significantly by post-graduate year (PGY1 0.44, PGY2 0.33, PGY3 0.38; p = 0.08). Although there was no evidence of benefit among residents (RR = 1.01; 95% CI, 0.83–1.22; p = 0.95), we observed a significantly lower risk of incorrect responses to reinforced material among interns (RR = 0.83, 95% CI, 0.70–0.99, p = 0.04).ConclusionsOverall, repetition of Clinical Pearls did not statistically improve test scores amongst junior and senior residents. However, among interns, repetition of the Clinical Pearls was associated with significantly higher test scores, perhaps reflecting their greater attendance at didactic sessions and engagement with Clinical Pearls. Although the study was limited by a low response rate, we employed test and control questions within the same quiz, limiting the potential for selection bias. Further work is needed to determine the optimal spacing and content load of Clinical Pearls to maximize retention amongst medical residents. This particular protocol of spaced education, however, was unique and readily reproducible suggesting its potential efficacy for intern education within a large residency program.
BackgroundIt is unclear if the 30-day unplanned hospital readmission rate is a plausible accountability metric.ObjectiveCompare preventability of hospital readmissions, between an early period [0–7 days post-discharge] and a late period [8–30 days post-discharge]. Compare causes of readmission, and frequency of markers of clinical instability 24h prior to discharge between early and late readmissions.Design, setting, patients120 patient readmissions in an academic medical center between 1/1/2009-12/31/2010MeasuresSum-score based on a standard algorithm that assesses preventability of each readmission based on blinded hospitalist review; average causation score for seven types of adverse events; rates of markers of clinical instability within 24h prior to discharge.ResultsReadmissions were significantly more preventable in the early compared to the late period [median preventability sum score 8.5 vs. 8.0, p = 0.03]. There were significantly more management errors as causative events for the readmission in the early compared to the late period [mean causation score [scale 1–6, 6 most causal] 2.0 vs. 1.5, p = 0.04], and these errors were significantly more preventable in the early compared to the late period [mean preventability score 1.9 vs 1.5, p = 0.03]. Patients readmitted in the early period were significantly more likely to have mental status changes documented 24h prior to hospital discharge than patients readmitted in the late period [12% vs. 0%, p = 0.01].ConclusionsReadmissions occurring in the early period were significantly more preventable. Early readmissions were associated with more management errors, and mental status changes 24h prior to discharge. Seven-day readmissions may be a better accountability measure.
Overall, residents' self-assessed ratings of their attitudes toward teaching were positively impacted by participation in a RasT workshop. Further subanalysis showed that residents in primary care specialties showed a significantly greater increase in their ratings than residents in nonprimary care specialties.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.