ObjectiveThe aim of this study was to assess adherence to the Consolidated Standards of Reporting Trials (CONSORT) for Abstracts by five high-impact general medical journals and to assess whether the quality of reporting was homogeneous across these journals.DesignThis is a descriptive, cross-sectional study.SettingRandomised controlled trial (RCT) abstracts in five high-impact general medical journals.ParticipantsWe used up to 100 RCT abstracts published between 2011 and 2014 from each of the following journals: The New England Journal of Medicine (NEJM), the Annals of Internal Medicine (Annals IM), The Lancet, the British Medical Journal (The BMJ) and the Journal of the American Medical Association (JAMA).Main outcomeThe primary outcome was per cent overall adherence to the 19-item CONSORT for Abstracts checklist. Secondary outcomes included per cent adherence in checklist subcategories and assessing homogeneity of reporting quality across the individual journals.ResultsSearch results yielded 466 abstracts, 3 of which were later excluded as they were not RCTs. Analysis was performed on 463 abstracts (97 from NEJM, 66 from Annals IM, 100 from The Lancet, 100 from The BMJ, 100 from JAMA). Analysis of all scored items showed an overall adherence of 67% (95% CI 66% to 68%) to the CONSORT for Abstracts checklist. The Lancet had the highest overall adherence rate (78%; 95% CI 76% to 80%), whereas NEJM had the lowest (55%; 95% CI 53% to 57%). Adherence rates to 8 of the checklist items differed by >25% between journals.ConclusionsAmong the five highest impact general medical journals, there is variable and incomplete adherence to the CONSORT for Abstracts reporting checklist of randomised trials, with substantial differences between individual journals. Lack of adherence to the CONSORT for Abstracts reporting checklist by high-impact medical journals impedes critical appraisal of important studies. We recommend diligent assessment of adherence to reporting guidelines by authors, reviewers and editors to promote transparency and unbiased reporting of abstracts.
BackgroundThis study explored the use of virtual patient generated data by investigating the association between students’ unprofessional patient summary statements, which they entered during an on-line virtual patient case, and detection of their future unprofessional behavior.MethodAt the USUHS, students complete a number of virtual patient encounters, including a patient summary, to meet the clerkship requirements of Internal Medicine, Family Medicine, and Pediatrics. We reviewed the summary statements of 343 students who graduated in 2012 and 2013. Each statement was rated with regard to four features: Unprofessional, Professional, Equivocal (could be construed as unprofessional), and Unanswered (students did not enter a statement). We also combined Unprofessional and Equivocal into a new category to indicate a statement receiving either rating. We then examined the associations of students’ scores on these categories (i.e. whether received a particular rating or not) and Expertise score and Professionalism score reflected by a post-graduate year one (PGY-1) program director (PD) evaluation form. The PD forms contained 58 Likert-scale items designed to measure the two constructs (Expertise and Professionalism).ResultsThe inter-rater reliability of statements coding was high (Cohen’s Kappa = .97). The measure of receiving an Unprofessional or Equivocal rating was significantly correlated with lower Expertise score (r = −.19, P < .05) as well as lower Professionalism score (r = −.17, P < .05) during PGY-1.ConclusionIncident reports and review of routine student evaluations are what most schools rely on to identify the majority of professionalism lapses. Unprofessionalism reflected in student entries may provide additional markers foreshadowing subsequent unprofessional behavior.
Background: In articles reporting randomized controlled trials, professional medical writing support is associated with increased adherence to Consolidated Standards of Reporting Trials (CONSORT). We set out to determine whether professional medical writing support was also associated with improved adherence to CONSORT for Abstracts. Methods: Using data from a previously published cross-sectional study of 463 articles reporting randomized controlled trials published between 2011 and 2014 in five top medical journals, we determined the association between professional medical writing support and CONSORT for Abstracts items using a Wilcoxon rank-sum test. Results: The mean proportion of adherence to CONSORT for Abstracts items reported was similar with and without professional medical writing support (64.3% vs 66.5%, respectively; p=0.30). Professional medical writing support was associated with lower adherence to reporting study setting (relative risk [RR]; 0.40; 95% confidence interval [CI], 0.23–0.70), and higher adherence to disclosing harms/side effects (RR 2.04; 95% CI, 1.37–3.03) and funding source (RR 1.75; 95% CI, 1.18–2.60). Conclusions: Although professional medical writing support was not associated with increased overall adherence to CONSORT for Abstracts, important aspects were improved with professional medical writing support, including reporting of adverse events and funding source. This study identifies areas to consider for improvement.
Significant opportunities exist to increase use of EHR functionalities and preserve physician-patient interactions and productivity in a resource-limited environment.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.