Objective Clinical practice guidelines (CPGs) underpin patient care, and ideally authors of these guidelines would be free from outside influence. However, it has been shown many times that authors of professional society CPGs receive large sums of money from industry drug companies, creating financial conflicts of interest. This study investigated industry payments catalogued in the Open Payments Database (OPD) that have been received by authors of the American College of Rheumatology (ACR) CPGs. Methods Guidelines on the ACR web site that were published during or after August 2014 were used to retrieve the list of authors. All general, research, associated research, and ownership payments reported on the OPD between the date of publication of the CPG and 12 months prior were extracted in a parallel and blinded manner by 2 investigators. Results Of the 89 US‐based physician‐authors from the 5 ACR CPGs identified within the study timeframe, 56 (62.9%) had received at least 1 payment according to OPD records. These 56 authors had received a median of $522 (interquartile range $119–2,500), which, combined, was a total of $9,728,751. Nineteen authors had received at least 1 industry payment relevant to the CPG recommendations, for a median amount of $748 and a total of $1,961,362 in relevant payments. Of the total relevant payments received, a significant proportion was undisclosed (for ACR CPGs during or after August 2014, undisclosed payments were $699,561, or 35.7% of the total). Conclusion Fewer than one‐half of the US‐based physician‐authors of ACR CPGs during or after August 2014 had received guideline‐relevant industry payments. Nonetheless, a substantial proportion of the money received was not disclosed. Conflict of interest disclosure is a bare minimum requirement, and more permanent solutions may include divestiture or inclusion of more nonconflicted authors.
ObjectivesAs much as 50%–90% of research is estimated to be irreproducible, costing upwards of $28 billion in USA alone. Reproducible research practices are essential to improving the reproducibility and transparency of biomedical research, such as including preregistering studies, publishing a protocol, making research data and metadata publicly available, and publishing in open access journals. Here we report an investigation of key reproducible or transparent research practices in the published oncology literature.DesignWe performed a cross-sectional analysis of a random sample of 300 oncology publications published from 2014 to 2018. We extracted key reproducibility and transparency characteristics in a duplicative fashion by blinded investigators using a pilot tested Google Form.Primary outcome measuresThe primary outcome of this investigation is the frequency of key reproducible or transparent research practices followed in published biomedical and clinical oncology literature.ResultsOf the 300 publications randomly sampled, 296 were analysed for reproducibility characteristics. Of these 296 publications, 194 contained empirical data that could be analysed for reproducible and transparent research practices. Raw data were available for nine studies (4.6%). Five publications (2.6%) provided a protocol. Despite our sample including 15 clinical trials and 7 systematic reviews/meta-analyses, only 7 included a preregistration statement. Less than 25% (65/194) of publications provided an author conflict of interest statement.ConclusionWe found that key reproducibility and transparency characteristics were absent from a random sample of published oncology publications. We recommend required preregistration for all eligible trials and systematic reviews, published protocols for all manuscripts, and deposition of raw data and metadata in public repositories.
ObjectiveIt has been estimated that much of health research may be wasted, resulting in billions of dollars in wasteful research spending worldwide each year. Given the increased use of randomized trials and their influence on medicine, one method to combat research waste is to conduct randomized clinical trials (RCTs) only when a systematic review (SR) suggests more data are needed or when no previous SRs are identified. Here, we analyzed RCTs to determine whether SRs were cited as justification for conducting a trial.MethodsWe analyzed phase III RCTs published between 2016 and 2018 in New England Journal of Medicine, Lancet, and JAMA. We performed duplicate and independent data extraction to ensure the accuracy and validity of our data. For each trial, we extracted whether SRs were cited as justification for conducting the clinical trial.ResultsWe examined 637 RCTs that cited 728 SRs. Overall, 38.1% (243/637) of RCTs cited an SR as either verbatim (6.9%, 44/637) or inferred (31.2%, 199/637) for trial justification. The 79 remaining RCTs cited SRs in other ways. Approximately, 49.5% (315/637) of RCTs did not cite a SR.ConclusionsLess than half of the analyzed clinical trials cited a SRs as the basis for undertaking the trial. We believe trialists should be required to present relevant SRs to an ethics or peer review committee demonstrating an unmet need prior to initiating a trial. Eliminating research waste is both a scientific and ethical responsibility.
Context Traditionally, the Accreditation Council for Graduate Medical Education (ACGME) requires residency programs to implement research and other scholarly activities into their training curriculum. Encouraging residents to publish during residency is believed to promote research throughout their careers; however, no study has attempted to quantify research productivity among orthopedic surgery residents before, during, and after residency. Objectives To determine whether publishing in peer-reviewed journals during orthopedic residencies was an indicator of continued academic achievement after graduation. Methods This study was observational in nature and employed a cross-sectional design. We examined whether research outcomes during orthopedic residency was associated with academic advancement or continued research involvement after residency. We identified 201 orthopedic residency programs on the Doximity website and randomly selected 50 to include in our sample. Of these programs, graduate rosters for 31 programs were located and subsequently included. Of the 341 graduates identified, we recorded the number of peer-reviewed publications, H-indices, fellowships, and whether the graduate pursued a career in private practice or academia. Results Orthopedic residency graduates from 31 programs published a total of 1923 peer-reviewed manuscripts. On average, residents had a total of 5.6 publications and an h-index of 3.2. Residents entering academia and pursuing fellowships had a significantly higher total number of publications, higher number of first-author publications, and greater H-indices compared to those who did not enter academia or pursue a fellowship. Conclusions Increased research productivity was associated with continued academic pursuits and an increased likelihood of pursuing fellowship training after residency.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.