Background: Adherence to study registration and reporting best practices are vital to foster evidence-based medicine. Poor adherence to these standards in clinical trials conducted in Canada would be detrimental to patients, researchers, and the public alike. Methods: All registered clinical trials on ClinicalTrials.gov conducted in Canada as of 2009 and completed by 2019 were identified. A cross-sectional analysis of those trials assessed prospective registration, subsequent result reporting in the registry, and subsequent publication of study findings. The lead sponsor, phase of study, clinical trial site location, total patient enrollment, number of arms, type of masking, type of allocation, year of completion, and patient demographics were examined as potential effect modifiers to these best practices. Results: A total of 6,720 trials met the inclusion criteria. From 2009-2019, 59% (n=3,967) of them were registered prospectively and 39% (n=2,642) reported their results in the registry. Of the trials registered between 2009-2014, 55% (n=1,482) were subsequently published in an academic journal. Of the 3,763 trials conducted exclusively in Canada, 3% (n=123) met all 3 criteria of: prospective registration, reporting in the registry, and publishing findings. In contrast, of the remaining 2,957 trials with both Canadian and international sites, 41% (n=1,238) had an overall compliance to these three criteria. Overall, the odds of having adherence to all three practices concurrently in Canadian trials decreases by 95% when compared to international trials (OR = 0.05; 95CI: 0.04 – 0.06). Conclusion: Canadian clinical trials substantially lacked adherence to study registration and reporting best practices. Knowledge of this widespread non-compliance should motivate stakeholders in the Canadian clinical trials ecosystem to address and continue to monitor this problem. The data presented provides a baseline against which to compare any improvement in the registration and reporting of clinical trials in Canada.
Background Peer review is an integral part of maintaining the current standard of scientific publishing. Despite this, there is no training standard for peer reviewers and review guidelines tend to vary between journals. The purpose of this study was to conduct a systematic review of all openly available online training in scholarly peer review and to analyze their characteristics. Methods MEDLINE, PsycINFO, Embase, ERIC, and Web of Science were systematically searched. Additional grey literature searches were conducted on Google, YouTube, university library websites, publisher websites and peer review related events and groups. All English or French training material in scholarly manuscript peer review of biomedical manuscripts openly accessible online on the search date (September 12, 2021) were included. Sources created prior to 2012 were excluded. Screening was conducted in duplicate in two separate phases: title and abstract followed by full text. Data extraction was conducted by one reviewer and verified by a second. Conflicts were resolved by third-party at both stages. Characteristics were reported using frequencies and percentages. A direct content analysis was preformed using pre-defined topics of interest based on existing checklists for peer reviewers. A risk of bias tool was purpose-built for this study to evaluate the included training material as evidence-based. The tool was used in duplicate with conflicts resolved through discussion between the two reviewers. Results After screening 1244 records, there were 45 sources that met the inclusion criteria; however, 23 of 45 (51%) were not able to be fully accessed for data extraction. The most common barriers to access were membership requirements (n = 11 of 23, 48%), availability for a limited time (n = 8, 35%), and paywalls with an average cost of $99 USD (n = 7, 30%). The remaining 22 documents were included in the data analysis. All documents were published in English. Most documents either did not report publication date (n = 10, 45%) or were created in the last five years (n = 10, 45%). The most common training format was an online module (n = 12, 57%) with an estimated completion time of less than one hour (n = 15, 68%). The most frequently covered topics included how to write a peer review report (n = 20, 91%), critical appraisal of data and results (n = 18, 82%), and a definition of peer review (n = 18, 82%). Critical appraisal of reporting guidelines (n = 9, 41%), clinical trials (n = 4, 18%), and statistical analysis (n = 4, 18%) were less commonly covered. Using our ad-hoc risk of bias tool, four documents (18%) met our criteria for evidence-based. Conclusion Our comprehensive search of the literature identified twenty-two openly accessible online training materials in manuscript peer review. For such a crucial step in the dissemination of literature, a lack of training could potentially explain disparities in the quality of scholarly publishing. Future efforts should be focused on creating a more unified openly accessible online manuscript peer review training program.
Background Despite having a crucial role in scholarly publishing, peer reviewers do not typically require any training. The purpose of this study was to conduct an international survey on the current perceptions and motivations of researchers regarding peer review training. Methods A cross-sectional online survey was conducted of biomedical researchers. A total of 2000 corresponding authors from 100 randomly selected medical journals were invited via email. Quantitative items were reported using frequencies and percentages or means and SE, as appropriate. A thematic content analysis was conducted for qualitative items in which two researchers independently assigned codes to the responses for each written-text question, and subsequently grouped the codes into themes. A descriptive definition of each category was then created and unique themes–as well as the number and frequency of codes within each theme–were reported. Results A total of 186 participants completed the survey of which 14 were excluded. The majority of participants indicated they were men (n = 97 of 170, 57.1%), independent researchers (n = 108 of 172, 62.8%), and primarily affiliated with an academic organization (n = 103 of 170, 62.8%). A total of 144 of 171 participants (84.2%) indicated they had never received formal training in peer review. Most participants (n = 128, 75.7%) agreed–of which 41 (32.0%) agreed strongly–that peer reviewers should receive formal training in peer review prior to acting as a peer reviewer. The most preferred training formats were online courses, online lectures, and online modules. Most respondents (n = 111 of 147, 75.5%) stated that difficulty finding and/or accessing training was a barrier to completing training in peer review. Conclusion Despite being desired, most biomedical researchers have not received formal training in peer review and indicated that training was difficult to access or not available.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.