2022
DOI: 10.3389/frma.2021.751734
|View full text |Cite
|
Sign up to set email alerts
|

RipetaScore: Measuring the Quality, Transparency, and Trustworthiness of a Scientific Work

Abstract: A wide array of existing metrics quantifies a scientific paper's prominence or the author's prestige. Many who use these metrics make assumptions that higher citation counts or more public attention must indicate more reliable, better quality science. While current metrics offer valuable insight into scientific publications, they are an inadequate proxy for measuring the quality, transparency, and trustworthiness of published research. Three essential elements to establishing trust in a work include: trust in … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 9 publications
0
2
0
Order By: Relevance
“…This means that the task of spreading and checking best practices is difficult. Checklists can help guide best practices, and enforcing these checklists should lead to improved reporting standards [ 41 ], but given the scale of publishing, the use of automatic checklist tools such as SciScore and others, more focused tools such as Barzooka (continuous data in bar graphs), JetFighter (color-blind accessibility in visualizations), ODDPub (data and code availability), and RipetaScore (authorship, ethics, and data or code availability) [ 42 - 45 ], should help authors and reviewers improve manuscripts and address common checklist items and omissions consistently across many journals. In addition, automatic checklist completion can only help speed up the review process, which is a notoriously slow endeavor [ 46 ].…”
Section: Discussionmentioning
confidence: 99%
“…This means that the task of spreading and checking best practices is difficult. Checklists can help guide best practices, and enforcing these checklists should lead to improved reporting standards [ 41 ], but given the scale of publishing, the use of automatic checklist tools such as SciScore and others, more focused tools such as Barzooka (continuous data in bar graphs), JetFighter (color-blind accessibility in visualizations), ODDPub (data and code availability), and RipetaScore (authorship, ethics, and data or code availability) [ 42 - 45 ], should help authors and reviewers improve manuscripts and address common checklist items and omissions consistently across many journals. In addition, automatic checklist completion can only help speed up the review process, which is a notoriously slow endeavor [ 46 ].…”
Section: Discussionmentioning
confidence: 99%
“…Bibliometrics should encompass indicators of best research practices (e.g., frequency of data sharing, code sharing, protocol registration, and replications) as a free, publicly available resource covering all the open-access literature [ 12 ]. Examples include PLOS’s Open Science Indicators , which are currently capturing data sharing in repositories, code sharing, and preprint posting, and the Dimensions Research Integrity product and its proposed Ripeta Score [ 13 ]. Centralized open-access assessments can also scrutinize for elements of poor research practices (e.g., signs of image manipulation) at a massive scale.…”
mentioning
confidence: 99%