2020
DOI: 10.31222/osf.io/7gct9
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Towards wide-scale adoption of open science practices: The role of open science communities

Abstract: Open Science (OS) increases the quality, efficiency, and impact of science. This has been widely recognised by scholars, funders, and policy makers. However, despite the increasing availability of infrastructure supporting OS and the rise in policies and incentives to change behavior, OS practices are not yet the norm. While pioneering researchers are developing and embracing OS practices, the majority sticks to the status quo. To transition from pioneering to common practice, we need to engage a critical prop… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(10 citation statements)
references
References 36 publications
0
10
0
Order By: Relevance
“…Behaviors that may directly or indirectly improve replicability, or the ability to assess replicability, include increasing sample size, preregistration, improving rigor and transparency, sharing materials and primary data, conducting replications, and enhancement of error detection and correction. A variety of interventions and solutions have emerged in the last decade including: tools supporting preregistration and sharing such as the Open Science Framework (OSF; Soderberg, 2018) and AsPredicted; error detection and correction such as statcheck (Epskamp & Nuijten, 2018) and GRIM (Granularity Related Inconsistent Means; ; grassroots communities promoting new norms such as the Society for Improving Psychological Science, Open Science Communities (Armeni et al, 2020), national reproducibility networks (Munafò et al, 2020); large-scale collaboration to increase sample size and replication efforts such as Psychological Science Accelerator (Moshontz et al, 2018) and ManyBabies (Byers-Heinlein et al, 2020); increasing visibility of behaviors to shift norms such as badges for open practices (Kidwell et al, 2016); altering incentives for publishing away from positive, novel, tidy results with Registered Reports (Chambers, 2019;Scheel et al, 2020); and policy changes by publishers, funders, and institutions to encourage or require more rigor, transparency, and sharing such as TOP Guidelines .…”
Section: Evidence Of Changementioning
confidence: 99%
“…Behaviors that may directly or indirectly improve replicability, or the ability to assess replicability, include increasing sample size, preregistration, improving rigor and transparency, sharing materials and primary data, conducting replications, and enhancement of error detection and correction. A variety of interventions and solutions have emerged in the last decade including: tools supporting preregistration and sharing such as the Open Science Framework (OSF; Soderberg, 2018) and AsPredicted; error detection and correction such as statcheck (Epskamp & Nuijten, 2018) and GRIM (Granularity Related Inconsistent Means; ; grassroots communities promoting new norms such as the Society for Improving Psychological Science, Open Science Communities (Armeni et al, 2020), national reproducibility networks (Munafò et al, 2020); large-scale collaboration to increase sample size and replication efforts such as Psychological Science Accelerator (Moshontz et al, 2018) and ManyBabies (Byers-Heinlein et al, 2020); increasing visibility of behaviors to shift norms such as badges for open practices (Kidwell et al, 2016); altering incentives for publishing away from positive, novel, tidy results with Registered Reports (Chambers, 2019;Scheel et al, 2020); and policy changes by publishers, funders, and institutions to encourage or require more rigor, transparency, and sharing such as TOP Guidelines .…”
Section: Evidence Of Changementioning
confidence: 99%
“…Behaviors that may directly or indirectly improve replicability (or the ability to assess replicability) include increasing sample size, preregistration, improving rigor and transparency, sharing materials and primary data, conducting replications, and enhancement of error detection and correction. A variety of interventions and solutions have emerged in the last decade including: tools supporting preregistration and sharing such as the Open Science Framework (OSF; Soderberg, 2018) and AsPredicted; error detection and correction such as statcheck (Epskamp & Nuijten, 2018) and GRIM (Granularity Related Inconsistent Means; Brown & Heathers, 2017); grassroots communities promoting new norms such as the Society for Improving Psychological Science, Open Science Communities (Armeni et al, 2020), national reproducibility networks (Munafò et al, 2020); large-scale collaboration to increase sample size and replication efforts such as Psychological Science Accelerator (Moshontz et al, 2018) and ManyBabies (Byers-Heinlein et al, 2020); increasing visibility of behaviors to shift norms such as badges for open practices (Kidwell et al, 2016); altering incentives for publishing away from positive, novel, tidy results with Registered Reports (Chambers, 2019;Scheel et al, 2020); and policy changes by publishers, funders, and institutions to encourage or require more rigor, transparency, and sharing such as TOP Guidelines (Nosek et al, 2015).…”
Section: Evidence Of Changementioning
confidence: 99%
“…Hiring, promotion, and tenure assessments of faculty at universities could reward transparently publishing all research results and openly sharing data, code, protocols, and other research materials (Moher et al, 2018). Universities also can provide training on open science practices through formal coursework on transparency, openness, and reproducibility for graduate students and postdoctoral fellows (Krishna & Peter, 2018), as well as support through fostering Open Science Communities at their institutions (Armeni et al, 2021). Given the costs involved in learning new knowledge and skills, universities and research institutions also can seek mechanisms to provide their students, faculty, and researchers with protected funding and time to develop proficiency in open science practices, such as resources to support data archiving (Gilmore et al, 2020).…”
Section: Universities and Research Institutionsmentioning
confidence: 99%