This crowdsourced project introduces a collaborative approach to improving the reproducibility of scientific research, in which findings are replicated in qualified independent laboratories before (rather than after) they are published. Our goal is to establish a non-adversarial replication process with highly informative final results. To illustrate the Pre-Publication Independent Replication (PPIR) approach, 25 research groups conducted replications of all ten moral judgment effects which the last author and his collaborators had "in the pipeline" as of August 2014. Six findings replicated according to all replication criteria, one finding replicated but with a significantly smaller effect size than the original, one finding replicated consistently in the original culture but not outside of it, and two findings failed to find support. In total, 40% of the original findings failed at least one major replication criterion. Potential ways to implement and incentivize pre-publication independent replication on a large scale are discussed
We present the data from a crowdsourced project seeking to replicate findings in independent laboratories before (rather than after) they are published. In this Pre-Publication Independent Replication (PPIR) initiative, 25 research groups attempted to replicate 10 moral judgment effects from a single laboratory’s research pipeline of unpublished findings. The 10 effects were investigated using online/lab surveys containing psychological manipulations (vignettes) followed by questionnaires. Results revealed a mix of reliable, unreliable, and culturally moderated findings. Unlike any previous replication project, this dataset includes the data from not only the replications but also from the original studies, creating a unique corpus that researchers can use to better understand reproducibility and irreproducibility in science.
The theory of mental workload suggests that job aids should be particularly useful if they provide resources for individuals without creating excessive, additional cognitive burden. We tested this proposition by examining the individual and interactive effects of task‐based training and checklist design on training performance. Undergraduate students (N = 229) were randomly assigned to task‐based or non‐task‐based training, and to one of three checklists or no checklist to aid training performance. The three checklists were (1) designed to provide low levels of detail but highly structured, (2) high levels of detail and highly structured or (3) high levels of detail with low structure. Results suggest checklists improve accuracy and also minimize psychological strain, yet at the cost of reduced speed. This suggests industries in which accuracy is critical to performance outcomes should consider how checklists, such as safety checklists, are designed. Implications for checklist design and provision are discussed.
Career planning is the process of goal setting and goal attainment in regard to one's career course. There are multiple benefits of career planning, including organizational commitment, career commitment, and career effectiveness. Certain factors may influence the process of career planning, such as an individual's locus of control, need for achievement, gender, education, and age. Organizational factors can also affect career planning, such as the availability of career planning programs and personnel practices.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.