One of the ways in which the publisher PLOS supports open science is via a stringent data availability policy established in 2014. Despite this policy, and more data sharing policies being introduced by other organizations, best practices for data sharing are adopted by a minority of researchers in their publications. Problems with effective research data sharing persist and these problems have been quantified by previous research as a lack of time, resources, incentives, and/or skills to share data.In this study we built on this research by investigating the importance of tasks associated with data sharing, and researchers' satisfaction with their ability to complete these tasks. By investigating these factors we aimed to better understand opportunities for new or improved solutions for sharing data.In May-June 2020 we surveyed researchers from Europe and North America to rate tasks associated with data sharing on (i) their importance and (ii) their satisfaction with their ability to complete them. We received 617 completed responses. We calculated mean importance and satisfaction scores to highlight potential opportunities for new solutions to and compare different cohorts.Tasks relating to research impact, funder compliance, and credit had the highest importance scores. 52% of respondents reuse research data but the average satisfaction score for obtaining data for reuse was relatively low. Tasks associated with sharing data were rated somewhat important and respondents were reasonably well satisfied in their ability to accomplish them. Notably, this included tasks associated with best data sharing practice, such as use of data repositories. However, the most common method for sharing data was in fact via supplemental files with articles, which is not considered to be best practice.We presume that researchers are unlikely to seek new solutions to a problem or task that they are satisfied in their ability to accomplish, even if many do not attempt this task. This implies there are few opportunities for new solutions or tools to meet these researcher needs. Publishers can likely meet these needs for data sharing by working to seamlessly integrate existing solutions that reduce the effort or behaviour change involved in some tasks, and focusing on advocacy and education around the benefits of sharing data.
Sharing of code supports reproducible research but fewer journals have policies on code sharing compared to data sharing, and there is little evidence on researchers’ attitudes and experiences with code sharing. Before introducing a stronger policy on sharing of code, the Editors and publisher of the journal PLOS Computational Biology wished to test, via an online survey, the suitability of a proposed mandatory code sharing policy with its community of authors. Previous research has established, in 2019, 41% of papers in the journal linked to shared code. We also wanted to understand the potential impact of the proposed policy on authors' submissions to the journal, and their concerns about code sharing.We received 214 completed survey responses, all of whom had generated code in their research previously. 80% had published in PLOS Computational Biology and 88% of whom were based in Europe or North America. Overall, respondents reported they were more likely to submit to the journal if it had a mandatory code sharing policy and US researchers were more positive than the average for all respondents. Researchers whose main discipline is Medicine and Health sciences viewed the proposed policy less favourably, as did the most senior researchers (those with more than 100 publications) compared to early and mid-career researchers.The authors surveyed report that, on average, 71% of their research articles have associated code, and that for the average author, code has not been shared for 32% of these papers. The most common reasons for not sharing code previously are practical issues, which are unlikely to prevent compliance with the policy. A lack of time to share code was the most common reason. 22% of respondents who had not shared their code in the past cited intellectual property (IP) concerns - a concern that might prevent public sharing of code under a mandatory code sharing policy. The results also imply that 18% of the respondents’ previous publications did not have the associated code shared and IP concerns were not cited, suggesting more papers in the journal could share code.To remain inclusive of all researchers in the community, the policy was designed to allow researchers who can demonstrate they are legally restricted from sharing their code to be granted an exemption to public sharing of code.As a secondary goal of the survey we wanted to determine if researchers have unmet needs in their ability to share their own code, and to access other researchers' code. Consistent with our previous research on data sharing, we found potential opportunities for new products or features that support code accessibility or reuse. We found researchers were on average satisfied with their ability to share their own code, suggesting that offering new products or features to support sharing in the absence of a stronger policy would not increase the availability of code with the journal's publications.
PLOS has long supported Open Science. One of the ways in which we do so is via our stringent data availability policy established in 2014. Despite this policy, and more data sharing policies being introduced by other organizations, best practices for data sharing are adopted by a minority of researchers in their publications. Problems with effective research data sharing persist and these problems have been quantified by previous research as a lack of time, resources, incentives, and/or skills to share data. In this study we built on this research by investigating the importance of tasks associated with data sharing, and researchers’ satisfaction with their ability to complete these tasks. By investigating these factors we aimed to better understand opportunities for new or improved solutions for sharing data. In May-June 2020 we surveyed researchers from Europe and North America to rate tasks associated with data sharing on (i) their importance and (ii) their satisfaction with their ability to complete them. We received 728 completed and 667 partial responses. We calculated mean importance and satisfaction scores to highlight potential opportunities for new solutions to and compare different cohorts.Tasks relating to research impact, funder compliance, and credit had the highest importance scores. 52% of respondents reuse research data but the average satisfaction score for obtaining data for reuse was relatively low. Tasks associated with sharing data were rated somewhat important and respondents were reasonably well satisfied in their ability to accomplish them. Notably, this included tasks associated with best data sharing practice, such as use of data repositories. However, the most common method for sharing data was in fact via supplemental files with articles, which is not considered to be best practice.We presume that researchers are unlikely to seek new solutions to a problem or task that they are satisfied in their ability to accomplish, even if many do not attempt this task. This implies there are few opportunities for new solutions or tools to meet these researcher needs. Publishers can likely meet these needs for data sharing by working to seamlessly integrate existing solutions that reduce the effort or behaviour change involved in some tasks, and focusing on advocacy and education around the benefits of sharing data. There may however be opportunities - unmet researcher needs - in relation to better supporting data reuse, which could be met in part by strengthening data sharing policies of journals and publishers, and improving the discoverability of data associated with published articles.
In a series of 52 semi-structured interviews with researchers in cell biology, we sought to characterize researchers’ goals when evaluating the credibility (or trustworthiness) and impact of research outputs in two contexts: during researchers' own work (the Discovery context) and when researchers participate in research assessment committees for grant review and hiring and promotion (the Committee context). We have compiled a list of researchers’ goals in these contexts, expressed as desired outcome statements and standardized across the two contexts, which will inform a quantitative survey to validate and prioritize these goals and to identify opportunities for new or improved solutions for research assessment. On the basis of the qualitative data, we examined how these needs intersect in the two contexts. We find that the goals of researchers in the Discovery and Committee context overlap significantly. Both impact and credibility matter in each context. In particular, credibility is the dominant factor in the Discovery context and somewhat less represented but still strongly relevant in the Committee context. Researchers use proxy methods, in particular journal-based proxies, to evaluate all attributes of research outputs and these proxies were reported with similar frequency in both contexts. We also find that researchers seek to understand reproducibility, quality and novelty of research outputs in both contexts, in addition to credibility and impact. While publications remain the dominant unit of research assessment, researchers in our sample also evaluate research data, code and preprints, in both contexts. Our preliminary findings suggest a number of potential opportunities to reduce time, reduce error, or improve the quality of assessment practices, in a manner that avoids journal-based proxies. Amongst these improvements are potential opportunities to (i) provide more reliable signals of credibility, quality, and impact, (ii) apply these signals to publications and preprints, and (iii) improve research assessment guidelines.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.