ROBIN CHIN ROEMER AND RACHEL BORCHARDT "NO ONE CAN READ EVERYTHING." So begins the Altmetrics Manifesto, first published online in late 2010 by the pioneering quartet of Priem, Taraborelli, Groth, and Neylon.
Chemistry researchers are frequently evaluated on the perceived significance of their work with the citation count as the most commonly-used metric for gauging this property. Recent studies have called for a broader evaluation of significance that includes more nuanced bibliometrics as well as altmetrics to more completely evaluate scientific research. To better understand the relationship between metrics and peer judgements of significance in chemistry, we have conducted a survey of chemists to investigate their perceptions of previously published research. Focusing on a specific issue of the Journal of the American Chemical Society published in 2003, respondents were asked to select which articles they thought best matched importance and significance given several contexts: highest number of citations, most significant (subjectively defined), most likely to share among chemists, and most likely to share with a broader audience. The answers to the survey can be summed up in several observations. The ability of respondents to predict the citation counts of established research is markedly lower than the ability of those counts to be predicted by the h-index of the corresponding author of each article. This observation is conserved even when only considering responses from chemists whose expertise falls within the subdiscipline that best describes the work performed in an article. Respondents view both cited papers and significant papers differently than papers that should be shared with chemists. We conclude from our results that peer judgements of importance and significance differ from metrics-based measurements, and that chemists should work with bibliometricians to develop metrics that better capture the nuance of opinions on the importance of a given piece of research.
Abstract
Objective – This study analyzes scholarly publications supported by library open access funds, including author demographics, journal trends, and article impact. It also identifies and summarizes open access fund criteria and viability. The goal is to better understand the sustainability of open access funds, as well as identify potential best practices for institutions with open access funds.
Methods – Publication data was solicited from universities with open access (OA) funds, and supplemented with publication and author metrics, including Journal Impact Factor, Altmetric Attention Score, and author h-index. Additionally, data was collected from OA fund websites, including fund criteria and guidelines.
Results – Library OA funds tend to support faculty in science and medical fields. Impact varied widely, especially between disciplines, but a limited measurement indicated an overall smaller relative impact of publications funded by library OA funds. Many open access funds operate using similar criteria related to author and publication eligibility, which seem to be largely successful at avoiding the funding of articles published in predatory journals.
Conclusions – Libraries have successfully funded many publications using criteria that could constitute best practices in this area. However, institutions with OA funds may need to identify opportunities to increase support for high-impact publications, as well as consider the financial stability of these funds. Alternative models for OA support are discussed in the context of an ever-changing open access landscape.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.