Until recently, comprehensive scientometrics data has been made available only in siloed, subscription-based tools that are inaccessible to researchers who lack institutional support and resources. As a result of limited data access, research evaluation practices have focused upon basic indicators that only take publications and their citation rates into account. This has blocked innovation on many fronts. Dimensions is a database that links and contextualizes different research information objects. It brings together data describing and linking awarded grants, clinical trials, patents, and policy documents, as well as altmetric information, alongside traditional publications and citations data. This article describes the approach that Digital Science is taking to support the scientometric community, together with the various Dimensions tools available to researchers who wish to use Dimensions data in their research at no cost.
The increase in the availability of data about how research is discussed, used, rated, recommend, saved and read online has allowed researchers to reconsider the mechanisms by which scholarship is evaluated. It is now possible to better track the influence of research beyond academia, though the measures by which we can do so are not yet mature enough to stand on their own. In this article, we examine a new class of data (commonly called "altmetrics") and describe its benefits, limitations and recommendations for its use and interpretation in the context of research assessment. This article is published as part of a collection on the future of research assessment.
A new class of social web-based metrics for scholarly publications (altmetrics) has surfaced as a complement to traditional citation-based metrics. Our aim was to study and characterize those recent papers in the field of Parkinson’s disease which had received the highest Altmetric Attention Scores and to compare this attention measure to the traditional metrics. The top 20 papers in our analysis covered a variety of topics, mainly new disease mechanisms, treatment options and risk factors for the development of PD. The main media sources for these high attention papers were news items and Twitter. The papers were published predominantly in high impact journals, suggesting a correlation between altmetrics and conventional metrics. One paper published in a relatively modest journal received a significant amount of attention, reflecting that public attention does not always parallel the traditional metrics. None of the most influential papers in PD, as reviewed by Ponce and Lozano (2011) made it to our list, suggesting that recent publications receive higher attention scores, and that altmetrics may omit older, seminal work in the field.
Editor's Summary For institutional repositories, alternative metrics reflecting online activity present valuable indicators of interest in their holdings that can supplement traditional usage statistics. A variable mix of built‐in metrics is available through popular repository platforms: Digital Commons, DSpace and EPrints. These may include download counts at the collection and/or item level, search terms, total and unique visitors, page views and social media and bookmarking metrics; additional data may be available with special plug‐ins. Data provide different types of information valuable for repository managers, university administrators and authors. They can reflect both scholarly and popular impact, show readership, reflect an institution's output, justify tenure and promotion and indicate direction for collection management. Practical considerations for implementing altmetrics include service costs, technical support, platform integration and user interest. Altmetrics should not be used for author ranking or comparison, and altmetrics sources should be regularly reevaluated for relevance.
Editor's Summary Methods for determining research quality have long been debated but with little lasting agreement on standards, leading to the emergence of alternative metrics. Altmetrics are a useful supplement to traditional citation metrics, reflecting a variety of measurement points that give different perspectives on how a dataset is used and by whom. A positive development is the integration of a number of research datasets into the ISI Data Citation Index, making datasets searchable and linking them to published articles. Yet access to data resources and tracking the resulting altmetrics depend on specific qualities of the datasets and the systems where they are archived. Though research on altmetrics use is growing, the lack of standardization across datasets and system architecture undermines its generalizability. Without some standards, stakeholders' adoption of altmetrics will be limited.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.