In this paper we review the socalled altmetrics or alternative metrics. This concept raises from the development of new indicators based on Web 2.0, for the evaluation of the research and academic activity. The basic assumption is that variables such as mentions in blogs, number of twits or of researchers bookmarking a research paper for instance, may be legitimate indicators for measuring the use and impact of scientific publications. In this sense, these indicators are currently the focus of the bibliometric community and are being discussed and debated. We describe the main platforms and indicators and we analyze as a sample the Spanish research output in Communication Studies. Comparing traditional indicators such as citations with these new indicators. The results show that the most cited papers are also the ones with a highest impact according to the altmetrics. We conclude pointing out the main shortcomings these metrics present and the role they may play when measuring the research impact through 2.0 platforms
The launch of Google Scholar Metrics as a tool for assessing scientific journals may be serious competition for Thomson Reuters' Journal Citation Reports, and for Scopus' powered Scimago Journal Rank. , A review of these bibliometric journal evaluation products is performed. We compare their main characteristics from different approaches: coverage, indexing policies, search and visualization, bibliometric indicators, results analysis options, economic cost and differences in their ranking of journals. Despite its shortcomings, Google Scholar Metrics is a helpful tool for authors and editors in identifying core journals. As an increasingly useful tool for ranking scientific journals, it may also challenge established journals products.
This paper presents a first approach to analyzing the factors that determine the citation characteristics of books. For this we use the Thomson Reuters' Book Citation Index, a novel multidisciplinary database launched in 2011 which offers bibliometric data on books. We analyze three possible factors which are considered to affect the citation impact of books: the presence of editors, the inclusion in series and the type of publisher. Also, we focus on highly cited books to see if these factors may affect them as well. We considered as highly cited books, those in the top 5% of those most highly cited in the database. We define these three aspects and present results for four major scientific areas in order to identify differences by area (Science, Engineering & Technology, Social Sciences and Arts & Humanities). Finally, we report differences for edited books and publisher type, however books included in series showed higher impact in two areas.
BackgroundThe peer review system has been traditionally challenged due to its many limitations especially for allocating funding. Bibliometric indicators may well present themselves as a complement.ObjectiveWe analyze the relationship between peers’ ratings and bibliometric indicators for Spanish researchers in the 2007 National R&D Plan for 23 research fields.Methods and MaterialsWe analyze peers’ ratings for 2333 applications. We also gathered principal investigators’ research output and impact and studied the differences between accepted and rejected applications. We used the Web of Science database and focused on the 2002-2006 period. First, we analyzed the distribution of granted and rejected proposals considering a given set of bibliometric indicators to test if there are significant differences. Then, we applied a multiple logistic regression analysis to determine if bibliometric indicators can explain by themselves the concession of grant proposals.Results63.4% of the applications were funded. Bibliometric indicators for accepted proposals showed a better previous performance than for those rejected; however the correlation between peer review and bibliometric indicators is very heterogeneous among most areas. The logistic regression analysis showed that the main bibliometric indicators that explain the granting of research proposals in most cases are the output (number of published articles) and the number of papers published in journals that belong to the first quartile ranking of the Journal Citations Report.DiscussionBibliometric indicators predict the concession of grant proposals at least as well as peer ratings. Social Sciences and Education are the only areas where no relation was found, although this may be due to the limitations of the Web of Science’s coverage. These findings encourage the use of bibliometric indicators as a complement to peer review in most of the analyzed areas.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.