In this study, we explore the citedness of research data, its distribution over time and its relation to the availability of a digital object identifier (DOI) in the Thomson Reuters database Data Citation Index (DCI). We investigate if cited research data “impacts” the (social) web, reflected by altmetrics scores, and if there is any relationship between the number of citations and the sum of altmetrics scores from various social media platforms. Three tools are used to collect altmetrics scores, namely PlumX, ImpactStory, and Altmetric.com, and the corresponding results are compared. We found that out of the three altmetrics tools, PlumX has the best coverage. Our experiments revealed that research data remain mostly uncited (about 85 %), although there has been an increase in citing data sets published since 2008. The percentage of the number of cited research data with a DOI in DCI has decreased in the last years. Only nine repositories are responsible for research data with DOIs and two or more citations. The number of cited research data with altmetrics “foot-prints” is even lower (4–9 %) but shows a higher coverage of research data from the last decade. In our study, we also found no correlation between the number of citations and the total number of altmetrics scores. Yet, certain data types (i.e. survey, aggregate data, and sequence data) are more often cited and also receive higher altmetrics scores. Additionally, we performed citation and altmetric analyses of all research data published between 2011 and 2013 in four different disciplines covered by the DCI. In general, these results correspond very well with the ones obtained for research data cited at least twice and also show low numbers in citations and in altmetrics. Finally, we observed that there are disciplinary differences in the availability and extent of altmetrics scores.
This article offers important background information about a new product, the Book Citation Index (BKCI), launched in 2011 by Thomson Reuters. Information is illustrated by some new facts concerningThe BKCI's use in bibliometrics, coverage analysis, and a series of idiosyncrasies worthy of further discussion. The BKCI was launched primarily to assist researchers identify useful and relevant research that was previously invisible to them, owing to the lack of significant book content in citation indexes such as the Web of Science. So far, the content of 33,000 books has been added to the desktops of the global research community, the majority in the arts, humanities, and social sciences fields. Initial analyses of the data from The BKCI have indicated that The BKCI, in its current version, should not be used for bibliometric or evaluative purposes. The most significant limitations to this potential application are the high share of publications without address information, the inflation of publication counts, the lack of cumulative citation counts from different hierarchical levels, and inconsistency in citation counts between the cited reference search and the book citation index. However, The BKCI is a first step toward creating a reliable and necessary citation data source for monographs -a very challenging issue, because, unlike journals and conference proceedings, books have specific requirements, and several problems emerge not only in the context of subject classification, but also in their role as cited publications and in citing publications.
Our study aims at examining the suitability of Scopus for bibliometric analyses in comparison with the Web of Science (WOS). In particular we want to explore if the outcome of bibliometric analyses differs between Scopus and WOS and, if yes, in which aspects. In doing so we focus on the following questions: To which extent are high impact JCR (Journal Citation Reports) journals covered by Scopus? Are the impact factor and the immediacy index usually lower for a JCR journal than the corresponding indicators computed in Scopus? Are there high impact journals not covered by the JCR? And, finally, how reliable are the data in these two databases?Since journal indicators like the impact factor and the immediacy index differ among disciplines, we analysed only journals from the subject pharmacy and pharmaceutical sciences. Focussing on one subject category offers furthermore the possibility to go into more detail when comparing the databases.The findings of our study can be summarized as follows:• Each top-100 JCR pharmacy journal was covered by Scopus.• The impact factor was higher for 82 and the immediacy index greater for 78 journals in Scopus in 2005. Pharmacy journals with a high impact factor in the JCR usually have a high impact factor in Scopus.• Several high but no top-impact journal could be identified in Scopus which were not reported in JCR.• The two databases differed in the number of articles within a tolerable margin of deviation for most journals.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.