Our study aims at examining the suitability of Scopus for bibliometric analyses in comparison with the Web of Science (WOS). In particular we want to explore if the outcome of bibliometric analyses differs between Scopus and WOS and, if yes, in which aspects. In doing so we focus on the following questions: To which extent are high impact JCR (Journal Citation Reports) journals covered by Scopus? Are the impact factor and the immediacy index usually lower for a JCR journal than the corresponding indicators computed in Scopus? Are there high impact journals not covered by the JCR? And, finally, how reliable are the data in these two databases?Since journal indicators like the impact factor and the immediacy index differ among disciplines, we analysed only journals from the subject pharmacy and pharmaceutical sciences. Focussing on one subject category offers furthermore the possibility to go into more detail when comparing the databases.The findings of our study can be summarized as follows:• Each top-100 JCR pharmacy journal was covered by Scopus.• The impact factor was higher for 82 and the immediacy index greater for 78 journals in Scopus in 2005. Pharmacy journals with a high impact factor in the JCR usually have a high impact factor in Scopus.• Several high but no top-impact journal could be identified in Scopus which were not reported in JCR.• The two databases differed in the number of articles within a tolerable margin of deviation for most journals.
The goal of the scientometric analysis presented in this article was to investigate international and regional (i.e., German-language) periodicals in the field of library and information science (LIS). This was done by means of a citation analysis and a reader survey. For the citation analysis, impact factor, citing half-life, number of references per article, and the rate of self-references of a periodical were used as indicators. In addition, the leading LIS periodicals were mapped. For the 40 international periodicals, data were collected from ISI's Social Sciences Citation Index Journal Citation Reports (JCR); the citations of the 10 German-language journals were counted manually (overall 1,494 source articles with 10,520 citations). Altogether, the empirical base of the citation analysis consisted of nearly 90,000 citations in 6,203 source articles that were published between 1997 and 2000. The expert survey investigated reading frequency, applicability of the journals to the job of the reader, publication frequency, and publication preference both for all respondents and for different groups among them (practitioners vs. scientists, librarians vs. documentalists vs. LIS scholars, public sector vs. information industry vs. other private company employees). The study was conducted in spring 2002. A total of 257 questionnaires were returned by information specialists from Germany, Austria, and Switzerland. Having both citation and readership data, we performed a comparative analysis of these two data sets. This enabled us to identify answers to questions like: Does reading behavior correlate with the journal impact factor? Do readers prefer journals with a short or a long half-life, or with a low or a high number of references? Is there any difference in this matter among librarians, documentalists, and LIS scholars?
It is the objective of this article to examine in which aspects journal usage data differ from citation data. This comparison is conducted both at journal level and on a paper by paper basis. At journal level, we define a so-called usage impact factor and a usage halflife in analogy to the corresponding Thomson's citation indicators. The usage data were provided from Science Direct, subject category ''oncology''. Citation indicators were obtained from JCR, article citations were retrieved from SCI and Scopus. Our study shows that downloads and citations have different obsolescence patterns. While the average cited half-life was 5.6 years, we computed a mean usage half-life of 1.7 years for the year 2006. We identified a strong correlation between the citation frequencies and the number of downloads for our journal sample. The relationship was lower when performing the analysis on a paper by paper basis because of existing variances in the citation-downloadratio among articles. Also the correlation between the usage impact factor and Thomson's journal impact factor was ''only'' moderate because of different obsolescence patterns between downloads and citations.
Following the transition from print journals to electronic (hybrid) journals in the past decade, usage metrics have become an interesting complement to citation metrics. In this article we investigate the similarities of and differences between usage and citation indicators for pharmacy and pharmacology journals and relate the results to a previous study on oncology journals. For the comparison at journal level we use the classical citation indicators as defined in the Journal Citation Reports and compute the corresponding usage indicators. At the article level we not only relate download and citation counts to each other but also try to identify the possible effect of citations upon subsequent downloads. Usage data were provided by ScienceDirect both at the journal level and, for a few selected journals, on a paper-by-paper basis. The corresponding citation data were retrieved from the Web of Science and Journal Citation Reports. Our analyses show that electronic journals have become generally accepted over the last decade. While the supply of ScienceDirect pharma journals rose by 50% between 2001 and 2006, the total number of article downloads (full-text articles [FTAs]) multiplied more than 5-fold in the same period. This also impacted the pattern of scholarly communication (strong increase in the immediacy index) in the past few years. Our results further reveal a close relation between citation and download frequencies. We computed a high correlation at the journal level when using absolute values and a moderate to high correlation when relating usage and citation impact factors. At the article level the rank correlation between downloads and citations was only medium-sized. Differences between downloads and citations exist in terms of obsolescence characteristics. While more than half of the articles are downloaded in the publication year or 1 year later, the median cited half-life was nearly 6 years for our journal sample. Our attempt to reveal a direct influence of citations upon downloads proved not to be feasible.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.