BackgroundArticles whose authors have supplemented subscription-based access to the publisher's version by self-archiving their own final draft to make it accessible free for all on the web (“Open Access”, OA) are cited significantly more than articles in the same journal and year that have not been made OA. Some have suggested that this “OA Advantage” may not be causal but just a self-selection bias, because authors preferentially make higher-quality articles OA. To test this we compared self-selective self-archiving with mandatory self-archiving for a sample of 27,197 articles published 2002–2006 in 1,984 journals.Methdology/Principal FindingsThe OA Advantage proved just as high for both. Logistic regression analysis showed that the advantage is independent of other correlates of citations (article age; journal impact factor; number of co-authors, references or pages; field; article type; or country) and highest for the most highly cited articles. The OA Advantage is real, independent and causal, but skewed. Its size is indeed correlated with quality, just as citations themselves are (the top 20% of articles receive about 80% of all citations).Conclusions/SignificanceThe OA advantage is greater for the more citable articles, not because of a quality bias from authors self-selecting what to make OA, but because of a quality advantage, from users self-selecting what to use and cite, freed by OA from the constraints of selective accessibility to subscribers only. It is hoped that these findings will help motivate the adoption of OA self-archiving mandates by universities, research institutions and research funders.
For more than 40 years, the Institute for Scientific Information (ISI, now part of Thomson Reuters) produced the only available bibliographic databases from which bibliometricians could compile largescale bibliometric indicators. ISI's citation indexes, now regrouped under the Web of Science (WoS), were the major sources of bibliometric data until 2004, when Scopus was launched by the publisher Reed Elsevier. For those who perform bibliometric analyses and comparisons of countries or institutions, the existence of these two major databases raises the important question of the comparability and stability of statistics obtained from different data sources. This paper uses macrolevel bibliometric indicators to compare results obtained from the WoS and Scopus. It shows that the correlations between the measures obtained with both databases for the number of papers and the number of citations received by countries, as well as for their ranks, are extremely high (R 2 ≈ .99). There is also a very high correlation when countries' papers are broken down by field. The paper thus provides evidence that indicators of scientific production and citations at the country level are stable and largely independent of the database.
Background and research question
This article provides the first historical analysis of the relationship between collaboration and scientific impact using three indicators of collaboration (number of authors, number of addresses, and number of countries) derived from articles published between 1900 and 2011. The results demonstrate that an increase in the number of authors leads to an increase in impact, from the beginning of the last century onward, and that this is not due simply to self‐citations. A similar trend is also observed for the number of addresses and number of countries represented in the byline of an article. However, the constant inflation of collaboration since 1900 has resulted in diminishing citation returns: Larger and more diverse (in terms of institutional and country affiliation) teams are necessary to realize higher impact. The article concludes with a discussion of the potential causes of the impact gain in citations of collaborative papers.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.