Abstract:In this study, we compare the difference in the impact between open access (OA) and non-open access (non-OA) articles. 1761 Nature Communications articles published from 1 Jan. 2012 to 31 Aug. 2013 are selected as our research objects, including 587 OA articles and 1174 non-OA articles. Citation data and daily updated article-level metrics data are harvested directly from the platform of nature.com. Data is analyzed from the static versus temporal-dynamic perspectives. The OA citation advantage is confirmed, and the OA advantage is also applicable when extending the comparing from citation to article views and social media attention. More important, we find that OA papers not only have the great advantage of total downloads, but also have the feature of keeping sustained and steady downloads for a long time. For article downloads, non-OA papers only have a short period of attention, when the advantage of OA papers exists for a much longer time.
Using bibliometric methods, we investigate China's international scientific collaboration from 3 levels of collaborating countries, institutions and individuals. We design a database in SQL Server, and make analysis of Chinese SCI papers based on the corresponding author field. We find that China's international scientific collaboration is focused on a handful of countries. Nearly 95% international co-authored papers are collaborated with only 20 countries, among which the USA account for more than 40% of all. Results also show that Chinese lineage in the international co-authorship is obvious, which means Chinese immigrant scientists are playing an important role in China's international scientific collaboration, especially in English-speaking countries.
Sufficient data presence is one of the key preconditions for applying metrics in practice. Based on both Altmetric.com data and Mendeley data collected up to 2019, this paper presents a state-of-the-art analysis of the presence of 12 kinds of altmetric events for nearly 12.3 million Web of Science publications published between 2012 and 2018. Results show that even though an upward trend of data presence can be observed over time, except for Mendeley readers and Twitter mentions, the overall presence of most altmetric data is still low. The majority of altmetric events go to publications in the fields of Biomedical and Health Sciences, Social Sciences and Humanities, and Life and Earth Sciences. As to research topics, the level of attention received by research topics varies across altmetric data, and specific altmetric data show different preferences for research topics, on the basis of which a framework for identifying hot research topics is proposed and applied to detect research topics with higher levels of attention garnered on certain altmetric data source. Twitter mentions and policy document citations were selected as two examples to identify hot research topics of interest of Twitter users and policy-makers, respectively, shedding light on the potential of altmetric data in monitoring research trends of specific social attention.
Editors play a critical role in the peer review system. How do editorial behaviors affect the performance of peer review? No quantitative model to date allows us to measure the influence of editorial behaviors on different peer review stages such as, manuscript distribution and final decision making. Here, we propose an agent-based model in which the process of peer review is guided mainly by the social interactions among three kinds of agents representing authors, editors and reviewers respectively. We apply this model to analyze a number of editorial behaviors such as decision strategy, number of reviewers and editorial bias on peer review. We find out that peer review outcomes are significantly sensitive to different editorial behaviors. With a small fraction (10 %) of biased editors, the quality of accepted papers declines 11 %, which indicates that effects of editorial biased behavior is worse than that of biased reviewers (7 %). While several peer review models exist, this is the first account for the study of editorial behaviors that is validated on the basis of simulation analysis.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.