The purpose of this paper is to update the review of Bornmann and Daniel (2008) presenting a narrative review of studies on citations in scientific documents. The current review covers 41 studies published between 2006 and 2018. Bornmann and Daniel (2008) focused on earlier years. The current review describes the (new) studies on citation content and context analyses as well as the studies that explore the citation motivation of scholars through surveys or interviews. One focus in this paper is on the technical developments in the last decade, such as the richer meta-data available and machine-readable formats of scientific papers. These developments have resulted in citation context analyses of large datasets in comprehensive studies (which was not possible previously). Many studies in recent years have used computational and machine learning techniques to determine citation functions and polarities, some of which have attempted to overcome the methodological weaknesses of previous studies. The automated recognition of citation functions seems to have the potential to greatly enhance citation indices and information retrieval capabilities. Our review of the empirical studies demonstrates that a paper may be cited for very different scientific and non-scientific reasons. This result accords with the finding by Bornmann and Daniel (2008). The current review also shows that to better understand the relationship between citing and cited documents, a variety of features should be analyzed, primarily the citation context, the semantics and linguistic patterns in citations, citation locations within the citing document, and citation polarity (negative, neutral, positive).
This study provides a conceptual overview of the literature dealing with the process of citing documents (focusing on the literature from the recent decade). It presents theories, which have been proposed for explaining the citation process, and studies having empirically analyzed this process. The overview is referred to as conceptual, because it is structured based on core elements in the citation process: the context of the cited document, processes from selection to citation of documents, and the context of the citing document. The core elements are presented in a schematic representation. The overview can be used to find answers on basic questions about the practice of citing documents. Besides understanding of the process of citing, it delivers basic information for the proper application of citations in research evaluation.
Can alternative metrics (altmetrics) data be used to measure societal impact? We wrote this literature overview of empirical studies in order to find an answer to this question. The overview includes two parts. The first part, "societal impact measurements", explains possible methods and problems in measuring the societal impact of research, case studies for societal impact measurement, societal impact considerations at funding organizations, and the societal problems that should be solved by science. The second part of the review, "altmetrics", addresses a major question in research evaluation, which is whether altmetrics are proper indicators for measuring the societal impact of research. In the second part we explain the data sources used for altmetrics studies and the importance of field-normalized indicators for impact measurements. This review indicates that it should be relevant for impact measurements to be oriented towards pressing societal problems. Case studies in which societal impact of certain pieces of research is explained seem to provide a legitimate method for measuring societal impact. In the use of altmetrics, field-specific differences should be considered by applying field normalization (in cross-field comparisons). Altmetrics data such as social media counts might mainly reflect the public interest and discussion of scholarly works rather than their societal impact. Altmetrics (Twitter data) might be especially fruitfully employed for research evaluation purposes, if they are used in the context of network approaches. Conclusions based on altmetrics data in research evaluation should be drawn with caution.
Several authors have proposed that a large number of unusual combinations of cited references in a paper point to its high creative potential (or novelty). However, it is still not clear whether the number of unusual combinations can really measure the creative potential of papers. The current study addresses this question on the basis of several case studies from the field of scientometrics. We identified some landmark papers in this field. Study subjects were the corresponding authors of these papers. We asked them where the ideas for the papers came from and which role the cited publications played. The results revealed that the creative ideas might not necessarily have been inspired by past publications. The literature seems to be important for the contextualization of the idea in the field of scientometrics. Instead, we found that creative ideas are the result of finding solutions to practical problems, result from discussions with colleagues, and profit from interdisciplinary exchange. The roots of the studied landmark papers are discussed in detail.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.