Nicolás Robinson-García has a masters in scientific information and a PhD in social sciences at the University of Granada. He is member of the EC3 Research Group (Evaluación de la Ciencia y de la Comunicación Científica). His research interests are research evaluation at the institutional level and the study of new data sources for bibliometric analysis. He is involved on the development of the
Google Scholar has been well received by the research community. Its promises of free, universal and easy access to scientific literature as well as the perception that it covers better than other traditional multidisciplinary databases the areas of the Social Sciences and the Humanities have contributed to the quick expansion of Google Scholar Citations and Google Scholar Metrics: two new bibliometric products that offer citation data at the individual level and at journal level. In this paper we show the results of a experiment undertaken to analyze Google Scholar's capacity to detect citation counting manipulation. For this, six documents were uploaded to an institutional web domain authored by a false researcher and referencing all the publications of the members of the EC3 research group at the University of Granada. The detection of Google Scholar of these papers outburst the citations included in the Google Scholar Citations profiles of the authors. We discuss the effects of such outburst and how it could affect the future development of such products not only at individual level but also at journal level, especially if Google Scholar persists with its lack of transparency.
We present an analysis of data citation practices based on the Data Citation Index (DCI) (Thomson Reuters). This database launched in 2012 links data sets and data studies with citations received from the other citation indexes. The DCI harvests citations to research data from papers indexed in the Web of Science. It relies on the information provided by the data repository. The findings of this study show that data citation practices are far from common in most research fields. Some differences have been reported on the way researchers cite data: Although in the areas of science and engineering & technology data sets were the most cited, in the social sciences and arts & humanities data studies play a greater role. A total of 88.1% of the records have received no citation, but some repositories show very low uncitedness rates. Although data citation practices are rare in most fields, they have expanded in disciplines such as crystallography and genomics. We conclude by emphasizing the role that the DCI could play in encouraging the consistent, standardized citation of research data-a role that would enhance their value as a means of following the research process from data collection to publication.
Abstract:This paper explores the use of Library Catalog Analysis (LCA), defined as the application of bibliometric or informetric techniques to a set of library online catalogs, to describe quantitatively a scientific-scholarly field on the basis of published book titles. It focuses on its value as a tool in studies of Social Sciences and Humanities, especially its cognitive structures, main book publishers and the performance of its actors. The paper proposes an analogy model between traditional citation analysis of journal articles and library catalog analysis of book titles. It presents the outcomes of an exploratory study of book titles in Economics included in 42 academic library catalogs from 7 countries. It describes the process of data collection and cleaning, and applies a series of indicators and thematic mapping techniques. It illustrates how LCA can be fruitfully used to assess book production and research performance at the level of an individual researcher, a research department, an entire country and a book publisher. It discusses a number of issues that should be addressed in follow-up studies and concludes that LCA of published book titles can be developed into a powerful and useful tool in studies of Social Sciences and Humanities.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.