Due to developments recently termed as ‘audit,’ ‘evaluation,’ or ‘metric society,’ universities have become subject to ratings and rankings and researchers are evaluated according to standardized quantitative indicators such as their publication output and their personal citation scores. Yet, this development is not only based on the rise of new public management and ideas on ‘the return on public or private investment.’ It has also profited from ongoing technological developments. Due to a massive increase in digital publishing corresponding with the growing availability of related data bibliometric infrastructures for evaluating science are continuously becoming more differentiated and elaborate. They allow for new ways of using bibliometric data through various easily applicable tools. Furthermore, they also produce new quantities of data due to new possibilities in following the digital traces of scientific publications. In this article, I discuss this development as quantification 2.0. The rise of digital infrastructures for publishing, indexing, and managing scientific publications has not only made bibliometric data become a valuable source for performance assessment. It has triggered an unprecedented growth in bibliometric data production turning freely accessible data about scientific work into edited databases and producing competition for its users. The production of bibliometric data has thus become decoupled from their application. Bibliometric data have turned into a self-serving end while their providers are constantly seeking for new tools to make use of them.
One could think that bibliometric measurement of academic performance has always been digital since the computer-assisted invention of the Science Citation Index. Yet, since the 2000s, the digitization of bibliometric infrastructure has accelerated at a rapid pace. Citation databases are indexing an increasing variety of publication types. Altmetric data aggregators are producing data on the reception of research outcomes. Machine-readable persistent identifiers are created to unambiguously identify researchers, research organizations, and research objects; and evaluative software tools and current research information systems are constantly enlarging their functionalities to make use of these data and extract meaning from them. In this article, we analyse how these developments in evaluative bibliometrics have contributed to an extension of indicator-based research evaluation towards data-driven research analytics. Drawing on empirical material from blogs and websites as well as from research and policy papers, we discuss how interoperability, scalability, and flexibility as material specificities of digital infrastructures generate new ways of data production and their assessment, which affect the possibilities of how academic performance can be understood and (e)valuated.
ZusammenfassungIn der Soziologie des Wertens und Bewertens wird Sichtbarkeit, d. h. wer wen wann auf welche Weise sehen kann, als wesentlicher Aspekt von Bewertungsverfahren diskutiert. Indem theoretische Perspektiven auf Sichtbarkeit mit aktueller Forschung zu Bewertungsverfahren zusammengebracht werden, nimmt der Artikel die Kalibrierung von Sichtbarkeit in und durch Bewertungsverfahren in den Blick. Er zeigt unterschiedliche Sichtbarkeitskonstellationen auf, die spezifische Wirkungsweisen – Anerkennung, Kontrolle, Singularisierung – entfalten können, und verdeutlicht, wie die gezielte Herstellung von (Un-)Sichtbarkeit erstens darüber entscheidet, was oder wer wie bewertet wird, und zweitens dabei selbst zu einem wesentlichen Ergebnis von Bewertungsverfahren wird.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.