The Leiden Ranking 2011/2012 is a ranking of universities based on bibliometric indicators of publication output, citation impact, and scientific collaboration. The ranking includes 500 major universities from 41 different countries. This paper provides an extensive discussion of the Leiden Ranking 2011/2012. The ranking is compared with other global university rankings, in particular the Academic Ranking of World Universities (commonly known as the Shanghai Ranking) and the Times Higher Education World University Rankings. The comparison focuses on the methodological choices underlying the different rankings. Also, a detailed description is offered of the data collection methodology of the Leiden Ranking 2011/2012 and of the indicators used in the ranking. Various innovations in the Leiden Ranking 2011/2012 are presented. These innovations include (1) an indicator based on counting a university's highly cited publications, (2) indicators based on fractional rather than full counting of collaborative publications, (3) the possibility of excluding non‐English language publications, and (4) the use of stability intervals. Finally, some comments are made on the interpretation of the ranking and a number of limitations of the ranking are pointed out.
The crown indicator is a well-known bibliometric indicator of research performance developed by our institute. The indicator aims to normalize citation counts for differences among fields. We critically examine the theoretical basis of the normalization mechanism applied in the crown indicator. We also make a comparison with an alternative normalization mechanism. The alternative mechanism turns out to have more satisfactory properties than the mechanism applied in the crown indicator. In particular, the alternative mechanism has a so-called consistency property. The mechanism applied in the crown indicator lacks this important property. As a consequence of our findings, we are currently moving towards a new crown indicator, which relies on the alternative normalization mechanism.
We review empirical research on (social) psychology of morality to identify which issues and relations are well documented by existing data and which areas of inquiry are in need of further empirical evidence. An electronic literature search yielded a total of 1,278 relevant research articles published from 1940 through 2017. These were subjected to expert content analysis and standardized bibliometric analysis to classify research questions and relate these to (trends in) empirical approaches that characterize research on morality. We categorize the research questions addressed in this literature into five different themes and consider how empirical approaches within each of these themes have addressed psychological antecedents and implications of moral behavior. We conclude that some key features of theoretical questions relating to human morality are not systematically captured in empirical research and are in need of further investigation.
This paper sets forth a general methodology for conducting bibliometric analyses at the micro level. It combines several indicators grouped into three factors or dimensions which characterize different aspects of scientific performance. Different profiles or "classes" of scientists are described according to their research performance in each dimension. A series of results based on the findings from the application of this methodology to the study of CSIC scientists in Spain in three thematic areas are presented. Special emphasis is made on the identification and description of top scientists from structural and bibliometric perspectives. The effects of age on the productivity and impact of the different classes of scientists are analyzed. The classificatory approach proposed herein may prove a useful tool in support of research assessment at the individual level and for exploring potential determinants of research success.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.