Does past performance influence success in grant applications? In this study we test whether the grant allocation decisions of the Netherlands Research Council for the Economic and Social Sciences correlate with the past performances of the applicants in terms of publications and citations, and with the results of the peer review process organized by the Council. We show that the Council is successful in distinguishing grant applicants with above-average performance from those with below-average performance, but within the former group no correlation could be found between past performance and receiving a grant. When comparing the best performing researchers who were denied funding with the group of researchers who received it, the rejected researchers significantly outperformed the funded ones. Furthermore, the best rejected proposals score on average as high on the outcomes of the peer review process as the accepted proposals. Finally, we found that the Council under study successfully corrected for gender effects during the selection process. We explain why these findings may be more general than for this case only. However, if research councils are not able to select the 'best' researchers, perhaps they should reconsider their mission. In a final section with policy implications, we discuss the role of research councils at the level of the science system in terms of variation, innovation, and quality control.
The theory of citations should not consider cited and/or citing agents as its sole subject of study. One is able to study also the dynamics in the networks of communications. While communicating agents (e.g., authors, laboratories, journals) can be made comparable in terms of their publication and citation counts, one would expect the communication networks not to be homogeneous. The latent structures of the network indicate different codifications that span a space of possible "translations". The various subdynamics can be hypothesized from an evolutionary perspective. Using the network of aggregated journal-journal citations in Science & Technolo~,~r Studies as an empirical case, the operation of such subdynamics can be demonstrated. Policy implications and the consequences for a theory-driven type of scientometrics will be elaborated.
Bibliometric studies often measure and compare scholarly performance, but they rarely investigate why universities, departments, and research groups do have different performance. In this paper we try to explain differences in scholarly performance of research groups in terms of organizational variables. In order to do this, we extensively review the relevant literature, and develop a model using two theoretical approaches. A multivariate analysis shows which of the independent variables do play a role in the various scholarly performance dimensions. The study shows what organizational strategies may help in optimizing performance in various dimensions. Implications are discussed.
This paper answers five questions about the societal impact of research. Firstly, we examine the opinions of research group leaders about the increased emphasis on societal impact, i.e. does it influence their research agenda, communication with stakeholders, and knowledge dissemination to stakeholders? Furthermore, we investigate the quality of their societal output. We also study whether the societal and scholarly productivity of academic groups are positively or negatively related. In addition, we investigate which managerial and organisational factors (e.g. experience of the principal investigator, group size and funding) influence societal output. Finally, we show for one case (virology) that societal impact is also visible through indirect links. Our study shows that research group leaders have a slightly positive attitude towards the increased emphasis on the societal impact of research. The study also indicates a wide variety of societal-oriented output. Furthermore, the societal and scientific productivity of academic groups are unrelated, suggesting that stimulating social relevance requires specific organisational and contextual interventions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.