2006
DOI: 10.1087/095315106778690751
|View full text |Cite
|
Sign up to set email alerts
|

The publishing imperative: the pervasive influence of publication metrics

Abstract: This article summarizes the effects of the increasing global trend towards measuring research quality and effectiveness through, in particular, publication‐based metrics, and its effects on scholarly communication. Such metrics are increasingly influencing the behaviour patterns of administrators, publishers, librarians, and researchers. Impact and citation measures, which often rely solely on Thomson Scientific data, are examined in the context of university league tables and research assessment exercises. Th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
57
0
1

Year Published

2008
2008
2020
2020

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 70 publications
(58 citation statements)
references
References 16 publications
0
57
0
1
Order By: Relevance
“…Rankings such as the Shanghai Jiao Tong Academic Ranking of World Universities (Centre for World-Class Universities 2014), the Times Higher Education World University Rankings (Thomson Reuters 2014) and the QS World University Rankings (Quacquarelli Symonds 2014) have captured wide attention not only from academia but also from government departments, funding bodies and the general public (Burns and McCarthy 2010;Macdonald and Kam 2009;Steele, Butler, and Kingsley 2006). These rankings, in many cases, have direct or indirect implications for a university's reputation and research funding.…”
Section: Introductionmentioning
confidence: 99%
“…Rankings such as the Shanghai Jiao Tong Academic Ranking of World Universities (Centre for World-Class Universities 2014), the Times Higher Education World University Rankings (Thomson Reuters 2014) and the QS World University Rankings (Quacquarelli Symonds 2014) have captured wide attention not only from academia but also from government departments, funding bodies and the general public (Burns and McCarthy 2010;Macdonald and Kam 2009;Steele, Butler, and Kingsley 2006). These rankings, in many cases, have direct or indirect implications for a university's reputation and research funding.…”
Section: Introductionmentioning
confidence: 99%
“…Further-more, given that science is important for areas as diverse as culture, innovation, and major societal challenges, any single performance measure -however welldesigned it may be -cannot capture the overall 'importance' of a scientist's work. A strong performance measure based on a single criterion may actually skew the focus of the staff in an undesirable way, so that they fulfil the criterion at the cost of scientific relevance (Butler 2003, Steele et al 2006, Lawrence 2007. This proves that, although the use of a performance measure can alter the behaviour of the researchers, a simple measure may do harm as well as good.…”
Section: Can We Measure Science?mentioning
confidence: 99%
“…Preliminary evidence suggests that changes in the academic research system may involve conflicting forces: shifts in funding stimulate scientists to make direct contributions to economic growth or other societal goals, but the rise of systematic performance evaluation increases the pressure to achieve scientific excellence as measured in bibliometric terms (Steele et al, 2006;Hessels and van Lente, 2011;Hessels et al, forthcoming).…”
mentioning
confidence: 99%