1996
DOI: 10.1002/(sici)1097-4571(199602)47:2<173::aid-asi10>3.0.co;2-6
|View full text |Cite
|
Sign up to set email alerts
|

Postscript on program rankings

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

1996
1996
2015
2015

Publication Types

Select...
6

Relationship

4
2

Authors

Journals

citations
Cited by 15 publications
(6 citation statements)
references
References 3 publications
0
6
0
Order By: Relevance
“…The total productivity of each department is presumably related to the number of faculty, and respondents to the U.S. News survey may or may not have considered department size when judging academic quality. Previous research suggests, however, that perceived quality is more closely associated with total departmental productivity than with productivity per faculty member (Cronin & Overfelt, ).…”
Section: Resultsmentioning
confidence: 97%
“…The total productivity of each department is presumably related to the number of faculty, and respondents to the U.S. News survey may or may not have considered department size when judging academic quality. Previous research suggests, however, that perceived quality is more closely associated with total departmental productivity than with productivity per faculty member (Cronin & Overfelt, ).…”
Section: Resultsmentioning
confidence: 97%
“…Here, for the first time, we apply the h ‐index to information science (IS) and compare rankings based on raw citation counts with those based on h ‐counts. In IS, as in many other fields, there is a robust, ongoing debate on the pros and cons of evaluative bibliometrics and the associated techniques (e.g., Budd, 2000; Cronin & Overfelt, 1996; Meho & Spurgin, 2005). We identified 31 influential information science faculty from the United States.…”
Section: Approach and Methodsmentioning
confidence: 99%
“…Although the proponents argue that this method is an indispensable support tool for traditional evaluative measures (Cronin & Overfelt, 1994; Garfield, 1983a, 1983b; Glanzel, 1996; Koenig, 1982, 1983; Kostoff, 1996; Lawani & Bayer, 1983; Narin, 1976; Narin & Hamilton, 1996; van Raan, 1996, 1997), critics claim that it has some serious problems or limitations that impact its validity, including the following: (1) Citation counts give no clue why a work is being cited; (2) citations are field‐dependent and may be influenced by time, number of publications, access to or knowledge of the existence of needed information, as well as the visibility and/or professional rank of the authors; and (3) citation databases provide credit only to the first author, primarily cover English journal articles published in the United States, are not comprehensive in coverage, and have many technical problems such as synonyms, homonyms, clerical errors, and limited coverage of the literature (MacRoberts & MacRoberts, 1986, 1989, 1996; Seglen, 1992, 1998). Studies that report both the validity of citation counts in research assessments and the positive correlation between them and both peer evaluations and publication counts have been discussed and reviewed by many, including Baird and Oppenheim (1994), Biggs and Bookstein (1988), Cronin and Overfelt (1996), Holmes and Oppenheim (2001), Kostoff (1996), Narin (1976), Narin and Hamilton (1996), Oppenheim (1995), Seng and Willett (1995), and Smith (1981).…”
Section: Ranking Studies In Library and Information Sciencementioning
confidence: 99%