Purpose -To analyse the publications of, and the citations to, the current staff of 19 departments of computer science in Malaysian universities, and to compare this bibliometric data with expert peer reviews of Malaysian research performance.
Method -Author and citation searches of the Scopus and Web of Science databases.Findings -Both publication and citation rates are low, although this is at least in part due to some Malaysian universities having only a teaching function. More of the departments' publications were identified in Scopus than in Web of Science, but both databases were needed for comprehensive coverage. Statistically significant relationships were observed between the departments' publication and citation counts and the rankings of the departments' parent universities in two evaluations of the research performance of Malaysian universities Originality -This is the first comparison of bibliometric and peer-review data for Malaysia, and, more generally, for a country with a newly developed higher education system.
Keywords -
IntroductionGovernments worldwide are looking for ways in which they can evaluate the quality of the research that is carried out in their countries' universities. Informal evaluations have been carried out for many years, but the increasing costs of higher education provision have resulted in the development of more formal evaluation mechanisms. These mechanisms are designed to ensure that government funding will be channelled to those institutions and research groups that have demonstrated their ability to carry out high-quality research in a cost-effective manner. assessments that focus on specific disciplines or groups of disciplines.Research quality has traditionally been assessed by means of expert review (in a manner analogous to the procedures used for refereeing journal articles and grant applications), not least because this approach is well established and generally enjoys the support of the academic community. It is, however, very costly in terms of the time of the subject experts, and this has spurred interest in the use of bibliometric indicators as a surrogate for peer review. These indicators can consider both the quantity of research (as reflected in the numbers of research publications produced by a university, department or whatever) and the quality of research (as reflected in the numbers of citations to those publications) and are typically far cheaper to use since they exploit existing bibliometric databases and do not require costly human judgments. Moreover, there is an increasing body of evidence to support the view that bibliometric approaches can yield results closely mirroring the judgments of subject experts. Specifically, significant correlations have been observed between expert judgments and bibliometric data in comparisons carried out in Italy (Abramo et al