Recently there is increasing interest in university rankings. Annual rankings of world universities are published by QS for the Times Higher Education Supplement, the Shanghai Jiao Tong University, the Higher Education and Accreditation Council of Taiwan and rankings based on Web visibility by the Cybermetrics Lab at CSIC. In this paper we compare the rankings using a set of similarity measures. For the rankings that are being published for a number of years we also examine longitudinal patterns. The rankings limited to European universities are compared to the ranking of the Centre for Science and Technology Studies at Leiden University. The findings show that there are reasonable similarities between the rankings, even though each applies a different methodology. The biggest differences are between the rankings provided by the QS-Times Higher Education Supplement and the Ranking Web of the CSIC Cybermetrics Lab. The highest similarities were observed between the Taiwanese and the Leiden rankings from European universities. Overall the similarities are increased when the comparison is limited to the European universities.
Academics can now use the web and the social websites to disseminate scholarly information in a variety of different ways. Although some scholars have taken advantage of these new online opportunities, it is not clear how widespread their uptake is or how much impact they can have. This study assesses the extent to which successful scientists have social web presences, focusing on one influential group: highly cited researchers working at European institutions. It also assesses the impact of these presences. We manually and systematically identified if the European highly cited researchers had profiles in Google Scholar, Microsoft Academic Search, Mendeley, Academia and LinkedIn or any content in SlideShare. We then used URL mentions and altmetric indicators to assess the impact of the web presences found. Although most of the scientists had an institutional website of some kind, few had created a profile in any social website investigated, and LinkedIn -the only non-academic site in the list -was the most popular. Scientists having one kind of social web profile were more likely to have another in many cases, especially in the life sciences and engineering. In most cases it was possible to estimate the relative impact of the profiles using a readily available statistic and there were disciplinary differences in the impact of the different kinds of profiles. Most social web profiles had some evidence of uptake, if not impact; nevertheless, the value of the indicators used is unclear.
To test feasibility of cybermetric indicators for describing and ranking university activities as shown in their Web sites, a large set of 9,330 institutions worldwide was compiled and analyzed. Using search engines' advanced features, size (number of pages), visibility (number of external inlinks), and number of rich files (pdf, ps, doc, ppt, and xls formats) were obtained for each of the institutional domains of the universities. We found a statistically significant correlation between a Web ranking built on a combination of Webometric data and other university rankings based on bibliometric and other indicators. Results show that cybermetric measures could be useful for reflecting the contribution of technologically oriented institutions, increasing the visibility of developing countries, and improving the rankings based on Science Citation Index (SCI) data with known biases.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.