I don't want to do this. I really would like to start blogging about whether rankings should measure third missions or developing metrics for teaching and learning. But I find it difficult to stay away from the THE rankings, especially the citations indicator.
I have a couple of questions. If someone can help please post a comment here.
Do the presidents, vice-chancellors, directors, generalissimos, or whatever of universities actually look at or get somebody to look at the indicator scores of the THE world rankings and their spin-offs?
Does anyone ever wonder how a ranking that produces such such imaginative and strange results for research influence, measured by citations, command the respect and trust of those hard-headed engineers, MBAs and statisticians running the world's elite universities?
These questions are especially relevant as THE are releasing subject rankings. Here are the top universities in the world for research impact (citations) in various subjects. For computer science and engineering they refer to last year's rankings.
Clinical, pre-clinical and health: Tokyo Metropolitan University
Life Sciences: MIT
Physical sciences: Babol Noshirvani University of Technology
Psychology: Princeton University
Arts and humanities: Universite de Versailles Saint Quentin-en-Yvelines
Education: Kazan Federal University
Law: Iowa State University
Social sciences: Stanford University
Business and economics: Dartmouth College
Computer Science: Princeton University
Engineering and technology: Victoria University, Australia.
https://www.timeshighereducation.com/world-university-rankings/by-subject
1 comment:
Yes, many universities employ rankings experts to compile their institution's data to submit to the rankers, to analyse the ranks, to question the rankers about apparent anomalies, and to submit proposals to change methods. The rankers often ignore these submissions, presumably because they consider them self interested.
Post a Comment