Thursday, November 15, 2018

THE uncovers more pockets of research excellence

I don't want to do this. I really would like to start blogging about whether rankings should measure third missions or developing metrics for teaching and learning. But I find it difficult to stay away from the THE rankings, especially the citations indicator.

I have a couple of questions. If someone can help please post a comment here. 

Do the presidents, vice-chancellors, directors, generalissimos, or whatever of universities actually look at or get somebody to look at the indicator scores of the THE world rankings and their spin-offs?

Does anyone ever wonder how a ranking that produces such such imaginative and strange results for research influence, measured by citations, command the respect and trust of those hard-headed engineers, MBAs and statisticians running the world's elite universities?

These questions are especially relevant as THE are releasing subject rankings. Here are the top universities in the world for research impact (citations) in various subjects. For computer science and engineering they refer to last year's rankings.

Clinical, pre-clinical and health: Tokyo Metropolitan University

Life Sciences: MIT

Physical sciences: Babol Noshirvani University of Technology

Psychology: Princeton University

Arts and humanities: Universite de Versailles Saint Quentin-en-Yvelines

Education: Kazan Federal University

Law: Iowa State University

Social sciences: Stanford University

Business and economics: Dartmouth College

Computer Science: Princeton University

Engineering and technology: Victoria University, Australia.










https://www.timeshighereducation.com/world-university-rankings/by-subject

Saturday, November 10, 2018

A modest suggestion for THE

A few years ago the Shanghai rankings did an interesting tweak on their global rankings. They deleted the two indicators that counted Nobel and Fields awards and produced an Alternative Ranking.

There were some changes. The University of California San Diego and the University of Toronto did better while Princeton and Vanderbilt did worse.

Perhaps it is time for Times Higher Education (THE) to consider doing something similar for their citations indicator. Take a look at their latest subject ranking, Clinical, Pre-clinical and Health. Here are the top ten for citations, supposedly a measure of research impact or influence.

1.   Tokyo Metropolitan University
2.   Auckland University of Technology
3.   Metropolitan Autonomous University, Mexico
4.   Jordan University of Science and Technology
5.   University of Canberra 
6.   Anglia Ruskin University
7.   University of the Philippines
8.   Brighton and Sussex Medical School
9.   Pontifical Javeriana University, Colombia
10. University of Lorraine.

If THE started producing alternative subject rankings without the citations indicator they would be a bit less interesting but a lot more credible.











Friday, November 02, 2018

Ranking Rankings: Measuring Stability

I have noticed that some rankings are prone to a large amount of churning. Universities may rise or fall dozens of places over a year, sometimes as a result of methodological changes, changes in the number or type of universities ranked, errors and corrections of errors (fortunately rare these days), changes in data collection and reporting procedures, or because there is a small number of data points.

Some ranking organisations like to throw headlines around about who's up or down, the rise of Asia, the fall of America, and so on. This is a trivialisation of any serious attempt at the comparative evaluation of universities, which do not behave like volatile financial markets. Universities are generally fairly stable institutions: most of the leading universities of the early twentieth century are still here while the Ottoman, Hohenzollern, Hapsburg and Romanov empires are long gone.

Reliable rankings should not be expected to show dramatic changes from year to year, unless there has been radical restructuring like the recent wave of mergers in France. The validity of a ranking system is questionable if universities bounce up or down dozens, scores, even hundreds of ranks every year.

The following table shows the volatility of the global rankings listed in the IREG Inventory of international rankings. U-Multirank is not listed because it does not provide overall ranks and UniRank and Webometrics do not give access to previous editions. 

Average rank change is the number of places that each of the top thirty universities has fallen or climbed between the two most recent editions of the ranking.

The most stable rankings are the Shanghai ARWU, followed by the US News global rankings and the National Taiwan University rankings. The GreenMetric rankings, Reuters Innovative Universities and the high quality research indicator of Leiden Ranking show the highest levels of volatility.

This is a very limited exercise. We might get different results if we examined all of the universities in the rankings or analysed changes over several years.



rank
ranking
address
average rank change
1
Shanghai ARWU
China
0.73
2
US News Best Global Universities
USA
0.83
3
National Taiwan University Rankings
Taiwan
1.43
4
THE World University Rankings
UK
1.60
5
Round University Rankings
Russia
2.28
6
University Ranking by Academic Performance
Turkey
2.23
7
QS World University Rankings
UK
2.33
8
Nature Index
UK
2.60
9
Leiden Ranking Publications
Netherlands
2.77
10
Scimago
Spain
3.43
11
Emerging/Trendence
France
3.53
12
Center for World University Ranking
UAE
4.60
13
Leiden Ranking % Publications in top 1%
Netherlands
4.77
14
Reuters Innovative Universities
USA
6.17
15
UI GreenMetric
Indonesia
13.14