Tuesday, February 23, 2016

Should the UK stay in the EU?

There are 130 universities in the UK and the vice-chancellors of more than 103 of them have signed a letter praising the role of the European Union in supporting the UK's world-class universities.

There are some notable names missing, University College London, Manchester, Warwick and York but such a high degree of consensus among the higher bureaucracy is rather suspicious.




The Ranking Effect

The observer effect in science refers to the changes that a phenomenon undergoes as a result of being observed.

Similarly, in the world of university ranking, indicators that may once have been useful measures of quality sometimes become less so once they are included in the global rankings.

Counting highly cited researchers might have been a good measure of research quality but its value was greatly reduced once someone found out how easy it was to recruit adjunct researchers and to get them to list their new "employer" as an affiliation.

International students and international faculty could also be markers of quality but not so much if universities are deliberately recruiting under-qualified staff and students just to boost their rankings score.

Or, as Irish writer Eoin O'Malley aptly puts it in the online magazine Village , "As Goodhart’s Law warns us: when a measure becomes a target, it ceases to be a good measure".

O'Malley argues that reputation surveys are little more than an exercise in name reputation, that Nobel awards do not measure anything important (although I should point out that Trinity College Dublin would not get any credit for Samuel Beckett since the Shanghai Rankings do not count literature awards), and that the major criteria used by rankers do not measure anything of interest to students.

Irish universities have experienced a disproportionate impact from the methodological changes introduced by QS and Times Higher Education towards the end of last year. I suspect that Dr O'Malley's criticism will have a receptive audience.






Friday, January 15, 2016

Aussies not impressed with THE any more


Back in 2012 The Australian published a list of the most influential figures in Australian higher education. In 14th place was Phil Baty, the editor of the Times Higher Education (THE) World University Rankings.

Recently, the newspaper came out with another influential list full of the usual bureaucrats and boosters plus the Australian dollar at number five. Then at number 10 was not a person, not even Times Higher Education, but "rankings". A step up for rankings but a demotion for THE.

To make things worse for THE, Leiden Ranking and the Shanghai Academic ranking of World Universities were designated the leaders.

Then we have a reference to "new and increasingly obscure league tables peddled by unreliable metrics merchants, with volatile methodologies triggering inexplicably spectacular rises and falls from grace."

But who are those new and increasingly obscure league tables?  It can't be URAP, National Taiwan University Rankings, QS world rankings, or Scimago, because they are not new. The US News Best Global Universities and the Russian Round University Ranking are  new but so far their methodology is not volatile. Webometrics can be a bit volatile sometimes but it is also not  new. Maybe they are referring to the QS subject rankings.

Or could it be that The Australian is thinking of the THE World University Rankings? What happened last autumn to universities in France, Korea and Turkey was certainly a case of volatile methodology. But new? Maybe The Australian has decided that the methodology was changed so much that it constituted a new league table.