Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Wednesday, February 24, 2016
Britain leads in sniffing edge research
There must be some formula that can predict the scientific research that will go viral over the social media or reach the pages of the popular press or even Times Higher Education (THE).
The number of papers in the natural and social sciences is getting close to uncountable. So why out of all of them has THE showcased a study of the disgust reported by students when sniffing sweaty T-shirts from other universities?
Anyway, here is a suggestion to the authors for a follow-up study. Have the students read the latest QS, THE or Shanghai world rankings before having a sniff and see if that makes any difference to the disgust experienced.
Tuesday, February 23, 2016
Should the UK stay in the EU?
There are 130 universities in the UK and the vice-chancellors of more than 103 of them have signed a letter praising the role of the European Union in supporting the UK's world-class universities.
There are some notable names missing, University College London, Manchester, Warwick and York but such a high degree of consensus among the higher bureaucracy is rather suspicious.
There are some notable names missing, University College London, Manchester, Warwick and York but such a high degree of consensus among the higher bureaucracy is rather suspicious.
The Ranking Effect
The observer effect in science refers to the changes that a phenomenon undergoes as a result of being observed.
Similarly, in the world of university ranking, indicators that may once have been useful measures of quality sometimes become less so once they are included in the global rankings.
Counting highly cited researchers might have been a good measure of research quality but its value was greatly reduced once someone found out how easy it was to recruit adjunct researchers and to get them to list their new "employer" as an affiliation.
International students and international faculty could also be markers of quality but not so much if universities are deliberately recruiting under-qualified staff and students just to boost their rankings score.
Or, as Irish writer Eoin O'Malley aptly puts it in the online magazine Village , "As Goodhart’s Law warns us: when a measure becomes a target, it ceases to be a good measure".
O'Malley argues that reputation surveys are little more than an exercise in name reputation, that Nobel awards do not measure anything important (although I should point out that Trinity College Dublin would not get any credit for Samuel Beckett since the Shanghai Rankings do not count literature awards), and that the major criteria used by rankers do not measure anything of interest to students.
Irish universities have experienced a disproportionate impact from the methodological changes introduced by QS and Times Higher Education towards the end of last year. I suspect that Dr O'Malley's criticism will have a receptive audience.
Similarly, in the world of university ranking, indicators that may once have been useful measures of quality sometimes become less so once they are included in the global rankings.
Counting highly cited researchers might have been a good measure of research quality but its value was greatly reduced once someone found out how easy it was to recruit adjunct researchers and to get them to list their new "employer" as an affiliation.
International students and international faculty could also be markers of quality but not so much if universities are deliberately recruiting under-qualified staff and students just to boost their rankings score.
Or, as Irish writer Eoin O'Malley aptly puts it in the online magazine Village , "As Goodhart’s Law warns us: when a measure becomes a target, it ceases to be a good measure".
O'Malley argues that reputation surveys are little more than an exercise in name reputation, that Nobel awards do not measure anything important (although I should point out that Trinity College Dublin would not get any credit for Samuel Beckett since the Shanghai Rankings do not count literature awards), and that the major criteria used by rankers do not measure anything of interest to students.
Irish universities have experienced a disproportionate impact from the methodological changes introduced by QS and Times Higher Education towards the end of last year. I suspect that Dr O'Malley's criticism will have a receptive audience.
Subscribe to:
Posts (Atom)