Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Sunday, May 08, 2022
Has China Really Stopped Rising? Looking at the QS Subject Rankings
Monday, March 28, 2022
Where does reputation come from?
THE announced the latest edition of its reputation rankings last October. The amount of information is quite limited: scores are given for only the top fifty universities. But even that provides a few interesting insights.
First, there is really no point in providing separate data for teaching and research reputation. The correlation between the two for the top fifty is .99. This is unsurprising. THE surveys researchers who have published in Scopus indexed journals and so there is a very obvious halo effect. Respondents have no choice but to refer to their knowledge of research competence when trying to assess teaching performance. If THE are going to improve their current methodology they need to recognise that their reputation surveys are measuring the same thing. Maybe they could try to find another source of respondents for the teaching survey, such as school advisors, students or faculty at predominantly teaching institutions.
Next, after plugging in a few indicators from other rankings, it is clear that that the metrics most closely associated with teaching and research reputation are publications in Nature and Science (Shanghai), highly cited researchers (Shanghai), and papers in highly reputed journals (Leiden).
The correlation with scores in the RUR and QS reputation rankings, citations (THE and QS), and international faculty was modest.
There was no correlation at all with the proportion of papers with female or male authors (Leiden).
So it seems that the best way to acquire a reputation for good teaching and research is publish papers in the top journals and get lots of citations. That, of course, applies only to this very limited group of institutions.
Sunday, March 20, 2022
What should Rankers Do About the Ukraine Crisis?
Over the last few days there have been calls for the global rankers to boycott or delist Russian universities to protest the Russian invasion of Ukraine. There have also been demands that journals should reject submissions from Russian authors and universities and research bodies stop collaborating with Russian authors.
So far, four European ranking agencies have announced some sort of sanctions.
U-Multirank has announced that Russian universities will be suspended "until they again share in the core values of the European higher education area."
QS will not promote Russia as a study area and will pause business engagement. It will also redact Russian universities from new rankings.
Webometrics will "limit the value added information" for Russian and Belarusian universities.
Times Higher Education (THE) will stop business activities with Russia but will not remove Russian universities from its rankings.
The crisis has highlighted a fundamental ambiguity in the nature of global rankings. Are they devices for promoting the business interests of institutions or do they provide relevant and useful information for researchers, students and the public?
Refraining from doing business with Russia until it withdraws from Ukraine is a welcome rebuke to the current government. If, however, rankings contain useful information about Russian scientific and research capabilities then that information should continue to be made available.
Sunday, September 26, 2021
What is a University Really for ?
Louise Richardson, Vice-Chancellor of the University of Oxford, has seen fit to enlighten us about the true purpose of a university. It is, it seems, to inculcate appropriate deference to the class of certified experts.
Professor Richardson remarked at the latest Times Higher Education (THE) academic summit that she was embarrassed that "we" had educated the Conservative politician Michael Gove who said, while talking about Brexit, that people had had enough of experts.
So now we know what universities are really about. Not about critical discussion, cutting-edge research, skepticism, the disinterested pursuit of truth but about teaching respect for experts.
A few years ago I wrote a post suggesting we were now in a world where the expertise of the accredited experts was declining along with public deference. I referred to the failure of political scientists to predict the nomination of Trump, the election of Trump, the rise of Leicester City, the Brexit vote. It looks like respect for experts has continued to decline, not entirely without reason.
Professor Richardson thinks that Gove's disdain for the Brexit experts is cause for embarrassment. While it is early years for the real effects of Brexit to become clear it is as yet far from obvious that it has been an unmitigated disaster. It is, moreover, a little ironic that the remark was made at the latest THE academic summit where the annual world rankings were announced. Richardson remarked that she was delighted that her university was once again ranked number one.
The irony is that the THE world rankings are probably the least expert of the global rankings although they are apparently the most prestigious at least among those institutions that are known for being prestigious.
Let's have another look at THE's Citations Indicator which is supposed to measure research quality or impact and accounts for nearly a third of the total weighting. (Regular readers of this blog can skim or skip the next few lines. ) Here are the top five from this year's rankings.
1, University of Cape Coast
2, Duy Tan University
3, An Najah National University
4. Aswan University
5. Brighton and Sussex Medical School.
This is not an academic version of the imaginary football league tables that nine-year-old children used to construct. Nor is it the result of massive cheating by the universities concerned. It is quite simply the outcome of a hopelessly flawed system. THE, or rather its data analysts, appear to be aware of the inadequacies of this indicator but somehow meaningful reform keeps getting postponed. One day historians will search the THE archives to findr the causes of this inability to take very simple and obvious measures to produce a sensible and credible ranking. I suspect that the people in control of THE policy are averse to anything that might involve any distraction from the priority of monetising as much data as possible. Nor is there any compelling reason for a rush to reform when universities like Oxford are unconcerned about the inadequacies of the current system.
Here are the top five for income from industry which is supposed to have something to do with innovation.
1. Asia University Taiwan
2. Istanbul Technical University
3. Khalifa University
4. Korean Advanced Institute of Science and Technology (KAIST)
5. LMU Munich.
This is a bit better. It is not implausible that KAIST or Munich is a world leader for innovation. But in general, this indicator is also inadequate for any purpose other than providing fodder for publicity. See a scathing review by Alex Usher
Would any tutor or examiner at Oxford give any credit to any student who thought that Ghana, Vietnam and Palestine were centers of international research impact. They are all doing a remarkable job of teachng in many reseapects but that is not what THE is ostensibly giving them credit for.
In addition, the THE world rankings fail to meet satisfactory standards with regard to basic validity. Looking at the indicator scores for the top 200 universities in the most recent world rankings we can see that the correlation between research and teaching is 0.92. In effect these are not two distinct metrics. They are measuring essentially the same thing. A quick look at the methodology suggests that what they are comparing is income (total institutional income for teaching, research income for research), reputation (the opinion surveys for research and teaching) and investment in doctoral programmes.
On the other hand , the citations indicator does not correlate significantly with research or teaching and correlates negatively with industry income.
One can hardly blame THE for wanting to make as much money as possible. But surely we can expect something better from supposedly elite institutions that claim to value intellectual and scientific excellence. If Oxford and its peers wish to restore public confidence in the experts there is no better way than saying to THE that we will not submit data to THE until you produce something a little less embarrassing.