Sunday, July 28, 2024

Are British Universities Really Underfunded?

I noticed this on LinkedIn recently. An item by Phil Baty, Chief Global Affairs Officer at Times Higher Education (THE), claims that British universities are seriously underfunded and that their world-class achievements are endangered.

He reports that a brilliant data analyst has revealed that inflation has eroded the value of the tuition fees that UK universities are allowed to charge and that costs have dramatically increased. 

Then we have a graph from a THE data guru that compares British university performance in the 18 metrics used in the current THE world rankings to that of their international peers. The UK is well ahead of the world average of the top 500 universities in the most recent world university rankings for the international outlook indicators and significantly for research strength, field-weighted citations, and publications. It is slightly ahead for research excellence, research reputation, research influence, and patents.

However, when it comes to institutional income, research income, and industry income, British universities are apparently way behind the rest of the world. So, it seems that THE has conclusively demonstrated that UK universities are seriously short of money.

But there are a few things that need to be considered.

First, the THE income indicators are all divided by the number of academic staff. To do well in these measures, a university could have substantial income, or at least report that it did, or it could reduce the number of faculty reported.

In other words, a university that decided to spend its money recruiting teaching and/or research staff would fall in the THE rankings. If it sacked a lot of teachers and researchers, it would be rewarded with a significant improvement. You might think that is a bit bonkers, but that is the unintended consequence of the THE  methodology. I do not know which applies to British universities in general or specifically, but it would be interesting to see a breakdown of the data.

Also, remember that the income indicators are based on data submitted by institutions. It would be unwise to assume that these are completely valid and accurate. A few years ago Alex Usher of HESA published an article showing that there were some serious problems with THE's industry income indicator. I am not sure whether it has improved since then.

Also, we should note that 55 UK universities are in the current THE world top 500. According to Webometrics, there are 31,657 universities worldwide and 355 in the UK. THE is, in effect, claiming that the top 15.49% of British universities, according to THE's criteria, are underfunded compared to the top 1.58% of world universities in general. 

Before signing off, the graph is instructive in that it shows that the rankings are massively biased toward British universities. Consider the weighting for the various metrics. 

The International Outlook pillar has a 7.5% weighting, research quality, that is citations, 30%, teaching and research reputation 33%, and publications per staff 5.5%. These are all criteria where British higher education does better than the world average.

In contrast, the three income metrics, where UK universities do badly, are given weightings of 2.5%, 5.5%, and 2% respectively. 

If THE decided to shift some of its weighting from reputation to income or to doctoral education, which the UK sector also does badly, its THE rank would fall very noticeably.







Sunday, July 07, 2024

Problems with the THE Reputation Rankings

THE has spent a lot of time and words proclaiming that it is trusted by administrators, students, sponsors, and the like. Perhaps it is, but whether it deserves to be is another matter. A recent article in THE  suggests that THE has made a mess of its reputation rankings and is scrambling to put things right.

Until 2021, THE used Elsevier to conduct its teaching and research reputation survey. The 2020-21 survey received 10,963  responses and was calibrated to ensure proper representation of regions and subjects. 

The survey was brought in-house in 2022, and since then, the number of responses has increased substantially to 29,606 in 2022, 38,796 in 2023, and 55,689 in 2024.

When the number of responses increases so dramatically, one should wonder exactly how this was achieved. Was it by sending out more surveys, improving the response rate, or institutional efforts to encourage participation? 

When the results were announced in February, THE declared that a number of Arab universities had achieved remarkable results in the reputation survey. THE conceded that this stellar performance was largely a regional affair that did not extend to the rest of the world. 

But that was not all. Several Arab universities have been making big strides and improving citation, publication, and patent scores: Cairo University, King Abdullah University of Science and Technology, UAE University, and Qatar University. 

The universities getting high scores in the THE rankings were less well-known in the Arab region and had received much lower scores for reputation in the US News and QS rankings. However, they are likely to do well in the forthcoming THE world and Arab university rankings.

THE has now admitted that some universities were encouraging researchers to vote for their own institutions and that there may have been "agreed relationships" between universities. THE is now talking about rewarding respondent diversity, that is getting support from more than just a few institutions.

It is regrettable that THE did not notice this earlier. If it does encourage such diversity, then quite a few universities will suffer dramatic falls in the rankings this year and next.

Anyway, THE could do a few things to improve the validity of its reputation survey. It could eliminate self-voting altogether, give a higher weighting to votes from other countries, as QS does, add a separate ranking for regional reputation, and combine scores for a number of years.

The problems with the reputation metrics seem to have begun with THE starting its own survey. It would be a good idea to go back to letting Elsevier do the survey. THE is undeniably brilliant at event management and public relations, although perhaps not jaw-droppingly so. However, it is not so good at rankings or data processing.