Friday, January 24, 2025

THE is shocked, shocked ...

We are repeatedly told that the TimesHigher Education (THE) university rankings are trusted by students, governments, and other stakeholders. Perhaps they are. Whether they should be is another matter. 

Last October, THE announced the results of its World University Rankings, and there was a chorus of lamentation from leading Australian universities, among others, who apparently trusted THE. It seems that the debate over restricting the admission of international students has damaged the country's reputation, and that has been reflected in the THE reputation survey. which contributes disproportionately to THE's teaching and research "pillars." That has led to declining overall scores, which will be the start of a vicious downward spiral. British and American universities also bemoaned the decline in ranking scores, supposedly due to the lack of funding from hard-hearted governments.

For many academics and administrators, THE has become the arbiter of excellence and a credible advisor to the agencies that dominate Western economy and society. It has even become a preferred analyst for the WEF, which is supposed to represent the finest minds of the corporate global world. This is quite remarkable since there is a big mismatch between THE's pretensions to excellence and its actual practice. 

A recent example was the publication of a story about how THE's data analysts had detected collusive activity among some universities in order to boost their scores in the reputation surveys that make up a substantial part of the THE World University Rankings and their various derivatives.

On October 24, David Watkins of THE announced that a "syndicate" had been detected where universities supported each other in the THE Arab reputation survey to the exclusion of non-members. Exactly who those members were was not announced, but it probably included the nine universities that made it to the top 200 in THE World Reputation Survey announced in February 2024, the data for which was included in THE world ranking announced in October  2024. It might also include some universities that had made sudden and surprising gains in the Arab University Rankings announced in November 2023, and the World University Rankings announced last October.

There is a whiff of hypocrisy here. THE is apparently getting upset because universities have probably been doing something that the rankers have condoned or at least ignored. There were signs that something was a bit off as far back as the Arab University Rankings in November 2023. These showed surprisingly good performances from several universities that had performed poorly or not at all in other rankings. In particular, universities in the Emirates were rising while those in Egypt were falling. This was interesting because the results were announced at a summit held in Abu Dhabi that featured several speakers from the Emirates, a development reminiscent of the 2014 summit in Qatar when Texas A and M Qatar was proclaimed the top MENA university based on precisely half a highly cited researcher followed by a similar summit in the UAE in 2015 when that university -- actually a program that has now been wound up -- disappeared, and  United Arab Emirates University advanced to fifth place.

Meanwhile,  between October 2023  and January 2024, THE was conducting their survey of academic opinion for the World University Rankings. Before 2021, they had relied on survey data supplied by Clarivate, but now the survey has been brought in-house. That, it now appears, was not a good idea. The number of survey respondents soared, and there was a disproportionate number of respondents from the UAE. In February 2024, THE published the results of its reputation survey, which would later become a part of the world rankings. 

THE listed only the top 200 universities and gave exact scores for the top fifty.  The interesting thing was that nine Arab universities were included whose reputation scores were below the scores for academic reputation in the QS World University Rankings rankings, the scores for global research reputation in the US News Best Global Universities ranking, or scores in the Round University Rankings, if they were actually ranked at all and below their previous scores.  They were also above the scores achieved by leading universities in the region in Egypt, Saudi Arabia, Qatar, and Lebanon, and they appeared unrelated to other indicators. 

It was probably not only Arab universities. Egor Yablokov of E --  Quadrat Science and Education identified several universities whose reputation score appears disproportionate to the overall scores for the THE world rankings.

When the 2025 WUR rankings appeared in October of last year, there were more signs that something was amiss. Universities in the UAE  including Abu Dhabi University and Khalifa University, also in Abu Dhabi, did much better than in previous editions or in other rankings. There were other apparent anomalies. Al Ahliyaa Amaan University was ahead of the University of Jordan, the Lebanese American University higher than the American University of Beirut,  the American University of the Middle East higher than Kuwait University, Future University in Egypt, and the Egypt-Japan University of Science and Technology higher than Cairo University and Al Azhar. 

Then came the Arab University Rankings. It appears that THE had now taken action against the "syndicate", resulting in them dropping significantly. 

In addition to this, there are some trends that require explanation. Many universities in Saudi Arabia and UAE have fallen significantly, while some in Jordan, Egypt, and Iraq have risen.  Applied Science Private University, Jordan, has risen from 91-100 to 25, Al Ahliyya Amman University, also in Jordan, from 91-100 to  28, Ahlia University in  Bahrain from unranked to 17th, Cairo University from  28 to 8, the University of Baghdad from 40 to 20, Mustansiriyah University, Baghdad from 71-80 to 37, An Najah National University, Palestine, 81-90 to 23, and Dhofar University, Oman, from 101-120 to 49. 

So, THE have allocated a whopping 41% weighting for reputation, of which 23% is for research reputation, for their Arab University Rankings, compared to 25% for their Asian rankings and 33% for the Latin American rankings. They have  introduced a new metric, collaboration within the Arab world, taken over the research and teaching survey from Elsevier, increased the number of respondents, organized prestigious summits, and offered a variety of consultancy arrangements. All of this would create an environment in which exclusive agreements were likely to flourish.

The extreme fluctuations resulting from THE's changes to the reputation indicators have seriously undermined THE's credibility, or at least they ought to. It would be better for everybody if THE simply returned the administration of the reputation survey to Elsevier and stuck to event management, where it is unsurpassed. 


 




Wednesday, October 23, 2024

Are Australian universities really on a precipice?

  


Times Higher Education (THE) recently published the latest edition of their World University Rankings (WUR), which contained bad news for Australian higher education. The country’s leading universities have fallen down the rankings, apparently because of a decline in their scores for research and teaching reputation scores and international outlook, that is international students, staff, and collaboration.

THE has reported that Angel Calderon of RMIT had said that the “downturn had mainly been driven by declining scores in THE’s reputation surveys” and that he was warning that there was worse to come.

Australian universities have responded by demanding that the cap on international students be lifted to avoid financial disaster. Nobody seems to consider how the universities got to the point where they could not survive without recruiting researchers and students from abroad.

It is, however, a mistake to predict catastrophe from a single year’s ranking. Universities have thousands of faculty, employees, and students and produce thousands of patents, articles, books, and other outputs. If a ranking produces large-scale fluctuations over the course of a year, that might well be due to deficiencies in the methodology rather than any sudden change in institutional quality.

There are now several global university rankings that attempt to assess universities' performance in one way or another. THE is not the only one, nor is it the best, and in some ways, it is the worst or nearly the worst.  For universities to link their public image to a single ranking, or even a single indicator, especially one that is as flawed as THE, is quite risky.

To start with, THE is very opaque. Unlike QS, US News, National Taiwan University, Shanghai Ranking, Webometrics, and other rankings, THE does not provide ranks or scores for each of the metrics that it uses to construct the composite or overall score. Instead, they are bundled together in five “pillars”. It is consequently difficult to determine exactly what causes a university to rise or fall in any of these pillars. For example, an improvement in the teaching pillar might be due to increased institutional income, fewer students, fewer faculty, an improved reputation for teaching, more doctorates, fewer bachelor degrees awarded, or some combination of these or some of these.

Added to this are some very dubious results from the THE world and regional rankings over the years. Alexandria University, Aswan University, Brighton and Sussex Medical School, Anglia Ruskin University, Panjab University, Federico Santa Maria Technical University, Kurdistan University of Medical Sciences, and the University of Perediniya have been at one time or another among the supposed world leaders for research quality measured by citations. Leaders for industry income, which is claimed to reflect knowledge transfer, have included Anadolu University, Asia University, Taiwan, the Federal University of Itajubá, and Makerere University,

The citations indicator has been reformed and is now the research quality indicator, but there are still some oddities at its upper level, such as Humanitas University, Vita-Salute San Raffaele University, Australian Catholic University, and St George’s, University of London, probably because they participated in a few highly cited multi-author medical or physics projects.

It now seems that the reputation indicators in the THE WUR are producing results that are similarly lacking in validity. Altogether, reputation counts for 33%, divided between the research and teaching pillars. A truncated version of the survey with the top 200 universities, the scores of fifty of which were provided, was published earlier this year, and the full results were incorporated in the recent world rankings.

Until 2021 THE used the results of a survey conducted by Elsevier among researchers who had published in journals in the Scopus database. After that THE brought the survey in-house and ran it themselves. That may have been a mistake. THE is brilliant at convincing journalists and administrators that it is a trustworthy judge of university quality, but it is not so good at actually assessing such quality, as the above examples demonstrate.

After bringing the survey in-house, THE increased the number of respondents from 10,963 in 2021 to 29,606 in 2022. 38,796 in 2023 and 55,689 in 2024. It seems that this is a different kind of survey since the new influx of respondents is likely to contain fewer researchers from countries like Australia. One might also ask how such a significant increase was achieved.

Another issue is the distribution of survey responses by subject. In 2021 a THE post on the reputation ranking methodology indicated the distribution of responses among academic by which the responses were rebalanced. So, while there were 9.8% computer science responses this was reduced to reflect a 4.2% proportion of international researchers. It seems that this information has not been provided for the 2022 or 2023 reputation surveys.

In 2017 I noted that Oxford’s reputation score tracked the percentage of THE survey responses from the arts and humanities, rising when there are more respondents from those fields and falling when there are fewer. So, the withholding of information about the distribution of responses by subjects is also significant since this could affect the ranking of Australian universities.

Then we have the issue of the geographical distribution of responses. THE has a long-standing policy of recalibrating its results to align with the number of researchers in a country, based on the number of researchers in countries according to data submitted and published by UNESCO.

There are good reasons to be suspicious of data emanating from UNESCO, some of which have been presented by Sasha Alyson.                               

But even if the data were totally accurate, there is still a problem that a university’s rise or fall in reputation might simply be due to a change in the relative number of researchers reported by government departments to the data crunching machines at THE.

According to UNESCO, the number of researchers per million inhabitants in Australia and New Zealand fell somewhat between 2016 and 2021. On the other hand, the number rose for Western Asia, Southern Asia, Eastern Asia, Latin America and the Caribbean, and Northern Africa.

If these changes are accurate, it means that some of Australia's declining research reputation is due to the increase in researchers in other parts of the world and not necessarily to any decline in the quality or quantity of its research.

Concerns about THE's reputation indicators are further raised by looking at some of the universities that did well in the recent reputation survey.

Earlier this year, THE announced that nine Arab universities had achieved the distinction of reaching the top 200 of the reputation rankings, although none were able to reach the top 50, where the exact score and rank were given. THE admitted that the reputation of these universities was regional rather than local. In fact, as some observers noted at the time, it was probably less than regional and primarily national.

It was not Arab universities' rising in the reputation rankings that was disconcerting. Quite a few leading universities from that region have begun to produce significant numbers of papers, citations, and patents and attract the attention of international researchers, but they were not among those doing so well in THE’s reputation rankings.

Then, last May, THE announced that it had detected signs of “possible relationships being agreed between universities”  and that steps would be taken, although not, it would seem, in time for the recent WUR.

More recently, a LinkedIn post by Egor Yablonsky, CEO of E-Quadratic Science & Education, reported that a few European universities had significantly higher reputation scores than the overall world rankings.

Another reason Australia should be cautious of the THE rankings and their reputation metrics is that Australian universities' ranks in the THE reputation rankings are much lower than they are for Global Research Reputation in the US News (USN) Best Global Universities or Academic Reputation in the QS World rankings.

In contrast, some French, Chinese and Emirati universities do noticeably better in the THE reputation ranking than they do in QS or USN.

 

Table: Ranks of leading Australian universities

University

THE reputation

2023

USN global research reputation 2024-2025

QS academic reputation 2025

Melbourne

51-60

43

21

Sydney

61-70

53

30

ANU

81-90

77

36

Monash

81-90

75

78

Queensland

91-100

81

50

UNSW Sydney

126-150

88

43

 

It would be unwise to put too much trust in the THE reputation survey or in the world rankings where it has nearly a one-third weighting. There are some implausible results this year, and it stretches credibility that the American University of the Middle East has a better reputation among researchers than the University of Bologna, National Taiwan University, the Technical University of Berlin, or even UNSW Sydney. THE has admitted that some of these results may be anomalous, and it is likely that some universities will fall after THE takes appropriate measures.

Moreover, the reputation scores and ranks for the leading Australian universities are significantly lower than those published by US News and QS. It seems very odd that Australian universities are embracing a narrative that comes from such a dubious source and is at odds with other rankings. It is undeniable that universities in Australia are facing problems. But it is no help to anyone to let dubious data guide public policy.

So, please, will all the Aussie academics and journalists having nervous breakdowns relax a bit and read some of the other rankings or just wait until next year when THE will probably revamp its reputation metrics.

 

Thursday, October 10, 2024

Is something happening in China?

The National Taiwan University rankings have been overlooked by the Western media, which is a shame since they can provide useful and interesting insights. 

For example, there are indicators for articles in the SCIE and the SSCI of the Web of Science database over 11 years and over the current year, which for this year's edition is 2023. For both metrics, the top scorer, which in these cases is Harvard, is assigned a score of 100, and the others are calibrated accordingly.

If a university has a score for the one-year indicator that is significantly higher than the score for eleven years, it is likely that they have made significant progress during 2023 compared to the previous decade. Conversely, if a university does much better for the eleven-year indicator than for the current year, it could mean that it has entered a period of low productivity.

Looking at the current ranking, we notice that most leading US, British, and Australian universities are doing well for the current year, with the notable exceptions of the Los Angeles, Berkeley, San Diego, and Davis campuses of the University of California. Saudi universities also do well, but French universities are down for the year.

The big story here is that Chinese universities do much worse for the current year than the 11-year period. Here are the Article scores for five leading institutions:

Tsinghua University 57.9  for eleven years and  47.2 for the current year

Zhejiang  University 64.7 and 55.4

Shanghai Jiao Tong University 65 and 52.8

Peking University 57.1 and 48

Sun Yat-Sen University 54.1 and 47.1.

And so on and so on.

So what is going on? I can think of several possible explanations. Firstly, we are seeing the temporary effect of the Covid restrictions, and soon we shall see a rebound.

Secondly, this is the beginning of a new period of decline for Chinese sciences, and we shall see a further decline in the next few years.

Thirdly, and I think most plausibly, China has lost interest in engagement with the West, whether this means partnerships with elite institutions, publications in scientific journals, or participation in surveys and rankings. This aligns with the abstention from the THE Impact rankings. the lack of data submission to the TOP 500 international ranking of supercomputers, and low scores in the   QS sustainability rankings, which suggests a lack of interest in those metrics.

Whatever the reason, we should have a better idea over the next year or two.






 



Thursday, August 29, 2024

China vs the West: Snow’s ‘two culture’ theory goes global

 


Published today in University World News

In 1959, C P Snow, a British scientist, civil servant and novelist, created a stir with a lecture, “The Two Cultures and the Scientific Revolution”. The two cultures were led by natural scientists and literary intellectuals.

There was no doubt about where Snow stood with regard to the cultures. Scientists, he said, had “the future in their bones”, and he was disdainful of those who were ignorant of the basic laws of physics.

He believed that Britain’s stagnation after the Second World War was the result of the domination of public life by humanities graduates and the marginalisation of natural scientists.

Snow’s lecture was met with an equally famous ad hominem blast from the Cambridge literary critic, F R Leavis, which probably did Snow more good than harm. Leavis may, however, have had a prescient point when he talked about how science had destroyed the organic communities of the pre-industrial world.

At the time, his nostalgia was largely misplaced. Those who lived in the villages and farms of England had little reluctance about moving, as did my forebears, to the cotton mills of Derbyshire and the coal mines of South Wales, but, looking at a world where every human instinct has become digital media fodder, Leavis might have been onto something.

It now looks like we have something like Snow’s two cultures emerging at the global level with their centres in China, and in North America and Western Europe.


















Sunday, August 25, 2024

India and the THE Impact Rankings


The World Economic Forum (WEF), supposedly the voice of the global economic and political elites, recently published an article by Phil Baty, Chief Global Affairs Officer of Times Higher Education (THE), about Indian universities and their apparent progress towards world-class status, shown by their participation and performance in the THE Impact Rankings, which measure universities’ contributions to the UN’s Sustainable Development Goals (SDGs).

This is misleading and irresponsible. Participation, or even a high score, in the Impact Rankings, whether overall or for specific indicators, has little, if anything, to do with the ability of universities to provide instruction in academic and professional subjects or to pursue research, scholarship, and innovation. Indeed, it is difficult to see how many of the criteria used in the Impact Rankings are relevant to attaining the SDGs.

The article begins by quoting Philip Altbach, who said in  2012   that India was a world-class country without world-class universities. That in itself is an interesting comment. If a country can be world-class without world-class universities, then one wonders if such universities are really essential.

There is a bit of bait and switch here. Whatever Altbach meant by world-class in 2012, I doubt that he was referring to performance in meeting the UN’s SDGs.

Baty goes on to claim that Indian universities are improving, and this is shown by Indian universities submitting data for THE impact rankings, which assess universities' contribution to the SDGs, 125 compared with 100 from TĂĽrkiye and 96 from Pakistan, out of a total of  2152 universities around the world.

That sounds impressive. However, submissions to the impact rankings and other THE products are voluntary, as THE often points out. There is no real merit involved in filling out the forms except perhaps showing a need to be ranked for something.

In any case, according to the uniRank site, there are 890 higher education institutions in India, 174 in TĂĽrkiye, and 176 in Pakistan. That means that the participation rate is about 14% for India, 57% for TĂĽrkiye, and 55% for Pakistan. India's participation in THE Impact Rankings is less than that of Pakistan and TĂĽrkiye, and in previous years, it has been much less than that of countries like Algeria, Iran, and Iraq.

Nor does gaining a high score in the Impact Rankings tell us very much. Universities are ranked on their four best scores. Many universities simply submit data for five or six goals and just ignore the others, for which their actual contribution might well be zero or negative.

These rankings rely heavily on data submitted by universities. Even if everybody concerned with the collection, transfer, and processing of information is totally honest and competent, there are often immense obstacles to data curation confronting universities in the Middle East, Africa, and Latin America. These rankings may be, in effect, little more than a measure of the ambitions of university leaders and the efficiency of their data analysts.

Moreover, much of the progress toward these goals is measured not by hard, verifiable data but by targets, programs, initiatives, partnerships, facilities, policies, measures, and projects that are subject to an opaque and, one suspects, sometimes arbitrary validation process.

Also, do the criteria measure progress toward the goals? Does producing graduates in law, civil enforcement, and related fields really contribute to peace, justice, and strong institutions? Does a large number of graduates qualified to teach say much about the quality of education?

It might be commendable that a minority of Indian universities, albeit proportionately less than many other countries, have signed up for these rankings and that a few have done well for one or two of the SDGs. It is helpful to know that JSS Academy of Higher Education and Research is apparently a world beater for good health and well-being, Shoolini University of Biotechnology and Management for clean water and sanitation, and Saveetha Institute of Medical and Technical Sciences for affordable and clean energy, but does this really compensate for the pervasive perceived mediocrity of Indian higher education?

The validity of the Impact Rankings can be checked by comparing them with the UI GreenMetric Rankings, which have measured universities' commitment to environmental sustainability since 2010. Some of the indicators here, such as Energy and Climate Change and Water, are similar, although not identical, to those in the Impact Rankings, but there is almost no overlap between the best-performing universities in the two rankings. No doubt THE would say their rankings are more sophisticated but still, even the least cynical observer might wonder a bit.

The reality is that Indian universities have consistently underperformed in the various global rankings, and this is, on balance, a fairly accurate picture. It is probable that current reforms will bring widespread change, but that is still something on the horizon.

Here, THE has not been helpful. Over the last few years, It has repeatedly exaggerated the achievements of a few Indian institutions that have risen in their world or regional rankings, often due to the dysfunctional citations indicator. These include Panjab University, Vel Tech Rangarajan Dr. Sagunthala R&D Institute of Science and Technology, JSS Academy of Higher Education and Research, and the Ropar and Indore Institutes of Technology. This has caused resentment among leading Indian institutions, who are perplexed by such relatively marginal places zooming ahead of the highly reputable Indian Institutes of Technology of Bombay, Madras, and Delhi.

The article ignores the boycott by the leading Indian Institutes of Technology (IITs) of the THE World University Rankings partly because of their opacity, where all the metrics are now bundled into pillars, so it is next to impossible to figure out what is causing movement in the rankings without paying THE for consultation and benchmarking.

Indian universities have not performed well in global rankings. In the Shanghai Rankings, the best performer is the Indian Institute of Science in the 401-500 band, down from 301-400 in 2023. In the CWTS Leiden Ranking, the leading university is the Indian Institute of Technology Kharagpur in 284th place. Compared to China, Japan, and South Korea, India’s performance is rather tepid. The occasional show of excellence with regard to one or two of the SDGs is hardly sufficient compensation.

The current reforms may put Indian research and higher education on track, but India’s problems go deeper than that. There is widespread evidence that the country is lagging far behind in primary and secondary education, and ultimately, that will matter much more than the exploits of universities on the way to meeting sustainability goals.

 

Wednesday, August 21, 2024

It seems that self-affirmation isn't such a big deal

 

A few years ago, I wrote about a massively cited study in Science, supposedly a leading scientific journal, that claimed to significantly reduce the racial high school achievement gap. The idea was that having low-achieving students write about values important to themselves would start a recursive process leading to an improvement in their relative academic performance. The positive effect of this self-affirmation intervention was conveniently confined to African-American students, which, I suspect, contributed to the paper's acceptance.

I was sceptical, having once taught English in classrooms, that 15 minutes of writing could have such a remarkable impact, and I wondered about whether the abundance of resources, support, and skills in the school under study might have compromised the anonymity of the subjects.

Now it seems that the study was "seriously underpowered" and "always obviously wrong".

How many more politically convenient studies will turn out to be wrong or perhaps even worse than wrong? 



Friday, August 02, 2024

Forget about the Euros, this is really serious

We are told that the failure at the UEFA final was a tragedy for England. Perhaps, but something else happened early in July that should have caused some debate but passed almost unnoticed, namely the publication of the latest edition of the CWTS Leiden Ranking.

The release of the Times Higher Education (THE) World University rankings and, to a lesser extent, the global rankings from Shanghai, QS, and US News (USN) are often met with fulsome praise from the media and government officials when national favourites rise in the rankings and lamentations when they fall, but other rankings, often much more reliable and rigorous, are largely ignored.

This is partly because the THE and QS rankings are dominated by American and British universities. Oxford, Cambridge, and Imperial College London are in the top ten in the overall tables in these three rankings. This year there was a lot of media talk about Imperial moving ahead of Cambridge and Oxford into second place in the QS rankings, just behind MIT. According to these rankings, British universities are on top of the world and criticism from journalists or politicians would surely be churlish in the extreme. 

It would, however, be a mistake to assume that the big brand rankings are objective judges of academic merit or any other sort. They are biased towards UK universities in a variety of obvious and subtle ways. QS, THE, and USN all include surveys of established academics, and the Shanghai Rankings include Nobel and Fields award winners, some of whom are long gone or retired. THE has three metrics based on income. THE USN, and QS give more weight to citations rather than publications, loading the dice for older and better-funded researchers. 

It seems that British universities have complacently accepted the verdict of these rankings and appear unwilling to consider that they are doing anything less than perfect. When the Sunak government proposed some vague and  bland  changes, the Chief Executive of the London Higher Group of Institutions complained that it was "beyond belief" that the government should have the King speak negatively of the "world-leading higher education and research sector." 

It is perhaps time to look at another ranking, one produced by the Centre for Science and Technology Studies (CWTS) at Leiden University. This provides data on publications with various optional filters for subject group, country, period, and fractional counting. There are also rankings for international and industrial collaboration, open-access publications, and gender equity in research.

CWTS does not, however, publish overall rankings, sponsor spectacular events in prestigious settings, or offer consultations and benchmarking services for non-trivial sums. Consequently, it is usually neglected by the media, university heads, or the grandees of the world economy gathered at WEF forums and the like.

Turning to the latest edition,  starting with the default metric, publications in the Web of Science over the period 2019-2022, we see that Zhejiang University has now overtaken Harvard and moved into first place. In the next few years, it is likely that other Chinese universities like Fudan, Peking, and Tsinghua will join Zhejiang at the peak. 

But the most interesting part of Leiden Ranking is the steady decline of British universities. Oxford is now 25th  in the publications table, down from 14th in 2009-12. That's not too bad, but rather different from the latest QS world ranking, where it is third, US News Best Global Universities, where it is fourth, or THE, where it is first. Oxford is well behind several Chinese universities and also behind, among others, the University of Sao Paulo, Seoul National University, and the University of Pennsylvania.

Of course, you could say that this is a crude measure of research activity and that if we look at other metrics, such as publications in the top 10% and the top 1% of journals, then, yes, Oxford does better. The problem is that the high-quality metrics are usually lagging indicators so we can expect Oxford to start declining there also before too long.

When we look at the broad subject tables for publications, there is further evidence of gradual decline.  For Mathematics and  Computer Science, Oxford is 63rd, behind Purdue University, Beijing University of Technology, and the University of New South Wales. In 2009-12 it was 50th. 

For Physical Sciences and Engineering, it is 72nd behind the  University of Tehran, Texas A & M, and Lomonosov Moscow State University. In 2009-12 it was 29th.

It is 64th in Life and Earth Sciences, behind Lanzhou University, the Swedish University of Agricultural Science, and Colorado State University. In 2009-2012 it was 42nd. 

For Biomedical and Health Sciences, it is 39th, behind Duke, University of British Columbia, and Karolinska Institutet; in 2009-2012, it was 27th.

Finally, when it comes to the Humanities and Social Scientists, Oxford remains at the top. It is fourth in the world, just as it was in 2009-2012. 

A glance at some middling British institutions shows the same picture of steady relative decline. Between 2009-2012 and 2019-2022 Reading went from 489th to 719th, Liverpool from 233rd to 302nd, and Cardiff from 190th to 328th. 

It is perhaps unfair to judge complex institutions based on a single metric. Unfortunately, most science, scholarship, and everyday life are based on assigning numbers that may ignore the fine details of indispensable complex phenomena. 

Also, such data does not tell us the full story about teaching and learning, but there is plenty of anecdotal evidence that British universities are not doing so great there either. 

It seems that the big rankings are exaggerating the merits of British higher education. It is time to take a look at some of the global in  rankings produced in places like the Netherlands (Leiden Ranking), Spain (SCImago), Turkiye (URAP), Georgia (RUR), and Taiwan (NTU rankings).









Sunday, July 28, 2024

Are British Universities Really Underfunded?

I noticed this on LinkedIn recently. An item by Phil Baty, Chief Global Affairs Officer at Times Higher Education (THE), claims that British universities are seriously underfunded and that their world-class achievements are endangered.

He reports that a brilliant data analyst has revealed that inflation has eroded the value of the tuition fees that UK universities are allowed to charge and that costs have dramatically increased. 

Then we have a graph from a THE data guru that compares British university performance in the 18 metrics used in the current THE world rankings to that of their international peers. The UK is well ahead of the world average of the top 500 universities in the most recent world university rankings for the international outlook indicators and significantly for research strength, field-weighted citations, and publications. It is slightly ahead for research excellence, research reputation, research influence, and patents.

However, when it comes to institutional income, research income, and industry income, British universities are apparently way behind the rest of the world. So, it seems that THE has conclusively demonstrated that UK universities are seriously short of money.

But there are a few things that need to be considered.

First, the THE income indicators are all divided by the number of academic staff. To do well in these measures, a university could have substantial income, or at least report that it did, or it could reduce the number of faculty reported.

In other words, a university that decided to spend its money recruiting teaching and/or research staff would fall in the THE rankings. If it sacked a lot of teachers and researchers, it would be rewarded with a significant improvement. You might think that is a bit bonkers, but that is the unintended consequence of the THE  methodology. I do not know which applies to British universities in general or specifically, but it would be interesting to see a breakdown of the data.

Also, remember that the income indicators are based on data submitted by institutions. It would be unwise to assume that these are completely valid and accurate. A few years ago Alex Usher of HESA published an article showing that there were some serious problems with THE's industry income indicator. I am not sure whether it has improved since then.

Also, we should note that 55 UK universities are in the current THE world top 500. According to Webometrics, there are 31,657 universities worldwide and 355 in the UK. THE is, in effect, claiming that the top 15.49% of British universities, according to THE's criteria, are underfunded compared to the top 1.58% of world universities in general. 

Before signing off, the graph is instructive in that it shows that the rankings are massively biased toward British universities. Consider the weighting for the various metrics. 

The International Outlook pillar has a 7.5% weighting, research quality, that is citations, 30%, teaching and research reputation 33%, and publications per staff 5.5%. These are all criteria where British higher education does better than the world average.

In contrast, the three income metrics, where UK universities do badly, are given weightings of 2.5%, 5.5%, and 2% respectively. 

If THE decided to shift some of its weighting from reputation to income or to doctoral education, which the UK sector also does badly, its THE rank would fall very noticeably.







Sunday, July 07, 2024

Problems with the THE Reputation Rankings

THE has spent a lot of time and words proclaiming that it is trusted by administrators, students, sponsors, and the like. Perhaps it is, but whether it deserves to be is another matter. A recent article in THE  suggests that THE has made a mess of its reputation rankings and is scrambling to put things right.

Until 2021, THE used Elsevier to conduct its teaching and research reputation survey. The 2020-21 survey received 10,963  responses and was calibrated to ensure proper representation of regions and subjects. 

The survey was brought in-house in 2022, and since then, the number of responses has increased substantially to 29,606 in 2022, 38,796 in 2023, and 55,689 in 2024.

When the number of responses increases so dramatically, one should wonder exactly how this was achieved. Was it by sending out more surveys, improving the response rate, or institutional efforts to encourage participation? 

When the results were announced in February, THE declared that a number of Arab universities had achieved remarkable results in the reputation survey. THE conceded that this stellar performance was largely a regional affair that did not extend to the rest of the world. 

But that was not all. Several Arab universities have been making big strides and improving citation, publication, and patent scores: Cairo University, King Abdullah University of Science and Technology, UAE University, and Qatar University. 

The universities getting high scores in the THE rankings were less well-known in the Arab region and had received much lower scores for reputation in the US News and QS rankings. However, they are likely to do well in the forthcoming THE world and Arab university rankings.

THE has now admitted that some universities were encouraging researchers to vote for their own institutions and that there may have been "agreed relationships" between universities. THE is now talking about rewarding respondent diversity, that is getting support from more than just a few institutions.

It is regrettable that THE did not notice this earlier. If it does encourage such diversity, then quite a few universities will suffer dramatic falls in the rankings this year and next.

Anyway, THE could do a few things to improve the validity of its reputation survey. It could eliminate self-voting altogether, give a higher weighting to votes from other countries, as QS does, add a separate ranking for regional reputation, and combine scores for a number of years.

The problems with the reputation metrics seem to have begun with THE starting its own survey. It would be a good idea to go back to letting Elsevier do the survey. THE is undeniably brilliant at event management and public relations, although perhaps not jaw-droppingly so. However, it is not so good at rankings or data processing.

  


Thursday, June 13, 2024

Imperial Ascendancy


The 2025 QS World University Rankings have just been announced. As usual, when there are big fluctuations in scores and ranks, the media are full of words like soaring, rising, plummeting, and collapsing. This year, British universities have been more plummeting than soaring, and this has generally been ascribed to serious underfunding of higher education by governments who have been throwing money at frivolities like childcare, hospitals, schools, roads, and housing.

There has been a lot of talk about Imperial College London rising to second in the world and first in the UK, ahead of Harvard, Oxford, and Cambridge. Imperial's president, quoted in Imperial News, spoke about quality, commitment, and "interrogating the forces that shape our world."

The article also referred to the university's achievements in the THE world rankings, the Guardian University Guide, and the UK's Research and Teaching Excellence Frameworks. It does not mention that Round University Ranking has had Imperial first in the UK since 2014.

So what exactly happened to propel Imperial ahead of Harvard, Oxford, and Cambridge? Perhaps commitment and the interrogation of forces were there in the background, but the more proximate causes were the methodological changes introduced by QS last year. There have been no further changes this year, but the QS rankings do seem to have become more volatile.

In 2023, QS introduced three new indicators. The first is the International Research Network, which measures the breadth rather than the quantity of international research collaborations. This favored universities in English-speaking countries and led to a reported boycott by South Korean universities. 

That boycott does not seem to have done Korean universities any harm since many of them have risen quite significantly this year.

QS has also added an Employment Outcomes metric that combines graduate employment rates and an alumni index of graduate achievements scaled against student numbers. 

Then there is a sustainability indicator based on over fifty pieces of data submitted by institutions. Some reputable Asian universities get low scores here, suggesting that they have not submitted data or that the data has been judged inadequate by the QS validators.

Imperial rose by exactly 0.7 points between the 2024 and the 2025 world rankings, while Harvard, Oxford, and Cambridge all fell. Its score declined for three indicators, Faculty Student Ratio, Citations per Faculty, and International Students, and remained unchanged for International Faculty.

The improvement in the weighted score of five indicators is listed below:

Employment Outcomes                      0.52

Sustainability                                     0.265

Academic Reputation                        0.15

International Research Network       0.035

Employer Reputation                        0.015.

Imperial has improved for all of the new indicators, very substantially for Employments Outcomes and Sustainability, and also for the reputation indicators. I suspect that the Imperial ascendancy may not last long as its peers, especially in Asia, pay more attention to the presentation of employability and sustainability data





Saturday, May 11, 2024

Hungarian universities, this is probably not a good idea

Times Higher Education (THE) has informed us that it has reached a "groundbreaking" agreement with the Hungarian Ministry of Culture and Innovation.

It seems that THE will analyse Hungary's higher education system and benchmark with successful higher education hubs according to the "gold standard" world rankings and provide advice and "unparalleled data insights" to Hungarian universities. The cost of this exercise is not mentioned, but it is unlikely to be trivial.

The Hungarian State Secretary for Innovation and Higher Education referred to the presence of Hungarian universities in the THE rankings. Eleven are now in the THE world rankings whereas five years ago seven were listed. 

That sounds very impressive, but wait a minute.

THE tells us in the 2018-19 rankings, there were 1258 universities, of which 1250 were ranked, and in 2023-24, there were 2671, of which 1906 were ranked. It would be remarkable if the number of Hungarian universities did not increase, and it is no big deal that they did.

What is relevant is the number of universities in the top thousand in each edition. For Hungary, it was six in the 2019 rankings and three in 2024. If the THE rankings mean anything, then the quality of  Hungarian universities has apparently declined over the last five years. 

Hungarian universities, however, have generally been drifting downwards in most rankings, not because they are getting worse in absolute terms but because of the steady rise of Asian, especially Chinese, research-based universities. 

Moreover, the THE world rankings rate Hungarian universities worse than any other global ranking. The latest edition of the THE World University Rankings  (WUR) shows three in the world's top 1000. There are five in the top 1000 in the latest QS rankings, four in the Shanghai rankings, five in Leiden Ranking, four in the US News Best Global Universities, four in URAP, five in CWUR, six in Webometrics, and eight in RUR.

The pattern is clear. THE now consistently underestimates the performance of Hungarian universities compared to other rankers. Not only that but some Hungarian universities have dropped significantly in the THE rankings. Eotvos Lorand University has gone from 601-800 to 801-1000, Pecs University from 601-800 to 1001-1200 and Budapest University of Technology and Economics from 801-1000 to 1201-1500.

On the other hand, a couple of Hungarian universities, Semmelweis and Debrecen, have risen through participation in multi-author multi-citation projects.

It is difficult to see what benefit Hungary will get from paying THE for insights, reports, and targets from an organization that has limited competence in the assessment and analysis of academic performance. Seriously, what insights could you get from an organization that in recent years has declared Anglia Ruskin University to be the world leader for research impact, Anadolu University for knowledge transfer, and Macau University of Science and Technology for International Outlook?

It is true that THE is outstanding in public relations and event management, and the universities will no doubt benefit from high praise at prestigious events and receive favourable headlines and awards. It is hard, though, to see that THE are able to provide the knowledgeable and informed advice that universities need to make difficult decisions in the coming years. 



Sunday, April 07, 2024

What happens to those who leave THE?

Times Higher Education (THE) appears to be getting rather worried about leading universities such as Rhodes University, University of Zurich, Utrecht University, and some of the Indian Institutes of Technology boycotting its World University Rankings (WUR) and not submitting data.

Thriving Rankings?

We have seen articles about how the THE rankings are thriving, indeed growing explosively. Now, THE has published a piece about the sad fate that awaits the universities that drop out of the WUR or their Impact Rankings. 

Declining Universities?

An article by two THE data specialists reports that 611 universities that remained in the THE world rankings from 2018 to 2023 retained, on average, a stable rank in the THE reputation ranking. The 16 who dropped out saw a significant decline in their reputation ranks, as did 75 who are described as never being in the WUR.

The last category is a bit perplexing. According to Webometrics, there are over 30,000 higher education institutions in the world and nearly 90,000, according to Alex Usher of HESA. So, I assume that THE is counting only those that got votes or a minimum number of votes in their reputation ranking. 

We are not told who the 75 never-inners or the 16 defectors are, although some, such as six Indian Institutes of Technology, are well known, so it is difficult to check THE's claims. However, it  is likely that an institution that boycotted the THE WUR would also discourage its faculty from participating in the THE academic survey, which would automatically tend to reduce the reputation scores since THE allows self-voting.

Also, we do not know if there have been changes in the weighting for country and subject and how that might modify the raw survey responses. A few years ago, I noticed that Oxford's academic reputation fluctuated with the percentage of survey responses from the humanities. It is possible that adjustments like that might affect the reputation scores of the leavers. 

The opacity of THE's methodology and the intricacies of its data processing system mean that we cannot be sure about THE's claim that departure from the world rankings would have a negative impact. In addition, there is always the possibility that universities on a downward trend might be more likely to pull out because their leaders are concerned about their rankings, so the withdrawal is a result, not the cause of the decline. 

We should also remember that reputation scores are not everything. If a decline in reputation was accompanied by an improvement in other metrics, it could be a worthwhile trade.

What happened to the IITs in the THE WUR?

Fortunately, we can check THE's claims by looking at a group of institutions from the same country and with the same subject orientation. In the 2019-20 world rankings, twelve Indian Institutes of Technology were ranked. Then, six -- Bombay, Madras, Delhi, Kanpur, Kharagpur, Roorkee --  withdrew from the WUR, and six -- Ropar, Indore, Gandhinagar, Guwahati, Hyderabad, Bhubaneswar --  remained, although two of these withdrew later. 

So, let's see what happened to them. First, look at the overall ranks in the WUR itself and then in Leiden Ranking, the Shanghai Rankings (ARWU), and Webometrics.

Looking at WUR, it seems that if there are penalties for leaving THE, the penalties for remaining could be more serious. 

Among the  IITs in the 2020 rankings, Ropar led in the 301-350 band, followed by Indore in the 351-400 band. Neither of them is as reputable in India as senior IITs such as Bombay and Madras and they had those ranks because of remarkable citation scores, although they did much less well for the other pillars. This anomaly was part of the reason for the six leavers to depart.

Fast-forward to the 2024 WUR. IIT Ropar has fallen dramatically to 1001-1200,  Indore, which had fallen from 351-400 to 601-800 in 2023, has opted out, and Gandhinagar has fallen from 501-600 to 801-1000. Bhubaneswar, which was in the 601-800 band in the 2020 WUR,  fell to 1001-1200 in 2022 and 2023 and was absent in 2024. Guwahati and Hyderabad remained in the 601-800 band.

Frankly, it looks like staying in the THE WUR is not always a good idea. Maybe their THE reputation improved but four of the original remaining IITs suffered serious declines.

IITs in Other Rankings

Now, let's examine the IITs' performance in other rankings. First, the total publications metric in Leiden Ranking. Between 2019 and 2023, four of the six early leavers rose, and two fell. The late leavers, Hyderabad and Indore, were absent in 2019 and were ranked in the 900s in 2023. Remainer Guwahati rose from 536th in 2019 to 439th in 2023.

For Webometrics, between 2019 and 2024, all 12 IITs went up except for Bombay.

Finally, let's check the overall scores in the QS WUR. Between 2021 and 2024, four of the six leavers went up, and two went down. Of the others, Guwahati went up, and Hyderabad went down.

So, looking at overall ranking scores, it seems unlikely that boycotting THE causes any great harm, if any. On the other hand, if THE is tweaking its methodology or something happens to a productive researcher, staying could lead to an embarrassing decline.

IITs' Academic Reputation Scores

Next, here are some academic reputation surveys. The  US News Best Global Universities is not as helpful as it could be since it does not provide data from previous editions, and the Wayback Machine doesn't seem to work very well. However, the Global Research Reputation metric in the most recent edition is instructive. 

The six escapees had an average rank of 272, ranging from 163 for Bombay to 477 for Roorkee.

The remainers' ranks ranged from 702 for Guwahati to 1710 for Bhubaneswar. Ropar was not ranked at all. So, leaving THE does not appear to have done the IITs any harm in this metric

Turning to the QS WUR academic reputation metric, the rank in the academic survey for the leavers ranges from 141 for Bombay to 500 for Roorkee. They have all improved since 2022. The best performing remainer is Guwahati in 523rd place.  Ropar and Gandhinagar are not ranked at all. Bhubaneswar, Indore and Hyderabad are all at 601+.  

Now for Round University Ranking's reputation ranking. Four of the six original leavers were there in 2019. Three fell by 2023 and Delhi rose. Two, Bombay and Roorkee, were absent in 2019 and present in 2023.

This might be considered evidence that leaving THE leads to a loss of reputation. But five of the original remainers are not ranked in these rankings, and Guwahati is there in 2023 with a rank of 417, well below that of the six leavers. 

There is then scant evidence that leaving WUR damaged the academic reputations of those IITs that joined the initial boycott, and their overall rankings scores have generally improved.

On the other hand, for IITs Ropar and Bhubaneswar remaining proved disastrous.  

IITs and Employer Reputation

In the latest GEURS employer rankings, published by Emerging, the French consulting firm, there are four exiting IITs in the top 250, Delhi, Bombay, Kharagpur, and Madras, and no remainers.

In the QS WUR Employer Reputation indicator, the boycotters all perform well. Bombay is 69th and Delhi is 80th. Of the six original remainers two, Ropar and Gandhinagar, were not ranked by QS in their 2024 WUR. Three were ranked 601 or below, and Guwahati was 381st, ahead of Roorkee in 421st place.

Conclusion

Looking at the IITs, there seems to be little downside to boycotting THE WUR, and there could be some risk in staying, especially for institutions that have over-invested in specific metrics. It is possible that the IITs are atypical, but so far there seems little reason to fear leaving the THE WUR. A study of the consequences of boycotting the THE Impact Rankings is being prepared 






Saturday, March 16, 2024

THE's Big Bang Ranking

 


Another day, another ranking. 

Times Higher Education (THE) has published a "bang for the bucks" ranking.

THE is taking the scores for institutional income, research income, and income from industry and comparing them with the scores "for research, teaching, and working with industry." This, presumably, is supposed to reveal those universities that are able to process their funding efficiently and turn it into publications, citations, patents, doctorates, and survey responses

There are some methodological issues here. It is not clear exactly how the income scores are calculated. Is it from the raw monetary data that THE collects from universities, or has it been through the THE standardization and normalization machine? Is there some sort of weighting or just an average of the three income categories? 

Also, there is a chart that suggests that all the scores are counted except for the financial metrics, but the text implies that the international pillar is not counted as part of the bang that THE purports to measure.

Another issue is that the financial data in the THE rankings refers to the year two years before the date of publication. However, citation and publication data are from a five—or six-year period before the ranking is published. In effect, THE is claiming that their favored schools have a remarkable ability to send money back in time to the years when research proposals were written, papers published, and citations recorded.

THE lists ten countries as good bang producers, starting with the UK and including Pakistan and Egypt. It does not list China, South Korea, Canada, or Australia, which should make us a little suspicious, 

Then, looking at the list of twenty universities with the biggest bangs, we see a few familiar names, including Sussex and Brighton Medical School,  Babol Noshirvani University of Technology,  and  Vita-Salute San Raffaele University, that have appeared in this blog before because they received remarkably high scores for citations and consequently did well in the overall rankings. Some, including Quaid-i-Islam University, COMSATS University, Auckland University of Technology, Government College University Faisalabad, and University College London, have contributed to citation-rich multi-contributor papers from the Global Burden of Disease Studies or the Large Hadron Collider project. Others, such as Shoolini University of  Biotechnology and Management Sciences and Malaviya National Institute of Technology, have scores for research quality that are disproportionate to those for research environment or teaching. It looks as though a lot of  THE's Big Bang simply consists of getting masses of citations. 

It is also possible that universities might obtain a good bang for the buck score by underreporting their income, perhaps accidentally, which would help here, although not in conventional rankings. This has happened to Trinity College Dublin and probably to Harvard, although the latter case went unnoticed by almost everyone. Probably, the very high scores for Sorbonne University and Universite Paris Cite result from the special features of the French funding system.

I suspect quite a few institutions will take this ranking seriously or pretend to and use it as a pretext to try to obtain more largesse from increasingly impoverished states.

It would seem that THE is engaged in a public relations exercise for upmarket British and perhaps for US and continental universities. These are doing all sorts of amazing, brilliant, and exciting things for which they receive insufficient funds from cheapskate governments.  Just imagine what they could do if they got as much money as Chinese universities do.