Showing posts sorted by date for query QS. Sort by relevance Show all posts
Showing posts sorted by date for query QS. Sort by relevance Show all posts

Saturday, June 14, 2025

Substack Posts

 


Faculty Student Ratio: Who are you Going to Believe?



Citations and Ranking


What Happened in the 2024-2025 QS World Rankings?

Thinking the unthinkable


Wednesday, April 09, 2025

The Decline of Harvard

 

Published in Substack 08 April 2025

Quite a few stories have come out of the Ivy League about how standards are collapsing. I used to think that this was just the perennial lament of teachers everywhere that today’s students are inferior to those of my day. But the stories are coming faster these days, and they seem to be consonant with declining cognitive skills throughout the West, a general disengagement by students, increasing rates of plagiarism, rejection of science and liberal values, and the ardent embrace of extremist ideologies.

Perhaps the most striking story was when Harvard introduced remedial math courses for some of its students. This resulted from the suspension of requiring the submission of SAT and ACT scores following the COVID-19 outbreak.

I suspect that the problem may go deeper than that, and remedial courses at Harvard and other elite schools may become permanent, although probably presented as enrichment programs or something like that.

But this is all anecdotal. Evidence from global rankings can provide more systematic data, which shows that Harvard is steadily declining relative to international universities and even to its peers in the USA.

Here is a prediction. This year, next year, or maybe the year after, Harvard will cede its position as the top university in the world in the publications metric in the Shanghai Rankings to Zhejiang University in Hangzhou.

The Shanghai Rankings, officially known as the Academic Ranking of World Universities (ARWU), have six indicators: Nobel prizes and Fields Medals for alumni, and faculty, papers in Nature and Science, Highly Cited Researchers, publications in the Science Citation Index Extended and the Social Science Citation Index, and Productivity per Capita, which is the sum of those five scores divided by the number of faculty.

When they began, the Shanghai Rankings placed Harvard in first place overall and for all the indicators except for productivity, where Caltech has always held the lead. However, in 2022, Harvard lost its lead to Princeton for faculty winning the Nobel and Fields awards. The coming loss of supremacy for publications will mean that Harvard will lead in just half of the six indicators.

This is only one sign of Harvard’s decline. Looking at some other rankings, we find a similar story. Back in 2010, when QS started producing independent rankings, Harvard was replaced by Cambridge, which in turn was superseded by MIT, which has held first place ever since. In the THE rankings, Caltech deposed Harvard in 2013 and was overtaken by Oxford in 2017.

I have no great faith in THE or QS, but this is suggestive. Then, we have the more rigorous research-based rankings. In the 2024 Leiden Ranking, published by the Centre for Science and Technology Studies at Leiden University, Zhejiang University took over first place from Harvard for publications, although not – not yet anyway – for publications in the top 10% or 1% of journals. In the SCImago Institutions Ranking, published in Spain, Harvard is now fourth overall, although still the leading university.

But when we look at computer science and engineering rankings, it is clear that Harvard has fallen dramatically in areas crucial to economic growth and scientific research over the last few decades.

Shanghai has Harvard in 10th place for computer science, the National Taiwan University Rankings put it in 11th behind Wisconsin, Georgia Institute of Technology, Texas at Austin, and Carnegie Mellon, SCImago Institutions Rankings 61st, and University Ranking by Academic Performance, published by the Middle East Technical University in Ankara, 35th.

For engineering, the prospect is just as grim. The Taiwan rankings have Harvard 31st, Scimago 42nd, and URAP 71st.

Fine, you might say, but the bottom line is jobs and salaries. Let’s look at the latest Financial Times MBA rankings, where Harvard has plunged to 13th place. A major reason for that was that nearly a quarter of the class of 2024 could not find jobs after graduating. According to Poets & Quants, Harvard’s “placement numbers are below every M7 peer, including Stanford, Wharton, Columbia, Kellogg, and Booth, with only one exception: MIT Sloan which is equal to HBS.”

It seems that Harvard’s problems are entrenched and pervasive. They may have been exacerbated by the pandemic, but their roots go back and go deeper. So what is the cause of this decline? I doubt that the usual villain, underfunding by vicious governments or offended donors, has anything to do with it. However, the announced Trumpian cuts may have an effect in the future.

A plausible hypothesis is that Harvard has drifted away from meritocracy in student admissions and assessment and, more significantly, faculty appointments and promotion.

Perhaps the concept of Harvard’s meritocracy has always been overblown. A few years ago, I was researching early American history and came across a reference to a prominent Massachusetts landowner who had graduated first in his class at Harvard. I was baffled because I thought I should have heard of somebody that brilliant. But it turned out that Harvard before the Revolution ranked students according to their perceived social status, a practice that ended with Independence, after which they were ranked alphabetically. The idea of sorting students academically seems to have become widespread only in the twentieth century.

Even after Harvard supposedly embraced meritocracy by introducing the SAT, the GRE, and other tests and linking tenure to publications and citations, it still included large numbers of legacies, athletes, persons of interest to the dean, and affirmative action.

It seems that Harvard is returning to its earlier model of subordinating academic performance to character, athletic ability, conformism, and membership of favored groups. It has appointed a president who is almost certainly the only Harvard professor in the humanities and social sciences not to have written a book. It has admitted students who are incapable or unwilling to do the academic work that elite universities used to require. And its global reputation is slowly eroding.


Thursday, April 03, 2025

The Decline of American Universities: The View From Leiden, Ankara and Madrid


There has been a lot of talk recently about the crisis or crises of American universities. Certainly, if we look at the deteriorating financial situation, the thuggish behavior of demonstrators at Ivy League schools or big state universities, scandals about admissions, or fraudulent research then, yes, American universities do seem to be in a very bad way.

However, financial problems, violent extremism, corruption, and research fraud can be found almost everywhere. Is there a way to compare large numbers of institutions across international frontiers? There is no perfect mode of assessment, but global rankings can tell us quite a bit about the health or sickness of higher education and research.

When Americans think about university rankings, it is usually America’s Best Colleges published for more than four decades by US News (USN) that comes to mind. In the rest of the world, global rankings are more significant. The leader in public approval, if we mean governments, university leaders, and the media, is clearly the Times Higher Education (THE) World University Rankings. These rankings are characterised by bizarrely implausible results, sometimes dismissed as outliers or quirky statistics. In the last few years – sorry to keep repeating this -- we have seen Anglia Ruskin University and Babol Noshirvani University of Technology leading the world for research impact, Macau University of Science and Technology and the University of Macau superstars for internationalisation, Anadolu University and Makerere University in the global top ten for knowledge transfer. No matter, as long as the composite top fifty scores look reasonable from a traditional perspective and the usual heroes, Harvard, MIT, Oxford, are at the top or not too far away.

QS, another British company, was once THE’s data supplier but has pursued an independent path since 2010. Its rankings are more sensible than THE's, but it also seems to have an undue regard for the old Western elite. In its recent world subject rankings, Harvard was first in the world for all five broad subjects except Engineering and Technology, where the crown went to MIT, and Oxford was second in all but one.

These two, along with the Shanghai Rankings by virtue of their age, and occasionally the US News Best Global Universities, because of the fame of their national rankings, constitute the NBA of the ranking world. They are cited endlessly by the global media and provide lists for the appointment of external examiners and editorial boards and for recruitment, promotion, and admissions and even data for the immigration policies of the UK, Hong Kong, and the Netherlands.

However, there are other rankings based on publicly accessible data, transparent methodologies, and consistent procedures. They are largely ignored by those with power and influence, but they tell a coherent and factual story. They are published by universities or research centers with limited budgets and small but well-qualified research teams.

I will take three: Leiden Ranking, produced by the Centre for Science and Technology Studies (CWTS) at Leiden University, the Netherlands, University Ranking by Academic Performance (URAP) by the Informatics Institute at the Middle East Technical University in Ankara, and the SCImago Institutions Rankings (SIR) published by the SCImago Lab in Spain, which has links with the Spanish National Research Council and Spanish universities.

Leiden Ranking

Let’s start by taking a look at Leiden Ranking. The publishers decline to construct any composite or combined ranking, which limits its popular appeal. The default metric, which appears when you land on the list page, is just the number of articles and reviews in core journals in the Web of Science database. Back in 2006-2009, Harvard was in first place here, and other US universities filled up the upper levels of the ranking. The University of Michigan was third, and the University of California Los Angeles (UCLA) was fifth. Chinese universities were lagging behind. Zhejiang University in Hangzhou was 16th, and Tsinghua University in Beijing 32nd.

Fast forward to publications between 2019 and 2022, and Zhejiang has overtaken Harvard and pushed it into second place. The top twenty now includes several Chinese universities, some now world-famous, but others, such as Central South University or Jilin University, scarcely known in the West.

Much of this decline is due to China's advance at the expense of US schools, but that is not the whole story. UCLA has now fallen behind Toronto, São Paulo, Seoul National University, Oxford, University College London, Melbourne, Tokyo, and Copenhagen.

You could say that is just quantity, not quality, and maybe we should be looking at high-impact publications. In that case, we should look at publications in the top 10% of journals, where Zhejiang is still ahead of Harvard. It is only when we reach the top 1% of journals that Harvard still has a lead, and one wonders how long that will last.

That is just the number of publications. Academics tend to judge scientific quality by the number of citations that a work receives. Leiden Ranking no longer ranks universities by citations, perhaps with good reason, but does provide data in the individual profiles. Here we see Harvard’s citations per paper score rising from 13.31 in 2006-2009 to 15.71 in 2019-2022, while Zhejiang’s rises from 3.38 to 11.43. So, Harvard is still ahead for citations, but the gap is closing rapidly and will probably be gone in three or four years.

 

URAP

Turning to the URAP, which is based on a bundle of research metrics, Harvard was first in the combined rankings back in 2013-2014, and the best-performing Chinese institution was Peking University, in 51st place. Now, in the recently published 2024-2025 rankings, Harvard is still first, but Peking is now tenth, and Zhejiang and Tsinghua have also entered the top ten.

Other elite American universities have fallen: Berkeley from 5th to 54th, Yale from 18th to 38th, Boston University from 58th to 151st, Dartmouth from 333rd to 481st.

The relative and absolute decline of the American elite is even clearer if we look at certain key areas. In the ranking for Information and Computing Sciences, the top ten are all located in Mainland China and Singapore, with Tsinghua at the top. Harvard is 35th.

Some American universities are doing much better here than Harvard. MIT, which I suppose will soon be known as the Tsinghua of the West, is 12th, and Carnegie Mellon is 15th.

In Engineering the top 25 universities are all located in Mainland China, Hong Kong, or Singapore. The best American school is again MIT in 37th place, while Harvard languishes in 71st.

 

SCImago

These rankings are quite distinctive in that they have a section for Innovation, which comprises metrics related to patents, and for Societal Factors, which is a mixed bag containing data about altmetrics, gender, impact on policy, web presence, and the UN Sustainable Development Goals. It also includes non-university organisations such as hospitals, companies, non-profits, and government agencies.

When these rankings started in 2009, and before societal factors were included, Harvard was in second place after France's National Scientific Research Center (CNRS). MIT and UCLA were both in the top ten, and the best-performing Chinese university was Tsinghua, in 80th place, while Zhejiang and Peking lagged way behind at 124th and 176th, respectively.

In the latest 2025 rankings, Harvard has slipped to fourth place behind the Chinese Academy of Sciences, the Chinese Ministry of Education, and CNRS. Tsinghua, Zhejiang, and Peking are all in the top twenty, and MIT, UCLA, and the North Carolina schools have all fallen.

Looking at Computer Science, the world leader is the Chinese Academy of Sciences. The best university is Tsinghua, in fourth place. Then there are some multinational and American companies and more Chinese universities before arriving at Stanford in the 24th slot. Harvard is 64th

In the next post, we will look at the causes of all this.




Saturday, March 01, 2025

China, AI, and Rankings


Recently we have seen the crumbling of many illusions. It now seems hard to believe but only a few weeks ago we were assured that President Biden was as sharp as a fiddle or as fit as a tack or something. Also, the Russian economy was collapsing under the weight of Western sanctions. Or again, the presidential race was running neck and neck, and probably heading for a decisive Democrat vote, foretold by that state-of-the-art poll from Iowa.

An equally significant illusion was the supremacy of Western, especially Anglophone, science and scholarship. The remarkable growth of Asian research has often been dismissed as imitative and uncreative and anyway much less important than the amazing things Western universities are doing for sustainability and diversity.

The two big UK rankings, THE and QS, highly regarded by governments and media, have been instrumental in the underestimation of Chinese science and the overestimation of that of the West. Oxford is in first place in the THE world rankings and no other, while MIT leads the QS world rankings and no other. Indeed, Leiden Ranking, probably the most respected ranking among actual researchers, has them in 25th and 91st place for publications. 

The myopia of the Western rankers has been revealed by recent events in the world of AI. The release of the large language model (LLM) DeepSeek has caused much soul searching among western academics and scientists. It looks as good as Chat GPT and the others, probably better, and, it seems, very much cheaper. There will likely be more to come in the near future. The researchers and developers were mainly “researchers and developers from China’s elite universities, with minimal overseas education,” according to DeepSeek itself, including Peking University, Tsinghua University, Zhejiang University, Beihang University, Shanghai Jiao Tong University and Nanjing University. There are some overseas links, Monash, Stanford, Texas, but these are less significant.

Some of the anguish or the excitement may be premature. DeepSeek may inspire another Sputnik moment, although that does seem rather unlikely at the moment, and Western companies and institutions may surge ahead again. Also, I suspect, the cheapness may have been exaggerated. Like its western counterparts, DeepSeek has places that it would prefer not to go to – Tiananmen Square and the Uighurs among others – and that could undermine its validity in the long run.

But it is a remarkable achievement nonetheless and it is yet another example of the emerging technological prowess of the Chinese economy. We have seen China build a network of high-speed railways. Compare that with the infamous Los Angeles to San Francisco railroad. Compare China’s military modernization with the state of European navies and armies.

We might add, compare the steady advance of Chinese universities in the output and quality of research and innovation compared to the stagnation and decline of western academia. The main western rankers, THE and QS, have consistently rated  American and British universities more favourably than those in Asia, especially China. Recently it seems that the two dominant rankers have been doing their best to lend a hand to western universities while holding back those in Asia. THE started their Impact rankings with the intention of allowing universities to show the wonderful things they are doing to promote sustainability, an opportunity that has been seized by some Canadian, Australian, and British universities but totally ignored by China. QS has introduced a new sustainability indicator into its world rankings, in which Chinese universities do not do well.

 

AI Rankings

QS and THE have been especially unobservant about the rise of China in computer science, and more specifically in the field of AI. This is in contrast to those rankings based largely on research and derived from public verifiable data.

There are currently four rankings that focus on AI. QS has a ranking for Data Science and Artificial Intelligence and it is very much dominated by Western universities. The top 20 includes 10 US institutions and none from Mainland China, although it does include the Hong Kong University of Science and Technology and the Chinese University of Hong Kong. Massachusetts Institute of Technology is in first place and the best Mainland university is Tongji in 36th = place.

Now let’s look at EduRank, a rather obscure firm, probably located in California, whose methodology might be based on publications, citations and other metrics. Here the top 20 for AI has 15 US universities, with Stanford University in first place. The best performing Chinese university is Tsinghua at 9th place.  

University Ranking of Academic Performance (URAP) is published by the Middle East Technical University in Ankara. Their most recent  AI ranking has Tsinghua in first place with Carnegie Mellon in 10th. The top 20 has 12 Mainland universities and only three American.

The US News Best Global Universities ranking for AI is even more emphatic in its assertion of Chinese superiority. Twelve out of the top 20 universities for AI are Mainland Chinese, with Tsinghua at number one. The best US university was Carnegie Mellon University in 29th  place, well behind a few universities from Australia, Hong Kong, and Singapore.

 

Computer Science Rankings

Turning to the broader field of Computer Science, the THE Computer Science rankings have Oxford in first place, MIT in third, and Peking University in twelfth. Similarly, QS has a Computer Science and Information Systems subject ranking, the most recent edition of which shows MIT first, Oxford fourth, and Tsinghua eleventh.

In contrast, in the National Taiwan University Computer Science Rankings Tsinghua is first, Stanford seventh, and Oxford 171st (!). According to the Scimago Institutions Rankings for universities, Tsinghua is first for Computer Science,  MIT 9th,  Oxford 22nd.  The Iran-based ISC World University Rankings for Computer and Information Sciences place Tsinghua first, MIT 11th, and Oxford 18th. In the US News Best Global Universities Computer Science and Engineering ranking Tsinghua is first, MIT fifth, and Oxford 18th.

In the Shanghai subject rankings MIT is still just ahead of Tsinghua, mainly because of the World Class Output metric which includes international academic awards since 1991.

It seems then that QS, THE, and EduRank have significantly exaggerated the capabilities of elite Western universities in AI and Computer Science generally and underestimated those of Chinese and other Asian schools. It seems ironic that THE and, to a lesser extent, QS are regarded as arbiters of excellence while URAP, Scimago, the National Taiwan University rankings, and even US News are largely ignored.

 

 

Friday, January 24, 2025

THE is shocked, shocked ...

We are repeatedly told that the TimesHigher Education (THE) university rankings are trusted by students, governments, and other stakeholders. Perhaps they are. Whether they should be is another matter. 

Last October, THE announced the results of its World University Rankings, and there was a chorus of lamentation from leading Australian universities, among others, who apparently trusted THE. It seems that the debate over restricting the admission of international students has damaged the country's reputation, and that has been reflected in the THE reputation survey. which contributes disproportionately to THE's teaching and research "pillars." That has led to declining overall scores, which will be the start of a vicious downward spiral. British and American universities also bemoaned the decline in ranking scores, supposedly due to the lack of funding from hard-hearted governments.

For many academics and administrators, THE has become the arbiter of excellence and a credible advisor to the agencies that dominate Western economy and society. It has even become a preferred analyst for the WEF, which is supposed to represent the finest minds of the corporate global world. This is quite remarkable since there is a big mismatch between THE's pretensions to excellence and its actual practice. 

A recent example was the publication of a story about how THE's data analysts had detected collusive activity among some universities in order to boost their scores in the reputation surveys that make up a substantial part of the THE World University Rankings and their various derivatives.

On October 24, David Watkins of THE announced that a "syndicate" had been detected where universities supported each other in the THE Arab reputation survey to the exclusion of non-members. Exactly who those members were was not announced, but it probably included the nine universities that made it to the top 200 in THE World Reputation Survey announced in February 2024, the data for which was included in THE world ranking announced in October  2024. It might also include some universities that had made sudden and surprising gains in the Arab University Rankings announced in November 2023, and the World University Rankings announced last October.

There is a whiff of hypocrisy here. THE is apparently getting upset because universities have probably been doing something that the rankers have condoned or at least ignored. There were signs that something was a bit off as far back as the Arab University Rankings in November 2023. These showed surprisingly good performances from several universities that had performed poorly or not at all in other rankings. In particular, universities in the Emirates were rising while those in Egypt were falling. This was interesting because the results were announced at a summit held in Abu Dhabi that featured several speakers from the Emirates, a development reminiscent of the 2014 summit in Qatar when Texas A and M Qatar was proclaimed the top MENA university based on precisely half a highly cited researcher followed by a similar summit in the UAE in 2015 when that university -- actually a program that has now been wound up -- disappeared, and  United Arab Emirates University advanced to fifth place.

Meanwhile,  between October 2023  and January 2024, THE was conducting their survey of academic opinion for the World University Rankings. Before 2021, they had relied on survey data supplied by Clarivate, but now the survey has been brought in-house. That, it now appears, was not a good idea. The number of survey respondents soared, and there was a disproportionate number of respondents from the UAE. In February 2024, THE published the results of its reputation survey, which would later become a part of the world rankings. 

THE listed only the top 200 universities and gave exact scores for the top fifty.  The interesting thing was that nine Arab universities were included whose reputation scores were below the scores for academic reputation in the QS World University Rankings rankings, the scores for global research reputation in the US News Best Global Universities ranking, or scores in the Round University Rankings, if they were actually ranked at all and below their previous scores.  They were also above the scores achieved by leading universities in the region in Egypt, Saudi Arabia, Qatar, and Lebanon, and they appeared unrelated to other indicators. 

It was probably not only Arab universities. Egor Yablokov of E --  Quadrat Science and Education identified several universities whose reputation score appears disproportionate to the overall scores for the THE world rankings.

When the 2025 WUR rankings appeared in October of last year, there were more signs that something was amiss. Universities in the UAE  including Abu Dhabi University and Khalifa University, also in Abu Dhabi, did much better than in previous editions or in other rankings. There were other apparent anomalies. Al Ahliyaa Amaan University was ahead of the University of Jordan, the Lebanese American University higher than the American University of Beirut,  the American University of the Middle East higher than Kuwait University, Future University in Egypt, and the Egypt-Japan University of Science and Technology higher than Cairo University and Al Azhar. 

Then came the Arab University Rankings. It appears that THE had now taken action against the "syndicate", resulting in them dropping significantly. 

In addition to this, there are some trends that require explanation. Many universities in Saudi Arabia and UAE have fallen significantly, while some in Jordan, Egypt, and Iraq have risen.  Applied Science Private University, Jordan, has risen from 91-100 to 25, Al Ahliyya Amman University, also in Jordan, from 91-100 to  28, Ahlia University in  Bahrain from unranked to 17th, Cairo University from  28 to 8, the University of Baghdad from 40 to 20, Mustansiriyah University, Baghdad from 71-80 to 37, An Najah National University, Palestine, 81-90 to 23, and Dhofar University, Oman, from 101-120 to 49. 

So, THE have allocated a whopping 41% weighting for reputation, of which 23% is for research reputation, for their Arab University Rankings, compared to 25% for their Asian rankings and 33% for the Latin American rankings. They have  introduced a new metric, collaboration within the Arab world, taken over the research and teaching survey from Elsevier, increased the number of respondents, organized prestigious summits, and offered a variety of consultancy arrangements. All of this would create an environment in which exclusive agreements were likely to flourish.

The extreme fluctuations resulting from THE's changes to the reputation indicators have seriously undermined THE's credibility, or at least they ought to. It would be better for everybody if THE simply returned the administration of the reputation survey to Elsevier and stuck to event management, where it is unsurpassed. 


 




Wednesday, October 23, 2024

Are Australian universities really on a precipice?

  


Times Higher Education (THE) recently published the latest edition of their World University Rankings (WUR), which contained bad news for Australian higher education. The country’s leading universities have fallen down the rankings, apparently because of a decline in their scores for research and teaching reputation scores and international outlook, that is international students, staff, and collaboration.

THE has reported that Angel Calderon of RMIT had said that the “downturn had mainly been driven by declining scores in THE’s reputation surveys” and that he was warning that there was worse to come.

Australian universities have responded by demanding that the cap on international students be lifted to avoid financial disaster. Nobody seems to consider how the universities got to the point where they could not survive without recruiting researchers and students from abroad.

It is, however, a mistake to predict catastrophe from a single year’s ranking. Universities have thousands of faculty, employees, and students and produce thousands of patents, articles, books, and other outputs. If a ranking produces large-scale fluctuations over the course of a year, that might well be due to deficiencies in the methodology rather than any sudden change in institutional quality.

There are now several global university rankings that attempt to assess universities' performance in one way or another. THE is not the only one, nor is it the best, and in some ways, it is the worst or nearly the worst.  For universities to link their public image to a single ranking, or even a single indicator, especially one that is as flawed as THE, is quite risky.

To start with, THE is very opaque. Unlike QS, US News, National Taiwan University, Shanghai Ranking, Webometrics, and other rankings, THE does not provide ranks or scores for each of the metrics that it uses to construct the composite or overall score. Instead, they are bundled together in five “pillars”. It is consequently difficult to determine exactly what causes a university to rise or fall in any of these pillars. For example, an improvement in the teaching pillar might be due to increased institutional income, fewer students, fewer faculty, an improved reputation for teaching, more doctorates, fewer bachelor degrees awarded, or some combination of these or some of these.

Added to this are some very dubious results from the THE world and regional rankings over the years. Alexandria University, Aswan University, Brighton and Sussex Medical School, Anglia Ruskin University, Panjab University, Federico Santa Maria Technical University, Kurdistan University of Medical Sciences, and the University of Perediniya have been at one time or another among the supposed world leaders for research quality measured by citations. Leaders for industry income, which is claimed to reflect knowledge transfer, have included Anadolu University, Asia University, Taiwan, the Federal University of Itajubá, and Makerere University,

The citations indicator has been reformed and is now the research quality indicator, but there are still some oddities at its upper level, such as Humanitas University, Vita-Salute San Raffaele University, Australian Catholic University, and St George’s, University of London, probably because they participated in a few highly cited multi-author medical or physics projects.

It now seems that the reputation indicators in the THE WUR are producing results that are similarly lacking in validity. Altogether, reputation counts for 33%, divided between the research and teaching pillars. A truncated version of the survey with the top 200 universities, the scores of fifty of which were provided, was published earlier this year, and the full results were incorporated in the recent world rankings.

Until 2021 THE used the results of a survey conducted by Elsevier among researchers who had published in journals in the Scopus database. After that THE brought the survey in-house and ran it themselves. That may have been a mistake. THE is brilliant at convincing journalists and administrators that it is a trustworthy judge of university quality, but it is not so good at actually assessing such quality, as the above examples demonstrate.

After bringing the survey in-house, THE increased the number of respondents from 10,963 in 2021 to 29,606 in 2022. 38,796 in 2023 and 55,689 in 2024. It seems that this is a different kind of survey since the new influx of respondents is likely to contain fewer researchers from countries like Australia. One might also ask how such a significant increase was achieved.

Another issue is the distribution of survey responses by subject. In 2021 a THE post on the reputation ranking methodology indicated the distribution of responses among academic by which the responses were rebalanced. So, while there were 9.8% computer science responses this was reduced to reflect a 4.2% proportion of international researchers. It seems that this information has not been provided for the 2022 or 2023 reputation surveys.

In 2017 I noted that Oxford’s reputation score tracked the percentage of THE survey responses from the arts and humanities, rising when there are more respondents from those fields and falling when there are fewer. So, the withholding of information about the distribution of responses by subjects is also significant since this could affect the ranking of Australian universities.

Then we have the issue of the geographical distribution of responses. THE has a long-standing policy of recalibrating its results to align with the number of researchers in a country, based on the number of researchers in countries according to data submitted and published by UNESCO.

There are good reasons to be suspicious of data emanating from UNESCO, some of which have been presented by Sasha Alyson.                               

But even if the data were totally accurate, there is still a problem that a university’s rise or fall in reputation might simply be due to a change in the relative number of researchers reported by government departments to the data crunching machines at THE.

According to UNESCO, the number of researchers per million inhabitants in Australia and New Zealand fell somewhat between 2016 and 2021. On the other hand, the number rose for Western Asia, Southern Asia, Eastern Asia, Latin America and the Caribbean, and Northern Africa.

If these changes are accurate, it means that some of Australia's declining research reputation is due to the increase in researchers in other parts of the world and not necessarily to any decline in the quality or quantity of its research.

Concerns about THE's reputation indicators are further raised by looking at some of the universities that did well in the recent reputation survey.

Earlier this year, THE announced that nine Arab universities had achieved the distinction of reaching the top 200 of the reputation rankings, although none were able to reach the top 50, where the exact score and rank were given. THE admitted that the reputation of these universities was regional rather than local. In fact, as some observers noted at the time, it was probably less than regional and primarily national.

It was not Arab universities' rising in the reputation rankings that was disconcerting. Quite a few leading universities from that region have begun to produce significant numbers of papers, citations, and patents and attract the attention of international researchers, but they were not among those doing so well in THE’s reputation rankings.

Then, last May, THE announced that it had detected signs of “possible relationships being agreed between universities”  and that steps would be taken, although not, it would seem, in time for the recent WUR.

More recently, a LinkedIn post by Egor Yablonsky, CEO of E-Quadratic Science & Education, reported that a few European universities had significantly higher reputation scores than the overall world rankings.

Another reason Australia should be cautious of the THE rankings and their reputation metrics is that Australian universities' ranks in the THE reputation rankings are much lower than they are for Global Research Reputation in the US News (USN) Best Global Universities or Academic Reputation in the QS World rankings.

In contrast, some French, Chinese and Emirati universities do noticeably better in the THE reputation ranking than they do in QS or USN.

 

Table: Ranks of leading Australian universities

University

THE reputation

2023

USN global research reputation 2024-2025

QS academic reputation 2025

Melbourne

51-60

43

21

Sydney

61-70

53

30

ANU

81-90

77

36

Monash

81-90

75

78

Queensland

91-100

81

50

UNSW Sydney

126-150

88

43

 

It would be unwise to put too much trust in the THE reputation survey or in the world rankings where it has nearly a one-third weighting. There are some implausible results this year, and it stretches credibility that the American University of the Middle East has a better reputation among researchers than the University of Bologna, National Taiwan University, the Technical University of Berlin, or even UNSW Sydney. THE has admitted that some of these results may be anomalous, and it is likely that some universities will fall after THE takes appropriate measures.

Moreover, the reputation scores and ranks for the leading Australian universities are significantly lower than those published by US News and QS. It seems very odd that Australian universities are embracing a narrative that comes from such a dubious source and is at odds with other rankings. It is undeniable that universities in Australia are facing problems. But it is no help to anyone to let dubious data guide public policy.

So, please, will all the Aussie academics and journalists having nervous breakdowns relax a bit and read some of the other rankings or just wait until next year when THE will probably revamp its reputation metrics.

 

Thursday, October 10, 2024

Is something happening in China?

The National Taiwan University rankings have been overlooked by the Western media, which is a shame since they can provide useful and interesting insights. 

For example, there are indicators for articles in the SCIE and the SSCI of the Web of Science database over 11 years and over the current year, which for this year's edition is 2023. For both metrics, the top scorer, which in these cases is Harvard, is assigned a score of 100, and the others are calibrated accordingly.

If a university has a score for the one-year indicator that is significantly higher than the score for eleven years, it is likely that they have made significant progress during 2023 compared to the previous decade. Conversely, if a university does much better for the eleven-year indicator than for the current year, it could mean that it has entered a period of low productivity.

Looking at the current ranking, we notice that most leading US, British, and Australian universities are doing well for the current year, with the notable exceptions of the Los Angeles, Berkeley, San Diego, and Davis campuses of the University of California. Saudi universities also do well, but French universities are down for the year.

The big story here is that Chinese universities do much worse for the current year than the 11-year period. Here are the Article scores for five leading institutions:

Tsinghua University 57.9  for eleven years and  47.2 for the current year

Zhejiang  University 64.7 and 55.4

Shanghai Jiao Tong University 65 and 52.8

Peking University 57.1 and 48

Sun Yat-Sen University 54.1 and 47.1.

And so on and so on.

So what is going on? I can think of several possible explanations. Firstly, we are seeing the temporary effect of the Covid restrictions, and soon we shall see a rebound.

Secondly, this is the beginning of a new period of decline for Chinese sciences, and we shall see a further decline in the next few years.

Thirdly, and I think most plausibly, China has lost interest in engagement with the West, whether this means partnerships with elite institutions, publications in scientific journals, or participation in surveys and rankings. This aligns with the abstention from the THE Impact rankings. the lack of data submission to the TOP 500 international ranking of supercomputers, and low scores in the   QS sustainability rankings, which suggests a lack of interest in those metrics.

Whatever the reason, we should have a better idea over the next year or two.






 



Friday, August 02, 2024

Forget about the Euros, this is really serious

We are told that the failure at the UEFA final was a tragedy for England. Perhaps, but something else happened early in July that should have caused some debate but passed almost unnoticed, namely the publication of the latest edition of the CWTS Leiden Ranking.

The release of the Times Higher Education (THE) World University rankings and, to a lesser extent, the global rankings from Shanghai, QS, and US News (USN) are often met with fulsome praise from the media and government officials when national favourites rise in the rankings and lamentations when they fall, but other rankings, often much more reliable and rigorous, are largely ignored.

This is partly because the THE and QS rankings are dominated by American and British universities. Oxford, Cambridge, and Imperial College London are in the top ten in the overall tables in these three rankings. This year there was a lot of media talk about Imperial moving ahead of Cambridge and Oxford into second place in the QS rankings, just behind MIT. According to these rankings, British universities are on top of the world and criticism from journalists or politicians would surely be churlish in the extreme. 

It would, however, be a mistake to assume that the big brand rankings are objective judges of academic merit or any other sort. They are biased towards UK universities in a variety of obvious and subtle ways. QS, THE, and USN all include surveys of established academics, and the Shanghai Rankings include Nobel and Fields award winners, some of whom are long gone or retired. THE has three metrics based on income. THE USN, and QS give more weight to citations rather than publications, loading the dice for older and better-funded researchers. 

It seems that British universities have complacently accepted the verdict of these rankings and appear unwilling to consider that they are doing anything less than perfect. When the Sunak government proposed some vague and  bland  changes, the Chief Executive of the London Higher Group of Institutions complained that it was "beyond belief" that the government should have the King speak negatively of the "world-leading higher education and research sector." 

It is perhaps time to look at another ranking, one produced by the Centre for Science and Technology Studies (CWTS) at Leiden University. This provides data on publications with various optional filters for subject group, country, period, and fractional counting. There are also rankings for international and industrial collaboration, open-access publications, and gender equity in research.

CWTS does not, however, publish overall rankings, sponsor spectacular events in prestigious settings, or offer consultations and benchmarking services for non-trivial sums. Consequently, it is usually neglected by the media, university heads, or the grandees of the world economy gathered at WEF forums and the like.

Turning to the latest edition,  starting with the default metric, publications in the Web of Science over the period 2019-2022, we see that Zhejiang University has now overtaken Harvard and moved into first place. In the next few years, it is likely that other Chinese universities like Fudan, Peking, and Tsinghua will join Zhejiang at the peak. 

But the most interesting part of Leiden Ranking is the steady decline of British universities. Oxford is now 25th  in the publications table, down from 14th in 2009-12. That's not too bad, but rather different from the latest QS world ranking, where it is third, US News Best Global Universities, where it is fourth, or THE, where it is first. Oxford is well behind several Chinese universities and also behind, among others, the University of Sao Paulo, Seoul National University, and the University of Pennsylvania.

Of course, you could say that this is a crude measure of research activity and that if we look at other metrics, such as publications in the top 10% and the top 1% of journals, then, yes, Oxford does better. The problem is that the high-quality metrics are usually lagging indicators so we can expect Oxford to start declining there also before too long.

When we look at the broad subject tables for publications, there is further evidence of gradual decline.  For Mathematics and  Computer Science, Oxford is 63rd, behind Purdue University, Beijing University of Technology, and the University of New South Wales. In 2009-12 it was 50th. 

For Physical Sciences and Engineering, it is 72nd behind the  University of Tehran, Texas A & M, and Lomonosov Moscow State University. In 2009-12 it was 29th.

It is 64th in Life and Earth Sciences, behind Lanzhou University, the Swedish University of Agricultural Science, and Colorado State University. In 2009-2012 it was 42nd. 

For Biomedical and Health Sciences, it is 39th, behind Duke, University of British Columbia, and Karolinska Institutet; in 2009-2012, it was 27th.

Finally, when it comes to the Humanities and Social Scientists, Oxford remains at the top. It is fourth in the world, just as it was in 2009-2012. 

A glance at some middling British institutions shows the same picture of steady relative decline. Between 2009-2012 and 2019-2022 Reading went from 489th to 719th, Liverpool from 233rd to 302nd, and Cardiff from 190th to 328th. 

It is perhaps unfair to judge complex institutions based on a single metric. Unfortunately, most science, scholarship, and everyday life are based on assigning numbers that may ignore the fine details of indispensable complex phenomena. 

Also, such data does not tell us the full story about teaching and learning, but there is plenty of anecdotal evidence that British universities are not doing so great there either. 

It seems that the big rankings are exaggerating the merits of British higher education. It is time to take a look at some of the global in  rankings produced in places like the Netherlands (Leiden Ranking), Spain (SCImago), Turkiye (URAP), Georgia (RUR), and Taiwan (NTU rankings).









Sunday, July 07, 2024

Problems with the THE Reputation Rankings

THE has spent a lot of time and words proclaiming that it is trusted by administrators, students, sponsors, and the like. Perhaps it is, but whether it deserves to be is another matter. A recent article in THE  suggests that THE has made a mess of its reputation rankings and is scrambling to put things right.

Until 2021, THE used Elsevier to conduct its teaching and research reputation survey. The 2020-21 survey received 10,963  responses and was calibrated to ensure proper representation of regions and subjects. 

The survey was brought in-house in 2022, and since then, the number of responses has increased substantially to 29,606 in 2022, 38,796 in 2023, and 55,689 in 2024.

When the number of responses increases so dramatically, one should wonder exactly how this was achieved. Was it by sending out more surveys, improving the response rate, or institutional efforts to encourage participation? 

When the results were announced in February, THE declared that a number of Arab universities had achieved remarkable results in the reputation survey. THE conceded that this stellar performance was largely a regional affair that did not extend to the rest of the world. 

But that was not all. Several Arab universities have been making big strides and improving citation, publication, and patent scores: Cairo University, King Abdullah University of Science and Technology, UAE University, and Qatar University. 

The universities getting high scores in the THE rankings were less well-known in the Arab region and had received much lower scores for reputation in the US News and QS rankings. However, they are likely to do well in the forthcoming THE world and Arab university rankings.

THE has now admitted that some universities were encouraging researchers to vote for their own institutions and that there may have been "agreed relationships" between universities. THE is now talking about rewarding respondent diversity, that is getting support from more than just a few institutions.

It is regrettable that THE did not notice this earlier. If it does encourage such diversity, then quite a few universities will suffer dramatic falls in the rankings this year and next.

Anyway, THE could do a few things to improve the validity of its reputation survey. It could eliminate self-voting altogether, give a higher weighting to votes from other countries, as QS does, add a separate ranking for regional reputation, and combine scores for a number of years.

The problems with the reputation metrics seem to have begun with THE starting its own survey. It would be a good idea to go back to letting Elsevier do the survey. THE is undeniably brilliant at event management and public relations, although perhaps not jaw-droppingly so. However, it is not so good at rankings or data processing.

  


Thursday, June 13, 2024

Imperial Ascendancy


The 2025 QS World University Rankings have just been announced. As usual, when there are big fluctuations in scores and ranks, the media are full of words like soaring, rising, plummeting, and collapsing. This year, British universities have been more plummeting than soaring, and this has generally been ascribed to serious underfunding of higher education by governments who have been throwing money at frivolities like childcare, hospitals, schools, roads, and housing.

There has been a lot of talk about Imperial College London rising to second in the world and first in the UK, ahead of Harvard, Oxford, and Cambridge. Imperial's president, quoted in Imperial News, spoke about quality, commitment, and "interrogating the forces that shape our world."

The article also referred to the university's achievements in the THE world rankings, the Guardian University Guide, and the UK's Research and Teaching Excellence Frameworks. It does not mention that Round University Ranking has had Imperial first in the UK since 2014.

So what exactly happened to propel Imperial ahead of Harvard, Oxford, and Cambridge? Perhaps commitment and the interrogation of forces were there in the background, but the more proximate causes were the methodological changes introduced by QS last year. There have been no further changes this year, but the QS rankings do seem to have become more volatile.

In 2023, QS introduced three new indicators. The first is the International Research Network, which measures the breadth rather than the quantity of international research collaborations. This favored universities in English-speaking countries and led to a reported boycott by South Korean universities. 

That boycott does not seem to have done Korean universities any harm since many of them have risen quite significantly this year.

QS has also added an Employment Outcomes metric that combines graduate employment rates and an alumni index of graduate achievements scaled against student numbers. 

Then there is a sustainability indicator based on over fifty pieces of data submitted by institutions. Some reputable Asian universities get low scores here, suggesting that they have not submitted data or that the data has been judged inadequate by the QS validators.

Imperial rose by exactly 0.7 points between the 2024 and the 2025 world rankings, while Harvard, Oxford, and Cambridge all fell. Its score declined for three indicators, Faculty Student Ratio, Citations per Faculty, and International Students, and remained unchanged for International Faculty.

The improvement in the weighted score of five indicators is listed below:

Employment Outcomes                      0.52

Sustainability                                     0.265

Academic Reputation                        0.15

International Research Network       0.035

Employer Reputation                        0.015.

Imperial has improved for all of the new indicators, very substantially for Employments Outcomes and Sustainability, and also for the reputation indicators. I suspect that the Imperial ascendancy may not last long as its peers, especially in Asia, pay more attention to the presentation of employability and sustainability data





Saturday, May 11, 2024

Hungarian universities, this is probably not a good idea

Times Higher Education (THE) has informed us that it has reached a "groundbreaking" agreement with the Hungarian Ministry of Culture and Innovation.

It seems that THE will analyse Hungary's higher education system and benchmark with successful higher education hubs according to the "gold standard" world rankings and provide advice and "unparalleled data insights" to Hungarian universities. The cost of this exercise is not mentioned, but it is unlikely to be trivial.

The Hungarian State Secretary for Innovation and Higher Education referred to the presence of Hungarian universities in the THE rankings. Eleven are now in the THE world rankings whereas five years ago seven were listed. 

That sounds very impressive, but wait a minute.

THE tells us in the 2018-19 rankings, there were 1258 universities, of which 1250 were ranked, and in 2023-24, there were 2671, of which 1906 were ranked. It would be remarkable if the number of Hungarian universities did not increase, and it is no big deal that they did.

What is relevant is the number of universities in the top thousand in each edition. For Hungary, it was six in the 2019 rankings and three in 2024. If the THE rankings mean anything, then the quality of  Hungarian universities has apparently declined over the last five years. 

Hungarian universities, however, have generally been drifting downwards in most rankings, not because they are getting worse in absolute terms but because of the steady rise of Asian, especially Chinese, research-based universities. 

Moreover, the THE world rankings rate Hungarian universities worse than any other global ranking. The latest edition of the THE World University Rankings  (WUR) shows three in the world's top 1000. There are five in the top 1000 in the latest QS rankings, four in the Shanghai rankings, five in Leiden Ranking, four in the US News Best Global Universities, four in URAP, five in CWUR, six in Webometrics, and eight in RUR.

The pattern is clear. THE now consistently underestimates the performance of Hungarian universities compared to other rankers. Not only that but some Hungarian universities have dropped significantly in the THE rankings. Eotvos Lorand University has gone from 601-800 to 801-1000, Pecs University from 601-800 to 1001-1200 and Budapest University of Technology and Economics from 801-1000 to 1201-1500.

On the other hand, a couple of Hungarian universities, Semmelweis and Debrecen, have risen through participation in multi-author multi-citation projects.

It is difficult to see what benefit Hungary will get from paying THE for insights, reports, and targets from an organization that has limited competence in the assessment and analysis of academic performance. Seriously, what insights could you get from an organization that in recent years has declared Anglia Ruskin University to be the world leader for research impact, Anadolu University for knowledge transfer, and Macau University of Science and Technology for International Outlook?

It is true that THE is outstanding in public relations and event management, and the universities will no doubt benefit from high praise at prestigious events and receive favourable headlines and awards. It is hard, though, to see that THE are able to provide the knowledgeable and informed advice that universities need to make difficult decisions in the coming years. 



Sunday, April 07, 2024

What happens to those who leave THE?

Times Higher Education (THE) appears to be getting rather worried about leading universities such as Rhodes University, University of Zurich, Utrecht University, and some of the Indian Institutes of Technology boycotting its World University Rankings (WUR) and not submitting data.

Thriving Rankings?

We have seen articles about how the THE rankings are thriving, indeed growing explosively. Now, THE has published a piece about the sad fate that awaits the universities that drop out of the WUR or their Impact Rankings. 

Declining Universities?

An article by two THE data specialists reports that 611 universities that remained in the THE world rankings from 2018 to 2023 retained, on average, a stable rank in the THE reputation ranking. The 16 who dropped out saw a significant decline in their reputation ranks, as did 75 who are described as never being in the WUR.

The last category is a bit perplexing. According to Webometrics, there are over 30,000 higher education institutions in the world and nearly 90,000, according to Alex Usher of HESA. So, I assume that THE is counting only those that got votes or a minimum number of votes in their reputation ranking. 

We are not told who the 75 never-inners or the 16 defectors are, although some, such as six Indian Institutes of Technology, are well known, so it is difficult to check THE's claims. However, it  is likely that an institution that boycotted the THE WUR would also discourage its faculty from participating in the THE academic survey, which would automatically tend to reduce the reputation scores since THE allows self-voting.

Also, we do not know if there have been changes in the weighting for country and subject and how that might modify the raw survey responses. A few years ago, I noticed that Oxford's academic reputation fluctuated with the percentage of survey responses from the humanities. It is possible that adjustments like that might affect the reputation scores of the leavers. 

The opacity of THE's methodology and the intricacies of its data processing system mean that we cannot be sure about THE's claim that departure from the world rankings would have a negative impact. In addition, there is always the possibility that universities on a downward trend might be more likely to pull out because their leaders are concerned about their rankings, so the withdrawal is a result, not the cause of the decline. 

We should also remember that reputation scores are not everything. If a decline in reputation was accompanied by an improvement in other metrics, it could be a worthwhile trade.

What happened to the IITs in the THE WUR?

Fortunately, we can check THE's claims by looking at a group of institutions from the same country and with the same subject orientation. In the 2019-20 world rankings, twelve Indian Institutes of Technology were ranked. Then, six -- Bombay, Madras, Delhi, Kanpur, Kharagpur, Roorkee --  withdrew from the WUR, and six -- Ropar, Indore, Gandhinagar, Guwahati, Hyderabad, Bhubaneswar --  remained, although two of these withdrew later. 

So, let's see what happened to them. First, look at the overall ranks in the WUR itself and then in Leiden Ranking, the Shanghai Rankings (ARWU), and Webometrics.

Looking at WUR, it seems that if there are penalties for leaving THE, the penalties for remaining could be more serious. 

Among the  IITs in the 2020 rankings, Ropar led in the 301-350 band, followed by Indore in the 351-400 band. Neither of them is as reputable in India as senior IITs such as Bombay and Madras and they had those ranks because of remarkable citation scores, although they did much less well for the other pillars. This anomaly was part of the reason for the six leavers to depart.

Fast-forward to the 2024 WUR. IIT Ropar has fallen dramatically to 1001-1200,  Indore, which had fallen from 351-400 to 601-800 in 2023, has opted out, and Gandhinagar has fallen from 501-600 to 801-1000. Bhubaneswar, which was in the 601-800 band in the 2020 WUR,  fell to 1001-1200 in 2022 and 2023 and was absent in 2024. Guwahati and Hyderabad remained in the 601-800 band.

Frankly, it looks like staying in the THE WUR is not always a good idea. Maybe their THE reputation improved but four of the original remaining IITs suffered serious declines.

IITs in Other Rankings

Now, let's examine the IITs' performance in other rankings. First, the total publications metric in Leiden Ranking. Between 2019 and 2023, four of the six early leavers rose, and two fell. The late leavers, Hyderabad and Indore, were absent in 2019 and were ranked in the 900s in 2023. Remainer Guwahati rose from 536th in 2019 to 439th in 2023.

For Webometrics, between 2019 and 2024, all 12 IITs went up except for Bombay.

Finally, let's check the overall scores in the QS WUR. Between 2021 and 2024, four of the six leavers went up, and two went down. Of the others, Guwahati went up, and Hyderabad went down.

So, looking at overall ranking scores, it seems unlikely that boycotting THE causes any great harm, if any. On the other hand, if THE is tweaking its methodology or something happens to a productive researcher, staying could lead to an embarrassing decline.

IITs' Academic Reputation Scores

Next, here are some academic reputation surveys. The  US News Best Global Universities is not as helpful as it could be since it does not provide data from previous editions, and the Wayback Machine doesn't seem to work very well. However, the Global Research Reputation metric in the most recent edition is instructive. 

The six escapees had an average rank of 272, ranging from 163 for Bombay to 477 for Roorkee.

The remainers' ranks ranged from 702 for Guwahati to 1710 for Bhubaneswar. Ropar was not ranked at all. So, leaving THE does not appear to have done the IITs any harm in this metric

Turning to the QS WUR academic reputation metric, the rank in the academic survey for the leavers ranges from 141 for Bombay to 500 for Roorkee. They have all improved since 2022. The best performing remainer is Guwahati in 523rd place.  Ropar and Gandhinagar are not ranked at all. Bhubaneswar, Indore and Hyderabad are all at 601+.  

Now for Round University Ranking's reputation ranking. Four of the six original leavers were there in 2019. Three fell by 2023 and Delhi rose. Two, Bombay and Roorkee, were absent in 2019 and present in 2023.

This might be considered evidence that leaving THE leads to a loss of reputation. But five of the original remainers are not ranked in these rankings, and Guwahati is there in 2023 with a rank of 417, well below that of the six leavers. 

There is then scant evidence that leaving WUR damaged the academic reputations of those IITs that joined the initial boycott, and their overall rankings scores have generally improved.

On the other hand, for IITs Ropar and Bhubaneswar remaining proved disastrous.  

IITs and Employer Reputation

In the latest GEURS employer rankings, published by Emerging, the French consulting firm, there are four exiting IITs in the top 250, Delhi, Bombay, Kharagpur, and Madras, and no remainers.

In the QS WUR Employer Reputation indicator, the boycotters all perform well. Bombay is 69th and Delhi is 80th. Of the six original remainers two, Ropar and Gandhinagar, were not ranked by QS in their 2024 WUR. Three were ranked 601 or below, and Guwahati was 381st, ahead of Roorkee in 421st place.

Conclusion

Looking at the IITs, there seems to be little downside to boycotting THE WUR, and there could be some risk in staying, especially for institutions that have over-invested in specific metrics. It is possible that the IITs are atypical, but so far there seems little reason to fear leaving the THE WUR. A study of the consequences of boycotting the THE Impact Rankings is being prepared