Sunday, July 07, 2024

Problems with the THE Reputation Rankings

THE has spent a lot of time and words proclaiming that it is trusted by administrators, students, sponsors, and the like. Perhaps it is, but whether it deserves to be is another matter. A recent article in THE  suggests that THE has made a mess of its reputation rankings and is scrambling to put things right.

Until 2021, THE used Elsevier to conduct its teaching and research reputation survey. The 2020-21 survey received 10,963  responses and was calibrated to ensure proper representation of regions and subjects. 

The survey was brought in-house in 2022, and since then, the number of responses has increased substantially to 29,606 in 2022, 38,796 in 2023, and 55,689 in 2024.

When the number of responses increases so dramatically, one should wonder exactly how this was achieved. Was it by sending out more surveys, improving the response rate, or institutional efforts to encourage participation? 

When the results were announced in February, THE declared that a number of Arab universities had achieved remarkable results in the reputation survey. THE conceded that this stellar performance was largely a regional affair that did not extend to the rest of the world. 

But that was not all. Several Arab universities have been making big strides and improving citation, publication, and patent scores: Cairo University, King Abdullah University of Science and Technology, UAE University, and Qatar University. 

The universities getting high scores in the THE rankings were less well-known in the Arab region and had received much lower scores for reputation in the US News and QS rankings. However, they are likely to do well in the forthcoming THE world and Arab university rankings.

THE has now admitted that some universities were encouraging researchers to vote for their own institutions and that there may have been "agreed relationships" between universities. THE is now talking about rewarding respondent diversity, that is getting support from more than just a few institutions.

It is regrettable that THE did not notice this earlier. If it does encourage such diversity, then quite a few universities will suffer dramatic falls in the rankings this year and next.

Anyway, THE could do a few things to improve the validity of its reputation survey. It could eliminate self-voting altogether, give a higher weighting to votes from other countries, as QS does, add a separate ranking for regional reputation, and combine scores for a number of years.

The problems with the reputation metrics seem to have begun with THE starting its own survey. It would be a good idea to go back to letting Elsevier do the survey. THE is undeniably brilliant at event management and public relations, although perhaps not jaw-droppingly so. However, it is not so good at rankings or data processing.

  


Thursday, June 13, 2024

Imperial Ascendancy


The 2025 QS World University Rankings have just been announced. As usual, when there are big fluctuations in scores and ranks, the media are full of words like soaring, rising, plummeting, and collapsing. This year, British universities have been more plummeting than soaring, and this has generally been ascribed to serious underfunding of higher education by governments who have been throwing money at frivolities like childcare, hospitals, schools, roads, and housing.

There has been a lot of talk about Imperial College London rising to second in the world and first in the UK, ahead of Harvard, Oxford, and Cambridge. Imperial's president, quoted in Imperial News, spoke about quality, commitment, and "interrogating the forces that shape our world."

The article also referred to the university's achievements in the THE world rankings, the Guardian University Guide, and the UK's Research and Teaching Excellence Frameworks. It does not mention that Round University Ranking has had Imperial first in the UK since 2014.

So what exactly happened to propel Imperial ahead of Harvard, Oxford, and Cambridge? Perhaps commitment and the interrogation of forces were there in the background, but the more proximate causes were the methodological changes introduced by QS last year. There have been no further changes this year, but the QS rankings do seem to have become more volatile.

In 2023, QS introduced three new indicators. The first is the International Research Network, which measures the breadth rather than the quantity of international research collaborations. This favored universities in English-speaking countries and led to a reported boycott by South Korean universities. 

That boycott does not seem to have done Korean universities any harm since many of them have risen quite significantly this year.

QS has also added an Employment Outcomes metric that combines graduate employment rates and an alumni index of graduate achievements scaled against student numbers. 

Then there is a sustainability indicator based on over fifty pieces of data submitted by institutions. Some reputable Asian universities get low scores here, suggesting that they have not submitted data or that the data has been judged inadequate by the QS validators.

Imperial rose by exactly 0.7 points between the 2024 and the 2025 world rankings, while Harvard, Oxford, and Cambridge all fell. Its score declined for three indicators, Faculty Student Ratio, Citations per Faculty, and International Students, and remained unchanged for International Faculty.

The improvement in the weighted score of five indicators is listed below:

Employment Outcomes                      0.52

Sustainability                                     0.265

Academic Reputation                        0.15

International Research Network       0.035

Employer Reputation                        0.015.

Imperial has improved for all of the new indicators, very substantially for Employments Outcomes and Sustainability, and also for the reputation indicators. I suspect that the Imperial ascendancy may not last long as its peers, especially in Asia, pay more attention to the presentation of employability and sustainability data





Saturday, May 11, 2024

Hungarian universities, this is probably not a good idea

Times Higher Education (THE) has informed us that it has reached a "groundbreaking" agreement with the Hungarian Ministry of Culture and Innovation.

It seems that THE will analyse Hungary's higher education system and benchmark with successful higher education hubs according to the "gold standard" world rankings and provide advice and "unparalleled data insights" to Hungarian universities. The cost of this exercise is not mentioned, but it is unlikely to be trivial.

The Hungarian State Secretary for Innovation and Higher Education referred to the presence of Hungarian universities in the THE rankings. Eleven are now in the THE world rankings whereas five years ago seven were listed. 

That sounds very impressive, but wait a minute.

THE tells us in the 2018-19 rankings, there were 1258 universities, of which 1250 were ranked, and in 2023-24, there were 2671, of which 1906 were ranked. It would be remarkable if the number of Hungarian universities did not increase, and it is no big deal that they did.

What is relevant is the number of universities in the top thousand in each edition. For Hungary, it was six in the 2019 rankings and three in 2024. If the THE rankings mean anything, then the quality of  Hungarian universities has apparently declined over the last five years. 

Hungarian universities, however, have generally been drifting downwards in most rankings, not because they are getting worse in absolute terms but because of the steady rise of Asian, especially Chinese, research-based universities. 

Moreover, the THE world rankings rate Hungarian universities worse than any other global ranking. The latest edition of the THE World University Rankings  (WUR) shows three in the world's top 1000. There are five in the top 1000 in the latest QS rankings, four in the Shanghai rankings, five in Leiden Ranking, four in the US News Best Global Universities, four in URAP, five in CWUR, six in Webometrics, and eight in RUR.

The pattern is clear. THE now consistently underestimates the performance of Hungarian universities compared to other rankers. Not only that but some Hungarian universities have dropped significantly in the THE rankings. Eotvos Lorand University has gone from 601-800 to 801-1000, Pecs University from 601-800 to 1001-1200 and Budapest University of Technology and Economics from 801-1000 to 1201-1500.

On the other hand, a couple of Hungarian universities, Semmelweis and Debrecen, have risen through participation in multi-author multi-citation projects.

It is difficult to see what benefit Hungary will get from paying THE for insights, reports, and targets from an organization that has limited competence in the assessment and analysis of academic performance. Seriously, what insights could you get from an organization that in recent years has declared Anglia Ruskin University to be the world leader for research impact, Anadolu University for knowledge transfer, and Macau University of Science and Technology for International Outlook?

It is true that THE is outstanding in public relations and event management, and the universities will no doubt benefit from high praise at prestigious events and receive favourable headlines and awards. It is hard, though, to see that THE are able to provide the knowledgeable and informed advice that universities need to make difficult decisions in the coming years. 



Sunday, April 07, 2024

What happens to those who leave THE?

Times Higher Education (THE) appears to be getting rather worried about leading universities such as Rhodes University, University of Zurich, Utrecht University, and some of the Indian Institutes of Technology boycotting its World University Rankings (WUR) and not submitting data.

Thriving Rankings?

We have seen articles about how the THE rankings are thriving, indeed growing explosively. Now, THE has published a piece about the sad fate that awaits the universities that drop out of the WUR or their Impact Rankings. 

Declining Universities?

An article by two THE data specialists reports that 611 universities that remained in the THE world rankings from 2018 to 2023 retained, on average, a stable rank in the THE reputation ranking. The 16 who dropped out saw a significant decline in their reputation ranks, as did 75 who are described as never being in the WUR.

The last category is a bit perplexing. According to Webometrics, there are over 30,000 higher education institutions in the world and nearly 90,000, according to Alex Usher of HESA. So, I assume that THE is counting only those that got votes or a minimum number of votes in their reputation ranking. 

We are not told who the 75 never-inners or the 16 defectors are, although some, such as six Indian Institutes of Technology, are well known, so it is difficult to check THE's claims. However, it  is likely that an institution that boycotted the THE WUR would also discourage its faculty from participating in the THE academic survey, which would automatically tend to reduce the reputation scores since THE allows self-voting.

Also, we do not know if there have been changes in the weighting for country and subject and how that might modify the raw survey responses. A few years ago, I noticed that Oxford's academic reputation fluctuated with the percentage of survey responses from the humanities. It is possible that adjustments like that might affect the reputation scores of the leavers. 

The opacity of THE's methodology and the intricacies of its data processing system mean that we cannot be sure about THE's claim that departure from the world rankings would have a negative impact. In addition, there is always the possibility that universities on a downward trend might be more likely to pull out because their leaders are concerned about their rankings, so the withdrawal is a result, not the cause of the decline. 

We should also remember that reputation scores are not everything. If a decline in reputation was accompanied by an improvement in other metrics, it could be a worthwhile trade.

What happened to the IITs in the THE WUR?

Fortunately, we can check THE's claims by looking at a group of institutions from the same country and with the same subject orientation. In the 2019-20 world rankings, twelve Indian Institutes of Technology were ranked. Then, six -- Bombay, Madras, Delhi, Kanpur, Kharagpur, Roorkee --  withdrew from the WUR, and six -- Ropar, Indore, Gandhinagar, Guwahati, Hyderabad, Bhubaneswar --  remained, although two of these withdrew later. 

So, let's see what happened to them. First, look at the overall ranks in the WUR itself and then in Leiden Ranking, the Shanghai Rankings (ARWU), and Webometrics.

Looking at WUR, it seems that if there are penalties for leaving THE, the penalties for remaining could be more serious. 

Among the  IITs in the 2020 rankings, Ropar led in the 301-350 band, followed by Indore in the 351-400 band. Neither of them is as reputable in India as senior IITs such as Bombay and Madras and they had those ranks because of remarkable citation scores, although they did much less well for the other pillars. This anomaly was part of the reason for the six leavers to depart.

Fast-forward to the 2024 WUR. IIT Ropar has fallen dramatically to 1001-1200,  Indore, which had fallen from 351-400 to 601-800 in 2023, has opted out, and Gandhinagar has fallen from 501-600 to 801-1000. Bhubaneswar, which was in the 601-800 band in the 2020 WUR,  fell to 1001-1200 in 2022 and 2023 and was absent in 2024. Guwahati and Hyderabad remained in the 601-800 band.

Frankly, it looks like staying in the THE WUR is not always a good idea. Maybe their THE reputation improved but four of the original remaining IITs suffered serious declines.

IITs in Other Rankings

Now, let's examine the IITs' performance in other rankings. First, the total publications metric in Leiden Ranking. Between 2019 and 2023, four of the six early leavers rose, and two fell. The late leavers, Hyderabad and Indore, were absent in 2019 and were ranked in the 900s in 2023. Remainer Guwahati rose from 536th in 2019 to 439th in 2023.

For Webometrics, between 2019 and 2024, all 12 IITs went up except for Bombay.

Finally, let's check the overall scores in the QS WUR. Between 2021 and 2024, four of the six leavers went up, and two went down. Of the others, Guwahati went up, and Hyderabad went down.

So, looking at overall ranking scores, it seems unlikely that boycotting THE causes any great harm, if any. On the other hand, if THE is tweaking its methodology or something happens to a productive researcher, staying could lead to an embarrassing decline.

IITs' Academic Reputation Scores

Next, here are some academic reputation surveys. The  US News Best Global Universities is not as helpful as it could be since it does not provide data from previous editions, and the Wayback Machine doesn't seem to work very well. However, the Global Research Reputation metric in the most recent edition is instructive. 

The six escapees had an average rank of 272, ranging from 163 for Bombay to 477 for Roorkee.

The remainers' ranks ranged from 702 for Guwahati to 1710 for Bhubaneswar. Ropar was not ranked at all. So, leaving THE does not appear to have done the IITs any harm in this metric

Turning to the QS WUR academic reputation metric, the rank in the academic survey for the leavers ranges from 141 for Bombay to 500 for Roorkee. They have all improved since 2022. The best performing remainer is Guwahati in 523rd place.  Ropar and Gandhinagar are not ranked at all. Bhubaneswar, Indore and Hyderabad are all at 601+.  

Now for Round University Ranking's reputation ranking. Four of the six original leavers were there in 2019. Three fell by 2023 and Delhi rose. Two, Bombay and Roorkee, were absent in 2019 and present in 2023.

This might be considered evidence that leaving THE leads to a loss of reputation. But five of the original remainers are not ranked in these rankings, and Guwahati is there in 2023 with a rank of 417, well below that of the six leavers. 

There is then scant evidence that leaving WUR damaged the academic reputations of those IITs that joined the initial boycott, and their overall rankings scores have generally improved.

On the other hand, for IITs Ropar and Bhubaneswar remaining proved disastrous.  

IITs and Employer Reputation

In the latest GEURS employer rankings, published by Emerging, the French consulting firm, there are four exiting IITs in the top 250, Delhi, Bombay, Kharagpur, and Madras, and no remainers.

In the QS WUR Employer Reputation indicator, the boycotters all perform well. Bombay is 69th and Delhi is 80th. Of the six original remainers two, Ropar and Gandhinagar, were not ranked by QS in their 2024 WUR. Three were ranked 601 or below, and Guwahati was 381st, ahead of Roorkee in 421st place.

Conclusion

Looking at the IITs, there seems to be little downside to boycotting THE WUR, and there could be some risk in staying, especially for institutions that have over-invested in specific metrics. It is possible that the IITs are atypical, but so far there seems little reason to fear leaving the THE WUR. A study of the consequences of boycotting the THE Impact Rankings is being prepared 






Saturday, March 16, 2024

THE's Big Bang Ranking

 


Another day, another ranking. 

Times Higher Education (THE) has published a "bang for the bucks" ranking.

THE is taking the scores for institutional income, research income, and income from industry and comparing them with the scores "for research, teaching, and working with industry." This, presumably, is supposed to reveal those universities that are able to process their funding efficiently and turn it into publications, citations, patents, doctorates, and survey responses

There are some methodological issues here. It is not clear exactly how the income scores are calculated. Is it from the raw monetary data that THE collects from universities, or has it been through the THE standardization and normalization machine? Is there some sort of weighting or just an average of the three income categories? 

Also, there is a chart that suggests that all the scores are counted except for the financial metrics, but the text implies that the international pillar is not counted as part of the bang that THE purports to measure.

Another issue is that the financial data in the THE rankings refers to the year two years before the date of publication. However, citation and publication data are from a five—or six-year period before the ranking is published. In effect, THE is claiming that their favored schools have a remarkable ability to send money back in time to the years when research proposals were written, papers published, and citations recorded.

THE lists ten countries as good bang producers, starting with the UK and including Pakistan and Egypt. It does not list China, South Korea, Canada, or Australia, which should make us a little suspicious, 

Then, looking at the list of twenty universities with the biggest bangs, we see a few familiar names, including Sussex and Brighton Medical School,  Babol Noshirvani University of Technology,  and  Vita-Salute San Raffaele University, that have appeared in this blog before because they received remarkably high scores for citations and consequently did well in the overall rankings. Some, including Quaid-i-Islam University, COMSATS University, Auckland University of Technology, Government College University Faisalabad, and University College London, have contributed to citation-rich multi-contributor papers from the Global Burden of Disease Studies or the Large Hadron Collider project. Others, such as Shoolini University of  Biotechnology and Management Sciences and Malaviya National Institute of Technology, have scores for research quality that are disproportionate to those for research environment or teaching. It looks as though a lot of  THE's Big Bang simply consists of getting masses of citations. 

It is also possible that universities might obtain a good bang for the buck score by underreporting their income, perhaps accidentally, which would help here, although not in conventional rankings. This has happened to Trinity College Dublin and probably to Harvard, although the latter case went unnoticed by almost everyone. Probably, the very high scores for Sorbonne University and Universite Paris Cite result from the special features of the French funding system.

I suspect quite a few institutions will take this ranking seriously or pretend to and use it as a pretext to try to obtain more largesse from increasingly impoverished states.

It would seem that THE is engaged in a public relations exercise for upmarket British and perhaps for US and continental universities. These are doing all sorts of amazing, brilliant, and exciting things for which they receive insufficient funds from cheapskate governments.  Just imagine what they could do if they got as much money as Chinese universities do. 



Wednesday, February 28, 2024

Comments on the THE Reputation Rankings

Times Higher Education (THE) has announced the latest edition of its reputation ranking. The scores for this ranking will be included in the forthcoming World University Ranking and THE's other tables, where they will have a significant or very significant effect. In the Japan University Ranking, they will get an 8% weighting, and in the Arab University Ranking, 41%. Why THE gives such a large weight to reputation in the Arab rankings seems a bit puzzling. 

The ranking is based on a survey of researchers "who have published in academic journals, have been cited by other researchers and who have been published within the last five years," presumably in journals indexed in  Scopus.

Until 2022 the survey was run by Elsevier but since then has been brought in-house. 

The top of the survey tells us little new. Harvard is first and is followed by the rest of the six big global brands: MIT, Stanford, Oxford, Cambridge, and Berkeley. Leading Chinese universities are edging closer to the top ten.

For most countries or regions, the rank order is uncontroversial: Melbourne is the most prestigious university in Australia, Toronto in Canada, Technical University of Munich in Germany, and a greyed-out Lomonosov Moscow State University in Russia. However, there is one region where the results are a little eyebrow-raising. 

As THE has been keen to point out, there has been a remarkable improvement in the scores for some universities in the Arab region. This in itself is not surprising. Arab nations in recent years have invested massive amounts of money in education and research, recruited international researchers, and begun to rise in the research-based rankings such as Shanghai and Leiden. It is to be expected that some of these universities should start to do well in reputation surveys.

What is surprising is which Arab universities have now appeared in the THE reputation ranking. Cairo University, the American University in Beirut, Qatar University, United Emirates University, KAUST, and King Abdulaziz University have achieved some success in various rankings, but they do not make the top 200 here. 

Instead, we have nine universities: the American University in the Middle East, Prince Mohammed Bin Fahd University, Imam Mohammed Ibn Saud Islamic University, Qassim University, Abu Dhabi University,  Zayed University, Al Ain University, Lebanese University, and Beirut Arab University. These are all excellent and well-funded institutions by any standards, but it is hard to see why they should be considered to be among the world's top 200 research-orientated universities.

None of these universities makes it into the top 1,000 of the Webometrics ranking or the RUR reputation rankings. A few are found in the US News Best Global Universities, but none get anywhere near the top 200 for world or regional reputation. They do appear in the QS world rankings but always with a low score for the academic survey.

THE accepts that survey support for the universities comes disproportionately from within the region in marked contrast to US institutions and claim that Arab universities have established a regional reputation but have yet to sell themselves to the rest of the world.

That may be so, but again, there are several Arab universities that have established international reputations. Cairo University is in the top 200 in the QS academic survey, and the RUR reputation ranking, and the American University of Beirut is ranked 42nd for regional research reputation by USN. They are, however, absent from the THE reputation ranking. 

When a ranking produces results that are at odds with other rankings and with accessible bibliometric data, then a bit of explanation is needed.


  




Sunday, February 04, 2024

Wednesday, January 17, 2024

Rankings and the Threat from the East

Recently, we have heard a lot about global university rankings' responsibilities. Some have drawn attention to the increasing number of universities included in the rankings or the new rankings that allow universities to showcase the remarkable and interesting things they have been doing for society or the environment. There are claims that the well-known rankers are promoting global equity by including many more African and Asia universities.

Perhaps. But it seems that some rankings, particularly the two big-name ones, THE and QS, have another function, which is to downplay the rise of Chinese and maybe other Asian institutions and maintain the dominant position of the elite schools of the Global North. 

The table below shows the number of universities included in the top 100 universities in various global rankings. The table is arranged in ascending order according to the number of Mainland Chinese universities and refers to the most recent edition. 

Chinese universities are apparently uninterested in the rankings that supposedly assess contributions or commitment to the environment, sustainability, or equity. There are none in the top 100 of the new QS Sustainability Rankings or the GreenMetric Rankings and only one in the THE Impact Rankings. On the other hand, China does very well in Nature Index and in Leiden Rankings Publications and Publications in the top 1% of journals and fairly well in the URAP, Scimago, and National Taiwan University rankings. In short, China does best in those rankings that emphasize recent achievements in research in STEM subjects. 

The UK does best in rankings that include a substantial weighting for reputation, internationalization,  or activity related to sustainability and much less well in research-based rankings. 

The USA hasn't really bothered with the GreenMetric and THE Impact rankings. Its best performance is in UniRank, which is a measure of web activity, and Webometrics, which is half web activity, CWUR, which includes faculty and alumni achievement, and US News Best Global Universities, which has a strong reputation element. It is not so good in Nature Index, URAP, and NTU, which are research-based. 

It seems, to simplify a bit, that British and American universities benefit from indicators that measure or try to measure resources, reputation, web presence, and international activity, Chinese and some other Asian institutions are rapidly moving ahead in research and innovation.


Table: Number of Universities in the Top 100


Ranking

Country of publication

USA

UK

Mainland China

QS Sustainability

UK

12

28

0

UniRank

Australia

75

7

0

GreenMetric

Indonesia

3

3

1

THE Impact

UK

6

25

1

USN Global

USA

41

11

4

MosIUR

Russia

41

15

4

QS World

UK

27

17

5

CWUR

UAE

50

9

6

GEURS

France

18

8

6

Webometrics

Spain

53

9

6

THE World

UK

36

11

7

RUR

Georgia

38

11

7

ARWU

China

38

8

10

NTU

Taiwan

36

10

14

Scimago- universities

Spain

39

7

24

URAP

Turkiye

28

6

23

Leiden P 1%

Netherlands

39

8

28

Leiden P

Netherlands

31

6

36

Nature Index -  academic

UK

37

5

35

 

 

 

 

 


Friday, December 15, 2023

Yet another example of the misuse of rankings

The proliferation of rankings has led to universities selectively quoting metrics in attempts to boost prestige, student applications, and state support. A recent example is Brunel University's claim that it is the joint most international university in the UK and fourth most international in the world.

This is based on the International Outlook pillar in the most recent edition of the Times Higher Education (THE) world rankings.

THE is not the only ranking with an internationalisation indicator. Let's take a look at the others.

In the QS world rankings Brunel is 9th in the UK for International Faculty, joint 12th for International Students, and 36th for International Research Network,

In the latest URAP (at the time of writing) it is 34th in England for International Collaboration.

In Round University Rankings, Brunel is 9th for International academic staff in the UK, 17th for international students, and 22nd for International Level.

In Leiden Ranking it is joint 6th in the UK for International Collaboration.

I don't want to denigrate Brunel in any way but the claim that it is the most international university in the UK is misleading and should be withdrawn or at least accompanied by a very big *.

















Saturday, December 09, 2023

Global Subject Rankings: The Case of Computer Science

Three ranking agencies have recently released the latest editions of their subject rankings: Times Higher Education, Shanghai Ranking, and Round University Rankings.  

QS, URAP, and National Taiwan University also published subject rankings earlier in the year. The US News global rankings announced last year can be filtered for subject. The methods are different and consequently the results are also rather different. It is instructive to focus on the results for a specific field, computer science and on two universities, Oxford and Tsinghua. Note that the scope of the rankings is sometimes different.

 

1.   Times Higher Education has published rankings of eleven broad subjects using the same indicators as in their world rankings, Teaching, Research Environment, Research Quality, International Outlook, and Industry: Income and Patents, but with different weightings. For example, Teaching has a weighting of 28% for the Engineering rankings and Industry: Income and Patents 8%, while for Arts and Humanities the weightings are 37.5% and 3% respectively.

These rankings continued to be led by the traditional Anglo-American elite. Harvard is in first place for three subjects, Stanford, MIT, and Oxford in two each and Berkeley and Caltech in one each.

The top five for Computer Science are:

1.    University of Oxford

2.    Stanford University

3.    Massachusetts Institute of Technology

4.    Carnegie Mellon University

5.    ETH Zurich.

Tsinghua is 13th.

 

2.   The Shanghai subject rankings are based on these metrics: influential journal publications, category normalised citation impact, international collaboration, papers in Top Journals or Top Conferences, and faculty winning significant academic awards.

According to these rankings China is now dominant in Engineering subjects. Chinese universities lead in fifteen subjects although Harvard, MIT and Northwestern University lead for seven subjects. The Natural Sciences, Medical Sciences, and Social Sciences are still largely the preserve of American and European universities.

Excellence in the Life Sciences appears to be divided between the USA and China. The top positions in Biology, Human Biology, Agriculture, and Veterinary Science are held respectively by Harvard, University of California San Francisco, Northwest Agriculture and Forestry University, and Nanjing Agricultural University.

The top five for Computer Science and Engineering are:

1.    Massachusetts Institute of Technology

2.    Stanford University

3.    Tsinghua University

4.    Carnegie Mellon University

5.    University of California Berkeley.

Oxford is 9th.

 

3.  The Round University Rankings (RUR), now published from Tbilisi, Georgia, are derived from 20 metrics grouped in 5 clusters, Teaching, Research, International Diversity, and Financial Sustainability. The same methodology is used for rankings in six broad fields. Here, Harvard is in first place for Medical Sciences, Social Sciences, and Technical Sciences, Caltech for Life Sciences, and University of Pennsylvania for Humanities.

RUR’s narrow subject rankings, published for the first time, use different criteria related to publications and citations: Number of Papers, Number of Citations, Citations per Paper, Number of Citing Papers, and Number of Highly Cited Papers. In these rankings, first place goes to twelve universities in the USA, eight in Mainland China, three in Singapore, and one each in Hong Kong, France, and the UK.

 The top five for Computer Science are:

1.    National University of Singapore

2.    Nanyang Technological University

3.    Massachusetts Institute of Technology

4.    Huazhong University of Science and Technology

5.    University of Electronic Science and Technology of China.

Tsinghua is 10th.  Oxford is 47th.

 

4.   The QS World University Rankings by Subject are based on five indicators: Academic reputation, Employer reputation, Research citations per paper, H-index and International research network.  At the top they are mostly led by the usual suspects, MIT, Harvard, Stanford, Oxford, and Cambridge.

The top five for Computer Science and Information Systems

1.    Massachusetts Institute of Technology

2.    Carnegie Mellon University

3.    Stanford University

4.    University of California Berkeley

5.    University of Oxford.

Tsinghua is 15th.

 

5.   University Ranking by Academic Performance (URAP) is produced by a research group at the Middle East Technical University, Ankara, and is based on publications, citations, and international collaboration. Last July it published rankings of 78 subjects.  

 The top five for Information and Computing Sciences were:

1.    Tsinghua University

2.    University of Electronic Science and Technology of China

3.   Nanyang Technological University

4.   National University of Singapore

5.   Xidian University

Oxford is 19th

 

6.    The US News Best Global Universities can be filtered by subject. They are based on publications, citations and research reputation.

The top five for Computer Science in 2022 were:

1.   Tsinghua University

2.   Stanford University

3.    Massachusetts Institute of Technology

4.    Carnegie Mellon University

5.   University of California Berkeley

Oxford was 11th.

 

7.    The National Taiwan University Rankings are based on articles, citations, highly cited papers, and H-index.

The top five for Computer Science are:

1.    Nanyang Technological University

2.    Tsinghua University

3.    University of Electronic Science and Technology of China

4.   National University of Singapore

5.    Xidian University

Oxford is 111th

 

So, Tsinghua is ahead of Oxford for computer science and related fields in the Shanghai Rankings, the Round University Rankings, URAP, the US News Best Global Universities, and the National Taiwan University Rankings. These rankings are entirely or mainly based on research publications and citations. Oxford is ahead of Tsinghua in both the QS and THE subject rankings. The contrast between the THE and the Taiwan rankings is especially striking.