Showing posts sorted by date for query oxford reputation. Sort by relevance Show all posts
Showing posts sorted by date for query oxford reputation. Sort by relevance Show all posts

Wednesday, April 09, 2025

The Decline of Harvard

 

Published in Substack 08 April 2025

Quite a few stories have come out of the Ivy League about how standards are collapsing. I used to think that this was just the perennial lament of teachers everywhere that today’s students are inferior to those of my day. But the stories are coming faster these days, and they seem to be consonant with declining cognitive skills throughout the West, a general disengagement by students, increasing rates of plagiarism, rejection of science and liberal values, and the ardent embrace of extremist ideologies.

Perhaps the most striking story was when Harvard introduced remedial math courses for some of its students. This resulted from the suspension of requiring the submission of SAT and ACT scores following the COVID-19 outbreak.

I suspect that the problem may go deeper than that, and remedial courses at Harvard and other elite schools may become permanent, although probably presented as enrichment programs or something like that.

But this is all anecdotal. Evidence from global rankings can provide more systematic data, which shows that Harvard is steadily declining relative to international universities and even to its peers in the USA.

Here is a prediction. This year, next year, or maybe the year after, Harvard will cede its position as the top university in the world in the publications metric in the Shanghai Rankings to Zhejiang University in Hangzhou.

The Shanghai Rankings, officially known as the Academic Ranking of World Universities (ARWU), have six indicators: Nobel prizes and Fields Medals for alumni, and faculty, papers in Nature and Science, Highly Cited Researchers, publications in the Science Citation Index Extended and the Social Science Citation Index, and Productivity per Capita, which is the sum of those five scores divided by the number of faculty.

When they began, the Shanghai Rankings placed Harvard in first place overall and for all the indicators except for productivity, where Caltech has always held the lead. However, in 2022, Harvard lost its lead to Princeton for faculty winning the Nobel and Fields awards. The coming loss of supremacy for publications will mean that Harvard will lead in just half of the six indicators.

This is only one sign of Harvard’s decline. Looking at some other rankings, we find a similar story. Back in 2010, when QS started producing independent rankings, Harvard was replaced by Cambridge, which in turn was superseded by MIT, which has held first place ever since. In the THE rankings, Caltech deposed Harvard in 2013 and was overtaken by Oxford in 2017.

I have no great faith in THE or QS, but this is suggestive. Then, we have the more rigorous research-based rankings. In the 2024 Leiden Ranking, published by the Centre for Science and Technology Studies at Leiden University, Zhejiang University took over first place from Harvard for publications, although not – not yet anyway – for publications in the top 10% or 1% of journals. In the SCImago Institutions Ranking, published in Spain, Harvard is now fourth overall, although still the leading university.

But when we look at computer science and engineering rankings, it is clear that Harvard has fallen dramatically in areas crucial to economic growth and scientific research over the last few decades.

Shanghai has Harvard in 10th place for computer science, the National Taiwan University Rankings put it in 11th behind Wisconsin, Georgia Institute of Technology, Texas at Austin, and Carnegie Mellon, SCImago Institutions Rankings 61st, and University Ranking by Academic Performance, published by the Middle East Technical University in Ankara, 35th.

For engineering, the prospect is just as grim. The Taiwan rankings have Harvard 31st, Scimago 42nd, and URAP 71st.

Fine, you might say, but the bottom line is jobs and salaries. Let’s look at the latest Financial Times MBA rankings, where Harvard has plunged to 13th place. A major reason for that was that nearly a quarter of the class of 2024 could not find jobs after graduating. According to Poets & Quants, Harvard’s “placement numbers are below every M7 peer, including Stanford, Wharton, Columbia, Kellogg, and Booth, with only one exception: MIT Sloan which is equal to HBS.”

It seems that Harvard’s problems are entrenched and pervasive. They may have been exacerbated by the pandemic, but their roots go back and go deeper. So what is the cause of this decline? I doubt that the usual villain, underfunding by vicious governments or offended donors, has anything to do with it. However, the announced Trumpian cuts may have an effect in the future.

A plausible hypothesis is that Harvard has drifted away from meritocracy in student admissions and assessment and, more significantly, faculty appointments and promotion.

Perhaps the concept of Harvard’s meritocracy has always been overblown. A few years ago, I was researching early American history and came across a reference to a prominent Massachusetts landowner who had graduated first in his class at Harvard. I was baffled because I thought I should have heard of somebody that brilliant. But it turned out that Harvard before the Revolution ranked students according to their perceived social status, a practice that ended with Independence, after which they were ranked alphabetically. The idea of sorting students academically seems to have become widespread only in the twentieth century.

Even after Harvard supposedly embraced meritocracy by introducing the SAT, the GRE, and other tests and linking tenure to publications and citations, it still included large numbers of legacies, athletes, persons of interest to the dean, and affirmative action.

It seems that Harvard is returning to its earlier model of subordinating academic performance to character, athletic ability, conformism, and membership of favored groups. It has appointed a president who is almost certainly the only Harvard professor in the humanities and social sciences not to have written a book. It has admitted students who are incapable or unwilling to do the academic work that elite universities used to require. And its global reputation is slowly eroding.


Tuesday, February 25, 2025

Comments on the New Edition of the THE Reputation Rankings

Times Higher Education (THE) have just published the latest edition of their World Reputation Rankings. At the top, it is business as usual. We have the big six super brands, Harvard, MIT, Oxford, Stanford, Cambridge, and Berkeley. After that, there are no real surprises. The top fifty includes  Ivy League schools like Princeton and Yale, rising Asian giants like Tsinghua, Tokyo and the National University of Singapore, established European institutions like LMU Munich and KU Leuven, and well-known London colleges, LSE and UCL.

But then things start to get interesting. THE has introduced some drastic methodological changes, and these have led to a significant amount of churning.   

A bit of context, last year, the reputation rankings recorded an apparently remarkable achievement by nine Arab universities that came into the top 200 from nowhere. Later, THE announced that they had discovered a "syndicate" that was trading votes in the reputation surveys and that measures would be taken to stop that and penalise the universities involved.

But THE was not satisfied with that, and they have revamped its methodology to include two new metrics in addition to the simple counting of votes for best universities for teaching and research.

The first of these is pairwise comparison. This means, according to THE, that universities are preselected "informed from their publication history," and respondents then place them in order from 1 to 5, thus encouraging them to consider places other than the super brands. Exactly how that preselection works is not clear.

The second is voter diversity, which rewards universities if they have more countries and more subjects in their respondent base, which, THE claims, indicates a strong reputation. 

Whatever THE's intentions, the overall result of these changes is clear. The USA, UK, Netherlands, Canada, Australia, Germany, Sweden, and Switzerland have all increased the number of universities in the top 200. 

The biggest gainers are the UK, which has increased its representation in the top 200 from 20 to 24; Switzerland, which has gone from 4 to 7; and the Netherlands, which has also added another three universities.

Of the twenty British universities in the top 200, eleven have risen, seven are in the same rank or band, and only two, Birmingham and Sheffield, have fallen. It is hard to believe that there has been such a widespread improvement in the international reputation of British universities. 

In contrast, the new methodology has been disastrous for universities in China, Russia, the Arab region, India, Israel, Japan, and South Korea. 

The number of Chinese universities in the top 200 has fallen from 15 to 8. While Tsinghua and Peking Universities have retained their places, others have fallen, Shanghai Jiao Tong University from 43rd  place to 58th, the University of Science and Technology China from 61-70 to 101-150, and Harbin Institute of Technology from 101-125 to  201-300.

Russia has fared even worse. There were six greyed-out Russian universities in the 2023 rankings. Now, there are just two, Lomonosov Moscow State University, in 83rd place, down from 34th, and Bauman Moscow State Technical University, down from 60-70 to 201-300. The latter gets 1.7 points in the pairwise comparison. All of the others are gone.

In 2023, there were four Indian universities in the rankings, the Indian Institute of Science and the Indian Institutes of Technology Bombay, Delhi, and Madras. Now, IIT Bombay has been removed altogether, and the Indian Institute of Science and the IITs Delhi and Madras have been demoted to the 201-300 band. They are joined by Siksha 'O' Anuhandsan, which has a global research rank of 1900 in the US News Best Global Universities. 

The worst-hit area is the Arab Region. Universities in Kuwait, Lebanon, Saudi Arabia, and the UAE have left. The only Arab university now is King Abdullah University of Science and Technology, which does not teach undergraduates.

A glimpse of the impact of the new metrics can be seen by looking at the universities that come at the bottom for each metric. 

For voter counts, the ten worst universities are all European, followed by a handful of Australian and Canadian universities and Siksha 'O' Anuhandsan, suggesting that it is the new indicators that keep them in the ranking.

For pairwise comparison, the bottom is quite diverse; there is Bauman Moscow State Technical University, Liverpool University, the University of Buenos Aires, Beijing Normal Univerity, Universite Paris Cite, and the University of Cape Town.

The universities that do worst for voter diversity are mainly South Korean, Indian, Turkish, and Japanese. 

It seems then that one function of the new methodology is to slow down the advance of Asian universities and maintain the status of the Western elite. 








Wednesday, October 23, 2024

Are Australian universities really on a precipice?

  


Times Higher Education (THE) recently published the latest edition of their World University Rankings (WUR), which contained bad news for Australian higher education. The country’s leading universities have fallen down the rankings, apparently because of a decline in their scores for research and teaching reputation scores and international outlook, that is international students, staff, and collaboration.

THE has reported that Angel Calderon of RMIT had said that the “downturn had mainly been driven by declining scores in THE’s reputation surveys” and that he was warning that there was worse to come.

Australian universities have responded by demanding that the cap on international students be lifted to avoid financial disaster. Nobody seems to consider how the universities got to the point where they could not survive without recruiting researchers and students from abroad.

It is, however, a mistake to predict catastrophe from a single year’s ranking. Universities have thousands of faculty, employees, and students and produce thousands of patents, articles, books, and other outputs. If a ranking produces large-scale fluctuations over the course of a year, that might well be due to deficiencies in the methodology rather than any sudden change in institutional quality.

There are now several global university rankings that attempt to assess universities' performance in one way or another. THE is not the only one, nor is it the best, and in some ways, it is the worst or nearly the worst.  For universities to link their public image to a single ranking, or even a single indicator, especially one that is as flawed as THE, is quite risky.

To start with, THE is very opaque. Unlike QS, US News, National Taiwan University, Shanghai Ranking, Webometrics, and other rankings, THE does not provide ranks or scores for each of the metrics that it uses to construct the composite or overall score. Instead, they are bundled together in five “pillars”. It is consequently difficult to determine exactly what causes a university to rise or fall in any of these pillars. For example, an improvement in the teaching pillar might be due to increased institutional income, fewer students, fewer faculty, an improved reputation for teaching, more doctorates, fewer bachelor degrees awarded, or some combination of these or some of these.

Added to this are some very dubious results from the THE world and regional rankings over the years. Alexandria University, Aswan University, Brighton and Sussex Medical School, Anglia Ruskin University, Panjab University, Federico Santa Maria Technical University, Kurdistan University of Medical Sciences, and the University of Perediniya have been at one time or another among the supposed world leaders for research quality measured by citations. Leaders for industry income, which is claimed to reflect knowledge transfer, have included Anadolu University, Asia University, Taiwan, the Federal University of Itajubá, and Makerere University,

The citations indicator has been reformed and is now the research quality indicator, but there are still some oddities at its upper level, such as Humanitas University, Vita-Salute San Raffaele University, Australian Catholic University, and St George’s, University of London, probably because they participated in a few highly cited multi-author medical or physics projects.

It now seems that the reputation indicators in the THE WUR are producing results that are similarly lacking in validity. Altogether, reputation counts for 33%, divided between the research and teaching pillars. A truncated version of the survey with the top 200 universities, the scores of fifty of which were provided, was published earlier this year, and the full results were incorporated in the recent world rankings.

Until 2021 THE used the results of a survey conducted by Elsevier among researchers who had published in journals in the Scopus database. After that THE brought the survey in-house and ran it themselves. That may have been a mistake. THE is brilliant at convincing journalists and administrators that it is a trustworthy judge of university quality, but it is not so good at actually assessing such quality, as the above examples demonstrate.

After bringing the survey in-house, THE increased the number of respondents from 10,963 in 2021 to 29,606 in 2022. 38,796 in 2023 and 55,689 in 2024. It seems that this is a different kind of survey since the new influx of respondents is likely to contain fewer researchers from countries like Australia. One might also ask how such a significant increase was achieved.

Another issue is the distribution of survey responses by subject. In 2021 a THE post on the reputation ranking methodology indicated the distribution of responses among academic by which the responses were rebalanced. So, while there were 9.8% computer science responses this was reduced to reflect a 4.2% proportion of international researchers. It seems that this information has not been provided for the 2022 or 2023 reputation surveys.

In 2017 I noted that Oxford’s reputation score tracked the percentage of THE survey responses from the arts and humanities, rising when there are more respondents from those fields and falling when there are fewer. So, the withholding of information about the distribution of responses by subjects is also significant since this could affect the ranking of Australian universities.

Then we have the issue of the geographical distribution of responses. THE has a long-standing policy of recalibrating its results to align with the number of researchers in a country, based on the number of researchers in countries according to data submitted and published by UNESCO.

There are good reasons to be suspicious of data emanating from UNESCO, some of which have been presented by Sasha Alyson.                               

But even if the data were totally accurate, there is still a problem that a university’s rise or fall in reputation might simply be due to a change in the relative number of researchers reported by government departments to the data crunching machines at THE.

According to UNESCO, the number of researchers per million inhabitants in Australia and New Zealand fell somewhat between 2016 and 2021. On the other hand, the number rose for Western Asia, Southern Asia, Eastern Asia, Latin America and the Caribbean, and Northern Africa.

If these changes are accurate, it means that some of Australia's declining research reputation is due to the increase in researchers in other parts of the world and not necessarily to any decline in the quality or quantity of its research.

Concerns about THE's reputation indicators are further raised by looking at some of the universities that did well in the recent reputation survey.

Earlier this year, THE announced that nine Arab universities had achieved the distinction of reaching the top 200 of the reputation rankings, although none were able to reach the top 50, where the exact score and rank were given. THE admitted that the reputation of these universities was regional rather than local. In fact, as some observers noted at the time, it was probably less than regional and primarily national.

It was not Arab universities' rising in the reputation rankings that was disconcerting. Quite a few leading universities from that region have begun to produce significant numbers of papers, citations, and patents and attract the attention of international researchers, but they were not among those doing so well in THE’s reputation rankings.

Then, last May, THE announced that it had detected signs of “possible relationships being agreed between universities”  and that steps would be taken, although not, it would seem, in time for the recent WUR.

More recently, a LinkedIn post by Egor Yablonsky, CEO of E-Quadratic Science & Education, reported that a few European universities had significantly higher reputation scores than the overall world rankings.

Another reason Australia should be cautious of the THE rankings and their reputation metrics is that Australian universities' ranks in the THE reputation rankings are much lower than they are for Global Research Reputation in the US News (USN) Best Global Universities or Academic Reputation in the QS World rankings.

In contrast, some French, Chinese and Emirati universities do noticeably better in the THE reputation ranking than they do in QS or USN.

 

Table: Ranks of leading Australian universities

University

THE reputation

2023

USN global research reputation 2024-2025

QS academic reputation 2025

Melbourne

51-60

43

21

Sydney

61-70

53

30

ANU

81-90

77

36

Monash

81-90

75

78

Queensland

91-100

81

50

UNSW Sydney

126-150

88

43

 

It would be unwise to put too much trust in the THE reputation survey or in the world rankings where it has nearly a one-third weighting. There are some implausible results this year, and it stretches credibility that the American University of the Middle East has a better reputation among researchers than the University of Bologna, National Taiwan University, the Technical University of Berlin, or even UNSW Sydney. THE has admitted that some of these results may be anomalous, and it is likely that some universities will fall after THE takes appropriate measures.

Moreover, the reputation scores and ranks for the leading Australian universities are significantly lower than those published by US News and QS. It seems very odd that Australian universities are embracing a narrative that comes from such a dubious source and is at odds with other rankings. It is undeniable that universities in Australia are facing problems. But it is no help to anyone to let dubious data guide public policy.

So, please, will all the Aussie academics and journalists having nervous breakdowns relax a bit and read some of the other rankings or just wait until next year when THE will probably revamp its reputation metrics.

 

Thursday, June 13, 2024

Imperial Ascendancy


The 2025 QS World University Rankings have just been announced. As usual, when there are big fluctuations in scores and ranks, the media are full of words like soaring, rising, plummeting, and collapsing. This year, British universities have been more plummeting than soaring, and this has generally been ascribed to serious underfunding of higher education by governments who have been throwing money at frivolities like childcare, hospitals, schools, roads, and housing.

There has been a lot of talk about Imperial College London rising to second in the world and first in the UK, ahead of Harvard, Oxford, and Cambridge. Imperial's president, quoted in Imperial News, spoke about quality, commitment, and "interrogating the forces that shape our world."

The article also referred to the university's achievements in the THE world rankings, the Guardian University Guide, and the UK's Research and Teaching Excellence Frameworks. It does not mention that Round University Ranking has had Imperial first in the UK since 2014.

So what exactly happened to propel Imperial ahead of Harvard, Oxford, and Cambridge? Perhaps commitment and the interrogation of forces were there in the background, but the more proximate causes were the methodological changes introduced by QS last year. There have been no further changes this year, but the QS rankings do seem to have become more volatile.

In 2023, QS introduced three new indicators. The first is the International Research Network, which measures the breadth rather than the quantity of international research collaborations. This favored universities in English-speaking countries and led to a reported boycott by South Korean universities. 

That boycott does not seem to have done Korean universities any harm since many of them have risen quite significantly this year.

QS has also added an Employment Outcomes metric that combines graduate employment rates and an alumni index of graduate achievements scaled against student numbers. 

Then there is a sustainability indicator based on over fifty pieces of data submitted by institutions. Some reputable Asian universities get low scores here, suggesting that they have not submitted data or that the data has been judged inadequate by the QS validators.

Imperial rose by exactly 0.7 points between the 2024 and the 2025 world rankings, while Harvard, Oxford, and Cambridge all fell. Its score declined for three indicators, Faculty Student Ratio, Citations per Faculty, and International Students, and remained unchanged for International Faculty.

The improvement in the weighted score of five indicators is listed below:

Employment Outcomes                      0.52

Sustainability                                     0.265

Academic Reputation                        0.15

International Research Network       0.035

Employer Reputation                        0.015.

Imperial has improved for all of the new indicators, very substantially for Employments Outcomes and Sustainability, and also for the reputation indicators. I suspect that the Imperial ascendancy may not last long as its peers, especially in Asia, pay more attention to the presentation of employability and sustainability data





Sunday, April 07, 2024

What happens to those who leave THE?

Times Higher Education (THE) appears to be getting rather worried about leading universities such as Rhodes University, University of Zurich, Utrecht University, and some of the Indian Institutes of Technology boycotting its World University Rankings (WUR) and not submitting data.

Thriving Rankings?

We have seen articles about how the THE rankings are thriving, indeed growing explosively. Now, THE has published a piece about the sad fate that awaits the universities that drop out of the WUR or their Impact Rankings. 

Declining Universities?

An article by two THE data specialists reports that 611 universities that remained in the THE world rankings from 2018 to 2023 retained, on average, a stable rank in the THE reputation ranking. The 16 who dropped out saw a significant decline in their reputation ranks, as did 75 who are described as never being in the WUR.

The last category is a bit perplexing. According to Webometrics, there are over 30,000 higher education institutions in the world and nearly 90,000, according to Alex Usher of HESA. So, I assume that THE is counting only those that got votes or a minimum number of votes in their reputation ranking. 

We are not told who the 75 never-inners or the 16 defectors are, although some, such as six Indian Institutes of Technology, are well known, so it is difficult to check THE's claims. However, it  is likely that an institution that boycotted the THE WUR would also discourage its faculty from participating in the THE academic survey, which would automatically tend to reduce the reputation scores since THE allows self-voting.

Also, we do not know if there have been changes in the weighting for country and subject and how that might modify the raw survey responses. A few years ago, I noticed that Oxford's academic reputation fluctuated with the percentage of survey responses from the humanities. It is possible that adjustments like that might affect the reputation scores of the leavers. 

The opacity of THE's methodology and the intricacies of its data processing system mean that we cannot be sure about THE's claim that departure from the world rankings would have a negative impact. In addition, there is always the possibility that universities on a downward trend might be more likely to pull out because their leaders are concerned about their rankings, so the withdrawal is a result, not the cause of the decline. 

We should also remember that reputation scores are not everything. If a decline in reputation was accompanied by an improvement in other metrics, it could be a worthwhile trade.

What happened to the IITs in the THE WUR?

Fortunately, we can check THE's claims by looking at a group of institutions from the same country and with the same subject orientation. In the 2019-20 world rankings, twelve Indian Institutes of Technology were ranked. Then, six -- Bombay, Madras, Delhi, Kanpur, Kharagpur, Roorkee --  withdrew from the WUR, and six -- Ropar, Indore, Gandhinagar, Guwahati, Hyderabad, Bhubaneswar --  remained, although two of these withdrew later. 

So, let's see what happened to them. First, look at the overall ranks in the WUR itself and then in Leiden Ranking, the Shanghai Rankings (ARWU), and Webometrics.

Looking at WUR, it seems that if there are penalties for leaving THE, the penalties for remaining could be more serious. 

Among the  IITs in the 2020 rankings, Ropar led in the 301-350 band, followed by Indore in the 351-400 band. Neither of them is as reputable in India as senior IITs such as Bombay and Madras and they had those ranks because of remarkable citation scores, although they did much less well for the other pillars. This anomaly was part of the reason for the six leavers to depart.

Fast-forward to the 2024 WUR. IIT Ropar has fallen dramatically to 1001-1200,  Indore, which had fallen from 351-400 to 601-800 in 2023, has opted out, and Gandhinagar has fallen from 501-600 to 801-1000. Bhubaneswar, which was in the 601-800 band in the 2020 WUR,  fell to 1001-1200 in 2022 and 2023 and was absent in 2024. Guwahati and Hyderabad remained in the 601-800 band.

Frankly, it looks like staying in the THE WUR is not always a good idea. Maybe their THE reputation improved but four of the original remaining IITs suffered serious declines.

IITs in Other Rankings

Now, let's examine the IITs' performance in other rankings. First, the total publications metric in Leiden Ranking. Between 2019 and 2023, four of the six early leavers rose, and two fell. The late leavers, Hyderabad and Indore, were absent in 2019 and were ranked in the 900s in 2023. Remainer Guwahati rose from 536th in 2019 to 439th in 2023.

For Webometrics, between 2019 and 2024, all 12 IITs went up except for Bombay.

Finally, let's check the overall scores in the QS WUR. Between 2021 and 2024, four of the six leavers went up, and two went down. Of the others, Guwahati went up, and Hyderabad went down.

So, looking at overall ranking scores, it seems unlikely that boycotting THE causes any great harm, if any. On the other hand, if THE is tweaking its methodology or something happens to a productive researcher, staying could lead to an embarrassing decline.

IITs' Academic Reputation Scores

Next, here are some academic reputation surveys. The  US News Best Global Universities is not as helpful as it could be since it does not provide data from previous editions, and the Wayback Machine doesn't seem to work very well. However, the Global Research Reputation metric in the most recent edition is instructive. 

The six escapees had an average rank of 272, ranging from 163 for Bombay to 477 for Roorkee.

The remainers' ranks ranged from 702 for Guwahati to 1710 for Bhubaneswar. Ropar was not ranked at all. So, leaving THE does not appear to have done the IITs any harm in this metric

Turning to the QS WUR academic reputation metric, the rank in the academic survey for the leavers ranges from 141 for Bombay to 500 for Roorkee. They have all improved since 2022. The best performing remainer is Guwahati in 523rd place.  Ropar and Gandhinagar are not ranked at all. Bhubaneswar, Indore and Hyderabad are all at 601+.  

Now for Round University Ranking's reputation ranking. Four of the six original leavers were there in 2019. Three fell by 2023 and Delhi rose. Two, Bombay and Roorkee, were absent in 2019 and present in 2023.

This might be considered evidence that leaving THE leads to a loss of reputation. But five of the original remainers are not ranked in these rankings, and Guwahati is there in 2023 with a rank of 417, well below that of the six leavers. 

There is then scant evidence that leaving WUR damaged the academic reputations of those IITs that joined the initial boycott, and their overall rankings scores have generally improved.

On the other hand, for IITs Ropar and Bhubaneswar remaining proved disastrous.  

IITs and Employer Reputation

In the latest GEURS employer rankings, published by Emerging, the French consulting firm, there are four exiting IITs in the top 250, Delhi, Bombay, Kharagpur, and Madras, and no remainers.

In the QS WUR Employer Reputation indicator, the boycotters all perform well. Bombay is 69th and Delhi is 80th. Of the six original remainers two, Ropar and Gandhinagar, were not ranked by QS in their 2024 WUR. Three were ranked 601 or below, and Guwahati was 381st, ahead of Roorkee in 421st place.

Conclusion

Looking at the IITs, there seems to be little downside to boycotting THE WUR, and there could be some risk in staying, especially for institutions that have over-invested in specific metrics. It is possible that the IITs are atypical, but so far there seems little reason to fear leaving the THE WUR. A study of the consequences of boycotting the THE Impact Rankings is being prepared 






Wednesday, February 28, 2024

Comments on the THE Reputation Rankings

Times Higher Education (THE) has announced the latest edition of its reputation ranking. The scores for this ranking will be included in the forthcoming World University Ranking and THE's other tables, where they will have a significant or very significant effect. In the Japan University Ranking, they will get an 8% weighting, and in the Arab University Ranking, 41%. Why THE gives such a large weight to reputation in the Arab rankings seems a bit puzzling. 

The ranking is based on a survey of researchers "who have published in academic journals, have been cited by other researchers and who have been published within the last five years," presumably in journals indexed in  Scopus.

Until 2022 the survey was run by Elsevier but since then has been brought in-house. 

The top of the survey tells us little new. Harvard is first and is followed by the rest of the six big global brands: MIT, Stanford, Oxford, Cambridge, and Berkeley. Leading Chinese universities are edging closer to the top ten.

For most countries or regions, the rank order is uncontroversial: Melbourne is the most prestigious university in Australia, Toronto in Canada, Technical University of Munich in Germany, and a greyed-out Lomonosov Moscow State University in Russia. However, there is one region where the results are a little eyebrow-raising. 

As THE has been keen to point out, there has been a remarkable improvement in the scores for some universities in the Arab region. This in itself is not surprising. Arab nations in recent years have invested massive amounts of money in education and research, recruited international researchers, and begun to rise in the research-based rankings such as Shanghai and Leiden. It is to be expected that some of these universities should start to do well in reputation surveys.

What is surprising is which Arab universities have now appeared in the THE reputation ranking. Cairo University, the American University in Beirut, Qatar University, United Emirates University, KAUST, and King Abdulaziz University have achieved some success in various rankings, but they do not make the top 200 here. 

Instead, we have nine universities: the American University in the Middle East, Prince Mohammed Bin Fahd University, Imam Mohammed Ibn Saud Islamic University, Qassim University, Abu Dhabi University,  Zayed University, Al Ain University, Lebanese University, and Beirut Arab University. These are all excellent and well-funded institutions by any standards, but it is hard to see why they should be considered to be among the world's top 200 research-orientated universities.

None of these universities makes it into the top 1,000 of the Webometrics ranking or the RUR reputation rankings. A few are found in the US News Best Global Universities, but none get anywhere near the top 200 for world or regional reputation. They do appear in the QS world rankings but always with a low score for the academic survey.

THE accepts that survey support for the universities comes disproportionately from within the region in marked contrast to US institutions and claim that Arab universities have established a regional reputation but have yet to sell themselves to the rest of the world.

That may be so, but again, there are several Arab universities that have established international reputations. Cairo University is in the top 200 in the QS academic survey, and the RUR reputation ranking, and the American University of Beirut is ranked 42nd for regional research reputation by USN. They are, however, absent from the THE reputation ranking. 

When a ranking produces results that are at odds with other rankings and with accessible bibliometric data, then a bit of explanation is needed.


  




Saturday, December 09, 2023

Global Subject Rankings: The Case of Computer Science

Three ranking agencies have recently released the latest editions of their subject rankings: Times Higher Education, Shanghai Ranking, and Round University Rankings.  

QS, URAP, and National Taiwan University also published subject rankings earlier in the year. The US News global rankings announced last year can be filtered for subject. The methods are different and consequently the results are also rather different. It is instructive to focus on the results for a specific field, computer science and on two universities, Oxford and Tsinghua. Note that the scope of the rankings is sometimes different.

 

1.   Times Higher Education has published rankings of eleven broad subjects using the same indicators as in their world rankings, Teaching, Research Environment, Research Quality, International Outlook, and Industry: Income and Patents, but with different weightings. For example, Teaching has a weighting of 28% for the Engineering rankings and Industry: Income and Patents 8%, while for Arts and Humanities the weightings are 37.5% and 3% respectively.

These rankings continued to be led by the traditional Anglo-American elite. Harvard is in first place for three subjects, Stanford, MIT, and Oxford in two each and Berkeley and Caltech in one each.

The top five for Computer Science are:

1.    University of Oxford

2.    Stanford University

3.    Massachusetts Institute of Technology

4.    Carnegie Mellon University

5.    ETH Zurich.

Tsinghua is 13th.

 

2.   The Shanghai subject rankings are based on these metrics: influential journal publications, category normalised citation impact, international collaboration, papers in Top Journals or Top Conferences, and faculty winning significant academic awards.

According to these rankings China is now dominant in Engineering subjects. Chinese universities lead in fifteen subjects although Harvard, MIT and Northwestern University lead for seven subjects. The Natural Sciences, Medical Sciences, and Social Sciences are still largely the preserve of American and European universities.

Excellence in the Life Sciences appears to be divided between the USA and China. The top positions in Biology, Human Biology, Agriculture, and Veterinary Science are held respectively by Harvard, University of California San Francisco, Northwest Agriculture and Forestry University, and Nanjing Agricultural University.

The top five for Computer Science and Engineering are:

1.    Massachusetts Institute of Technology

2.    Stanford University

3.    Tsinghua University

4.    Carnegie Mellon University

5.    University of California Berkeley.

Oxford is 9th.

 

3.  The Round University Rankings (RUR), now published from Tbilisi, Georgia, are derived from 20 metrics grouped in 5 clusters, Teaching, Research, International Diversity, and Financial Sustainability. The same methodology is used for rankings in six broad fields. Here, Harvard is in first place for Medical Sciences, Social Sciences, and Technical Sciences, Caltech for Life Sciences, and University of Pennsylvania for Humanities.

RUR’s narrow subject rankings, published for the first time, use different criteria related to publications and citations: Number of Papers, Number of Citations, Citations per Paper, Number of Citing Papers, and Number of Highly Cited Papers. In these rankings, first place goes to twelve universities in the USA, eight in Mainland China, three in Singapore, and one each in Hong Kong, France, and the UK.

 The top five for Computer Science are:

1.    National University of Singapore

2.    Nanyang Technological University

3.    Massachusetts Institute of Technology

4.    Huazhong University of Science and Technology

5.    University of Electronic Science and Technology of China.

Tsinghua is 10th.  Oxford is 47th.

 

4.   The QS World University Rankings by Subject are based on five indicators: Academic reputation, Employer reputation, Research citations per paper, H-index and International research network.  At the top they are mostly led by the usual suspects, MIT, Harvard, Stanford, Oxford, and Cambridge.

The top five for Computer Science and Information Systems

1.    Massachusetts Institute of Technology

2.    Carnegie Mellon University

3.    Stanford University

4.    University of California Berkeley

5.    University of Oxford.

Tsinghua is 15th.

 

5.   University Ranking by Academic Performance (URAP) is produced by a research group at the Middle East Technical University, Ankara, and is based on publications, citations, and international collaboration. Last July it published rankings of 78 subjects.  

 The top five for Information and Computing Sciences were:

1.    Tsinghua University

2.    University of Electronic Science and Technology of China

3.   Nanyang Technological University

4.   National University of Singapore

5.   Xidian University

Oxford is 19th

 

6.    The US News Best Global Universities can be filtered by subject. They are based on publications, citations and research reputation.

The top five for Computer Science in 2022 were:

1.   Tsinghua University

2.   Stanford University

3.    Massachusetts Institute of Technology

4.    Carnegie Mellon University

5.   University of California Berkeley

Oxford was 11th.

 

7.    The National Taiwan University Rankings are based on articles, citations, highly cited papers, and H-index.

The top five for Computer Science are:

1.    Nanyang Technological University

2.    Tsinghua University

3.    University of Electronic Science and Technology of China

4.   National University of Singapore

5.    Xidian University

Oxford is 111th

 

So, Tsinghua is ahead of Oxford for computer science and related fields in the Shanghai Rankings, the Round University Rankings, URAP, the US News Best Global Universities, and the National Taiwan University Rankings. These rankings are entirely or mainly based on research publications and citations. Oxford is ahead of Tsinghua in both the QS and THE subject rankings. The contrast between the THE and the Taiwan rankings is especially striking.

 

 

 

 

 

Saturday, March 18, 2023

SCImago Innovation Rankings: The East-West Gap Gets Wider

The decline of western academic research becomes more apparent every time a ranking with a stable and moderately accurate methodology is published. This will not be obvious if one just looks at the top ten, or even the top fifty, of the better known rankings. Harvard, Stanford, and MIT are usually still there at the top and Oxford and Cambridge are cruising along in the top twenty or the top thirty.

But take away the metrics that measure inherited intellectual capital such as the Nobel and Fields laureates in the Shanghai rankings or the reputation surveys in the QS, THE, and US world rankings, and the dominance of the West appears ever more precarious. This is confirmed if we turn from overall rankings to subject and field tables.

Take a look at the most recent edition of the CWTS Leiden Ranking, which is highly reputed among researchers although much less so among the media. For sheer number of publications overall, Harvard still holds the lead although Zhejiang, Shanghai Jiao Tong and Tsinghua are closing in and there are more Chinese schools in the top 30.  Chinese dominance is reduced if we move to the top 10% of journals but it may be just a matter of time before China takes the lead there as well. 

But click to physical sciences and engineering. The top 19 places are held by Mainland Chinese universities with the University of Tokyo coming in at 20.  MIT is there at 33, Texas A & M at 55 and Purdue 62. Again the Chinese presence is diluted, probably just for the moment, if we switch to the top 10% or 1% of journals.  

Turning to developments in applied research, the shift to China and away from the West, appears even greater.

The SCImago Institutions rankings are rather distinctive. In addition to the standard measures of research activity, there are also metrics for innovation and societal impact. Also, they include the performance of government agencies, hospitals, research centres and companies.

The innovation rankings combine three measures of patent activity. Patents are problematic for comparing universities but they can establish broad long-term trends. 

Here are the top 10 for Innovation in 2009:

1.   Centre Nationale de la Recherche Scientifique

2.   Harvard University 

3.   National Institutes of Health, USA

4.   Stanford University 

5.   Massachusetts Institute of Technology

6.   Institute National de las Sante et de la Recherche Medicale

7.   Johns Hopkins University 

8.   University of California Los Angeles

9.   Howard Hughes Medical Institute 

10.  University of Tokyo.

And here they are for 2023:

1.   Chinese Academy of Sciences 

2.   State Grid Corporation of China  

3.   Ministry of Education PRC

4.   DeepMind

5.   Ionis Pharmaceuticals

6.   Google Inc, USA

7.   Alphabet Inc 

8.  Tsinghua University

9.   Huawei Technologies Co Ltd

10.  Google International LLC.

What happened to the high flying universities of 2009?  Harvard is in 57th place, MIT in 60th, Stanford 127th, Johns Hopkins 365th, and Tokyo in 485th. 

it seems that the torch of innovation has left the hand of American, European, and Japanese universities and research centres and has been passed to Multinational, Chinese, and American companies and research bodies, plus a few Chinese universities. I am not sure where the loyalties of the multinational institutions lie, if indeed they have any at all.




Sunday, September 26, 2021

What is a University Really for ?

Louise Richardson, Vice-Chancellor of the University of Oxford, has seen fit to enlighten us about the true purpose of a university. It is, it seems, to inculcate appropriate deference to the class of certified experts.

Professor Richardson remarked at the latest Times Higher Education (THE) academic summit that she was embarrassed that "we" had educated the Conservative politician Michael Gove who said, while talking about Brexit, that people had had enough of experts.

So now we know what universities are really about.  Not about critical discussion, cutting-edge research, skepticism, the disinterested pursuit of truth but about teaching respect for experts.

A few years ago I wrote a post suggesting we were now in a world where the expertise of the accredited experts was declining along with public deference. I referred to the failure of political scientists to predict the nomination of Trump, the election of Trump, the rise of Leicester City, the Brexit vote. It looks like respect for experts has continued to decline, not entirely without reason.

Professor Richardson thinks that Gove's disdain for the Brexit experts is cause for embarrassment. While it is early years for the real effects of Brexit to become clear it is as yet far from obvious that it has been an unmitigated disaster.  It is, moreover, a little ironic that the remark was made at the latest THE academic summit where the annual world rankings were announced.  Richardson remarked that she was delighted that her university was once again ranked number one.

The irony is that the THE world rankings are probably the least expert of the global rankings although they are apparently the most prestigious at least among those institutions that are known for being prestigious.

Let's have another look at THE's Citations Indicator which is supposed to measure research quality or impact and accounts for nearly a third of the total weighting. (Regular readers of this blog can skim or skip the next few lines. ) Here are the top five from this year's rankings.

1,   University of Cape Coast

2,   Duy Tan University

3,   An Najah National University

4.   Aswan University

5.   Brighton and Sussex Medical School.

This is not an academic version of the imaginary football league tables that nine-year-old children used to construct. Nor is it the result of massive cheating by the universities concerned. It is quite simply the outcome of a hopelessly flawed system. THE, or rather its data analysts, appear to be aware of the inadequacies of this indicator but somehow meaningful reform keeps getting postponed. One day historians will search the THE archives to findr the causes of this inability to take very simple and obvious measures to produce a sensible and credible ranking. I suspect that the people in control of THE policy are averse to anything that might involve any distraction from the priority of monetising as much data as possible. Nor is there any compelling reason for a rush to reform when universities like Oxford are unconcerned about the inadequacies of the current system.

Here are the top five for income from industry which is supposed to have something to do with innovation.

1.   Asia University Taiwan

2.   Istanbul Technical University

3.   Khalifa University

4.   Korean Advanced Institute of Science and Technology (KAIST)

5.   LMU Munich.

This is a bit better. It is not implausible that KAIST or Munich is a world leader for innovation. But in general, this indicator is also inadequate for any purpose other than providing fodder for publicity. See a scathing review by Alex Usher

Would any tutor or examiner at Oxford give any credit to any student who thought that Ghana, Vietnam and Palestine were centers of international research impact. They are all doing a remarkable job of teachng in many reseapects but that is not what THE is ostensibly giving them credit for.

In addition, the THE world rankings fail to meet satisfactory standards with regard to basic validity. Looking at the indicator scores for the top 200 universities in the most recent world rankings we can see that the correlation between research and teaching is 0.92. In effect these are not two distinct metrics. They are measuring essentially the same thing. A quick look at the methodology suggests that what they are comparing is income (total institutional income for teaching, research income for research), reputation (the opinion surveys for research and teaching) and investment in doctoral programmes.

On the other hand , the citations indicator does not correlate significantly with research or teaching and correlates negatively with industry income.

One can hardly blame THE for wanting to make as much money as possible. But surely we can expect something better from supposedly elite institutions that claim to value intellectual and scientific excellence. If Oxford and its peers wish to restore public confidence in the experts there is no better way than saying to THE that we will not submit data to THE until you produce something a little less embarrassing.




Sunday, June 13, 2021

The Remarkable Revival of Oxford and Cambridge


There is nearly always a theme for the publication of global rankings. Often it is the rise of Asia, or parts of it. For a while it was the malign grasp of Brexit which was crushing the life out of British research or the resilience of American science in the face of the frenzied hostility of the great orange beast. This year it seems that the latest QS world rankings are about the triumph of Oxford and other elite UK institutions and their leapfrogging their US rivals. Around the world, quite a few other places are also showcasing their splendid achievements.

In the recent QS rankings Oxford has moved up from overall fifth to second place and Cambridge from seventh to third while University College London, Imperial College London, and Edinburgh have also advanced. No doubt we will soon hear that this is because of transformative leadership, the strength that diversity brings, working together as a team or a family, although I doubt whether any actual teachers or researchers will get a bonus or a promotion for their contributions to these achievements.

But was it leadership or team spirit that pushed Oxford and Cambridge into the top five? That is very improbable. Whenever there is a big fuss about universities rising or falling significantly in the rankings in a single year it is a safe bet that it is the result of an error, the correction of an error, or a methodological flaw or tweak of some kind.

Anyway, this year's Oxbridge advances had as much to do with leadership,  internationalization, or reputation as goodness had with Mae West's diamonds. It was entirely due to a remarkable rise for both places in the score for citations per faculty, Oxford from 81.3 to 96, and Cambridge from 69.2 to 92.1. There was no such change for any of the other indicators.

Normally, there are three ways in which a university can rise in QS's citations indicator. One is to increase the number of publications while maintaining the citation rate. Another is to improve the citation rate while keeping output constant. The third is to reduce the number of faculty physically or statistically.

None of these seem to have happened at Oxford and Cambridge. The number of publications and citations has been increasing but not sufficiently to cause such a big jump. Nor does there appear to have been a drastic reduction of faculty in either place.

In any case it seems that Oxbridge is not alone in its remarkable progress this year. For citations, ETH Zurich rose from 96.4 to 99.8, University of Melbourne from 75 to 89.7, National University of Singapore from 72.9 to 90.6, Michigan from 58 to 70.5. It seems that at the top levels of these rankings nearly everybody is rising except for MIT which has the top score of 100 but it is noticeable that as we get near the top the increase gets smaller.

It is theoretically possible that this might be the result of a collapse of the raw scores of citations front runner MIT which would raise everybody else's scores if it still remained at the top but there is no evidence of either a massive collapse in citations or a massive expansion of research and teaching staff.

But then as we go to the other end of the ranking we find universities' citations scores falling, University College Cork from 23.4 to 21.8, Universitas Gadjah Mada from 1.7 to 1.5, UCSI University Malaysia from 4.4 to 3.6, American University  in Cairo from 5.7 to 4.2.

It seems there is a bug in the QS methodology. The indicator scores that are published by QS are not raw data but standardized scores based on standard deviations from the mean The mean score is set at fifty and the top score at one hundred. Over the last few years the number of ranked universities has been increasing and the new ones tend to perform less well than the the established ones, especially for citations. In consequence, the  mean number of citations per faculty has declined and therefore universities scoring above the mean will increase their standardized scores which is derived from the standard deviation from the mean. If this interpretation is incorrect I'm very willing to be corrected.

This has an impact on the relative positions of Oxbridge and leading American universities. Oxford and Cambridge rely on their  scores in the academic and employer survey and international faculty and staff to keep in the top ten. Compared to Harvard, Stanford and MIT they are do not perform well for quantity or quality of research. So the general inflation of citations scores gives them more of a boost than the US leaders and so their total score rises.

It is likely that Oxford and Cambridge's moment of glory will be brief since QS in the next couple of years will have to do some recentering in order to prevent citation indicator scores bunching up in the high nineties. The two universities will fall again although  it that will probably not be attributed to a sudden collapse of leadership or failure to work as a team.

It will be interesting to see if any of this year's rising universities will make an announcement that they don't really deserve any praise for their illusory success in the rankings.