Sunday, November 19, 2023

How Dare They? HE Sector Reacts to the King's Speech

 Occasionally there are moments when a few casual words suddenly illuminate things that have been obscured. 

One such moment was when the Vice-Chancellor of the University of Oxford expressed her embarrassment  that one of the university's alumni had failed to express appropriate deference towards experts. It would seem that the a major function of higher education is not the encouragement of critical thought but the acceptance of anything that has been approved by academic experts. This was especially ironic since the claim was made at a summit held by Times Higher Education, whose expertise in the field of university ranking is somewhat questionable

The recent King's Speech contained a bland announcement that the government would "ensure young people have the knowledge and skills to succeed" by combining technical and academic qualifications.  Also, the government will attempt to reduce enrollment in "poor quality university degrees and increase the number undertaking high quality apprenticeships." 

It is not unlikely that any such initiatives will fail to get off the ground or will crash soon after take off and even if implemented they would probably not be very effective or even effective at all.

Research Professional News, however, reports that industry insiders are incensed that the government has dared to say anything that could be considered critical of British universities. Diana Beech, Chief Executive of the London Higher Group of Institutions said "it is beyond belief that the UK government would even contemplate asking His Majesty the King to speak negatively of the national asset that is our world-leading higher education and research sector."

The King is not speaking negatively of the entire sector. He is talking about proposed efforts to improve the sector. And surely the "brightest and the best" of the world are more likely to come to Britain if they think there are efforts to bring about positive change.

It seems that the academic establishment wants everybody to pretend that there is nothing wrong with British universities. That, in the long run is not going to do anyone any good.


Saturday, October 21, 2023

Crisis, conflict and global rankings

Just published in the Journal of Adult Learning, Knowledge and Innovation

Crisis, conflict and global rankings

Read here


Abstract

Global university rankings have always been associated with international political and economic conflicts. Even before the COVID-19 pandemic there were signs that scientific and academic globalism was breaking down. The pandemic, the various measures taken to combat it, and military and ideological conflicts have led to the breakdown of international academic cooperation, the formation of very different research complexes, and the development of new regional ranking systems.

Friday, September 01, 2023

Two Decades of Rankings: Rising and Falling in ARWU

 Most rankings are of little value for identifying trends over more that a couple of years. Changes in methodology, and sometimes a lack of access to old editions make year on year comparisons difficult or impossible. The Shanghai Rankings, aka ARWU, have maintained a generally consistent methodology over two decades and publish data going back to the founding year of 2003.

So it is possible to use ARWU to look for  patterns in the  world's research and higher education landscape. Here are some "winners" and "losers", based on the number of universities in the ARWU top 500 in 2004, when Shanghai changed the initial methodology to include the social sciences and 2023. This is far from a perfect measure; for a start this ranking does not takes no account of the humanities and relies to much on old Nobel and Fields laureates. Even so it does give us some idea of the shift in the academic world's centre of gravity.

Rising

Australia from 14 in 2004 to 24 in 2023 (and from 2 in the top 100 to 7)

Brazil from 4 to 5

China from 16 to 98 (and from zero in the top 100 to 11)

Malaysia from zero to one

New Zealand from 3 to 4

Saudi Arabia from zero to 6

Singapore 2 in the top 500 in 2004 and 2023 (but rising from zero in the top 100 in 2004 to 2 in 2023)

South Korea from 8 to 11

Falling

Canada from 23 to 18 

France from 22 to 18

Germany from 43 to 31

India from 3 to 1

Israel from 7 to 5 (but rising from 1 to 3 in the top 100)

Italy from 24 to 16

Japan from 36 to 12 (and from 5 in the top 100 to 2)

Switzerland from 8 to 7

United Kingdom from 42 to 38 (and from 11 in the top 100 to 8)

United States from 170 to 120 (and from 51 in the top 100 to 38)


The last two decades have seen a massive increase in the research capabilities of universities in Australia, China, South Korea, and Singapore. The rest of Asia, including Japan and India, has stagnated or even fallen relatively and perhaps absolutely.

The biggest losers are the USA, UK, and Germany although Canada, France, Italy and Switzerland have also not done so well.

More recently, Saudi Arabia has noticeably improved and may soon be followed by other Middle eastern states.




                                







Wednesday, July 05, 2023

The New QS Methodology: Academic Snakes and Ladders

The ranking season is under way. The latest edition of the QS world rankings  has just been announced and we have already seen the publication of the latest tables from Leiden Ranking, CWUR, RUR, uniRank and the THE Impact Rankings plus the THE Asian and Young Universities and  Sub-Saharan Africa rankings. Forgive me if I've missed anything.

Each of these tells us something about the current geopolitics of higher education and science and the way in which they are reflected in the global ranking business. 

The QS rankings have a new methodology, which makes it quite different from previous editions. Nonetheless, the media have been full of universities celebrating their remarkable achievements as they have soared, surged, risen, ascended in the rankings. No doubt, there will be a few promotions and bonuses.

It is in the nature of ranking that places are finite and if some  universities suddenly surge then others will fall. It seems that the general pattern of the QS rankings is that Canadian, Australian, and American universities are rising and Chinese, Korean, and Indian Universities are falling. Russian and Ukrainian universities are also falling although that might be for other reasons.

QS have reduced the weighting of their academic survey from 40% to 30% and faculty student ratio from 20% to 10%. The weighting for the employer survey has increased from 10% to 15% and there are three new indicators, Sustainability, Employment Outcomes, and International Research Network.

QS claim that the new methodology "reflects the collective intelligence of the sector, and the changing priorities of students." If that is so, then the collective intelligence is very localised. The new methodology puts a heavy fist on the scales in favour of Western universities and against Asia.  

The revised methodology works against universities that have acquired a good reputation for research or recruited a large and permanent faculty. It favours those that have mobilised their alumni network for the employer survey and are enthusiastic participants in the sustainability movement. 

As a result the leading Chinese institutions have taken a tumble. Peking has fallen 5 places to 17th, Tsinghua 11 to 25th, and Fudan 16 to 50th. 

Other national and regional flagships have tumbled, Seoul National University from 29th to 41st, the Indian Institute of Science from 115th to 225th, the University of Tokyo from from 23rd to 28th, an the University of Hong Kong from 21st to 26th.

In contrast, the University of British Columbia, McGill University, the University of  Toronto, the University of Melbourne, the University of Sydney, the University of Cape Town, Witwatersrand University, Trinity College Dublin, University College Dublin, University of California Berkeley and UCLA are climbing the ladders. For the moment at any rate.



 










Wednesday, May 17, 2023

World Economic Forum Declares that Africa is Rising

The World Economic Forum (WEF),  an organization of the global economic elite, has published a report by Phil Baty, currently Chief Global Affairs Officer of Times Higher Education (THE), that proclaims that African universities are "surging in the world rankings" and that this is a highly positive development. This is an irresponsible claim that has scant relationship to reality. 

The report refers to a claim in 2012 by Max Price, then Vice-Chancellor of the University of Cape Town, that African universities needed to rise to the challenge of global university rankings. According to Baty, African universities are now successfully competing in the rankings game and rising to the top.

And just what is the evidence for this? Well, in 2012 there were four African universities in the THE World University Rankings, In 2022 there were 97. An impressive and remarkable achievement indeed if we are talking about same  rankings. But they are not the same.

In 2012 the THE rankings consisted of 402 universities. By 2022 they had expanded to 2,345 including "reporters", of which 1500+ were formally ranked. It would be truly amazing if any region had failed to improve its representation.

The real comparison, of course, is with the numbers in the top 400 in both years and here there is no sign of any African surging. In 2012 there were three South African universities in the top 400. The University of the Witwatersrand and Stellenbosch University were in the 251-275 band and in the 2022-2023 rankings they were in the 251-375.  The University of Cape Town was 103rd in 2012 and 160th in 2022-2023, which might be cause for concern if this was a ranking that had a rigorous and stable methodology but that is not the case for THE. 

The fourth African university in the top 400 in 2012 was the University of Alexandria in the 301-350 band. By 2022 it had dropped to the 801-1000 band. This was a university on a downward spiral from its magical moment of ranking glory in 2010 when it was ranked 147th in the world overall and 4th in the world for citations as a result of a spurious affiliation claim by a serial self-publisher and  self-citer, who was involved in a libel case with Nature.

By 2022-2023 another African university, the University of Cape Coast in Ghana, had entered the top 400. This was not a testimony to any kind of achievement. It was simply the result of the university taking part in the Gates-funded Global Burden of Disease Study (GBDS). Because of a flawed methodology it is possible for a university with a few papers in the study, which typically have hundreds of authors and citations, and a small number of total publications to rack up scores of 80, 90, or even 100  for this indicator. 

There is then no evidence of a surge of any kind, not even a bit of a trickle.

That brings us to the assertion that Nigeria has, followed by Egypt, posted the biggest gains in the citations indicator, which purports to measure research impact or research quality or something, and has therefore achieved excellent progress. 

THE is being entirely too modest here. It could have used the indicator to celebrate the extraordinary accomplishment of a range of African institutions and countries that have surged in the rankings with 90 + scores for citations, an incredible accomplishment that contrasts with very low scores for Research, which includes publications, expenditure and reputation. In fact, if this indicator is taken seriously a number of African universities have now outpaced reputable research universities in North America and China. 

These research influencers, according to THE, include Jimma University, Ethiopia, Damietta University, Egypt, Muhimbili University of Health and Allied Sciences, Tanzania, Aswan University, Egypt, University of Lagos, Nigeria, the University of Zambia, and Kafrelsheikh University, Egypt.

Again, this has nothing to do with excellence or teamwork or transformative practices or any other current managerial shibboleth. It is largely the result of contributing a single researcher or a few to the hundreds of "authors" of GBDS papers in The Lancet and other prestigious journals and collecting credit for thousands of citations.

The cruellest aspect of this is that THE have announced that they are finally getting around to a partial revamping of the world rankings methodology this year. If THE do go ahead it is very likely -- not certainly because the whole process is so complex and opaque -- that these universities will go tumbling down the rankings and we shall probably see leaders across the continent under fire for their gross incompetence.

It is strange that an organisation that supposedly represents the best minds of the corporate world should adopt THE as the sole arbiter of African excellence. It is not the only global ranking and in fact it is probably the worst for Africa. It emphasises income, assessed by three separate indicators, self-submitted data which diverts the unacknowledged  labour of talented and motivated faculty, and reputation, and privileges postgraduate programmes. 

Perhaps also, at the risk of committing heresy in the first degree, the quality of higher education should not be Africa's highest priority. The latest edition of the Progress in International Reading Literacy Study (PIRLS) shows that Morocco, Egypt, and South Africa do very poorly with regard to fourth grade literacy. For South Africa and Morocco, the situation revealed by the 2019 Trends in Mathematics and Science (TIMSS) was little better, although they did come out ahead of Pakistan and the Philippines. Surely, this is as crucial for the future of Africa as the funding of doctoral programmes.









Sunday, April 23, 2023

Article in University World News

 



Go HERE for my recent article in University World News.


Is the switch in rankings’ focus masking the West’s decline?


recent commentary in The Lancet by Richard Horton presented criticism of international university rankings, including a briefing paper from the United Nations University (UNU) in Kuala Lumpur, Malaysia.

Horton makes some relevant comments on the rankings, although his survey is very limited and incomplete, and he argues that they need to be reformed to hold universities accountable for their social responsibilities. He notes that the UNU report suggests doing away with rankings altogether.

The status of The Lancet is such that this article provides insight into the collective thinking of the Western academic and scientific establishment and it therefore needs some attention.

To start with, getting rid of rankings, as posited in the article, sounds like a good idea, but it is not really feasible. Bureaucrats and faculty in the big brand universities are not suggesting that every university is as good as any other, that their salaries or tuition fees be reduced to the industry average, that research grants be allocated randomly or that their students are no more employable or intelligent than those at other places.

Monday, March 20, 2023

The Frontiers of the Ranking World: UNIRANKS

After predatory journals and predatory conferences, the next logical step would be predatory rankings. 

Recently, internet searches have shown up something called UNIRANKS, with a polished website containing several plausible world and country rankings and announcements about a forthcoming conference, along with a list of photogenic speakers and detailed instructions about registration and payment.

I will not give out the URL since a couple of clicks in I ran into a bright red screen with a warning about phishing.

There appears to be no adequate methodological details, no advisory committee, no criteria for inclusion, or any of the other things provided  by even the most technically careless rankings. Even the name is a near copy of UniRank, a reputable if limited search engine and ranking.

So be warned. If I receive any indication that this is a proper ranking and conference I will of course post it.

Saturday, March 18, 2023

SCImago Innovation Rankings: The East-West Gap Gets Wider

The decline of western academic research becomes more apparent every time a ranking with a stable and moderately accurate methodology is published. This will not be obvious if one just looks at the top ten, or even the top fifty, of the better known rankings. Harvard, Stanford, and MIT are usually still there at the top and Oxford and Cambridge are cruising along in the top twenty or the top thirty.

But take away the metrics that measure inherited intellectual capital such as the Nobel and Fields laureates in the Shanghai rankings or the reputation surveys in the QS, THE, and US world rankings, and the dominance of the West appears ever more precarious. This is confirmed if we turn from overall rankings to subject and field tables.

Take a look at the most recent edition of the CWTS Leiden Ranking, which is highly reputed among researchers although much less so among the media. For sheer number of publications overall, Harvard still holds the lead although Zhejiang, Shanghai Jiao Tong and Tsinghua are closing in and there are more Chinese schools in the top 30.  Chinese dominance is reduced if we move to the top 10% of journals but it may be just a matter of time before China takes the lead there as well. 

But click to physical sciences and engineering. The top 19 places are held by Mainland Chinese universities with the University of Tokyo coming in at 20.  MIT is there at 33, Texas A & M at 55 and Purdue 62. Again the Chinese presence is diluted, probably just for the moment, if we switch to the top 10% or 1% of journals.  

Turning to developments in applied research, the shift to China and away from the West, appears even greater.

The SCImago Institutions rankings are rather distinctive. In addition to the standard measures of research activity, there are also metrics for innovation and societal impact. Also, they include the performance of government agencies, hospitals, research centres and companies.

The innovation rankings combine three measures of patent activity. Patents are problematic for comparing universities but they can establish broad long-term trends. 

Here are the top 10 for Innovation in 2009:

1.   Centre Nationale de la Recherche Scientifique

2.   Harvard University 

3.   National Institutes of Health, USA

4.   Stanford University 

5.   Massachusetts Institute of Technology

6.   Institute National de las Sante et de la Recherche Medicale

7.   Johns Hopkins University 

8.   University of California Los Angeles

9.   Howard Hughes Medical Institute 

10.  University of Tokyo.

And here they are for 2023:

1.   Chinese Academy of Sciences 

2.   State Grid Corporation of China  

3.   Ministry of Education PRC

4.   DeepMind

5.   Ionis Pharmaceuticals

6.   Google Inc, USA

7.   Alphabet Inc 

8.  Tsinghua University

9.   Huawei Technologies Co Ltd

10.  Google International LLC.

What happened to the high flying universities of 2009?  Harvard is in 57th place, MIT in 60th, Stanford 127th, Johns Hopkins 365th, and Tokyo in 485th. 

it seems that the torch of innovation has left the hand of American, European, and Japanese universities and research centres and has been passed to Multinational, Chinese, and American companies and research bodies, plus a few Chinese universities. I am not sure where the loyalties of the multinational institutions lie, if indeed they have any at all.




Saturday, March 04, 2023

US News and the Law Schools

There has always been a tension between the claim by commercial rankers that that they provide insights and data for students and other stakeholders and the need to keep on the good side of those institutions that can provide them with status, credibility, and perhaps even lucrative consultancies.

A recent example is Yale, Harvard, Berkeley and other leading law schools declaring that they will "boycott", "leave", "shun", or "withdraw from" the US News (USN) law school rankings. USN has announced that it will make some concessions to the schools although it seems that, for some of them at least, this will not be enough. It is possible that this revolt of the elites will spread to other US institutions and other rankings. Already Harvard Medical School has declared that it will follow suit and boycott the medical school rankings.

At first sight, it would seem that the law schools are performing an act of extraordinary generosity or self denial. Yale has held first place since the rankings began and the others who have joined the supposed boycott seem to be mainly from the upper levels of the ranking while those who have not seem to be mostly from the middle and lower. To "abandon" a ranking that has served the law school elite so well for many years is a bit odd, to say the least.

But Yale and the others are not actually leaving or withdrawing from the rankings. That is something they cannot do. The data used by US News is mostly from public sources or if it is supplied by the schools it can be replaced by public data. The point of the exercise seems to be to persuade US News to review their methodology so that it conforms to the direction where Yale law school and other institutions want to go.

We can be sure that the schools have a good idea how they will fare if the current methodology continues and what is likely to happen if there are changes. It is now standard practice in the business to model how institutions will be affected by possible tweaks in ranking methodology.

So what was Yale demanding? It wanted  fellowships to be given the same weighting as graduate legal jobs. This would appear reasonable on the surface but it seems that the fellowships will be under the control of Yale and therefore this would add a metric dependent on the ability to fund such fellowships. Yale also wanted debt-forgiveness programmes to be counted in the rankings. Again this is something dependent on the schools having enough money to spare.

For a long time the top law schools have been in the business of supplying bright and conscientious young graduates. Employers have been happy to pay substantial salaries to the graduates of the famous "top 14" schools since they appear more intelligent and productive than those from run of the mill institutions.

The top law schools have been able to do this by rigorous selection procedures including standardised tests and college grades. Basically, they have selected for intelligence and conscientiousness and perhaps for a certain amount of agreeability and conformity. There is some deception here,  including perhaps including self-deception. Yale and the rest of the elite claim that they are doing something remarkable by producing outstanding legal talent but in fact they are just recruiting new students with the greatest potential, or at least they did until quite recently.

If schools cannot select for such attributes then they will have problems convincing future employers that their graduates do in fact possess them. If that happens then the law school graduate premium will erode and if that happens future lawyers will be reluctant to go into never ending  debt to enter a career that is increasingly precarious and unrewarding.

The law schools, along with American universities in general, are also voicing their discontent with reliance on standardised tests for admission and their inclusion as ranking indicators. The rationale for this is that the tests supposedly discourage universities from offering aid according to need  and favours those who can afford expensive test prep courses.

Sometimes this is expanded into the argument that since there is a relationship between test scores and wealth then that is the only thing that tests measure and so they cannot measure anything else that might be related to academic ability. 

The problem here is that standardised tests do have a substantial relationship with intelligence, although not as much as they used to, which  in turn has a strong relationship with academic and career success. Dropping the tests means that schools will have to rely on high school and college grades. which have been increasingly diluted over the last few decades, or on recommendations, interviews, and personal essays which have little or no predictive validity and can be easily prepped or gamed.

It appears that American academia is retreating from its mission of producing highly intelligent and productive graduates and have embraced the goal of socialisation into the currently dominant ideology. Students will be  admitted and graduated and faculty recruited according to their doctrinal conformity and their declared identity.  

USN has gone some way to meeting the demands of the schools but that will probably not be enough. Already there are calls to have a completely new ranking system or to do away with rankings altogether.


 




Saturday, February 25, 2023

Global Trends in Innovation: Evidence from SCImago

We are now approaching the third decade of global university rankings. They have had a mixed impact. The original Shanghai rankings published in 2003 were a salutary shock for universities in continental Europe and contributed to a wave of restructuring and excellence initiatives. On the other hand, rankings with unstable and unreliable methodologies are of little use to anyone except for the public relations departments of wealthy Western universities. 

In contrast, the SCImago Institutions Rankings, published by a Spanish research organisation, with thousands of universities, hospitals, research institutes, companies and other organisations, can be quite informative, especially the Innovation and Societal Rankings.

The Innovation Rankings, which are based on three measures of patent citations and applications, included 4019 organisations of various kinds in 2009. The top spot was held by the Centre Nationale de la Recherche Scientifique in France, followed by Harvard, the National Institutes of Health in the USA, Stanford, and MIT.

Altogether the top 20 in 2009 consisted of  ten universities, nine American plus the University of Tokyo, and ten non-university organisations, three American, two German, two French, two multinational, and the Chinese Academy of Sciences (CAS) in 14th place. 

Fast forward to 2022 and we now have 8084 institutions. First place now goes to CAS, followed by the State Grid Corporation of China, Deep Mind Technologies, a British AI firm, the Chinese Ministry of Education, and Samsung Corp.

Now, the top twenty includes exactly two universities, Tsinghua in 14th place and Harvard in 20th. The rest are companies, health organisations, and government agencies. The nationality assigned by Scimago for these eighteen is Multinational eight, USA six, China four, and UK and South Korea one each.

What about those high flying US universities of 2009? Stanford has fallen from 4th place to 67th, the University of Michigan from 13th to 249th, the University of Washington from 16th to 234th.

The relative -- and probably now absolute as well -- decline of American academic research has been well documented. It seems that the situation is even more dire for the innovative capability of US universities. But the technological torch is passing not only to Chinese universities and research centres but also to US and Multinational corporations.



Saturday, February 04, 2023

Aggregate Ranking from BlueSky Thinking

 

In recent years there have been attempts to construct rankings that combine several global rankings. The University of New South Wales has produced an aggregate ranking based on the “Big Three” rankings, the Shanghai ARWU, Times Higher Education (THE), and QS. AppliedHE of Singapore has produced a ranking that combines these three plus Leiden Ranking and Webometrics.

The latest aggregate ranking is from BlueSky Thinking, a website devoted to research and insights in higher education. This aggregates the big three rankings plus the Best Global Universities published by US News.

There are some noticeable differences between the rankings. The University of California Berkeley is fourth in the US News rankings but 27th in QS. The National University of Singapore is 11th in QS but 71st in ARWU.

The top of the aggregate ranking is unremarkable. Harvard leads followed by Stanford, MIT, Cambridge, and Oxford.

There have been some significant changes over the last five years, with universities in Mainland China, Hong Kong, France, Germany, Saudi Arabia, and Australia recording significant improvements, while a number of US institutions, including Ohio State University, Boston University and the University of Minnesota, have fallen.

 

Saturday, January 28, 2023

Another Sustainability Ranking

 

People and Planet is a British student network concerned with environmental and social justice. It has just published a league table that claims to measure the environmental and ethical performance of UK universities.

The top five universities are Cardiff Metropolitan, Bedfordshire, Manchester Metropolitan, Reading, and University of the Arts London. It is interesting that this league table shows almost no overlap with the other rankings that claim to assess commitment or contributions to sustainability.

Here are all the six British universities in the latest UI GreenMetric ranking in order: Nottingham, Nottingham Trent, Warwick, Glasgow Caledonian, Loughborough, Teesside.

The top five British universities in the THE Impact Rankings are Newcastle, Manchester, Glasgow, Leicester, King's College London. For the QS Sustainability rankings we have: Edinburgh, Glasgow, Oxford, Newcastle, Cambridge.

There is some similarity between the QS Sustainability and the THE Impact Rankings, because both give prominence to research on environmental topics. But even this is quite modest compared to the much greater overlap between conventional research based rankings such as Leiden, Shanghai or URAP (Middle East Technical University).

This surely raises serious questions about the trend to rankings based on sustainability. If the rankers produce league tables that show such a modest degree of correlation then we have to ask whether there is any point at all to the exercise.



 



Friday, January 20, 2023

Implications of the new QS methodolgy

QS have announced that the world rankings due to appear in 2023 will have a new methodology. This is likely to produce significant changes in the scores and ranks of some universities even if there are no significant changes in the underlying data. 

There will no doubt be headlines galore about how dynamic leadership and teamwork have transformed institutions or how those miserable Scrooges in government have been crushing higher education by withholding needed funds.

The first change is that the weighting of the academic survey will be reduced from 40% to 30%. This is quite sensible: 40% is far too high for any one indicator. It remains, however, the largest single indicator and it remains one that tends to favour the old elite or those universities that can afford expensive marketing consultants, at the expense of emerging institutions. The employer survey weight will go up from 10% to 15%.

Next, the weighting of faculty student ratio has been cut from 20% to 10%. Again this is not a bad idea. This metric is quite easy to manipulate and has only a modest relationship to teaching quality, for which it is sometimes supposed to be a proxy.

What has not changed is the citations per faculty indicator. This is unfortunate since rankers can get very different results by tweaking the methodology just a bit. It would have been a big improvement if QS had used several different metrics for citations and/or publications, something that Times Higher Education has just got round to doing.

Then there are three new indicators: international research network, graduate employability, and sustainability.

This means that international indicators will now account for a 15% weighting, adding a further bias towards English-speaking universities, or those in small countries adjoining larger neighbours with similar languages and cultures and working against China and India. 

The introduction of a sustainability metric is questionable. It requires a considerable amount of institutional data collecting and this will tend to favour schools with the resources and ambitions to jump through the rankers' hoops.

On the surface, it seems that these changes will be a modest  improvement. However, I suspect that one effect of the changes will be a spurious boost for the scores and ranks of the elite Western and English-speaking  universities who can mobilise partners and alumni for the surveys, nurture their global networks, and collect the data required to compete in the rankings.


Monday, November 07, 2022

The QS Sustainability Rankings

 


Do we need to measure social and environmental impact?

Sunday, October 23, 2022

Australia and the THE World Rankings

 

The Latest Rankings

The latest Times Higher Education (THE) world rankings have just been announced at a summit in New York. Around the world political leaders, mass media, and academics have been proclaiming their delight about their universities rising in the rankings. Australian universities are especially fascinated by them, sometimes to the point of unhealthy obsession.

Study Australia reports that "Australia shines again." Insider Guides finds it "particularly exciting" that six Australian universities in the top 200 have climbed the charts. Monash University is celebrating how it has "skyrocketed" 13 places, further proof of its world-leading status.

It is unfortunate that Australian media and administrators are so concerned with these rankings. They are not the only global rankings and certainly not the most reliable, although they are apparently approved by universities in the traditional elite or their imitators.  They are not totally without value, but they do need a lot of deconstructions to get to any sort of meaningful insight.

Transparency

One problem with the THE rankings, to be painfully repetitive, is that they are far from transparent. Three of their five current “pillars” consist of more than one indicator so we cannot be sure exactly what is contributing to a rise or fall. If, for example, a university suddenly improves for THE’s teaching pillar that might be because its income has increased, or the number of faculty has increased, or the number of students has decreased, or it has awarded more doctorates or fewer bachelor’s degrees, or it has got more votes in THE’s reputation survey, or a combination of two or more of these.

THE's citations indicator, which purportedly measures research impact or research quality, stands alone but it is also extremely opaque. To calculate a university’s score for citations you have to work out the number of citations in 8,000 “boxes” (300 plus fields multiplied by five years of publication multiplied by five types of documents) and compare them to the world average. Add them up and then apply the country bonus, the square root of the national impact score, to half of the university’s score. Then calculate Z scores. For practical purposes this indicator is a black box into which masses of data disappear, are chopped up, shuffled around, processed, reconstituted and then turned into numbers and ranks that are, to say the least, somewhat counter-intuitive.

This indicator, which accounts for a 30% weighting, has produced some remarkable results over the last decade, with a succession of improbable institutions soaring into the upper reaches of this metric. This year’s tables are no exception. The world leader is Arak University of Medical Sciences, Iran, followed by Cankaya University, Turkey, Duy Tan University, Vietnam, Golestan University of Medical Sciences, Iran, and Jimma University, Ethiopia. Another two Iranian medical universities are in the top 25. They may not last long. Over the last few years quite a lot of universities have appeared briefly at the top and then in a few years slumped to a much lower position.

One of the more interesting things about the current success of the THE rankings is the apparent suspension of critical thought among the superlatively credentialed and accredited leaders of the academic world. One wonders how those professors, rectors and deans who gather at the various summits, seminars, webinars, and masterclasses would react to a graduate student who wrote a research paper that claimed that Arak University of Medical Sciences leads the world for “research quality”, Istanbul Technical University for “knowledge transfer”, or Macau University of Science and Technology for “international outlook”.

Volatility

Not only do the rankings lack transparency they are also extremely volatile. The top fifty list, or even the top one hundred, is reasonably stable but after that THE has seen some quite remarkable and puzzling ascents and descents. There have been methodological changes and there is a big one coming next year but that alone does not explain why there should be such dramatic changes. One cause of instability in the rankings is the citations indicator which is constructed so that one or a few researchers, often those working on the Gates-funded Global Burden of Disease Study (GBDS), can have a massively disproportionate impact.

Another possible cause of volatility is that the number of ranked institutions is not fixed. If the rankings expand new universities will usually be at the lower end of the scale and the effect of this is that the mean score for each indicator is lowered and this will affect the final score for every institution since the standardised scores that appear in the published tables are based on means and deviations.

There may be other reasons for the volatility of this year’s rankings. Fluctuating exchange rates may have affected reported income data, international students’ numbers may have fallen or even recovered. Some universities might have performed better in the surveys of teaching or research.

 

Australian universities rising and falling

Some Australian universities appear to have been quite mobile this year. In some cases, this has a lot to do with the citation indicator. Two years ago, Bond University was ranked in the 501 – 600 band and 26th in Australia. Now it is tenth in Australia and in the world top 300, driven largely by a remarkable rise in the citations score from 56.4 to 99.7. A lot of that seems to have come from a small number of publications relating to the 2020 PRISMA statement which amassed a massive number of citations in 2021 and 2022.

Another example is Australian Catholic University. In 2018 it was in the world 501-600 band and this year it is in band 251-300. This is mainly due to an improvement in its citations score from 59.5 to 98.5, the result of a series of articles between 2017 and 2020 related to the multi-author and massively cited GBDS.

The problem with relying on citations to get ahead in the THE rankings is that if the researchers who have been racking up the citations move on or retire the scores will eventually decline as their papers pass outside the period for counting publications. This might have happened with the University of Canberra which has benefitted from GBDS papers published between 2015 and 2018. This year, however, the 2015 and 2016 papers no longer count, and the result is that Canberra’s citation score has fallen from 98.6 to 92.6 and its world rank from 170th to 250-300. A university might even start falling just because its peers have started racking up scores of 99 plus for citations.

This is similar to the trajectory of quite a few international universities that have risen and fallen in the wake of a few highly cited papers such as Babol Noshirvani University of Technology, Iran, the Indian Institute of Technology Ropar, the University of Occupational and Environmental Health, Japan, Durban University of Technology, South Africa, and Nova Southeastern University, USA.

Citations have a lot to do with Australia’s success in the THE rankings. All the Australian universities the world rankings have a higher score for citations than for research, which is measured by publications, reputation, and research income and six have citation scores in the 90s. Compare that with Japan, where the highest citation score is 82.8. and leading universities do better for research than for citations. If THE had taken some of the weight from citations and given it to research, Australian universities might be in a different position.

Are the THE rankings any use?

Over the long term the THE rankings might have some value in charting the general health of an institution or a national system. Should a university fall steadily across several indicators despite changes in methodology and despite proclaimed excellence initiatives, then that might be a sign of systemic decline.

The success of Australian universities in the THE rankings might represent genuine progress but it is necessary to identify exactly why they are rising and how sustainable that progress is.

The rankings certainly should not be used to punish or reward researchers and teachers for “success” or “failure” in the rankers, to allocate funds, or to attract talented faculty or wealthy students.

Other rankings

The THE rankings are not the only game in town or in the world. In fact, for most purposes there are several rankings that are no worse and probably a lot better than THE. It would be a good idea for Australian universities, students and stakeholders to shop around a bit,

For a serious analysis of research quantity and quality there are straightforward rankings of research conducted by universities or research centres such as Shanghai Ranking, CWTS Leiden University, University Ranking by Academic Performance, or National Taiwan University. They can be a bit boring since they do not change very much from year to year, but they are at least reasonably competent technically and they rely on data that is fairly objective and transparent.

For prospective graduate and professional students, the research-based rankings might be helpful since the quality of research is likely to have an effect, even if an unpredictable, on the quality of postgraduate and professional instruction.

For undergraduate students there is not really too much that is directly relevant to their needs. The QS employability rankings, the Employer opinion survey in the QS world rankings, the Emerging/Trendence rankings employability rankings, the student quality section in the Center for World University Ranking tables, now based in the Emirates, can all help to provide some helpful insights.

Next year?

It seems that THE has finally steeled itself to introduce a set of changes. The precise effect is unclear except that the world rankings look to be getting even more complex and even more burdensome for the underpaid drones toiling away to collect, process and transmit the data THE requires of its “customers”. It is not clear exactly how this will affect Australian universities.

No doubt Australian deans and rectors will be wondering what lies ahead of them in the 2024 rankings coming next year. But not to worry. THE is offering “bespoke” shadow rankings that will tell them how they would have done if the new methodology had been applied this year. 

 

 

 

 

 

Sunday, August 21, 2022

California in the Shanghai Rankings

Global rankings are often misleading and uninformative, especially those that have eccentric methodologies or are subject to systematic gaming. But if their indicators are objective and reliable over several years, they can tell us something about shifts in the international distribution of research excellence.

I would like to look at 20 years of the Shanghai Rankings from the first edition in 2003 to the most recent, published this week. The first thing that anyone notices is of course the remarkable rise of China -- not Asia in general -- and the relative decline of the USA. These rankings can also be used to find regional trends within nations. Take a look at California universities. In 2003 California was the research star of the US with six universities in the world top twenty. Two decades later that number has fallen to five with the University of California (UC) San Diego falling from 14th to 21st place.

That is symptomatic of a broader trend. UC Santa Barbara has fallen from 25th to 57th, the University of Southern California from 40th to 55th, and UC Riverside from 88th to the 201-300 band. 

American universities in nearly all the states have been falling and have, for the most part, been replaced, by Chinese institutions. But even within the USA California has been drifting downwards. Caltech has gone from 3rd to 7th, UC San Francisco, a medical school, from 11th to 15th, and UC Davis from 27th to band 40-54.

This is not universal. Stanford is still second in the USA in 2022 while UC Los Angeles (UCLA) has risen from 13th to 11th.

But overall California is falling. Of the thirteen universities in the top 500 in 2003, nine had fallen in the US table by 2022, two, UC Santa Cruz and UCLA, rose and, two remained in the same rank. The decline is especially apparent in the Publications metric, which is based on recent articles in the Web of Science.

Recent events in California, including learning loss during the pandemic, the abandonment of standardised testing, and the imposition of political loyalty tests for faculty, suggest that the decline is not going to be halted or reversed any time soon.

 





Tuesday, July 19, 2022

What's the Matter with Harvard?

When the first global ranking was published by Shanghai Jiao Tong University back in 2003, the top place was taken by Harvard. It was the same for the rankings that followed in 2004, Webometrics and the THES - QS World University Rankings.  Indeed, at that time any international ranking that did not put Harvard at the top would have  been regarded as faulty.

Is Harvard Declining?

But since then Harvard has been dethroned by a few rankings. Now MIT leads in the QS world rankings, while Oxford is first in THE's  and the Chinese Academy of Sciences in Nature Index. Recently Caltech deposed Harvard at the top of the Round University Rankings, now published in Georgia.

It is difficult to get excited about Oxford leading Harvard in the THE rankings. A table that purports to show Macau University of Science and Technology as the world's most international university, Asia University Taiwan as the most innovative, and An Najah National University as the best for research impact, need not be taken too seriously.

Losing out to MIT in the QS world rankings probably does not mean very much either. Harvard is at a serious disadvantage here for international students and international faculty.

Harvard and Leiden Ranking

On the other hand, the performance of Harvard in CWTS Leiden Ranking, which is generally respected in the global research community,  might tell us that something is going on. Take a look at the total number of publications for the period 2017-20 (using the default settings and parameters). There we can see Harvard at the top with 35,050 publications followed by Zhejiang and Shanghai Jiao Tong Universities.

But it is rather different for publications in the broad subject fields. Harvard is still in the lead for Biomedical Sciences and for Social Sciences and Humanities. For Mathematics and Computer Science, however, the top twenty consists entirely of Mainland Chinese universities. The best non - Mainland institution is Nanyang Technological University in Singapore. Harvard is 128th.

You could argue whether this is just a matter of quantity rather than quality. So, let's turn to another Leiden indicator, the percentage of publications in the top 10% of journals for Mathematics and Computer Science. Even here China is in the lead, although somewhat precariously. Changsha University of Science and Technology tops the table and  Harvard is in fifth place.

The pattern for Physical Sciences and Engineering is similar. The top 19 for publications are Chinese with the University of Tokyo in 20th place. However, for those in the top 10% Harvard still leads. It seems then that Harvard is still ahead for upmarket publications in physics and engineering but a growing and substantial amount of  research is done by China, a few other parts of Asia, and perhaps some American outposts of scientific excellence such as MIT and Caltech.

The Rise of China

The trend seems clear. China is heading towards industrial and scientific hegemony and eventually Peking, Tsinghua, Fudan and Zhejiang and a few others will, if nothing changes, surpass the Ivy league, the Group of Eight, and Oxbridge, although it will take longer for the more expensive and demanding fields of research. Perhaps the opportunity will be lost in the next few years if there is another proletarian cultural revolution in China or if Western universities change course.

What Happened to Harvard's Money?

It is standard to claim that the success or failure of universities is dependent on the amount of money they receive. The latest edition of the annual Nature Index tables was accompanied by headlines proclaiming that that China's recent success in high impact research was the result of a long term investment program. 

Money surely had a lot to do with it but there needs to be a bit of caution here. The higher education establishment has a clear vested interest in getting as much money from the public purse as it can and is inclined to claiming that any decline in the rankings is a result of hostility to higher education..

Tracing the causes of Harvard's decline, we should consult the latest edition of the Round University Rankings, now based in Georgia,  which provides ranks for 20 indicators. In 2021 Harvard was first but this year it was second, replaced by Caltech. So what happened?  Looking more closely we see that in 2021 Harvard was 2nd for financial sustainability and in 2022 it was 357th, That suggests a catastrophic financial collapse. So maybe there has been a financial disaster over at Harvard and the media simply have not noticed bankrupt professors jumping out of their offices, Nobel laureates hawking their medals, or mendicant students wandering the streets with tin cups. 

Zooming in a bit, it seems that, if the data is accurate, there has been a terrible collapse in Harvard's financial fortunes. For institutional income per academic staff Harvard's rank has gone from 21st to 891st.

Exiting sarcasm mode for a moment, it is of course impossible that there has actually been such a catastrophic fall in income. I suspect that what we have here is something similar to what happened  to Trinity College Dublin  a few years ago when someone forgot the last six zeros when filling out the form for the THE world rankings.

So let me borrow a flick knife from my good friend Occam and propose that what happened to Harvard in the Round University Rankings was simply that somebody left off the zeros at the end of the institutional income number when submitting data to Clarivate Analytics, who do the statistics for RUR. I expect next year the error will be corrected, perhaps without anybody admitting that anything was wrong.

So, there was no substantial reason why Harvard lost ground to Caltech in the Round Rankings this year. Still it does say something that such a mistake could occur and that nobody in the administration noticed or had the honesty to say anything. That is perhaps symptomatic of deeper problems within American academia. We can then expect the relative decline of Harvard and the rise of Chinese universities and a few others in Asia to continue.





Saturday, June 18, 2022

Is China really quitting the international rankings?

For some time, there have been signs that some of the leading higher education powers are disenchanted with global rankings, at least those based in the UK. Russia has wound up its 5 Top 100 project, aimed at getting five universities in the top 100 of selected rankings, and several of the highly regarded Indian Institutes of Technology have withdrawn from the THE world rankings. This seems to be part of a general withdrawal from global, or Western, standards and practices in higher education and research, the latest example of which is Russia leaving the Bologna process.

Recently University World News reported that three Chinese universities, Nanjing,  Renmin University of China, and Lanzhou would not participate in "all international rankings",  which appears  to mean the THE and QS rankings. 

It is typical of the biases of the ranking world that it seems to be assumed that abandoning the QS and THE world rankings is equivalent to leaving international rankings altogether.  

In itself, the reported withdrawal by the three universities means little. None of them were in the world top 100. But it does seems that China is become more sceptical of the pretensions of the western rankers. Most Chinese universities, for example, have ignored the THE impact rankings, although  Fudan University did make an appearance in the most recent edition, getting first place for clean and affordable energy. 

China may also have noticed that proposed changes by QS and THE could work to its disadvantage. QS says that next year it will introduce a new indicator into the world rankings, International Research Network, where Chinese institutions do not do very well. THE is considering a variety of changes the impact of which is still not clear, perhaps not even to HE's data team, and which may have an unsettling effect on Asian universities.

It seems that the world's universities are beginning to diverge in several important ways, not just with regard to rankings. China, for example, is deemphasising publications in international journals. US and European institutions are increasingly concerned with social and political matters that are of limited interest in other parts of the world.

It seems that some countries are adopting a pragmatic approach to rankings, making use of them when convenient and ignoring them if necessary. One sign of this  approach recently came come from Shanghai where the city is opening the hukou, a document that regulates access to education, health insurance and  housing,  to graduates of universities at the top of the one of four world rankings rankings, Shanghai, QS, THE and  the US News Best Global Universities. The hukou will be available to those from universities in the top fifty if in full time employment for sixth months and after six months for those with degrees from universities ranked 51-100.

This is part of an effort to restart the city's economy after recent lockdowns. It would be unsurprising if other Chinese cities and other countries adopted similar policies.


Sunday, May 08, 2022

Has China Really Stopped Rising? Looking at the QS Subject Rankings

For the last few years the publication of global rankings has led to headlines about the rise of Asia. If these were to be believed we would expect a few Asian universities to be orbiting above the stratosphere by now.

The Asian ascent was always somewhat exaggerated. It was true largely for China and perhaps Southeast Asia and the Gulf States. Japan, however, has been relatively stable or even declining a bit and India so far has made little progress as far as research or innovation is concerned. Now, it seems that the Chinese research wave may be slowing down. The latest edition of the QS subject rankings suggests that that  the quality of Chinese is levelling off and perhaps even declining. 

A bit of explanation here. QS publishes rankings for broad subject fields such as Arts and Humanities and for narrow subject areas  such as Archaeology. All tables include indicators for H-index, citations per paper, academic review, and employer review, with varying weightings. This year, QS has added a new indicator, International Research Network (IRN), with "broad" -- does that mean not unanimous? -- support from its Advisory Board, which is based on the number of international collaborators and their countries or territories. Chinese universities do much less well here than on the other indicators.

With QS, as with the other rankings, we should always be careful when there is any sort of methodological change. The first rule of ranking analysis is that any non-trivial change in rank is likely to be the result of methodological changes.

So lets take a look at the broad field tables. In Arts and Humanities  the top Chinese university is Peking University which fell seven places from 36th to 43rd between 2021 and 2022.

It was the same for other broad areas. In Engineering and Technology, Tsinghua fell from 10th to 14th, and in Natural Sciences from 15th to 23rd. (In this table Peking moved slightly ahead of Tsinghua into 21st place). In Social Sciences and Management Peking went from 21st to 26th 

There was one exception. In Life Sciences and Medicine Peking rose from 62nd to 53rd, although its overall score remained the same at 79.

However, before assuming that this is evidence of Chinese decline we should note the possible impact of the new indicator where Chinese institutions, including Peking and Tsinghua, do relatively poorly. In Life Sciences and Medicine every single one of the 22 Chinese universities listed do better for H-Index and Citations than for IRN. 

It looks as though the ostensible fall of Chinese universities  is partly or largely due to QS's addition of the IRN metric.

Looking at  Pitations per paper, which is a fairly good  proxy for research quality, we find that for most subject areas, the best Chinese universities have improved since last year. In Engineering and Technology Tsinghua has risen from 89.1 to 89.6. In Life Sciences and Medicine Peking has gone from 79.2 to 80.6 and in Social Science and Management from 89.7 to 90.7.

For Natural Science Tsinghau, had a score for citations of 88.6. It fell this year but was surpassed by Peking with a score of 90.1.

If Citations per Paper are consider the arbiter of research excellence then Chinese universities have been improving over the last year and the apparent decline in the broad subject areas is largely the result of the new indicator. One wonders if the QS management knew this was going to happen.

That is not the end of the discussion. There may well be areas where the Chinese advance is faltering or at least reaching a plateau and this might be revealed by a scrutiny of the narrow subject tables.



Monday, March 28, 2022

Where does reputation come from?

THE announced the latest edition of its reputation rankings last October. The amount of information is quite limited: scores are given for only the top fifty universities. But even that provides a few interesting insights.

First, there is really no point in providing separate data for teaching and research reputation. The correlation between the two for the top fifty is .99. This is unsurprising. THE surveys researchers who have published in Scopus indexed journals and so there is a very obvious halo effect. Respondents have no choice but to refer to their knowledge of research competence when trying to assess teaching performance. If THE are going to improve their current methodology they need to recognise that their reputation surveys are measuring the same thing. Maybe they could try to find another source of respondents for the teaching survey, such as school advisors, students or faculty at predominantly teaching institutions. 

Next, after plugging in a few indicators from other rankings, it is clear that that the metrics most closely associated with teaching and research reputation are publications in Nature and Science (Shanghai), highly cited researchers (Shanghai), and papers in highly reputed journals (Leiden).

The correlation with scores in the RUR and QS reputation rankings, citations (THE and QS), and international faculty was modest.

There was no correlation at all with the proportion of papers with female or male authors (Leiden).

So it seems that the best way to acquire a reputation for good teaching and research is publish papers in the top journals and get lots of citations. That, of course, applies only to this very limited group of institutions.



Sunday, March 20, 2022

What should Rankers Do About the Ukraine Crisis?

Over the last few days there have been calls for the global rankers to boycott or delist Russian universities to protest the Russian invasion of Ukraine. There have also been demands that journals should reject submissions from Russian authors and universities and research bodies stop collaborating with Russian authors.

So far, four European ranking agencies have announced some sort of sanctions.

U-Multirank has announced that Russian universities will be suspended "until they again share in the core values of the European higher education area."

QS will not promote Russia as a study area and will pause business engagement. It will also redact Russian universities from new rankings.

Webometrics will "limit the value added information" for Russian and Belarusian universities.

Times Higher Education (THE) will stop business activities with Russia but will not remove Russian universities from its rankings. 

The crisis has highlighted a fundamental ambiguity in the nature of global rankings. Are they devices for promoting the business interests of institutions or do they provide relevant and useful information for researchers, students and the public?

Refraining from doing business with Russia until it withdraws from Ukraine is a welcome rebuke to the current government. If, however, rankings contain useful information about Russian scientific and research capabilities then that information should continue to be made available.



Sunday, September 26, 2021

What is a University Really for ?

Louise Richardson, Vice-Chancellor of the University of Oxford, has seen fit to enlighten us about the true purpose of a university. It is, it seems, to inculcate appropriate deference to the class of certified experts.

Professor Richardson remarked at the latest Times Higher Education (THE) academic summit that she was embarrassed that "we" had educated the Conservative politician Michael Gove who said, while talking about Brexit, that people had had enough of experts.

So now we know what universities are really about.  Not about critical discussion, cutting-edge research, skepticism, the disinterested pursuit of truth but about teaching respect for experts.

A few years ago I wrote a post suggesting we were now in a world where the expertise of the accredited experts was declining along with public deference. I referred to the failure of political scientists to predict the nomination of Trump, the election of Trump, the rise of Leicester City, the Brexit vote. It looks like respect for experts has continued to decline, not entirely without reason.

Professor Richardson thinks that Gove's disdain for the Brexit experts is cause for embarrassment. While it is early years for the real effects of Brexit to become clear it is as yet far from obvious that it has been an unmitigated disaster.  It is, moreover, a little ironic that the remark was made at the latest THE academic summit where the annual world rankings were announced.  Richardson remarked that she was delighted that her university was once again ranked number one.

The irony is that the THE world rankings are probably the least expert of the global rankings although they are apparently the most prestigious at least among those institutions that are known for being prestigious.

Let's have another look at THE's Citations Indicator which is supposed to measure research quality or impact and accounts for nearly a third of the total weighting. (Regular readers of this blog can skim or skip the next few lines. ) Here are the top five from this year's rankings.

1,   University of Cape Coast

2,   Duy Tan University

3,   An Najah National University

4.   Aswan University

5.   Brighton and Sussex Medical School.

This is not an academic version of the imaginary football league tables that nine-year-old children used to construct. Nor is it the result of massive cheating by the universities concerned. It is quite simply the outcome of a hopelessly flawed system. THE, or rather its data analysts, appear to be aware of the inadequacies of this indicator but somehow meaningful reform keeps getting postponed. One day historians will search the THE archives to findr the causes of this inability to take very simple and obvious measures to produce a sensible and credible ranking. I suspect that the people in control of THE policy are averse to anything that might involve any distraction from the priority of monetising as much data as possible. Nor is there any compelling reason for a rush to reform when universities like Oxford are unconcerned about the inadequacies of the current system.

Here are the top five for income from industry which is supposed to have something to do with innovation.

1.   Asia University Taiwan

2.   Istanbul Technical University

3.   Khalifa University

4.   Korean Advanced Institute of Science and Technology (KAIST)

5.   LMU Munich.

This is a bit better. It is not implausible that KAIST or Munich is a world leader for innovation. But in general, this indicator is also inadequate for any purpose other than providing fodder for publicity. See a scathing review by Alex Usher

Would any tutor or examiner at Oxford give any credit to any student who thought that Ghana, Vietnam and Palestine were centers of international research impact. They are all doing a remarkable job of teachng in many reseapects but that is not what THE is ostensibly giving them credit for.

In addition, the THE world rankings fail to meet satisfactory standards with regard to basic validity. Looking at the indicator scores for the top 200 universities in the most recent world rankings we can see that the correlation between research and teaching is 0.92. In effect these are not two distinct metrics. They are measuring essentially the same thing. A quick look at the methodology suggests that what they are comparing is income (total institutional income for teaching, research income for research), reputation (the opinion surveys for research and teaching) and investment in doctoral programmes.

On the other hand , the citations indicator does not correlate significantly with research or teaching and correlates negatively with industry income.

One can hardly blame THE for wanting to make as much money as possible. But surely we can expect something better from supposedly elite institutions that claim to value intellectual and scientific excellence. If Oxford and its peers wish to restore public confidence in the experts there is no better way than saying to THE that we will not submit data to THE until you produce something a little less embarrassing.