Friday, December 15, 2023

Yet another example of the misuse of rankings

The proliferation of rankings has led to universities selectively quoting metrics in attempts to boost prestige, student applications, and state support. A recent example is Brunel University's claim that it is the joint most international university in the UK and fourth most international in the world.

This is based on the International Outlook pillar in the most recent edition of the Times Higher Education (THE) world rankings.

THE is not the only ranking with an internationalisation indicator. Let's take a look at the others.

In the QS world rankings Brunel is 9th in the UK for International Faculty, joint 12th for International Students, and 36th for International Research Network,

In the latest URAP (at the time of writing) it is 34th in England for International Collaboration.

In Round University Rankings, Brunel is 9th for International academic staff in the UK, 17th for international students, and 22nd for International Level.

In Leiden Ranking it is joint 6th in the UK for International Collaboration.

I don't want to denigrate Brunel in any way but the claim that it is the most international university in the UK is misleading and should be withdrawn or at least accompanied by a very big *.

















Saturday, December 09, 2023

Global Subject Rankings: The Case of Computer Science

Three ranking agencies have recently released the latest editions of their subject rankings: Times Higher Education, Shanghai Ranking, and Round University Rankings.  

QS, URAP, and National Taiwan University also published subject rankings earlier in the year. The US News global rankings announced last year can be filtered for subject. The methods are different and consequently the results are also rather different. It is instructive to focus on the results for a specific field, computer science and on two universities, Oxford and Tsinghua. Note that the scope of the rankings is sometimes different.

 

1.   Times Higher Education has published rankings of eleven broad subjects using the same indicators as in their world rankings, Teaching, Research Environment, Research Quality, International Outlook, and Industry: Income and Patents, but with different weightings. For example, Teaching has a weighting of 28% for the Engineering rankings and Industry: Income and Patents 8%, while for Arts and Humanities the weightings are 37.5% and 3% respectively.

These rankings continued to be led by the traditional Anglo-American elite. Harvard is in first place for three subjects, Stanford, MIT, and Oxford in two each and Berkeley and Caltech in one each.

The top five for Computer Science are:

1.    University of Oxford

2.    Stanford University

3.    Massachusetts Institute of Technology

4.    Carnegie Mellon University

5.    ETH Zurich.

Tsinghua is 13th.

 

2.   The Shanghai subject rankings are based on these metrics: influential journal publications, category normalised citation impact, international collaboration, papers in Top Journals or Top Conferences, and faculty winning significant academic awards.

According to these rankings China is now dominant in Engineering subjects. Chinese universities lead in fifteen subjects although Harvard, MIT and Northwestern University lead for seven subjects. The Natural Sciences, Medical Sciences, and Social Sciences are still largely the preserve of American and European universities.

Excellence in the Life Sciences appears to be divided between the USA and China. The top positions in Biology, Human Biology, Agriculture, and Veterinary Science are held respectively by Harvard, University of California San Francisco, Northwest Agriculture and Forestry University, and Nanjing Agricultural University.

The top five for Computer Science and Engineering are:

1.    Massachusetts Institute of Technology

2.    Stanford University

3.    Tsinghua University

4.    Carnegie Mellon University

5.    University of California Berkeley.

Oxford is 9th.

 

3.  The Round University Rankings (RUR), now published from Tbilisi, Georgia, are derived from 20 metrics grouped in 5 clusters, Teaching, Research, International Diversity, and Financial Sustainability. The same methodology is used for rankings in six broad fields. Here, Harvard is in first place for Medical Sciences, Social Sciences, and Technical Sciences, Caltech for Life Sciences, and University of Pennsylvania for Humanities.

RUR’s narrow subject rankings, published for the first time, use different criteria related to publications and citations: Number of Papers, Number of Citations, Citations per Paper, Number of Citing Papers, and Number of Highly Cited Papers. In these rankings, first place goes to twelve universities in the USA, eight in Mainland China, three in Singapore, and one each in Hong Kong, France, and the UK.

 The top five for Computer Science are:

1.    National University of Singapore

2.    Nanyang Technological University

3.    Massachusetts Institute of Technology

4.    Huazhong University of Science and Technology

5.    University of Electronic Science and Technology of China.

Tsinghua is 10th.  Oxford is 47th.

 

4.   The QS World University Rankings by Subject are based on five indicators: Academic reputation, Employer reputation, Research citations per paper, H-index and International research network.  At the top they are mostly led by the usual suspects, MIT, Harvard, Stanford, Oxford, and Cambridge.

The top five for Computer Science and Information Systems

1.    Massachusetts Institute of Technology

2.    Carnegie Mellon University

3.    Stanford University

4.    University of California Berkeley

5.    University of Oxford.

Tsinghua is 15th.

 

5.   University Ranking by Academic Performance (URAP) is produced by a research group at the Middle East Technical University, Ankara, and is based on publications, citations, and international collaboration. Last July it published rankings of 78 subjects.  

 The top five for Information and Computing Sciences were:

1.    Tsinghua University

2.    University of Electronic Science and Technology of China

3.   Nanyang Technological University

4.   National University of Singapore

5.   Xidian University

Oxford is 19th

 

6.    The US News Best Global Universities can be filtered by subject. They are based on publications, citations and research reputation.

The top five for Computer Science in 2022 were:

1.   Tsinghua University

2.   Stanford University

3.    Massachusetts Institute of Technology

4.    Carnegie Mellon University

5.   University of California Berkeley

Oxford was 11th.

 

7.    The National Taiwan University Rankings are based on articles, citations, highly cited papers, and H-index.

The top five for Computer Science are:

1.    Nanyang Technological University

2.    Tsinghua University

3.    University of Electronic Science and Technology of China

4.   National University of Singapore

5.    Xidian University

Oxford is 111th

 

So, Tsinghua is ahead of Oxford for computer science and related fields in the Shanghai Rankings, the Round University Rankings, URAP, the US News Best Global Universities, and the National Taiwan University Rankings. These rankings are entirely or mainly based on research publications and citations. Oxford is ahead of Tsinghua in both the QS and THE subject rankings. The contrast between the THE and the Taiwan rankings is especially striking.

 

 

 

 

 

Friday, November 24, 2023

Observations on the THE Arab University Rankings

Times Higher Education (THE) has just announced the third edition of its Arab University Rankings. There has been a churning of universities with many falling and many rising. Once again, this volatility seems largely the result of methodology changes and only in part any genuine decline or progress

The rankings are led by King Abdullah University of Science and Technology (KAUST) in Saudi Arabia, which makes sense from the point of view of high impact research, although it does no undergraduate teaching. After that we have Khalifa University, UAE, Qatar University, King Fahd University of Petroleum and Minerals, Saudi Arabia, and the University of Sharjah, UAE.

THE has introduced a raft of changes in its World University Rankings, including adding patents as a metric, tweaking the internationalisation pillar to help larger countries, and including three new measures of citations. 

They have added more changes to the Arab University Rankings. The weighting given to the teaching and research surveys has been trimmed. Field Normalised Citation Impact has been removed altogether leaving the three new metrics for research impact: Research Strength, Research Excellence, and Research Influence. Within the International Outlook pillar there is now a 2% weighting for inter-Arab collaboration. The Society pillar, unlike the world rankings,  does not include patents and  it gives a 4%. weight to participation and performance in THE's Impact Ranking.

It is always advisable to look at the specific metric ranks for any ranking, especially THE. For this year's ranking we have: Research Quality; KAUST, International Outlook; Gulf Medical University, UAE, Research Environment; KAUST, Teaching; Beirut Arab University, Society; KAUST. 

There are some interesting things about this year's rankings. To start, there is a noticeable improvement in the ranks of universities in the United Arab Emirates. There are now six UAE universities in the top 25 compared with four last year and three in 2021.

Some Emirati universities have done particularly well, Khalifa University in Abu Dhabi has risen from fifth place to second and  Abu Dhabi University from 39th to 9th. 

The results were announced this year at the THE MENA summit which this year was held at the campus of New York University Abu Dhabi. 

That meeting also saw a number of awards going to UAE institutions, including Abu Dhabi University for International Strategy of the year. Gulf Medical University in Ajman for outstanding support for students, New York University Abu Dhabi for Research Project  STEM, American University in Dubai for  Teaching and Learning Strategy.

 A few years ago I noticed that THE was holding conferences  where they would announce results that appeared to favour the host countries. Thus in February 2015 THE held a MENA summit in Qatar with a "snapshot" single metric ranking that put Texas A & M Qatar in first place and UAE University 11th. The next MENA meeting was in January 2016 in Al Ain, UAE where in a ranking that used the WUR metrics, Texas A and M Qatar disappeared and UAEU rose to fifth place.

Another example. In February 2016 at a conference held at the Hong Kong University of Science and Technology, THE introduced a new methodology for its Asian rankings that dethroned the University of Tokyo  as the top Asian university and placed it below universities in Kong Kong, Singapore, and Mainland China.

In contrast, the number of Egyptian universities in the top 25 has fallen from six to two , Mansoura University and the American University in Cairo. Last year's front runner King Abdulaziz University, Saudi Arabia, has fallen to 15th place.

So the holding of a summit in Abu Dhabi and a new methodology coincided with a significant improvement for UAE in general and a very significant improvement for two Abu Dhabi universities. Plus NYU Abu Dhabi, currently unranked, received an award. Perhaps this is just a coincidence or perhaps such a turnover in a single year reflects  real changes, which the new methodology accurately detects. But cynics may wonder a little.

There has been a lot of discussion recently about conflict of interest in the ranking business. It is likely that questions will be asked about a new methodology so conveniently helping institutions in the summit host country.











Sunday, November 19, 2023

How Dare They? HE Sector Reacts to the King's Speech

 Occasionally there are moments when a few casual words suddenly illuminate things that have been obscured. 

One such moment was when the Vice-Chancellor of the University of Oxford expressed her embarrassment  that one of the university's alumni had failed to express appropriate deference towards experts. It would seem that the a major function of higher education is not the encouragement of critical thought but the acceptance of anything that has been approved by academic experts. This was especially ironic since the claim was made at a summit held by Times Higher Education, whose expertise in the field of university ranking is somewhat questionable

The recent King's Speech contained a bland announcement that the government would "ensure young people have the knowledge and skills to succeed" by combining technical and academic qualifications.  Also, the government will attempt to reduce enrollment in "poor quality university degrees and increase the number undertaking high quality apprenticeships." 

It is not unlikely that any such initiatives will fail to get off the ground or will crash soon after take off and even if implemented they would probably not be very effective or even effective at all.

Research Professional News, however, reports that industry insiders are incensed that the government has dared to say anything that could be considered critical of British universities. Diana Beech, Chief Executive of the London Higher Group of Institutions said "it is beyond belief that the UK government would even contemplate asking His Majesty the King to speak negatively of the national asset that is our world-leading higher education and research sector."

The King is not speaking negatively of the entire sector. He is talking about proposed efforts to improve the sector. And surely the "brightest and the best" of the world are more likely to come to Britain if they think there are efforts to bring about positive change.

It seems that the academic establishment wants everybody to pretend that there is nothing wrong with British universities. That, in the long run is not going to do anyone any good.


Saturday, October 21, 2023

Crisis, conflict and global rankings

Just published in the Journal of Adult Learning, Knowledge and Innovation

Crisis, conflict and global rankings

Read here


Abstract

Global university rankings have always been associated with international political and economic conflicts. Even before the COVID-19 pandemic there were signs that scientific and academic globalism was breaking down. The pandemic, the various measures taken to combat it, and military and ideological conflicts have led to the breakdown of international academic cooperation, the formation of very different research complexes, and the development of new regional ranking systems.

Friday, September 01, 2023

Two Decades of Rankings: Rising and Falling in ARWU

 Most rankings are of little value for identifying trends over more that a couple of years. Changes in methodology, and sometimes a lack of access to old editions make year on year comparisons difficult or impossible. The Shanghai Rankings, aka ARWU, have maintained a generally consistent methodology over two decades and publish data going back to the founding year of 2003.

So it is possible to use ARWU to look for  patterns in the  world's research and higher education landscape. Here are some "winners" and "losers", based on the number of universities in the ARWU top 500 in 2004, when Shanghai changed the initial methodology to include the social sciences and 2023. This is far from a perfect measure; for a start this ranking does not takes no account of the humanities and relies to much on old Nobel and Fields laureates. Even so it does give us some idea of the shift in the academic world's centre of gravity.

Rising

Australia from 14 in 2004 to 24 in 2023 (and from 2 in the top 100 to 7)

Brazil from 4 to 5

China from 16 to 98 (and from zero in the top 100 to 11)

Malaysia from zero to one

New Zealand from 3 to 4

Saudi Arabia from zero to 6

Singapore 2 in the top 500 in 2004 and 2023 (but rising from zero in the top 100 in 2004 to 2 in 2023)

South Korea from 8 to 11

Falling

Canada from 23 to 18 

France from 22 to 18

Germany from 43 to 31

India from 3 to 1

Israel from 7 to 5 (but rising from 1 to 3 in the top 100)

Italy from 24 to 16

Japan from 36 to 12 (and from 5 in the top 100 to 2)

Switzerland from 8 to 7

United Kingdom from 42 to 38 (and from 11 in the top 100 to 8)

United States from 170 to 120 (and from 51 in the top 100 to 38)


The last two decades have seen a massive increase in the research capabilities of universities in Australia, China, South Korea, and Singapore. The rest of Asia, including Japan and India, has stagnated or even fallen relatively and perhaps absolutely.

The biggest losers are the USA, UK, and Germany although Canada, France, Italy and Switzerland have also not done so well.

More recently, Saudi Arabia has noticeably improved and may soon be followed by other Middle eastern states.




                                







Wednesday, July 05, 2023

The New QS Methodology: Academic Snakes and Ladders

The ranking season is under way. The latest edition of the QS world rankings  has just been announced and we have already seen the publication of the latest tables from Leiden Ranking, CWUR, RUR, uniRank and the THE Impact Rankings plus the THE Asian and Young Universities and  Sub-Saharan Africa rankings. Forgive me if I've missed anything.

Each of these tells us something about the current geopolitics of higher education and science and the way in which they are reflected in the global ranking business. 

The QS rankings have a new methodology, which makes it quite different from previous editions. Nonetheless, the media have been full of universities celebrating their remarkable achievements as they have soared, surged, risen, ascended in the rankings. No doubt, there will be a few promotions and bonuses.

It is in the nature of ranking that places are finite and if some  universities suddenly surge then others will fall. It seems that the general pattern of the QS rankings is that Canadian, Australian, and American universities are rising and Chinese, Korean, and Indian Universities are falling. Russian and Ukrainian universities are also falling although that might be for other reasons.

QS have reduced the weighting of their academic survey from 40% to 30% and faculty student ratio from 20% to 10%. The weighting for the employer survey has increased from 10% to 15% and there are three new indicators, Sustainability, Employment Outcomes, and International Research Network.

QS claim that the new methodology "reflects the collective intelligence of the sector, and the changing priorities of students." If that is so, then the collective intelligence is very localised. The new methodology puts a heavy fist on the scales in favour of Western universities and against Asia.  

The revised methodology works against universities that have acquired a good reputation for research or recruited a large and permanent faculty. It favours those that have mobilised their alumni network for the employer survey and are enthusiastic participants in the sustainability movement. 

As a result the leading Chinese institutions have taken a tumble. Peking has fallen 5 places to 17th, Tsinghua 11 to 25th, and Fudan 16 to 50th. 

Other national and regional flagships have tumbled, Seoul National University from 29th to 41st, the Indian Institute of Science from 115th to 225th, the University of Tokyo from from 23rd to 28th, an the University of Hong Kong from 21st to 26th.

In contrast, the University of British Columbia, McGill University, the University of  Toronto, the University of Melbourne, the University of Sydney, the University of Cape Town, Witwatersrand University, Trinity College Dublin, University College Dublin, University of California Berkeley and UCLA are climbing the ladders. For the moment at any rate.



 










Wednesday, May 17, 2023

World Economic Forum Declares that Africa is Rising

The World Economic Forum (WEF),  an organization of the global economic elite, has published a report by Phil Baty, currently Chief Global Affairs Officer of Times Higher Education (THE), that proclaims that African universities are "surging in the world rankings" and that this is a highly positive development. This is an irresponsible claim that has scant relationship to reality. 

The report refers to a claim in 2012 by Max Price, then Vice-Chancellor of the University of Cape Town, that African universities needed to rise to the challenge of global university rankings. According to Baty, African universities are now successfully competing in the rankings game and rising to the top.

And just what is the evidence for this? Well, in 2012 there were four African universities in the THE World University Rankings, In 2022 there were 97. An impressive and remarkable achievement indeed if we are talking about same  rankings. But they are not the same.

In 2012 the THE rankings consisted of 402 universities. By 2022 they had expanded to 2,345 including "reporters", of which 1500+ were formally ranked. It would be truly amazing if any region had failed to improve its representation.

The real comparison, of course, is with the numbers in the top 400 in both years and here there is no sign of any African surging. In 2012 there were three South African universities in the top 400. The University of the Witwatersrand and Stellenbosch University were in the 251-275 band and in the 2022-2023 rankings they were in the 251-375.  The University of Cape Town was 103rd in 2012 and 160th in 2022-2023, which might be cause for concern if this was a ranking that had a rigorous and stable methodology but that is not the case for THE. 

The fourth African university in the top 400 in 2012 was the University of Alexandria in the 301-350 band. By 2022 it had dropped to the 801-1000 band. This was a university on a downward spiral from its magical moment of ranking glory in 2010 when it was ranked 147th in the world overall and 4th in the world for citations as a result of a spurious affiliation claim by a serial self-publisher and  self-citer, who was involved in a libel case with Nature.

By 2022-2023 another African university, the University of Cape Coast in Ghana, had entered the top 400. This was not a testimony to any kind of achievement. It was simply the result of the university taking part in the Gates-funded Global Burden of Disease Study (GBDS). Because of a flawed methodology it is possible for a university with a few papers in the study, which typically have hundreds of authors and citations, and a small number of total publications to rack up scores of 80, 90, or even 100  for this indicator. 

There is then no evidence of a surge of any kind, not even a bit of a trickle.

That brings us to the assertion that Nigeria has, followed by Egypt, posted the biggest gains in the citations indicator, which purports to measure research impact or research quality or something, and has therefore achieved excellent progress. 

THE is being entirely too modest here. It could have used the indicator to celebrate the extraordinary accomplishment of a range of African institutions and countries that have surged in the rankings with 90 + scores for citations, an incredible accomplishment that contrasts with very low scores for Research, which includes publications, expenditure and reputation. In fact, if this indicator is taken seriously a number of African universities have now outpaced reputable research universities in North America and China. 

These research influencers, according to THE, include Jimma University, Ethiopia, Damietta University, Egypt, Muhimbili University of Health and Allied Sciences, Tanzania, Aswan University, Egypt, University of Lagos, Nigeria, the University of Zambia, and Kafrelsheikh University, Egypt.

Again, this has nothing to do with excellence or teamwork or transformative practices or any other current managerial shibboleth. It is largely the result of contributing a single researcher or a few to the hundreds of "authors" of GBDS papers in The Lancet and other prestigious journals and collecting credit for thousands of citations.

The cruellest aspect of this is that THE have announced that they are finally getting around to a partial revamping of the world rankings methodology this year. If THE do go ahead it is very likely -- not certainly because the whole process is so complex and opaque -- that these universities will go tumbling down the rankings and we shall probably see leaders across the continent under fire for their gross incompetence.

It is strange that an organisation that supposedly represents the best minds of the corporate world should adopt THE as the sole arbiter of African excellence. It is not the only global ranking and in fact it is probably the worst for Africa. It emphasises income, assessed by three separate indicators, self-submitted data which diverts the unacknowledged  labour of talented and motivated faculty, and reputation, and privileges postgraduate programmes. 

Perhaps also, at the risk of committing heresy in the first degree, the quality of higher education should not be Africa's highest priority. The latest edition of the Progress in International Reading Literacy Study (PIRLS) shows that Morocco, Egypt, and South Africa do very poorly with regard to fourth grade literacy. For South Africa and Morocco, the situation revealed by the 2019 Trends in Mathematics and Science (TIMSS) was little better, although they did come out ahead of Pakistan and the Philippines. Surely, this is as crucial for the future of Africa as the funding of doctoral programmes.









Sunday, April 23, 2023

Article in University World News

 



Go HERE for my recent article in University World News.


Is the switch in rankings’ focus masking the West’s decline?


recent commentary in The Lancet by Richard Horton presented criticism of international university rankings, including a briefing paper from the United Nations University (UNU) in Kuala Lumpur, Malaysia.

Horton makes some relevant comments on the rankings, although his survey is very limited and incomplete, and he argues that they need to be reformed to hold universities accountable for their social responsibilities. He notes that the UNU report suggests doing away with rankings altogether.

The status of The Lancet is such that this article provides insight into the collective thinking of the Western academic and scientific establishment and it therefore needs some attention.

To start with, getting rid of rankings, as posited in the article, sounds like a good idea, but it is not really feasible. Bureaucrats and faculty in the big brand universities are not suggesting that every university is as good as any other, that their salaries or tuition fees be reduced to the industry average, that research grants be allocated randomly or that their students are no more employable or intelligent than those at other places.

Monday, March 20, 2023

The Frontiers of the Ranking World: UNIRANKS

After predatory journals and predatory conferences, the next logical step would be predatory rankings. 

Recently, internet searches have shown up something called UNIRANKS, with a polished website containing several plausible world and country rankings and announcements about a forthcoming conference, along with a list of photogenic speakers and detailed instructions about registration and payment.

I will not give out the URL since a couple of clicks in I ran into a bright red screen with a warning about phishing.

There appears to be no adequate methodological details, no advisory committee, no criteria for inclusion, or any of the other things provided  by even the most technically careless rankings. Even the name is a near copy of UniRank, a reputable if limited search engine and ranking.

So be warned. If I receive any indication that this is a proper ranking and conference I will of course post it.

Saturday, March 18, 2023

SCImago Innovation Rankings: The East-West Gap Gets Wider

The decline of western academic research becomes more apparent every time a ranking with a stable and moderately accurate methodology is published. This will not be obvious if one just looks at the top ten, or even the top fifty, of the better known rankings. Harvard, Stanford, and MIT are usually still there at the top and Oxford and Cambridge are cruising along in the top twenty or the top thirty.

But take away the metrics that measure inherited intellectual capital such as the Nobel and Fields laureates in the Shanghai rankings or the reputation surveys in the QS, THE, and US world rankings, and the dominance of the West appears ever more precarious. This is confirmed if we turn from overall rankings to subject and field tables.

Take a look at the most recent edition of the CWTS Leiden Ranking, which is highly reputed among researchers although much less so among the media. For sheer number of publications overall, Harvard still holds the lead although Zhejiang, Shanghai Jiao Tong and Tsinghua are closing in and there are more Chinese schools in the top 30.  Chinese dominance is reduced if we move to the top 10% of journals but it may be just a matter of time before China takes the lead there as well. 

But click to physical sciences and engineering. The top 19 places are held by Mainland Chinese universities with the University of Tokyo coming in at 20.  MIT is there at 33, Texas A & M at 55 and Purdue 62. Again the Chinese presence is diluted, probably just for the moment, if we switch to the top 10% or 1% of journals.  

Turning to developments in applied research, the shift to China and away from the West, appears even greater.

The SCImago Institutions rankings are rather distinctive. In addition to the standard measures of research activity, there are also metrics for innovation and societal impact. Also, they include the performance of government agencies, hospitals, research centres and companies.

The innovation rankings combine three measures of patent activity. Patents are problematic for comparing universities but they can establish broad long-term trends. 

Here are the top 10 for Innovation in 2009:

1.   Centre Nationale de la Recherche Scientifique

2.   Harvard University 

3.   National Institutes of Health, USA

4.   Stanford University 

5.   Massachusetts Institute of Technology

6.   Institute National de las Sante et de la Recherche Medicale

7.   Johns Hopkins University 

8.   University of California Los Angeles

9.   Howard Hughes Medical Institute 

10.  University of Tokyo.

And here they are for 2023:

1.   Chinese Academy of Sciences 

2.   State Grid Corporation of China  

3.   Ministry of Education PRC

4.   DeepMind

5.   Ionis Pharmaceuticals

6.   Google Inc, USA

7.   Alphabet Inc 

8.  Tsinghua University

9.   Huawei Technologies Co Ltd

10.  Google International LLC.

What happened to the high flying universities of 2009?  Harvard is in 57th place, MIT in 60th, Stanford 127th, Johns Hopkins 365th, and Tokyo in 485th. 

it seems that the torch of innovation has left the hand of American, European, and Japanese universities and research centres and has been passed to Multinational, Chinese, and American companies and research bodies, plus a few Chinese universities. I am not sure where the loyalties of the multinational institutions lie, if indeed they have any at all.




Saturday, March 04, 2023

US News and the Law Schools

There has always been a tension between the claim by commercial rankers that that they provide insights and data for students and other stakeholders and the need to keep on the good side of those institutions that can provide them with status, credibility, and perhaps even lucrative consultancies.

A recent example is Yale, Harvard, Berkeley and other leading law schools declaring that they will "boycott", "leave", "shun", or "withdraw from" the US News (USN) law school rankings. USN has announced that it will make some concessions to the schools although it seems that, for some of them at least, this will not be enough. It is possible that this revolt of the elites will spread to other US institutions and other rankings. Already Harvard Medical School has declared that it will follow suit and boycott the medical school rankings.

At first sight, it would seem that the law schools are performing an act of extraordinary generosity or self denial. Yale has held first place since the rankings began and the others who have joined the supposed boycott seem to be mainly from the upper levels of the ranking while those who have not seem to be mostly from the middle and lower. To "abandon" a ranking that has served the law school elite so well for many years is a bit odd, to say the least.

But Yale and the others are not actually leaving or withdrawing from the rankings. That is something they cannot do. The data used by US News is mostly from public sources or if it is supplied by the schools it can be replaced by public data. The point of the exercise seems to be to persuade US News to review their methodology so that it conforms to the direction where Yale law school and other institutions want to go.

We can be sure that the schools have a good idea how they will fare if the current methodology continues and what is likely to happen if there are changes. It is now standard practice in the business to model how institutions will be affected by possible tweaks in ranking methodology.

So what was Yale demanding? It wanted  fellowships to be given the same weighting as graduate legal jobs. This would appear reasonable on the surface but it seems that the fellowships will be under the control of Yale and therefore this would add a metric dependent on the ability to fund such fellowships. Yale also wanted debt-forgiveness programmes to be counted in the rankings. Again this is something dependent on the schools having enough money to spare.

For a long time the top law schools have been in the business of supplying bright and conscientious young graduates. Employers have been happy to pay substantial salaries to the graduates of the famous "top 14" schools since they appear more intelligent and productive than those from run of the mill institutions.

The top law schools have been able to do this by rigorous selection procedures including standardised tests and college grades. Basically, they have selected for intelligence and conscientiousness and perhaps for a certain amount of agreeability and conformity. There is some deception here,  including perhaps including self-deception. Yale and the rest of the elite claim that they are doing something remarkable by producing outstanding legal talent but in fact they are just recruiting new students with the greatest potential, or at least they did until quite recently.

If schools cannot select for such attributes then they will have problems convincing future employers that their graduates do in fact possess them. If that happens then the law school graduate premium will erode and if that happens future lawyers will be reluctant to go into never ending  debt to enter a career that is increasingly precarious and unrewarding.

The law schools, along with American universities in general, are also voicing their discontent with reliance on standardised tests for admission and their inclusion as ranking indicators. The rationale for this is that the tests supposedly discourage universities from offering aid according to need  and favours those who can afford expensive test prep courses.

Sometimes this is expanded into the argument that since there is a relationship between test scores and wealth then that is the only thing that tests measure and so they cannot measure anything else that might be related to academic ability. 

The problem here is that standardised tests do have a substantial relationship with intelligence, although not as much as they used to, which  in turn has a strong relationship with academic and career success. Dropping the tests means that schools will have to rely on high school and college grades. which have been increasingly diluted over the last few decades, or on recommendations, interviews, and personal essays which have little or no predictive validity and can be easily prepped or gamed.

It appears that American academia is retreating from its mission of producing highly intelligent and productive graduates and have embraced the goal of socialisation into the currently dominant ideology. Students will be  admitted and graduated and faculty recruited according to their doctrinal conformity and their declared identity.  

USN has gone some way to meeting the demands of the schools but that will probably not be enough. Already there are calls to have a completely new ranking system or to do away with rankings altogether.


 




Saturday, February 25, 2023

Global Trends in Innovation: Evidence from SCImago

We are now approaching the third decade of global university rankings. They have had a mixed impact. The original Shanghai rankings published in 2003 were a salutary shock for universities in continental Europe and contributed to a wave of restructuring and excellence initiatives. On the other hand, rankings with unstable and unreliable methodologies are of little use to anyone except for the public relations departments of wealthy Western universities. 

In contrast, the SCImago Institutions Rankings, published by a Spanish research organisation, with thousands of universities, hospitals, research institutes, companies and other organisations, can be quite informative, especially the Innovation and Societal Rankings.

The Innovation Rankings, which are based on three measures of patent citations and applications, included 4019 organisations of various kinds in 2009. The top spot was held by the Centre Nationale de la Recherche Scientifique in France, followed by Harvard, the National Institutes of Health in the USA, Stanford, and MIT.

Altogether the top 20 in 2009 consisted of  ten universities, nine American plus the University of Tokyo, and ten non-university organisations, three American, two German, two French, two multinational, and the Chinese Academy of Sciences (CAS) in 14th place. 

Fast forward to 2022 and we now have 8084 institutions. First place now goes to CAS, followed by the State Grid Corporation of China, Deep Mind Technologies, a British AI firm, the Chinese Ministry of Education, and Samsung Corp.

Now, the top twenty includes exactly two universities, Tsinghua in 14th place and Harvard in 20th. The rest are companies, health organisations, and government agencies. The nationality assigned by Scimago for these eighteen is Multinational eight, USA six, China four, and UK and South Korea one each.

What about those high flying US universities of 2009? Stanford has fallen from 4th place to 67th, the University of Michigan from 13th to 249th, the University of Washington from 16th to 234th.

The relative -- and probably now absolute as well -- decline of American academic research has been well documented. It seems that the situation is even more dire for the innovative capability of US universities. But the technological torch is passing not only to Chinese universities and research centres but also to US and Multinational corporations.



Saturday, February 04, 2023

Aggregate Ranking from BlueSky Thinking

 

In recent years there have been attempts to construct rankings that combine several global rankings. The University of New South Wales has produced an aggregate ranking based on the “Big Three” rankings, the Shanghai ARWU, Times Higher Education (THE), and QS. AppliedHE of Singapore has produced a ranking that combines these three plus Leiden Ranking and Webometrics.

The latest aggregate ranking is from BlueSky Thinking, a website devoted to research and insights in higher education. This aggregates the big three rankings plus the Best Global Universities published by US News.

There are some noticeable differences between the rankings. The University of California Berkeley is fourth in the US News rankings but 27th in QS. The National University of Singapore is 11th in QS but 71st in ARWU.

The top of the aggregate ranking is unremarkable. Harvard leads followed by Stanford, MIT, Cambridge, and Oxford.

There have been some significant changes over the last five years, with universities in Mainland China, Hong Kong, France, Germany, Saudi Arabia, and Australia recording significant improvements, while a number of US institutions, including Ohio State University, Boston University and the University of Minnesota, have fallen.

 

Saturday, January 28, 2023

Another Sustainability Ranking

 

People and Planet is a British student network concerned with environmental and social justice. It has just published a league table that claims to measure the environmental and ethical performance of UK universities.

The top five universities are Cardiff Metropolitan, Bedfordshire, Manchester Metropolitan, Reading, and University of the Arts London. It is interesting that this league table shows almost no overlap with the other rankings that claim to assess commitment or contributions to sustainability.

Here are all the six British universities in the latest UI GreenMetric ranking in order: Nottingham, Nottingham Trent, Warwick, Glasgow Caledonian, Loughborough, Teesside.

The top five British universities in the THE Impact Rankings are Newcastle, Manchester, Glasgow, Leicester, King's College London. For the QS Sustainability rankings we have: Edinburgh, Glasgow, Oxford, Newcastle, Cambridge.

There is some similarity between the QS Sustainability and the THE Impact Rankings, because both give prominence to research on environmental topics. But even this is quite modest compared to the much greater overlap between conventional research based rankings such as Leiden, Shanghai or URAP (Middle East Technical University).

This surely raises serious questions about the trend to rankings based on sustainability. If the rankers produce league tables that show such a modest degree of correlation then we have to ask whether there is any point at all to the exercise.



 



Friday, January 20, 2023

Implications of the new QS methodolgy

QS have announced that the world rankings due to appear in 2023 will have a new methodology. This is likely to produce significant changes in the scores and ranks of some universities even if there are no significant changes in the underlying data. 

There will no doubt be headlines galore about how dynamic leadership and teamwork have transformed institutions or how those miserable Scrooges in government have been crushing higher education by withholding needed funds.

The first change is that the weighting of the academic survey will be reduced from 40% to 30%. This is quite sensible: 40% is far too high for any one indicator. It remains, however, the largest single indicator and it remains one that tends to favour the old elite or those universities that can afford expensive marketing consultants, at the expense of emerging institutions. The employer survey weight will go up from 10% to 15%.

Next, the weighting of faculty student ratio has been cut from 20% to 10%. Again this is not a bad idea. This metric is quite easy to manipulate and has only a modest relationship to teaching quality, for which it is sometimes supposed to be a proxy.

What has not changed is the citations per faculty indicator. This is unfortunate since rankers can get very different results by tweaking the methodology just a bit. It would have been a big improvement if QS had used several different metrics for citations and/or publications, something that Times Higher Education has just got round to doing.

Then there are three new indicators: international research network, graduate employability, and sustainability.

This means that international indicators will now account for a 15% weighting, adding a further bias towards English-speaking universities, or those in small countries adjoining larger neighbours with similar languages and cultures and working against China and India. 

The introduction of a sustainability metric is questionable. It requires a considerable amount of institutional data collecting and this will tend to favour schools with the resources and ambitions to jump through the rankers' hoops.

On the surface, it seems that these changes will be a modest  improvement. However, I suspect that one effect of the changes will be a spurious boost for the scores and ranks of the elite Western and English-speaking  universities who can mobilise partners and alumni for the surveys, nurture their global networks, and collect the data required to compete in the rankings.