Saturday, December 09, 2023

Global Subject Rankings: The Case of Computer Science

Three ranking agencies have recently released the latest editions of their subject rankings: Times Higher Education, Shanghai Ranking, and Round University Rankings.  

QS, URAP, and National Taiwan University also published subject rankings earlier in the year. The US News global rankings announced last year can be filtered for subject. The methods are different and consequently the results are also rather different. It is instructive to focus on the results for a specific field, computer science and on two universities, Oxford and Tsinghua. Note that the scope of the rankings is sometimes different.

 

1.   Times Higher Education has published rankings of eleven broad subjects using the same indicators as in their world rankings, Teaching, Research Environment, Research Quality, International Outlook, and Industry: Income and Patents, but with different weightings. For example, Teaching has a weighting of 28% for the Engineering rankings and Industry: Income and Patents 8%, while for Arts and Humanities the weightings are 37.5% and 3% respectively.

These rankings continued to be led by the traditional Anglo-American elite. Harvard is in first place for three subjects, Stanford, MIT, and Oxford in two each and Berkeley and Caltech in one each.

The top five for Computer Science are:

1.    University of Oxford

2.    Stanford University

3.    Massachusetts Institute of Technology

4.    Carnegie Mellon University

5.    ETH Zurich.

Tsinghua is 13th.

 

2.   The Shanghai subject rankings are based on these metrics: influential journal publications, category normalised citation impact, international collaboration, papers in Top Journals or Top Conferences, and faculty winning significant academic awards.

According to these rankings China is now dominant in Engineering subjects. Chinese universities lead in fifteen subjects although Harvard, MIT and Northwestern University lead for seven subjects. The Natural Sciences, Medical Sciences, and Social Sciences are still largely the preserve of American and European universities.

Excellence in the Life Sciences appears to be divided between the USA and China. The top positions in Biology, Human Biology, Agriculture, and Veterinary Science are held respectively by Harvard, University of California San Francisco, Northwest Agriculture and Forestry University, and Nanjing Agricultural University.

The top five for Computer Science and Engineering are:

1.    Massachusetts Institute of Technology

2.    Stanford University

3.    Tsinghua University

4.    Carnegie Mellon University

5.    University of California Berkeley.

Oxford is 9th.

 

3.  The Round University Rankings (RUR), now published from Tbilisi, Georgia, are derived from 20 metrics grouped in 5 clusters, Teaching, Research, International Diversity, and Financial Sustainability. The same methodology is used for rankings in six broad fields. Here, Harvard is in first place for Medical Sciences, Social Sciences, and Technical Sciences, Caltech for Life Sciences, and University of Pennsylvania for Humanities.

RUR’s narrow subject rankings, published for the first time, use different criteria related to publications and citations: Number of Papers, Number of Citations, Citations per Paper, Number of Citing Papers, and Number of Highly Cited Papers. In these rankings, first place goes to twelve universities in the USA, eight in Mainland China, three in Singapore, and one each in Hong Kong, France, and the UK.

 The top five for Computer Science are:

1.    National University of Singapore

2.    Nanyang Technological University

3.    Massachusetts Institute of Technology

4.    Huazhong University of Science and Technology

5.    University of Electronic Science and Technology of China.

Tsinghua is 10th.  Oxford is 47th.

 

4.   The QS World University Rankings by Subject are based on five indicators: Academic reputation, Employer reputation, Research citations per paper, H-index and International research network.  At the top they are mostly led by the usual suspects, MIT, Harvard, Stanford, Oxford, and Cambridge.

The top five for Computer Science and Information Systems

1.    Massachusetts Institute of Technology

2.    Carnegie Mellon University

3.    Stanford University

4.    University of California Berkeley

5.    University of Oxford.

Tsinghua is 15th.

 

5.   University Ranking by Academic Performance (URAP) is produced by a research group at the Middle East Technical University, Ankara, and is based on publications, citations, and international collaboration. Last July it published rankings of 78 subjects.  

 The top five for Information and Computing Sciences were:

1.    Tsinghua University

2.    University of Electronic Science and Technology of China

3.   Nanyang Technological University

4.   National University of Singapore

5.   Xidian University

Oxford is 19th

 

6.    The US News Best Global Universities can be filtered by subject. They are based on publications, citations and research reputation.

The top five for Computer Science in 2022 were:

1.   Tsinghua University

2.   Stanford University

3.    Massachusetts Institute of Technology

4.    Carnegie Mellon University

5.   University of California Berkeley

Oxford was 11th.

 

7.    The National Taiwan University Rankings are based on articles, citations, highly cited papers, and H-index.

The top five for Computer Science are:

1.    Nanyang Technological University

2.    Tsinghua University

3.    University of Electronic Science and Technology of China

4.   National University of Singapore

5.    Xidian University

Oxford is 111th

 

So, Tsinghua is ahead of Oxford for computer science and related fields in the Shanghai Rankings, the Round University Rankings, URAP, the US News Best Global Universities, and the National Taiwan University Rankings. These rankings are entirely or mainly based on research publications and citations. Oxford is ahead of Tsinghua in both the QS and THE subject rankings. The contrast between the THE and the Taiwan rankings is especially striking.

 

 

 

 

 

Friday, November 24, 2023

Observations on the THE Arab University Rankings

Times Higher Education (THE) has just announced the third edition of its Arab University Rankings. There has been a churning of universities with many falling and many rising. Once again, this volatility seems largely the result of methodology changes and only in part any genuine decline or progress

The rankings are led by King Abdullah University of Science and Technology (KAUST) in Saudi Arabia, which makes sense from the point of view of high impact research, although it does no undergraduate teaching. After that we have Khalifa University, UAE, Qatar University, King Fahd University of Petroleum and Minerals, Saudi Arabia, and the University of Sharjah, UAE.

THE has introduced a raft of changes in its World University Rankings, including adding patents as a metric, tweaking the internationalisation pillar to help larger countries, and including three new measures of citations. 

They have added more changes to the Arab University Rankings. The weighting given to the teaching and research surveys has been trimmed. Field Normalised Citation Impact has been removed altogether leaving the three new metrics for research impact: Research Strength, Research Excellence, and Research Influence. Within the International Outlook pillar there is now a 2% weighting for inter-Arab collaboration. The Society pillar, unlike the world rankings,  does not include patents and  it gives a 4%. weight to participation and performance in THE's Impact Ranking.

It is always advisable to look at the specific metric ranks for any ranking, especially THE. For this year's ranking we have: Research Quality; KAUST, International Outlook; Gulf Medical University, UAE, Research Environment; KAUST, Teaching; Beirut Arab University, Society; KAUST. 

There are some interesting things about this year's rankings. To start, there is a noticeable improvement in the ranks of universities in the United Arab Emirates. There are now six UAE universities in the top 25 compared with four last year and three in 2021.

Some Emirati universities have done particularly well, Khalifa University in Abu Dhabi has risen from fifth place to second and  Abu Dhabi University from 39th to 9th. 

The results were announced this year at the THE MENA summit which this year was held at the campus of New York University Abu Dhabi. 

That meeting also saw a number of awards going to UAE institutions, including Abu Dhabi University for International Strategy of the year. Gulf Medical University in Ajman for outstanding support for students, New York University Abu Dhabi for Research Project  STEM, American University in Dubai for  Teaching and Learning Strategy.

 A few years ago I noticed that THE was holding conferences  where they would announce results that appeared to favour the host countries. Thus in February 2015 THE held a MENA summit in Qatar with a "snapshot" single metric ranking that put Texas A & M Qatar in first place and UAE University 11th. The next MENA meeting was in January 2016 in Al Ain, UAE where in a ranking that used the WUR metrics, Texas A and M Qatar disappeared and UAEU rose to fifth place.

Another example. In February 2016 at a conference held at the Hong Kong University of Science and Technology, THE introduced a new methodology for its Asian rankings that dethroned the University of Tokyo  as the top Asian university and placed it below universities in Kong Kong, Singapore, and Mainland China.

In contrast, the number of Egyptian universities in the top 25 has fallen from six to two , Mansoura University and the American University in Cairo. Last year's front runner King Abdulaziz University, Saudi Arabia, has fallen to 15th place.

So the holding of a summit in Abu Dhabi and a new methodology coincided with a significant improvement for UAE in general and a very significant improvement for two Abu Dhabi universities. Plus NYU Abu Dhabi, currently unranked, received an award. Perhaps this is just a coincidence or perhaps such a turnover in a single year reflects  real changes, which the new methodology accurately detects. But cynics may wonder a little.

There has been a lot of discussion recently about conflict of interest in the ranking business. It is likely that questions will be asked about a new methodology so conveniently helping institutions in the summit host country.











Sunday, November 19, 2023

How Dare They? HE Sector Reacts to the King's Speech

 Occasionally there are moments when a few casual words suddenly illuminate things that have been obscured. 

One such moment was when the Vice-Chancellor of the University of Oxford expressed her embarrassment  that one of the university's alumni had failed to express appropriate deference towards experts. It would seem that the a major function of higher education is not the encouragement of critical thought but the acceptance of anything that has been approved by academic experts. This was especially ironic since the claim was made at a summit held by Times Higher Education, whose expertise in the field of university ranking is somewhat questionable

The recent King's Speech contained a bland announcement that the government would "ensure young people have the knowledge and skills to succeed" by combining technical and academic qualifications.  Also, the government will attempt to reduce enrollment in "poor quality university degrees and increase the number undertaking high quality apprenticeships." 

It is not unlikely that any such initiatives will fail to get off the ground or will crash soon after take off and even if implemented they would probably not be very effective or even effective at all.

Research Professional News, however, reports that industry insiders are incensed that the government has dared to say anything that could be considered critical of British universities. Diana Beech, Chief Executive of the London Higher Group of Institutions said "it is beyond belief that the UK government would even contemplate asking His Majesty the King to speak negatively of the national asset that is our world-leading higher education and research sector."

The King is not speaking negatively of the entire sector. He is talking about proposed efforts to improve the sector. And surely the "brightest and the best" of the world are more likely to come to Britain if they think there are efforts to bring about positive change.

It seems that the academic establishment wants everybody to pretend that there is nothing wrong with British universities. That, in the long run is not going to do anyone any good.


Saturday, October 21, 2023

Crisis, conflict and global rankings

Just published in the Journal of Adult Learning, Knowledge and Innovation

Crisis, conflict and global rankings

Read here


Abstract

Global university rankings have always been associated with international political and economic conflicts. Even before the COVID-19 pandemic there were signs that scientific and academic globalism was breaking down. The pandemic, the various measures taken to combat it, and military and ideological conflicts have led to the breakdown of international academic cooperation, the formation of very different research complexes, and the development of new regional ranking systems.

Friday, September 01, 2023

Two Decades of Rankings: Rising and Falling in ARWU

 Most rankings are of little value for identifying trends over more that a couple of years. Changes in methodology, and sometimes a lack of access to old editions make year on year comparisons difficult or impossible. The Shanghai Rankings, aka ARWU, have maintained a generally consistent methodology over two decades and publish data going back to the founding year of 2003.

So it is possible to use ARWU to look for  patterns in the  world's research and higher education landscape. Here are some "winners" and "losers", based on the number of universities in the ARWU top 500 in 2004, when Shanghai changed the initial methodology to include the social sciences and 2023. This is far from a perfect measure; for a start this ranking does not takes no account of the humanities and relies to much on old Nobel and Fields laureates. Even so it does give us some idea of the shift in the academic world's centre of gravity.

Rising

Australia from 14 in 2004 to 24 in 2023 (and from 2 in the top 100 to 7)

Brazil from 4 to 5

China from 16 to 98 (and from zero in the top 100 to 11)

Malaysia from zero to one

New Zealand from 3 to 4

Saudi Arabia from zero to 6

Singapore 2 in the top 500 in 2004 and 2023 (but rising from zero in the top 100 in 2004 to 2 in 2023)

South Korea from 8 to 11

Falling

Canada from 23 to 18 

France from 22 to 18

Germany from 43 to 31

India from 3 to 1

Israel from 7 to 5 (but rising from 1 to 3 in the top 100)

Italy from 24 to 16

Japan from 36 to 12 (and from 5 in the top 100 to 2)

Switzerland from 8 to 7

United Kingdom from 42 to 38 (and from 11 in the top 100 to 8)

United States from 170 to 120 (and from 51 in the top 100 to 38)


The last two decades have seen a massive increase in the research capabilities of universities in Australia, China, South Korea, and Singapore. The rest of Asia, including Japan and India, has stagnated or even fallen relatively and perhaps absolutely.

The biggest losers are the USA, UK, and Germany although Canada, France, Italy and Switzerland have also not done so well.

More recently, Saudi Arabia has noticeably improved and may soon be followed by other Middle eastern states.




                                







Wednesday, July 05, 2023

The New QS Methodology: Academic Snakes and Ladders

The ranking season is under way. The latest edition of the QS world rankings  has just been announced and we have already seen the publication of the latest tables from Leiden Ranking, CWUR, RUR, uniRank and the THE Impact Rankings plus the THE Asian and Young Universities and  Sub-Saharan Africa rankings. Forgive me if I've missed anything.

Each of these tells us something about the current geopolitics of higher education and science and the way in which they are reflected in the global ranking business. 

The QS rankings have a new methodology, which makes it quite different from previous editions. Nonetheless, the media have been full of universities celebrating their remarkable achievements as they have soared, surged, risen, ascended in the rankings. No doubt, there will be a few promotions and bonuses.

It is in the nature of ranking that places are finite and if some  universities suddenly surge then others will fall. It seems that the general pattern of the QS rankings is that Canadian, Australian, and American universities are rising and Chinese, Korean, and Indian Universities are falling. Russian and Ukrainian universities are also falling although that might be for other reasons.

QS have reduced the weighting of their academic survey from 40% to 30% and faculty student ratio from 20% to 10%. The weighting for the employer survey has increased from 10% to 15% and there are three new indicators, Sustainability, Employment Outcomes, and International Research Network.

QS claim that the new methodology "reflects the collective intelligence of the sector, and the changing priorities of students." If that is so, then the collective intelligence is very localised. The new methodology puts a heavy fist on the scales in favour of Western universities and against Asia.  

The revised methodology works against universities that have acquired a good reputation for research or recruited a large and permanent faculty. It favours those that have mobilised their alumni network for the employer survey and are enthusiastic participants in the sustainability movement. 

As a result the leading Chinese institutions have taken a tumble. Peking has fallen 5 places to 17th, Tsinghua 11 to 25th, and Fudan 16 to 50th. 

Other national and regional flagships have tumbled, Seoul National University from 29th to 41st, the Indian Institute of Science from 115th to 225th, the University of Tokyo from from 23rd to 28th, an the University of Hong Kong from 21st to 26th.

In contrast, the University of British Columbia, McGill University, the University of  Toronto, the University of Melbourne, the University of Sydney, the University of Cape Town, Witwatersrand University, Trinity College Dublin, University College Dublin, University of California Berkeley and UCLA are climbing the ladders. For the moment at any rate.



 










Wednesday, May 17, 2023

World Economic Forum Declares that Africa is Rising

The World Economic Forum (WEF),  an organization of the global economic elite, has published a report by Phil Baty, currently Chief Global Affairs Officer of Times Higher Education (THE), that proclaims that African universities are "surging in the world rankings" and that this is a highly positive development. This is an irresponsible claim that has scant relationship to reality. 

The report refers to a claim in 2012 by Max Price, then Vice-Chancellor of the University of Cape Town, that African universities needed to rise to the challenge of global university rankings. According to Baty, African universities are now successfully competing in the rankings game and rising to the top.

And just what is the evidence for this? Well, in 2012 there were four African universities in the THE World University Rankings, In 2022 there were 97. An impressive and remarkable achievement indeed if we are talking about same  rankings. But they are not the same.

In 2012 the THE rankings consisted of 402 universities. By 2022 they had expanded to 2,345 including "reporters", of which 1500+ were formally ranked. It would be truly amazing if any region had failed to improve its representation.

The real comparison, of course, is with the numbers in the top 400 in both years and here there is no sign of any African surging. In 2012 there were three South African universities in the top 400. The University of the Witwatersrand and Stellenbosch University were in the 251-275 band and in the 2022-2023 rankings they were in the 251-375.  The University of Cape Town was 103rd in 2012 and 160th in 2022-2023, which might be cause for concern if this was a ranking that had a rigorous and stable methodology but that is not the case for THE. 

The fourth African university in the top 400 in 2012 was the University of Alexandria in the 301-350 band. By 2022 it had dropped to the 801-1000 band. This was a university on a downward spiral from its magical moment of ranking glory in 2010 when it was ranked 147th in the world overall and 4th in the world for citations as a result of a spurious affiliation claim by a serial self-publisher and  self-citer, who was involved in a libel case with Nature.

By 2022-2023 another African university, the University of Cape Coast in Ghana, had entered the top 400. This was not a testimony to any kind of achievement. It was simply the result of the university taking part in the Gates-funded Global Burden of Disease Study (GBDS). Because of a flawed methodology it is possible for a university with a few papers in the study, which typically have hundreds of authors and citations, and a small number of total publications to rack up scores of 80, 90, or even 100  for this indicator. 

There is then no evidence of a surge of any kind, not even a bit of a trickle.

That brings us to the assertion that Nigeria has, followed by Egypt, posted the biggest gains in the citations indicator, which purports to measure research impact or research quality or something, and has therefore achieved excellent progress. 

THE is being entirely too modest here. It could have used the indicator to celebrate the extraordinary accomplishment of a range of African institutions and countries that have surged in the rankings with 90 + scores for citations, an incredible accomplishment that contrasts with very low scores for Research, which includes publications, expenditure and reputation. In fact, if this indicator is taken seriously a number of African universities have now outpaced reputable research universities in North America and China. 

These research influencers, according to THE, include Jimma University, Ethiopia, Damietta University, Egypt, Muhimbili University of Health and Allied Sciences, Tanzania, Aswan University, Egypt, University of Lagos, Nigeria, the University of Zambia, and Kafrelsheikh University, Egypt.

Again, this has nothing to do with excellence or teamwork or transformative practices or any other current managerial shibboleth. It is largely the result of contributing a single researcher or a few to the hundreds of "authors" of GBDS papers in The Lancet and other prestigious journals and collecting credit for thousands of citations.

The cruellest aspect of this is that THE have announced that they are finally getting around to a partial revamping of the world rankings methodology this year. If THE do go ahead it is very likely -- not certainly because the whole process is so complex and opaque -- that these universities will go tumbling down the rankings and we shall probably see leaders across the continent under fire for their gross incompetence.

It is strange that an organisation that supposedly represents the best minds of the corporate world should adopt THE as the sole arbiter of African excellence. It is not the only global ranking and in fact it is probably the worst for Africa. It emphasises income, assessed by three separate indicators, self-submitted data which diverts the unacknowledged  labour of talented and motivated faculty, and reputation, and privileges postgraduate programmes. 

Perhaps also, at the risk of committing heresy in the first degree, the quality of higher education should not be Africa's highest priority. The latest edition of the Progress in International Reading Literacy Study (PIRLS) shows that Morocco, Egypt, and South Africa do very poorly with regard to fourth grade literacy. For South Africa and Morocco, the situation revealed by the 2019 Trends in Mathematics and Science (TIMSS) was little better, although they did come out ahead of Pakistan and the Philippines. Surely, this is as crucial for the future of Africa as the funding of doctoral programmes.









Sunday, April 23, 2023

Article in University World News

 



Go HERE for my recent article in University World News.


Is the switch in rankings’ focus masking the West’s decline?


recent commentary in The Lancet by Richard Horton presented criticism of international university rankings, including a briefing paper from the United Nations University (UNU) in Kuala Lumpur, Malaysia.

Horton makes some relevant comments on the rankings, although his survey is very limited and incomplete, and he argues that they need to be reformed to hold universities accountable for their social responsibilities. He notes that the UNU report suggests doing away with rankings altogether.

The status of The Lancet is such that this article provides insight into the collective thinking of the Western academic and scientific establishment and it therefore needs some attention.

To start with, getting rid of rankings, as posited in the article, sounds like a good idea, but it is not really feasible. Bureaucrats and faculty in the big brand universities are not suggesting that every university is as good as any other, that their salaries or tuition fees be reduced to the industry average, that research grants be allocated randomly or that their students are no more employable or intelligent than those at other places.

Monday, March 20, 2023

The Frontiers of the Ranking World: UNIRANKS

After predatory journals and predatory conferences, the next logical step would be predatory rankings. 

Recently, internet searches have shown up something called UNIRANKS, with a polished website containing several plausible world and country rankings and announcements about a forthcoming conference, along with a list of photogenic speakers and detailed instructions about registration and payment.

I will not give out the URL since a couple of clicks in I ran into a bright red screen with a warning about phishing.

There appears to be no adequate methodological details, no advisory committee, no criteria for inclusion, or any of the other things provided  by even the most technically careless rankings. Even the name is a near copy of UniRank, a reputable if limited search engine and ranking.

So be warned. If I receive any indication that this is a proper ranking and conference I will of course post it.

Saturday, March 18, 2023

SCImago Innovation Rankings: The East-West Gap Gets Wider

The decline of western academic research becomes more apparent every time a ranking with a stable and moderately accurate methodology is published. This will not be obvious if one just looks at the top ten, or even the top fifty, of the better known rankings. Harvard, Stanford, and MIT are usually still there at the top and Oxford and Cambridge are cruising along in the top twenty or the top thirty.

But take away the metrics that measure inherited intellectual capital such as the Nobel and Fields laureates in the Shanghai rankings or the reputation surveys in the QS, THE, and US world rankings, and the dominance of the West appears ever more precarious. This is confirmed if we turn from overall rankings to subject and field tables.

Take a look at the most recent edition of the CWTS Leiden Ranking, which is highly reputed among researchers although much less so among the media. For sheer number of publications overall, Harvard still holds the lead although Zhejiang, Shanghai Jiao Tong and Tsinghua are closing in and there are more Chinese schools in the top 30.  Chinese dominance is reduced if we move to the top 10% of journals but it may be just a matter of time before China takes the lead there as well. 

But click to physical sciences and engineering. The top 19 places are held by Mainland Chinese universities with the University of Tokyo coming in at 20.  MIT is there at 33, Texas A & M at 55 and Purdue 62. Again the Chinese presence is diluted, probably just for the moment, if we switch to the top 10% or 1% of journals.  

Turning to developments in applied research, the shift to China and away from the West, appears even greater.

The SCImago Institutions rankings are rather distinctive. In addition to the standard measures of research activity, there are also metrics for innovation and societal impact. Also, they include the performance of government agencies, hospitals, research centres and companies.

The innovation rankings combine three measures of patent activity. Patents are problematic for comparing universities but they can establish broad long-term trends. 

Here are the top 10 for Innovation in 2009:

1.   Centre Nationale de la Recherche Scientifique

2.   Harvard University 

3.   National Institutes of Health, USA

4.   Stanford University 

5.   Massachusetts Institute of Technology

6.   Institute National de las Sante et de la Recherche Medicale

7.   Johns Hopkins University 

8.   University of California Los Angeles

9.   Howard Hughes Medical Institute 

10.  University of Tokyo.

And here they are for 2023:

1.   Chinese Academy of Sciences 

2.   State Grid Corporation of China  

3.   Ministry of Education PRC

4.   DeepMind

5.   Ionis Pharmaceuticals

6.   Google Inc, USA

7.   Alphabet Inc 

8.  Tsinghua University

9.   Huawei Technologies Co Ltd

10.  Google International LLC.

What happened to the high flying universities of 2009?  Harvard is in 57th place, MIT in 60th, Stanford 127th, Johns Hopkins 365th, and Tokyo in 485th. 

it seems that the torch of innovation has left the hand of American, European, and Japanese universities and research centres and has been passed to Multinational, Chinese, and American companies and research bodies, plus a few Chinese universities. I am not sure where the loyalties of the multinational institutions lie, if indeed they have any at all.




Saturday, March 04, 2023

US News and the Law Schools

There has always been a tension between the claim by commercial rankers that that they provide insights and data for students and other stakeholders and the need to keep on the good side of those institutions that can provide them with status, credibility, and perhaps even lucrative consultancies.

A recent example is Yale, Harvard, Berkeley and other leading law schools declaring that they will "boycott", "leave", "shun", or "withdraw from" the US News (USN) law school rankings. USN has announced that it will make some concessions to the schools although it seems that, for some of them at least, this will not be enough. It is possible that this revolt of the elites will spread to other US institutions and other rankings. Already Harvard Medical School has declared that it will follow suit and boycott the medical school rankings.

At first sight, it would seem that the law schools are performing an act of extraordinary generosity or self denial. Yale has held first place since the rankings began and the others who have joined the supposed boycott seem to be mainly from the upper levels of the ranking while those who have not seem to be mostly from the middle and lower. To "abandon" a ranking that has served the law school elite so well for many years is a bit odd, to say the least.

But Yale and the others are not actually leaving or withdrawing from the rankings. That is something they cannot do. The data used by US News is mostly from public sources or if it is supplied by the schools it can be replaced by public data. The point of the exercise seems to be to persuade US News to review their methodology so that it conforms to the direction where Yale law school and other institutions want to go.

We can be sure that the schools have a good idea how they will fare if the current methodology continues and what is likely to happen if there are changes. It is now standard practice in the business to model how institutions will be affected by possible tweaks in ranking methodology.

So what was Yale demanding? It wanted  fellowships to be given the same weighting as graduate legal jobs. This would appear reasonable on the surface but it seems that the fellowships will be under the control of Yale and therefore this would add a metric dependent on the ability to fund such fellowships. Yale also wanted debt-forgiveness programmes to be counted in the rankings. Again this is something dependent on the schools having enough money to spare.

For a long time the top law schools have been in the business of supplying bright and conscientious young graduates. Employers have been happy to pay substantial salaries to the graduates of the famous "top 14" schools since they appear more intelligent and productive than those from run of the mill institutions.

The top law schools have been able to do this by rigorous selection procedures including standardised tests and college grades. Basically, they have selected for intelligence and conscientiousness and perhaps for a certain amount of agreeability and conformity. There is some deception here,  including perhaps including self-deception. Yale and the rest of the elite claim that they are doing something remarkable by producing outstanding legal talent but in fact they are just recruiting new students with the greatest potential, or at least they did until quite recently.

If schools cannot select for such attributes then they will have problems convincing future employers that their graduates do in fact possess them. If that happens then the law school graduate premium will erode and if that happens future lawyers will be reluctant to go into never ending  debt to enter a career that is increasingly precarious and unrewarding.

The law schools, along with American universities in general, are also voicing their discontent with reliance on standardised tests for admission and their inclusion as ranking indicators. The rationale for this is that the tests supposedly discourage universities from offering aid according to need  and favours those who can afford expensive test prep courses.

Sometimes this is expanded into the argument that since there is a relationship between test scores and wealth then that is the only thing that tests measure and so they cannot measure anything else that might be related to academic ability. 

The problem here is that standardised tests do have a substantial relationship with intelligence, although not as much as they used to, which  in turn has a strong relationship with academic and career success. Dropping the tests means that schools will have to rely on high school and college grades. which have been increasingly diluted over the last few decades, or on recommendations, interviews, and personal essays which have little or no predictive validity and can be easily prepped or gamed.

It appears that American academia is retreating from its mission of producing highly intelligent and productive graduates and have embraced the goal of socialisation into the currently dominant ideology. Students will be  admitted and graduated and faculty recruited according to their doctrinal conformity and their declared identity.  

USN has gone some way to meeting the demands of the schools but that will probably not be enough. Already there are calls to have a completely new ranking system or to do away with rankings altogether.


 




Saturday, February 25, 2023

Global Trends in Innovation: Evidence from SCImago

We are now approaching the third decade of global university rankings. They have had a mixed impact. The original Shanghai rankings published in 2003 were a salutary shock for universities in continental Europe and contributed to a wave of restructuring and excellence initiatives. On the other hand, rankings with unstable and unreliable methodologies are of little use to anyone except for the public relations departments of wealthy Western universities. 

In contrast, the SCImago Institutions Rankings, published by a Spanish research organisation, with thousands of universities, hospitals, research institutes, companies and other organisations, can be quite informative, especially the Innovation and Societal Rankings.

The Innovation Rankings, which are based on three measures of patent citations and applications, included 4019 organisations of various kinds in 2009. The top spot was held by the Centre Nationale de la Recherche Scientifique in France, followed by Harvard, the National Institutes of Health in the USA, Stanford, and MIT.

Altogether the top 20 in 2009 consisted of  ten universities, nine American plus the University of Tokyo, and ten non-university organisations, three American, two German, two French, two multinational, and the Chinese Academy of Sciences (CAS) in 14th place. 

Fast forward to 2022 and we now have 8084 institutions. First place now goes to CAS, followed by the State Grid Corporation of China, Deep Mind Technologies, a British AI firm, the Chinese Ministry of Education, and Samsung Corp.

Now, the top twenty includes exactly two universities, Tsinghua in 14th place and Harvard in 20th. The rest are companies, health organisations, and government agencies. The nationality assigned by Scimago for these eighteen is Multinational eight, USA six, China four, and UK and South Korea one each.

What about those high flying US universities of 2009? Stanford has fallen from 4th place to 67th, the University of Michigan from 13th to 249th, the University of Washington from 16th to 234th.

The relative -- and probably now absolute as well -- decline of American academic research has been well documented. It seems that the situation is even more dire for the innovative capability of US universities. But the technological torch is passing not only to Chinese universities and research centres but also to US and Multinational corporations.



Saturday, February 04, 2023

Aggregate Ranking from BlueSky Thinking

 

In recent years there have been attempts to construct rankings that combine several global rankings. The University of New South Wales has produced an aggregate ranking based on the “Big Three” rankings, the Shanghai ARWU, Times Higher Education (THE), and QS. AppliedHE of Singapore has produced a ranking that combines these three plus Leiden Ranking and Webometrics.

The latest aggregate ranking is from BlueSky Thinking, a website devoted to research and insights in higher education. This aggregates the big three rankings plus the Best Global Universities published by US News.

There are some noticeable differences between the rankings. The University of California Berkeley is fourth in the US News rankings but 27th in QS. The National University of Singapore is 11th in QS but 71st in ARWU.

The top of the aggregate ranking is unremarkable. Harvard leads followed by Stanford, MIT, Cambridge, and Oxford.

There have been some significant changes over the last five years, with universities in Mainland China, Hong Kong, France, Germany, Saudi Arabia, and Australia recording significant improvements, while a number of US institutions, including Ohio State University, Boston University and the University of Minnesota, have fallen.

 

Saturday, January 28, 2023

Another Sustainability Ranking

 

People and Planet is a British student network concerned with environmental and social justice. It has just published a league table that claims to measure the environmental and ethical performance of UK universities.

The top five universities are Cardiff Metropolitan, Bedfordshire, Manchester Metropolitan, Reading, and University of the Arts London. It is interesting that this league table shows almost no overlap with the other rankings that claim to assess commitment or contributions to sustainability.

Here are all the six British universities in the latest UI GreenMetric ranking in order: Nottingham, Nottingham Trent, Warwick, Glasgow Caledonian, Loughborough, Teesside.

The top five British universities in the THE Impact Rankings are Newcastle, Manchester, Glasgow, Leicester, King's College London. For the QS Sustainability rankings we have: Edinburgh, Glasgow, Oxford, Newcastle, Cambridge.

There is some similarity between the QS Sustainability and the THE Impact Rankings, because both give prominence to research on environmental topics. But even this is quite modest compared to the much greater overlap between conventional research based rankings such as Leiden, Shanghai or URAP (Middle East Technical University).

This surely raises serious questions about the trend to rankings based on sustainability. If the rankers produce league tables that show such a modest degree of correlation then we have to ask whether there is any point at all to the exercise.



 



Friday, January 20, 2023

Implications of the new QS methodolgy

QS have announced that the world rankings due to appear in 2023 will have a new methodology. This is likely to produce significant changes in the scores and ranks of some universities even if there are no significant changes in the underlying data. 

There will no doubt be headlines galore about how dynamic leadership and teamwork have transformed institutions or how those miserable Scrooges in government have been crushing higher education by withholding needed funds.

The first change is that the weighting of the academic survey will be reduced from 40% to 30%. This is quite sensible: 40% is far too high for any one indicator. It remains, however, the largest single indicator and it remains one that tends to favour the old elite or those universities that can afford expensive marketing consultants, at the expense of emerging institutions. The employer survey weight will go up from 10% to 15%.

Next, the weighting of faculty student ratio has been cut from 20% to 10%. Again this is not a bad idea. This metric is quite easy to manipulate and has only a modest relationship to teaching quality, for which it is sometimes supposed to be a proxy.

What has not changed is the citations per faculty indicator. This is unfortunate since rankers can get very different results by tweaking the methodology just a bit. It would have been a big improvement if QS had used several different metrics for citations and/or publications, something that Times Higher Education has just got round to doing.

Then there are three new indicators: international research network, graduate employability, and sustainability.

This means that international indicators will now account for a 15% weighting, adding a further bias towards English-speaking universities, or those in small countries adjoining larger neighbours with similar languages and cultures and working against China and India. 

The introduction of a sustainability metric is questionable. It requires a considerable amount of institutional data collecting and this will tend to favour schools with the resources and ambitions to jump through the rankers' hoops.

On the surface, it seems that these changes will be a modest  improvement. However, I suspect that one effect of the changes will be a spurious boost for the scores and ranks of the elite Western and English-speaking  universities who can mobilise partners and alumni for the surveys, nurture their global networks, and collect the data required to compete in the rankings.


Monday, November 07, 2022

The QS Sustainability Rankings

 


Do we need to measure social and environmental impact?

Sunday, October 23, 2022

Australia and the THE World Rankings

 

The Latest Rankings

The latest Times Higher Education (THE) world rankings have just been announced at a summit in New York. Around the world political leaders, mass media, and academics have been proclaiming their delight about their universities rising in the rankings. Australian universities are especially fascinated by them, sometimes to the point of unhealthy obsession.

Study Australia reports that "Australia shines again." Insider Guides finds it "particularly exciting" that six Australian universities in the top 200 have climbed the charts. Monash University is celebrating how it has "skyrocketed" 13 places, further proof of its world-leading status.

It is unfortunate that Australian media and administrators are so concerned with these rankings. They are not the only global rankings and certainly not the most reliable, although they are apparently approved by universities in the traditional elite or their imitators.  They are not totally without value, but they do need a lot of deconstructions to get to any sort of meaningful insight.

Transparency

One problem with the THE rankings, to be painfully repetitive, is that they are far from transparent. Three of their five current “pillars” consist of more than one indicator so we cannot be sure exactly what is contributing to a rise or fall. If, for example, a university suddenly improves for THE’s teaching pillar that might be because its income has increased, or the number of faculty has increased, or the number of students has decreased, or it has awarded more doctorates or fewer bachelor’s degrees, or it has got more votes in THE’s reputation survey, or a combination of two or more of these.

THE's citations indicator, which purportedly measures research impact or research quality, stands alone but it is also extremely opaque. To calculate a university’s score for citations you have to work out the number of citations in 8,000 “boxes” (300 plus fields multiplied by five years of publication multiplied by five types of documents) and compare them to the world average. Add them up and then apply the country bonus, the square root of the national impact score, to half of the university’s score. Then calculate Z scores. For practical purposes this indicator is a black box into which masses of data disappear, are chopped up, shuffled around, processed, reconstituted and then turned into numbers and ranks that are, to say the least, somewhat counter-intuitive.

This indicator, which accounts for a 30% weighting, has produced some remarkable results over the last decade, with a succession of improbable institutions soaring into the upper reaches of this metric. This year’s tables are no exception. The world leader is Arak University of Medical Sciences, Iran, followed by Cankaya University, Turkey, Duy Tan University, Vietnam, Golestan University of Medical Sciences, Iran, and Jimma University, Ethiopia. Another two Iranian medical universities are in the top 25. They may not last long. Over the last few years quite a lot of universities have appeared briefly at the top and then in a few years slumped to a much lower position.

One of the more interesting things about the current success of the THE rankings is the apparent suspension of critical thought among the superlatively credentialed and accredited leaders of the academic world. One wonders how those professors, rectors and deans who gather at the various summits, seminars, webinars, and masterclasses would react to a graduate student who wrote a research paper that claimed that Arak University of Medical Sciences leads the world for “research quality”, Istanbul Technical University for “knowledge transfer”, or Macau University of Science and Technology for “international outlook”.

Volatility

Not only do the rankings lack transparency they are also extremely volatile. The top fifty list, or even the top one hundred, is reasonably stable but after that THE has seen some quite remarkable and puzzling ascents and descents. There have been methodological changes and there is a big one coming next year but that alone does not explain why there should be such dramatic changes. One cause of instability in the rankings is the citations indicator which is constructed so that one or a few researchers, often those working on the Gates-funded Global Burden of Disease Study (GBDS), can have a massively disproportionate impact.

Another possible cause of volatility is that the number of ranked institutions is not fixed. If the rankings expand new universities will usually be at the lower end of the scale and the effect of this is that the mean score for each indicator is lowered and this will affect the final score for every institution since the standardised scores that appear in the published tables are based on means and deviations.

There may be other reasons for the volatility of this year’s rankings. Fluctuating exchange rates may have affected reported income data, international students’ numbers may have fallen or even recovered. Some universities might have performed better in the surveys of teaching or research.

 

Australian universities rising and falling

Some Australian universities appear to have been quite mobile this year. In some cases, this has a lot to do with the citation indicator. Two years ago, Bond University was ranked in the 501 – 600 band and 26th in Australia. Now it is tenth in Australia and in the world top 300, driven largely by a remarkable rise in the citations score from 56.4 to 99.7. A lot of that seems to have come from a small number of publications relating to the 2020 PRISMA statement which amassed a massive number of citations in 2021 and 2022.

Another example is Australian Catholic University. In 2018 it was in the world 501-600 band and this year it is in band 251-300. This is mainly due to an improvement in its citations score from 59.5 to 98.5, the result of a series of articles between 2017 and 2020 related to the multi-author and massively cited GBDS.

The problem with relying on citations to get ahead in the THE rankings is that if the researchers who have been racking up the citations move on or retire the scores will eventually decline as their papers pass outside the period for counting publications. This might have happened with the University of Canberra which has benefitted from GBDS papers published between 2015 and 2018. This year, however, the 2015 and 2016 papers no longer count, and the result is that Canberra’s citation score has fallen from 98.6 to 92.6 and its world rank from 170th to 250-300. A university might even start falling just because its peers have started racking up scores of 99 plus for citations.

This is similar to the trajectory of quite a few international universities that have risen and fallen in the wake of a few highly cited papers such as Babol Noshirvani University of Technology, Iran, the Indian Institute of Technology Ropar, the University of Occupational and Environmental Health, Japan, Durban University of Technology, South Africa, and Nova Southeastern University, USA.

Citations have a lot to do with Australia’s success in the THE rankings. All the Australian universities the world rankings have a higher score for citations than for research, which is measured by publications, reputation, and research income and six have citation scores in the 90s. Compare that with Japan, where the highest citation score is 82.8. and leading universities do better for research than for citations. If THE had taken some of the weight from citations and given it to research, Australian universities might be in a different position.

Are the THE rankings any use?

Over the long term the THE rankings might have some value in charting the general health of an institution or a national system. Should a university fall steadily across several indicators despite changes in methodology and despite proclaimed excellence initiatives, then that might be a sign of systemic decline.

The success of Australian universities in the THE rankings might represent genuine progress but it is necessary to identify exactly why they are rising and how sustainable that progress is.

The rankings certainly should not be used to punish or reward researchers and teachers for “success” or “failure” in the rankers, to allocate funds, or to attract talented faculty or wealthy students.

Other rankings

The THE rankings are not the only game in town or in the world. In fact, for most purposes there are several rankings that are no worse and probably a lot better than THE. It would be a good idea for Australian universities, students and stakeholders to shop around a bit,

For a serious analysis of research quantity and quality there are straightforward rankings of research conducted by universities or research centres such as Shanghai Ranking, CWTS Leiden University, University Ranking by Academic Performance, or National Taiwan University. They can be a bit boring since they do not change very much from year to year, but they are at least reasonably competent technically and they rely on data that is fairly objective and transparent.

For prospective graduate and professional students, the research-based rankings might be helpful since the quality of research is likely to have an effect, even if an unpredictable, on the quality of postgraduate and professional instruction.

For undergraduate students there is not really too much that is directly relevant to their needs. The QS employability rankings, the Employer opinion survey in the QS world rankings, the Emerging/Trendence rankings employability rankings, the student quality section in the Center for World University Ranking tables, now based in the Emirates, can all help to provide some helpful insights.

Next year?

It seems that THE has finally steeled itself to introduce a set of changes. The precise effect is unclear except that the world rankings look to be getting even more complex and even more burdensome for the underpaid drones toiling away to collect, process and transmit the data THE requires of its “customers”. It is not clear exactly how this will affect Australian universities.

No doubt Australian deans and rectors will be wondering what lies ahead of them in the 2024 rankings coming next year. But not to worry. THE is offering “bespoke” shadow rankings that will tell them how they would have done if the new methodology had been applied this year. 

 

 

 

 

 

Sunday, August 21, 2022

California in the Shanghai Rankings

Global rankings are often misleading and uninformative, especially those that have eccentric methodologies or are subject to systematic gaming. But if their indicators are objective and reliable over several years, they can tell us something about shifts in the international distribution of research excellence.

I would like to look at 20 years of the Shanghai Rankings from the first edition in 2003 to the most recent, published this week. The first thing that anyone notices is of course the remarkable rise of China -- not Asia in general -- and the relative decline of the USA. These rankings can also be used to find regional trends within nations. Take a look at California universities. In 2003 California was the research star of the US with six universities in the world top twenty. Two decades later that number has fallen to five with the University of California (UC) San Diego falling from 14th to 21st place.

That is symptomatic of a broader trend. UC Santa Barbara has fallen from 25th to 57th, the University of Southern California from 40th to 55th, and UC Riverside from 88th to the 201-300 band. 

American universities in nearly all the states have been falling and have, for the most part, been replaced, by Chinese institutions. But even within the USA California has been drifting downwards. Caltech has gone from 3rd to 7th, UC San Francisco, a medical school, from 11th to 15th, and UC Davis from 27th to band 40-54.

This is not universal. Stanford is still second in the USA in 2022 while UC Los Angeles (UCLA) has risen from 13th to 11th.

But overall California is falling. Of the thirteen universities in the top 500 in 2003, nine had fallen in the US table by 2022, two, UC Santa Cruz and UCLA, rose and, two remained in the same rank. The decline is especially apparent in the Publications metric, which is based on recent articles in the Web of Science.

Recent events in California, including learning loss during the pandemic, the abandonment of standardised testing, and the imposition of political loyalty tests for faculty, suggest that the decline is not going to be halted or reversed any time soon.

 





Tuesday, July 19, 2022

What's the Matter with Harvard?

When the first global ranking was published by Shanghai Jiao Tong University back in 2003, the top place was taken by Harvard. It was the same for the rankings that followed in 2004, Webometrics and the THES - QS World University Rankings.  Indeed, at that time any international ranking that did not put Harvard at the top would have  been regarded as faulty.

Is Harvard Declining?

But since then Harvard has been dethroned by a few rankings. Now MIT leads in the QS world rankings, while Oxford is first in THE's  and the Chinese Academy of Sciences in Nature Index. Recently Caltech deposed Harvard at the top of the Round University Rankings, now published in Georgia.

It is difficult to get excited about Oxford leading Harvard in the THE rankings. A table that purports to show Macau University of Science and Technology as the world's most international university, Asia University Taiwan as the most innovative, and An Najah National University as the best for research impact, need not be taken too seriously.

Losing out to MIT in the QS world rankings probably does not mean very much either. Harvard is at a serious disadvantage here for international students and international faculty.

Harvard and Leiden Ranking

On the other hand, the performance of Harvard in CWTS Leiden Ranking, which is generally respected in the global research community,  might tell us that something is going on. Take a look at the total number of publications for the period 2017-20 (using the default settings and parameters). There we can see Harvard at the top with 35,050 publications followed by Zhejiang and Shanghai Jiao Tong Universities.

But it is rather different for publications in the broad subject fields. Harvard is still in the lead for Biomedical Sciences and for Social Sciences and Humanities. For Mathematics and Computer Science, however, the top twenty consists entirely of Mainland Chinese universities. The best non - Mainland institution is Nanyang Technological University in Singapore. Harvard is 128th.

You could argue whether this is just a matter of quantity rather than quality. So, let's turn to another Leiden indicator, the percentage of publications in the top 10% of journals for Mathematics and Computer Science. Even here China is in the lead, although somewhat precariously. Changsha University of Science and Technology tops the table and  Harvard is in fifth place.

The pattern for Physical Sciences and Engineering is similar. The top 19 for publications are Chinese with the University of Tokyo in 20th place. However, for those in the top 10% Harvard still leads. It seems then that Harvard is still ahead for upmarket publications in physics and engineering but a growing and substantial amount of  research is done by China, a few other parts of Asia, and perhaps some American outposts of scientific excellence such as MIT and Caltech.

The Rise of China

The trend seems clear. China is heading towards industrial and scientific hegemony and eventually Peking, Tsinghua, Fudan and Zhejiang and a few others will, if nothing changes, surpass the Ivy league, the Group of Eight, and Oxbridge, although it will take longer for the more expensive and demanding fields of research. Perhaps the opportunity will be lost in the next few years if there is another proletarian cultural revolution in China or if Western universities change course.

What Happened to Harvard's Money?

It is standard to claim that the success or failure of universities is dependent on the amount of money they receive. The latest edition of the annual Nature Index tables was accompanied by headlines proclaiming that that China's recent success in high impact research was the result of a long term investment program. 

Money surely had a lot to do with it but there needs to be a bit of caution here. The higher education establishment has a clear vested interest in getting as much money from the public purse as it can and is inclined to claiming that any decline in the rankings is a result of hostility to higher education..

Tracing the causes of Harvard's decline, we should consult the latest edition of the Round University Rankings, now based in Georgia,  which provides ranks for 20 indicators. In 2021 Harvard was first but this year it was second, replaced by Caltech. So what happened?  Looking more closely we see that in 2021 Harvard was 2nd for financial sustainability and in 2022 it was 357th, That suggests a catastrophic financial collapse. So maybe there has been a financial disaster over at Harvard and the media simply have not noticed bankrupt professors jumping out of their offices, Nobel laureates hawking their medals, or mendicant students wandering the streets with tin cups. 

Zooming in a bit, it seems that, if the data is accurate, there has been a terrible collapse in Harvard's financial fortunes. For institutional income per academic staff Harvard's rank has gone from 21st to 891st.

Exiting sarcasm mode for a moment, it is of course impossible that there has actually been such a catastrophic fall in income. I suspect that what we have here is something similar to what happened  to Trinity College Dublin  a few years ago when someone forgot the last six zeros when filling out the form for the THE world rankings.

So let me borrow a flick knife from my good friend Occam and propose that what happened to Harvard in the Round University Rankings was simply that somebody left off the zeros at the end of the institutional income number when submitting data to Clarivate Analytics, who do the statistics for RUR. I expect next year the error will be corrected, perhaps without anybody admitting that anything was wrong.

So, there was no substantial reason why Harvard lost ground to Caltech in the Round Rankings this year. Still it does say something that such a mistake could occur and that nobody in the administration noticed or had the honesty to say anything. That is perhaps symptomatic of deeper problems within American academia. We can then expect the relative decline of Harvard and the rise of Chinese universities and a few others in Asia to continue.