Saturday, March 18, 2023

SCImago Innovation Rankings: The East-West Gap Gets Wider

The decline of western academic research becomes more apparent every time a ranking with a stable and moderately accurate methodology is published. This will not be obvious if one just looks at the top ten, or even the top fifty, of the better known rankings. Harvard, Stanford, and MIT are usually still there at the top and Oxford and Cambridge are cruising along in the top twenty or the top thirty.

But take away the metrics that measure inherited intellectual capital such as the Nobel and Fields laureates in the Shanghai rankings or the reputation surveys in the QS, THE, and US world rankings, and the dominance of the West appears ever more precarious. This is confirmed if we turn from overall rankings to subject and field tables.

Take a look at the most recent edition of the CWTS Leiden Ranking, which is highly reputed among researchers although much less so among the media. For sheer number of publications overall, Harvard still holds the lead although Zhejiang, Shanghai Jiao Tong and Tsinghua are closing in and there are more Chinese schools in the top 30.  Chinese dominance is reduced if we move to the top 10% of journals but it may be just a matter of time before China takes the lead there as well. 

But click to physical sciences and engineering. The top 19 places are held by Mainland Chinese universities with the University of Tokyo coming in at 20.  MIT is there at 33, Texas A & M at 55 and Purdue 62. Again the Chinese presence is diluted, probably just for the moment, if we switch to the top 10% or 1% of journals.  

Turning to developments in applied research, the shift to China and away from the West, appears even greater.

The SCImago Institutions rankings are rather distinctive. In addition to the standard measures of research activity, there are also metrics for innovation and societal impact. Also, they include the performance of government agencies, hospitals, research centres and companies.

The innovation rankings combine three measures of patent activity. Patents are problematic for comparing universities but they can establish broad long-term trends. 

Here are the top 10 for Innovation in 2009:

1.   Centre Nationale de la Recherche Scientifique

2.   Harvard University 

3.   National Institutes of Health, USA

4.   Stanford University 

5.   Massachusetts Institute of Technology

6.   Institute National de las Sante et de la Recherche Medicale

7.   Johns Hopkins University 

8.   University of California Los Angeles

9.   Howard Hughes Medical Institute 

10.  University of Tokyo.

And here they are for 2023:

1.   Chinese Academy of Sciences 

2.   State Grid Corporation of China  

3.   Ministry of Education PRC

4.   DeepMind

5.   Ionis Pharmaceuticals

6.   Google Inc, USA

7.   Alphabet Inc 

8.  Tsinghua University

9.   Huawei Technologies Co Ltd

10.  Google International LLC.

What happened to the high flying universities of 2009?  Harvard is in 57th place, MIT in 60th, Stanford 127th, Johns Hopkins 365th, and Tokyo in 485th. 

it seems that the torch of innovation has left the hand of American, European, and Japanese universities and research centres and has been passed to Multinational, Chinese, and American companies and research bodies, plus a few Chinese universities. I am not sure where the loyalties of the multinational institutions lie, if indeed they have any at all.




Saturday, March 04, 2023

US News and the Law Schools

There has always been a tension between the claim by commercial rankers that that they provide insights and data for students and other stakeholders and the need to keep on the good side of those institutions that can provide them with status, credibility, and perhaps even lucrative consultancies.

A recent example is Yale, Harvard, Berkeley and other leading law schools declaring that they will "boycott", "leave", "shun", or "withdraw from" the US News (USN) law school rankings. USN has announced that it will make some concessions to the schools although it seems that, for some of them at least, this will not be enough. It is possible that this revolt of the elites will spread to other US institutions and other rankings. Already Harvard Medical School has declared that it will follow suit and boycott the medical school rankings.

At first sight, it would seem that the law schools are performing an act of extraordinary generosity or self denial. Yale has held first place since the rankings began and the others who have joined the supposed boycott seem to be mainly from the upper levels of the ranking while those who have not seem to be mostly from the middle and lower. To "abandon" a ranking that has served the law school elite so well for many years is a bit odd, to say the least.

But Yale and the others are not actually leaving or withdrawing from the rankings. That is something they cannot do. The data used by US News is mostly from public sources or if it is supplied by the schools it can be replaced by public data. The point of the exercise seems to be to persuade US News to review their methodology so that it conforms to the direction where Yale law school and other institutions want to go.

We can be sure that the schools have a good idea how they will fare if the current methodology continues and what is likely to happen if there are changes. It is now standard practice in the business to model how institutions will be affected by possible tweaks in ranking methodology.

So what was Yale demanding? It wanted  fellowships to be given the same weighting as graduate legal jobs. This would appear reasonable on the surface but it seems that the fellowships will be under the control of Yale and therefore this would add a metric dependent on the ability to fund such fellowships. Yale also wanted debt-forgiveness programmes to be counted in the rankings. Again this is something dependent on the schools having enough money to spare.

For a long time the top law schools have been in the business of supplying bright and conscientious young graduates. Employers have been happy to pay substantial salaries to the graduates of the famous "top 14" schools since they appear more intelligent and productive than those from run of the mill institutions.

The top law schools have been able to do this by rigorous selection procedures including standardised tests and college grades. Basically, they have selected for intelligence and conscientiousness and perhaps for a certain amount of agreeability and conformity. There is some deception here,  including perhaps including self-deception. Yale and the rest of the elite claim that they are doing something remarkable by producing outstanding legal talent but in fact they are just recruiting new students with the greatest potential, or at least they did until quite recently.

If schools cannot select for such attributes then they will have problems convincing future employers that their graduates do in fact possess them. If that happens then the law school graduate premium will erode and if that happens future lawyers will be reluctant to go into never ending  debt to enter a career that is increasingly precarious and unrewarding.

The law schools, along with American universities in general, are also voicing their discontent with reliance on standardised tests for admission and their inclusion as ranking indicators. The rationale for this is that the tests supposedly discourage universities from offering aid according to need  and favours those who can afford expensive test prep courses.

Sometimes this is expanded into the argument that since there is a relationship between test scores and wealth then that is the only thing that tests measure and so they cannot measure anything else that might be related to academic ability. 

The problem here is that standardised tests do have a substantial relationship with intelligence, although not as much as they used to, which  in turn has a strong relationship with academic and career success. Dropping the tests means that schools will have to rely on high school and college grades. which have been increasingly diluted over the last few decades, or on recommendations, interviews, and personal essays which have little or no predictive validity and can be easily prepped or gamed.

It appears that American academia is retreating from its mission of producing highly intelligent and productive graduates and have embraced the goal of socialisation into the currently dominant ideology. Students will be  admitted and graduated and faculty recruited according to their doctrinal conformity and their declared identity.  

USN has gone some way to meeting the demands of the schools but that will probably not be enough. Already there are calls to have a completely new ranking system or to do away with rankings altogether.


 




Saturday, February 25, 2023

Global Trends in Innovation: Evidence from SCImago

We are now approaching the third decade of global university rankings. They have had a mixed impact. The original Shanghai rankings published in 2003 were a salutary shock for universities in continental Europe and contributed to a wave of restructuring and excellence initiatives. On the other hand, rankings with unstable and unreliable methodologies are of little use to anyone except for the public relations departments of wealthy Western universities. 

In contrast, the SCImago Institutions Rankings, published by a Spanish research organisation, with thousands of universities, hospitals, research institutes, companies and other organisations, can be quite informative, especially the Innovation and Societal Rankings.

The Innovation Rankings, which are based on three measures of patent citations and applications, included 4019 organisations of various kinds in 2009. The top spot was held by the Centre Nationale de la Recherche Scientifique in France, followed by Harvard, the National Institutes of Health in the USA, Stanford, and MIT.

Altogether the top 20 in 2009 consisted of  ten universities, nine American plus the University of Tokyo, and ten non-university organisations, three American, two German, two French, two multinational, and the Chinese Academy of Sciences (CAS) in 14th place. 

Fast forward to 2022 and we now have 8084 institutions. First place now goes to CAS, followed by the State Grid Corporation of China, Deep Mind Technologies, a British AI firm, the Chinese Ministry of Education, and Samsung Corp.

Now, the top twenty includes exactly two universities, Tsinghua in 14th place and Harvard in 20th. The rest are companies, health organisations, and government agencies. The nationality assigned by Scimago for these eighteen is Multinational eight, USA six, China four, and UK and South Korea one each.

What about those high flying US universities of 2009? Stanford has fallen from 4th place to 67th, the University of Michigan from 13th to 249th, the University of Washington from 16th to 234th.

The relative -- and probably now absolute as well -- decline of American academic research has been well documented. It seems that the situation is even more dire for the innovative capability of US universities. But the technological torch is passing not only to Chinese universities and research centres but also to US and Multinational corporations.



Saturday, February 04, 2023

Aggregate Ranking from BlueSky Thinking

 

In recent years there have been attempts to construct rankings that combine several global rankings. The University of New South Wales has produced an aggregate ranking based on the “Big Three” rankings, the Shanghai ARWU, Times Higher Education (THE), and QS. AppliedHE of Singapore has produced a ranking that combines these three plus Leiden Ranking and Webometrics.

The latest aggregate ranking is from BlueSky Thinking, a website devoted to research and insights in higher education. This aggregates the big three rankings plus the Best Global Universities published by US News.

There are some noticeable differences between the rankings. The University of California Berkeley is fourth in the US News rankings but 27th in QS. The National University of Singapore is 11th in QS but 71st in ARWU.

The top of the aggregate ranking is unremarkable. Harvard leads followed by Stanford, MIT, Cambridge, and Oxford.

There have been some significant changes over the last five years, with universities in Mainland China, Hong Kong, France, Germany, Saudi Arabia, and Australia recording significant improvements, while a number of US institutions, including Ohio State University, Boston University and the University of Minnesota, have fallen.

 

Saturday, January 28, 2023

Another Sustainability Ranking

 

People and Planet is a British student network concerned with environmental and social justice. It has just published a league table that claims to measure the environmental and ethical performance of UK universities.

The top five universities are Cardiff Metropolitan, Bedfordshire, Manchester Metropolitan, Reading, and University of the Arts London. It is interesting that this league table shows almost no overlap with the other rankings that claim to assess commitment or contributions to sustainability.

Here are all the six British universities in the latest UI GreenMetric ranking in order: Nottingham, Nottingham Trent, Warwick, Glasgow Caledonian, Loughborough, Teesside.

The top five British universities in the THE Impact Rankings are Newcastle, Manchester, Glasgow, Leicester, King's College London. For the QS Sustainability rankings we have: Edinburgh, Glasgow, Oxford, Newcastle, Cambridge.

There is some similarity between the QS Sustainability and the THE Impact Rankings, because both give prominence to research on environmental topics. But even this is quite modest compared to the much greater overlap between conventional research based rankings such as Leiden, Shanghai or URAP (Middle East Technical University).

This surely raises serious questions about the trend to rankings based on sustainability. If the rankers produce league tables that show such a modest degree of correlation then we have to ask whether there is any point at all to the exercise.



 



Friday, January 20, 2023

Implications of the new QS methodolgy

QS have announced that the world rankings due to appear in 2023 will have a new methodology. This is likely to produce significant changes in the scores and ranks of some universities even if there are no significant changes in the underlying data. 

There will no doubt be headlines galore about how dynamic leadership and teamwork have transformed institutions or how those miserable Scrooges in government have been crushing higher education by withholding needed funds.

The first change is that the weighting of the academic survey will be reduced from 40% to 30%. This is quite sensible: 40% is far too high for any one indicator. It remains, however, the largest single indicator and it remains one that tends to favour the old elite or those universities that can afford expensive marketing consultants, at the expense of emerging institutions. The employer survey weight will go up from 10% to 15%.

Next, the weighting of faculty student ratio has been cut from 20% to 10%. Again this is not a bad idea. This metric is quite easy to manipulate and has only a modest relationship to teaching quality, for which it is sometimes supposed to be a proxy.

What has not changed is the citations per faculty indicator. This is unfortunate since rankers can get very different results by tweaking the methodology just a bit. It would have been a big improvement if QS had used several different metrics for citations and/or publications, something that Times Higher Education has just got round to doing.

Then there are three new indicators: international research network, graduate employability, and sustainability.

This means that international indicators will now account for a 15% weighting, adding a further bias towards English-speaking universities, or those in small countries adjoining larger neighbours with similar languages and cultures and working against China and India. 

The introduction of a sustainability metric is questionable. It requires a considerable amount of institutional data collecting and this will tend to favour schools with the resources and ambitions to jump through the rankers' hoops.

On the surface, it seems that these changes will be a modest  improvement. However, I suspect that one effect of the changes will be a spurious boost for the scores and ranks of the elite Western and English-speaking  universities who can mobilise partners and alumni for the surveys, nurture their global networks, and collect the data required to compete in the rankings.


Monday, November 07, 2022

The QS Sustainability Rankings

 


Do we need to measure social and environmental impact?

Sunday, October 23, 2022

Australia and the THE World Rankings

 

The Latest Rankings

The latest Times Higher Education (THE) world rankings have just been announced at a summit in New York. Around the world political leaders, mass media, and academics have been proclaiming their delight about their universities rising in the rankings. Australian universities are especially fascinated by them, sometimes to the point of unhealthy obsession.

Study Australia reports that "Australia shines again." Insider Guides finds it "particularly exciting" that six Australian universities in the top 200 have climbed the charts. Monash University is celebrating how it has "skyrocketed" 13 places, further proof of its world-leading status.

It is unfortunate that Australian media and administrators are so concerned with these rankings. They are not the only global rankings and certainly not the most reliable, although they are apparently approved by universities in the traditional elite or their imitators.  They are not totally without value, but they do need a lot of deconstructions to get to any sort of meaningful insight.

Transparency

One problem with the THE rankings, to be painfully repetitive, is that they are far from transparent. Three of their five current “pillars” consist of more than one indicator so we cannot be sure exactly what is contributing to a rise or fall. If, for example, a university suddenly improves for THE’s teaching pillar that might be because its income has increased, or the number of faculty has increased, or the number of students has decreased, or it has awarded more doctorates or fewer bachelor’s degrees, or it has got more votes in THE’s reputation survey, or a combination of two or more of these.

THE's citations indicator, which purportedly measures research impact or research quality, stands alone but it is also extremely opaque. To calculate a university’s score for citations you have to work out the number of citations in 8,000 “boxes” (300 plus fields multiplied by five years of publication multiplied by five types of documents) and compare them to the world average. Add them up and then apply the country bonus, the square root of the national impact score, to half of the university’s score. Then calculate Z scores. For practical purposes this indicator is a black box into which masses of data disappear, are chopped up, shuffled around, processed, reconstituted and then turned into numbers and ranks that are, to say the least, somewhat counter-intuitive.

This indicator, which accounts for a 30% weighting, has produced some remarkable results over the last decade, with a succession of improbable institutions soaring into the upper reaches of this metric. This year’s tables are no exception. The world leader is Arak University of Medical Sciences, Iran, followed by Cankaya University, Turkey, Duy Tan University, Vietnam, Golestan University of Medical Sciences, Iran, and Jimma University, Ethiopia. Another two Iranian medical universities are in the top 25. They may not last long. Over the last few years quite a lot of universities have appeared briefly at the top and then in a few years slumped to a much lower position.

One of the more interesting things about the current success of the THE rankings is the apparent suspension of critical thought among the superlatively credentialed and accredited leaders of the academic world. One wonders how those professors, rectors and deans who gather at the various summits, seminars, webinars, and masterclasses would react to a graduate student who wrote a research paper that claimed that Arak University of Medical Sciences leads the world for “research quality”, Istanbul Technical University for “knowledge transfer”, or Macau University of Science and Technology for “international outlook”.

Volatility

Not only do the rankings lack transparency they are also extremely volatile. The top fifty list, or even the top one hundred, is reasonably stable but after that THE has seen some quite remarkable and puzzling ascents and descents. There have been methodological changes and there is a big one coming next year but that alone does not explain why there should be such dramatic changes. One cause of instability in the rankings is the citations indicator which is constructed so that one or a few researchers, often those working on the Gates-funded Global Burden of Disease Study (GBDS), can have a massively disproportionate impact.

Another possible cause of volatility is that the number of ranked institutions is not fixed. If the rankings expand new universities will usually be at the lower end of the scale and the effect of this is that the mean score for each indicator is lowered and this will affect the final score for every institution since the standardised scores that appear in the published tables are based on means and deviations.

There may be other reasons for the volatility of this year’s rankings. Fluctuating exchange rates may have affected reported income data, international students’ numbers may have fallen or even recovered. Some universities might have performed better in the surveys of teaching or research.

 

Australian universities rising and falling

Some Australian universities appear to have been quite mobile this year. In some cases, this has a lot to do with the citation indicator. Two years ago, Bond University was ranked in the 501 – 600 band and 26th in Australia. Now it is tenth in Australia and in the world top 300, driven largely by a remarkable rise in the citations score from 56.4 to 99.7. A lot of that seems to have come from a small number of publications relating to the 2020 PRISMA statement which amassed a massive number of citations in 2021 and 2022.

Another example is Australian Catholic University. In 2018 it was in the world 501-600 band and this year it is in band 251-300. This is mainly due to an improvement in its citations score from 59.5 to 98.5, the result of a series of articles between 2017 and 2020 related to the multi-author and massively cited GBDS.

The problem with relying on citations to get ahead in the THE rankings is that if the researchers who have been racking up the citations move on or retire the scores will eventually decline as their papers pass outside the period for counting publications. This might have happened with the University of Canberra which has benefitted from GBDS papers published between 2015 and 2018. This year, however, the 2015 and 2016 papers no longer count, and the result is that Canberra’s citation score has fallen from 98.6 to 92.6 and its world rank from 170th to 250-300. A university might even start falling just because its peers have started racking up scores of 99 plus for citations.

This is similar to the trajectory of quite a few international universities that have risen and fallen in the wake of a few highly cited papers such as Babol Noshirvani University of Technology, Iran, the Indian Institute of Technology Ropar, the University of Occupational and Environmental Health, Japan, Durban University of Technology, South Africa, and Nova Southeastern University, USA.

Citations have a lot to do with Australia’s success in the THE rankings. All the Australian universities the world rankings have a higher score for citations than for research, which is measured by publications, reputation, and research income and six have citation scores in the 90s. Compare that with Japan, where the highest citation score is 82.8. and leading universities do better for research than for citations. If THE had taken some of the weight from citations and given it to research, Australian universities might be in a different position.

Are the THE rankings any use?

Over the long term the THE rankings might have some value in charting the general health of an institution or a national system. Should a university fall steadily across several indicators despite changes in methodology and despite proclaimed excellence initiatives, then that might be a sign of systemic decline.

The success of Australian universities in the THE rankings might represent genuine progress but it is necessary to identify exactly why they are rising and how sustainable that progress is.

The rankings certainly should not be used to punish or reward researchers and teachers for “success” or “failure” in the rankers, to allocate funds, or to attract talented faculty or wealthy students.

Other rankings

The THE rankings are not the only game in town or in the world. In fact, for most purposes there are several rankings that are no worse and probably a lot better than THE. It would be a good idea for Australian universities, students and stakeholders to shop around a bit,

For a serious analysis of research quantity and quality there are straightforward rankings of research conducted by universities or research centres such as Shanghai Ranking, CWTS Leiden University, University Ranking by Academic Performance, or National Taiwan University. They can be a bit boring since they do not change very much from year to year, but they are at least reasonably competent technically and they rely on data that is fairly objective and transparent.

For prospective graduate and professional students, the research-based rankings might be helpful since the quality of research is likely to have an effect, even if an unpredictable, on the quality of postgraduate and professional instruction.

For undergraduate students there is not really too much that is directly relevant to their needs. The QS employability rankings, the Employer opinion survey in the QS world rankings, the Emerging/Trendence rankings employability rankings, the student quality section in the Center for World University Ranking tables, now based in the Emirates, can all help to provide some helpful insights.

Next year?

It seems that THE has finally steeled itself to introduce a set of changes. The precise effect is unclear except that the world rankings look to be getting even more complex and even more burdensome for the underpaid drones toiling away to collect, process and transmit the data THE requires of its “customers”. It is not clear exactly how this will affect Australian universities.

No doubt Australian deans and rectors will be wondering what lies ahead of them in the 2024 rankings coming next year. But not to worry. THE is offering “bespoke” shadow rankings that will tell them how they would have done if the new methodology had been applied this year. 

 

 

 

 

 

Sunday, August 21, 2022

California in the Shanghai Rankings

Global rankings are often misleading and uninformative, especially those that have eccentric methodologies or are subject to systematic gaming. But if their indicators are objective and reliable over several years, they can tell us something about shifts in the international distribution of research excellence.

I would like to look at 20 years of the Shanghai Rankings from the first edition in 2003 to the most recent, published this week. The first thing that anyone notices is of course the remarkable rise of China -- not Asia in general -- and the relative decline of the USA. These rankings can also be used to find regional trends within nations. Take a look at California universities. In 2003 California was the research star of the US with six universities in the world top twenty. Two decades later that number has fallen to five with the University of California (UC) San Diego falling from 14th to 21st place.

That is symptomatic of a broader trend. UC Santa Barbara has fallen from 25th to 57th, the University of Southern California from 40th to 55th, and UC Riverside from 88th to the 201-300 band. 

American universities in nearly all the states have been falling and have, for the most part, been replaced, by Chinese institutions. But even within the USA California has been drifting downwards. Caltech has gone from 3rd to 7th, UC San Francisco, a medical school, from 11th to 15th, and UC Davis from 27th to band 40-54.

This is not universal. Stanford is still second in the USA in 2022 while UC Los Angeles (UCLA) has risen from 13th to 11th.

But overall California is falling. Of the thirteen universities in the top 500 in 2003, nine had fallen in the US table by 2022, two, UC Santa Cruz and UCLA, rose and, two remained in the same rank. The decline is especially apparent in the Publications metric, which is based on recent articles in the Web of Science.

Recent events in California, including learning loss during the pandemic, the abandonment of standardised testing, and the imposition of political loyalty tests for faculty, suggest that the decline is not going to be halted or reversed any time soon.