Sunday, May 08, 2022

Has China Really Stopped Rising? Looking at the QS Subject Rankings

For the last few years the publication of global rankings has led to headlines about the rise of Asia. If these were to be believed we would expect a few Asian universities to be orbiting above the stratosphere by now.

The Asian ascent was always somewhat exaggerated. It was true largely for China and perhaps Southeast Asia and the Gulf States. Japan, however, has been relatively stable or even declining a bit and India so far has made little progress as far as research or innovation is concerned. Now, it seems that the Chinese research wave may be slowing down. The latest edition of the QS subject rankings suggests that that  the quality of Chinese is levelling off and perhaps even declining. 

A bit of explanation here. QS publishes rankings for broad subject fields such as Arts and Humanities and for narrow subject areas  such as Archaeology. All tables include indicators for H-index, citations per paper, academic review, and employer review, with varying weightings. This year, QS has added a new indicator, International Research Network (IRN), with "broad" -- does that mean not unanimous? -- support from its Advisory Board, which is based on the number of international collaborators and their countries or territories. Chinese universities do much less well here than on the other indicators.

With QS, as with the other rankings, we should always be careful when there is any sort of methodological change. The first rule of ranking analysis is that any non-trivial change in rank is likely to be the result of methodological changes.

So lets take a look at the broad field tables. In Arts and Humanities  the top Chinese university is Peking University which fell seven places from 36th to 43rd between 2021 and 2022.

It was the same for other broad areas. In Engineering and Technology, Tsinghua fell from 10th to 14th, and in Natural Sciences from 15th to 23rd. (In this table Peking moved slightly ahead of Tsinghua into 21st place). In Social Sciences and Management Peking went from 21st to 26th 

There was one exception. In Life Sciences and Medicine Peking rose from 62nd to 53rd, although its overall score remained the same at 79.

However, before assuming that this is evidence of Chinese decline we should note the possible impact of the new indicator where Chinese institutions, including Peking and Tsinghua, do relatively poorly. In Life Sciences and Medicine every single one of the 22 Chinese universities listed do better for H-Index and Citations than for IRN. 

It looks as though the ostensible fall of Chinese universities  is partly or largely due to QS's addition of the IRN metric.

Looking at  Pitations per paper, which is a fairly good  proxy for research quality, we find that for most subject areas, the best Chinese universities have improved since last year. In Engineering and Technology Tsinghua has risen from 89.1 to 89.6. In Life Sciences and Medicine Peking has gone from 79.2 to 80.6 and in Social Science and Management from 89.7 to 90.7.

For Natural Science Tsinghau, had a score for citations of 88.6. It fell this year but was surpassed by Peking with a score of 90.1.

If Citations per Paper are consider the arbiter of research excellence then Chinese universities have been improving over the last year and the apparent decline in the broad subject areas is largely the result of the new indicator. One wonders if the QS management knew this was going to happen.

That is not the end of the discussion. There may well be areas where the Chinese advance is faltering or at least reaching a plateau and this might be revealed by a scrutiny of the narrow subject tables.



Monday, March 28, 2022

Where does reputation come from?

THE announced the latest edition of its reputation rankings last October. The amount of information is quite limited: scores are given for only the top fifty universities. But even that provides a few interesting insights.

First, there is really no point in providing separate data for teaching and research reputation. The correlation between the two for the top fifty is .99. This is unsurprising. THE surveys researchers who have published in Scopus indexed journals and so there is a very obvious halo effect. Respondents have no choice but to refer to their knowledge of research competence when trying to assess teaching performance. If THE are going to improve their current methodology they need to recognise that their reputation surveys are measuring the same thing. Maybe they could try to find another source of respondents for the teaching survey, such as school advisors, students or faculty at predominantly teaching institutions. 

Next, after plugging in a few indicators from other rankings, it is clear that that the metrics most closely associated with teaching and research reputation are publications in Nature and Science (Shanghai), highly cited researchers (Shanghai), and papers in highly reputed journals (Leiden).

The correlation with scores in the RUR and QS reputation rankings, citations (THE and QS), and international faculty was modest.

There was no correlation at all with the proportion of papers with female or male authors (Leiden).

So it seems that the best way to acquire a reputation for good teaching and research is publish papers in the top journals and get lots of citations. That, of course, applies only to this very limited group of institutions.



Sunday, March 20, 2022

What should Rankers Do About the Ukraine Crisis?

Over the last few days there have been calls for the global rankers to boycott or delist Russian universities to protest the Russian invasion of Ukraine. There have also been demands that journals should reject submissions from Russian authors and universities and research bodies stop collaborating with Russian authors.

So far, four European ranking agencies have announced some sort of sanctions.

U-Multirank has announced that Russian universities will be suspended "until they again share in the core values of the European higher education area."

QS will not promote Russia as a study area and will pause business engagement. It will also redact Russian universities from new rankings.

Webometrics will "limit the value added information" for Russian and Belarusian universities.

Times Higher Education (THE) will stop business activities with Russia but will not remove Russian universities from its rankings. 

The crisis has highlighted a fundamental ambiguity in the nature of global rankings. Are they devices for promoting the business interests of institutions or do they provide relevant and useful information for researchers, students and the public?

Refraining from doing business with Russia until it withdraws from Ukraine is a welcome rebuke to the current government. If, however, rankings contain useful information about Russian scientific and research capabilities then that information should continue to be made available.