Friday, March 29, 2019

The THE-QS duopoly

Strolling around the exhibition hall at the APAIE conference in Kuala Lumpur, I gathered a pile of promotional literature from various universities.

As expected, a lot of this referred to international university rankings. Here are some examples.

Ritsumeikan Asia Pacific University, Japan: THE Japan University Rankings 21st, QS World Rankings Asia 100% for internationalisation.

Yonsei University, Korea: QS Asia University Rankings 19th

Hanyang University, Korea: QS, Reuters Asia's Innovative 100 universities

Sabinci University, Turkey: THE

University of Malaya: QS world rankings 87th

Hasanuddin University: QS Asian Rankings, Webometrics

Keio University, QS, THE, Asia Innovative Universities

Novosibirsk State University, Russia: QS World, EECA and subject rankings

China University of Geosciences; US News Best Global Universities.

Mahidol University, Thailand, cites US News, GreenMetric, THE, National Taiwan University, uniRank, URAP,, and QS.

The QS- THE duopoly seems to be holding up fairly well but there are signs that some universities are exploring other international rankings.

Also, in a report on Malaysia, Prof Nordin Yahaya of Universiti Teknologi Malaysia referred to URAP, produced by the Middle East Technical University, to measure the country's research performance.




Thursday, March 28, 2019

Global University Rankings and Southeast Asia


Global University Rankings and Southeast Asia
Paper presented at Asia-Pacific Association for International Education, Kuala Lumpur 26 March 2019

Background
Global rankings began in a small way in 2003 with the publication of the first edition of the Shanghai Rankings. These were quite simple, comprising six indicators that measured scientific research. Their purpose was to show how far Chinese universities had to go to reach world class status. Public interest was limited although some European universities were shocked to find how far they were behind English-speaking institutions.
Then came the Times Higher Education Supplement (THES) – Quacquarelli Symonds (QS) World University Rankings. Their methodology was very different from that of Shanghai, relying heavily of a survey of academic opinion. In most parts of the world interest was limited and the rankings received little respect but Malaysia was different. The country’s flagship university, Universiti Malaya (UM), reached the top one hundred, an achievement that was cause for great if brief celebration. That achievement was the result of an error on the part of the rankers, QS, and in 2005 UM crashed out of the top 100.

Current Ranking Scene
International rankings have made substantial progress over the last decade and a half. In 2003 there was one, Shanghai. Now according to the IREG Inventory there are 45 international rankings of which 17 are global, plus subject, regional, system, business school and sub- rankings.
They cover a broad range of data that could be of interest to students, researchers, policy makers and other stakeholders. They include metrics like number of faculty and students, income, patents, web activity, publications, books, conferences, reputation surveys, patents, and contributions to environmental sustainability.

Rankings and Southeast Asia
For Malaysia the publication of the THES-QS rankings in 2004 was the beginning of years of interest, perhaps obsession, with the rankings. The country has devoted resources and support to gain favourable places in the QS rankings.
Singapore has emphasised both the QS and THE rankings since that unpleasant divorce in 2009. It has hosted the THE academic summit and has performed well in nearly all rankings especially in the THE and QS world rankings.
A few universities in Thailand, Indonesia and the Philippines have been included at the lower levels of rankings such as those published by the University of Leiden, National Taiwan University, Scimago, THE and QS.
Other countries have shown less interest. Myanmar and Cambodia are included only in the Webometrics and uniRank rankings, which include thousands of places with the slightest pretension of being a university or college.

Inclusion and Performance
There is considerable variation in the inclusiveness of the rankings. There are five Southeast Asian universities in the Shanghai Rankings and 3,192 in Webometrics.
Among Southeast Asian universities Singapore is clearly the best performer, followed by Malaysia, while Myanmar is the worse.

Targets
The declaration of targets with regard to rankings is a common strategy across the world.  Malaysia has a specific policy of getting universities into the QS rankings, 4 in the top 200, 2 in the top 100 and one in the top 25.
In Thailand the 20-year national strategy aims at getting at least five Thai universities into the top 100 of the world rankings.
Indonesia wants to get five specified universities into the QS top 500 by 2019 and a further six by 2024.

The Dangers of Rankings
The cruel reality is that we cannot escape rankings. If all the current rankings were banned and thrown into an Orwellian memory hole then we would simply revert to informal and subjective rankings that prevailed before.
If we must have formal rankings then they should be as valid and accurate as possible and they should take account of the varying missions of universities and their size and clientele and they should be as comprehensive as possible.
To ignore the data that rankings can provide is to seriously limit public awareness. At the moment Southeast Asian universities and governments seem interested mainly or only in the QS rankings or perhaps the THE rankings.
To focus on any single ranking could be self-defeating. Take a look at Malaysia’s position in the QS rankings. It is obvious that UM, Malaysia’s leading university in most rankings, does very much better in the QS rankings than in every single ranking, except the GreenMetric rankings.
Why is this? The QS rankings allot a 40 % weighting to a survey of academic opinion supposedly about research, more than any other ranking. They allow universities to influence the composition of survey respondents, by submitting names or by alerting researchers to the sign-up facility where they can take part in the survey.
To their credit, QS have published the number of survey respondents by country. The largest number is from the USA with almost as many from the UK. The third largest number of respondents is from Malaysia, more than China and India combined. Malaysian universities do much better in the academic survey than they do for citations.
It is problematical to present UM as a top 100 university. It has a good reputation among local and regional researchers but is not doing so well in the other metrics especially research of the highest quality.
There is also a serious risk that the performance in the QS ranking is precarious. Already countries like Russia, Colombia, Iraq, and Kazakhstan are increasing their representation in the QS survey. More will join them. The top Chinese universities are targeting the Shanghai rankings but one day the second tier may try out for the QS rankings.
Also, any university that relies too much on the QS rankings could easily be a victim of methodological changes. QS has, with good reason, revamped its methodology several times and this can easily affect the scores of universities through no fault or credit of their own. This may have happened again during the collection of data for this year’s rankings. QS recently announced that universities can either submit names of potential respondents or alert researchers to the sign-up facility but not, as in previous years, both. Universities that have not responded to this change may well suffer a reduced score in the survey indicators.
If not QS, should another ranking be used for benchmarking and targets? Some observers claim that Asian universities should opt for the THE rankings which are alleged to be more rigorous and sophisticated and certainly more prestigious.
That would be a mistake. The value of the THE rankings, but not their price, is drastically reduced by their lack of transparency so that it is impossible, for example, to tell whether a change in the score for research results from an increase in publications, a decline in the number of staff, an improved reputation or an increase in research income.
Then there is the THE citations indicator. This can only be described as bizarre and ridiculous.
Here are some of the universities that appeared in the top 50 of last year’s citation indicator which supposedly measures research influence or quality: Babol Noshirvani University of Technology, Brighton and Sussex medical School, Reykjavik University, Anglia Ruskin University Jordan University of Science and Technology, Vita-Salute San Raffaele University.

Proposals
1.      It is not a good idea to use any single ranking but if one is to be then it should be one that is methodologically stable and technically competent and does not emphasise a single indicator. For research, probably the best bet would be the Leiden Ranking. If a ranking is needed that includes metrics that might be related to teaching and learning then Round University Rankings would be helpful.
2.  Another approach would be to encourage universities to target more than one university.
3.     A regional database should be created that would provide information about ranks and scores in all relevant rankings and data about faculty, students, income, publications, citations and so on.
4.     Regional universities should work to develop measures of the effectiveness of teaching and learning.

Links