Rankings are everywhere. Like a cleverly constructed virus they are all over the place and are almost impossible to delete. They are used for immigration policy, advertising, promotion, and recruitment. Here is the latest example.
A tweet from Eduardo Urias noted by Stephen Curry reported that an advertisement for an assistant professorship at Maastricht University included the requirement that candidates "should clearly state the (THE, QS, of FT business school) ranking of the university of their highest degree."
The sentence has since been removed but one wonders why the relevant committee at Maastricht could not be trusted to look up the university ranks by themselves and why should they ask about those specific rankings, which might not be the most relevant or accurate. Maastricht is a very good university, especially for the social sciences (I knew that anyway and I checked with Leiden Ranking), so why should it need to take rankings into account instead of looking at the applicants grad school records publications?
Even though that sentence was removed. this one remains.
"Maastricht University is currently ranked fifth in the top of Young Universities under 50 years."
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Sunday, June 30, 2019
Monday, June 17, 2019
Are Malaysian Universities Going Backwards?
International university rankings have become very popular in Malaysia, perhaps obsessively so. There is also a lot of commentary in the media, usually not very well informed.
Are Malaysian universities going backwards?
Murray Hunter writing in Eurasia Review thinks so. His claim is supported entirely by their relatively poor performance in the Times Higher Education (THE) world and Asian university rankings.
(By the way, Hunter refers to "THES" but that changed several years ago).
Hunter apparently is one of those who are unaware of the variety and complexity of the current international university ranking scene. The IREG international inventory lists 45 rankings and is already in need of updating. Many of these cover more institutions than THE, some are much more technically competent, and some include more indicators.
THE's is not the only ranking available and it is not very helpful for any institution seeking to make genuine improvements. It bundles eleven indicators in groups so that it is very difficult to work out exactly what contributed to a deterioration or an improvement . The two metrics that stand alone have produced some amusing but questionable results, Babol Noshirvani University of Technology first for research impact, Anadolu University for industry income.
It really is no disgrace to do badly in these rankings.
Hunter's article is a mirror image of the excitement in the Malaysian media about the rise of Malaysian universities in the QS rankings, which seems to be largely the result of massive Malaysian participation in the QS academic survey, which has a disproportionate weighting of 40%.
Malaysian universities have been celebrating their rise in the QS world rankings for some time. That is perhaps a bit more reasonable than getting excited about the THE rankings but still not very helpful.
We need to use a broad range of rankings. For a start take a look at the Leiden Ranking for quantity and quality of research. For total publications Universiti Malaya (UM) has risen from 509th place in 2006-09 to112th in 2014-17.
For the percentage of publications in the top 1% of journals, the most selective indicator, its rank has risen from 824th in 2006-2009 to 221st in 2014-17.
Turning to the Moscow based Round University Rankings for a more general assessment, we find that UM has risen from 269th in 2016 to 156th in 2019 (76th for teaching).
Malaysian universities, at least the best known ones, are making significant and substantial progress in stable and reliable global rankings.
At the end of the article Hunter says that "(t)he fact that Universiti Tunku Abdul Rahman (UTAR) has run into second best Malaysian University in less than 20 years of existence as a university is telling about the plight of Malaysian public universities."
Actually, it says nothing except that THE has a flawed methodology for counting citations. UTAR's performance in the THE rankings is the result of one talented researcher working for the Global Burden of Disease project, limited research output, a bonus for location in a country with a modest impact score and a refusal to use fractional counting.
Are Malaysian universities going backwards?
Murray Hunter writing in Eurasia Review thinks so. His claim is supported entirely by their relatively poor performance in the Times Higher Education (THE) world and Asian university rankings.
(By the way, Hunter refers to "THES" but that changed several years ago).
Hunter apparently is one of those who are unaware of the variety and complexity of the current international university ranking scene. The IREG international inventory lists 45 rankings and is already in need of updating. Many of these cover more institutions than THE, some are much more technically competent, and some include more indicators.
THE's is not the only ranking available and it is not very helpful for any institution seeking to make genuine improvements. It bundles eleven indicators in groups so that it is very difficult to work out exactly what contributed to a deterioration or an improvement . The two metrics that stand alone have produced some amusing but questionable results, Babol Noshirvani University of Technology first for research impact, Anadolu University for industry income.
It really is no disgrace to do badly in these rankings.
Hunter's article is a mirror image of the excitement in the Malaysian media about the rise of Malaysian universities in the QS rankings, which seems to be largely the result of massive Malaysian participation in the QS academic survey, which has a disproportionate weighting of 40%.
Malaysian universities have been celebrating their rise in the QS world rankings for some time. That is perhaps a bit more reasonable than getting excited about the THE rankings but still not very helpful.
We need to use a broad range of rankings. For a start take a look at the Leiden Ranking for quantity and quality of research. For total publications Universiti Malaya (UM) has risen from 509th place in 2006-09 to112th in 2014-17.
For the percentage of publications in the top 1% of journals, the most selective indicator, its rank has risen from 824th in 2006-2009 to 221st in 2014-17.
Turning to the Moscow based Round University Rankings for a more general assessment, we find that UM has risen from 269th in 2016 to 156th in 2019 (76th for teaching).
Malaysian universities, at least the best known ones, are making significant and substantial progress in stable and reliable global rankings.
At the end of the article Hunter says that "(t)he fact that Universiti Tunku Abdul Rahman (UTAR) has run into second best Malaysian University in less than 20 years of existence as a university is telling about the plight of Malaysian public universities."
Actually, it says nothing except that THE has a flawed methodology for counting citations. UTAR's performance in the THE rankings is the result of one talented researcher working for the Global Burden of Disease project, limited research output, a bonus for location in a country with a modest impact score and a refusal to use fractional counting.