International university rankings have become very popular in Malaysia, perhaps obsessively so. There is also a lot of commentary in the media, usually not very well informed.
Are Malaysian universities going backwards?
Murray Hunter writing in Eurasia Review thinks so. His claim is supported entirely by their relatively poor performance in the Times Higher Education (THE) world and Asian university rankings.
(By the way, Hunter refers to "THES" but that changed several years ago).
Hunter apparently is one of those who are unaware of the variety and complexity of the current international university ranking scene. The IREG international inventory lists 45 rankings and is already in need of updating. Many of these cover more institutions than THE, some are much more technically competent, and some include more indicators.
THE's is not the only ranking available and it is not very helpful for any institution seeking to make genuine improvements. It bundles eleven indicators in groups so that it is very difficult to work out exactly what contributed to a deterioration or an improvement . The two metrics that stand alone have produced some amusing but questionable results, Babol Noshirvani University of Technology first for research impact, Anadolu University for industry income.
It really is no disgrace to do badly in these rankings.
Hunter's article is a mirror image of the excitement in the Malaysian media about the rise of Malaysian universities in the QS rankings, which seems to be largely the result of massive Malaysian participation in the QS academic survey, which has a disproportionate weighting of 40%.
Malaysian universities have been celebrating their rise in the QS world rankings for some time. That is perhaps a bit more reasonable than getting excited about the THE rankings but still not very helpful.
We need to use a broad range of rankings. For a start take a look at the Leiden Ranking for quantity and quality of research. For total publications Universiti Malaya (UM) has risen from 509th place in 2006-09 to112th in 2014-17.
For the percentage of publications in the top 1% of journals, the most selective indicator, its rank has risen from 824th in 2006-2009 to 221st in 2014-17.
Turning to the Moscow based Round University Rankings for a more general assessment, we find that UM has risen from 269th in 2016 to 156th in 2019 (76th for teaching).
Malaysian universities, at least the best known ones, are making significant and substantial progress in stable and reliable global rankings.
At the end of the article Hunter says that "(t)he fact that Universiti Tunku Abdul Rahman (UTAR) has run into second best Malaysian University in less than 20 years of existence as a university is telling about the plight of Malaysian public universities."
Actually, it says nothing except that THE has a flawed methodology for counting citations. UTAR's performance in the THE rankings is the result of one talented researcher working for the Global Burden of Disease project, limited research output, a bonus for location in a country with a modest impact score and a refusal to use fractional counting.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Monday, June 17, 2019
Tuesday, May 14, 2019
Bangladeshi Universities Should Forget about Their Websites
Bangladesh has a lot of universities but none of them have succeeded in getting into the list of 417 universities included in the THE Asian University Rankings. The country's performance is worse than anywhere in South Asia. Even Nepal and Sri Lanka have managed one university each
Once again, it seems that the media and and university administrators have been persuaded that there is only one set of international rankings, those produced by Times Higher Education (THE), and that the many others, which are documented in the IREG Inventory of International Rankings, do not exist.
The response of university bureaucrats shows a lack of awareness of current university rankings and their methodologies. Dhaka University's head claimed, in an interview with the Dhaka Tribune, that if the university had provided the necessary information on its website it would be in a "prestigious position". He apparently went on to say that the problem was that the website was not up to date and that a dean has been assigned to discuss the matter.
THE does not use data from university websites. It collects and processes information submitted by institutions, bibliometric data from the Scopus database and responses to surveys. It makes no difference to THE or other rankers whether the website was updated yesterday or a decade ago.
The Vice Chancellor of Shahjahal University of Science and Technology spoke about research papers not being published on websites or noted in annual reports. Again, this makes no difference to THE or anyone else.
He was, however, correct to note that bureaucratic restrictions on the admission of foreign students would reduce the scores in those rankings that count international students as an indicator.
Universities in Bangladesh need to do some background research into the current ranking scene before they attempt to get ranked. They should be aware of the rapidly growing number of rankings. THE is not the only international ranking and it is probably unsuitable for universities in countries like Bangladesh that do not have very much income or established reputations and are unable to participate in citation-rich global projects.
They should look at rankings with a more appropriate methodology. Dhaka University, for example, is currently ranked 504th among universities in the Scimago Institutions Rankings, which include patents, altmetrics, and web size as well as research.
Bangladeshi universities should first review the current rankings and make a note of their procedures and requirements and also consider the resources available to collect and submit data .
It would probably be a good idea to focus on Scimago and the research focussed URAP rankings, If universities want to try for a research plus teaching ranking which require institutions to submit data then it would be better to contact the Global Institutional profile Project to get into the Round University Rankings or QS with the objective of leveraging their local reputations with academics and employers.
Saturday, April 13, 2019
Do we really need a global impact ranking?
Sixteen years ago there was just one international university ranking, the Shanghai Academic Ranking of World Universities (ARWU). Since then rankings have proliferated. We have world rankings, regional rankings, subject rankings, business school rankings, young university rankings, employability rankings, systems rankings, and best student cities.
As if this wasn't enough, there is now a "global impact" ranking published by Times Higher Education (THE). This was announced with a big dose of breathless hyperbole as though it was something revolutionary and unprecedented. Not quite. Before THE's ranking there was the GreenMetric ranking published by Universitas Indonesia. This measured universities' contribution to sustainability through indicators like water, waste, transportation, and education and research .
THE was doing something more specific and perhaps more ambitious, measuring adherence to the Sustainable Development Goals proclaimed by the UN. Universities could submit data about eleven out of the 17 seventeen goals and a minimum of four were counted for the overall rankings, with one, partnership for the goals, being mandatory.
The two rankings have attracted different respondents so perhaps they are complementary rather than competitive. The GreenMetric rankings include 66 universities from Indonesia, 18 from Malaysia and 61 from the USA compared to 7, 9 and 31 in the THE impact rankings. On the other hand, the THE rankings have a lot more universities from Australia and the UK. It is noticeable that China is almost entirely absent from both (2 universities in GreenMetric and 3 in THE's).
But is there really any point in a global impact ranking? Some universities in the West seem to be doing a fairly decent job of producing research in the natural sciences although no doubt much of it is mediocre or worse and there is also a lot of politically correct nonsense being produced in the humanities and social sciences. They have been far less successful in teaching undergraduates and providing them with the skills required by employers and professional and graduate schools. It is surely debatable whether universities should be concerned about the UN sustainable development goals before they have figured out to fulfill their teaching mission.
Similarly, rankers have become quite adept at measuring and comparing research output and quality. There are several technically competent rankings which look at research from different viewpoints. There is the Shanghai ARWU which counts long dead Nobel and Fields laureates, the National Taiwan University ranking which counts publications over an eleven year period period, Scimago which includes patents, URAP with 2,500 institutions, the US News Best Global Universities which includes books and conferences.
The THE world ranking is probably the least useful of the research-dominant rankings. It gives a 30 % weighting to research which is assessed by three indicators, reputation, publications per staff and research income per staff. An improvement in the score for research could result from an improved reputation for research, an reduction in the number of academic staff, an increase in the number of publications, an increase in research funding, or a combination of some or all of these. Students and stakeholders who want to know exactly why the research prowess of a university is rising or falling will not find THE very helpful.
The THE world and regional rankings also have a citations indicator derived from normalised citations impact. Citations are benchmarked against documents in 300+ fields, five document types and five years of publications. Further, citations to documents with less that a thousand authors are not fractionalised. Further again, self-citations are allowed. And again, there is a regional modification or country bonus applied to half of the indicator, dividing a universities impact score by the square root of the score of the country in which it is located. This means that every university except those in the country with the highest score goes up, some a bit and some a lot.
The result of all this is a bit of a mess. Over the last few years we have seen institutions rise to glory at the top of the citations that should never have been there, usually because they have succeeded in combining a small number of publications with participation in a mega project with hundreds of authors and affiliated universities and thousands of citations. Top universities for research impact in the 2018-19 world rankings include Babol Noshirvani University of Technology, the University of Reykjavik, the Brighton and Sussex Medical School and Anglia Ruskin University.
There is something disturbing about university leaders queuing up to bask in the approval of an organisation that seems to think that Babol Norshirvani University of Technology has a greater research influence than anywhere else in the world. The idea that a ranking organisation that cannot publish a plausible list of influential research universities should have the nerve to start talking about measuring global impact is quite surprising.
Most rankers have done better at evaluating research than THE. At least they have not produced indicators as ridiculous as the normalised citations indicator. Teaching, especially undergraduate teaching, is another matter. Attempts to capture the quality of university teaching have been far from successful. Rankers have tried to measure inputs such as income or faculty resources or have conducted surveys but these are at best very indirect indicators. It seems strange that they should now turn their attention to various third missions.
Of course, research and teaching are not the only thing that universities do. But until international ranking organisations have worked out how to effectively compare universities for the quality of learning and teaching or graduate employability it seems premature to start trying to measure anything else.
It is likely though that many universities will welcome the latest THE initiative. Many Western universities faced with declining standards and funding and competition from the East will welcome the opportunity to find something where they can get high scores that will help with branding and promotion.
As if this wasn't enough, there is now a "global impact" ranking published by Times Higher Education (THE). This was announced with a big dose of breathless hyperbole as though it was something revolutionary and unprecedented. Not quite. Before THE's ranking there was the GreenMetric ranking published by Universitas Indonesia. This measured universities' contribution to sustainability through indicators like water, waste, transportation, and education and research .
THE was doing something more specific and perhaps more ambitious, measuring adherence to the Sustainable Development Goals proclaimed by the UN. Universities could submit data about eleven out of the 17 seventeen goals and a minimum of four were counted for the overall rankings, with one, partnership for the goals, being mandatory.
The two rankings have attracted different respondents so perhaps they are complementary rather than competitive. The GreenMetric rankings include 66 universities from Indonesia, 18 from Malaysia and 61 from the USA compared to 7, 9 and 31 in the THE impact rankings. On the other hand, the THE rankings have a lot more universities from Australia and the UK. It is noticeable that China is almost entirely absent from both (2 universities in GreenMetric and 3 in THE's).
But is there really any point in a global impact ranking? Some universities in the West seem to be doing a fairly decent job of producing research in the natural sciences although no doubt much of it is mediocre or worse and there is also a lot of politically correct nonsense being produced in the humanities and social sciences. They have been far less successful in teaching undergraduates and providing them with the skills required by employers and professional and graduate schools. It is surely debatable whether universities should be concerned about the UN sustainable development goals before they have figured out to fulfill their teaching mission.
Similarly, rankers have become quite adept at measuring and comparing research output and quality. There are several technically competent rankings which look at research from different viewpoints. There is the Shanghai ARWU which counts long dead Nobel and Fields laureates, the National Taiwan University ranking which counts publications over an eleven year period period, Scimago which includes patents, URAP with 2,500 institutions, the US News Best Global Universities which includes books and conferences.
The THE world ranking is probably the least useful of the research-dominant rankings. It gives a 30 % weighting to research which is assessed by three indicators, reputation, publications per staff and research income per staff. An improvement in the score for research could result from an improved reputation for research, an reduction in the number of academic staff, an increase in the number of publications, an increase in research funding, or a combination of some or all of these. Students and stakeholders who want to know exactly why the research prowess of a university is rising or falling will not find THE very helpful.
The THE world and regional rankings also have a citations indicator derived from normalised citations impact. Citations are benchmarked against documents in 300+ fields, five document types and five years of publications. Further, citations to documents with less that a thousand authors are not fractionalised. Further again, self-citations are allowed. And again, there is a regional modification or country bonus applied to half of the indicator, dividing a universities impact score by the square root of the score of the country in which it is located. This means that every university except those in the country with the highest score goes up, some a bit and some a lot.
The result of all this is a bit of a mess. Over the last few years we have seen institutions rise to glory at the top of the citations that should never have been there, usually because they have succeeded in combining a small number of publications with participation in a mega project with hundreds of authors and affiliated universities and thousands of citations. Top universities for research impact in the 2018-19 world rankings include Babol Noshirvani University of Technology, the University of Reykjavik, the Brighton and Sussex Medical School and Anglia Ruskin University.
There is something disturbing about university leaders queuing up to bask in the approval of an organisation that seems to think that Babol Norshirvani University of Technology has a greater research influence than anywhere else in the world. The idea that a ranking organisation that cannot publish a plausible list of influential research universities should have the nerve to start talking about measuring global impact is quite surprising.
Most rankers have done better at evaluating research than THE. At least they have not produced indicators as ridiculous as the normalised citations indicator. Teaching, especially undergraduate teaching, is another matter. Attempts to capture the quality of university teaching have been far from successful. Rankers have tried to measure inputs such as income or faculty resources or have conducted surveys but these are at best very indirect indicators. It seems strange that they should now turn their attention to various third missions.
Of course, research and teaching are not the only thing that universities do. But until international ranking organisations have worked out how to effectively compare universities for the quality of learning and teaching or graduate employability it seems premature to start trying to measure anything else.
It is likely though that many universities will welcome the latest THE initiative. Many Western universities faced with declining standards and funding and competition from the East will welcome the opportunity to find something where they can get high scores that will help with branding and promotion.
Where is the real educational capital of the world?
Here is another example of how rankings, especially those produced by Times Higher Education (THE), are used to mislead the public.
The London Post has announced that London is the Higher Educational Capital of the World for 2019. Support for this claim is provided by four London universities appearing in the top 40 of the THE World University Rankings which, unsurprisingly, have been welcomed by London Mayor Sadiq Khan.
In addition, THE has Oxford and Cambridge as first and second in the world in their overall rankings and QS has declared London to be the Best Student City.
THE is not the only global ranking. There are now several others and none of them have Oxford in first place. Most of them give the top spot to Harvard, although in the QS world rankings it is MIT and in the GreenMetric rankings Wageningen.
Also, if we look at the number of universities in the top 50 of the Shanghai rankings we cannot see London as the undisputed HE capital of the world. Using this simple criterion it would be New York with three, Columbia, New York University and Rockefeller.
Then come Boston, Paris, Chicago and London with two each.
The London Post has announced that London is the Higher Educational Capital of the World for 2019. Support for this claim is provided by four London universities appearing in the top 40 of the THE World University Rankings which, unsurprisingly, have been welcomed by London Mayor Sadiq Khan.
In addition, THE has Oxford and Cambridge as first and second in the world in their overall rankings and QS has declared London to be the Best Student City.
THE is not the only global ranking. There are now several others and none of them have Oxford in first place. Most of them give the top spot to Harvard, although in the QS world rankings it is MIT and in the GreenMetric rankings Wageningen.
Also, if we look at the number of universities in the top 50 of the Shanghai rankings we cannot see London as the undisputed HE capital of the world. Using this simple criterion it would be New York with three, Columbia, New York University and Rockefeller.
Then come Boston, Paris, Chicago and London with two each.
Subscribe to:
Comments (Atom)