Rankings are everywhere. Like a cleverly constructed virus they are all over the place and are almost impossible to delete. They are used for immigration policy, advertising, promotion, and recruitment. Here is the latest example.
A tweet from Eduardo Urias noted by Stephen Curry reported that an advertisement for an assistant professorship at Maastricht University included the requirement that candidates "should clearly state the (THE, QS, of FT business school) ranking of the university of their highest degree."
The sentence has since been removed but one wonders why the relevant committee at Maastricht could not be trusted to look up the university ranks by themselves and why should they ask about those specific rankings, which might not be the most relevant or accurate. Maastricht is a very good university, especially for the social sciences (I knew that anyway and I checked with Leiden Ranking), so why should it need to take rankings into account instead of looking at the applicants grad school records publications?
Even though that sentence was removed. this one remains.
"Maastricht University is currently ranked fifth in the top of Young Universities under 50 years."
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Sunday, June 30, 2019
Monday, June 17, 2019
Are Malaysian Universities Going Backwards?
International university rankings have become very popular in Malaysia, perhaps obsessively so. There is also a lot of commentary in the media, usually not very well informed.
Are Malaysian universities going backwards?
Murray Hunter writing in Eurasia Review thinks so. His claim is supported entirely by their relatively poor performance in the Times Higher Education (THE) world and Asian university rankings.
(By the way, Hunter refers to "THES" but that changed several years ago).
Hunter apparently is one of those who are unaware of the variety and complexity of the current international university ranking scene. The IREG international inventory lists 45 rankings and is already in need of updating. Many of these cover more institutions than THE, some are much more technically competent, and some include more indicators.
THE's is not the only ranking available and it is not very helpful for any institution seeking to make genuine improvements. It bundles eleven indicators in groups so that it is very difficult to work out exactly what contributed to a deterioration or an improvement . The two metrics that stand alone have produced some amusing but questionable results, Babol Noshirvani University of Technology first for research impact, Anadolu University for industry income.
It really is no disgrace to do badly in these rankings.
Hunter's article is a mirror image of the excitement in the Malaysian media about the rise of Malaysian universities in the QS rankings, which seems to be largely the result of massive Malaysian participation in the QS academic survey, which has a disproportionate weighting of 40%.
Malaysian universities have been celebrating their rise in the QS world rankings for some time. That is perhaps a bit more reasonable than getting excited about the THE rankings but still not very helpful.
We need to use a broad range of rankings. For a start take a look at the Leiden Ranking for quantity and quality of research. For total publications Universiti Malaya (UM) has risen from 509th place in 2006-09 to112th in 2014-17.
For the percentage of publications in the top 1% of journals, the most selective indicator, its rank has risen from 824th in 2006-2009 to 221st in 2014-17.
Turning to the Moscow based Round University Rankings for a more general assessment, we find that UM has risen from 269th in 2016 to 156th in 2019 (76th for teaching).
Malaysian universities, at least the best known ones, are making significant and substantial progress in stable and reliable global rankings.
At the end of the article Hunter says that "(t)he fact that Universiti Tunku Abdul Rahman (UTAR) has run into second best Malaysian University in less than 20 years of existence as a university is telling about the plight of Malaysian public universities."
Actually, it says nothing except that THE has a flawed methodology for counting citations. UTAR's performance in the THE rankings is the result of one talented researcher working for the Global Burden of Disease project, limited research output, a bonus for location in a country with a modest impact score and a refusal to use fractional counting.
Are Malaysian universities going backwards?
Murray Hunter writing in Eurasia Review thinks so. His claim is supported entirely by their relatively poor performance in the Times Higher Education (THE) world and Asian university rankings.
(By the way, Hunter refers to "THES" but that changed several years ago).
Hunter apparently is one of those who are unaware of the variety and complexity of the current international university ranking scene. The IREG international inventory lists 45 rankings and is already in need of updating. Many of these cover more institutions than THE, some are much more technically competent, and some include more indicators.
THE's is not the only ranking available and it is not very helpful for any institution seeking to make genuine improvements. It bundles eleven indicators in groups so that it is very difficult to work out exactly what contributed to a deterioration or an improvement . The two metrics that stand alone have produced some amusing but questionable results, Babol Noshirvani University of Technology first for research impact, Anadolu University for industry income.
It really is no disgrace to do badly in these rankings.
Hunter's article is a mirror image of the excitement in the Malaysian media about the rise of Malaysian universities in the QS rankings, which seems to be largely the result of massive Malaysian participation in the QS academic survey, which has a disproportionate weighting of 40%.
Malaysian universities have been celebrating their rise in the QS world rankings for some time. That is perhaps a bit more reasonable than getting excited about the THE rankings but still not very helpful.
We need to use a broad range of rankings. For a start take a look at the Leiden Ranking for quantity and quality of research. For total publications Universiti Malaya (UM) has risen from 509th place in 2006-09 to112th in 2014-17.
For the percentage of publications in the top 1% of journals, the most selective indicator, its rank has risen from 824th in 2006-2009 to 221st in 2014-17.
Turning to the Moscow based Round University Rankings for a more general assessment, we find that UM has risen from 269th in 2016 to 156th in 2019 (76th for teaching).
Malaysian universities, at least the best known ones, are making significant and substantial progress in stable and reliable global rankings.
At the end of the article Hunter says that "(t)he fact that Universiti Tunku Abdul Rahman (UTAR) has run into second best Malaysian University in less than 20 years of existence as a university is telling about the plight of Malaysian public universities."
Actually, it says nothing except that THE has a flawed methodology for counting citations. UTAR's performance in the THE rankings is the result of one talented researcher working for the Global Burden of Disease project, limited research output, a bonus for location in a country with a modest impact score and a refusal to use fractional counting.
Tuesday, May 14, 2019
Bangladeshi Universities Should Forget about Their Websites
Bangladesh has a lot of universities but none of them have succeeded in getting into the list of 417 universities included in the THE Asian University Rankings. The country's performance is worse than anywhere in South Asia. Even Nepal and Sri Lanka have managed one university each
Once again, it seems that the media and and university administrators have been persuaded that there is only one set of international rankings, those produced by Times Higher Education (THE), and that the many others, which are documented in the IREG Inventory of International Rankings, do not exist.
The response of university bureaucrats shows a lack of awareness of current university rankings and their methodologies. Dhaka University's head claimed, in an interview with the Dhaka Tribune, that if the university had provided the necessary information on its website it would be in a "prestigious position". He apparently went on to say that the problem was that the website was not up to date and that a dean has been assigned to discuss the matter.
THE does not use data from university websites. It collects and processes information submitted by institutions, bibliometric data from the Scopus database and responses to surveys. It makes no difference to THE or other rankers whether the website was updated yesterday or a decade ago.
The Vice Chancellor of Shahjahal University of Science and Technology spoke about research papers not being published on websites or noted in annual reports. Again, this makes no difference to THE or anyone else.
He was, however, correct to note that bureaucratic restrictions on the admission of foreign students would reduce the scores in those rankings that count international students as an indicator.
Universities in Bangladesh need to do some background research into the current ranking scene before they attempt to get ranked. They should be aware of the rapidly growing number of rankings. THE is not the only international ranking and it is probably unsuitable for universities in countries like Bangladesh that do not have very much income or established reputations and are unable to participate in citation-rich global projects.
They should look at rankings with a more appropriate methodology. Dhaka University, for example, is currently ranked 504th among universities in the Scimago Institutions Rankings, which include patents, altmetrics, and web size as well as research.
Bangladeshi universities should first review the current rankings and make a note of their procedures and requirements and also consider the resources available to collect and submit data .
It would probably be a good idea to focus on Scimago and the research focussed URAP rankings, If universities want to try for a research plus teaching ranking which require institutions to submit data then it would be better to contact the Global Institutional profile Project to get into the Round University Rankings or QS with the objective of leveraging their local reputations with academics and employers.
Saturday, April 13, 2019
Do we really need a global impact ranking?
Sixteen years ago there was just one international university ranking, the Shanghai Academic Ranking of World Universities (ARWU). Since then rankings have proliferated. We have world rankings, regional rankings, subject rankings, business school rankings, young university rankings, employability rankings, systems rankings, and best student cities.
As if this wasn't enough, there is now a "global impact" ranking published by Times Higher Education (THE). This was announced with a big dose of breathless hyperbole as though it was something revolutionary and unprecedented. Not quite. Before THE's ranking there was the GreenMetric ranking published by Universitas Indonesia. This measured universities' contribution to sustainability through indicators like water, waste, transportation, and education and research .
THE was doing something more specific and perhaps more ambitious, measuring adherence to the Sustainable Development Goals proclaimed by the UN. Universities could submit data about eleven out of the 17 seventeen goals and a minimum of four were counted for the overall rankings, with one, partnership for the goals, being mandatory.
The two rankings have attracted different respondents so perhaps they are complementary rather than competitive. The GreenMetric rankings include 66 universities from Indonesia, 18 from Malaysia and 61 from the USA compared to 7, 9 and 31 in the THE impact rankings. On the other hand, the THE rankings have a lot more universities from Australia and the UK. It is noticeable that China is almost entirely absent from both (2 universities in GreenMetric and 3 in THE's).
But is there really any point in a global impact ranking? Some universities in the West seem to be doing a fairly decent job of producing research in the natural sciences although no doubt much of it is mediocre or worse and there is also a lot of politically correct nonsense being produced in the humanities and social sciences. They have been far less successful in teaching undergraduates and providing them with the skills required by employers and professional and graduate schools. It is surely debatable whether universities should be concerned about the UN sustainable development goals before they have figured out to fulfill their teaching mission.
Similarly, rankers have become quite adept at measuring and comparing research output and quality. There are several technically competent rankings which look at research from different viewpoints. There is the Shanghai ARWU which counts long dead Nobel and Fields laureates, the National Taiwan University ranking which counts publications over an eleven year period period, Scimago which includes patents, URAP with 2,500 institutions, the US News Best Global Universities which includes books and conferences.
The THE world ranking is probably the least useful of the research-dominant rankings. It gives a 30 % weighting to research which is assessed by three indicators, reputation, publications per staff and research income per staff. An improvement in the score for research could result from an improved reputation for research, an reduction in the number of academic staff, an increase in the number of publications, an increase in research funding, or a combination of some or all of these. Students and stakeholders who want to know exactly why the research prowess of a university is rising or falling will not find THE very helpful.
The THE world and regional rankings also have a citations indicator derived from normalised citations impact. Citations are benchmarked against documents in 300+ fields, five document types and five years of publications. Further, citations to documents with less that a thousand authors are not fractionalised. Further again, self-citations are allowed. And again, there is a regional modification or country bonus applied to half of the indicator, dividing a universities impact score by the square root of the score of the country in which it is located. This means that every university except those in the country with the highest score goes up, some a bit and some a lot.
The result of all this is a bit of a mess. Over the last few years we have seen institutions rise to glory at the top of the citations that should never have been there, usually because they have succeeded in combining a small number of publications with participation in a mega project with hundreds of authors and affiliated universities and thousands of citations. Top universities for research impact in the 2018-19 world rankings include Babol Noshirvani University of Technology, the University of Reykjavik, the Brighton and Sussex Medical School and Anglia Ruskin University.
There is something disturbing about university leaders queuing up to bask in the approval of an organisation that seems to think that Babol Norshirvani University of Technology has a greater research influence than anywhere else in the world. The idea that a ranking organisation that cannot publish a plausible list of influential research universities should have the nerve to start talking about measuring global impact is quite surprising.
Most rankers have done better at evaluating research than THE. At least they have not produced indicators as ridiculous as the normalised citations indicator. Teaching, especially undergraduate teaching, is another matter. Attempts to capture the quality of university teaching have been far from successful. Rankers have tried to measure inputs such as income or faculty resources or have conducted surveys but these are at best very indirect indicators. It seems strange that they should now turn their attention to various third missions.
Of course, research and teaching are not the only thing that universities do. But until international ranking organisations have worked out how to effectively compare universities for the quality of learning and teaching or graduate employability it seems premature to start trying to measure anything else.
It is likely though that many universities will welcome the latest THE initiative. Many Western universities faced with declining standards and funding and competition from the East will welcome the opportunity to find something where they can get high scores that will help with branding and promotion.
As if this wasn't enough, there is now a "global impact" ranking published by Times Higher Education (THE). This was announced with a big dose of breathless hyperbole as though it was something revolutionary and unprecedented. Not quite. Before THE's ranking there was the GreenMetric ranking published by Universitas Indonesia. This measured universities' contribution to sustainability through indicators like water, waste, transportation, and education and research .
THE was doing something more specific and perhaps more ambitious, measuring adherence to the Sustainable Development Goals proclaimed by the UN. Universities could submit data about eleven out of the 17 seventeen goals and a minimum of four were counted for the overall rankings, with one, partnership for the goals, being mandatory.
The two rankings have attracted different respondents so perhaps they are complementary rather than competitive. The GreenMetric rankings include 66 universities from Indonesia, 18 from Malaysia and 61 from the USA compared to 7, 9 and 31 in the THE impact rankings. On the other hand, the THE rankings have a lot more universities from Australia and the UK. It is noticeable that China is almost entirely absent from both (2 universities in GreenMetric and 3 in THE's).
But is there really any point in a global impact ranking? Some universities in the West seem to be doing a fairly decent job of producing research in the natural sciences although no doubt much of it is mediocre or worse and there is also a lot of politically correct nonsense being produced in the humanities and social sciences. They have been far less successful in teaching undergraduates and providing them with the skills required by employers and professional and graduate schools. It is surely debatable whether universities should be concerned about the UN sustainable development goals before they have figured out to fulfill their teaching mission.
Similarly, rankers have become quite adept at measuring and comparing research output and quality. There are several technically competent rankings which look at research from different viewpoints. There is the Shanghai ARWU which counts long dead Nobel and Fields laureates, the National Taiwan University ranking which counts publications over an eleven year period period, Scimago which includes patents, URAP with 2,500 institutions, the US News Best Global Universities which includes books and conferences.
The THE world ranking is probably the least useful of the research-dominant rankings. It gives a 30 % weighting to research which is assessed by three indicators, reputation, publications per staff and research income per staff. An improvement in the score for research could result from an improved reputation for research, an reduction in the number of academic staff, an increase in the number of publications, an increase in research funding, or a combination of some or all of these. Students and stakeholders who want to know exactly why the research prowess of a university is rising or falling will not find THE very helpful.
The THE world and regional rankings also have a citations indicator derived from normalised citations impact. Citations are benchmarked against documents in 300+ fields, five document types and five years of publications. Further, citations to documents with less that a thousand authors are not fractionalised. Further again, self-citations are allowed. And again, there is a regional modification or country bonus applied to half of the indicator, dividing a universities impact score by the square root of the score of the country in which it is located. This means that every university except those in the country with the highest score goes up, some a bit and some a lot.
The result of all this is a bit of a mess. Over the last few years we have seen institutions rise to glory at the top of the citations that should never have been there, usually because they have succeeded in combining a small number of publications with participation in a mega project with hundreds of authors and affiliated universities and thousands of citations. Top universities for research impact in the 2018-19 world rankings include Babol Noshirvani University of Technology, the University of Reykjavik, the Brighton and Sussex Medical School and Anglia Ruskin University.
There is something disturbing about university leaders queuing up to bask in the approval of an organisation that seems to think that Babol Norshirvani University of Technology has a greater research influence than anywhere else in the world. The idea that a ranking organisation that cannot publish a plausible list of influential research universities should have the nerve to start talking about measuring global impact is quite surprising.
Most rankers have done better at evaluating research than THE. At least they have not produced indicators as ridiculous as the normalised citations indicator. Teaching, especially undergraduate teaching, is another matter. Attempts to capture the quality of university teaching have been far from successful. Rankers have tried to measure inputs such as income or faculty resources or have conducted surveys but these are at best very indirect indicators. It seems strange that they should now turn their attention to various third missions.
Of course, research and teaching are not the only thing that universities do. But until international ranking organisations have worked out how to effectively compare universities for the quality of learning and teaching or graduate employability it seems premature to start trying to measure anything else.
It is likely though that many universities will welcome the latest THE initiative. Many Western universities faced with declining standards and funding and competition from the East will welcome the opportunity to find something where they can get high scores that will help with branding and promotion.
Where is the real educational capital of the world?
Here is another example of how rankings, especially those produced by Times Higher Education (THE), are used to mislead the public.
The London Post has announced that London is the Higher Educational Capital of the World for 2019. Support for this claim is provided by four London universities appearing in the top 40 of the THE World University Rankings which, unsurprisingly, have been welcomed by London Mayor Sadiq Khan.
In addition, THE has Oxford and Cambridge as first and second in the world in their overall rankings and QS has declared London to be the Best Student City.
THE is not the only global ranking. There are now several others and none of them have Oxford in first place. Most of them give the top spot to Harvard, although in the QS world rankings it is MIT and in the GreenMetric rankings Wageningen.
Also, if we look at the number of universities in the top 50 of the Shanghai rankings we cannot see London as the undisputed HE capital of the world. Using this simple criterion it would be New York with three, Columbia, New York University and Rockefeller.
Then come Boston, Paris, Chicago and London with two each.
The London Post has announced that London is the Higher Educational Capital of the World for 2019. Support for this claim is provided by four London universities appearing in the top 40 of the THE World University Rankings which, unsurprisingly, have been welcomed by London Mayor Sadiq Khan.
In addition, THE has Oxford and Cambridge as first and second in the world in their overall rankings and QS has declared London to be the Best Student City.
THE is not the only global ranking. There are now several others and none of them have Oxford in first place. Most of them give the top spot to Harvard, although in the QS world rankings it is MIT and in the GreenMetric rankings Wageningen.
Also, if we look at the number of universities in the top 50 of the Shanghai rankings we cannot see London as the undisputed HE capital of the world. Using this simple criterion it would be New York with three, Columbia, New York University and Rockefeller.
Then come Boston, Paris, Chicago and London with two each.
Saturday, April 06, 2019
Resources alone may not be enough
Universitas 21 has just published its annual ranking of higher education systems. There are four criteria each containing several metrics: resources, connectivity, environment and output.
The ranking has received a reasonable amount of media coverage although not as much as THE or QS.
A comparison of the ranks for the Resources indicator, comprising five measures of expenditure, and for Output, which includes research, citations, performance on rankings, graduation rates and enrolments, produces some interesting insights. There are countries such as Denmark and Switzerland that do well for both. China, Israel and some European countries seem to be very good at getting a high output from the resources available. There are others, including Turkey, Brazil, Saudi Arabia, and Malaysia, that appear to have adequate or more than adequate resources but whose rank for output is not so high.
These are of course limited indicators and it could perhaps just be a matter of time before the resources produce the desired results. The time for panic or celebration may not have arrived yet. Even so, it does seem that some countries or cultures are able to make better use of their resources than others.
The table below orders countries according to the difference between their ranks for resources and for output. Ireland is 20 places higher for output than it is for resources. India is seven places lower.
The relatively poor performance for Singapore is surprising given that country's reputation for all round excellence. Possibly there is a point where expenditure on higher education runs into diminishing or even negative returns.
The ranking has received a reasonable amount of media coverage although not as much as THE or QS.
A comparison of the ranks for the Resources indicator, comprising five measures of expenditure, and for Output, which includes research, citations, performance on rankings, graduation rates and enrolments, produces some interesting insights. There are countries such as Denmark and Switzerland that do well for both. China, Israel and some European countries seem to be very good at getting a high output from the resources available. There are others, including Turkey, Brazil, Saudi Arabia, and Malaysia, that appear to have adequate or more than adequate resources but whose rank for output is not so high.
These are of course limited indicators and it could perhaps just be a matter of time before the resources produce the desired results. The time for panic or celebration may not have arrived yet. Even so, it does seem that some countries or cultures are able to make better use of their resources than others.
The table below orders countries according to the difference between their ranks for resources and for output. Ireland is 20 places higher for output than it is for resources. India is seven places lower.
The relatively poor performance for Singapore is surprising given that country's reputation for all round excellence. Possibly there is a point where expenditure on higher education runs into diminishing or even negative returns.
China
|
+20
|
Ireland
|
+20
|
Russia
|
+18
|
Greece
|
+16
|
Hungary
|
+14
|
Italy
|
+14
|
UK
|
+11
|
Israel
|
+10
|
Slovenia
|
+10
|
South Korea
|
+10
|
Australia
|
+8
|
USA
|
+8
|
Spain
|
+7
|
Taiwan
|
+4
|
Bulgaria
|
+3
|
Germany
|
+3
|
Iran
|
+3
|
Netherlands
|
+3
|
Japan
|
+3
|
Czech Republic
|
+2
|
Belgium
|
+1
|
Croatia
|
+1
|
Romania
|
+1
|
Thailand
|
+1
|
Finland
|
0
|
France
|
0
|
Indonesia
|
0
|
New Zealand
|
0
|
Canada
|
-1
|
Denmark
|
-1
|
Portugal
|
-1
|
Argentina
|
-2
|
Norway
|
-2
|
Poland
|
-2
|
South Africa
|
-2
|
Switzerland
|
-2
|
Ukraine
|
-5
|
Hong Kong
|
-4
|
Sweden
|
-5
|
India
|
-7
|
Chile
|
-9
|
Singapore
|
-9
|
Austria
|
-11
|
Mexico
|
-13
|
Serbia
|
-13
|
Slovakia
|
-14
|
Turkey
|
-14
|
Brazil
|
-16
|
Saudi Arabia
|
-25
|
Malaysia
|
-28
|
Thursday, April 04, 2019
What to do to get into the rankings?
I have been asked this question quite a few times. So finally here is an attempt to answer it.
If you represent a university that is not listed in any rankings, except uniRank and Webometrics, but you want to be, what should you do?
Where are you now?
The first thing to do is to find out where you are in the global hierarchy of universities.
Here the Webometrics rankings are very helpful. These are now a mixture of web activity and research indicators and provide a rank for over 28,000 universities or places that might be considered universities, colleges, or academies of some sort.
If you are ranked in the bottom half of Webometrics then frankly it would be better to concentrate on not going bankrupt and getting or staying accredited.
But if you are in the top 10,000 or so then you might be able to think about getting somewhere in some sort of ranking.
Where do you want to be?
Nearly everybody in higher education who is not hibernating has heard of the Times Higher Education (THE) world and regional rankings. Some also know about the Quacquarelli Symonds (QS) or the Shanghai rankings. But there are now many more rankings that are just as good as, or in some cases better than, the "big three".
According to the IREG inventory published last year there are now at least 45 international university rankings including business school, subject, system and regional rankings, of which 17 are global rankings, and there will be more to come. This inventory provides links and some basic preliminary information about all the rankings but it already needs updating.
The methodology and public visibility of the global rankings varies enormously. So, first you have to decide what sort of university you are and what you want to be. You also need to think about exactly what you want from a ranking, whether it is fuel for the publicity machine or an accurate and valid assessment of research performance.
If you want to be a small high quality research led institution with lavish public and private funding, something like Caltech, then the THE world rankings would probably be appropriate. They measure income three different ways, no matter how wastefully it is spent, and most of the indicators are scaled according to number of staff or students. They also have a citations indicator which favours research intensive institutions like Stanford or MIT along with some improbable places like Babol Noshirvani University of Technology, Brighton and Sussex Medical School or Reykjavik University.
If, however, your goal is to be a large comprehensive research and teaching university then the QS or the Russia-based Round University Rankings might be a better choice. The latter has all the metrics of the THE rankings except one plus another eight, all with sensible weightings.
If you are a research postgraduate-only university then you would not be eligible for the overall rankings produced by QS or THE but you could be included in the Shanghai Rankings.
Data Submission
Most rankings rely on publicly accessible information. However these global rankings use include information submitted by the ranked institution: QS world rankings, THE world rankings, Round University Ranking, US News Best Global Universities, U-Multirank, UI Greenmetric. Collecting, verifying and submitting data can be a very tiresome task so it would be well to consider whether there are sufficient informed and conscientious staff available. U-Multirank is especially demanding in the the amount and quality of data required.
List of Global Rankings
Here is the list of the 17 global rankings included in the IREG inventory with comments about the kind of university that is likely to do well in them.
CWTS Leiden Ranking
This is a research only ranking by a group of bibliometric experts at Leiden University. There are several indicators starting with the total number of publications, headed by Harvard followed by the University of Toronto, and ending with the percentage of publications in the top 1% of journals, headed by Rockefeller University.
CWUR World University Rankings
Now produced out of UAE, this is an unusual and not well-known ranking that attempts to measure alumni employment and the quality of education and faculty. At the top it generally resembles more conventional rankings.
Emerging/Trendence Global University Employability Rankings
Published in but not produced by THE, these are based on a survey of employers in selected countries and rank only 150 universities.
Nature Index
A research rankine based on a very select group of journals. Also includes non-university institutions. The current leader is the Chinese Academy of Sciences. This ranking is relevant only for those universities aiming for the very top levels of research in the natural sciences.
National Taiwan University Rankings
A research ranking of current publications and citations and those over a period of eleven years. It favours big universities with the current top ten including the University of Toronto and the University of Michigan.
QS World University Rankings
If you are confident of building a local reputation then this is the ranking for you. There is a 40 % weighting for academic reputation and 10 % for employer reputation. Southeast Asian universities often do well in this ranking.
Webometrics
This now has two measures of web activity, one of citations and one of publications. It measures quantity rather than quality so there is a chance here for mass market institutions to excel.
Reuters Top 100 Innovative Universities
This is definitely for the world's technological elite.
Round University Rankings
These rankings combines survey and institutional data from Clarivate's Global Institutional Profiles Project and bibliometric data from the.Web of Science Core Collection. They are the most balanced and comprehensive of the general world rankings although hardly known outside Russia.
Scimago Institution Rankings
These combine indicators of research, innovation measured by patents and web activity. They tend to favour larger universities that are strong in technology.
Shanghai Academic Ranking of World Universities (ARWU)
These are the oldest of the global rankings with a simple and stable methodology. They are definitely biased towards large, rich, old research universities with strengths in the natural sciences and a long history of scientific achievement.
THE World University Rankings
The most famous of the international rankings, they claim to be sophisticated, rigorous, trusted etc but are biased towards UK universities. The citations indicator is hopelessly and amusingly flawed. There are a number of spin-offs that might be of interest to non-elite universities such as regional, reputation, young universities and, now, global impact rankings.
U-Multirank
Contains masses of information about things that other rankings neglect but would be helpful mainly to universities looking for students from Europe.
UI GreenMetric Ranking
This is published by Universitas Indonesia and measures universities' contribution to environmental sustainability. Includes a lot of Southeast Asian universities but not many from North America. Useful for eco-conscious universities.
uniRank University Ranking
This is based on web popularity derived from several sources. In many parts of Africa it serves as a measure of general quality.
University Ranking by Academic Performance
A research ranking produced by the Middle East Technical University in Ankara that ranks 2,500 universities. It is little known outside Turkey but I noticed recently that it was used in a presentation at a conference in Malaysia.
US News Best Global Universities
Sometimes counted as one of the big four but hardly ever the big three, this is a research ranking that is balanced and includes 1,250 universities. For American universities is a useful complement to the US News' America's best Colleges.
You will have to decide whether to take a short-term approach to rankings, by recruiting staff from the Highly Cited Researchers list, admitting international students regardless of ability, sending papers to marginal journals and conferences, signing up for citation-rich mega projects, or by concentrating on the underlying attributes of an excellent university, admitting students and appointing and promoting faculty for their cognitive skills and academic ability, encouraging genuine and productive collaboration, nurturing local talent.
The first may produce quick results or nice bonuses for administrators but it can leave universities at the mercy of the methodological tweaking of the rankers, as Turkish universities found out in 2015.
The latter will take years or decades to make a difference and unfortunately that may be too long for journalists and policy makers.
If you represent a university that is not listed in any rankings, except uniRank and Webometrics, but you want to be, what should you do?
Where are you now?
The first thing to do is to find out where you are in the global hierarchy of universities.
Here the Webometrics rankings are very helpful. These are now a mixture of web activity and research indicators and provide a rank for over 28,000 universities or places that might be considered universities, colleges, or academies of some sort.
If you are ranked in the bottom half of Webometrics then frankly it would be better to concentrate on not going bankrupt and getting or staying accredited.
But if you are in the top 10,000 or so then you might be able to think about getting somewhere in some sort of ranking.
Where do you want to be?
Nearly everybody in higher education who is not hibernating has heard of the Times Higher Education (THE) world and regional rankings. Some also know about the Quacquarelli Symonds (QS) or the Shanghai rankings. But there are now many more rankings that are just as good as, or in some cases better than, the "big three".
According to the IREG inventory published last year there are now at least 45 international university rankings including business school, subject, system and regional rankings, of which 17 are global rankings, and there will be more to come. This inventory provides links and some basic preliminary information about all the rankings but it already needs updating.
The methodology and public visibility of the global rankings varies enormously. So, first you have to decide what sort of university you are and what you want to be. You also need to think about exactly what you want from a ranking, whether it is fuel for the publicity machine or an accurate and valid assessment of research performance.
If you want to be a small high quality research led institution with lavish public and private funding, something like Caltech, then the THE world rankings would probably be appropriate. They measure income three different ways, no matter how wastefully it is spent, and most of the indicators are scaled according to number of staff or students. They also have a citations indicator which favours research intensive institutions like Stanford or MIT along with some improbable places like Babol Noshirvani University of Technology, Brighton and Sussex Medical School or Reykjavik University.
If, however, your goal is to be a large comprehensive research and teaching university then the QS or the Russia-based Round University Rankings might be a better choice. The latter has all the metrics of the THE rankings except one plus another eight, all with sensible weightings.
If you are a research postgraduate-only university then you would not be eligible for the overall rankings produced by QS or THE but you could be included in the Shanghai Rankings.
Data Submission
Most rankings rely on publicly accessible information. However these global rankings use include information submitted by the ranked institution: QS world rankings, THE world rankings, Round University Ranking, US News Best Global Universities, U-Multirank, UI Greenmetric. Collecting, verifying and submitting data can be a very tiresome task so it would be well to consider whether there are sufficient informed and conscientious staff available. U-Multirank is especially demanding in the the amount and quality of data required.
List of Global Rankings
Here is the list of the 17 global rankings included in the IREG inventory with comments about the kind of university that is likely to do well in them.
CWTS Leiden Ranking
This is a research only ranking by a group of bibliometric experts at Leiden University. There are several indicators starting with the total number of publications, headed by Harvard followed by the University of Toronto, and ending with the percentage of publications in the top 1% of journals, headed by Rockefeller University.
CWUR World University Rankings
Now produced out of UAE, this is an unusual and not well-known ranking that attempts to measure alumni employment and the quality of education and faculty. At the top it generally resembles more conventional rankings.
Emerging/Trendence Global University Employability Rankings
Published in but not produced by THE, these are based on a survey of employers in selected countries and rank only 150 universities.
Nature Index
A research rankine based on a very select group of journals. Also includes non-university institutions. The current leader is the Chinese Academy of Sciences. This ranking is relevant only for those universities aiming for the very top levels of research in the natural sciences.
National Taiwan University Rankings
A research ranking of current publications and citations and those over a period of eleven years. It favours big universities with the current top ten including the University of Toronto and the University of Michigan.
QS World University Rankings
If you are confident of building a local reputation then this is the ranking for you. There is a 40 % weighting for academic reputation and 10 % for employer reputation. Southeast Asian universities often do well in this ranking.
Webometrics
This now has two measures of web activity, one of citations and one of publications. It measures quantity rather than quality so there is a chance here for mass market institutions to excel.
Reuters Top 100 Innovative Universities
This is definitely for the world's technological elite.
Round University Rankings
These rankings combines survey and institutional data from Clarivate's Global Institutional Profiles Project and bibliometric data from the.Web of Science Core Collection. They are the most balanced and comprehensive of the general world rankings although hardly known outside Russia.
Scimago Institution Rankings
These combine indicators of research, innovation measured by patents and web activity. They tend to favour larger universities that are strong in technology.
Shanghai Academic Ranking of World Universities (ARWU)
These are the oldest of the global rankings with a simple and stable methodology. They are definitely biased towards large, rich, old research universities with strengths in the natural sciences and a long history of scientific achievement.
THE World University Rankings
The most famous of the international rankings, they claim to be sophisticated, rigorous, trusted etc but are biased towards UK universities. The citations indicator is hopelessly and amusingly flawed. There are a number of spin-offs that might be of interest to non-elite universities such as regional, reputation, young universities and, now, global impact rankings.
U-Multirank
Contains masses of information about things that other rankings neglect but would be helpful mainly to universities looking for students from Europe.
UI GreenMetric Ranking
This is published by Universitas Indonesia and measures universities' contribution to environmental sustainability. Includes a lot of Southeast Asian universities but not many from North America. Useful for eco-conscious universities.
uniRank University Ranking
This is based on web popularity derived from several sources. In many parts of Africa it serves as a measure of general quality.
University Ranking by Academic Performance
A research ranking produced by the Middle East Technical University in Ankara that ranks 2,500 universities. It is little known outside Turkey but I noticed recently that it was used in a presentation at a conference in Malaysia.
US News Best Global Universities
Sometimes counted as one of the big four but hardly ever the big three, this is a research ranking that is balanced and includes 1,250 universities. For American universities is a useful complement to the US News' America's best Colleges.
You will have to decide whether to take a short-term approach to rankings, by recruiting staff from the Highly Cited Researchers list, admitting international students regardless of ability, sending papers to marginal journals and conferences, signing up for citation-rich mega projects, or by concentrating on the underlying attributes of an excellent university, admitting students and appointing and promoting faculty for their cognitive skills and academic ability, encouraging genuine and productive collaboration, nurturing local talent.
The first may produce quick results or nice bonuses for administrators but it can leave universities at the mercy of the methodological tweaking of the rankers, as Turkish universities found out in 2015.
The latter will take years or decades to make a difference and unfortunately that may be too long for journalists and policy makers.
Friday, March 29, 2019
The THE-QS duopoly
Strolling around the exhibition hall at the APAIE conference in Kuala Lumpur, I gathered a pile of promotional literature from various universities.
As expected, a lot of this referred to international university rankings. Here are some examples.
Ritsumeikan Asia Pacific University, Japan: THE Japan University Rankings 21st, QS World Rankings Asia 100% for internationalisation.
Yonsei University, Korea: QS Asia University Rankings 19th
Hanyang University, Korea: QS, Reuters Asia's Innovative 100 universities
Sabinci University, Turkey: THE
University of Malaya: QS world rankings 87th
Hasanuddin University: QS Asian Rankings, Webometrics
Keio University, QS, THE, Asia Innovative Universities
Novosibirsk State University, Russia: QS World, EECA and subject rankings
China University of Geosciences; US News Best Global Universities.
Mahidol University, Thailand, cites US News, GreenMetric, THE, National Taiwan University, uniRank, URAP,, and QS.
The QS- THE duopoly seems to be holding up fairly well but there are signs that some universities are exploring other international rankings.
Also, in a report on Malaysia, Prof Nordin Yahaya of Universiti Teknologi Malaysia referred to URAP, produced by the Middle East Technical University, to measure the country's research performance.
As expected, a lot of this referred to international university rankings. Here are some examples.
Ritsumeikan Asia Pacific University, Japan: THE Japan University Rankings 21st, QS World Rankings Asia 100% for internationalisation.
Yonsei University, Korea: QS Asia University Rankings 19th
Hanyang University, Korea: QS, Reuters Asia's Innovative 100 universities
Sabinci University, Turkey: THE
University of Malaya: QS world rankings 87th
Hasanuddin University: QS Asian Rankings, Webometrics
Keio University, QS, THE, Asia Innovative Universities
Novosibirsk State University, Russia: QS World, EECA and subject rankings
China University of Geosciences; US News Best Global Universities.
Mahidol University, Thailand, cites US News, GreenMetric, THE, National Taiwan University, uniRank, URAP,, and QS.
The QS- THE duopoly seems to be holding up fairly well but there are signs that some universities are exploring other international rankings.
Also, in a report on Malaysia, Prof Nordin Yahaya of Universiti Teknologi Malaysia referred to URAP, produced by the Middle East Technical University, to measure the country's research performance.
Thursday, March 28, 2019
Global University Rankings and Southeast Asia
Global University Rankings and
Southeast Asia
Paper presented at Asia-Pacific
Association for International Education, Kuala Lumpur 26 March 2019
Background
Global rankings began in a small way in 2003 with the
publication of the first edition of the Shanghai Rankings. These were quite
simple, comprising six indicators that measured scientific research. Their
purpose was to show how far Chinese universities had to go to reach world class
status. Public interest was limited although some European universities were
shocked to find how far they were behind English-speaking institutions.
Then came the Times
Higher Education Supplement (THES) – Quacquarelli Symonds (QS) World
University Rankings. Their methodology was very different from that of
Shanghai, relying heavily of a survey of academic opinion. In most parts of the
world interest was limited and the rankings received little respect but
Malaysia was different. The country’s flagship university, Universiti Malaya
(UM), reached the top one hundred, an achievement that was cause for great if
brief celebration. That achievement was the result of an error on the part of the
rankers, QS, and in 2005 UM crashed out of the top 100.
Current Ranking Scene
International rankings have made substantial progress over
the last decade and a half. In 2003 there was one, Shanghai. Now according to
the IREG Inventory there are 45 international rankings of which 17 are global,
plus subject, regional, system, business school and sub- rankings.
They cover a broad range of data that could be of interest to
students, researchers, policy makers and other stakeholders. They include
metrics like number of faculty and students, income, patents, web activity, publications,
books, conferences, reputation surveys, patents, and contributions to
environmental sustainability.
Rankings and Southeast
Asia
For Malaysia the publication of the THES-QS rankings in 2004
was the beginning of years of interest, perhaps obsession, with the rankings. The
country has devoted resources and support to gain favourable places in the QS
rankings.
Singapore has emphasised both the QS and THE rankings since
that unpleasant divorce in 2009. It has hosted the THE academic summit and has
performed well in nearly all rankings especially in the THE and QS world
rankings.
A few universities in Thailand, Indonesia and the Philippines
have been included at the lower levels of rankings such as those published by
the University of Leiden, National Taiwan University, Scimago, THE and QS.
Other countries have shown less interest. Myanmar and
Cambodia are included only in the Webometrics and uniRank rankings, which
include thousands of places with the slightest pretension of being a university
or college.
Inclusion and
Performance
There is considerable variation in the inclusiveness of the
rankings. There are five Southeast Asian universities in the Shanghai Rankings
and 3,192 in Webometrics.
Among Southeast Asian universities Singapore is clearly the
best performer, followed by Malaysia, while Myanmar is the worse.
Targets
The declaration of targets with regard to rankings is a
common strategy across the world. Malaysia
has a specific policy of getting universities into the QS rankings, 4 in the
top 200, 2 in the top 100 and one in the top 25.
In Thailand the 20-year national strategy aims at getting at
least five Thai universities into the top 100 of the world rankings.
Indonesia wants to get five specified universities into the
QS top 500 by 2019 and a further six by 2024.
The Dangers of Rankings
The cruel reality is that we cannot escape rankings. If all
the current rankings were banned and thrown into an Orwellian memory hole then
we would simply revert to informal and subjective rankings that prevailed
before.
If we must have formal rankings then they should be as valid
and accurate as possible and they should take account of the varying missions
of universities and their size and clientele and they should be as
comprehensive as possible.
To ignore the data that rankings can provide is to seriously limit
public awareness. At the moment Southeast Asian universities and governments
seem interested mainly or only in the QS rankings or perhaps the THE rankings.
To focus on any single ranking could be self-defeating. Take
a look at Malaysia’s position in the QS rankings. It is obvious that UM, Malaysia’s
leading university in most rankings, does very much better in the QS rankings than
in every single ranking, except the GreenMetric rankings.
Why is this? The QS rankings allot a 40 % weighting to a
survey of academic opinion supposedly about research, more than any other
ranking. They allow universities to influence the composition of survey
respondents, by submitting names or by alerting researchers to the sign-up
facility where they can take part in the survey.
To their credit, QS have published the number of survey respondents
by country. The largest number is from the USA with almost as many from the UK.
The third largest number of respondents is from Malaysia, more than China and
India combined. Malaysian universities do much better in the academic survey
than they do for citations.
It is problematical to present UM as a top 100 university. It
has a good reputation among local and regional researchers but is not doing so
well in the other metrics especially research of the highest quality.
There is also a serious risk that the performance in the QS
ranking is precarious. Already countries like Russia, Colombia, Iraq, and
Kazakhstan are increasing their representation in the QS survey. More will join
them. The top Chinese universities are targeting the Shanghai rankings but one
day the second tier may try out for the QS rankings.
Also, any university that relies too much on the QS rankings
could easily be a victim of methodological changes. QS has, with good reason,
revamped its methodology several times and this can easily affect the scores of
universities through no fault or credit of their own. This may have happened
again during the collection of data for this year’s rankings. QS recently
announced that universities can either submit names of potential respondents or
alert researchers to the sign-up facility but not, as in previous years, both.
Universities that have not responded to this change may well suffer a reduced
score in the survey indicators.
If not QS, should another ranking be used for benchmarking
and targets? Some observers claim that Asian universities should opt for the
THE rankings which are alleged to be more rigorous and sophisticated and certainly
more prestigious.
That would be a mistake. The value of the THE rankings, but
not their price, is drastically reduced by their lack of transparency so that
it is impossible, for example, to tell whether a change in the score for
research results from an increase in publications, a decline in the number of
staff, an improved reputation or an increase in research income.
Then there is the THE citations indicator. This can only be
described as bizarre and ridiculous.
Here are some of the universities that appeared in the top 50
of last year’s citation indicator which supposedly measures research influence
or quality: Babol Noshirvani University of Technology, Brighton and Sussex
medical School, Reykjavik University, Anglia Ruskin University Jordan
University of Science and Technology, Vita-Salute San Raffaele University.
Proposals
1.
It is not a good idea to use any single ranking
but if one is to be then it should be one that is methodologically stable and
technically competent and does not emphasise a single indicator. For research, probably
the best bet would be the Leiden Ranking. If a ranking is needed that includes
metrics that might be related to teaching and learning then Round University
Rankings would be helpful.
2. Another
approach would be to encourage universities to target more than one university.
3.
A
regional database should be created that would provide information about ranks
and scores in all relevant rankings and data about faculty, students, income,
publications, citations and so on.
4.
Regional
universities should work to develop measures of the effectiveness of teaching and
learning.
Links
Subscribe to:
Posts (Atom)