I have been asked this question quite a few times. So finally here is an attempt to answer it.
If you represent a university that is not listed in any rankings, except uniRank and Webometrics, but you want to be, what should you do?
Where are you now?
The first thing to do is to find out where you are in the global hierarchy of universities.
Here the Webometrics rankings are very helpful. These are now a mixture of web activity and research indicators and provide a rank for over 28,000 universities or places that might be considered universities, colleges, or academies of some sort.
If you are ranked in the bottom half of Webometrics then frankly it would be better to concentrate on not going bankrupt and getting or staying accredited.
But if you are in the top 10,000 or so then you might be able to think about getting somewhere in some sort of ranking.
Where do you want to be?
Nearly everybody in higher education who is not hibernating has heard of the Times Higher Education (THE) world and regional rankings. Some also know about the Quacquarelli Symonds (QS) or the Shanghai rankings. But there are now many more rankings that are just as good as, or in some cases better than, the "big three".
According to the IREG inventory published last year there are now at least 45 international university rankings including business school, subject, system and regional rankings, of which 17 are global rankings, and there will be more to come. This inventory provides links and some basic preliminary information about all the rankings but it already needs updating.
The methodology and public visibility of the global rankings varies enormously. So, first you have to decide what sort of university you are and what you want to be. You also need to think about exactly what you want from a ranking, whether it is fuel for the publicity machine or an accurate and valid assessment of research performance.
If you want to be a small high quality research led institution with lavish public and private funding, something like Caltech, then the THE world rankings would probably be appropriate. They measure income three different ways, no matter how wastefully it is spent, and most of the indicators are scaled according to number of staff or students. They also have a citations indicator which favours research intensive institutions like Stanford or MIT along with some improbable places like Babol Noshirvani University of Technology, Brighton and Sussex Medical School or Reykjavik University.
If, however, your goal is to be a large comprehensive research and teaching university then the QS or the Russia-based Round University Rankings might be a better choice. The latter has all the metrics of the THE rankings except one plus another eight, all with sensible weightings.
If you are a research postgraduate-only university then you would not be eligible for the overall rankings produced by QS or THE but you could be included in the Shanghai Rankings.
Data Submission
Most rankings rely on publicly accessible information. However these global rankings use include information submitted by the ranked institution: QS world rankings, THE world rankings, Round University Ranking, US News Best Global Universities, U-Multirank, UI Greenmetric. Collecting, verifying and submitting data can be a very tiresome task so it would be well to consider whether there are sufficient informed and conscientious staff available. U-Multirank is especially demanding in the the amount and quality of data required.
List of Global Rankings
Here is the list of the 17 global rankings included in the IREG inventory with comments about the kind of university that is likely to do well in them.
CWTS Leiden Ranking
This is a research only ranking by a group of bibliometric experts at Leiden University. There are several indicators starting with the total number of publications, headed by Harvard followed by the University of Toronto, and ending with the percentage of publications in the top 1% of journals, headed by Rockefeller University.
CWUR World University Rankings
Now produced out of UAE, this is an unusual and not well-known ranking that attempts to measure alumni employment and the quality of education and faculty. At the top it generally resembles more conventional rankings.
Emerging/Trendence Global University Employability Rankings
Published in but not produced by THE, these are based on a survey of employers in selected countries and rank only 150 universities.
Nature Index
A research rankine based on a very select group of journals. Also includes non-university institutions. The current leader is the Chinese Academy of Sciences. This ranking is relevant only for those universities aiming for the very top levels of research in the natural sciences.
National Taiwan University Rankings
A research ranking of current publications and citations and those over a period of eleven years. It favours big universities with the current top ten including the University of Toronto and the University of Michigan.
QS World University Rankings
If you are confident of building a local reputation then this is the ranking for you. There is a 40 % weighting for academic reputation and 10 % for employer reputation. Southeast Asian universities often do well in this ranking.
Webometrics
This now has two measures of web activity, one of citations and one of publications. It measures quantity rather than quality so there is a chance here for mass market institutions to excel.
Reuters Top 100 Innovative Universities
This is definitely for the world's technological elite.
Round University Rankings
These rankings combines survey and institutional data from Clarivate's Global Institutional Profiles Project and bibliometric data from the.Web of Science Core Collection. They are the most balanced and comprehensive of the general world rankings although hardly known outside Russia.
Scimago Institution Rankings
These combine indicators of research, innovation measured by patents and web activity. They tend to favour larger universities that are strong in technology.
Shanghai Academic Ranking of World Universities (ARWU)
These are the oldest of the global rankings with a simple and stable methodology. They are definitely biased towards large, rich, old research universities with strengths in the natural sciences and a long history of scientific achievement.
THE World University Rankings
The most famous of the international rankings, they claim to be sophisticated, rigorous, trusted etc but are biased towards UK universities. The citations indicator is hopelessly and amusingly flawed. There are a number of spin-offs that might be of interest to non-elite universities such as regional, reputation, young universities and, now, global impact rankings.
U-Multirank
Contains masses of information about things that other rankings neglect but would be helpful mainly to universities looking for students from Europe.
UI GreenMetric Ranking
This is published by Universitas Indonesia and measures universities' contribution to environmental sustainability. Includes a lot of Southeast Asian universities but not many from North America. Useful for eco-conscious universities.
uniRank University Ranking
This is based on web popularity derived from several sources. In many parts of Africa it serves as a measure of general quality.
University Ranking by Academic Performance
A research ranking produced by the Middle East Technical University in Ankara that ranks 2,500 universities. It is little known outside Turkey but I noticed recently that it was used in a presentation at a conference in Malaysia.
US News Best Global Universities
Sometimes counted as one of the big four but hardly ever the big three, this is a research ranking that is balanced and includes 1,250 universities. For American universities is a useful complement to the US News' America's best Colleges.
You will have to decide whether to take a short-term approach to rankings, by recruiting staff from the Highly Cited Researchers list, admitting international students regardless of ability, sending papers to marginal journals and conferences, signing up for citation-rich mega projects, or by concentrating on the underlying attributes of an excellent university, admitting students and appointing and promoting faculty for their cognitive skills and academic ability, encouraging genuine and productive collaboration, nurturing local talent.
The first may produce quick results or nice bonuses for administrators but it can leave universities at the mercy of the methodological tweaking of the rankers, as Turkish universities found out in 2015.
The latter will take years or decades to make a difference and unfortunately that may be too long for journalists and policy makers.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Thursday, April 04, 2019
Friday, March 29, 2019
The THE-QS duopoly
Strolling around the exhibition hall at the APAIE conference in Kuala Lumpur, I gathered a pile of promotional literature from various universities.
As expected, a lot of this referred to international university rankings. Here are some examples.
Ritsumeikan Asia Pacific University, Japan: THE Japan University Rankings 21st, QS World Rankings Asia 100% for internationalisation.
Yonsei University, Korea: QS Asia University Rankings 19th
Hanyang University, Korea: QS, Reuters Asia's Innovative 100 universities
Sabinci University, Turkey: THE
University of Malaya: QS world rankings 87th
Hasanuddin University: QS Asian Rankings, Webometrics
Keio University, QS, THE, Asia Innovative Universities
Novosibirsk State University, Russia: QS World, EECA and subject rankings
China University of Geosciences; US News Best Global Universities.
Mahidol University, Thailand, cites US News, GreenMetric, THE, National Taiwan University, uniRank, URAP,, and QS.
The QS- THE duopoly seems to be holding up fairly well but there are signs that some universities are exploring other international rankings.
Also, in a report on Malaysia, Prof Nordin Yahaya of Universiti Teknologi Malaysia referred to URAP, produced by the Middle East Technical University, to measure the country's research performance.
As expected, a lot of this referred to international university rankings. Here are some examples.
Ritsumeikan Asia Pacific University, Japan: THE Japan University Rankings 21st, QS World Rankings Asia 100% for internationalisation.
Yonsei University, Korea: QS Asia University Rankings 19th
Hanyang University, Korea: QS, Reuters Asia's Innovative 100 universities
Sabinci University, Turkey: THE
University of Malaya: QS world rankings 87th
Hasanuddin University: QS Asian Rankings, Webometrics
Keio University, QS, THE, Asia Innovative Universities
Novosibirsk State University, Russia: QS World, EECA and subject rankings
China University of Geosciences; US News Best Global Universities.
Mahidol University, Thailand, cites US News, GreenMetric, THE, National Taiwan University, uniRank, URAP,, and QS.
The QS- THE duopoly seems to be holding up fairly well but there are signs that some universities are exploring other international rankings.
Also, in a report on Malaysia, Prof Nordin Yahaya of Universiti Teknologi Malaysia referred to URAP, produced by the Middle East Technical University, to measure the country's research performance.
Thursday, March 28, 2019
Global University Rankings and Southeast Asia
Global University Rankings and
Southeast Asia
Paper presented at Asia-Pacific
Association for International Education, Kuala Lumpur 26 March 2019
Background
Global rankings began in a small way in 2003 with the
publication of the first edition of the Shanghai Rankings. These were quite
simple, comprising six indicators that measured scientific research. Their
purpose was to show how far Chinese universities had to go to reach world class
status. Public interest was limited although some European universities were
shocked to find how far they were behind English-speaking institutions.
Then came the Times
Higher Education Supplement (THES) – Quacquarelli Symonds (QS) World
University Rankings. Their methodology was very different from that of
Shanghai, relying heavily of a survey of academic opinion. In most parts of the
world interest was limited and the rankings received little respect but
Malaysia was different. The country’s flagship university, Universiti Malaya
(UM), reached the top one hundred, an achievement that was cause for great if
brief celebration. That achievement was the result of an error on the part of the
rankers, QS, and in 2005 UM crashed out of the top 100.
Current Ranking Scene
International rankings have made substantial progress over
the last decade and a half. In 2003 there was one, Shanghai. Now according to
the IREG Inventory there are 45 international rankings of which 17 are global,
plus subject, regional, system, business school and sub- rankings.
They cover a broad range of data that could be of interest to
students, researchers, policy makers and other stakeholders. They include
metrics like number of faculty and students, income, patents, web activity, publications,
books, conferences, reputation surveys, patents, and contributions to
environmental sustainability.
Rankings and Southeast
Asia
For Malaysia the publication of the THES-QS rankings in 2004
was the beginning of years of interest, perhaps obsession, with the rankings. The
country has devoted resources and support to gain favourable places in the QS
rankings.
Singapore has emphasised both the QS and THE rankings since
that unpleasant divorce in 2009. It has hosted the THE academic summit and has
performed well in nearly all rankings especially in the THE and QS world
rankings.
A few universities in Thailand, Indonesia and the Philippines
have been included at the lower levels of rankings such as those published by
the University of Leiden, National Taiwan University, Scimago, THE and QS.
Other countries have shown less interest. Myanmar and
Cambodia are included only in the Webometrics and uniRank rankings, which
include thousands of places with the slightest pretension of being a university
or college.
Inclusion and
Performance
There is considerable variation in the inclusiveness of the
rankings. There are five Southeast Asian universities in the Shanghai Rankings
and 3,192 in Webometrics.
Among Southeast Asian universities Singapore is clearly the
best performer, followed by Malaysia, while Myanmar is the worse.
Targets
The declaration of targets with regard to rankings is a
common strategy across the world. Malaysia
has a specific policy of getting universities into the QS rankings, 4 in the
top 200, 2 in the top 100 and one in the top 25.
In Thailand the 20-year national strategy aims at getting at
least five Thai universities into the top 100 of the world rankings.
Indonesia wants to get five specified universities into the
QS top 500 by 2019 and a further six by 2024.
The Dangers of Rankings
The cruel reality is that we cannot escape rankings. If all
the current rankings were banned and thrown into an Orwellian memory hole then
we would simply revert to informal and subjective rankings that prevailed
before.
If we must have formal rankings then they should be as valid
and accurate as possible and they should take account of the varying missions
of universities and their size and clientele and they should be as
comprehensive as possible.
To ignore the data that rankings can provide is to seriously limit
public awareness. At the moment Southeast Asian universities and governments
seem interested mainly or only in the QS rankings or perhaps the THE rankings.
To focus on any single ranking could be self-defeating. Take
a look at Malaysia’s position in the QS rankings. It is obvious that UM, Malaysia’s
leading university in most rankings, does very much better in the QS rankings than
in every single ranking, except the GreenMetric rankings.
Why is this? The QS rankings allot a 40 % weighting to a
survey of academic opinion supposedly about research, more than any other
ranking. They allow universities to influence the composition of survey
respondents, by submitting names or by alerting researchers to the sign-up
facility where they can take part in the survey.
To their credit, QS have published the number of survey respondents
by country. The largest number is from the USA with almost as many from the UK.
The third largest number of respondents is from Malaysia, more than China and
India combined. Malaysian universities do much better in the academic survey
than they do for citations.
It is problematical to present UM as a top 100 university. It
has a good reputation among local and regional researchers but is not doing so
well in the other metrics especially research of the highest quality.
There is also a serious risk that the performance in the QS
ranking is precarious. Already countries like Russia, Colombia, Iraq, and
Kazakhstan are increasing their representation in the QS survey. More will join
them. The top Chinese universities are targeting the Shanghai rankings but one
day the second tier may try out for the QS rankings.
Also, any university that relies too much on the QS rankings
could easily be a victim of methodological changes. QS has, with good reason,
revamped its methodology several times and this can easily affect the scores of
universities through no fault or credit of their own. This may have happened
again during the collection of data for this year’s rankings. QS recently
announced that universities can either submit names of potential respondents or
alert researchers to the sign-up facility but not, as in previous years, both.
Universities that have not responded to this change may well suffer a reduced
score in the survey indicators.
If not QS, should another ranking be used for benchmarking
and targets? Some observers claim that Asian universities should opt for the
THE rankings which are alleged to be more rigorous and sophisticated and certainly
more prestigious.
That would be a mistake. The value of the THE rankings, but
not their price, is drastically reduced by their lack of transparency so that
it is impossible, for example, to tell whether a change in the score for
research results from an increase in publications, a decline in the number of
staff, an improved reputation or an increase in research income.
Then there is the THE citations indicator. This can only be
described as bizarre and ridiculous.
Here are some of the universities that appeared in the top 50
of last year’s citation indicator which supposedly measures research influence
or quality: Babol Noshirvani University of Technology, Brighton and Sussex
medical School, Reykjavik University, Anglia Ruskin University Jordan
University of Science and Technology, Vita-Salute San Raffaele University.
Proposals
1.
It is not a good idea to use any single ranking
but if one is to be then it should be one that is methodologically stable and
technically competent and does not emphasise a single indicator. For research, probably
the best bet would be the Leiden Ranking. If a ranking is needed that includes
metrics that might be related to teaching and learning then Round University
Rankings would be helpful.
2. Another
approach would be to encourage universities to target more than one university.
3.
A
regional database should be created that would provide information about ranks
and scores in all relevant rankings and data about faculty, students, income,
publications, citations and so on.
4.
Regional
universities should work to develop measures of the effectiveness of teaching and
learning.
Links
Subscribe to:
Posts (Atom)