I have been asked this question quite a few times. So finally here is an attempt to answer it.
If you represent a university that is not listed in any rankings, except uniRank and Webometrics, but you want to be, what should you do?
Where are you now?
The first thing to do is to find out where you are in the global hierarchy of universities.
Here the Webometrics rankings are very helpful. These are now a mixture of web activity and research indicators and provide a rank for over 28,000 universities or places that might be considered universities, colleges, or academies of some sort.
If you are ranked in the bottom half of Webometrics then frankly it would be better to concentrate on not going bankrupt and getting or staying accredited.
But if you are in the top 10,000 or so then you might be able to think about getting somewhere in some sort of ranking.
Where do you want to be?
Nearly everybody in higher education who is not hibernating has heard of the Times Higher Education (THE) world and regional rankings. Some also know about the Quacquarelli Symonds (QS) or the Shanghai rankings. But there are now many more rankings that are just as good as, or in some cases better than, the "big three".
According to the IREG inventory published last year there are now at least 45 international university rankings including business school, subject, system and regional rankings, of which 17 are global rankings, and there will be more to come. This inventory provides links and some basic preliminary information about all the rankings but it already needs updating.
The methodology and public visibility of the global rankings varies enormously. So, first you have to decide what sort of university you are and what you want to be. You also need to think about exactly what you want from a ranking, whether it is fuel for the publicity machine or an accurate and valid assessment of research performance.
If you want to be a small high quality research led institution with lavish public and private funding, something like Caltech, then the THE world rankings would probably be appropriate. They measure income three different ways, no matter how wastefully it is spent, and most of the indicators are scaled according to number of staff or students. They also have a citations indicator which favours research intensive institutions like Stanford or MIT along with some improbable places like Babol Noshirvani University of Technology, Brighton and Sussex Medical School or Reykjavik University.
If, however, your goal is to be a large comprehensive research and teaching university then the QS or the Russia-based Round University Rankings might be a better choice. The latter has all the metrics of the THE rankings except one plus another eight, all with sensible weightings.
If you are a research postgraduate-only university then you would not be eligible for the overall rankings produced by QS or THE but you could be included in the Shanghai Rankings.
Data Submission
Most rankings rely on publicly accessible information. However these global rankings use include information submitted by the ranked institution: QS world rankings, THE world rankings, Round University Ranking, US News Best Global Universities, U-Multirank, UI Greenmetric. Collecting, verifying and submitting data can be a very tiresome task so it would be well to consider whether there are sufficient informed and conscientious staff available. U-Multirank is especially demanding in the the amount and quality of data required.
List of Global Rankings
Here is the list of the 17 global rankings included in the IREG inventory with comments about the kind of university that is likely to do well in them.
CWTS Leiden Ranking
This is a research only ranking by a group of bibliometric experts at Leiden University. There are several indicators starting with the total number of publications, headed by Harvard followed by the University of Toronto, and ending with the percentage of publications in the top 1% of journals, headed by Rockefeller University.
CWUR World University Rankings
Now produced out of UAE, this is an unusual and not well-known ranking that attempts to measure alumni employment and the quality of education and faculty. At the top it generally resembles more conventional rankings.
Emerging/Trendence Global University Employability Rankings
Published in but not produced by THE, these are based on a survey of employers in selected countries and rank only 150 universities.
Nature Index
A research rankine based on a very select group of journals. Also includes non-university institutions. The current leader is the Chinese Academy of Sciences. This ranking is relevant only for those universities aiming for the very top levels of research in the natural sciences.
National Taiwan University Rankings
A research ranking of current publications and citations and those over a period of eleven years. It favours big universities with the current top ten including the University of Toronto and the University of Michigan.
QS World University Rankings
If you are confident of building a local reputation then this is the ranking for you. There is a 40 % weighting for academic reputation and 10 % for employer reputation. Southeast Asian universities often do well in this ranking.
Webometrics
This now has two measures of web activity, one of citations and one of publications. It measures quantity rather than quality so there is a chance here for mass market institutions to excel.
Reuters Top 100 Innovative Universities
This is definitely for the world's technological elite.
Round University Rankings
These rankings combines survey and institutional data from Clarivate's Global Institutional Profiles Project and bibliometric data from the.Web of Science Core Collection. They are the most balanced and comprehensive of the general world rankings although hardly known outside Russia.
Scimago Institution Rankings
These combine indicators of research, innovation measured by patents and web activity. They tend to favour larger universities that are strong in technology.
Shanghai Academic Ranking of World Universities (ARWU)
These are the oldest of the global rankings with a simple and stable methodology. They are definitely biased towards large, rich, old research universities with strengths in the natural sciences and a long history of scientific achievement.
THE World University Rankings
The most famous of the international rankings, they claim to be sophisticated, rigorous, trusted etc but are biased towards UK universities. The citations indicator is hopelessly and amusingly flawed. There are a number of spin-offs that might be of interest to non-elite universities such as regional, reputation, young universities and, now, global impact rankings.
U-Multirank
Contains masses of information about things that other rankings neglect but would be helpful mainly to universities looking for students from Europe.
UI GreenMetric Ranking
This is published by Universitas Indonesia and measures universities' contribution to environmental sustainability. Includes a lot of Southeast Asian universities but not many from North America. Useful for eco-conscious universities.
uniRank University Ranking
This is based on web popularity derived from several sources. In many parts of Africa it serves as a measure of general quality.
University Ranking by Academic Performance
A research ranking produced by the Middle East Technical University in Ankara that ranks 2,500 universities. It is little known outside Turkey but I noticed recently that it was used in a presentation at a conference in Malaysia.
US News Best Global Universities
Sometimes counted as one of the big four but hardly ever the big three, this is a research ranking that is balanced and includes 1,250 universities. For American universities is a useful complement to the US News' America's best Colleges.
You will have to decide whether to take a short-term approach to rankings, by recruiting staff from the Highly Cited Researchers list, admitting international students regardless of ability, sending papers to marginal journals and conferences, signing up for citation-rich mega projects, or by concentrating on the underlying attributes of an excellent university, admitting students and appointing and promoting faculty for their cognitive skills and academic ability, encouraging genuine and productive collaboration, nurturing local talent.
The first may produce quick results or nice bonuses for administrators but it can leave universities at the mercy of the methodological tweaking of the rankers, as Turkish universities found out in 2015.
The latter will take years or decades to make a difference and unfortunately that may be too long for journalists and policy makers.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Thursday, April 04, 2019
Friday, March 29, 2019
The THE-QS duopoly
Strolling around the exhibition hall at the APAIE conference in Kuala Lumpur, I gathered a pile of promotional literature from various universities.
As expected, a lot of this referred to international university rankings. Here are some examples.
Ritsumeikan Asia Pacific University, Japan: THE Japan University Rankings 21st, QS World Rankings Asia 100% for internationalisation.
Yonsei University, Korea: QS Asia University Rankings 19th
Hanyang University, Korea: QS, Reuters Asia's Innovative 100 universities
Sabinci University, Turkey: THE
University of Malaya: QS world rankings 87th
Hasanuddin University: QS Asian Rankings, Webometrics
Keio University, QS, THE, Asia Innovative Universities
Novosibirsk State University, Russia: QS World, EECA and subject rankings
China University of Geosciences; US News Best Global Universities.
Mahidol University, Thailand, cites US News, GreenMetric, THE, National Taiwan University, uniRank, URAP,, and QS.
The QS- THE duopoly seems to be holding up fairly well but there are signs that some universities are exploring other international rankings.
Also, in a report on Malaysia, Prof Nordin Yahaya of Universiti Teknologi Malaysia referred to URAP, produced by the Middle East Technical University, to measure the country's research performance.
As expected, a lot of this referred to international university rankings. Here are some examples.
Ritsumeikan Asia Pacific University, Japan: THE Japan University Rankings 21st, QS World Rankings Asia 100% for internationalisation.
Yonsei University, Korea: QS Asia University Rankings 19th
Hanyang University, Korea: QS, Reuters Asia's Innovative 100 universities
Sabinci University, Turkey: THE
University of Malaya: QS world rankings 87th
Hasanuddin University: QS Asian Rankings, Webometrics
Keio University, QS, THE, Asia Innovative Universities
Novosibirsk State University, Russia: QS World, EECA and subject rankings
China University of Geosciences; US News Best Global Universities.
Mahidol University, Thailand, cites US News, GreenMetric, THE, National Taiwan University, uniRank, URAP,, and QS.
The QS- THE duopoly seems to be holding up fairly well but there are signs that some universities are exploring other international rankings.
Also, in a report on Malaysia, Prof Nordin Yahaya of Universiti Teknologi Malaysia referred to URAP, produced by the Middle East Technical University, to measure the country's research performance.
Thursday, March 28, 2019
Global University Rankings and Southeast Asia
Global University Rankings and
Southeast Asia
Paper presented at Asia-Pacific
Association for International Education, Kuala Lumpur 26 March 2019
Background
Global rankings began in a small way in 2003 with the
publication of the first edition of the Shanghai Rankings. These were quite
simple, comprising six indicators that measured scientific research. Their
purpose was to show how far Chinese universities had to go to reach world class
status. Public interest was limited although some European universities were
shocked to find how far they were behind English-speaking institutions.
Then came the Times
Higher Education Supplement (THES) – Quacquarelli Symonds (QS) World
University Rankings. Their methodology was very different from that of
Shanghai, relying heavily of a survey of academic opinion. In most parts of the
world interest was limited and the rankings received little respect but
Malaysia was different. The country’s flagship university, Universiti Malaya
(UM), reached the top one hundred, an achievement that was cause for great if
brief celebration. That achievement was the result of an error on the part of the
rankers, QS, and in 2005 UM crashed out of the top 100.
Current Ranking Scene
International rankings have made substantial progress over
the last decade and a half. In 2003 there was one, Shanghai. Now according to
the IREG Inventory there are 45 international rankings of which 17 are global,
plus subject, regional, system, business school and sub- rankings.
They cover a broad range of data that could be of interest to
students, researchers, policy makers and other stakeholders. They include
metrics like number of faculty and students, income, patents, web activity, publications,
books, conferences, reputation surveys, patents, and contributions to
environmental sustainability.
Rankings and Southeast
Asia
For Malaysia the publication of the THES-QS rankings in 2004
was the beginning of years of interest, perhaps obsession, with the rankings. The
country has devoted resources and support to gain favourable places in the QS
rankings.
Singapore has emphasised both the QS and THE rankings since
that unpleasant divorce in 2009. It has hosted the THE academic summit and has
performed well in nearly all rankings especially in the THE and QS world
rankings.
A few universities in Thailand, Indonesia and the Philippines
have been included at the lower levels of rankings such as those published by
the University of Leiden, National Taiwan University, Scimago, THE and QS.
Other countries have shown less interest. Myanmar and
Cambodia are included only in the Webometrics and uniRank rankings, which
include thousands of places with the slightest pretension of being a university
or college.
Inclusion and
Performance
There is considerable variation in the inclusiveness of the
rankings. There are five Southeast Asian universities in the Shanghai Rankings
and 3,192 in Webometrics.
Among Southeast Asian universities Singapore is clearly the
best performer, followed by Malaysia, while Myanmar is the worse.
Targets
The declaration of targets with regard to rankings is a
common strategy across the world. Malaysia
has a specific policy of getting universities into the QS rankings, 4 in the
top 200, 2 in the top 100 and one in the top 25.
In Thailand the 20-year national strategy aims at getting at
least five Thai universities into the top 100 of the world rankings.
Indonesia wants to get five specified universities into the
QS top 500 by 2019 and a further six by 2024.
The Dangers of Rankings
The cruel reality is that we cannot escape rankings. If all
the current rankings were banned and thrown into an Orwellian memory hole then
we would simply revert to informal and subjective rankings that prevailed
before.
If we must have formal rankings then they should be as valid
and accurate as possible and they should take account of the varying missions
of universities and their size and clientele and they should be as
comprehensive as possible.
To ignore the data that rankings can provide is to seriously limit
public awareness. At the moment Southeast Asian universities and governments
seem interested mainly or only in the QS rankings or perhaps the THE rankings.
To focus on any single ranking could be self-defeating. Take
a look at Malaysia’s position in the QS rankings. It is obvious that UM, Malaysia’s
leading university in most rankings, does very much better in the QS rankings than
in every single ranking, except the GreenMetric rankings.
Why is this? The QS rankings allot a 40 % weighting to a
survey of academic opinion supposedly about research, more than any other
ranking. They allow universities to influence the composition of survey
respondents, by submitting names or by alerting researchers to the sign-up
facility where they can take part in the survey.
To their credit, QS have published the number of survey respondents
by country. The largest number is from the USA with almost as many from the UK.
The third largest number of respondents is from Malaysia, more than China and
India combined. Malaysian universities do much better in the academic survey
than they do for citations.
It is problematical to present UM as a top 100 university. It
has a good reputation among local and regional researchers but is not doing so
well in the other metrics especially research of the highest quality.
There is also a serious risk that the performance in the QS
ranking is precarious. Already countries like Russia, Colombia, Iraq, and
Kazakhstan are increasing their representation in the QS survey. More will join
them. The top Chinese universities are targeting the Shanghai rankings but one
day the second tier may try out for the QS rankings.
Also, any university that relies too much on the QS rankings
could easily be a victim of methodological changes. QS has, with good reason,
revamped its methodology several times and this can easily affect the scores of
universities through no fault or credit of their own. This may have happened
again during the collection of data for this year’s rankings. QS recently
announced that universities can either submit names of potential respondents or
alert researchers to the sign-up facility but not, as in previous years, both.
Universities that have not responded to this change may well suffer a reduced
score in the survey indicators.
If not QS, should another ranking be used for benchmarking
and targets? Some observers claim that Asian universities should opt for the
THE rankings which are alleged to be more rigorous and sophisticated and certainly
more prestigious.
That would be a mistake. The value of the THE rankings, but
not their price, is drastically reduced by their lack of transparency so that
it is impossible, for example, to tell whether a change in the score for
research results from an increase in publications, a decline in the number of
staff, an improved reputation or an increase in research income.
Then there is the THE citations indicator. This can only be
described as bizarre and ridiculous.
Here are some of the universities that appeared in the top 50
of last year’s citation indicator which supposedly measures research influence
or quality: Babol Noshirvani University of Technology, Brighton and Sussex
medical School, Reykjavik University, Anglia Ruskin University Jordan
University of Science and Technology, Vita-Salute San Raffaele University.
Proposals
1.
It is not a good idea to use any single ranking
but if one is to be then it should be one that is methodologically stable and
technically competent and does not emphasise a single indicator. For research, probably
the best bet would be the Leiden Ranking. If a ranking is needed that includes
metrics that might be related to teaching and learning then Round University
Rankings would be helpful.
2. Another
approach would be to encourage universities to target more than one university.
3.
A
regional database should be created that would provide information about ranks
and scores in all relevant rankings and data about faculty, students, income,
publications, citations and so on.
4.
Regional
universities should work to develop measures of the effectiveness of teaching and
learning.
Links
Wednesday, February 13, 2019
Rankings Uproar in Singapore
Singapore has been doing very well in the global university rankings lately. In the recent QS world rankings the National University of Singapore (NUS) and Nanyang Technological University (NTU) are 11th and 12th respectively and 23rd and 51st in the Times Higher Education (THE) rankings.
Their performance in other rankings is good but not so remarkable. In the Shanghai rankings NUS is 85th and NTU 96th. In the Round University Rankings, which combine teaching and research indicators, they are 50th and 73rd. In the CWUR rankings, which attempt to measure student and faculty, they are 103rd and 173rd. In the CWTS Leiden Ranking publications count they are 34th and 66th.
The universities have performed well across a broad range of indicators and in nearly all of the international rankings. They do, however, perform much better in the QS and THE rankings than in others.
There are now at least 45 international rankings of various kinds, including global, regional, subject, system and business school rankings. See the IREG Inventory of International Rankings which already needs updating. Singapore usually ignores these except for the QS and the THE world and regional rankings. The NUS website proudly displays the ranks in the QS and THE rankings and mentions Reuters Top 75 Asian Innovative Universities but there is nothing about the others.
Singapore's success is the the result of a genuine drive for excellence in research powered by a lot of money and rigorous selection but there also seems to be a careful and perhaps excessive cultivation of links with the UK rankers.
A few years ago, for example, World Scientific Publishing, which used to provide a database for the THES-QS academic survey, published a book co-authored by a NTU academic on how to succeed in the IELTS exam. It was entitled Top the IELTS: Opening the Gates to Top QS-ranked Universities.
There have been complaints that Singapore has become obsessed with rankings, that local researchers and teachers are marginalised, and that the humanities and social sciences are neglected.
An article in the Singapore magazine Today reported that there had been a high and damaging turnover of faculty in the humanities and social sciences. They were oppressed by demands for publications and citations, key performance indicators which were often changed without warning, an emphasis on the hard sciences and a damaging pursuit of glory in the rankings. Several faculty have departed and this has supposedly had an adverse effect on departments in the humanities and social sciences.
I sympathise with scholars outsider the harder sciences who have to deal with bureaucrats unfamiliar with the publication and citation practices of various disciples. I recall once attending an interview with an Asian university for a job teaching English where a panel of engineers and administrators wanted to know why my publications were so few. First on the list was a book of 214 pages which would be the equivalent of 20 papers in the natural sciences. It was not co-authored which would make it the equivalent in bulk of about eighty natural science papers with four authors apiece. Next was an article that was one of the most frequently cited paper in the field of genre analysis. But this was ignored. Numbers were the only thing that mattered.
But it seems that the two leading Singapore universities have not in fact neglected the disciplines outside the STEM subjects. In the QS arts and humanities ranking NUS is18th in the world and NTU 61st. In THE's they are 32nd and in the 101-125 band.
It is also not entirely correct to suggest that the rankings generally discriminate against the social sciences and humanities. Both QS and THE now use normalisation to make sure that citation and publication counts and other metrics give due weight to those disciplines. It is certainly true that the Shanghai rankings do not count scholarship in the humanities but they do not seem to get much publicity in Singapore.
The big problem with Singapore's approach to rankings is that it is too concerned with the QS rankings with their emphasis on reputation surveys and international orientation and the THE rankings which also have reputation and international indicators and three different measures of income. This has resulted in ranking successes that even Singaporeans found difficult to believe. Does NTU really have an academic reputation greater than that of Johns Hopkins, Duke, and King's College London? Meanwhile other rankings that are more stable and reliable are simply ignored.
The Today article has been withdrawn apparently for legal reasons. There may be genuine concerns about defamation but it seems that that someone is a bit heavy handed. This may be self defeating since the dissident academics are unlikely to get very much public sympathy. One complained that his door had been defaced while he was on sabbatical. Another took to his bed for days at a time because of the frustration of dealing with the bureaucracy.
There has been more debate since. Linda Lim and Pang Eng Fong, emeriti of Michigan State University and Singapore Management University, argue in the South China Morning Post that the emphasis on rankings is damaging to Singapore because it discourages academics from doing research that is relevant to social policy. They argue that citations are a key metric in the rankings and that the top journals favor research that is of theoretical significance in STEM subjects rather than local research in the humanities and social sciences.
This seems exaggerated. Citations count for 30% of the THE rankings, which is probably too much and 30% of QS's. Both of then, and other rankings, now have processes that reduce the
he bias to the natural sciences in citations, publications and reputation surveys. In fact QS have claimed that the weighting given to its academic reputation indicator was precisely to give a level playing field to the humanities and social sciences.
They refer to Teo You Yenn a researcher at NTU who has written a book for a commercial publisher that has received a lot of attention but is unlikely to advance her career.
Dr Teo, however, is an Associate Professor and has a very respectable publishing record of articles in leading journals on family, migration, inequality, gender, and poverty in Singapore. Some are highly cited, although not as much as papers in medical research or particle physics. It seems that a focus on elite journals and rankings has done nothing to stop research on social policy issues.
The state and the universities are unlikely to be swayed from their current policy. It would, however, be advisable for them to think about their focus on the QS and THE rankings. Reputation, financial and international indicators are the backbone of Singapore's ranking success. But they can be easily emulated by other countries with supportive governments and the help of benchmarking and reputation management schemes.
Their performance in other rankings is good but not so remarkable. In the Shanghai rankings NUS is 85th and NTU 96th. In the Round University Rankings, which combine teaching and research indicators, they are 50th and 73rd. In the CWUR rankings, which attempt to measure student and faculty, they are 103rd and 173rd. In the CWTS Leiden Ranking publications count they are 34th and 66th.
The universities have performed well across a broad range of indicators and in nearly all of the international rankings. They do, however, perform much better in the QS and THE rankings than in others.
There are now at least 45 international rankings of various kinds, including global, regional, subject, system and business school rankings. See the IREG Inventory of International Rankings which already needs updating. Singapore usually ignores these except for the QS and the THE world and regional rankings. The NUS website proudly displays the ranks in the QS and THE rankings and mentions Reuters Top 75 Asian Innovative Universities but there is nothing about the others.
Singapore's success is the the result of a genuine drive for excellence in research powered by a lot of money and rigorous selection but there also seems to be a careful and perhaps excessive cultivation of links with the UK rankers.
A few years ago, for example, World Scientific Publishing, which used to provide a database for the THES-QS academic survey, published a book co-authored by a NTU academic on how to succeed in the IELTS exam. It was entitled Top the IELTS: Opening the Gates to Top QS-ranked Universities.
There have been complaints that Singapore has become obsessed with rankings, that local researchers and teachers are marginalised, and that the humanities and social sciences are neglected.
An article in the Singapore magazine Today reported that there had been a high and damaging turnover of faculty in the humanities and social sciences. They were oppressed by demands for publications and citations, key performance indicators which were often changed without warning, an emphasis on the hard sciences and a damaging pursuit of glory in the rankings. Several faculty have departed and this has supposedly had an adverse effect on departments in the humanities and social sciences.
I sympathise with scholars outsider the harder sciences who have to deal with bureaucrats unfamiliar with the publication and citation practices of various disciples. I recall once attending an interview with an Asian university for a job teaching English where a panel of engineers and administrators wanted to know why my publications were so few. First on the list was a book of 214 pages which would be the equivalent of 20 papers in the natural sciences. It was not co-authored which would make it the equivalent in bulk of about eighty natural science papers with four authors apiece. Next was an article that was one of the most frequently cited paper in the field of genre analysis. But this was ignored. Numbers were the only thing that mattered.
But it seems that the two leading Singapore universities have not in fact neglected the disciplines outside the STEM subjects. In the QS arts and humanities ranking NUS is18th in the world and NTU 61st. In THE's they are 32nd and in the 101-125 band.
It is also not entirely correct to suggest that the rankings generally discriminate against the social sciences and humanities. Both QS and THE now use normalisation to make sure that citation and publication counts and other metrics give due weight to those disciplines. It is certainly true that the Shanghai rankings do not count scholarship in the humanities but they do not seem to get much publicity in Singapore.
The big problem with Singapore's approach to rankings is that it is too concerned with the QS rankings with their emphasis on reputation surveys and international orientation and the THE rankings which also have reputation and international indicators and three different measures of income. This has resulted in ranking successes that even Singaporeans found difficult to believe. Does NTU really have an academic reputation greater than that of Johns Hopkins, Duke, and King's College London? Meanwhile other rankings that are more stable and reliable are simply ignored.
The Today article has been withdrawn apparently for legal reasons. There may be genuine concerns about defamation but it seems that that someone is a bit heavy handed. This may be self defeating since the dissident academics are unlikely to get very much public sympathy. One complained that his door had been defaced while he was on sabbatical. Another took to his bed for days at a time because of the frustration of dealing with the bureaucracy.
There has been more debate since. Linda Lim and Pang Eng Fong, emeriti of Michigan State University and Singapore Management University, argue in the South China Morning Post that the emphasis on rankings is damaging to Singapore because it discourages academics from doing research that is relevant to social policy. They argue that citations are a key metric in the rankings and that the top journals favor research that is of theoretical significance in STEM subjects rather than local research in the humanities and social sciences.
This seems exaggerated. Citations count for 30% of the THE rankings, which is probably too much and 30% of QS's. Both of then, and other rankings, now have processes that reduce the
he bias to the natural sciences in citations, publications and reputation surveys. In fact QS have claimed that the weighting given to its academic reputation indicator was precisely to give a level playing field to the humanities and social sciences.
They refer to Teo You Yenn a researcher at NTU who has written a book for a commercial publisher that has received a lot of attention but is unlikely to advance her career.
Dr Teo, however, is an Associate Professor and has a very respectable publishing record of articles in leading journals on family, migration, inequality, gender, and poverty in Singapore. Some are highly cited, although not as much as papers in medical research or particle physics. It seems that a focus on elite journals and rankings has done nothing to stop research on social policy issues.
The state and the universities are unlikely to be swayed from their current policy. It would, however, be advisable for them to think about their focus on the QS and THE rankings. Reputation, financial and international indicators are the backbone of Singapore's ranking success. But they can be easily emulated by other countries with supportive governments and the help of benchmarking and reputation management schemes.
Friday, February 08, 2019
Are Turkish Universities Declining? More misuse of Rankings
Sorry to get repetitive.
Another article has appeared offering the Times Higher Education (THE) world rankings as evidence for the decline of national universities.
Matin Gurcan in Al-Monitor argues that Turkish universities are in decline for several reasons, including uncontrolled boutique private universities, excessive government control, academic purges, lack of training for research staff and rampant nepotism.
We have been here before. But are Turkish universities really in decline?
The main evidence offered is that there are fewer Turkish universities at the higher levels of the THE rankings. The other rankings that are now available are ignored.
It is typical of the current state of higher education journalism that many commentators seem unaware that there are now many university rankings and that some of them are as valid and accurate as THE's if not better. The ascendancy of the THE is largely a creation of a lazy and compliant media.
Turkish universities have certainly fallen in the THE rankings.
In 2014 there were six Turkish universities in the world's top 500 and four in the top 200. Leading the pack was Middle East Technical University (METU) in 85th place, up from the 221-250 band in 2013
A year later there were four in the top 500 and none in the top 200. METU was in the 500-600 band.
Nepotism, purges, lack of training were not the cause. They were as relevant here as goodness was to Mae West's diamonds. What happened was that in 2015 THE made a number of changes to the methodology of its flagship citations indicator. The country adjustment which favoured universities in countries with low citation counts was reduced. There was a switch from Web of Science to Scopus as the data source. Citations to mega-papers such as those emanating from the CERN projects, with thousands of contributors and thousands were no longer counted.
Some Turkish universities were heavily over-invested in the CERN project, which took them to an unusually high position in 2014. In 2015 they went crashing down the THE rankings largely as a result of the methodological adjustments.
Other rankings such as URAP and National Taiwan University show that Turkish universities, especially METU, have declined but not nearly as much or as quickly as the THE rankings appear to show.
In the Round University Rankings there were seven Turkish universities in the top 500 in 2014, six in 2015, and seven in 2018, METU was 375th in 2014, 308th in 2015, and 437th in 2018: a significantly decline but much less than the one recorded by THE.
Meanwhile the US News Best Global Universities rankings show three Turkish universities, including METU, in the top 500.
I do not dispute that Turkish universities have problems or the significance of the trends mentioned by Matin Gurcan. The evidence of the rankings is that they are declining at least in comparison with other universities especially in Asia. The THE world rankings are not, however, a good source of evidence.
Friday, January 18, 2019
Top universities for research impact in the emerging economies
It is always interesting to read the reactions of the good and the great of the academic world to the latest rankings from Times Higher Education (THE). Rectors, vice chancellors and spokespersons of various kinds gloat over their institutions' success. Opposition politicians and journalists demand rolling heads if a university falls.
What nobody seems to do is take a look at the ranking methodology or the indicator scores which can often say as much about a university's success or failure than any amount of government funding, working as a team or dynamic leadership.
The citations indicator in the latest THE Emerging Economies Rankings is supposed to measure research impact and officially accounts for twenty per cent of the total weighting. In practice its weighting is effectively larger because universities in every country except the one with the most impact benefit from a country adjustment.
Here are the top ten universities in the emerging world for research impact according to THE.
1. Jordan University of Science and Technology
2. Cyprus University of Technology
3. University of Desarrolio
4. Diego Portales University
5. Southern University of Science and Technology China
6. University of Crete
7. University of Cape Town
8. Indian Institute of Technology Indore
9. Pontifical Javeriana University
10. University of Tartu
What nobody seems to do is take a look at the ranking methodology or the indicator scores which can often say as much about a university's success or failure than any amount of government funding, working as a team or dynamic leadership.
The citations indicator in the latest THE Emerging Economies Rankings is supposed to measure research impact and officially accounts for twenty per cent of the total weighting. In practice its weighting is effectively larger because universities in every country except the one with the most impact benefit from a country adjustment.
Here are the top ten universities in the emerging world for research impact according to THE.
1. Jordan University of Science and Technology
2. Cyprus University of Technology
3. University of Desarrolio
4. Diego Portales University
5. Southern University of Science and Technology China
6. University of Crete
7. University of Cape Town
8. Indian Institute of Technology Indore
9. Pontifical Javeriana University
10. University of Tartu
Friday, January 11, 2019
Where does prestige come from? : Age, IQ, research or money?
Prestige is a wonderful thing. Universities use it to attract capable students, prolific researchers, grants and awards and, of course, to rise in the global rankings.
This post was inspired by a series of tweets that started with a claim that the prestige of universities was dependent on student IQ.
That is a fairly plausible idea. There is good evidence that employers expect universities to guarantee that graduates have a certain level of cognitive ability, are reasonably conscientious and, especially in recent years, conform to prevailing social and political orthodoxy. At the moment, general cognitive ability appears to be what contributes most to graduate employability although it may be less important than it used to be.
Then there was a suggestion that when it came to prestige it was actually age and research that mattered. Someone also said that it might be money.
So I have compared these metrics or proxies with universities' scores on various reputation surveys, which could be indicative, perhaps not perfectly, of their prestige
I have taken the median ACT or SAT scores of admitted students at the top fifty selective colleges in the USA as a substitute for IQ, with which they have a high correlation. The data is from the supplement to a paper in the Journal of Intelligence by Wai, Brown and Chabris.
The endowments of those colleges and the financial sustainability scores in the Round University Rankings are used to measure money. The number of research publications listed in the latest CWTS Leiden Ranking represents research.
I have looked at the correlations of these with the reputation scores in the rankings by QS (academic reputation and employer reputation), Times Higher Education (THE) (research reputation and teaching reputation), RUR (research reputation, teaching reputation and reputation outside region), and the Emergence/Trendence survey of graduate employability.
Since we are looking at a small fraction of the world's institutions in just one country the generalisability of this exercise is limited.
So what do we come up with? First, there are several highly selective liberal arts colleges in the US that are overlooked by international rankings. About half of the top 50 schools by SAT/ACT scores in the US do not show in the global rankings. An international undergraduate student wanting to study in the USA would do well to look beyond these rankings and think about places that are still highly selective such as Harvey Mudd, Pomona and Amherst Colleges.
Let's take a look at the four attributes. Age doesn't matter. There is no significance correlation between an institution's age and any of the reputation indicators. The lowest correlation, -.15, is with the RUR world research reputation indicator and the highest, but still not significant, .36, is with the THE teaching reputation indicator.
Research, however, is important. The correlation between total publications in the most recent Leiden Ranking varies from .48, RUR reputation outside region, to .63, THE teaching reputation.
So are standardised test scores. There is a significant correlation between SAT/ACT scores and the reputation indicators in the QS, RUR and Emerging/Trendence survey, ranging from .48 for the RUR world research reputation and reputation outside region to .72 for the Emerging/Trendence ranking. But the correlation with the THE teaching and research reputation indicators is not significant.
The RUR composite financial sustainability indicator correlates highly with the QS, RUR and Emerging/Trendence rankings, ranging from .47 for the QS employers' survey to .71 for the RUR world teaching reputation score but not for the THE indicators with which it is .15 for research and .16 for teaching.
Endowment value appears to be the biggest influence on reputation. it correlates significantly with all reputation indicators, ranging from .42 for the RUR world research reputation indicator to .72 for Emerging/Trendence.
Of the four inputs the one that has the highest correlation with the three RUR reputation indicators, .71, .63, and .64 and the QS academic survey, .59, is financial sustainability.
Endowment value has the highest correlation with the QS employer survey, .57, and the two THE indicators, .66 and .71. Endowment and SAT are joint top for the Emerging/Trendence employability survey, .72.
So it's seems that the best way to a good reputation, at least for selective American colleges, would be money. Test scores and research output can also help. But age doesn't matter.
This post was inspired by a series of tweets that started with a claim that the prestige of universities was dependent on student IQ.
That is a fairly plausible idea. There is good evidence that employers expect universities to guarantee that graduates have a certain level of cognitive ability, are reasonably conscientious and, especially in recent years, conform to prevailing social and political orthodoxy. At the moment, general cognitive ability appears to be what contributes most to graduate employability although it may be less important than it used to be.
Then there was a suggestion that when it came to prestige it was actually age and research that mattered. Someone also said that it might be money.
So I have compared these metrics or proxies with universities' scores on various reputation surveys, which could be indicative, perhaps not perfectly, of their prestige
I have taken the median ACT or SAT scores of admitted students at the top fifty selective colleges in the USA as a substitute for IQ, with which they have a high correlation. The data is from the supplement to a paper in the Journal of Intelligence by Wai, Brown and Chabris.
The endowments of those colleges and the financial sustainability scores in the Round University Rankings are used to measure money. The number of research publications listed in the latest CWTS Leiden Ranking represents research.
I have looked at the correlations of these with the reputation scores in the rankings by QS (academic reputation and employer reputation), Times Higher Education (THE) (research reputation and teaching reputation), RUR (research reputation, teaching reputation and reputation outside region), and the Emergence/Trendence survey of graduate employability.
Since we are looking at a small fraction of the world's institutions in just one country the generalisability of this exercise is limited.
So what do we come up with? First, there are several highly selective liberal arts colleges in the US that are overlooked by international rankings. About half of the top 50 schools by SAT/ACT scores in the US do not show in the global rankings. An international undergraduate student wanting to study in the USA would do well to look beyond these rankings and think about places that are still highly selective such as Harvey Mudd, Pomona and Amherst Colleges.
Let's take a look at the four attributes. Age doesn't matter. There is no significance correlation between an institution's age and any of the reputation indicators. The lowest correlation, -.15, is with the RUR world research reputation indicator and the highest, but still not significant, .36, is with the THE teaching reputation indicator.
Research, however, is important. The correlation between total publications in the most recent Leiden Ranking varies from .48, RUR reputation outside region, to .63, THE teaching reputation.
So are standardised test scores. There is a significant correlation between SAT/ACT scores and the reputation indicators in the QS, RUR and Emerging/Trendence survey, ranging from .48 for the RUR world research reputation and reputation outside region to .72 for the Emerging/Trendence ranking. But the correlation with the THE teaching and research reputation indicators is not significant.
The RUR composite financial sustainability indicator correlates highly with the QS, RUR and Emerging/Trendence rankings, ranging from .47 for the QS employers' survey to .71 for the RUR world teaching reputation score but not for the THE indicators with which it is .15 for research and .16 for teaching.
Endowment value appears to be the biggest influence on reputation. it correlates significantly with all reputation indicators, ranging from .42 for the RUR world research reputation indicator to .72 for Emerging/Trendence.
Of the four inputs the one that has the highest correlation with the three RUR reputation indicators, .71, .63, and .64 and the QS academic survey, .59, is financial sustainability.
Endowment value has the highest correlation with the QS employer survey, .57, and the two THE indicators, .66 and .71. Endowment and SAT are joint top for the Emerging/Trendence employability survey, .72.
So it's seems that the best way to a good reputation, at least for selective American colleges, would be money. Test scores and research output can also help. But age doesn't matter.
Monday, December 03, 2018
Interesting Times for Times Higher?
Changes may be coming for the "universities Bible", aka Times Higher Education, and its rankings, events, consultancies and so on.
It seems that TES Global is selling off its very lucrative cash cow and that, in addition to private equity firms, the RELX group which owns Scopus and Clarivate Analytics are in a bidding war.
Scopus currently provides the data for the THE rankings and Clarivate used to. If one of them wins the war there may be implications for the THE rankings, especially for the citations indicator.
If anybody has information about what is happening please send a comment.
It seems that TES Global is selling off its very lucrative cash cow and that, in addition to private equity firms, the RELX group which owns Scopus and Clarivate Analytics are in a bidding war.
Scopus currently provides the data for the THE rankings and Clarivate used to. If one of them wins the war there may be implications for the THE rankings, especially for the citations indicator.
If anybody has information about what is happening please send a comment.
Thursday, November 15, 2018
THE uncovers more pockets of research excellence
I don't want to do this. I really would like to start blogging about whether rankings should measure third missions or developing metrics for teaching and learning. But I find it difficult to stay away from the THE rankings, especially the citations indicator.
I have a couple of questions. If someone can help please post a comment here.
Do the presidents, vice-chancellors, directors, generalissimos, or whatever of universities actually look at or get somebody to look at the indicator scores of the THE world rankings and their spin-offs?
Does anyone ever wonder how a ranking that produces such such imaginative and strange results for research influence, measured by citations, command the respect and trust of those hard-headed engineers, MBAs and statisticians running the world's elite universities?
These questions are especially relevant as THE are releasing subject rankings. Here are the top universities in the world for research impact (citations) in various subjects. For computer science and engineering they refer to last year's rankings.
Clinical, pre-clinical and health: Tokyo Metropolitan University
Life Sciences: MIT
Physical sciences: Babol Noshirvani University of Technology
Psychology: Princeton University
Arts and humanities: Universite de Versailles Saint Quentin-en-Yvelines
Education: Kazan Federal University
Law: Iowa State University
Social sciences: Stanford University
Business and economics: Dartmouth College
Computer Science: Princeton University
Engineering and technology: Victoria University, Australia.
https://www.timeshighereducation.com/world-university-rankings/by-subject
I have a couple of questions. If someone can help please post a comment here.
Do the presidents, vice-chancellors, directors, generalissimos, or whatever of universities actually look at or get somebody to look at the indicator scores of the THE world rankings and their spin-offs?
Does anyone ever wonder how a ranking that produces such such imaginative and strange results for research influence, measured by citations, command the respect and trust of those hard-headed engineers, MBAs and statisticians running the world's elite universities?
These questions are especially relevant as THE are releasing subject rankings. Here are the top universities in the world for research impact (citations) in various subjects. For computer science and engineering they refer to last year's rankings.
Clinical, pre-clinical and health: Tokyo Metropolitan University
Life Sciences: MIT
Physical sciences: Babol Noshirvani University of Technology
Psychology: Princeton University
Arts and humanities: Universite de Versailles Saint Quentin-en-Yvelines
Education: Kazan Federal University
Law: Iowa State University
Social sciences: Stanford University
Business and economics: Dartmouth College
Computer Science: Princeton University
Engineering and technology: Victoria University, Australia.
https://www.timeshighereducation.com/world-university-rankings/by-subject
Saturday, November 10, 2018
A modest suggestion for THE
A few years ago the Shanghai rankings did an interesting tweak on their global rankings. They deleted the two indicators that counted Nobel and Fields awards and produced an Alternative Ranking.
There were some changes. The University of California San Diego and the University of Toronto did better while Princeton and Vanderbilt did worse.
Perhaps it is time for Times Higher Education (THE) to consider doing something similar for their citations indicator. Take a look at their latest subject ranking, Clinical, Pre-clinical and Health. Here are the top ten for citations, supposedly a measure of research impact or influence.
1. Tokyo Metropolitan University
2. Auckland University of Technology
3. Metropolitan Autonomous University, Mexico
4. Jordan University of Science and Technology
5. University of Canberra
6. Anglia Ruskin University
7. University of the Philippines
8. Brighton and Sussex Medical School
9. Pontifical Javeriana University, Colombia
10. University of Lorraine.
If THE started producing alternative subject rankings without the citations indicator they would be a bit less interesting but a lot more credible.
There were some changes. The University of California San Diego and the University of Toronto did better while Princeton and Vanderbilt did worse.
Perhaps it is time for Times Higher Education (THE) to consider doing something similar for their citations indicator. Take a look at their latest subject ranking, Clinical, Pre-clinical and Health. Here are the top ten for citations, supposedly a measure of research impact or influence.
1. Tokyo Metropolitan University
2. Auckland University of Technology
3. Metropolitan Autonomous University, Mexico
4. Jordan University of Science and Technology
5. University of Canberra
6. Anglia Ruskin University
7. University of the Philippines
8. Brighton and Sussex Medical School
9. Pontifical Javeriana University, Colombia
10. University of Lorraine.
If THE started producing alternative subject rankings without the citations indicator they would be a bit less interesting but a lot more credible.
Friday, November 02, 2018
Ranking Rankings: Measuring Stability
I have noticed that some rankings are prone to a large amount of churning. Universities may rise or fall dozens of places over a year, sometimes as a result of methodological changes, changes in the number or type of universities ranked, errors and corrections of errors (fortunately rare these days), changes in data collection and reporting procedures, or because there is a small number of data points.
Some ranking organisations like to throw headlines around about who's up or down, the rise of Asia, the fall of America, and so on. This is a trivialisation of any serious attempt at the comparative evaluation of universities, which do not behave like volatile financial markets. Universities are generally fairly stable institutions: most of the leading universities of the early twentieth century are still here while the Ottoman, Hohenzollern, Hapsburg and Romanov empires are long gone.
Reliable rankings should not be expected to show dramatic changes from year to year, unless there has been radical restructuring like the recent wave of mergers in France. The validity of a ranking system is questionable if universities bounce up or down dozens, scores, even hundreds of ranks every year.
The following table shows the volatility of the global rankings listed in the IREG Inventory of international rankings. U-Multirank is not listed because it does not provide overall ranks and UniRank and Webometrics do not give access to previous editions.
Average rank change is the number of places that each of the top thirty universities has fallen or climbed between the two most recent editions of the ranking.
The most stable rankings are the Shanghai ARWU, followed by the US News global rankings and the National Taiwan University rankings. The GreenMetric rankings, Reuters Innovative Universities and the high quality research indicator of Leiden Ranking show the highest levels of volatility.
This is a very limited exercise. We might get different results if we examined all of the universities in the rankings or analysed changes over several years.
Some ranking organisations like to throw headlines around about who's up or down, the rise of Asia, the fall of America, and so on. This is a trivialisation of any serious attempt at the comparative evaluation of universities, which do not behave like volatile financial markets. Universities are generally fairly stable institutions: most of the leading universities of the early twentieth century are still here while the Ottoman, Hohenzollern, Hapsburg and Romanov empires are long gone.
Reliable rankings should not be expected to show dramatic changes from year to year, unless there has been radical restructuring like the recent wave of mergers in France. The validity of a ranking system is questionable if universities bounce up or down dozens, scores, even hundreds of ranks every year.
The following table shows the volatility of the global rankings listed in the IREG Inventory of international rankings. U-Multirank is not listed because it does not provide overall ranks and UniRank and Webometrics do not give access to previous editions.
Average rank change is the number of places that each of the top thirty universities has fallen or climbed between the two most recent editions of the ranking.
The most stable rankings are the Shanghai ARWU, followed by the US News global rankings and the National Taiwan University rankings. The GreenMetric rankings, Reuters Innovative Universities and the high quality research indicator of Leiden Ranking show the highest levels of volatility.
This is a very limited exercise. We might get different results if we examined all of the universities in the rankings or analysed changes over several years.
rank
|
ranking
|
address
|
average rank change
|
1
|
Shanghai ARWU
|
China
|
0.73
|
2
|
US News Best Global
Universities
|
USA
|
0.83
|
3
|
National Taiwan University Rankings
|
Taiwan
|
1.43
|
4
|
THE World University Rankings
|
UK
|
1.60
|
5
|
Round University Rankings
|
Russia
|
2.28
|
6
|
University Ranking by Academic Performance
|
Turkey
|
2.23
|
7
|
QS World University Rankings
|
UK
|
2.33
|
8
|
Nature Index
|
UK
|
2.60
|
9
|
Leiden Ranking Publications
|
Netherlands
|
2.77
|
10
|
Scimago
|
Spain
|
3.43
|
11
|
Emerging/Trendence
|
France
|
3.53
|
12
|
Center for World University Ranking
|
UAE
|
4.60
|
13
|
Leiden Ranking % Publications in top 1%
|
Netherlands
|
4.77
|
14
|
Reuters Innovative Universities
|
USA
|
6.17
|
15
|
UI GreenMetric
|
Indonesia
|
13.14
|
Subscribe to:
Posts (Atom)