Originally published in
WONK HE 27/10/2015
Are global rankings losing their credibility?
Richard is an academic and expert on university rankings. He writes
in depth on rankings at his blog: University
Rankings Watch.
Oct 27th 2015
The international university ranking scene has become increasingly
complex, confusing and controversial. It also seems that the big name brands
are having problems balancing popularity with reliability and validity. All
this is apparent from the events of the last two months which have seen the
publication of several major rankings.
In addition, a Russian organisation, Round University Ranking (RUR), has
produced another set of league tables. Apart from a news item on
the website of the International Ranking Expert Group these rankings have
received almost no attention outside Russia, Eastern Europe and the CIS. This
is very unfortunate since they do almost everything that the other rankings do
and contain information that the others do not.
One sign of the growing complexity of the ranking scene is that USN, QS,
ARWU and THE are producing a variety of by-products including
rankings of new universities, subject rankings, best cities for students,
reputation rankings, regional rankings with no doubt more to come. They are
also assessing more universities than ever before. THE used to take pride in
ranking only a small elite group of world universities. Now they are talking
about being open and inclusive and have ranked 800 universities this year, as
did QS, while USN has expanded from 500 to 750 universities. Only the Shanghai rankers
have remained content with a mere 500 universities in their general rankings.
Academic Ranking of World Universities (ARWU)
All three of the brand name rankings have faced issues of credibility.
The Shanghai ARWU has had a problem with the massive recruitment of adjunct
faculty by King Abdulaziz University (KAU) in Jeddah. This was initially aimed
at the highly cited researchers indicator in the ARWU, which simply counts the
number of researchers affiliated to universities no matter whether their affiliation
has been for an academic lifetime or had begun the day before ARWU did the
counting. The Shanghai rankers deftly dealt with this issue by simply not
counting secondary affiliations in the new lists of highly cited researchers
supplied by Thompson Reuters in 2014.
That, however, did not resolve the problem entirely. Those researchers
have not stopped putting KAU as a secondary affiliation and even if they no
longer affected the highly cited researchers indicator they could still help a
lot with publications and papers in Nature and Science,
both of which are counted in the ARWU. These part-timers – and some may not
even be that – have already ensured that KAU, according to ARWU, is the top
university in the world for publications in mathematics.
The issue of secondary affiliation is one that is likely to become a
serious headache for rankers, academic publishers and databases in the next few
years. Already, undergraduate teaching in American universities is dominated by
a huge reserve army of adjuncts. It is not impossible that in the near future
some universities may find it very easy to offer minimal part-time contracts to
talented researchers in return for listing as an affiliation and then see a
dramatic improvement in ranking performance.
ARWU’s problem with the highly cited researchers coincided with Thomson
Reuters producing a new list and announcing that the old one would no longer be
updated. Last year, Shanghai combined the old and new lists and this produced
substantial changes for some universities. This year they continued with the
two lists and there was relatively little movement in this indicator or in the
overall rankings. But next year they will drop the old list altogether and just
use the new one and there will be further volatility. ARWU have, however, listed the
number of highly cited researchers in the old and new lists so
most universities should be aware of what is coming.
Quacquarelli Symonds (QS) World University Rankings
The Quacquarelli Symonds (QS) World University Rankings have been
regarded with disdain by many British and American academics although they do
garner some respect in Asia and Latin America. Much of the criticism has
been directed at the academic reputation survey which is complex, opaque and,
judging from QS’s regular anti-gaming measures, susceptible to influence from
universities. There have also been complaints about the staff student ratio
indicator being a poor proxy for teaching quality and the bias of the citations
per faculty indicator towards medicine and against engineering, the social
sciences and the arts and humanities.
QS have decided to reform their
citations indicator by treating the five large subject groups
as contributing equally to the indicator score. In addition, QS omitted papers,
most of them in physics, with a very large number of listed authors and
averaged responses to the surveys over a period of five years in an attempt to
make the rankings less volatile.
The result of all this was that some universities rose and others fell.
Imperial College London went from 2nd to 8th while the London School of Economics rose from 71st to 35th. In Italy, the Polytechnics of Milan
and Turin got a big boost while venerable universities suffered dramatic
relegation. Two Indian institutions moved into the two hundred, some Irish
universities such as Trinity College Dublin, University College Dublin and
University College Cork went down and some such as National University of
Ireland Galway and the University of Limerick went up.
There has always been a considerable amount of noise in these rankings
resulting in part from small fluctuations in the employer and academic surveys.
In the latest rankings these combined with methodological changes to produce
some interesting fluctuations. Overall the general pattern was that
universities that emphasise the social sciences, the humanities and engineering
have improved at the expense of those that are strong in physics and medicine.
Perhaps the most remarkable of this year’s changes was the rise of two
Singaporean universities, the National University of Singapore (NUS) and
Nanyang Technological University (NTU), to 12th and 13th place respectively, a change
that has met with some scepticism even in Singapore. They are now above Yale,
EPF Lausanne and King’s College London. While the changes to the citations
component were significant, another important reason for the rise of these two
universities was their continuing remarkable performance in the academic and
employer surveys. NUS is in the top ten in the world for academic reputation
and employer reputation with a perfect score of 100, presumably rounded up, in
each. NTU is 52nd for the academic survey and 39th for employer with scores in the nineties for both.
Introducing a moderate degree of field normalisation was probably a
smart move. QS were able to reduce the distortion resulting from the database’s
bias to medical research without risking the multiplication of strange results
that have plagued the THE citations indicator. They have not,
however, attempted to reform the reputation surveys which continue to have a
combined 50% weighting and until they do so these rankings are unlikely to
achieve full recognition from the international academic community.
Times Higher Education (THE)
World University Rankings
The latest THE world rankings were published on
September 30th and like QS, THE have done some tweaking of
their methodology. They had broken with Thompson Reuters at the end of
2014 and started using data from Scopus, while doing the analysis and
processing in-house. They were able to analyse many more papers and citations
and conduct a more representative survey of research and postgraduate
supervision. In addition they omitted multi-author and multi-cited papers and
reduced the impact of the “regional modification”.
Consequently there was a large dose of volatility. The results were so
different from those of 2014 that they seemed to reflect an entirely new
system. THE did, to their credit, do the decent thing and
state that direct comparisons should not be made to previous years. That,
however, did not stop scores of universities and countries around the world
from announcing their success. Those that had suffered have for the most part
kept quiet.
There were some remarkable changes. At the very top, Oxford and
Cambridge surged ahead of Harvard which fell to sixth place. University College
Dublin, in contrast to the QS rankings, rose as did Twente and Moscow State,
the Karolinska Institute and ETH Zurich.
On the other hand, many universities in France, Korea, Japan and Turkey
suffered dramatic falls. Some of those universities had been participants in
the CERN projects and so had benefitted in 2014 from the huge number of
citations derived from their papers. Some were small and produced few papers so
those citations were divided by a small number of papers. Some were located in
countries that performed poorly and so got help from a “regional modification”
(the citation impact score of the university is divided by the square root of
the average citation impact score of the whole country). Such places suffered
badly from this year’s changes.
It is a relief that THE have finally done something
about the citations indicator and it would be excellent if they continued with
further reforms such as fractional counting, reducing the indicator’s overall
weighting, not counting self-citations and secondary affiliations and getting
rid of the regional modification altogether.
Unfortunately, if the current round of reforms represent an improvement,
and on balance they probably do, then the very different results of 2014 and
before, call into question THE’s repeated claims to be trusted, robust
and sophisticated. If the University of Twente deserves to be in the top 150
this year then the 2014 rankings which had them outside the top 200 could not
possibly be valid. If the Korean Advanced Institute of Science and Technology
(KAIST) fell 66 places then either the 2015 rankings or those of 2014 were
inaccurate, or they both were. Unless there is some sort of major restructuring
such as an amalgamation of specialist schools or the shedding of inconvenient
junior colleges or branch campuses, large organisations like universities
simply do not and cannot change that much over the course of 12 months or less.
It would have been more honest, although probably not commercially
feasible, for THE to declare that they were starting with a
completely new set of rankings and to renounce the 2009-14 rankings in the way
that they had disowned the rankings produced in cooperation with QS between
2004 and 2008. THE seem to be trying to trade on the basis of
their trusted methodology while selling results suggesting that that
methodology is far from trustworthy. They are of course doing just what a
business has to do. But that is no reason why university administrators and
academic experts should be so tolerant of such a dubious product.
These rankings also contain quite a few small or specialised
institutions that would appear to be on the borderline of a reasonable
definition of an “independent university with a broad range of subjects”:
Scuala Normale Superiore di Pisa and Scuala Superiore Sant’Anna, both part of
the University of Pisa system, Charité-Universitätsmedizin Berlin, an affiliate
of two universities, St George’s, University of London, a medical school,
Copenhagen Business School, Rush university, the academic branch of a private
hospital in Chicago, the Royal College of Surgeons in Ireland, and the National
Research Nuclear University (MEPhI) in Moscow, specialising in physics. Even if THE have
not been too loose about who is included, the high scores achieved by such
narrowly focussed institutions calls the validity of the rankings into
question.
Round University Rankings
In general the THE rankings have received a broad and
respectful response from the international media and university
managers, and criticism has largely been confined to outsiders and
specialists. This is in marked contrast to the Rankings released by a
Russian organisation early in September. These are based entirely on data
supplied by Thompson Reuters, THE’s data provider and analyst
until last year. They contain a total of 20 indicators, including 12 out of the
13 in the THE rankings. Unlike THE, RUR do not bundle
indicators together in groups so it is possible to tell exactly why
universities are performing well or badly.
The RUR rankings are not elegantly presented but the content is more
transparent than THE, more comprehensive than QS, and apparently
less volatile than either. It is a strong indictment of the international
higher education establishment that these rankings are ignored while THE’s are
followed so avidly.
Best Global Universities
The second edition of the US News’s Best Global
Universities was published at the beginning of October. The US
News is best known for the ranking of American colleges and
universities and it has been cautious about venturing into the global arena.
These rankings are fairly similar to the Shanghai ARWU, containing only
research indicators and making no pretence to measure teaching or graduate
quality. The methodology avoids some elementary mistakes. It does not give too
much weight to any one indicator, with none getting more than 12.5%, and
measures citations in three different ways. For eight indicators log
manipulation was done before the calculation of z-scores to eliminate outliers
and statistical anomalies.
This year US News went a little way towards reducing
the rankers’ obsession with citations by including conferences and books in the
list of criteria.
Since they do not include any non-research indicators these rankings are
essentially competing with the Shanghai ARWU and it is possible that they may
eventually become the first choice for internationally mobile graduate
students.
But at the moment it seems that the traditional media and higher
education establishment have lost none of their fascination for the snakes and
ladders game of THE and QS.