Global
university rankings are now nearly a decade and a half old. The Shanghai
rankings (Academic Ranking of World Universities or ARWU) began in 2003,
followed a year later by Webometrics and the THES-QS rankings which, after an
unpleasant divorce, became the Times Higher Education (THE)
and the Quacquarelli Symonds (QS) world rankings. Since then the number of
rankings with a variety of audiences and methodologies has expanded.
We now
have several research-based rankings, University Ranking by Academic
Performance (URAP) from Turkey, the National Taiwan
University Rankings, Best Global Universities from US News, Leiden
Ranking, as well as rankings that include some attempt to assess and
compare something other than research, the Round University Rankings from
Russia and U-Multirank from
the European Union. And, of course, we also have subject
rankings, regional
rankings, even age
group rankings.
It is
interesting that some of these rankings have developed beyond the original
founders of global rankings. Leiden Ranking is now the gold standard for the
analysis of publications and citations. The Russian rankings use the same Web
of Science database that THE did until 2014 and it has 12 out of the 13
indicators used by THE plus another eight in a more sensible and transparent
arrangement. However, both of these receive only a fraction of the attention
given to the THE rankings.
The
research rankings from Turkey and Taiwan are similar to the Shanghai rankings
but without the elderly or long departed Fields and Nobel award winners and
with a more coherent methodology. U-Multirank is almost alone in trying to
get at things that might be of interest to prospective undergraduate students.
It is
regrettable that an article by Professor Brian Leiter of the University of
Chicago in the Chronicle of Higher Education , 'Academic
Ethics: To Rank or Not to Rank' ignores such developments
and mentions only the original “Big Three”, Shanghai, QS and THE. This is
perhaps forgivable since the establishment media, including THE and the
Chronicle, and leading state and academic bureaucrats have until recently paid
very little attention to innovative developments in university ranking. Leiter
attacks the QS rankings and proposes that they should be boycotted while trying
to improve the THE rankings.
It is a
little odd that Leiter should be so caustic, not entirely without justification,
about QS while apparently being unaware of similar or greater problems with THE.
He begins
by saying that QS stands for “quirky silliness”. I would not disagree with that although
in recent years QS has been getting less silly. I have been as sarcastic as
anyone about the failings of QS: see here and here for
an amusing commentary.
But the
suggestion that QS is uniquely bad in contrast to THE is way off the target.
There are many issues with the QS methodology, especially with its employer and
academic surveys, and it has often announced placings that seem very
questionable such as Nanyang Technological University (NTU) ahead of Princeton
and Yale or the University of Buenos Aires in the world top 100, largely
as a result of a suspiciously good performance in the survey
indicators. The
oddities of the QS rankings are, however, no worse than some of the absurdities
that THE has served up in their world and
regional rankings. We have had places like University of Marakkesh Cadi
Ayyad University in Morocco, Middle East Technical University in Turkey,
Federico Santa Maria Technical University in Chile, Alexandria University
and Veltech University
in India rise to ludicrously high places, sometimes just for a year or two, as
the result of a few papers or even a single highly cited author.
I am not
entirely persuaded that NTU deserves its top
12 placing in the QS rankings. You can see here QS’s
unconvincing reply to a question that I provided. QS claims that NTU's excellence
is shown by its success in attracting foreign faculty, students and
collaborators, but when you are in a country where people show their passports
to drive to the dentist, being international is no great accomplishment. Even
so, it is evidently world class as far as engineering and computer science are
concerned and it is not impossible that it could reach an undisputed overall top
ten or twenty ranking the next decade.
While the
THE top ten or twenty or even fifty looks quite reasonable, apart from Oxford
in first place, there are many anomalies as soon as we start breaking the
rankings apart by country or indicator and THE has pushed some very weird data
in recent years. Look at these
places supposed to be regional or international centers of across
the board research excellence as measured by citations: St Georges University
of London, Brandeis University, the Free University of Bozen-Bolsano,
King Abdulaziz University, the University of Iceland, Veltech University.
If QS is silly what are we to call a ranking where Anglia Ruskin University is
supposed to have a greater research impact than Chicago, Cambridge or Tsinghua.
Leiter
starts his article by pointing out that the QS academic survey is largely
driven by the geographical distribution of its respondents and by the halo
effect. This is very probably true and to that I would add that a lot of the
responses to academic surveys of this kind are likely driven by simple self
interest, academics voting for their alma mater or current employer. QS does
not allow respondents to vote for the latter but they can vote for the former
and also vote for grant providers or collaborators.
He says
that “QS does not, however, disclose the geographic distribution of
its survey respondents, so the extent of the distorting effect cannot be
determined". This is not true of the overall survey. QS does in fact
give very
detailed figures about the origin of its respondents and there
is good evidence here of probable distorting effects. There are, for example,
more responses from Taiwan than from Mainland China, and almost as many from
Malaysia as from Russia. QS does not, however, go down to subject level when
listing geographic distribution.
He then
refers to the case of University
College Cork (UCC) asking faculty to solicit friends in other
institutions to vote for UCC. This is definitely a bad practice, but it was in
violation of QS guidelines and QS have investigated. I do not know what came of
the investigation but it is worth noting that the message would not have been
an issue if it had referred to the THE survey.
On
balance, I would agree that THE ‘s survey methodology is less dubious than QS’s
and less likely to be influenced by energetic PR campaigns. It would certainly
be a good idea if the weighting of the QS survey was reduced and if there was more
rigorous screening and classification of potential respondents.
But I
think we also have to bear in mind that QS does prohibit respondents from
voting for their own universities and it does average results out over a five-
year period (formerly three years).
It is
interesting that while THE does not usually combine and average survey
results it
did so in the 2016-17 world rankings combining the 2015 and 2016
survey results. This was, I suspect, probably because of a substantial drop in 2016 in the
percentage of respondents from the arts and humanities that would, if
unadjusted, have caused a serious problem for UK universities, especially those
in the Russell Group.
Leiter
then goes on to condemn QS for its dubious business practices. He reports that
THE dropped QS because of its dubious practices. That is what THE says but it
is widely rumoured within the rankings industry that THE was also interested in
the financial advantages of a direct partnership with Thomson Reuters rather
than getting data from QS.
He also
refers to QS’s hosting a series of “World Class events” where world university
leaders pay $950 for “seminar, dinners, coffee breaks” and “learn best practice
for branding and marketing your institution through case studies and expert
knowledge” and the QS stars plan where universities pay to be audited by QS in
return for stars that they can use for promotion and advertising. I would add
to his criticism that the Stars program has apparently undergone a typical
“grade inflation” with the number of five-star universities increasing all the
time.
Also, QS
offers specific consulting services and it has a large number of clients from
around the world although there are many more from Australia and Indonesia than
from Canada and the US. Of the three from the US one is MIT which has
been number
one in the QS world rankings since 2012, a position it
probably achieved after a change in the way in which faculty were classified.
It would,
however, be misleading to suggest that THE is any better in this respect. Since
2014 it has launched a serious and unapologetic “monetisation of data” program.
There are
events such as the forthcoming world "academic summit" where for 1,199
GBP (standard university) or 2,200 GBP (corporate), delegates can get "Exclusive
insight into the 2017 Times Higher Education World University
Rankings at the official launch and rankings masterclass,”, plus “prestigious
gala dinner, drinks reception and other networking events”. THE also provides a variety of
benchmarking and performance analysis services, branding, advertising and
reputation management campaigns and a range of silver and gold profiles,
including adverts and sponsored supplements. THE’s data
clients include some illustrious names like the National University of
Singapore and Trinity College Dublin plus some less well-known places such as
Federico Santa Maria Technical University, Orebro University, King Abdulaziz University,
National Research Nuclear University MEPhI Moscow, and Charles Darwin
University.
Among
THE’s activities are regional events that promise “partnership opportunities
for global thought leaders” and where rankings like “the WUR are presented at
these events with our award-winning data team on hand to explain them, allowing
institutions better understanding of their findings”.
At some
of these summits the rankings presented are trimmed and tweaked and somehow
the hosts emerge in a favourable light. In February 2015, for example, THE held
a Middle East and North Africa (MENA) summit that included a “snapshot ranking” that
put Texas A and M University Qatar, a branch campus that offers nothing but
engineering courses, in first place and Qatar University in fourth. The ranking
consisted of precisely one indicator out of the 13 that make up THE’s world
university rankings, field and year normalised citations. United Arab Emirates
University (UAEU) was 11th and the American University of
Sharjah in the UAE 14th.
The next
MENA summit was held in January 2016 in Al Ain in UAE. There was no snapshot
this time and the methodology for the MENA rankings included 13 indicators in
THE’s world rankings. Host country universities were now in fifth (UAEU) and
eighth place (American University in Sharjah). Texas A and M Qatar was not
ranked and Qatar University fell to sixth place.
Something
similar happened to Africa. In 2015, THE went to the University of Johannesburg
for a summit that brought together “outstanding global thought leaders from
industry, government, higher education and research” and which unveiled THE’s
Africa ranking based on citations (with the innovation of fractional counting)
that put the host university in ninth place and the University of Ghana in
twelfth.
In 2016
the show moved on to the University of Ghana where another ranking was produced
based on all the 13 world ranking indicators. This time the University of
Johannesburg did not take part and the University of Ghana went from 12th place
to 7th.
I may
have missed something but so far I do not see sign of THE Africa or MENA
summits planned for 2017. If so, then African and MENA university leaders are
to be congratulated for a very healthy scepticism.
To be
fair, THE does not seem to have done any methodological tweaking for this year’s
Asian, Asia Pacific and Latin American rankings.
Leiter
concludes that American academics should boycott the QS survey but not THE’s
and that they should lobby THE to improve its survey practices. That, I
suspect, is pretty much a nonstarter. QS has never had much a presence in the
US anyway and THE is unlikely to change significantly as long as its commercial
dominance goes unchallenged and as long as scholars and administrators fail to
see through its PR wizardry. It would be better for everybody to start looking beyond the "Big Three" rankings.
you forgot to name USNews global university ranking
ReplyDeleteThank you.The omission has been corrected.
ReplyDeleteThank you for your post.
ReplyDeleteI'd like to give a counterargument - why there is a correlation between good performance of the university and being the host of the summit.
Since THE and QS have some preliminary results before starting the preparation of the event they try to get partnership with top universities.
1. It's easier to sell such partnership to best universities - it's a good promotion for them. And selling partnerships to universities that scored lower than they wanted to can be impossible - who will promote ranking that shows them in bad light?
2. Hosting summit in one the best universities in country is a way to promote and legitimize the ranking. (I assume that summits are held in respectable universities, for example, University of Johannesburg and University of Ghana are probably one the leading in their countries)
At the same time this counterargument doesn't rule out some unethical practices that can happen or are happening.
Pakistan is a developing country that is developing and improving at a very high pace. Currently, Pakistan has a huge number of public and private universities that are producing skilled graduates. Here is a list of famous universities in Pakistan that are known for their beautiful campuses, a wide variety of subjects being offered to the students, and good facilities.
ReplyDeletevery nice information sir , i always follow your each post .its give very good knowledge
ReplyDeleteYour blogs are good and informative, keep sharing such good content.
ReplyDeletereal estate investing