Malaysia has had a complicated relationship with global university rankings. There was a fleeting moment of glory in 2004 when Universiti Malaya, the national flagship, leaped into the top 100 of the THES-QS world rankings. Sadly, it turned out that this was the result of an error by the rankers who thought that ethnic minorities were international faculty and students. Since then the country's leading universities have gone up and down, usually because of methodological changes rather than any merit or fault of their own.
Recently though, Malaysia seems to have adopted sensible, if not always popular, policies and made steady advances in the Shanghai rankings. There are now three universities in the top 500, UM, Universiti Sains Malaysia (USM) and Universiti Kebangsaan Malaysia (UKM). UM has been rising since 2011 although it fell a bit last year because of the loss of a single highly cited researcher listed in the Thomson Reuters database.
The Shanghai rankings rely on public records and focus on research in the sciences. For a broader based ranking with a consistent methodology and teaching metrics we can take a look at the Round University Rankings. There UM is overall 268th. For the 20 metrics included in these rankings UM's scores range from very good for number of faculty and reputation (except outside the region) to poor for doctoral degrees and normalised citations.
The story told by these rankings is that Malaysia is making steady progress in providing resources and facilities, attracting international students and staff, and producing a substantial amount of research in the natural sciences. But going beyond that is going to be very difficult. Citation counts indicate that Malaysian research gets little attention from the rest of the world. The Shanghai rankings report that UM has zero scores for highly cited researchers and papers in Nature and Science.
In this year's QS world rankings, UM reached 114th place overall and there are now hopes that it will soon reach the top 100. But it should be noted that UM's profile is very skewed with a score of 65.7 for academic reputation and 24.3 for citations per faculty. Going higher without an improvement in research quality will be very challenging since the reputation curve becomes very steep at this level, with dozens of survey responses needed just to go up a few points.
It might be better if Malaysia focused more on the Shanghai rankings, the Round University Rankings and the US News Best Global Universities. Progress in these rankings is often slow and gradual but their results are usually fairly consistent and reliable.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Friday, August 11, 2017
Tuesday, August 08, 2017
Excellent Series on Rankings
I have just come across a site, ACCESS, that includes a lot of excellent material on university rankings by Ruth A Pagell, who is Emeritus Faculty Librarian at Emory University and Adjunct Faculty at the University of Hawaii.
I'll provide specific links to some of the articles later
Go here
I'll provide specific links to some of the articles later
Go here
Saturday, August 05, 2017
There is no such thing as free tuition
It is reported that the Philippines is introducing free tuition in state universities.It will not really be free. The government will have to find P100 billion from a possible "re-allocation of resources."
If there is a graduate premium for degrees from Philippine universities then this measure will increase existing social inequalities and result in a transfer of wealth from the working class and small businesses to the privileged educated classes.
Unless lecturers work for nothing and buildings and facilities materialize, Hogwarts style, out of nothing, tuition is never free.
If there is a graduate premium for degrees from Philippine universities then this measure will increase existing social inequalities and result in a transfer of wealth from the working class and small businesses to the privileged educated classes.
Unless lecturers work for nothing and buildings and facilities materialize, Hogwarts style, out of nothing, tuition is never free.
Who educates the world's leaders?
According to Times Higher Education (THE), the UK has educated more heads of state and government than any other country. The USA is a close second followed by France. No doubt this will get a let of publicity as the THE summit heads for London but, considering the state of the world, is it really something to be proud of?
Thursday, August 03, 2017
America's Top Colleges: 2017 Rankings
America's Top Colleges is published by Forbes business magazine. It is an unabashed assessment of institutions from the viewpoint of the student as investor. The metrics are post-graduate success, debt, student experience, graduation rate and academic success.
The top three colleges are Harvard, Stanford and Yale.
The top three liberal arts colleges are Pomona, Claremont McKenna and Williams.
The top three low debt private colleges are College of the Ozarks, Berea College and Princeton.
The top three STEM colleges are MIT, Caltech and Harvey Mudd College.
Wednesday, August 02, 2017
Ranking Rankings
Hobsons, the education technology company, has produced a ranking of global university rankings. The information provided is very limited and i hope there will be more in a while. Here are the top five according to a survey of international students inbound to the USA.
1. QS World University Rankings
2. THE World University Rankings
3. Shanghai ARWU
4. US News Best Global Universities
5. Center for World University Rankings (formerly published at King Abdulaziz University).
University of Bolton head thinks he's worth his salary
George Holmes, vice-Chancellor of the University of Bolton with a salary of GBP 220,120 and owner of a yacht and a Bentley, is not ashamed of his salary. According to an article by Camilla Turner in the Daily Telegraph, he says that he has had a very successful career and he hopes his students will get good jobs and have Bentleys.
The university is ranked 86th in the Guardian 2018 league table which reports that 59.2% of graduates have jobs or in postgraduate courses six months after graduation. It does not appear in the THE or QS world rankings.
Webometrics puts it 105th in the UK and 1846th in the world so I suppose he could claim to be head of a top ten per cent university.
Perhaps Bolton should start looking for the owner of a private jet for its next vice-Chancellor. it might do even better.
Tuesday, August 01, 2017
Highlights from the Princeton Review
Here are the top universities in selected categories in the latest Best Colleges Ranking from Princeton Review. The rankings are based entirely on survey data and are obviously subjective and vulnerable to sampling error.
Most conservative students: University of Dallas, Texas
Most liberal students: Reed College, Oregon
Best campus food: University of Massachusetts Amherst
Happiest students: Vanderbilt University, Tennessee
Party schools: Tulane University, Louisiana
Don't inhale: US Coast Guard Academy, Connecticut
Best college library: University of Chicago, Illinois
Best-run college: University of Richmond, Virginia
Most studious students: Harvey Mudd College, California
Most religious students: Thomas Aquinas College, California
Least religious students: Reed College, Oregon
Best athletic facilities: Auburn University, Alabama.
Most conservative students: University of Dallas, Texas
Most liberal students: Reed College, Oregon
Best campus food: University of Massachusetts Amherst
Happiest students: Vanderbilt University, Tennessee
Party schools: Tulane University, Louisiana
Don't inhale: US Coast Guard Academy, Connecticut
Best college library: University of Chicago, Illinois
Best-run college: University of Richmond, Virginia
Most studious students: Harvey Mudd College, California
Most religious students: Thomas Aquinas College, California
Least religious students: Reed College, Oregon
Best athletic facilities: Auburn University, Alabama.
Monday, July 31, 2017
The world is safe for another year
The Princeton Review has just published the results of its annual survey of 382 US colleges with 62 lists of various kinds. I'll publish a few of the highlights later but for the moment here is one which should make everyone happy.
"Don't inhale" refers to nor using marijuana. Four of the top five places are held by service academies (Coast Guard, Naval, Army, Air Force).
The academies also get high scores in the stone cold sober rankings (opposite of party schools) so everyone can feel a bit safer when they sleep tonight.
"Don't inhale" refers to nor using marijuana. Four of the top five places are held by service academies (Coast Guard, Naval, Army, Air Force).
The academies also get high scores in the stone cold sober rankings (opposite of party schools) so everyone can feel a bit safer when they sleep tonight.
Wednesday, July 19, 2017
Comments on an Article by Brian Leiter
Global
university rankings are now nearly a decade and a half old. The Shanghai
rankings (Academic Ranking of World Universities or ARWU) began in 2003,
followed a year later by Webometrics and the THES-QS rankings which, after an
unpleasant divorce, became the Times Higher Education (THE)
and the Quacquarelli Symonds (QS) world rankings. Since then the number of
rankings with a variety of audiences and methodologies has expanded.
We now
have several research-based rankings, University Ranking by Academic
Performance (URAP) from Turkey, the National Taiwan
University Rankings, Best Global Universities from US News, Leiden
Ranking, as well as rankings that include some attempt to assess and
compare something other than research, the Round University Rankings from
Russia and U-Multirank from
the European Union. And, of course, we also have subject
rankings, regional
rankings, even age
group rankings.
It is
interesting that some of these rankings have developed beyond the original
founders of global rankings. Leiden Ranking is now the gold standard for the
analysis of publications and citations. The Russian rankings use the same Web
of Science database that THE did until 2014 and it has 12 out of the 13
indicators used by THE plus another eight in a more sensible and transparent
arrangement. However, both of these receive only a fraction of the attention
given to the THE rankings.
The
research rankings from Turkey and Taiwan are similar to the Shanghai rankings
but without the elderly or long departed Fields and Nobel award winners and
with a more coherent methodology. U-Multirank is almost alone in trying to
get at things that might be of interest to prospective undergraduate students.
It is
regrettable that an article by Professor Brian Leiter of the University of
Chicago in the Chronicle of Higher Education , 'Academic
Ethics: To Rank or Not to Rank' ignores such developments
and mentions only the original “Big Three”, Shanghai, QS and THE. This is
perhaps forgivable since the establishment media, including THE and the
Chronicle, and leading state and academic bureaucrats have until recently paid
very little attention to innovative developments in university ranking. Leiter
attacks the QS rankings and proposes that they should be boycotted while trying
to improve the THE rankings.
It is a
little odd that Leiter should be so caustic, not entirely without justification,
about QS while apparently being unaware of similar or greater problems with THE.
He begins
by saying that QS stands for “quirky silliness”. I would not disagree with that although
in recent years QS has been getting less silly. I have been as sarcastic as
anyone about the failings of QS: see here and here for
an amusing commentary.
But the
suggestion that QS is uniquely bad in contrast to THE is way off the target.
There are many issues with the QS methodology, especially with its employer and
academic surveys, and it has often announced placings that seem very
questionable such as Nanyang Technological University (NTU) ahead of Princeton
and Yale or the University of Buenos Aires in the world top 100, largely
as a result of a suspiciously good performance in the survey
indicators. The
oddities of the QS rankings are, however, no worse than some of the absurdities
that THE has served up in their world and
regional rankings. We have had places like University of Marakkesh Cadi
Ayyad University in Morocco, Middle East Technical University in Turkey,
Federico Santa Maria Technical University in Chile, Alexandria University
and Veltech University
in India rise to ludicrously high places, sometimes just for a year or two, as
the result of a few papers or even a single highly cited author.
I am not
entirely persuaded that NTU deserves its top
12 placing in the QS rankings. You can see here QS’s
unconvincing reply to a question that I provided. QS claims that NTU's excellence
is shown by its success in attracting foreign faculty, students and
collaborators, but when you are in a country where people show their passports
to drive to the dentist, being international is no great accomplishment. Even
so, it is evidently world class as far as engineering and computer science are
concerned and it is not impossible that it could reach an undisputed overall top
ten or twenty ranking the next decade.
While the
THE top ten or twenty or even fifty looks quite reasonable, apart from Oxford
in first place, there are many anomalies as soon as we start breaking the
rankings apart by country or indicator and THE has pushed some very weird data
in recent years. Look at these
places supposed to be regional or international centers of across
the board research excellence as measured by citations: St Georges University
of London, Brandeis University, the Free University of Bozen-Bolsano,
King Abdulaziz University, the University of Iceland, Veltech University.
If QS is silly what are we to call a ranking where Anglia Ruskin University is
supposed to have a greater research impact than Chicago, Cambridge or Tsinghua.
Leiter
starts his article by pointing out that the QS academic survey is largely
driven by the geographical distribution of its respondents and by the halo
effect. This is very probably true and to that I would add that a lot of the
responses to academic surveys of this kind are likely driven by simple self
interest, academics voting for their alma mater or current employer. QS does
not allow respondents to vote for the latter but they can vote for the former
and also vote for grant providers or collaborators.
He says
that “QS does not, however, disclose the geographic distribution of
its survey respondents, so the extent of the distorting effect cannot be
determined". This is not true of the overall survey. QS does in fact
give very
detailed figures about the origin of its respondents and there
is good evidence here of probable distorting effects. There are, for example,
more responses from Taiwan than from Mainland China, and almost as many from
Malaysia as from Russia. QS does not, however, go down to subject level when
listing geographic distribution.
He then
refers to the case of University
College Cork (UCC) asking faculty to solicit friends in other
institutions to vote for UCC. This is definitely a bad practice, but it was in
violation of QS guidelines and QS have investigated. I do not know what came of
the investigation but it is worth noting that the message would not have been
an issue if it had referred to the THE survey.
On
balance, I would agree that THE ‘s survey methodology is less dubious than QS’s
and less likely to be influenced by energetic PR campaigns. It would certainly
be a good idea if the weighting of the QS survey was reduced and if there was more
rigorous screening and classification of potential respondents.
But I
think we also have to bear in mind that QS does prohibit respondents from
voting for their own universities and it does average results out over a five-
year period (formerly three years).
It is
interesting that while THE does not usually combine and average survey
results it
did so in the 2016-17 world rankings combining the 2015 and 2016
survey results. This was, I suspect, probably because of a substantial drop in 2016 in the
percentage of respondents from the arts and humanities that would, if
unadjusted, have caused a serious problem for UK universities, especially those
in the Russell Group.
Leiter
then goes on to condemn QS for its dubious business practices. He reports that
THE dropped QS because of its dubious practices. That is what THE says but it
is widely rumoured within the rankings industry that THE was also interested in
the financial advantages of a direct partnership with Thomson Reuters rather
than getting data from QS.
He also
refers to QS’s hosting a series of “World Class events” where world university
leaders pay $950 for “seminar, dinners, coffee breaks” and “learn best practice
for branding and marketing your institution through case studies and expert
knowledge” and the QS stars plan where universities pay to be audited by QS in
return for stars that they can use for promotion and advertising. I would add
to his criticism that the Stars program has apparently undergone a typical
“grade inflation” with the number of five-star universities increasing all the
time.
Also, QS
offers specific consulting services and it has a large number of clients from
around the world although there are many more from Australia and Indonesia than
from Canada and the US. Of the three from the US one is MIT which has
been number
one in the QS world rankings since 2012, a position it
probably achieved after a change in the way in which faculty were classified.
It would,
however, be misleading to suggest that THE is any better in this respect. Since
2014 it has launched a serious and unapologetic “monetisation of data” program.
There are
events such as the forthcoming world "academic summit" where for 1,199
GBP (standard university) or 2,200 GBP (corporate), delegates can get "Exclusive
insight into the 2017 Times Higher Education World University
Rankings at the official launch and rankings masterclass,”, plus “prestigious
gala dinner, drinks reception and other networking events”. THE also provides a variety of
benchmarking and performance analysis services, branding, advertising and
reputation management campaigns and a range of silver and gold profiles,
including adverts and sponsored supplements. THE’s data
clients include some illustrious names like the National University of
Singapore and Trinity College Dublin plus some less well-known places such as
Federico Santa Maria Technical University, Orebro University, King Abdulaziz University,
National Research Nuclear University MEPhI Moscow, and Charles Darwin
University.
Among
THE’s activities are regional events that promise “partnership opportunities
for global thought leaders” and where rankings like “the WUR are presented at
these events with our award-winning data team on hand to explain them, allowing
institutions better understanding of their findings”.
At some
of these summits the rankings presented are trimmed and tweaked and somehow
the hosts emerge in a favourable light. In February 2015, for example, THE held
a Middle East and North Africa (MENA) summit that included a “snapshot ranking” that
put Texas A and M University Qatar, a branch campus that offers nothing but
engineering courses, in first place and Qatar University in fourth. The ranking
consisted of precisely one indicator out of the 13 that make up THE’s world
university rankings, field and year normalised citations. United Arab Emirates
University (UAEU) was 11th and the American University of
Sharjah in the UAE 14th.
The next
MENA summit was held in January 2016 in Al Ain in UAE. There was no snapshot
this time and the methodology for the MENA rankings included 13 indicators in
THE’s world rankings. Host country universities were now in fifth (UAEU) and
eighth place (American University in Sharjah). Texas A and M Qatar was not
ranked and Qatar University fell to sixth place.
Something
similar happened to Africa. In 2015, THE went to the University of Johannesburg
for a summit that brought together “outstanding global thought leaders from
industry, government, higher education and research” and which unveiled THE’s
Africa ranking based on citations (with the innovation of fractional counting)
that put the host university in ninth place and the University of Ghana in
twelfth.
In 2016
the show moved on to the University of Ghana where another ranking was produced
based on all the 13 world ranking indicators. This time the University of
Johannesburg did not take part and the University of Ghana went from 12th place
to 7th.
I may
have missed something but so far I do not see sign of THE Africa or MENA
summits planned for 2017. If so, then African and MENA university leaders are
to be congratulated for a very healthy scepticism.
To be
fair, THE does not seem to have done any methodological tweaking for this year’s
Asian, Asia Pacific and Latin American rankings.
Leiter
concludes that American academics should boycott the QS survey but not THE’s
and that they should lobby THE to improve its survey practices. That, I
suspect, is pretty much a nonstarter. QS has never had much a presence in the
US anyway and THE is unlikely to change significantly as long as its commercial
dominance goes unchallenged and as long as scholars and administrators fail to
see through its PR wizardry. It would be better for everybody to start looking beyond the "Big Three" rankings.
Monday, July 03, 2017
Proving anything you want from rankings
It seems that
university rankings can be used to prove almost anything that journalists want
to prove.
Ever since
the Brexit referendum experts and pundits of various kinds have been muttering
about the dread disease that is undermining or about to undermine the research
prowess of British universities. The malignity of Brexit is so great that it
can send its evil rays back from the future.
Last year,
as several British universities tumbled down the Quacquarelli Symonds (QS) world
rankings, the Independent claimed that
“[p]ost-Brexit uncertainty and long-term funding issues have seen storm clouds
gather over UK higher education in this year’s QS World University Rankings”.
It is
difficult to figure out how anxiety about a vote that took place on June 24th
2016 could affect a ranking based on institutional data for 2014 and
bibliometric data from the previous five years.
It is just
about possible that some academics or employers might have woken up on June 24th
to see that their intellectual inferiors had joined the orcs to raze the ivory
towers of Baggins University and Bree Poly and then rushed to send a late
response to the QS opinion survey. But QS, to their credit, have taken steps to
deal with that sort of thing by averaging out survey responses over a period of
five years.
European and
American universities have been complaining for a long time that they do not
get enough money from the state and that their performance in the global
rankings is undermined because they do not get enough international students or
researchers. That is a bit more plausible. After all, income does account for
three separate indicators in the Times Higher Education (THE) world rankings so
reduced income would obviously cause universities to fall a bit. The scandal
over Trinity College Dublin’s botched rankings data submission showed
precisely how much a given increase in reported total income (with research and
industry income in a constant proportion) means for the THE world rankings. International
metrics account for 10% of the QS rankings and 7.5% of the THE world rankings.
Whether a decline in income or the number of international students has a
direct effect or indeed any effect at all on research output or the quality of
teaching is quite another matter.
The problem
with claims like this is that the QS and THE rankings are very blunt instruments
that should not be used to make year by year analyses or to influence
government or university policy. There have been several changes in
methodology, there are fluctuations in the distribution of survey responses by
region and subject and the average scores for indicators may go up and down as
the number of participants changes. All of these mean that it is very unwise to
make extravagant assertions about university quality based on what happens in those
rankings.
Before
making any claim based on ranking changes it would be a good idea to wait a few
years until the impact of any methodological change has passed through the
system
Another variation
in this genre is the recent
claim in the Daily Telegraph that “British universities are slipping down
the world rankings, with experts blaming the decline on pressure to admit more
disadvantaged students.”
Among the
experts is Alan Smithers of the University of Buckingham who is reported as
saying “universities are no longer free to take their own decisions and recruit
the most talented students which would ensure top positions in league tables”.
There is certainly
good evidence that British university courses are becoming much less rigorous. Every
year reports come in about declining standards
everywhere. The latest is the proposal at Oxford to allow
students to do take home instead of timed exams.
But it is unlikely
that this could show up in the QS or THE rankings. None of the global rankings
has a metric that measures the attributes of graduates except perhaps the QS
employers survey. It is probable that a decline in the cognitive skills of
admitted undergraduate students would eventually trickle up to the qualities of
research students and then to the output and quality of research but that is
not something that could happen in a single year especially when there is so
much noise generated by methodological changes.
The cold reality
is that university rankings can tell us some things about universities and how
they change over perhaps half a decade and some metrics are better than others
but it is an exercise in futility to use overall rankings or indicators subject
to methodological tweaking to argue about how political or economic changes are
impacting western universities.
The latest
improbable claim about rankings is that
Oxford’s achieving parity with Cambridge in the THE reputation rankings was the
result of a
positive image created by appointing its first female Vice Chancellor.
Phil Baty, THE’s editor, is reported as saying that ‘Oxford
University’s move to appoint its first female Vice Chancellor sent a “symbolic”
wave around the world which created a positive image for the institution among
academics.’
There is a
bit of a problem here. Louise Richardson was appointed Vice -Chancellor in
January 2016. The polling for the 2016 THE reputation rankings took place
between January and March 2016. One would expect that if the appointment of
Richardson had any effect on academic opinion at all then it would be in those
months. It certainly seems more likely than an impact that was delayed for more
than a year. If the appointment did affect the reputation rankings then it was
apparently a negative one for Oxford’s
score fell massively from 80.4 in 2015 to 69.1 in 2016 (compared to 100 for
Harvard in both years).
So, did Oxford
suffer in 2016 because spiteful curmudgeons were infuriated by an upstart
intruding into the dreaming spires?
The
collapse of Oxford in the 2016 reputation rankings and its slight recovery in
2017 almost certainly had nothing to do with the new Vice-Chancellor.
Take a look
at the table below. Oxford’s reputation score tracks the percentage of THE
survey responses from the arts and humanities. It goes up when there are more
respondents from those subjects and goes down when there are fewer. This is the
case for British universities in general and also for Cambridge except for this
year.
The general
trend since 2011 has been for the gap between Cambridge and Oxford to fall
steadily and that trend happened before Oxford acquired a new Vice-Chancellor
although it accelerated and finally erased the gap this year.
What is
unusual about this year’s reputation ranking is not that Oxford recovered as
the number of arts and humanities respondents increased but that Cambridge
continued to fall.
I wonder if
it has something to do with Cambridge’s “disastrous” performance in the THE
research impact (citations) indicator in recent years. In the 2014-15 world rankings Cambridge was
28th behind places like Federico Santa Maria Technical University
and Bogazici University. In 2015-16 it was 27th behind St Petersburg
Polytechnic University. But a greater humiliation came in the 2016-17 rankings.
Cambridge fell to 31st in the world for research impact. Even worse it
was well behind Anglia Ruskin University, a former art school. For research
impact Cambridge University wasn’t the best university in Europe or England. It
wasn’t even the best in Cambridge, at least if you trusted the sophisticated THE rankings.
Rankings
are not entirely worthless and if they did not exist no doubt they would
somehow be invented. But it is doing nobody any good to use them to promote the
special interests of university bureaucrats and insecure senior academics.
Table:
Scores in THE reputation rankings
Year
|
Oxford
|
Cambridge
|
Gap
|
%
responses arts and
humanities
|
2011
|
68.6
|
80.7
|
12.1
|
--
|
2012
|
71.2
|
80.7
|
9.5
|
7%
|
2013
|
73.0
|
81.3
|
8.3
|
10.5%
|
2014
|
67.8
|
74.3
|
6.5
|
9%
|
2015
|
80.4
|
84.3
|
3.9
|
16%
|
2016
|
67.6
|
72.2
|
4.6
|
9%
|
2017
|
69.1
|
69.1
|
0
|
12.5%
|
Sunday, June 18, 2017
Comparing the THE and QS Academic Reputation Surveys
Times Higher Education (THE) has just published its 2017 reputation rankings which include 100 universities. These are based on a survey distributed between January and March of this year and will be included, after standardisation, in the 2017-18 (or 2018) World University Rankings scheduled for publication in a few months. In the forthcoming world rankings the reputation survey will be divided into two metrics in the research and teaching indicator groups, with a combined weighting of 33 percent. The survey asked about research and postgraduate teaching but since the correlation between these two questions is very high there is effectively only one indicator.
The QS world rankings released last week included scores derived from two surveys, one of academics with a 40% weighting and one of employers with 10%. The academic survey was concerned only with research.
The methodology of the THE survey is relatively simple. The respondents are drawn from the database of researchers with publications in Scopus indexed journals, in other words those who get to be listed as corresponding author. THE claims that this makes them experienced senior researchers although in many parts of the world being a member or leader of a research team often has more to do with politics than merit.
In contrast, the QS methodology has changed quite a lot over the last few years. It began with scouring the mailing lists of World Scientific, a Singapore based academic publisher with links to Imperial College London, then adding various other channels including lists supplied by institutions and sign up facilities for potential respondents. The result is a survey that appears more inclusive than THE's with more respondents from outside the elite but one whose validity may be rather suspect.
The THE ranking found that there were six super-brand universities that stood out from everyone else, Harvard, MIT, Stanford, Cambridge, Oxford, and Berkeley. There was a big gap between Berkeley and number seven Princeton and then the long smooth slope continues.
After that, the ranking is dominated by English speaking universities, with the USA contributing 42, the UK 10, Canada 3 and Australia 3. East Asia and the Chinese diaspora (Hong Kong, Taiwan and Singapore) are fairly well represented, while South and Central Asia, the Middle East and Africa are absent.
For any survey a great deal depends on how the forms are distributed. Last year, the THE survey had a lot more responses from the social sciences, including economics and business studies, and fewer from the arts and the humanities, and that contributed to some Asian universities rising and some British ones falling.
Such falls are typically attributed in the education establishment media to anxiety about the looming horrors of Brexit, the vicious snatching of research funds and the rising tide of hostility to international students.
This year British Universities did a bit better in the THE reputation ranking this year with five going up, three staying put and three going down. No doubt we will soon hear about the invigorating effects of Brexit and the benefits of austerity. Perhaps also it might have something to do with the number of survey responses from the arts and humanities going up from 9% to 12.5%, something that would surely benefit UK universities.
The QS reputation indicator has the same universities in the top six but not in quite the same order: Cambridge, fourth in THE, is second in the QS indicator. After that it starts looking very different. Number seven is the University of Tokyo, which THE puts in 11th place for academic reputation. Other Asian universities do much better in the QS indicator. The National University of Singapore is 11th ( 27th in THE) Nanyang Technological University Singapore is 50th (THE 81-90 band), Peking University is 14th (THE 17h) Chulalongkorn University Thailand is 99th (not in the THE top 100).
It is noticeable that Latin American universities such as the University of Sao Paulo, the University of Buenos Aires and the Pontifical Catholic University of Chile get a higher placing in the QS indicator than they do in the THE ranking as do some Southern European universities such as Barcelona, Sapienza and Bologna.
The THE reputation ranking gives us a snapshot of the current views of the world's academic elite and probably underestimates the rising universities of Greater China and Korea. QS cast their nets further and have probably caught a few of tomorrow's world class institutions although I suspect that the Latin American high fliers, apart from Sao Paulo, are very overrated.
The QS world rankings released last week included scores derived from two surveys, one of academics with a 40% weighting and one of employers with 10%. The academic survey was concerned only with research.
The methodology of the THE survey is relatively simple. The respondents are drawn from the database of researchers with publications in Scopus indexed journals, in other words those who get to be listed as corresponding author. THE claims that this makes them experienced senior researchers although in many parts of the world being a member or leader of a research team often has more to do with politics than merit.
In contrast, the QS methodology has changed quite a lot over the last few years. It began with scouring the mailing lists of World Scientific, a Singapore based academic publisher with links to Imperial College London, then adding various other channels including lists supplied by institutions and sign up facilities for potential respondents. The result is a survey that appears more inclusive than THE's with more respondents from outside the elite but one whose validity may be rather suspect.
The THE ranking found that there were six super-brand universities that stood out from everyone else, Harvard, MIT, Stanford, Cambridge, Oxford, and Berkeley. There was a big gap between Berkeley and number seven Princeton and then the long smooth slope continues.
After that, the ranking is dominated by English speaking universities, with the USA contributing 42, the UK 10, Canada 3 and Australia 3. East Asia and the Chinese diaspora (Hong Kong, Taiwan and Singapore) are fairly well represented, while South and Central Asia, the Middle East and Africa are absent.
For any survey a great deal depends on how the forms are distributed. Last year, the THE survey had a lot more responses from the social sciences, including economics and business studies, and fewer from the arts and the humanities, and that contributed to some Asian universities rising and some British ones falling.
Such falls are typically attributed in the education establishment media to anxiety about the looming horrors of Brexit, the vicious snatching of research funds and the rising tide of hostility to international students.
This year British Universities did a bit better in the THE reputation ranking this year with five going up, three staying put and three going down. No doubt we will soon hear about the invigorating effects of Brexit and the benefits of austerity. Perhaps also it might have something to do with the number of survey responses from the arts and humanities going up from 9% to 12.5%, something that would surely benefit UK universities.
The QS reputation indicator has the same universities in the top six but not in quite the same order: Cambridge, fourth in THE, is second in the QS indicator. After that it starts looking very different. Number seven is the University of Tokyo, which THE puts in 11th place for academic reputation. Other Asian universities do much better in the QS indicator. The National University of Singapore is 11th ( 27th in THE) Nanyang Technological University Singapore is 50th (THE 81-90 band), Peking University is 14th (THE 17h) Chulalongkorn University Thailand is 99th (not in the THE top 100).
It is noticeable that Latin American universities such as the University of Sao Paulo, the University of Buenos Aires and the Pontifical Catholic University of Chile get a higher placing in the QS indicator than they do in the THE ranking as do some Southern European universities such as Barcelona, Sapienza and Bologna.
The THE reputation ranking gives us a snapshot of the current views of the world's academic elite and probably underestimates the rising universities of Greater China and Korea. QS cast their nets further and have probably caught a few of tomorrow's world class institutions although I suspect that the Latin American high fliers, apart from Sao Paulo, are very overrated.
Thursday, June 15, 2017
The Abuse and Use of Rankings
International
university rankings have become a substantial industry since the first
appearance of the Shanghai rankings (Academic Ranking of World Universities or
ARWU) back in 2003. The various rankings are now watched closely by governments
and media and for some students they play a significant role in
choosing universities, They have become a factor in national higher education
policies and are an important element in the race to enter and dominate
the lucrative transnational higher education market. In Malaysia a local
newspaper, Utusan Malaysia, recently had a full page on the latest QS
world rankings including a half page of congratulations from the Malaysian
Qualification Agency for nine universities who are part of a state-backed
export drive.
Reaction to
international rankings often goes to one of two extremes, either outright
rejection or uncritical praise, sometimes descending into grovelling flattery
that would make Uriah Heep ashamed (the revered QS rankings, Phil Baty a
thought leader). The problem with the first, which is certainly very
understandable, is that it is unrealistic. If every international ranking
suddenly stopped publication we would just have, as we did before, an informal
ranking system based largely on reputation, stereotypes and prejudice.
On the
other hand, many academics and bureaucrats find rankings very useful. It is
striking that university administrators, the media and national governments
have been so tolerant of some of the absurdities that Times
Higher Education (THE) has announced in recent years. Recently,
THE’s Asian rankings had Veltech University as the third best university in
India and the best in Asia for research impact, the result of exactly one
researcher assiduously citing himself. This passed almost unnoticed in the
Indian press and seems to have aroused no great interest among Indian academics
apart from a couple of blog posts. Equally, when Universiti Tunku Abdul Rahman
(UTAR), a private Malaysian university, was declared to be the second best
university in the country and best for research impact, on the strength of a
single researcher’s participation in a high profile global medical project
there was no apparent response from anyone.
International rankings have also become a weapon in the the drive by universities to maintain or increase their access to public funds. British and Irish universities often complain that their fall in the rankings is all the fault of the government for not providing enough money. Almost any result in the better known rankings can be used to prop up the narrative of western universities starved of funds and international researchers and students.
Neither of
these two views is really valid. Rankings can tell us a great
deal about the way that higher education and research are going. The early
Shanghai rankings indicated that China was a long way behind the West and that
research in continental Europe was inferior to that in the USA. A recent
analysis by Nature Index shows that American research is declining and that the
decline is concentrated in diverse Democrat voting states such as California,
Massachusetts, Illinois and New York.
But if
university rankings are useful they not equally so and neither are the various
indicators from which they are constructed.
Ranking
indicators that rely on self-submitted information should be mistrusted. Even
if everybody concerned is fanatically honest, there are many ways in which data
can be manipulated, massaged, refined, defined and redefined, analysed and
distorted as it makes it way from branch campuses, affiliated colleges and
research institutes through central administration to the number munching
programs of the rankers.
Then of
course there are the questionable validation processes within the ranking
organisations. There was a much publicised case concerning Trinity College
Dublin where for two years in a row the rankers missed an error of orders of
magnitude in the data submitted for three income indicators.
Any metric
that measures inputs rather than outputs should be approached with caution
including THE's measures of income that amount to a total weighting of 10.75%.
THE and QS both have indicators that count staff resources. It is interesting
to have this sort of information but there is no guarantee that having
loads of money or staff will lead to quality whether of research, teaching or
anything else.
Reputation
survey data is also problematic. It is obviously subjective, although that is
not necessarily a bad thing, and everything depends on the distribution of
responses between countries, disciplines, subjects and levels of seniority.
Take a look at the latest QS rankings and the percentages of respondents from
various countries.
Canada has 3.5% of survey respondents and China has 1.7%.
Australia
has 4% and Russia 4.2%.
Kazakhstan
has 2.1% and India 2.3%'
There ought
to be a sensible middle road between rejecting rankings altogether and
passively accepting the errors, anomalies and biases of the popular rankers.
Universities and governments should abide by a self denying ordinance and reject ranking results that challenge common sense or contradict accepted national rankings. I remember a few years ago someone at Duke University saying that they were puzzled why the THES-QS rankings put the school in first place for faculty student ratio when this contradicted data in the US News rankings. Few, if any, major universities or higher education misters seem to have done anything like this lately.
It would also be a good idea if universities and governments stopped looking at rankings holistically and started setting targets according to specific indicators. High flying research university could refer to the Leiden Ranking, Nature Index or the Nature and Science and Publications indicators in ARWU. New universities could target a place in the Excellence indicators in the Webometrics rankings which lists 5,777 institutions as having some sort of research presence.
As for the
teaching mission, the most directly relevant indicators are the QS employer
survey in the world rankings, the QS Graduate Employability Index, and the
Global University Ranking Employability Ranking published by THE.
Governments and universities would be advised not to got too excited about a strong performance in the rankings. What the rankings have given the rankings can take away.
Subscribe to:
Posts (Atom)