Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Thursday, October 29, 2015
Worth Reading 2
University ranking methodologies. An interview with Ben Sowter about the Quacquarelli Symonds World University Ranking
Alberto Baccini, Antonio Banfi, Giuseppe De Nicolao, Paola Galimberti
RT. A Journal on Research Policy & Evaluation 1 (2015)
Wednesday, October 28, 2015
Tuesday, October 27, 2015
Thursday, October 22, 2015
Even the Spectator reads the THE rankings
The influence of the global rankings, especially the Times Higher Education (THE) World University Rankings, appears to have no limits.
An article by Harry Mount in the Spectator describes the changing educational background of the leaders of the Labour Party. The top ranks used to be filled by graduates of Oxford (Denis Healey, Harold Wilson, Tony Blair, the Milibands, Ed Balls), Cambridge (Tristram Hunt) and Edinburgh (Gordon Brown).
Now they have been replaced by the alumni of Brunel and Birkbeck (John McDonnell), Sussex (Hilary Benn and Owen Smith, Nottingham (Michael Dugher ), Westminster (Gloria De Piero) and Hull (Tom Watson and Rosie Winterton) . Jeremy Corbyn lasted a year at the Polytechnic of North London now London Metropolitan University.
Mount observes that Oxford was second in the latest edition of the THE world rankings, Hull 401st and London Metropolitan unranked.
It is only fair to point out that participation in the THE rankings is voluntary so maybe London Metropolitan could have been ranked if they had bothered to send in the data.
Not everyone is impressed by the THE rankings. "Tony Dark" comments
An article by Harry Mount in the Spectator describes the changing educational background of the leaders of the Labour Party. The top ranks used to be filled by graduates of Oxford (Denis Healey, Harold Wilson, Tony Blair, the Milibands, Ed Balls), Cambridge (Tristram Hunt) and Edinburgh (Gordon Brown).
Now they have been replaced by the alumni of Brunel and Birkbeck (John McDonnell), Sussex (Hilary Benn and Owen Smith, Nottingham (Michael Dugher ), Westminster (Gloria De Piero) and Hull (Tom Watson and Rosie Winterton) . Jeremy Corbyn lasted a year at the Polytechnic of North London now London Metropolitan University.
Mount observes that Oxford was second in the latest edition of the THE world rankings, Hull 401st and London Metropolitan unranked.
It is only fair to point out that participation in the THE rankings is voluntary so maybe London Metropolitan could have been ranked if they had bothered to send in the data.
Not everyone is impressed by the THE rankings. "Tony Dark" comments
"Amusing to note the reference to the Times Higher Education world ranking: this allegedly authoritative table is produced by a handful of hacks, and their hired statisticians, from a journal so insignificant that hardly anyone even in universities reads it. The other allegedly authoritative table, emanating from an organisation called QS, is largely driven by another clique of journos who split from the Times Higher . And the heads of multi million pound universities quail before the wondrous listings generated by these miniscule cabals. A mad world, my masters."
Sunday, October 18, 2015
Going Up and Going Down
A revised version of a previous post has been posted at University World News. Readers are welcome to comment here.
Sunday, October 11, 2015
More on Politics and Rankings
The Higher Education Minister of Malaysia has praised the country's leading university, Universiti Malaya (UM) for getting into the top 150 of the Quacquarelli Symonds (QS) World University Rankings. He also noted that UM and other Malaysian universities had done well in the QS subject rankings.
The problem with relying on QS or Times Higher Education (THE) is that they are prone to volatility because of reliance on reputation surveys that can be unstable outside the top dozen or so universities. Things have been made worse this year by methodological changes. In the case of QS one change was to give more credit to citations in the humanities and social sciences thereby helping universities that publish mainly or entirely in English.
A more consistent view of university performance might be found in the Shanghai or US News rankings.
The problem with relying on QS or Times Higher Education (THE) is that they are prone to volatility because of reliance on reputation surveys that can be unstable outside the top dozen or so universities. Things have been made worse this year by methodological changes. In the case of QS one change was to give more credit to citations in the humanities and social sciences thereby helping universities that publish mainly or entirely in English.
A more consistent view of university performance might be found in the Shanghai or US News rankings.
Rankings Become Big Politics
University performance in global rankings has become a favorite weapon of politicians around the world. Scotland's first Minister has noted that there are five Scottish universities in the Times Higher Education World University Rankings and that the Scottish government will "continue to work with our universities to make two sure that they continue to be that fantastic success story"
She did not mention that there are only two Scottish universities in the top 200 of the Shanghai rankings and in the US News Best Global Universities.
She did not mention that there are only two Scottish universities in the top 200 of the Shanghai rankings and in the US News Best Global Universities.
Thursday, October 08, 2015
Tokyo Metropolitan University is Still in the Japanese Top Ten
Until recently Tokyo Metropolitan University had an advertisement with Times Higher Education proclaiming their perfect score of 100 for citations. This year the score fell to 72.2 and so now they just say "TMU ranks 9th among Japanese universities in the Times Higher Education World University Rankings 2015-2016"
I hope they got a discount.
I hope they got a discount.
Saturday, October 03, 2015
Where Should Rankers get Data From?
Times Higher Education (THE) have started publishing some basic university statistics on their rankings page: number of students, student-staff ratio, international students and female-male ratio.
Already some observers have noted that the data does not always match that found in institutional and official sources. I have heard that the number of students given for several German universities is significantly lower than that found in other sources.
The Online Citizen in Singapore has found that the island's two leading tertiary institutions, National University of Singapore and Nanyang Technological University, have claimed 34% and 33% international students respectively on the THE site although in 2013 the Minister of Education had claimed that the proportion of international students in Singaporean universities was only 16 %.
There are several plausible and innocent explanations for this and similar discrepancies. It could be that part-time students, branch campuses, online students, permanent residents, research institutes, commuters living in Malaysia are counted in one set of figures but not the other.
But there is a serious and general problem with institutional data for university rankings. Even if everybody concerned is completely honest, there are many points at which ambiguous definitions, conflicting estimates, duplication or omission of data can undermine the accuracy of ranking indicators. In the case of Germany there might be some argument over whether doctoral candidates count as students or teaching and/or research staff
QS used to have a validation hierarchy starting with national statistics, followed by institutional data, data from websites, old data, third party data and smart averages in that order. If it is still applied rigorously this would be the best approach.
I understand that both QS and THE reserve the right to overrule institutional data although how severe they are I do not know. THE have a particularly difficult task since they allow universities to opt in or out as they please. Should THE be too strict about the data supplied a university might simply decide not to be ranked for a year.
On balance, it is probably good sense for ranking organisations to rely on publicly accessible data when they can and to minimise input from universities.
Already some observers have noted that the data does not always match that found in institutional and official sources. I have heard that the number of students given for several German universities is significantly lower than that found in other sources.
The Online Citizen in Singapore has found that the island's two leading tertiary institutions, National University of Singapore and Nanyang Technological University, have claimed 34% and 33% international students respectively on the THE site although in 2013 the Minister of Education had claimed that the proportion of international students in Singaporean universities was only 16 %.
There are several plausible and innocent explanations for this and similar discrepancies. It could be that part-time students, branch campuses, online students, permanent residents, research institutes, commuters living in Malaysia are counted in one set of figures but not the other.
But there is a serious and general problem with institutional data for university rankings. Even if everybody concerned is completely honest, there are many points at which ambiguous definitions, conflicting estimates, duplication or omission of data can undermine the accuracy of ranking indicators. In the case of Germany there might be some argument over whether doctoral candidates count as students or teaching and/or research staff
QS used to have a validation hierarchy starting with national statistics, followed by institutional data, data from websites, old data, third party data and smart averages in that order. If it is still applied rigorously this would be the best approach.
I understand that both QS and THE reserve the right to overrule institutional data although how severe they are I do not know. THE have a particularly difficult task since they allow universities to opt in or out as they please. Should THE be too strict about the data supplied a university might simply decide not to be ranked for a year.
On balance, it is probably good sense for ranking organisations to rely on publicly accessible data when they can and to minimise input from universities.
Friday, October 02, 2015
Very Interesting Rankings from Times Higher Education
The latest edition of the Times Higher Education (THE) World University Rankings has just been published, along with a big dose of self-flattery and
congratulations to the winners of what is beginning to look more like a lottery
than an objective exercise in comparative assessment.
The background to the story is
that at the end of last year THE broke with their data suppliers Thomson Reuters (TR) and
announced the dawn of a new era of transparency and accountability
There were quite a few things wrong
with the THE rankings, especially with the citations indicator which supposedly measured research impact and was given
nearly a third of the total weighting. This meant that THE was faced with a
serious dilemma. Keeping the old methodology would be a problem but radical
reform would raise the question of why THE would want to change what they claimed was a uniquely trusted and sophisticated methodology with carefully calibrated indicators.
It seems that THE have decided to
make a limited number of changes but to postpone making a decision about other
issues.
They have broadened the academic
reputation survey, sending out forms in more languages and getting more
responses from outside the USA. Respondents are now drawn from those with publications
in the Scopus database, much larger than the Web of Science, as was information
about publications and citations. In addition, THE have excluded 649 “freakish” multi
– author papers from their calculations and diluted the effect of the regional modification
that boosted the scores in the citations indicator of low performing countries.
These changes have led to implausible fluctuations with some institutions rising or falling dozens or hundreds of places. Fortunately for THE, the latest winners are happy to trumpet their success and the losers so far seem to have lapsed into an embarrassed silence.
When they were published on the 30th
of September the rankings provided lots of headline fodder about who was up or
down.
The Irish Times announced that the rankings showed Trinity College Dublin had fallen while University College Dublin was rising.
In the Netherlands the University of
Twente bragged about its “sensationally higher scores”.
Study
International asserted that “Asia Falters” and that Britain and the US were still
dominant in higher education.
The London Daily Telegraph claimed that
European universities were matching the US.
The Hindu found something to boast about
by noting that India was at last the equal of co-BRICS member Brazil.
Russian media celebrated the
remarkable achievement of Lomonosov Moscow State University in rising 35 places.
And, of course, the standard THE narrative was trotted out again. British universities are wonderful
but they will only go on being wonderful if they are given as much money as
they want and are allowed to admit as many overseas students as they want.
The latest rankings support this narrative
of British excellence by showing Oxford and Cambridge overtaking Harvard, which
was pushed into sixth place. But is such a claim believable? Has anything happened
in the labs or lecture halls at any of those places between 2014 and 2015 to cause
such a shift?
In reality, what probably happened
was that the Oxbridge duo were not actually doing anything better this year but
that Harvard’s eclipse came from a large drop from 92.9 to 83.6 points for THE’s
composite teaching indicator. Did Harvard’s teaching really deteriorate over twelve
months? It is more likely that there were relatively fewer American
respondents in the THE survey but one cannot be sure because there are four
other statistics bundled into the indicator.
While British universities appeared to do well, French ones appeared to perform disastrously. The École Normale Supérieure
recorded a substantial gain going from 78th to 54th place
but every other French institution in the rankings fell, sometimes by dozens of places. École Polytechnique went from
61st place to 101st, Université Paris-Sud from 120th to 188th , the University of Strasbourg from the 201-225 band to 301-350, in every case because of a
substantial fall in the citations indicator. If switching to Scopus was intended to help non-English speaking countries it did not do France any good.
Meanwhile, the advance of Asia has apparently come to an end or gone into screeching reverse. Many Asian universities slipped down the ladder although the top Chinese schools held their ground. Some Japanese and Korean
universities fell dozens of places. The University of Tokyo went from 23rd
to 43rd place, largely because of a fall in the citations indicator from 74.7 points to 60.9 and the University of Kyoto from 59th to 88th with another drop in the score for citations. Among the casualties was Tokyo Metropolitan University which used to advertise its perfect score of 100 for citations on the THE website. This year, stripped of the citations for mega-papers in physics, its citation score dropped to a rather tepid 72.2.
The Korean flagships have also foundered. Seoul National University fell 35 places and the Korean Advanced Institute of Technology 66, largely because of a decline in the scores for teaching and research. Pohang University of Science and Technology (POSTECH) fell 50 places, losing points in all indicators except income from industry
The most catastrophic fall was in
Turkey. There were four Turkish universities in the top 200 last year. All of
them have dropped out. Several Turkish universities contributed to the Large Hadron
Collider project with its multiple authors and multiple citations and they also benefited from producing comparatively few
research papers and from the regional modification, which gave them artificially high scores for the citations indicator in 2014 but not this year.
The worst case was Middle East Technical
University which had the 85th place in 2014, helped by an outstanding
score of 92 for citations and reasonable scores for the other indicators. This year
it was in the 501-600 band with reduced scores for everything except Industry
Income and a very low score of 28.8 for citations.
The new rankings appear to have
restored the privilege given to medical research. In the upper reaches we find
St George’s, University of London, a medical school, which according to THE is the world's leading university for research impact, Charité - Universitätsmedizin Berlin, a teaching hospital affiliated to Humboldt University and the Free University of Berlin, and Oregon Health and Science University.
It also appears that THE's methodology continues to gives an undeserved advantage to small or specialized institutions such as Scuola Superiore Sant’Anna in Pisa, which does not appear to be a truly independent university, the Copenhagen Business School, and Rush University in Chicago, the academic branch of a private hospital.
These rankings appear so far to have got a good reception in the mainstream press, although it is likely that that before long we will hear some negative reactions from independent experts and from Japan, Korea, France, Italy and the Middle East.
THE, however, have just postponed the hard decisions that they will eventually have to make.
Subscribe to:
Posts (Atom)