I am feeling a bit embarrassed. In a recent post I wrote about the Shanghai Rankings (ARWU) being a bit boring (which is good) because university ranks usually do not change very much. But then I noticed that a couple of Australian universities did very well in the latest rankings. One of them, the Australian National University (ANU), has risen a spectacular (for ARWU) 31 places over last year. The Financial Review says that "[u]niversity scientific research has boosted the position of two Australian universities in a global ranking of higher education providers."
The ranking is ARWU and the rise in the ranking is linked to the economic contribution of Australian universities, especially those in the Group of Eight.
So how well did Australian universities do? The top performer, as in previous years, is the University of Melbourne, which went up a spot to 38th place. Two other universities went up a lot in a very un-Shanghainese way, ANU, already mentioned, from 69th to 38th place and the University of Sydney from to 83rd to 68th
The University of Queensland was unchanged in 55th place while Monash fell from 78th to 91st and the University of Western Australia from 91st to 93rd.
How did ANU and Sydney do it? The ANU scores for Nobel and Fields awards were unchanged. Publications were up a bit and papers in Nature and Science down a bit.
What made the difference was the score for highly cited researchers, derived from lists kept by Clarivate Analytics, which rose from 15.4 to 23.5, a difference of 8.1 or, after weighting, 1.62 points of the overall score. The difference in total scores between 2017 and 2018 was 1.9 so those highly cited researchers made up most of the difference.
In 2016 ANU had two researchers in the list, which was used for the 2017 rankings. One was also on the 2017 list, used in 2018. In 2017 there were six ANU highly cited researchers, one from the previous year and one who had moved from MIT. The other four were long serving ANU researchers.
Let's be clear. ANU has not been handing out unusual contracts or poaching from other institutions. It has grown its own researchers and should be congratulated.
But using an indicator where a single researcher can lift a top 100 university seven or eight places is an invitation to perverse consequences. ARWU should consider whether it is time to explore other measures of research impact.
The improved scores for the University of Sydney resulted from an increase between 2016 and 2017 in the number of articles published in the Science Citation Index Expanded and the Social Science Citation Index.
1 comment:
It might be time for ARWU to consider a major re-think of the Hi-Ci indicator. The presence of a single Hi-Ci can result in a ranking of more than 100 places higher for a university ranked at 500th. A more accurate and stable measure of research impact would be total highly cited article count for a university over a five year period. Many of us use this indicator and find it more useful for benchmarking than number of Hi-C0 researchers.
Post a Comment