In the previous post I referred to the vulnerabilities that have developed in the most popular world rankings, THE, QS and Shanghai ARWU, indicators that have a large weighting and can be influenced by universities that know how to work the system or sometimes are just plain lucky.
In the latest QS rankings four universities from Mexico, Chile, Brazil and Argentina have 90+ scores for the academic reputation indicator, which has a 40% weighting. All of these universities have low scores for citations per faculty which would seem at odds with a stellar research reputation. In three cases QS does not even list the score in its main table.
I have spent so much time on the normalised citation indicator in the THE world and regional rankings that I can hardly bear to revisit the issue. I will just mention the long list of universities that have achieved improbable glory by a few researchers, or sometimes just one, on a multi-author international physics, medical or genetics project.
The Shanghai rankings were once known for their stability but have become more volatile recently. The villain here is the highly cited researchers indicator which has a 20% weighting and consists of those scientists included in the lists now published by Clarivate Analytics.
It seems that several universities have now become aware that if they can recruit a couple of extra highly cited researchers to the faculty they can get a significant boost in these rankings. Equally, if they should be so careless to lose one or two then the ranking consequences could be most unfortunate.
In 2016 a single highly cited researcher was worth 10.3 points in the Shanghai rankings, or 2.06 on the overall score after weighting, which is the difference between 500th place and 386th. That is a good deal, certainly much better than hiring a team of consultants or sending staff for excruciating transformational sharing sessions
Of course, as the number of HiCis increases the value of each incremental diminishes so it would make little difference if a top 20 or 30 university added or lost a couple of researchers.
Take a look at some changes in the Shanghai rankings between 2016 and 2017. The University of Kyoto fell three places from 32nd to 35th place or 0.5 points from 37.2 to 36.7. This was due to a fall in the number of highly cited researchers from seven to five which meant a fall of 2.7 in the HiCi score or a weighted 0.54 points in the overall score.
McMaster University rose from 83rd to 66th gaining 2.5 overall points. The HiCi score went from 32.4 to 42.3, equivalent to 1.98 weighted overall points, representing an increase in the number of such researchers from 10 to 15.
Further down the charts,the University of Hamburg rose from 256th with an overall score of 15.46 to 188th with a score of 18.69, brought about largely by an improvement in the HiCi score from zero to 15.4 which was the result of the acquisition of tworesearchers.
Meanwhile the Ecole Polytechnique of Paris fell from 303rd place to 434th partly because of the loss of its only highly cited researcher.
It is time for ShanghaiRanking to start looking around for a Plan B for their citations indicator.