Universities in Russia, India, Pakistan, Malaysia, Indonesia and other places often promise that one day they will be in the top 100 or 50 or 10 of one of the international rankings. It would be interesting to see how much money they propose to spend.
An intriguing aspect of this paper is the concept of noise. The authors find that the USNWR rankings show a lot of volatility with universities bouncing up and down for no particular reason and that any change of four places or less can be regarded as a random fluctuation that should not give journalists or university administrators a heart attack or send them strutting around the campus.
One of the authors, a former vice provost at Johns Hopkins, told interviewers:
' “the trustees would go bananas” when Johns Hopkins dropped in the rankings. The administration would then have to explain what had happened.
“Every year Hopkins went from 15 to 16 to 15 to 16 – and I thought, ‘What a silly waste of energy,' ” Kuncl said in an interview Monday. (Johns Hopkins is currently No. 12.)
The paper found that small movements up or down in the rankings are more or less irrelevant. For most universities in the top 40, any movement of two spots or less should be considered noise, the paper said. For colleges outside the top 40, moves up or down of four spots should be thought of as noise, too.'
The amount of noise generated by a ranking is probably a good negative indicator of its reliability. We are after all dealing with institutions that receive millions of dollars in funds, produce thousands of papers and tens of thousands of citations, enroll thousands of students, graduate some of them, employer hundreds of bureaucrats and faculty and adjuncts and so on. We should not expect massive fluctuations from year to year.
I have calculated the average movement up or down the top 100 in Shanghai Jiao Tong University's Academic Ranking of World Universities between 2011 and 12 and between 2012 and 2013 and the Times Higher Education (THE) and Quacquarelli Symonds (QS) World University Rankings for 2012 -2013. A university falling out of the top 100 altogether was counted as falling to 101st place.
In 2013 ARWU had a problem with Thomson Reuters who were supposed to be preparing a new list of highly cited researchers so they simply recycled the scores for that indicator from the 2012 rankings. This reduced the volatility of the rankings somewhat so changes between 2011 and 2012 were also analysed.
Starting with the top 20 of ARWU between 2012 and 13 there was an average change of 0.25 place changes. Between 2011 and 12 there were 0.15.
Between 2011 and 2012 the University of California San Francisco fell from 17th to 18th place, Johns Hopkins rose from 18th to 17th and Tokyo University fell from 20th to 21st. There were no other changes in the top twenty.
Moving on to the QS top 20 in 2012-2013 there were some significant changes, including Stanford rising from 15th to 7th.and the University of Michigan falling from 17th to 22nd. The average change was 1.7 places.
At the top, the THE world rankings were somewhat less volatile than QS but much more so than ARWU. The average change was 1.2 and the biggest was University College London which fell from 17th to 21st.
So university administrators should be concerned about any change within the top twenty of the ARWU but should not be bothered about a change of one or two places in the THE or QS rankings.
Moving on to the top 100, the average change in the ARWU was 1.66 places between 2012 and 2013 and 2.01 between 2011 and 2012.
The biggest change between 2011 and 2012 was Goettingen which fell from 86th to the 101 -150th band.
In the QS rankings between 2012 and 2013 the average change in the top 100 was 3.97 Substantial changes include Boston University which fell from 64th to 79th and the University of Birmingham which rose from 77th to 62nd.
In the THE top 100 the average change was 5.36 . Notable changes include Lund University falling from 82nd to 123rd, Montreal falling from 84th to 106th and King's College London rising from 57th to 38th.
So, it can be concluded that the ARWU rankings are the most reliable. For the top 100, QS is more reliable than THE but the reverse is the case for the top 20.
How to explain these differences? To be certain it would be necessary to look at the separate indicators in the three rankings but here are some thoughts.
The dominant indicator in the QS rankings is the academic survey which has a weighting of 40 %. Any fluctuation in the survey could have a disproportionate effect on the overall score. The most important single indicator in the THE rankings is Citations: Research Influence, which has a 30 % weighting but contributes a higher proportion to total scores because the regional adjustment gives an extra boost to countries with a limited research base. In contrast, no indicator in the Shanghai rankings has more than a 20 % weighting.
The THE rankings include inputs such as income. An injection of research funds from a corporation would immediately improve a university's position in the income from industry, research income and total institutional income indicators. It would be a few years before the funds produce an improvement, if they ever did, in the publications indicator in ARWU and even longer in the Citations per Faculty indicator in the QS rankings.
ARWU uses publicly available data that can be easily checked and is unlikely to fluctuate very much from year to year. THE and QS also use data submitted from institutions. There is room for error as data flows from branch campuses and research centres to the central administrators and then to the rankers. QS also has the option of replacing institutional data with that from third party sources.
So everybody should relax when reading this year's rankings. Unless your university has risen or fallen by more than two spots in ARWU, four in the QS rankings or six in THE's.