Cambridge and Harvard
After the break with THE, QS decided to continue the old methodology of the 2004-2009 rankings. At least, that is what they said. It was therefore surprising to see that, according to data provided by QS, there were in fact a number of noticeable rises and falls between 2009 and 2010 although nothing like as much as in previous years.
For example the University of Munich fell from 66th place to 98th place, the Free University of Berlin from 70th to 94th and Stockholm University from 168th to 215th while University College Dublin rose from 114th to 89th and Wurzburg from 309th to 215th.
But perhaps the most remarkable news was that Cambridge replaced Harvard as the world's best university. In every other ranking Harvard is well ahead.
So how did it happen? According to Martin Ince, “Harvard has taken more students since the last rankings were compiled without an equivalent increase in the number of academics.”
In other words there should have been a lower faculty student ratio and therefore a lower score for this indicator. This in fact happened. Harvard’s score went from 98 to 97.
Ince also says that there was an “improvement in staffing levels”, at Cambridge, presumably meaning that there was an increase in the number of faculty relative to the number of students. Between 2009 and 2010 Cambridge’s score for the student faculty remained the same at 100 which is consistent with Ince’s claim.
In addition to this, there was a "significant growth in the number of citations per faculty member" for Cambridge. It is not impossible that the number of citations racked up by Cambridge had risen relative to Harvard but the QS indicator counts citations over a five year so even a substantial increase in publications or citations would take a few years to have an equivalent effect on this indicator. Also note that this indicator is citations per faculty and it appears that the number of faculty at Cambridge has gone up relative to Harvard. So we would expect any increase in citations to be cancelled out by a similar increase in faculty.
It looks a little odd then that for this indicator the Cambridge score rose from 89 to 93, four points, which is worth 0.8 in the weighted total score. That, by the way, was the difference between Harvard and Cambridge in 2009.
The oddity is compounded when we look at other high ranking universities. between 2009 and 2010 Leiden's score for citations per faculty rose from 97 to 99, Emory from 90 to 95, Oxford from 80 to 84, Florida from 70 to 75.
It would at first sight appear plausible that if Harvard, the top scorer in both years, did worse on this indicator then everybody or nearly everybody else would do better. But if we look at universities further down the table, we found the opposite. Between 2009 and 2010 for this indicator Bochum fell from 43 to 34, Ghent from 43 to 37, Belfast from 44 to 35 and so on.
Could it be that there was some subtle and unannounced change in the method by which the raw scores were transformed into indicator scores. Is it just a coincidence that the change was sufficient to erase the difference between Harvard and Cambridge?
http://www.wiziq.com/tutorial/90743-QS-World-University_Rankings-top-500
Looking at other citation rankings it is clear that Harvard (and other top US schools) outperform Cambridge on a per faculty basis. THE and QS obviously manipulate the numbers in favor of the British universities. Also, is "faculty" defined differently at Harvard and Cambridge? Would all supervisors (for tutorials) at Cambridge be counted as faculty, for instance? It is known that some supervisors are just grad students...
ReplyDelete