Part 2
It looks as though Duke University achieved its outstanding score on the Times Higher Education Supplement (THES) world university rankings in 2005 largely through another blunder by THES's consultants, QS. What is scandalous about this is that none of the statisticians, mathematicians and educational management experts at Duke noticed this or, if they did notice it, that they kept quiet about it.
Between 2004 and 2005 Duke went zooming up the THES rankings from 57th to 11th place. This would be truly remarkable if the THES rankings were at all accurate. It would mean that the university had in twelve months recruited hundreds of new faculty, multiplied the number of its international students, doubled its research output, convinced hundreds of academics and employers around the world of its superlative qualities, or some combination of the above. If this did not happen, it means that there must have been something wrong with THES's figures.
So how did Duke achieve its extraordinary rise? First, we had better point out that when you are constructing rankings based on thousand of students, hundreds of faculty, tens of thousands of research papers or the views of hundreds or thousands of reviewers you are not going to get very much change from year to year if the ranking is reliable. This is why we can be fairly confident that the Shanghai Jiao Tong University index, limited though it is in many respects, is accurate. This, unfortunately, also makes it rather boring. Notice the near total lack of interest in the latest edition of the index which came out a few weeks ago. It is hard to get excited about Tokyo inching up one place to number 19. Wait another four decades and it will be challenging Harvard! Compare that with the catastrophic fall of Universiti Malaya between 2004 and 2005 on the THES index and the local media uproar that ensued. That was much more interesting. What a pity that it was all the result of a research error, or a "clarification of data".
Or look at what happened to Duke. Last September it crept up from number 32 to 31 on the Shanghai Jiao Tong ranking, reversing its fall to 31 in 2004 from 32 in 2003. Who is going to get excited about that?
Anyway, let's have a look at the THES rankings in detail. A brief summary, which could be passed over by those familiar with the rankings, is that in 2005 it gave a weighting of 40 % to a peer review by "research-active academics", 10 % to a rating by employers of "internationally mobile graduates", 20 % to faculty-student ratio, 10 % to proportion of international faculty and international students and 20% to the citations of research papers per faculty member.
I will just mention the peer review section and then go on to the faculty-student ratio where the real scandal can be found.
Peer Review
Duke got a score of 61 on the peer review compared to a top score (Berkeley) of 665 in 2004. This is equivalent to a score of 9.17 out of 100. In 2005 it got a score of 36 compared with the top score of 100 (Harvard), effectively almost quadrupling its score. To some extent, this is explained by the fact that everybody except Berkeley went up on this measure between 2004 and 2005. But Duke did much better than most. In 2004 Duke was noticeably below the mean score of the 164 universities that appeared in the top 200 in both years, but in 2005 it was slightly above the mean of these universities. Its position on this measure rose from 95th to 64th.
How is this possible? How could there be such a dramatic rise in the academic peers' opinions of Duke university? Remember that the reviewers of 2005 included those of 2004 so there must have been a very dramatic shift among the additional reviewers towards Duke in 2005.
A genuine change in opinion is thoroughly implausible. Two other explanations are more likely. One is that QS polled many more academics from the east coast or the south of the United States in 2005, perhaps because of a perceived bias to California in 2004. The other is that the Duke academics invited to serve on the panel passed the survey to others who, in a spontananeous or organised fashion, returned a large number of responses to QS.
Faculty-Student Ratio
Next, take a look at the scores for Faculty-Student ratio. Duke did well on this category in 2004 with a score of 66. In that year the top scorer was Ecole Normal Superieure (ENS) in Paris which, with 1800 students and 900 faculty according to QS, would have had 0.50 faculty per student, Duke therefore would have 0.33 faculty per students. This would be a very good ratio if true.
In 2005, the top scorer was Ecole Polytechnique in Paris, which supposedly had 2,468 students and 1,900 faculty, or 0.77 faculty per student. ENS's score went down to 65 which is exactly what you what expect if the ratio remained unchanged at 0.5. Duke's score in 2006 was 56, which works out at 0.43 faculty per student.
Datafile
Demographic
No. of faculty:
6,244
No. of international
faculty:
825
No. of students:
12,223
However, Duke's web site currently gives a figure of 13,088 students and 1,595 tenure and tenure track faculty and 923 others (professors of the practice, lecturers, research professors and medical associates). This makes a total of 2,518 tenure, tenure track and other regular faculty.
So where on Earth did QS find another 3,700 plus faculty?
Take a look at this data provided by Duke. Notice anything?
STUDENTS Enrollment
(full-time) Fall 2005
Undergraduate
6,244
African-AmericanAsian-American
So what happened is that someone at QS confused the number of undergraduate students with the number of faculty and nobody at QS noticed and nobody at THES noticed. Perhaps nobody at Duke noticed either, although I find that hard to believe. This resulted in a grossly inflated score for Duke on the faculty-student ratio component and contributed to its undeserved ascent in the rankings.
Anyway, it's time to stop and post this, and then work out what Duke's score really should have been.
No comments:
Post a Comment