Top Universities Ranked by Research Impact
The THES – QS World Universities Rankings, and its bulky offspring, Guide to the World’s Top Universities (London: QS Quacquarelli Symonds), are strange documents, full of obvious errors and repeated contradictions. Thus, we find that the Guide has data about student faculty ratios that are completely different from those used in the top 200 rankings published in the THES while talking about how robust such a measure is. Also, if we look at the Guide we notice that for each of the top 100 universities it provides a figure for research impact, that is the number of citations divided by the number of papers. In other words it indicates how interesting other researchers found the research of each institution. These figures completely undermine the credibility ot the “peer review” as a measure of research expertise.
The table below is a re-ranking of the THES top 100 universities for 2006 by research impact and therefore by overall quality of research. This is not by any means a erefect measure. For a start, the natural sciences and medicine do a lot more citing than other disciplines and this might favor some universities more than others. Nonetheless it is very suggestive and it is so radically different from the THES-QS peer review and the overall ranking that it provides further evidence of the invalidity of the latter.
Cambridge and Oxford, ranked second and third by THES-QS, only manage to achieve thirtieth and twenty-first places for research impact.
Notice that in comparison to their research impact scores the following universities are overrated by THES-QS: Imperial College London, Ecole Normale Superieure, Ecole Polytechnique, Peking, Tsing Hua,Tokyo, Kyoto, Hong Kong, Chinese University of Hong Kong, National University of Singapore, Nanyang Technological University, Australian National University, Melbourne, Sydney, Monash, Indian Institutes of Technology, Indian Institutes of Management.
The following are underrated by THES-QS: Washington University in St Louis,
Pennsylvania State University, University of Washington, Vanderbilt, Case Western Reserve, Boston, Pittsburgh, Wisconsin, Lausanne, Erasmus, Basel, Utrecht, Munich, Wageningen, Birmingham.
The number on the left is the ranking by research impact, i.e. the number of citations divided by the number of papers. The number to the right of the universities is the research impact. The number in brackets is the overall ranking in the THES-QS 2006 rankings.
1 Harvard 41.3 (1 )
2 Washington St Louis 35.5 (48 )
3 Yale 34.7 (4 )
4 Stanford 34.6 (6 )
5 Caltech 34 (7 )
6 Johns Hopkins 33.8 (23 )
7 UC San Diego 33 (44)
8 MIT 32.8 (4)
9= Pennsylvania State University 30.8 (99)
9= Princeton 30.8 (10)
11 Chicago 30.7 (11)
12= Emory 30.3 (56)
12= Washington 30.3 (84)
14 Duke 29.9 (13 )
15 Columbia 29.7 (12 )
16 Vanderbilt 29.4 (53)
17 Lausanne 29.2 (89 )
18 University of Pennsylvania 29 (26)
19 Erasmus 28.3 (92)
20 UC Berkeley 28 (8)
21= UC Los Angeles 27.5 (31)
21= Oxford 27.5 (3
23 Case Western Reserve 27.4 (60)
24 Boston 27.2 (66)
25 Pittsburgh 27.1 (88 )
26 Basel 26.7 (75 )
27= New York University 26.4 (43)
27= Texas at Austin 26.4 (32 )
29 Geneva 26.2 (39 )
30= Northwestern 25.8 (42 )
30= Cambridge 25.8 (2)
32 Dartmouth College 25.6 (61
33 Cornell 25.5 (15 )
34 Rochester 25.1 (48 )
35 Michigan 25 (29)
36 University College London 24.9 (25 )
37 Brown 24.1 (54)
38 McGill 23.6 (21)
39 Edinburgh 23.4 (33 )
40 Toronto 23 (27 )
41 Amsterdam 21.6 (69 )
42 Wisconsin 21.5 (79 )
43= Utrecht 21.4 (95
43= Ecole Normale Superieure Lyon 21.4 (72)
45 ETH Zurich 21.2 (24 )
46 Heidelberg 20.8 (58 )
47 British Columbia 20.6 (50 )
48 Carnegie Mellon 20.5 (35 )
49= Imperial College London 20.4 (9)
49= Ecole Normale Superieure Paris 20.4 (18 )
51 King’s College London 20.1 (48 )
52 Bristol 20 (64)
53= Trinity College Dublin 19.9 (78 )
53= Copenhagen 19.9 (54 )
53= Glasgow 19.9 (81 )
56 Munich 19.8 (98)
57 Technical University Munich 19.4 (82 )
58= Birmingham 19.1 (90)
58= Catholic University of Louvain 19.1 (76 )
60 Tokyo 18.7 (19)
61 Illinois 18.6 (77 )
62 Osaka 18.4 (70
63 Wageningen 18.1 (97 )
64 Kyoto 18 (29 )
65 Australian National University 17.9 (16 )
66 Vienna 17.9 (87)
67 Manchester 17.3 (40 )
68 Catholic University of Leuven 17 (96)
69= Melbourne 16.8 (22 )
69= New South Wales 16.8 (41 )
71 Nottingham 16.6 (85 )
72 Sydney 15.9 (35)
73= Pierre-et-Marie-Curie 15.7 (93 )
73= Monash 15.7 (38)
75 Otago 15.5 (79 )
76 Queensland 15.3 (45)
77 Auckland 14.8 (46 )
78= EPF Lausanne 14.3 (64 )
78= MacQuarie 14.3 (82 )
78= Leiden 14.3 (90 )
81 Eindhoven University of Technology 13,4 (67 )
82= Warwick 13.3 (73 )
82= Delft University of Technology 3.3 (86)
84 Ecole Polytechnique 13.2 (37 )
85 Hong Kong 12.6 (33 )
86 Hong Kong Uni Science and Technology 12.2 (58)
87 Chinese University of Hong Kong 11.9 (50 )
88 Seoul National University 10.9 (63)
89 National University of Singapore 10.4 (19 )
90 National Autonomous University of Mexico 9.8 (74)
91 Peking 8 (14)
92 Lomonosov Moscow State 6 (93 )
93 Nanyang Technological University 5.6 (61)
94 Tsing Hua 5.4 (28 )
95 LSE 4.4 (17 )
96 Indian Institutes of Technology 3 (57 )
97 SOAS 2.5 (70 )
98 Indian Institutes of Management 1.9 (68)
Queen Mary London -- (99 )
Sciences Po -- (52)
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Monday, March 05, 2007
Thursday, March 01, 2007
THES-QS Bias Chart
LevDau has been kind enough to reproduce my post "More Problems with Method" and to add a couple of very interesting graphs. What he has done is to calculate a bias ratio, which is the number of THE-QS reviewers reported on the topuniversities site divided by the number of highly cited researchers listed in Thomson ISI. The higher the number the more biased is the THES-QS review towards that country and the lower the number the more biased against. Some countries will not appear because they did not have anybody at all in the highly cited list.
If we chose a less rigorous definition of research expertise such as number of papers published rather than the number of highly cited researchers then the bias might be somewhat reduced. It would certainly not, however, be removed. In any case, if we are talking about the gold standard of ranking then the best researchers would surely be most qualified to judge the merits of their peers.
Bias in the THES-QS peer review (Selected Countries)
Iran 25
India 23.27
Singapore 23
Pakistan 23
China 19
Mexico 17
South Korea 9
Taiwan 3.22
Australia 1.82
Hong Kong 1.79
Finland 1.53
New Zealand 1.47
France 1
UK 0.86
Israel 0.77
Germany 0.43
Japan 0.22
USA 0.14
LevDau has been kind enough to reproduce my post "More Problems with Method" and to add a couple of very interesting graphs. What he has done is to calculate a bias ratio, which is the number of THE-QS reviewers reported on the topuniversities site divided by the number of highly cited researchers listed in Thomson ISI. The higher the number the more biased is the THES-QS review towards that country and the lower the number the more biased against. Some countries will not appear because they did not have anybody at all in the highly cited list.
If we chose a less rigorous definition of research expertise such as number of papers published rather than the number of highly cited researchers then the bias might be somewhat reduced. It would certainly not, however, be removed. In any case, if we are talking about the gold standard of ranking then the best researchers would surely be most qualified to judge the merits of their peers.
Bias in the THES-QS peer review (Selected Countries)
Iran 25
India 23.27
Singapore 23
Pakistan 23
China 19
Mexico 17
South Korea 9
Taiwan 3.22
Australia 1.82
Hong Kong 1.79
Finland 1.53
New Zealand 1.47
France 1
UK 0.86
Israel 0.77
Germany 0.43
Japan 0.22
USA 0.14
Subscribe to:
Posts (Atom)