Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Wednesday, May 09, 2007
The Princeton Review (registration required) has published a list of the best value colleges in the US.
Here is what they say about their methodology:
"We chose the schools that appear on our Top Ten Best Value Public and Private Colleges ranking lists based on institutional data we collected from more than 650 schools during the 2005-2006 academic year and our surveys of students attending them. Broadly speaking, the factors we weighed covered undergraduate academics, costs, and financial aid.
More specifically, academic factors included the quality of students the schools attracted, as measured by admissions credentials, as well as how students rated their academic experiences. Cost considerations were tuition, room and board, and required fees.
Financial aid factors included the average gift aid (grants, scholarships, or free money) awarded to students, the average percentage of financial need met for students who demonstrated need, the percentage of students with financial need whose need was fully met by the school, the percentage of graduating students who took out loans to pay for school, and the average debt of those students. We also took into consideration how satisfied students were with the financial aid packages they received."
There are a few questions that should be asked about the methodology, especially concerning the student surveys, but this approach may be more useful for undergraduate students than that of the THES-QS and Shanghai Jiao Tong rankings.
The top 10 best value private colleges for undergraduates are:
1. Rice University
2. Williams College
3. Grinell College
4. Swarthmore College
5. Thomas Aquinas college
6. Wabash College
7. Whitman College
8. Amherst College
9. Scripps College
10. Harvard College
The top 10 best value public colleges are:
1. New College of Florida
2. Truman State University
3. University of North Carolina at Asheville
4. University of Virginia
5. University of California at Berkeley
6. University of California at San Diego
7. University of California at Santa Cruz
8. University of Minnesota, NMorris
9. University of Wisconson-Madison
10. St. Mary's College of Maryland
Thursday, May 03, 2007
Eric Beeekens at Bog,u +S has written some excellent posts on the internationalization of higher education.
A recent one concerns QS Quacquarelli Symonds Ltd (QS) who were responsible for collecting data for a ranking of business schools by Fortune magazine. It seems that QS committed a major blunder by leaving out the Kenan-Flagler School at the University of North Carolina at Chapel Hill, one of the top American business schools and one that regularly appears among the high fliers in other business school rankings. Apparently QS got mixed up with North Carolina State University’s College of Management. They also left out the Boston University School of Business. Beerkens refers to an article in the Economist (subscription required) and remarks:
“After reading the first line, I thought: 'again!?' Yep... Quacquarelli Symonds Ltd (QS) did it again.”
Beerkens then points out that this is not the first time that QS has produced flawed research, referring – for which many thanks – to this blog and others. He concludes:
“It's rather disappointing that reputable publications like THES and Forbes use the services of companies like QS. QS clearly doesn't have any clue about the global academic market and has no understanding of the impact that their rankings are having throughout the world. There has been a lot of critique about the indicators that they use, but at least we can see these indicators. It are the mistakes and the biases that are behind the indicators that make it unacceptable!”
There was a vigorous response from the University of North Carolina. They pointed out that QS had admitted to not contacting the university about the rankings, using outdated information and getting the University of North Carolina mixed up with North Carolina State University. QS did not employ any proper procedures for verification and validation, apparently failed to check with other rankings, gave wrong or outdated information about salaries and provided data from 2004 0r 2005 although claiming that these referred to 2006.
Fortune has done the appropriate and honest, although probably expensive, thing and removed the rankings from its website.
What is remarkable about this is the contrast between Fortune and the THES All of the errors committed by QS with regard to the Fortune rankings are parallelled in the World University Rankings. They have, for example grossly inflated the scores of Ecole Normale Superieure in Paris in 2004 and Ecole Polytechnique in 2005 by counting part-time faculty as full time, and done the same for Duke University – QS does seem to have bad luck in North Carolina, doesn’t it? -- in 2005 by counting undergraduate students as faculty and in 2006 by counting faculty twice, used a database from a Singapore based academic publishing company that specializes in Asia-Pacific publications to produce a survey to represent world academic opinion, conducted a survey with an apparent response rate of less than one per cent and got the names of universities wrong – Beijing University and the Official University of California among others.
It is probably unrealistic for THES to remove the rankings from its website. Still, they could at the very least start looking around for another consultant.
This is a draft of a review that may appear shortly in an academic journal.
Guide to the World’s Top Universities, John O’Leary, Nunzio Quacquarelli and Martin Ince. QS Quacquarelli Symonds Ltd.: London. 2006.
The THES (Times Higher Education Supplement)-QS World University Rankings have aroused massive interest throughout the world of higher education, nowhere more so than in East and Southeast Asia. Very few university teachers and administrators in the region can be unaware of the apparent dramatic collapse of quality at Universiti Malaya, which was in fact nothing of the sort. That this resulted from nothing more than an error by THES’s consultants and its belated correction has done little to diminish public fascination.
Now, QS Quacquarelli Symonds, the consultants who compiled the data for the rankings, have published a large 512-page volume. The book, written by John O’Leary and Martin Ince of THES and Nunzio Quacquarelli of QS, comes with impressive endorsements. It is published in association with IELTS, TOEFL and ETS, names that quite a few Asian students and teachers will know, and is distributed by Blackwell Publishing of Oxford. At the top of the front cover, there is a quotation from Tim Rogers, former Head of Student Recruitment and Admissions, London School of Economics: “A must – have book for anyone seeking a quality university education at home and abroad.” Tim Rogers, by the way, has been a consultant for QS.
The Guide to the World’s Top Universities certainly contains a large amount of material. There are thirteen chapters as follows.
Welcome to the world’s first top university guide
Ranking the world’s universities
How to choose a university and course
The benefits of studying abroad
What career? Benefits of a top degree
Tips for applying to university
What parents need to know -- guide to study costs and more
Financing and scholarships
The world’s top 200 universities. This is the ranking that was published last year in the THES.
The world’s top universities by subject. This was also published in the THES.
The top 100 university profiles. This provides two pages of information about each university.
12. The top ten countries
13. Directory of over 500 top world universities.
Basically, there are two parts. The earlier chapters mostly consist of advice that is generally interesting, well written and sensible. Later, we have data about various characteristics of the universities, often ranking them in order. The latter comprise much of the book. The profiles of the top 100 universities take up 200 pages and the directory of 500 plus universities another 140.
So, is this a must-have book? At ₤19.99, $35.95 or Euro 28.50 the answer has to be not really. Maybe it would be a good idea to glance through the earlier advisory chapters but as a source of information and evaluation it is not worth the money. First of all, there are serious problems with the information presented in the rankings, the profiles and the directory. The book’s credibility is undermined by a succession of errors, indicating an unacceptable degree of carelessness. At 35 dollars or 20 pounds we surely have the right to expect something a little better, especially from the producers of what is supposed to be “the gold standard” of university rankings.
Thus we find that the Technical University of Munich appears twice in the profiles in positions 82 (page 283) and 98 (Page313). The latter should be the University of Munich. In the directory the University of Munich is provided with an address in Dortmund (page 407). The Technical University of Helsinki is listed twice in the directory (pages 388 and 389). A number of Swiss universities are located in Sweden (pages 462 and 463). The authors cannot decide whether there is only one Indian Institute of Technology and one Indian Institute of Management (page 416) or several (pages 231 and 253). New Zealand is spelt ‘New Zeland’ (page 441). The profile for Harvard repeats the same information in the factfile under two different headings (page 119). There is something called the ‘Official University of California, Riverside’ on page 483. Kyungpook National University in Korea has a student faculty ratio of zero (page 452). Something that is particularly irritating is that the authors or their assistants still cannot get the names of Malaysian universities right. So we find ‘University Putra Malaysia’ on page 435 and ‘University Sains Malaysia’ on page 436. After that famous blunder about Universiti Malaya’s international students and faculty one would expect the authors to be a bit more careful.
Still, we must give some credit. At least the book has at last started to use the right name for China’s best or second best university – Peking University, not Beijing University -- and ‘University of Kebangsaan Malaysia’ in the 2006 rankings in the THES has now been corrected to ‘Universiti Kebangsaan Malaysia’.
The Guide really gets confusing, to put it mildly, when it comes to the number of students and faculty. A perceptive observer will note that the data for student-faculty ratio in the top 200 rankings reproduced in chapter 9 is completely different from those in the profiles in chapter 11 and the directory in chapter 13.
For example, in the rankings Duke University, in North Carolina, is given a score of 100, indicating the best student faculty ratio. Going to QS’s topuniversities website we find that Duke supposedly has 11,106 students and 3,192 faculty, representing a ratio of 3.48 students per faculty. But then we turn to the profile and see that Duke is assigned a ratio of 16.7 students per faculty (page 143). On the same page we are told that Duke has 6,301 undergraduates and 4,805 postgraduates and “just under 1,600 faculty”. That makes a ratio of about 6.94. So, Duke has 3.48 or 6.94 or 16.7 students per faculty. Not very helpful.
Looking at Yale University, the book tells us on the same page (127) that the student faculty ratio is 34.3 and that the university has “around 10,000 students” and 3,333 faculty, a ratio of 3 students for each faculty member.
On page 209 we are told that the University of Auckland has a student–faculty ratio of 13.5 and in the adjacent column that it has 2,000 academic staff and 41, 209 students, a ratio of 20.6. Meanwhile, the top 200 rankings give it a faculty student score of 38 which works out at a ratio of 9.2. So, take your pick from 9.2, 13.5 and 20.6.
The data for research expertise is also contradictory. Universities in Australia and China get excellent scores for the “peer review” of best research in the rankings of the top 200 universities in chapter 9 but get relatively poor scores for research impact. The less glamorous American universities like Boston and Pittsburgh get comparatively low scores for peer review of research but actually do excellent research.
Errors and contradictions like these seriously diminish the book’s value as a source of information.
It would not be a good idea to buy this book although it might be worth looking at the early chapters if you can borrow it from a library. To judge the overall global status of a university, the best bet would be to look east and turn to at the Shanghai Jiao Tong University Index, available on the Internet, which ranks the top 500 universities. This index focuses entirely on research but there is usually at least a modest relationship between research activity and other variables such as the quality if the undergraduate student intake and teaching performance. Those thinking about going to the US should look at the US News and World Report ‘s America’s Best Colleges. Anyone concerned about costs – who isn’t? -- should look at Kiplinger’s Index, which calculates the value for money of American universities. Incidentally, the fifth place here goes to the State University of New York at Binghamton, which is not even mentioned in the Guide. The Times (which is not the same as the Times Higher Education Supplement) and Guardian rankings are good for British universities.
Students who are not certain about going abroad or who are thinking about going to a less well known local institution could try doing a Google Scholar search for evidence of research proficiency and a Yahoo search for miscellaneous activity. Whatever you do, it is not a good idea to rely on any one source alone and certainly not this one.
Friday, April 27, 2007
Geoffrey Alderman, currently a visiting research fellow at the Institute of Historical Research, University of London, has an article in the Guardian about the decline of standards in British universities. He refers to a case at Bournemouth University where officials overrode a decision by a professor and examination board to fail thirteen students. Apparently, the officials thought it unreasonable that students were required to do any reading to pass the course. He also comments on the remarkable increase in the number of first-class degrees at the University of Liverpool. Professor Alderman is clear that part of the problem is with the current obsession with rankings:
"Part of the answer lies in the league-table culture that now permeates the sector. The more firsts and upper seconds a university awards, the higher its ranking is likely to be. So each university looks closely at the grading criteria used by its league-table near rivals, and if they are found to be using more lenient grading schemes, the argument is put about that "peer" institutions must do the same. The upholding of academic standards is thus replaced by a grotesque "bidding" game, in which standards are inevitably sacrificed on the alter of public image - as reflected in newspaper rankings."
Similarly, it seems that in the US large numbers of students are being pushed through universities for no other reason than to improve graduation rates and therefore scores on the US News and World Report rankings.
Tuesday, April 24, 2007
QS Quacquarelli Symonds, the consultants responsible for the THES-QS World University Rankings, have now placed data for 540 universities, complete with scores for the various components, on their topuniversities website (registration required). This reveals more dramatically than before the disparity between scores by some universities on the “peer review” and scores for citations per faculty, a measure of research quality. Below are the top 20 universities in the world according to the THES-QS “peer review” by research-active academics who were asked to select the universities that are best for research. In curved brackets to the right is the position of the universities in the 2006 rankings according to the number of citations per faculty.
Notice that some universities, including Sydney, Melbourne, Australian National University and the National University of Singapore perform dramatically better on the peer review than on citations per faculty. Melbourne, rated the tenth best university in the world for research by the THES-QS peer reviewers, is 189th for citations per faculty while the National University of Singapore, in twelfth place for the peer review, comes in at 170th for citations per faculty. The most devastating disparity is for Peking University, 11th on the “peer review” and 352nd for citations, behind, among others, Catania, Brunel, Sao Paulo, Strathclyde and Jyväskylä. Once again, this raises the question of how universities whose research is regarded so lightly by other researchers could be voted among the best for research. Oxford, Cambridge and Imperial College London are substantially overrated by the peer review. Kyoto is somewhat overrated while the American universities, with the exception of Chicago, have roughly the same place that would be indicated by the citations per faculty position.
Of course, part of the problem could be with the citations per faculty. I am fairly confident that the data for citations, which is collected by Evidence Ltd, is accurate but less certain about the number of faculty. I have noted already that if a university increases its score for student faculty ratio by increasing the number of faculty it would suffer a drop in the citations per faculty score. For most universities the trade-off would be worth it since the gap between the very good and the good is much greater for citations than for student-faculty ratio. So, if there has been some inflating of the number of faculty, however and by whom it was done, than this would have an adverse impact on the figures for citations per faculty.
I have therefore included the positions of these universities according to their score for articles in the Science Citation Index-expanded and Social Science Citation Index in 2005 in the Shanghai Jiao Tong rankings. This is not the same as the THES measure. It covers one year only and is based on the number of papers, not number of citations. It therefore measures overall research output of a certain minimum quality rather than the impact of that research on other researchers. The position according to this index is indicated in square brackets.
We can see that Cambridge and Oxford do not do as badly as they did on citations per faculty. Perhaps they produced research characterised by quantity more than quality or perhaps the difference is a result of inflated faculty numbers. Similarly the performance of Peking, National University of Singapore, Melbourne and Sydney is not as mediocre on this measure as it is THES’s citations per faculty.
Nonetheless the disparity still persists. Oxford, Cambridge, Imperial College and universities in Asia and Australia are still overrated by the THES-QS review.
1. Cambridge (46 ) [15]
2. Oxford (63) [17]
3. Harvard (2) [1]
4. Berkeley ( 7) [9]
5. Stanford (3) [10]
6. MIT (4) [29}
7. Yale (20) [27]
8. Australian National University (83) [125]
9. Tokyo (15) [2]
10. Melbourne (189) [52]
11. Peking (352) [50]
12. National University of Singapore (170) [111]
13. Princeton (10) [96]
14. Imperial College London ( 95) [23]
15. Sydney (171) [46}
16. Toronto (18) [3]
17. Kyoto (42) [8]
18. Cornell (16)
19. UCLA (19) [21]
20. Chicago (47) [55]
Monday, April 23, 2007
A new book on university rankings, The World-Class University and Ranking: Aiming Beyond Status, has appeared and has been reviewed by Martin Ince in the Times Higher Education Suppplement (THES). It is edited by Liu Nian Cai of Shanghai Jiao Tong University and Jan Sadlak. You can read a review here (subscription required) I hope eventually to review the book myself.
I must admit to being rather amused by one comment by Ince. He says:
"Although one of its editors is the director of the Shanghai rankings, The World-Class University and Ranking largely reflects university concerns at being ranked. Many contributors regard ranking as an unwelcome new pressure on academics and institutions. Much is made of the "Berlin principles" for ranking, a verbose and pompous 16-point compilation that includes such tips as "choose indicators according to their relevance and validity". The Shanghai rankings themselves fall at the third principle, the need to recognise diversity, because they rank the world's universities almost exclusively on science research. But the principles are silent on the most important point they should have contained - the need for rankings to be independent and not be produced by universities or education ministries."
I would not argue about the desirability of rankings being independent of university or government bureaucracies but there is far greater danger in rankings that are dominated by the commercial interests of newspapers.
Friday, April 20, 2007
The Guardian has announced that the Higher Education Funding Council for England (Hefce) will investigate university league tables, Its chief executive David Eastwood has announced that the council will examine the rankings produced by the Guardian, the Times and the Sunday Times and whether university policies are influenced by attempts to improve their scores.
The report continues:
'World tables compiled by Shanghai Jiao Tong University and the Times Higher Education Supplement will also be surveyed. The University of Manchester, for example, has made it clear that its strategy is to climb the international rankings, which include factors like the number of Nobel prizewinners. The university has pledged to recruit five Nobel laureates in the next few years.
Prof Eastwood said league tables were now part of the higher education landscape "as one of a number of sources of information available to potential students".
He added: "Hefce has an interest in the availability of high quality useful information for students and the sector's other stakeholders. The league table methodologies are already the subject of debate and academic comment. We plan to commission up-to-date research to explore higher education league table and ranking methodologies and the underlying data, with the intention of stimulating informed discussion.'
Thursday, April 19, 2007
There is a very interesting piece by Ahmad Ismail at highnetworth -- acknowledgement to Education in Malaysia -- that argues that the economic value of an overseas university education for a Malaysian student is minimal. There are, no doubt, going to be questions about the assumptions behind the financial calculations and there are of course other reasons for studying abroad.
Even so, if students themselves typically gain little or nothing economically from studying in another country and if their parents suffer a great deal and if students or taxpayers in the host country in one way or another have to pick up the tap (at Cornell it takes USD1,700 to recruit an international student) then one wonders what internationalisation has to do with university quality. And one wonders why THES and QS consider it so important
A letter to the Cornell Daily Sun from Mao Ye, a student-elected trustee, suggests increasing the recruitment of international students in order to boost the university's position in the US News and World Report rankings.
The question arises of whether the intenational students would add anything to the quality of an institution. If they do, then surely they would be recruited anyway. But if students are admitted for no other reason than to boost scores in the rankings they may well contribute to a decline in the overall quality of the students.
"Two critical ways to improve Cornell’s ranking are to increase the number of applications and to increase the yield rate of admitted students. To achieve this goal, no one can overlook the fact that international applications to all U.S. institutions have recently increased at a very fast pace. For Cornell, the applications from China increased by 42.9 percent in 2005 and 47.5 percent in 2006. We also saw a 40 percent increase in applications from India last year. By my estimation, if international applications continue to grow at the current rate, in 10 years there will be more than 10,000 foreign applications received by the Cornell admissions office. Therefore, good performance in the international market will have a significant positive impact on our ranking in U.S. News and World Report.
How might we get more international students to apply? It’s actually very easy. We can have different versions of application materials, each in various students’ native languages, highlighting Cornell’s achievements in that country and addressing the specific concerns of students from that country. I checked the price and realized we do not need more than $500 to translate the whole application package into Chinese. If we focus translation on the crucial information for Chinese applicants, the cost is as low as $50. Comparatively, this is lower than the cost of recruiting one undergraduate student to a university, which costs an average of $1,700 per student, based on the calculations of Prof. Ronald Ehrenburg, industrial and labor relations. Staff, students, parents and Cornell as a whole will all benefit greatly from this plan."
Monday, March 05, 2007
The THES – QS World Universities Rankings, and its bulky offspring, Guide to the World’s Top Universities (London: QS Quacquarelli Symonds), are strange documents, full of obvious errors and repeated contradictions. Thus, we find that the Guide has data about student faculty ratios that are completely different from those used in the top 200 rankings published in the THES while talking about how robust such a measure is. Also, if we look at the Guide we notice that for each of the top 100 universities it provides a figure for research impact, that is the number of citations divided by the number of papers. In other words it indicates how interesting other researchers found the research of each institution. These figures completely undermine the credibility ot the “peer review” as a measure of research expertise.
The table below is a re-ranking of the THES top 100 universities for 2006 by research impact and therefore by overall quality of research. This is not by any means a erefect measure. For a start, the natural sciences and medicine do a lot more citing than other disciplines and this might favor some universities more than others. Nonetheless it is very suggestive and it is so radically different from the THES-QS peer review and the overall ranking that it provides further evidence of the invalidity of the latter.
Cambridge and Oxford, ranked second and third by THES-QS, only manage to achieve thirtieth and twenty-first places for research impact.
Notice that in comparison to their research impact scores the following universities are overrated by THES-QS: Imperial College London, Ecole Normale Superieure, Ecole Polytechnique, Peking, Tsing Hua,Tokyo, Kyoto, Hong Kong, Chinese University of Hong Kong, National University of Singapore, Nanyang Technological University, Australian National University, Melbourne, Sydney, Monash, Indian Institutes of Technology, Indian Institutes of Management.
The following are underrated by THES-QS: Washington University in St Louis,
Pennsylvania State University, University of Washington, Vanderbilt, Case Western Reserve, Boston, Pittsburgh, Wisconsin, Lausanne, Erasmus, Basel, Utrecht, Munich, Wageningen, Birmingham.
The number on the left is the ranking by research impact, i.e. the number of citations divided by the number of papers. The number to the right of the universities is the research impact. The number in brackets is the overall ranking in the THES-QS 2006 rankings.
1 Harvard 41.3 (1 )
2 Washington St Louis 35.5 (48 )
3 Yale 34.7 (4 )
4 Stanford 34.6 (6 )
5 Caltech 34 (7 )
6 Johns Hopkins 33.8 (23 )
7 UC San Diego 33 (44)
8 MIT 32.8 (4)
9= Pennsylvania State University 30.8 (99)
9= Princeton 30.8 (10)
11 Chicago 30.7 (11)
12= Emory 30.3 (56)
12= Washington 30.3 (84)
14 Duke 29.9 (13 )
15 Columbia 29.7 (12 )
16 Vanderbilt 29.4 (53)
17 Lausanne 29.2 (89 )
18 University of Pennsylvania 29 (26)
19 Erasmus 28.3 (92)
20 UC Berkeley 28 (8)
21= UC Los Angeles 27.5 (31)
21= Oxford 27.5 (3
23 Case Western Reserve 27.4 (60)
24 Boston 27.2 (66)
25 Pittsburgh 27.1 (88 )
26 Basel 26.7 (75 )
27= New York University 26.4 (43)
27= Texas at Austin 26.4 (32 )
29 Geneva 26.2 (39 )
30= Northwestern 25.8 (42 )
30= Cambridge 25.8 (2)
32 Dartmouth College 25.6 (61
33 Cornell 25.5 (15 )
34 Rochester 25.1 (48 )
35 Michigan 25 (29)
36 University College London 24.9 (25 )
37 Brown 24.1 (54)
38 McGill 23.6 (21)
39 Edinburgh 23.4 (33 )
40 Toronto 23 (27 )
41 Amsterdam 21.6 (69 )
42 Wisconsin 21.5 (79 )
43= Utrecht 21.4 (95
43= Ecole Normale Superieure Lyon 21.4 (72)
45 ETH Zurich 21.2 (24 )
46 Heidelberg 20.8 (58 )
47 British Columbia 20.6 (50 )
48 Carnegie Mellon 20.5 (35 )
49= Imperial College London 20.4 (9)
49= Ecole Normale Superieure Paris 20.4 (18 )
51 King’s College London 20.1 (48 )
52 Bristol 20 (64)
53= Trinity College Dublin 19.9 (78 )
53= Copenhagen 19.9 (54 )
53= Glasgow 19.9 (81 )
56 Munich 19.8 (98)
57 Technical University Munich 19.4 (82 )
58= Birmingham 19.1 (90)
58= Catholic University of Louvain 19.1 (76 )
60 Tokyo 18.7 (19)
61 Illinois 18.6 (77 )
62 Osaka 18.4 (70
63 Wageningen 18.1 (97 )
64 Kyoto 18 (29 )
65 Australian National University 17.9 (16 )
66 Vienna 17.9 (87)
67 Manchester 17.3 (40 )
68 Catholic University of Leuven 17 (96)
69= Melbourne 16.8 (22 )
69= New South Wales 16.8 (41 )
71 Nottingham 16.6 (85 )
72 Sydney 15.9 (35)
73= Pierre-et-Marie-Curie 15.7 (93 )
73= Monash 15.7 (38)
75 Otago 15.5 (79 )
76 Queensland 15.3 (45)
77 Auckland 14.8 (46 )
78= EPF Lausanne 14.3 (64 )
78= MacQuarie 14.3 (82 )
78= Leiden 14.3 (90 )
81 Eindhoven University of Technology 13,4 (67 )
82= Warwick 13.3 (73 )
82= Delft University of Technology 3.3 (86)
84 Ecole Polytechnique 13.2 (37 )
85 Hong Kong 12.6 (33 )
86 Hong Kong Uni Science and Technology 12.2 (58)
87 Chinese University of Hong Kong 11.9 (50 )
88 Seoul National University 10.9 (63)
89 National University of Singapore 10.4 (19 )
90 National Autonomous University of Mexico 9.8 (74)
91 Peking 8 (14)
92 Lomonosov Moscow State 6 (93 )
93 Nanyang Technological University 5.6 (61)
94 Tsing Hua 5.4 (28 )
95 LSE 4.4 (17 )
96 Indian Institutes of Technology 3 (57 )
97 SOAS 2.5 (70 )
98 Indian Institutes of Management 1.9 (68)
Queen Mary London -- (99 )
Sciences Po -- (52)
Thursday, March 01, 2007
LevDau has been kind enough to reproduce my post "More Problems with Method" and to add a couple of very interesting graphs. What he has done is to calculate a bias ratio, which is the number of THE-QS reviewers reported on the topuniversities site divided by the number of highly cited researchers listed in Thomson ISI. The higher the number the more biased is the THES-QS review towards that country and the lower the number the more biased against. Some countries will not appear because they did not have anybody at all in the highly cited list.
If we chose a less rigorous definition of research expertise such as number of papers published rather than the number of highly cited researchers then the bias might be somewhat reduced. It would certainly not, however, be removed. In any case, if we are talking about the gold standard of ranking then the best researchers would surely be most qualified to judge the merits of their peers.
Bias in the THES-QS peer review (Selected Countries)
Iran 25
India 23.27
Singapore 23
Pakistan 23
China 19
Mexico 17
South Korea 9
Taiwan 3.22
Australia 1.82
Hong Kong 1.79
Finland 1.53
New Zealand 1.47
France 1
UK 0.86
Israel 0.77
Germany 0.43
Japan 0.22
USA 0.14
Monday, February 26, 2007
The THES-QS rankings of 2005 and 2006 are heavily weighted towards its so-called peer review, which receives 40% of the total ranking score. No other section gets more than 20 %. The “peer review” is supposed to be a survey of research active academics from around the world. One would therefore expect it to be based on a representative sample of the international research community, or “global opinion”, as THES claimed in 2005. It is, however, nothing of the sort.
The review was based on e-mails sent to people included on a database purchase from World Scientific Publishing Company. This is a publishing company that was founded in 1981. It now has 200 employees at its main office in Singapore. There are also subsidiary offices in New Jersey, London, Hong Kong, Taipei, Chennai, Beijing and Singapore. It claims to be the leading publisher of scientific journals and books in the Asia-Pacific region.
World Scientific has several subsidiaries. These include Imperial College (London) Press, which publishes books and journals on engineering, medicine, information technology, environmental technology and management, Pan Stanford Publishing of Singapore, which publishes in such fields as nanoelectronics, spintronics biomedical engineering and genetics, and KH Biotech Services Singapore who specialise in biotechnology, pharmaceuticals, food and agriculture; consultancy, training and conference organisation services. It also distributes books and journals produced for The National Academies Press (based in Washington, D.C.) in most countries in Asia (but not in Japan).
World Scientific has particularly close links with China, especially with Peking University. Their newsletter of November 2005 reports that:
”The last few years has seen the rapid growth of China's economy and academic sectors. Over the years, World Scientific has been actively establishing close links and putting down roots in rapidly growing China”
Another report describes a visit from Chinese university publishers:
”In August 2005, World Scientific Chairman Professor K. K. Phua, was proud to receive a delegation from the China University Press Association. Headed by President of Tsinghua University Press Professor Li Jiaqiang, the delegation comprised presidents from 18 top Chinese university publishing arms. The parties exchanged opinions on current trends and developments in the scientific publishing industry in China as well as Singapore. Professor Phua shared many of his experiences and expressed his interest in furthering collaboration with Chinese university presses. “
World Scientific has also established very close links with Peking University:
”World Scientific and Peking University's School of Mathematical Sciences have, for many years, enjoyed a close relationship in teaching, research and academic publishing. To further improve the close cooperation, a "World Scientific - Peking University Work Room" has been set up in the university to serve the academic communities around the world, and to provide a publishing platform to enhance global academic exchange and cooperation. World Scientific has also set up a biannual "World Scientific Scholarship" in the Peking School of Mathematical Sciences. The scholarship, totaling RMB 30,000 per annum and administered by the university, aims to reward and encourage students and academics with outstanding research contributions.”
Here are some of the titles published by the company:
China Particuology
Chinese Journal of Polymer science
Asia Pacific Journal of Operational research
Singapore Economic review
China: an International Journal
Review of Pacific Basin Financial Markets and Policies
Asian Case Research Journal
It should be clear by now that World Scientific is active mainly in the Asia-Pacific region, with an outpost in London. It seems more than likely that its database, which might be the list of subscribers or its mailing list, would be heavily biased towards the Asia-Pacific region. This goes a long way towards explaining why Chinese, Southeast Asian and Australasian universities do so dramatically better on the peer review than they do on the citations count or any other measure of quality.
I find it inconceivable that QS were unaware of the nature of World Scientific when they purchased the database and sent out the e-mails. To claim that the peer review is in any sense an international survey is absurd. QS have produced what may some day become a classical example of how bad sampling technique can destroy the validity of any survey.
Monday, February 19, 2007
Professor Simon Marginson of the University of Melbourne has made some very appropriate comments to The Age about the THES - QS rankings and Australian universities.
Professor Marginson told The Age that a lack of transparency in the rankings method means that universities could be damaged through no fault of their own.
'"Up to now, we in Australian universities have done better out of the Times rankings than our performance on other indicators would suggest," he said. "But it could all turn around and start working against us, too."
The Times rankings are volatile because surveys of employers and academics are open to manipulation, subjectivity and reward marketing over research, Professor Marginson said.'
The admitted extraordinarily low response rate to the THES - QS "peer review" combined with the overrepresentation of Australian "research-active academics"among the respondents are sufficient to confirm Professor's Marginsons remarks about the rankings.
Friday, February 16, 2007
Another problem with the peer review section of the THES-QS World University Rankings is that it is extremely biased against certain countries and biased in favour of certain others. Here is an incomplete list of countries where respondents to the peer review survey are located and the number of respondents.
USA 532
UK 378
India 256
Australia 191
Canada 153
Malaysia 112
Germany 103
Indonesia 93
Singapore 92
China 76
Japan 53
France 56
Japan 53
Mexico 51
Thailand 37
Israel 36
Iran 31
Taiwan 29
South Korea 27
HongKong 25
New Zealand 25
Pakistan 23
Finland 23
Nigeria 20
How far does the above list reflect the distribution of research expertise throughout the world? Here is a list of the same countries with the number of academics listed in Thomson ISI Highly Cited Researchers.
USA 3,825
UK 439
India 11
Australia 105
Canada 172
Malaysia 0
Germany 241
Indonesia 0
Singapore 4
China (excluding Hong Kong) 4
Japan 53
France 56
Japan 246
Mexico 3
Thailand 0
Israel 47
Iran 1
Taiwan 9
South Korea 3
HongKong 14
New Zealand 17
Pakistan 1
Finland 15
Nigeria 0
The number of highly cited scholars is not a perfect measure of research activity -- for one thing, some disciplines cite more than others -- but it does give us a broad picture of the research expertise of different countries.
The peer review is outrageously biased against the United States, extremely biased against Japan and very biased against Canada, Israel, European countries like France, Germany, Switzerland and the Netherlands.
On the other hand, there is a strong bias towards China (less so Taiwan and Hong Kong), India Southeast Asia and Australia.
Now we now why Cambridge does so much better in the peer review than Harvard despite an inferior research record, why Peking university is apparantly among the best in the world, why there are so many Australian universities in the top 200 and why the world 's academics supposedly cite Japanese researchers copiously but cannot bring themselves to vote for them in the peer review .
Thursday, February 15, 2007
QS Quacquarelli Symonds have published additional information on their web site concerning the selection of the initial list of universities and the administration of the "peer review". I would like to focus on just one issue for the moment, namely the response rate to the e-mail survey. Ben Sowter of QS had already claimed to have surveyed more than 190,000 academics to produce the review. He had said:
"Peer Review: Over 190,000 academics were emailed a request to complete our online survey this year. Over 1600 responded - contributing to our response universe of 3,703 unique responses in the last three years. Previous respondents are given the opportunity to update their response." (THES-QS World University Rankings _ Methodology)
This is a response rate about 0.8%, less than 1%. I had assumed that the figure of 190,000 was a typographical error and that it should have been 1,900. A response rate of 80% would have been on the high side but perhaps respondents were highly motivated by being included in the ranks of "smart people" or winning a BlackBerry organiser.
However, the new information provided appears to suggests that QS did survey such a large number.
"So, each year, phase one of the peer review exercise is to invite all previous reviewers to return and update their opinion. Then we purchase two databases, one of 180,000 international academics from the World Scientific (based in Singapore) and another of around 12,000 from Mardev - focused mainly on Arts & humanities which is poorly represented in the former.
We examine the responses carefully and discard any test responses and bad responses and look for any non-academic responses that may have crept in. " (Methodology-- The Peer Review)
There is a gap between "we purchase" and "we examine the responses" but the implication is that about 192, 000 academics were sent emails.
If this is the case then we have an extraordinarily low response rate, probably a record in the history of survey research. Kim Sheehan in an article in the Journal of Computer Mediated Communication reports that 31 studies of e-mail surveys show a mean response rate of about 37%. Response rates have been declining in recent years but even in 2004 the mean response rate was about 24%
Either QS did not send out so many e-mails or there was something wrong with the database or something else is wrong. Whatever it is, such a low response rate is in itself enough to render a survey invalid. A explanation is needed.
Wednesday, February 14, 2007
The Technical Univeritsity of Munich has pulled off a major feat. It has been awarded not one but two places among the world's top 100 universities in the THES-QS book, Guide to the World's Top Universities. The Guide has also managed to move a major university several hundred miles.
In 2006 the THES -- QS world university rankings placed the Technical University of Munich in 82nd place and the University of Munich at 98th.
The new THES-QS Guide has profiles of the top 100 universities. On page 283 and in 82nd place we find the Technical University Munich. Its address is given as "Germany". How very helpful. The description is clearly that of the Technical University and so is the data in the factfile.
On page 313 the Technical University Munich appears again, now in 98th place . The description is identical to that on page 283 but the information in the factfile is different and appears to refer to the (Ludwig-Maximilien) University of Munich. The university is given an address in Dortmund, in a completely different state and the web site appears to be that of the University of Munich.
Turning to the directory we find that "Universitat Munchen" is listed, again with an address in Dortmund, and the Technische Universitat Munichen is on page 409, without an address. This time the data for the two universities appears to be correct.
Sunday, February 11, 2007
A Robust Measure
There is something very wrong with the THES-QS Guide to the World’s Top Universities, recently published in association with Blackwell’s of London. I am referring to the book’s presentation of two completely different sets of data for student faculty ratio.
In the Guide, it is claimed that this ratio “is a robust measure and is based on data gathered by QS from universities or from national bodies such as the UK’s Higher Education Statistics Agency, on a prescribed definition of staff and students” (p 75).
Chapter 9 of the book consists of the ranking of the world’s top 200 universities originally published in the THES in October 2006. The rankings consist of an overall score for each university and scores for various components one of which is for the number of students per faculty. This section accounted for 20% of the total ranking. Chapter 11 consists of profiles of the top 100 universities, which among other things, include data for student faculty ratio. Chapter 12 is a directory of over 500 universities which in most cases also includes the student faculty ratio.
Table 1 below shows the top ten universities in the world according to the faculty student score in the university rankings, which is indicated in the middle column. It is possible to reconstruct the process by which the scores in THES rankings were calculated by referring to QS’s topuniversities site which provides information, including numbers of students and faculty, about each university in the top 200, as well as more than 300 others.
There can be no doubt that the data on the web site is that from which the faculty student score has been calculated. Thus Duke has, according to QS, 11,106 students and 3,192 faculty or a ratio of 3.48 students per faculty which was converted to a score of 100. Harvard has 24,648 students and 3,997 faculty, a ratio of 6.17, which was converted to a score of 56. MIT has 10,320 students and 1,253 faculty, a ratio of 8.24 converted to a score of 42 and so on. There seems, incidentally, to have been an error in calculating the score for Princeton. The right hand column in table 1 shows the ratio of students per faculty, based on the data provided in the rankings for the ten universities with the best score on this component.
Table 1
1. Duke ...............................................................100..........3.48
2. Yale ..........................................................................................93 ............3.74
3. Eindhoven University of Technology .................92 ..........3.78
4. Rochester ...............................................................................91 .........3.82
5. Imperial College London ...........................................88.........4.94
6. Sciences Po Paris ............................................................86.........4.05
7= Tsing Hua, PRC ............................................................84 .............4.14
7= Emory .................................................................................84...........4. 14
9= Geneva xxxxxxxx...................................................xxxxxx81 ...............4.30
9= Wake Forest ..................................................81 ...............4.30
Table 2 shows the eleven best universities ranked for students per faculty according to the profile and directory in the Guide. It may need to be revised after another search. You will notice immediately that there is no overlap at all between the two lists. The student faculty ratio in the profile and directory is indicated in the right hand column.
Table 2
1. Kyongpook National University , Korea xxxxxxxxxxxxxxxxxxxxxxxxxxx0
2. University of California at Los Angeles (UCLA) xxxxxxxxxxxxxxxxxxxxxxxxxxxxx0.6
3.= Pontificia Univeridade Catolica do Rio de Janeirio, Brazil xxxxxxxxxxxxx3.8
3= Ecole Polytechnique Paris xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx3.8
5. Ljubljana, Slovenia xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx3.9
6= Kanazawa, Japan xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.0
6= Oulo, Finland xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.0
8= Edinburgh xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.1
8= Trento, Italy ...........................................................................................................4.1
10= Utrecht, Netherlands ..........................................................................................4.3
10= Fudan, PRC.......................................................................................................... 4.3
The figures for Kyongpook and UCLA are obviously simple data entry errors. The figure for Ecole Polytechnique might not be grotesquely wrong if part-timers were included . But I remained very sceptical about such low ratios for universities in Brazil, China, Finland and Slovenia.
Someone who was looking for a university with a commitment to teaching would end up with dramatically different results if he or she checked the rankings or the profile and directory. A search of the first would produce Duke, Yale and Eindhoven and so on. A search of the second would produce (I’ll assume even the most naïve student would not believe the ratios for Kyongpook and UC LA) Ecole Polytechnique, Ljubljana and Kanazawa and so on.
Table 3 below compares the figures for student faculty ratio derived from the rankings on the left with those given in the profile and directory sections of the Guide, on the right.
Table 3.
Duke xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...x3.48 xxxxxxxxxxxxxxxxxxx16.7
Yale.........................................................................3.74 xxxxxxxxxxxxxxxxxxxx34.3
Eindhoven University of Technology................. 3.78 xxxxxxxxxxxxxxxxxx x31.1
Rochester xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx3.82 xxxxxxxxxxxxxxxxxxxx7.5
Imperial College London xxxxxxxxxxxxxxxxxxxxxxxxx4.94 xxxxxxxxxxxxxxxxxxx6.6
Sciences Po, Paris xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.05 xxxxxxxxxxxxxxxxxxx22.5
Tsing Hua xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.14 xxxxxxxxxxxxxxxxxxxxx9.3
Emory xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.14xxxxxxxxxxxxxx 9.9
Geneva.....................................................................4.30 .........................................8.4
Wake Forest............................................................4.30 .........................................16.1
UCLA.......................................................................10.20............................... 0.6
Ecole Polytechnique, Paris xxxxxxxxxxxxxxxxxxxxxxx....x5.4 xxxxxxxxxxxxxxxxxxxxx3.8
Edinburgh ...................................................................8.3 xxxxxxxxxxxxxxxxxxxx4.1
Utrecht ......................................................................13.9 xxxxxxxxxxxxxxxxxxxx4.3
Fudan xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx19.3 xxxxxxxxxxxxxxxxxxx4.3
There seem to be no relationship whatsoever between the ratios derived from the rankings and those given in the profiles and directory.
Logically, there are three possibilities. The ranking data is wrong. The directory data is wrong. Both are wrong. It is impossible for both to be correct.
In a little while, I shall try to figure out where QS got the data for both sets of statistics. I am beginning to wonder, though, whether they got them from anywhere.
To call the faculty student score a robust measure is ridiculous. As compiled and presented by THES and QS, it is as robust as a pile of dead jellyfish.
Friday, February 09, 2007
Guide to the World’s Top Universities
Guide to the World’s Top Universities: Exclusively Featuring the Official Times Higher Education Supplement QS World University Rankings. John O’Leary, Nunzio Quacquarelli and Martin Ince (QS Quacquarelli Symonds Limited/Blackwell Publishing 2006)
Here are some preliminary comments on the THES-QS guide. A full review will follow in a few days.
The Times Higher Education Supplement and QS Quacquarelli Symonds have now produced a book, published in association with Blackwell’s. The book incorporates the 2006 world university rankings of 200 universities and the rankings by peer review of the top 100 universities in disciplinary areas. It also contains chapters on topics such as choosing a university, the benefits of studying abroad and tips for applying to university. There are profiles of the top 100 universities in the THES-QS rankings and a directory containing data about over 500 universities
The book is attractively produced and contains a large amount of information. A superficial glance would suggest that it would be a very valuable resource for anybody thinking about applying to university or anybody comparing universities for any reason. Unfortunately, this would be a mistake.
There are far too many basic errors. Here is a list, almost certainly incomplete. Taken individually they may be trivial but collectively they create a strong impression of general sloppiness.
“University of Gadjah Mada” (p91). Gadjah Mada was a person not a place.
In the factfile for Harvard (p119) the section Research Impact by Subject repeats information given in the previous section on Overall Research Performance.
The factfile for Yale (p 127) reports a Student Faculty Ratio of 34.3 , probably ten times too high.
The directory (p 483) provides data about something called the “Official University of Califormia, Riverside”. No doubt someone was cutting and pasting from the official university website.
Zurich, Geneva, St Gallen and Lausanne are listed as being in Sweden (p 462-3)
Kyungpook National University, Korea, has a Student faculty Ratio of 0:1. (p 452)
New Zealand is spelt New Zeland (p441)
There is a profile for the Indian Institutes of Technology [plural] (p 231) but the directory refers to only one in New Delhi (p 416).
Similarly, there is a profile for the Indian Institutes of Management [plural] (p 253) but the directory refers to one in Lucknow (p416)
On p 115 we find the “University of Melbourneersity”
On p 103 there is a reference to “SUNY” (State University of new York” that does not specifiy which of the four university centres of the SUNY system is referred to.
Malaysian universities are given the bahasa rojak (salad language) treatment and are referred to as University Putra Malaysia and University Sains Malaysia. (p437-438)
UCLA has a student faculty ratio of 0.6:1 (p483)
There will be further comments later.
Monday, February 05, 2007
The Rise of Seoul National University
One remarkable feature of the THES-QS world university rankings has been the rise of the Seoul National University (SNU) in the Republic of Korea from 118th place in 2004 to 93rd in 2005 and then to 63rd in 2006. This made SNU the eleventh best university in Asia in 2006 and placed it well above any other Korean university.
This was accomplished in part by a rise in the peer review score from 39 to 43. Also, SNU scored 13 on the recruiter rating compared with zero in 2005. However, the most important factor seems to be an improvement in the faculty student score from 14 in 2005 to 57 in 2006
How did this happen? If we are to believe QS it was because of a remarkable expansion in the number of SNU’s faculty. In 2005 according to QS’s topgraduate site, SNU had a total of 31,509 students and 3,312 faculty or 9.51 students per faculty. In 2006, again according to QS, SNU had 30,120 students and 4,952 faculty, a ratio of 6.08. The numbers provided for students seem reasonable. SNU’s site refers to 28,074 students. It is not implausible that QS’s figures included some categories such as non-degree, part-time or off-campus students that were counted by SNU.
The number of faculty is, however, another matter. The SNU site refers to 28,074 students and 1,927 full time equivalent faculty members. There are also “1,947 staff members”. It is reasonable to assume that the latter are non-teaching staff such as technicians and librarians.
Further down, the SNU site things begin to get confusing. As of 1st April 2006, according to the site, there were 3,834 “teaching faculty” and 1, 947 “educational staff”. Presumably these are the same as the earlier 1, 947 “staff members.
The mystery now is how 1,927 full time equivalent faculty grew to 3,834 teaching faculty. The latter figure would seem to be completely wrong if only because one would expect teaching faculty to be fewer than total faculty.
Since 1,947 full time equivalent faculty plus 1,947 staff members adds up to 3,874, a little bit more than 3, 834, it could be that “faculty” and “staff” were combined to produce a total for “teaching faculty”.
Another oddity is that SNU has announced on this site that it has a student-faculty ration of 4.6. I am baffled how this particular statistic was arrived at.
QS should, I suppose, get some credit for not accepting this thoroughly implausible claim. It’s ratio of 6.08 is, however, only slightly better and seems dependent on accepting a figure of 4,952 faculty. Unless somebody has been fabricating data out of very thin air, the most plausible explanation I can think of is that QS constructed the faculty statistic from a source that did something like taking the already inflated number of teaching faculty and then added the professors. Perhaps the numbers were obtained in the course of a telephone conversation over a bad line.
And the real ratio? On the SNU site there is a "visual statistics page" that refers to 1,733 "faculty members" in 2006. This seems plausible. Also, just have a look at what the MBA Dauphine-Sorbonne-Renault programme, which has partnerships with Asian and Latin American universities, says:
"Founded in 1946, Seoul National University (SNU) marked the opening of the first national university in modern Korean history. As an indisputable leader of higher education in Korea, SNU has maintained the high standard of education in liberal arts and sciences. With outstanding cultural and recreational benefits, SNU offers a wide variety of entertainment opportunities in the college town of Kwanak and in the city of Seoul.
SNU began with one graduate school and nine colleges, and today SNU has 16 colleges, 3 specialized graduate schools , 1 graduate school, 93 research institutes, and other supporting facilities, which are distributed over 2 campuses.
Currently, SNU has a student enrollment of approximately 30,600 degree candidates, including 27,600 undergraduates and 3,000 graduates. Also SNU has approximately 700 foreign students from 70 different countries. Maintaining the faculty of student ratio of 1:20, over 1,544 faculty and around 30 foreign professors are devoted to teaching SNU students to become leaders in every sector of Korean Society.
With the ideal of liberal education and progressive visions for the 21st century, SNU will continue to take a leading position as the most prestigious, research-oriented academic university in South Korea. " (my italics)
A Student-faculty ratio of around 20 seems far more realistic than the 4.6 claimed by SNU or QS's 6.08. An explanation would seem to be in order from SNU and from QS.
Monday, January 29, 2007
On January 23rd I wrote to John O’Leary, Editor of the Times Higher Education Supplement concerning the data for Duke University in the 2006 world university rankings. I had already pointed out that the 2006 data appeared to be inaccurate and that, since Duke had the best score in the faculty–student section against which the others were benchmarked, all the scores in this section and therefore all the overall scores were inaccurate. There has to date been no reply. I am therefore publishing this account of how the data for Duke may have been constructed.
It has been clear for some time that the score given to Duke for this section of the rankings and the underlying data reported on the web sites of THES’s consultants, QS Quacquarelli Symonds, were incorrect and that Duke should not have been the highest scorer in 2006 on this section. Even the Duke administration has expressed its surprise at the published data. What has not been clear is how QS could have come up with data so implausible and so different from those provided by Duke itself. I believe I have worked out how QS probably constructed these data, which have placed Duke at the top of this part of the rankings so that it has become the benchmark for every other score.
In 2005 Duke University made an impressive ascent up the rankings from 52nd to 11th. This rise was due in large part to a remarkable score for faculty-student ratio. In that year Duke was reported by QS on their topgraduates site to have a total of 12,223 students, comprising 6,248 undergraduates and 5, 975 postgraduates, and 6,244 faculty, producing a ratio of 1.96 students per faculty. The figure for faculty was clearly an error since Duke itself claimed to have only
1,595 tenure and tenure-track faculty and was almost certainly caused by someone entering the number of undergraduate students at Duke, 6,244 in the fall of 2005, into the space for faculty on the QS database. In any case someone should have pointed out that large non-specialist institutions, no matter how lavishly they are funded, simply do not have fewer than two students per faculty
In 2006 the number of faculty and students listed on QS’s topuniversities site was not so obviously incredible and erroneous but was still quite implausible.
According to QS, there were in 2006 11,106 students at Duke, of whom 6,301 were undergraduates and 4,805 postgraduates. It is unbelievable that a university could reduce the number of its postgraduate students by over a thousand, based on QS’s figures, or about two thousand, based on data on the Duke web site, in the course of a single year.
There were in 2006, according to QS, 3,192 faculty at Duke. This is not quite as incredible as the number claimed in 2005 but is still well in excess of the number reported on the Duke site.
So where did the figures, which have placed Duke at the top of the faculty student ratio component in 2006, come from? The problem evidently faced by whoever compiled the data is that the Duke site has not updated its totals of students and faculty since the fall of 2005 but has provided partial information about admissions and graduations which were used in an attempt to estimate current enrollment for the fall of 2006.
If you look at the Duke site you will notice that there is some information about admissions and graduations. At the start of the academic year of 2005 – 2006 (the “class of 2009”) 1,728 undergraduates were admitted and between July 1st, 2005 and June 30th, 2006 1,670 undergraduate degrees were conferred.
So, working from the information provided by Duke about undergraduate students we have;
6,244-
1,670+
1,728
=
6,302
The QS site indicates 6,301 undergraduate students in 2006.
It seems likely that the number of undergraduates in the fall of 2006 was calculated by adding the number of admissions in the fall of 2005 (it should actually have been the fall of 2006) to the number enrolled in the fall of 2005 and deducting the number of degrees conferred between July 2005 and June 2006. The total thus obtained differs by one digit from that listed by the QS site. This is most probably a simple data entry error. The total obtained by this method would not of course be completely valid since it did not take account of students leaving for reasons other than receiving a degree. It would, however, probably be not too far off the correct number.
The number of postgraduate students is another matter. It appears that there was a botched attempt to use the same procedure to calculate the number of graduate students in 2006. The problem, though, was that the Duke site does not indicate enrollment of postgraduate students in that year. In the fall of 2005 there were 6,844 postgraduate students. Between July 2005 and June 2006 2,348 postgraduate and professional degrees were awarded, according to the Duke site. This leaves 4,496 postgraduate students. The QS topuniversities site reports that there were 4,805 postgraduates in 2006. This is a difference of 309.
So where did the extra 309 postgraduates come from? Almost certainly the answer is provided by the online Duke news of September 6, 2006 which refers to a total of 1,687 first year undergraduate students composed of 1,378 entering the Trinity College of Arts and Science (Trinity College is the old name of Duke retained for the undergraduate school) and 309 undergraduates entering the Pratt School of Engineering. The total number of admissions is slightly different from the number given on the Duke main page but this may be explained by last minute withdrawals or a data entry error.
So it looks like someone at QS took the number of postgraduate students in 2005 , deducted the number of degrees awarded and added students admitted to the Pratt School of Engineering in the fall of 2006 and came up with the total of 4,805 in 2006. This is way off the mark because the 309 students admitted to the School of Engineering are not postgraduates, as is evident from their inclusion in the class of 2010, and no postgraduate admissions of any kind were counted. The result is that Duke appears erroneously to have lost about 2,000 postgraduate students between 2005 and 2006.
The undergraduate and postgraduate students were then apparently combined on the QS site to produce a total of 11,106 students, or about 1,000 less than QS reported in 2005 and about 2,000 less than indicated by Duke for that year.
What about the number of faculty? Here, QS’s procedure appears to get even dodgier. The Duke site refers to 1,595 tenure and tenure track faculty. The QS site refers to 3,192 faculty. Where does the difference come from? The answer ought to be obvious and I am embarrassed to admit that it took me a couple of hours to work it out. 1,595 multiplied by 2 is 3190 or exactly 2 less than QS’s figure. The slight difference is probably another data entry error or perhaps an earlier error of addition.
The Duke site contains a table of faculty classified according to school – Arts and Sciences, Engineering, Divinity and so on adding up to 1,595 and then classified according to status – full, associate and assistant professors, again adding up to 1,595. It would seem likely that someone assumed that the two tables referred to separate groups of faculty and then added them together.
So, having reduced the number of students by not including postgraduate admissions and doubling the number of faculty by counting them twice, QS seem to have come up with a a ratio of 3.48 students per faculty This gave Duke the best score for this part of the ranking against which all other scores were calibrated. The standardized score of 100 should in fact have been given to Yale, assuming, perhaps optimistically, that this ratio has been calculated correctly.
It follows that every score for the faculty student ratio is incorrect and therefore that every overall score is incorrect.
If there is another explain for the unbelievable Duke statistics I would be glad to hear it. But I think that if there is going to be a claim that an official at Duke provided information that is so obviously incorrect then the details of the communication should be provided. If information was obtained from another source, although I do not see any way that it could be, it should be indicated. Whatever the source of the error, someone at QS ought to have checked the score of the top university in each component and should have realized immediately that major universities do not reduce the number of their students so dramatically in a single year and keep it secret. Nor is it plausible that a large general university could have a ratio of 3.48 students per faculty.
To show that this reconstruction of QS’s methods is mistaken would require nothing more than indicating the source of the data and an e-mail address or citation by which it could be verified.
Friday, January 12, 2007
Something very odd has been going on at the University of Technology Sydney (UTS), if we can believe QS Quacquarelli Syminds, THES's consultants.
In 2005, according to QS, UTS had a faculty of 866 of whom 253 were international. The latter figure is definitely not real information but simply repesents 29% of the total faculty, which is QS's estimate or guess for Australian universities in general. This should have given UTS a score of 53 for the international faculty component on the 2005 world university rankings although the score actually given was 33. This was presumably the result of a data entry error. UTS was ranked 87th in the 2005 rankings.
In 2006, according to QS, the number of faculty at UTS increased dramatically to 1,224. However, the number of international faculty dropped to precisely zero. Partly as a result of this UTS's position in the rankings fell to 255.
Meanwhile, UTS itself reports that it has 2,576 full time equivalent faculty.
On the 21st of December I received a message from John O'Leary, Editor of THES, that he had sent my questions about the world university rankings to QS and that he hoped to get back to me in the new year since UK companies often have an extended Chtistmas break.
Assuming it started on December 25th, the break has now lasted for 18 days.
Monday, January 01, 2007
That is the opinion of the Gadfly, a blog run by four Harvard undergraduates, of the THES world rankings. Here is a quotation:
"The Times Higher Education Supplement (THES) just released their global rankings, and it’s an utter scandal. Rife with errors of calculation, consistency and judgment, it is a testament not only to this ridiculous urge to rank everything but also to the carelessness with which important documents can be compiled."
The post concludes:
"One cannot help but think that the THES rankings are a British ploy to feel good about Oxford and Cambridge, the former of which is having a hard time pushing through financial reforms. Both are really universities who should be doing better, and are not. It may explain why Cambridge ups Harvard on the THES peer review, despite the fact that it lags behind Harvard under almost every other criteria, like citations per faculty, and citations per paper in specific disciplines."
Bangor is Very Naughty
Bangor University in Wales has apparantly been fiddling about with its exam results in order to boost its position in university rankings (not this time the THES world rankings). One wonders how much more of this sort of thing goes on. Anyway, here is an extract from the report in the THES . Contrary to what many people in Asia and the US think, the THES and the Times are separate publications.
And congratulations to Sam Burnett.
Bangor University was accused this week of lowering its academic standards with a proposal to boost the number of first-class degrees it awards.
According to a paper leaked to The Times Higher, the university agreed a system for calculating student results that would mean that about 60 per cent of graduates would obtain either a first or an upper-second class degree in 2007, compared with about 52 per cent under the current system.
The paper, by pro vice-chancellor Tom Corns, says that the university's key local rival, Aberystwyth University, "awarded 6.7 per cent more first and upper-second class degrees than we did". At the time, this helped place Bangor eight positions below Aberystwyth in The Times 2005 league table of universities.
He says: "We must redress the balance with all expedition", meaning the reforms are likely to take effect for 2007 graduates rather than for the 2007 entry cohort.
The move prompted heavy criticism this week. Alan Smithers, director of the Centre for Education and Employment Research at Buckingham University, said: "Hitherto, universities have been trusted to uphold degree standards, but such behaviour calls into question the desirability of continuing to allow them free rein in awarding their own degrees. Perhaps there should be an independent regulatory body."
He suggested that a body such as the Qualifications and Curriculum Authority, which regulates schools' exam awards, could be set up for higher education.
Sam Burnett, president of Bangor student union, said that Bangor had been "very naughty".
"The issue isn't about the system that should be in place... University figures seem to have identified the quickest way to boost Bangor up the league tables and will cheapen degrees in the process. Maybe it would be easier just to add 5 per cent to everyone's scores next July."
Thursday, December 21, 2006
John O'Leary, editor of the THES, has replied to my open letter:
Dear Mr Holmes
Thank you for your email about our world rankings. As you have raised a number of detailed points, I have forwarded it to QS for their comments. I will get back to you as soon as those have arrived but I suspect that may be early in the New Year, since a lot of UK companies have an extended Christmas break.
Best wishes
John O'Leary
Editor
The Times Higher Education Supplement
If nothing else, perhaps we will find out how long an extended christmas break is.