Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Monday, February 26, 2007
The THES-QS rankings of 2005 and 2006 are heavily weighted towards its so-called peer review, which receives 40% of the total ranking score. No other section gets more than 20 %. The “peer review” is supposed to be a survey of research active academics from around the world. One would therefore expect it to be based on a representative sample of the international research community, or “global opinion”, as THES claimed in 2005. It is, however, nothing of the sort.
The review was based on e-mails sent to people included on a database purchase from World Scientific Publishing Company. This is a publishing company that was founded in 1981. It now has 200 employees at its main office in Singapore. There are also subsidiary offices in New Jersey, London, Hong Kong, Taipei, Chennai, Beijing and Singapore. It claims to be the leading publisher of scientific journals and books in the Asia-Pacific region.
World Scientific has several subsidiaries. These include Imperial College (London) Press, which publishes books and journals on engineering, medicine, information technology, environmental technology and management, Pan Stanford Publishing of Singapore, which publishes in such fields as nanoelectronics, spintronics biomedical engineering and genetics, and KH Biotech Services Singapore who specialise in biotechnology, pharmaceuticals, food and agriculture; consultancy, training and conference organisation services. It also distributes books and journals produced for The National Academies Press (based in Washington, D.C.) in most countries in Asia (but not in Japan).
World Scientific has particularly close links with China, especially with Peking University. Their newsletter of November 2005 reports that:
”The last few years has seen the rapid growth of China's economy and academic sectors. Over the years, World Scientific has been actively establishing close links and putting down roots in rapidly growing China”
Another report describes a visit from Chinese university publishers:
”In August 2005, World Scientific Chairman Professor K. K. Phua, was proud to receive a delegation from the China University Press Association. Headed by President of Tsinghua University Press Professor Li Jiaqiang, the delegation comprised presidents from 18 top Chinese university publishing arms. The parties exchanged opinions on current trends and developments in the scientific publishing industry in China as well as Singapore. Professor Phua shared many of his experiences and expressed his interest in furthering collaboration with Chinese university presses. “
World Scientific has also established very close links with Peking University:
”World Scientific and Peking University's School of Mathematical Sciences have, for many years, enjoyed a close relationship in teaching, research and academic publishing. To further improve the close cooperation, a "World Scientific - Peking University Work Room" has been set up in the university to serve the academic communities around the world, and to provide a publishing platform to enhance global academic exchange and cooperation. World Scientific has also set up a biannual "World Scientific Scholarship" in the Peking School of Mathematical Sciences. The scholarship, totaling RMB 30,000 per annum and administered by the university, aims to reward and encourage students and academics with outstanding research contributions.”
Here are some of the titles published by the company:
China Particuology
Chinese Journal of Polymer science
Asia Pacific Journal of Operational research
Singapore Economic review
China: an International Journal
Review of Pacific Basin Financial Markets and Policies
Asian Case Research Journal
It should be clear by now that World Scientific is active mainly in the Asia-Pacific region, with an outpost in London. It seems more than likely that its database, which might be the list of subscribers or its mailing list, would be heavily biased towards the Asia-Pacific region. This goes a long way towards explaining why Chinese, Southeast Asian and Australasian universities do so dramatically better on the peer review than they do on the citations count or any other measure of quality.
I find it inconceivable that QS were unaware of the nature of World Scientific when they purchased the database and sent out the e-mails. To claim that the peer review is in any sense an international survey is absurd. QS have produced what may some day become a classical example of how bad sampling technique can destroy the validity of any survey.
Monday, February 19, 2007
Professor Simon Marginson of the University of Melbourne has made some very appropriate comments to The Age about the THES - QS rankings and Australian universities.
Professor Marginson told The Age that a lack of transparency in the rankings method means that universities could be damaged through no fault of their own.
'"Up to now, we in Australian universities have done better out of the Times rankings than our performance on other indicators would suggest," he said. "But it could all turn around and start working against us, too."
The Times rankings are volatile because surveys of employers and academics are open to manipulation, subjectivity and reward marketing over research, Professor Marginson said.'
The admitted extraordinarily low response rate to the THES - QS "peer review" combined with the overrepresentation of Australian "research-active academics"among the respondents are sufficient to confirm Professor's Marginsons remarks about the rankings.
Friday, February 16, 2007
Another problem with the peer review section of the THES-QS World University Rankings is that it is extremely biased against certain countries and biased in favour of certain others. Here is an incomplete list of countries where respondents to the peer review survey are located and the number of respondents.
USA 532
UK 378
India 256
Australia 191
Canada 153
Malaysia 112
Germany 103
Indonesia 93
Singapore 92
China 76
Japan 53
France 56
Japan 53
Mexico 51
Thailand 37
Israel 36
Iran 31
Taiwan 29
South Korea 27
HongKong 25
New Zealand 25
Pakistan 23
Finland 23
Nigeria 20
How far does the above list reflect the distribution of research expertise throughout the world? Here is a list of the same countries with the number of academics listed in Thomson ISI Highly Cited Researchers.
USA 3,825
UK 439
India 11
Australia 105
Canada 172
Malaysia 0
Germany 241
Indonesia 0
Singapore 4
China (excluding Hong Kong) 4
Japan 53
France 56
Japan 246
Mexico 3
Thailand 0
Israel 47
Iran 1
Taiwan 9
South Korea 3
HongKong 14
New Zealand 17
Pakistan 1
Finland 15
Nigeria 0
The number of highly cited scholars is not a perfect measure of research activity -- for one thing, some disciplines cite more than others -- but it does give us a broad picture of the research expertise of different countries.
The peer review is outrageously biased against the United States, extremely biased against Japan and very biased against Canada, Israel, European countries like France, Germany, Switzerland and the Netherlands.
On the other hand, there is a strong bias towards China (less so Taiwan and Hong Kong), India Southeast Asia and Australia.
Now we now why Cambridge does so much better in the peer review than Harvard despite an inferior research record, why Peking university is apparantly among the best in the world, why there are so many Australian universities in the top 200 and why the world 's academics supposedly cite Japanese researchers copiously but cannot bring themselves to vote for them in the peer review .
Thursday, February 15, 2007
QS Quacquarelli Symonds have published additional information on their web site concerning the selection of the initial list of universities and the administration of the "peer review". I would like to focus on just one issue for the moment, namely the response rate to the e-mail survey. Ben Sowter of QS had already claimed to have surveyed more than 190,000 academics to produce the review. He had said:
"Peer Review: Over 190,000 academics were emailed a request to complete our online survey this year. Over 1600 responded - contributing to our response universe of 3,703 unique responses in the last three years. Previous respondents are given the opportunity to update their response." (THES-QS World University Rankings _ Methodology)
This is a response rate about 0.8%, less than 1%. I had assumed that the figure of 190,000 was a typographical error and that it should have been 1,900. A response rate of 80% would have been on the high side but perhaps respondents were highly motivated by being included in the ranks of "smart people" or winning a BlackBerry organiser.
However, the new information provided appears to suggests that QS did survey such a large number.
"So, each year, phase one of the peer review exercise is to invite all previous reviewers to return and update their opinion. Then we purchase two databases, one of 180,000 international academics from the World Scientific (based in Singapore) and another of around 12,000 from Mardev - focused mainly on Arts & humanities which is poorly represented in the former.
We examine the responses carefully and discard any test responses and bad responses and look for any non-academic responses that may have crept in. " (Methodology-- The Peer Review)
There is a gap between "we purchase" and "we examine the responses" but the implication is that about 192, 000 academics were sent emails.
If this is the case then we have an extraordinarily low response rate, probably a record in the history of survey research. Kim Sheehan in an article in the Journal of Computer Mediated Communication reports that 31 studies of e-mail surveys show a mean response rate of about 37%. Response rates have been declining in recent years but even in 2004 the mean response rate was about 24%
Either QS did not send out so many e-mails or there was something wrong with the database or something else is wrong. Whatever it is, such a low response rate is in itself enough to render a survey invalid. A explanation is needed.
Wednesday, February 14, 2007
The Technical Univeritsity of Munich has pulled off a major feat. It has been awarded not one but two places among the world's top 100 universities in the THES-QS book, Guide to the World's Top Universities. The Guide has also managed to move a major university several hundred miles.
In 2006 the THES -- QS world university rankings placed the Technical University of Munich in 82nd place and the University of Munich at 98th.
The new THES-QS Guide has profiles of the top 100 universities. On page 283 and in 82nd place we find the Technical University Munich. Its address is given as "Germany". How very helpful. The description is clearly that of the Technical University and so is the data in the factfile.
On page 313 the Technical University Munich appears again, now in 98th place . The description is identical to that on page 283 but the information in the factfile is different and appears to refer to the (Ludwig-Maximilien) University of Munich. The university is given an address in Dortmund, in a completely different state and the web site appears to be that of the University of Munich.
Turning to the directory we find that "Universitat Munchen" is listed, again with an address in Dortmund, and the Technische Universitat Munichen is on page 409, without an address. This time the data for the two universities appears to be correct.
Sunday, February 11, 2007
A Robust Measure
There is something very wrong with the THES-QS Guide to the World’s Top Universities, recently published in association with Blackwell’s of London. I am referring to the book’s presentation of two completely different sets of data for student faculty ratio.
In the Guide, it is claimed that this ratio “is a robust measure and is based on data gathered by QS from universities or from national bodies such as the UK’s Higher Education Statistics Agency, on a prescribed definition of staff and students” (p 75).
Chapter 9 of the book consists of the ranking of the world’s top 200 universities originally published in the THES in October 2006. The rankings consist of an overall score for each university and scores for various components one of which is for the number of students per faculty. This section accounted for 20% of the total ranking. Chapter 11 consists of profiles of the top 100 universities, which among other things, include data for student faculty ratio. Chapter 12 is a directory of over 500 universities which in most cases also includes the student faculty ratio.
Table 1 below shows the top ten universities in the world according to the faculty student score in the university rankings, which is indicated in the middle column. It is possible to reconstruct the process by which the scores in THES rankings were calculated by referring to QS’s topuniversities site which provides information, including numbers of students and faculty, about each university in the top 200, as well as more than 300 others.
There can be no doubt that the data on the web site is that from which the faculty student score has been calculated. Thus Duke has, according to QS, 11,106 students and 3,192 faculty or a ratio of 3.48 students per faculty which was converted to a score of 100. Harvard has 24,648 students and 3,997 faculty, a ratio of 6.17, which was converted to a score of 56. MIT has 10,320 students and 1,253 faculty, a ratio of 8.24 converted to a score of 42 and so on. There seems, incidentally, to have been an error in calculating the score for Princeton. The right hand column in table 1 shows the ratio of students per faculty, based on the data provided in the rankings for the ten universities with the best score on this component.
Table 1
1. Duke ...............................................................100..........3.48
2. Yale ..........................................................................................93 ............3.74
3. Eindhoven University of Technology .................92 ..........3.78
4. Rochester ...............................................................................91 .........3.82
5. Imperial College London ...........................................88.........4.94
6. Sciences Po Paris ............................................................86.........4.05
7= Tsing Hua, PRC ............................................................84 .............4.14
7= Emory .................................................................................84...........4. 14
9= Geneva xxxxxxxx...................................................xxxxxx81 ...............4.30
9= Wake Forest ..................................................81 ...............4.30
Table 2 shows the eleven best universities ranked for students per faculty according to the profile and directory in the Guide. It may need to be revised after another search. You will notice immediately that there is no overlap at all between the two lists. The student faculty ratio in the profile and directory is indicated in the right hand column.
Table 2
1. Kyongpook National University , Korea xxxxxxxxxxxxxxxxxxxxxxxxxxx0
2. University of California at Los Angeles (UCLA) xxxxxxxxxxxxxxxxxxxxxxxxxxxxx0.6
3.= Pontificia Univeridade Catolica do Rio de Janeirio, Brazil xxxxxxxxxxxxx3.8
3= Ecole Polytechnique Paris xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx3.8
5. Ljubljana, Slovenia xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx3.9
6= Kanazawa, Japan xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.0
6= Oulo, Finland xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.0
8= Edinburgh xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.1
8= Trento, Italy ...........................................................................................................4.1
10= Utrecht, Netherlands ..........................................................................................4.3
10= Fudan, PRC.......................................................................................................... 4.3
The figures for Kyongpook and UCLA are obviously simple data entry errors. The figure for Ecole Polytechnique might not be grotesquely wrong if part-timers were included . But I remained very sceptical about such low ratios for universities in Brazil, China, Finland and Slovenia.
Someone who was looking for a university with a commitment to teaching would end up with dramatically different results if he or she checked the rankings or the profile and directory. A search of the first would produce Duke, Yale and Eindhoven and so on. A search of the second would produce (I’ll assume even the most naïve student would not believe the ratios for Kyongpook and UC LA) Ecole Polytechnique, Ljubljana and Kanazawa and so on.
Table 3 below compares the figures for student faculty ratio derived from the rankings on the left with those given in the profile and directory sections of the Guide, on the right.
Table 3.
Duke xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...x3.48 xxxxxxxxxxxxxxxxxxx16.7
Yale.........................................................................3.74 xxxxxxxxxxxxxxxxxxxx34.3
Eindhoven University of Technology................. 3.78 xxxxxxxxxxxxxxxxxx x31.1
Rochester xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx3.82 xxxxxxxxxxxxxxxxxxxx7.5
Imperial College London xxxxxxxxxxxxxxxxxxxxxxxxx4.94 xxxxxxxxxxxxxxxxxxx6.6
Sciences Po, Paris xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.05 xxxxxxxxxxxxxxxxxxx22.5
Tsing Hua xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.14 xxxxxxxxxxxxxxxxxxxxx9.3
Emory xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.14xxxxxxxxxxxxxx 9.9
Geneva.....................................................................4.30 .........................................8.4
Wake Forest............................................................4.30 .........................................16.1
UCLA.......................................................................10.20............................... 0.6
Ecole Polytechnique, Paris xxxxxxxxxxxxxxxxxxxxxxx....x5.4 xxxxxxxxxxxxxxxxxxxxx3.8
Edinburgh ...................................................................8.3 xxxxxxxxxxxxxxxxxxxx4.1
Utrecht ......................................................................13.9 xxxxxxxxxxxxxxxxxxxx4.3
Fudan xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx19.3 xxxxxxxxxxxxxxxxxxx4.3
There seem to be no relationship whatsoever between the ratios derived from the rankings and those given in the profiles and directory.
Logically, there are three possibilities. The ranking data is wrong. The directory data is wrong. Both are wrong. It is impossible for both to be correct.
In a little while, I shall try to figure out where QS got the data for both sets of statistics. I am beginning to wonder, though, whether they got them from anywhere.
To call the faculty student score a robust measure is ridiculous. As compiled and presented by THES and QS, it is as robust as a pile of dead jellyfish.
Friday, February 09, 2007
Guide to the World’s Top Universities
Guide to the World’s Top Universities: Exclusively Featuring the Official Times Higher Education Supplement QS World University Rankings. John O’Leary, Nunzio Quacquarelli and Martin Ince (QS Quacquarelli Symonds Limited/Blackwell Publishing 2006)
Here are some preliminary comments on the THES-QS guide. A full review will follow in a few days.
The Times Higher Education Supplement and QS Quacquarelli Symonds have now produced a book, published in association with Blackwell’s. The book incorporates the 2006 world university rankings of 200 universities and the rankings by peer review of the top 100 universities in disciplinary areas. It also contains chapters on topics such as choosing a university, the benefits of studying abroad and tips for applying to university. There are profiles of the top 100 universities in the THES-QS rankings and a directory containing data about over 500 universities
The book is attractively produced and contains a large amount of information. A superficial glance would suggest that it would be a very valuable resource for anybody thinking about applying to university or anybody comparing universities for any reason. Unfortunately, this would be a mistake.
There are far too many basic errors. Here is a list, almost certainly incomplete. Taken individually they may be trivial but collectively they create a strong impression of general sloppiness.
“University of Gadjah Mada” (p91). Gadjah Mada was a person not a place.
In the factfile for Harvard (p119) the section Research Impact by Subject repeats information given in the previous section on Overall Research Performance.
The factfile for Yale (p 127) reports a Student Faculty Ratio of 34.3 , probably ten times too high.
The directory (p 483) provides data about something called the “Official University of Califormia, Riverside”. No doubt someone was cutting and pasting from the official university website.
Zurich, Geneva, St Gallen and Lausanne are listed as being in Sweden (p 462-3)
Kyungpook National University, Korea, has a Student faculty Ratio of 0:1. (p 452)
New Zealand is spelt New Zeland (p441)
There is a profile for the Indian Institutes of Technology [plural] (p 231) but the directory refers to only one in New Delhi (p 416).
Similarly, there is a profile for the Indian Institutes of Management [plural] (p 253) but the directory refers to one in Lucknow (p416)
On p 115 we find the “University of Melbourneersity”
On p 103 there is a reference to “SUNY” (State University of new York” that does not specifiy which of the four university centres of the SUNY system is referred to.
Malaysian universities are given the bahasa rojak (salad language) treatment and are referred to as University Putra Malaysia and University Sains Malaysia. (p437-438)
UCLA has a student faculty ratio of 0.6:1 (p483)
There will be further comments later.
Monday, February 05, 2007
The Rise of Seoul National University
One remarkable feature of the THES-QS world university rankings has been the rise of the Seoul National University (SNU) in the Republic of Korea from 118th place in 2004 to 93rd in 2005 and then to 63rd in 2006. This made SNU the eleventh best university in Asia in 2006 and placed it well above any other Korean university.
This was accomplished in part by a rise in the peer review score from 39 to 43. Also, SNU scored 13 on the recruiter rating compared with zero in 2005. However, the most important factor seems to be an improvement in the faculty student score from 14 in 2005 to 57 in 2006
How did this happen? If we are to believe QS it was because of a remarkable expansion in the number of SNU’s faculty. In 2005 according to QS’s topgraduate site, SNU had a total of 31,509 students and 3,312 faculty or 9.51 students per faculty. In 2006, again according to QS, SNU had 30,120 students and 4,952 faculty, a ratio of 6.08. The numbers provided for students seem reasonable. SNU’s site refers to 28,074 students. It is not implausible that QS’s figures included some categories such as non-degree, part-time or off-campus students that were counted by SNU.
The number of faculty is, however, another matter. The SNU site refers to 28,074 students and 1,927 full time equivalent faculty members. There are also “1,947 staff members”. It is reasonable to assume that the latter are non-teaching staff such as technicians and librarians.
Further down, the SNU site things begin to get confusing. As of 1st April 2006, according to the site, there were 3,834 “teaching faculty” and 1, 947 “educational staff”. Presumably these are the same as the earlier 1, 947 “staff members.
The mystery now is how 1,927 full time equivalent faculty grew to 3,834 teaching faculty. The latter figure would seem to be completely wrong if only because one would expect teaching faculty to be fewer than total faculty.
Since 1,947 full time equivalent faculty plus 1,947 staff members adds up to 3,874, a little bit more than 3, 834, it could be that “faculty” and “staff” were combined to produce a total for “teaching faculty”.
Another oddity is that SNU has announced on this site that it has a student-faculty ration of 4.6. I am baffled how this particular statistic was arrived at.
QS should, I suppose, get some credit for not accepting this thoroughly implausible claim. It’s ratio of 6.08 is, however, only slightly better and seems dependent on accepting a figure of 4,952 faculty. Unless somebody has been fabricating data out of very thin air, the most plausible explanation I can think of is that QS constructed the faculty statistic from a source that did something like taking the already inflated number of teaching faculty and then added the professors. Perhaps the numbers were obtained in the course of a telephone conversation over a bad line.
And the real ratio? On the SNU site there is a "visual statistics page" that refers to 1,733 "faculty members" in 2006. This seems plausible. Also, just have a look at what the MBA Dauphine-Sorbonne-Renault programme, which has partnerships with Asian and Latin American universities, says:
"Founded in 1946, Seoul National University (SNU) marked the opening of the first national university in modern Korean history. As an indisputable leader of higher education in Korea, SNU has maintained the high standard of education in liberal arts and sciences. With outstanding cultural and recreational benefits, SNU offers a wide variety of entertainment opportunities in the college town of Kwanak and in the city of Seoul.
SNU began with one graduate school and nine colleges, and today SNU has 16 colleges, 3 specialized graduate schools , 1 graduate school, 93 research institutes, and other supporting facilities, which are distributed over 2 campuses.
Currently, SNU has a student enrollment of approximately 30,600 degree candidates, including 27,600 undergraduates and 3,000 graduates. Also SNU has approximately 700 foreign students from 70 different countries. Maintaining the faculty of student ratio of 1:20, over 1,544 faculty and around 30 foreign professors are devoted to teaching SNU students to become leaders in every sector of Korean Society.
With the ideal of liberal education and progressive visions for the 21st century, SNU will continue to take a leading position as the most prestigious, research-oriented academic university in South Korea. " (my italics)
A Student-faculty ratio of around 20 seems far more realistic than the 4.6 claimed by SNU or QS's 6.08. An explanation would seem to be in order from SNU and from QS.