Sunday, February 11, 2007

A Robust Measure

There is something very wrong with the THES-QS Guide to the World’s Top Universities, recently published in association with Blackwell’s of London. I am referring to the book’s presentation of two completely different sets of data for student faculty ratio.

In the Guide, it is claimed that this ratio “is a robust measure and is based on data gathered by QS from universities or from national bodies such as the UK’s Higher Education Statistics Agency, on a prescribed definition of staff and students” (p 75).


Chapter 9 of the book consists of the ranking of the world’s top 200 universities originally published in the THES in October 2006. The rankings consist of an overall score for each university and scores for various components one of which is for the number of students per faculty. This section accounted for 20% of the total ranking. Chapter 11 consists of profiles of the top 100 universities, which among other things, include data for student faculty ratio. Chapter 12 is a directory of over 500 universities which in most cases also includes the student faculty ratio.

Table 1 below shows the top ten universities in the world according to the faculty student score in the university rankings, which is indicated in the middle column. It is possible to reconstruct the process by which the scores in THES rankings were calculated by referring to QS’s topuniversities site which provides information, including numbers of students and faculty, about each university in the top 200, as well as more than 300 others.

There can be no doubt that the data on the web site is that from which the faculty student score has been calculated. Thus Duke has, according to QS, 11,106 students and 3,192 faculty or a ratio of 3.48 students per faculty which was converted to a score of 100. Harvard has 24,648 students and 3,997 faculty, a ratio of 6.17, which was converted to a score of 56. MIT has 10,320 students and 1,253 faculty, a ratio of 8.24 converted to a score of 42 and so on. There seems, incidentally, to have been an error in calculating the score for Princeton. The right hand column in table 1 shows the ratio of students per faculty, based on the data provided in the rankings for the ten universities with the best score on this component.

Table 1

1. Duke ...............................................................100..........3.48

2. Yale ..........................................................................................93 ............3.74

3. Eindhoven University of Technology .................92 ..........3.78

4. Rochester ...............................................................................91 .........3.82

5. Imperial College London ...........................................88.........4.94

6. Sciences Po Paris ............................................................86.........4.05

7= Tsing Hua, PRC ............................................................84 .............4.14

7= Emory .................................................................................84...........4. 14

9= Geneva xxxxxxxx...................................................xxxxxx81 ...............4.30

9= Wake Forest ..................................................81 ...............4.30

Table 2 shows the eleven best universities ranked for students per faculty according to the profile and directory in the Guide. It may need to be revised after another search. You will notice immediately that there is no overlap at all between the two lists. The student faculty ratio in the profile and directory is indicated in the right hand column.

Table 2

1. Kyongpook National University , Korea xxxxxxxxxxxxxxxxxxxxxxxxxxx0

2. University of California at Los Angeles (UCLA) xxxxxxxxxxxxxxxxxxxxxxxxxxxxx0.6

3.= Pontificia Univeridade Catolica do Rio de Janeirio, Brazil xxxxxxxxxxxxx3.8

3= Ecole Polytechnique Paris xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx3.8

5. Ljubljana, Slovenia xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx3.9

6= Kanazawa, Japan xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.0

6= Oulo, Finland xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.0

8= Edinburgh xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.1

8= Trento, Italy ...........................................................................................................4.1

10= Utrecht, Netherlands ..........................................................................................4.3

10= Fudan, PRC.......................................................................................................... 4.3

The figures for Kyongpook and UCLA are obviously simple data entry errors. The figure for Ecole Polytechnique might not be grotesquely wrong if part-timers were included . But I remained very sceptical about such low ratios for universities in Brazil, China, Finland and Slovenia.

Someone who was looking for a university with a commitment to teaching would end up with dramatically different results if he or she checked the rankings or the profile and directory. A search of the first would produce Duke, Yale and Eindhoven and so on. A search of the second would produce (I’ll assume even the most naïve student would not believe the ratios for Kyongpook and UC LA) Ecole Polytechnique, Ljubljana and Kanazawa and so on.

Table 3 below compares the figures for student faculty ratio derived from the rankings on the left with those given in the profile and directory sections of the Guide, on the right.

Table 3.

Duke xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...x3.48 xxxxxxxxxxxxxxxxxxx16.7

Yale.........................................................................3.74 xxxxxxxxxxxxxxxxxxxx34.3

Eindhoven University of Technology................. 3.78 xxxxxxxxxxxxxxxxxx x31.1

Rochester xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx3.82 xxxxxxxxxxxxxxxxxxxx7.5

Imperial College London xxxxxxxxxxxxxxxxxxxxxxxxx4.94 xxxxxxxxxxxxxxxxxxx6.6

Sciences Po, Paris xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.05 xxxxxxxxxxxxxxxxxxx22.5

Tsing Hua xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.14 xxxxxxxxxxxxxxxxxxxxx9.3

Emory xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.14xxxxxxxxxxxxxx 9.9

Geneva.....................................................................4.30 .........................................8.4

Wake Forest............................................................4.30 .........................................16.1

UCLA.......................................................................10.20............................... 0.6

Ecole Polytechnique, Paris xxxxxxxxxxxxxxxxxxxxxxx....x5.4 xxxxxxxxxxxxxxxxxxxxx3.8

Edinburgh ...................................................................8.3 xxxxxxxxxxxxxxxxxxxx4.1

Utrecht ......................................................................13.9 xxxxxxxxxxxxxxxxxxxx4.3

Fudan xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx19.3 xxxxxxxxxxxxxxxxxxx4.3

There seem to be no relationship whatsoever between the ratios derived from the rankings and those given in the profiles and directory.

Logically, there are three possibilities. The ranking data is wrong. The directory data is wrong. Both are wrong. It is impossible for both to be correct.

In a little while, I shall try to figure out where QS got the data for both sets of statistics. I am beginning to wonder, though, whether they got them from anywhere.

To call the faculty student score a robust measure is ridiculous. As compiled and presented by THES and QS, it is as robust as a pile of dead jellyfish.

Friday, February 09, 2007

Guide to the World’s Top Universities

Guide to the World’s Top Universities: Exclusively Featuring the Official Times Higher Education Supplement QS World University Rankings. John O’Leary, Nunzio Quacquarelli and Martin Ince (QS Quacquarelli Symonds Limited/Blackwell Publishing 2006)

Here are some preliminary comments on the THES-QS guide. A full review will follow in a few days.

The Times Higher Education Supplement and QS Quacquarelli Symonds have now produced a book, published in association with Blackwell’s. The book incorporates the 2006 world university rankings of 200 universities and the rankings by peer review of the top 100 universities in disciplinary areas. It also contains chapters on topics such as choosing a university, the benefits of studying abroad and tips for applying to university. There are profiles of the top 100 universities in the THES-QS rankings and a directory containing data about over 500 universities

The book is attractively produced and contains a large amount of information. A superficial glance would suggest that it would be a very valuable resource for anybody thinking about applying to university or anybody comparing universities for any reason. Unfortunately, this would be a mistake.

There are far too many basic errors. Here is a list, almost certainly incomplete. Taken individually they may be trivial but collectively they create a strong impression of general sloppiness.

“University of Gadjah Mada” (p91). Gadjah Mada was a person not a place.

In the factfile for Harvard (p119) the section Research Impact by Subject repeats information given in the previous section on Overall Research Performance.

The factfile for Yale (p 127) reports a Student Faculty Ratio of 34.3 , probably ten times too high.

The directory (p 483) provides data about something called the “Official University of Califormia, Riverside”. No doubt someone was cutting and pasting from the official university website.

Zurich, Geneva, St Gallen and Lausanne are listed as being in Sweden (p 462-3)

Kyungpook National University, Korea, has a Student faculty Ratio of 0:1. (p 452)

New Zealand is spelt New Zeland (p441)

There is a profile for the Indian Institutes of Technology [plural] (p 231) but the directory refers to only one in New Delhi (p 416).

Similarly, there is a profile for the Indian Institutes of Management [plural] (p 253) but the directory refers to one in Lucknow (p416)

On p 115 we find the “University of Melbourneersity”

On p 103 there is a reference to “SUNY” (State University of new York” that does not specifiy which of the four university centres of the SUNY system is referred to.

Malaysian universities are given the bahasa rojak (salad language) treatment and are referred to as University Putra Malaysia and University Sains Malaysia. (p437-438)

UCLA has a student faculty ratio of 0.6:1 (p483)

There will be further comments later.



Monday, February 05, 2007

The Rise of Seoul National University

One remarkable feature of the THES-QS world university rankings has been the rise of the Seoul National University (SNU) in the Republic of Korea from 118th place in 2004 to 93rd in 2005 and then to 63rd in 2006. This made SNU the eleventh best university in Asia in 2006 and placed it well above any other Korean university.

This was accomplished in part by a rise in the peer review score from 39 to 43. Also, SNU scored 13 on the recruiter rating compared with zero in 2005. However, the most important factor seems to be an improvement in the faculty student score from 14 in 2005 to 57 in 2006

How did this happen? If we are to believe QS it was because of a remarkable expansion in the number of SNU’s faculty. In 2005 according to QS’s topgraduate site, SNU had a total of 31,509 students and 3,312 faculty or 9.51 students per faculty. In 2006, again according to QS, SNU had 30,120 students and 4,952 faculty, a ratio of 6.08. The numbers provided for students seem reasonable. SNU’s site refers to 28,074 students. It is not implausible that QS’s figures included some categories such as non-degree, part-time or off-campus students that were counted by SNU.

The number of faculty is, however, another matter. The SNU site refers to 28,074 students and 1,927 full time equivalent faculty members. There are also “1,947 staff members”. It is reasonable to assume that the latter are non-teaching staff such as technicians and librarians.

Further down, the SNU site things begin to get confusing. As of 1st April 2006, according to the site, there were 3,834 “teaching faculty” and 1, 947 “educational staff”. Presumably these are the same as the earlier 1, 947 “staff members.

The mystery now is how 1,927 full time equivalent faculty grew to 3,834 teaching faculty. The latter figure would seem to be completely wrong if only because one would expect teaching faculty to be fewer than total faculty.

Since 1,947 full time equivalent faculty plus 1,947 staff members adds up to 3,874, a little bit more than 3, 834, it could be that “faculty” and “staff” were combined to produce a total for “teaching faculty”.

Another oddity is that SNU has announced on this site that it has a student-faculty ration of 4.6. I am baffled how this particular statistic was arrived at.

QS should, I suppose, get some credit for not accepting this thoroughly implausible claim. It’s ratio of 6.08 is, however, only slightly better and seems dependent on accepting a figure of 4,952 faculty. Unless somebody has been fabricating data out of very thin air, the most plausible explanation I can think of is that QS constructed the faculty statistic from a source that did something like taking the already inflated number of teaching faculty and then added the professors. Perhaps the numbers were obtained in the course of a telephone conversation over a bad line.

And the real ratio? On the SNU site there is a "visual statistics page" that refers to 1,733 "faculty members" in 2006. This seems plausible. Also, just have a look at what the MBA Dauphine-Sorbonne-Renault programme, which has partnerships with Asian and Latin American universities, says:

"Founded in 1946, Seoul National University (SNU) marked the opening of the first national university in modern Korean history. As an indisputable leader of higher education in Korea, SNU has maintained the high standard of education in liberal arts and sciences. With outstanding cultural and recreational benefits, SNU offers a wide variety of entertainment opportunities in the college town of Kwanak and in the city of Seoul.
SNU began with one graduate school and nine colleges, and today SNU has 16 colleges, 3 specialized graduate schools , 1 graduate school, 93 research institutes, and other supporting facilities, which are distributed over 2 campuses.
Currently, SNU has a student enrollment of approximately 30,600 degree candidates, including 27,600 undergraduates and 3,000 graduates. Also SNU has approximately 700 foreign students from 70 different countries. Maintaining the faculty of student ratio of 1:20, over 1,544 faculty and around 30 foreign professors are devoted to teaching SNU students to become leaders in every sector of Korean Society.
With the ideal of liberal education and progressive visions for the 21st century, SNU will continue to take a leading position as the most prestigious, research-oriented academic university in South Korea. " (my italics)

A Student-faculty ratio of around 20 seems far more realistic than the 4.6 claimed by SNU or QS's 6.08. An explanation would seem to be in order from SNU and from QS.

Monday, January 29, 2007

More About Duke

On January 23rd I wrote to John O’Leary, Editor of the Times Higher Education Supplement concerning the data for Duke University in the 2006 world university rankings. I had already pointed out that the 2006 data appeared to be inaccurate and that, since Duke had the best score in the faculty–student section against which the others were benchmarked, all the scores in this section and therefore all the overall scores were inaccurate. There has to date been no reply. I am therefore publishing this account of how the data for Duke may have been constructed.

It has been clear for some time that the score given to Duke for this section of the rankings and the underlying data reported on the web sites of THES’s consultants, QS Quacquarelli Symonds, were incorrect and that Duke should not have been the highest scorer in 2006 on this section. Even the Duke administration has expressed its surprise at the published data. What has not been clear is how QS could have come up with data so implausible and so different from those provided by Duke itself. I believe I have worked out how QS probably constructed these data, which have placed Duke at the top of this part of the rankings so that it has become the benchmark for every other score.

In 2005 Duke University made an impressive ascent up the rankings from 52nd to 11th. This rise was due in large part to a remarkable score for faculty-student ratio. In that year Duke was reported by QS on their topgraduates site to have a total of 12,223 students, comprising 6,248 undergraduates and 5, 975 postgraduates, and 6,244 faculty, producing a ratio of 1.96 students per faculty. The figure for faculty was clearly an error since Duke itself claimed to have only
1,595 tenure and tenure-track faculty and was almost certainly caused by someone entering the number of undergraduate students at Duke, 6,244 in the fall of 2005, into the space for faculty on the QS database. In any case someone should have pointed out that large non-specialist institutions, no matter how lavishly they are funded, simply do not have fewer than two students per faculty

In 2006 the number of faculty and students listed on QS’s topuniversities site was not so obviously incredible and erroneous but was still quite implausible.

According to QS, there were in 2006 11,106 students at Duke, of whom 6,301 were undergraduates and 4,805 postgraduates. It is unbelievable that a university could reduce the number of its postgraduate students by over a thousand, based on QS’s figures, or about two thousand, based on data on the Duke web site, in the course of a single year.

There were in 2006, according to QS, 3,192 faculty at Duke. This is not quite as incredible as the number claimed in 2005 but is still well in excess of the number reported on the Duke site.

So where did the figures, which have placed Duke at the top of the faculty student ratio component in 2006, come from? The problem evidently faced by whoever compiled the data is that the Duke site has not updated its totals of students and faculty since the fall of 2005 but has provided partial information about admissions and graduations which were used in an attempt to estimate current enrollment for the fall of 2006.

If you look at the Duke site you will notice that there is some information about admissions and graduations. At the start of the academic year of 2005 – 2006 (the “class of 2009”) 1,728 undergraduates were admitted and between July 1st, 2005 and June 30th, 2006 1,670 undergraduate degrees were conferred.

So, working from the information provided by Duke about undergraduate students we have;

6,244-
1,670+
1,728
=
6,302

The QS site indicates 6,301 undergraduate students in 2006.

It seems likely that the number of undergraduates in the fall of 2006 was calculated by adding the number of admissions in the fall of 2005 (it should actually have been the fall of 2006) to the number enrolled in the fall of 2005 and deducting the number of degrees conferred between July 2005 and June 2006. The total thus obtained differs by one digit from that listed by the QS site. This is most probably a simple data entry error. The total obtained by this method would not of course be completely valid since it did not take account of students leaving for reasons other than receiving a degree. It would, however, probably be not too far off the correct number.

The number of postgraduate students is another matter. It appears that there was a botched attempt to use the same procedure to calculate the number of graduate students in 2006. The problem, though, was that the Duke site does not indicate enrollment of postgraduate students in that year. In the fall of 2005 there were 6,844 postgraduate students. Between July 2005 and June 2006 2,348 postgraduate and professional degrees were awarded, according to the Duke site. This leaves 4,496 postgraduate students. The QS topuniversities site reports that there were 4,805 postgraduates in 2006. This is a difference of 309.

So where did the extra 309 postgraduates come from? Almost certainly the answer is provided by the online Duke news of September 6, 2006 which refers to a total of 1,687 first year undergraduate students composed of 1,378 entering the Trinity College of Arts and Science (Trinity College is the old name of Duke retained for the undergraduate school) and 309 undergraduates entering the Pratt School of Engineering. The total number of admissions is slightly different from the number given on the Duke main page but this may be explained by last minute withdrawals or a data entry error.

So it looks like someone at QS took the number of postgraduate students in 2005 , deducted the number of degrees awarded and added students admitted to the Pratt School of Engineering in the fall of 2006 and came up with the total of 4,805 in 2006. This is way off the mark because the 309 students admitted to the School of Engineering are not postgraduates, as is evident from their inclusion in the class of 2010, and no postgraduate admissions of any kind were counted. The result is that Duke appears erroneously to have lost about 2,000 postgraduate students between 2005 and 2006.

The undergraduate and postgraduate students were then apparently combined on the QS site to produce a total of 11,106 students, or about 1,000 less than QS reported in 2005 and about 2,000 less than indicated by Duke for that year.

What about the number of faculty? Here, QS’s procedure appears to get even dodgier. The Duke site refers to 1,595 tenure and tenure track faculty. The QS site refers to 3,192 faculty. Where does the difference come from? The answer ought to be obvious and I am embarrassed to admit that it took me a couple of hours to work it out. 1,595 multiplied by 2 is 3190 or exactly 2 less than QS’s figure. The slight difference is probably another data entry error or perhaps an earlier error of addition.

The Duke site contains a table of faculty classified according to school – Arts and Sciences, Engineering, Divinity and so on adding up to 1,595 and then classified according to status – full, associate and assistant professors, again adding up to 1,595. It would seem likely that someone assumed that the two tables referred to separate groups of faculty and then added them together.

So, having reduced the number of students by not including postgraduate admissions and doubling the number of faculty by counting them twice, QS seem to have come up with a a ratio of 3.48 students per faculty This gave Duke the best score for this part of the ranking against which all other scores were calibrated. The standardized score of 100 should in fact have been given to Yale, assuming, perhaps optimistically, that this ratio has been calculated correctly.

It follows that every score for the faculty student ratio is incorrect and therefore that every overall score is incorrect.

If there is another explain for the unbelievable Duke statistics I would be glad to hear it. But I think that if there is going to be a claim that an official at Duke provided information that is so obviously incorrect then the details of the communication should be provided. If information was obtained from another source, although I do not see any way that it could be, it should be indicated. Whatever the source of the error, someone at QS ought to have checked the score of the top university in each component and should have realized immediately that major universities do not reduce the number of their students so dramatically in a single year and keep it secret. Nor is it plausible that a large general university could have a ratio of 3.48 students per faculty.

To show that this reconstruction of QS’s methods is mistaken would require nothing more than indicating the source of the data and an e-mail address or citation by which it could be verified.

Friday, January 12, 2007

And Then There Were None

Something very odd has been going on at the University of Technology Sydney (UTS), if we can believe QS Quacquarelli Syminds, THES's consultants.

In 2005, according to QS, UTS had a faculty of 866 of whom 253 were international. The latter figure is definitely not real information but simply repesents 29% of the total faculty, which is QS's estimate or guess for Australian universities in general. This should have given UTS a score of 53 for the international faculty component on the 2005 world university rankings although the score actually given was 33. This was presumably the result of a data entry error. UTS was ranked 87th in the 2005 rankings.

In 2006, according to QS, the number of faculty at UTS increased dramatically to 1,224. However, the number of international faculty dropped to precisely zero. Partly as a result of this UTS's position in the rankings fell to 255.

Meanwhile, UTS itself reports that it has 2,576 full time equivalent faculty.
How Long is an Extended Christmas Break?

On the 21st of December I received a message from John O'Leary, Editor of THES, that he had sent my questions about the world university rankings to QS and that he hoped to get back to me in the new year since UK companies often have an extended Chtistmas break.

Assuming it started on December 25th, the break has now lasted for 18 days.

Monday, January 01, 2007

A Disgrace in Every Sense of the Word

That is the opinion of the Gadfly, a blog run by four Harvard undergraduates, of the THES world rankings. Here is a quotation:

"The Times Higher Education Supplement (THES) just released their global rankings, and it’s an utter scandal. Rife with errors of calculation, consistency and judgment, it is a testament not only to this ridiculous urge to rank everything but also to the carelessness with which important documents can be compiled."

The post concludes:

"One cannot help but think that the THES rankings are a British ploy to feel good about Oxford and Cambridge, the former of which is having a hard time pushing through financial reforms. Both are really universities who should be doing better, and are not. It may explain why Cambridge ups Harvard on the THES peer review, despite the fact that it lags behind Harvard under almost every other criteria, like citations per faculty, and citations per paper in specific disciplines."

Bangor is Very Naughty


Bangor University in Wales has apparantly been fiddling about with its exam results in order to boost its position in university rankings (not this time the THES world rankings). One wonders how much more of this sort of thing goes on. Anyway, here is an extract from the report in the THES . Contrary to what many people in Asia and the US think, the THES and the Times are separate publications.


And congratulations to Sam Burnett.


Bangor University was accused this week of lowering its academic standards with a proposal to boost the number of first-class degrees it awards.

According to a paper leaked to The Times Higher, the university agreed a system for calculating student results that would mean that about 60 per cent of graduates would obtain either a first or an upper-second class degree in 2007, compared with about 52 per cent under the current system.

The paper, by pro vice-chancellor Tom Corns, says that the university's key local rival, Aberystwyth University, "awarded 6.7 per cent more first and upper-second class degrees than we did". At the time, this helped place Bangor eight positions below Aberystwyth in The Times 2005 league table of universities.

He says: "We must redress the balance with all expedition", meaning the reforms are likely to take effect for 2007 graduates rather than for the 2007 entry cohort.

The move prompted heavy criticism this week. Alan Smithers, director of the Centre for Education and Employment Research at Buckingham University, said: "Hitherto, universities have been trusted to uphold degree standards, but such behaviour calls into question the desirability of continuing to allow them free rein in awarding their own degrees. Perhaps there should be an independent regulatory body."

He suggested that a body such as the Qualifications and Curriculum Authority, which regulates schools' exam awards, could be set up for higher education.

Sam Burnett, president of Bangor student union, said that Bangor had been "very naughty".

"The issue isn't about the system that should be in place... University figures seem to have identified the quickest way to boost Bangor up the league tables and will cheapen degrees in the process. Maybe it would be easier just to add 5 per cent to everyone's scores next July."