Tuesday, September 05, 2006

The Fastest Way into the THES TOP 200

In a little while the latest edition of the THES rankings will be out. There will be protests from those who fail to make the top 200, 300 or 500 and much self-congratulation from those included. Also, of course, THES and QS, THES’s consultants, directly or indirectly, will make a lot of money from the whole business.

If you search through the web you will find that QS and THES have been quite busy over the last year or so promoting their rankings and giving advice about what to do to get into the top 200. Some of their advice is not very helpful. Thus, Nunzio Quacquarelli, director of QS told a seminar in Kuala Lumpur in November 2005, that producing more quality research was one way of moving up in the rankings. This is not necessarily a bad thing but it will be a least a decade before any quality research can be completed, written up, submitted for publication, revised, finally accepted, published, and then cited by another researcher whose work goes through the same processes. Only then will research will start to push a university into the top 200 or 100 by boosting their score for citations per faculty.

Something less advertised is that once a university has got onto the list of 300 universities (so far this has been decided by peer review) there is a very simple way of boosting a university’s position in the rankings. It is also not unlikely that several universities have already realized this.

Pause for a minute and review the THES methodology. They gave a weighing of 40 per cent to a review of universities by other academics, 10 per cent to a rating by employers, 20 per cent to the ratio of faculty to students, 10 per cent to the proportion of international faculty and students, and 20 per cent to the number of citations per faculty. In 2005 the top scoring institution in each category was given a score of 100 and then the scores of the others were calibrated accordingly.

Getting back to boosting ratings, first take a look at the 2004 and 205 scores for citations per faculty. Comparison is a bit difficult because in 2004 the top scorer is given a score of 400 and then one of 100 in 2005 (it’s MIT in both cases.) What immediately demands attention is that there are some very dramatic changes between 2004 and 2005.

For example Ecole Polytechnique in Paris fell from 14.75 (dividing the THES figures by four because top ranked MIT was given a score of 400 in 2004) to 4, ETH Zurich from 66.5 to 8, and McGill in Canada from 21 to 8.

This at first sight is more a bit strange. The figures are supposed to refer to ten-year periods, so that in 2005 citations for the earliest year would be dropped and then those for another year added. You would not expect very much change from year to year since the figures for 2004 and 2005 overlap a great deal.

But it is not only citations that we have to consider. The score is actually based on citations per faculty member. So, if the number of faculty goes up and the number of citations remains the same then the score for citations per faculty goes down.

This in fact is what happened to a lot of universities. If we look at the score for citations per faculty and then the score for faculty-student ratio there are several cases where they change proportionately but in opposite directions.

So, going back to the three examples given above between 2004 and 2005 Ecole Polytechnique went up from 23 to 100, to become the top scorer for faculty-student ratio, ETH Zurich from 4 to 37, and Mc Gill from 23 to 42. Notice the rise in the faculty student ratio score is roughly proportionate to the fall in the number of citations per faculty.

I am not the first person to notice the apparent dramatic collapse of research activity at ETH Zurich. Norbert Staub in ETH Life International was puzzled by this. It looks as though it wasn’t that ETH Zurich stopped doing research but that apparently it acquired something like eight times as many teachers.

It seems pretty obvious that what happened to these institutions is that the apparent number of faculty went up between 2004 and 2005. This led to a rise in the score for faculty student ratio and a fall in the number of citations per faculty.

You might ask, so what? If a university goes up on one measure and goes down on another surely the total score will remain unchanged.

Not always. THES has indexed the scores to the top scoring university so that in 2005 the top scorer gets 100 for both faculty-student ratio and citations per faculty. But the gap between the top university for faculty student ration and run of the mill places in, say, the second hundred is much less than it is for citations per faculty. For example take a look at the faculty-student scores of the universities starting at position number 100. We have 15, 4, 13, 10, 23, 16, 13, 29, 12, 23. Then look at the scores for citations per faculty, 7, 1, 8, 6, 0, 12, 9, 14, 12, 7.

That means that many universities can, like Ecole Polytechnique, gain much more by increasing their faculty student ratio than they lose by reducing the citations per faculty. Not all of course. ETH Zurich suffered badly as a result of this faculty inflation.

So what is going on? Are we really to believe that in 2005 Ecole Polytechnique quadrupled its teaching staff, ETH Zurich increased its eightfold and that of McGill nearly doubled. This is totally implausible. The only explanation that makes any sort of sense is that either QS or the institutions concerned were counting their teachers differently in 2004 and 2005.

The likeliest explanation for Ecole Polytechnique’s s remarkable change is simply that in 2004 only full time staff were counted but in 2005 part-time staff were counted as well. It is well known that many staff of the Grandes Ecoles of France are employed by neighbouring research institutes and universities, although exactly how many is hard to find out. If anyone can suggest any other explanation please let me know.

Going through the rankings we find that are quite a few universities that are affected by what we might call “faculty inflation”. EPF Lausanne from 13 to 64, Eindhoven from 11 to 54, University of California at San Francisco from 39 to 91, Nagoya from 19 to 35, Hong Kong from 8 to 17.

So, having got through the peer review, this is how to get a boost in the rankings. Just inflate the number of teachers and deflate the number of students.

Here are some ways to do it. Wherever possible, hire part-time teachers but don’t differentiate between full and part-time. Announce that every graduate student is a teaching assistant, even if they just have to do a bit of marking, and count them as teaching staff. Make sure anyone who leaves is designated emeritus or emerita and kept on the books. Never sack anyone but keep him or her suspended. Count everybody in branch campuses and off -campus programmes. Classify all administrative appointees as teaching staff.

It will also help to keep the official number of students down. A few possible ways are not counting part-time students, not counting branch campuses, counting at the end of the semester when some have dropped out.

2 comments:

  1. Anonymous12:38 PM

    Hi, read your blog with interest. I am involved in trying to rate UK universities by the attention that students will get from their tutors, i.e. the staff/student ratio. Additionally this is a campaign to try to highlight some of the disgracefull practices that some universities use to save money. The site is http://WillISeeMyTutor.com we would be delighted if you would let us have your opinion.

    ReplyDelete
  2. Im sure it is possible!...About what time do you think?!

    ReplyDelete