Friday, September 07, 2012

Rating Rankings

A caustic comment from the University of Adelaide:
"University rankings would have to be the worst consumer ratings in the retail market. In no other area are customers so badly served by consumer ratings as in the global student market," said Professor Bebbington. "The international rankings must change, or student consumers worldwide will eventually stop using them.

"Next to buying a house, choosing a university education is for most students, the largest financial commitment they will ever make. A degree costs more than a car, but if consumer advice for cars was this poor, there would be uproar.

"Students the world over use rankings for advice on which particular teaching program, at what campus to enrol in. Most don't realise that many of the rankings scarcely measure teaching or the campus experience at all. They mostly measure research outcomes." For this reason, said Professor Bebbington, half the universities in the US, including some very outstanding institutions, remain unranked.

He went on to discuss the inconsistency of university ranking results against the quality of the learning experience. According to Professor Bebbington, such a contradiction should come as no surprise: "Anyone who knows exactly what the rankings actually measure knows they have little bearing on the quality of the education."

Another problem was an increasing number of ranking systems, each producing different results. "With some, we have seen universities shift 40 places in a year, simply because the methodology has changed, when the universities have barely changed at all," he said. "It leaves students and parents hopelessly confused."

The Jiao Tong rankings in particular favour natural sciences and scarcely reflect other fields according to Professor Bebbington. "Moreover, they assume all universities have research missions. Those dedicated to serving their State or region through teaching, rather than competing in the international research arena, may be outstanding places to study, but at present are unranked.

"What is needed is a ranking that offers different lists for undergraduate teaching programs, international research-focussed programs, and regionally focussed programs. We need a ranking that measures all disciplines and is not so focussed on hard science."

The University of Adelaide has been steadily improving in the Shanghai ARWU rankings, although not in the two indicators that measure current research of the highest quality, publications in Nature and Science and highly cited researchers. It also improved quite a bit in the QS rankings from 107th in 2006 to 92nd in 2011. One wonders what Professor Bebbington is complaining about. Has he seen next week's results?

Still he has a point. The ARWU rankings have a very indirect measure of teaching, alumni who have won Nobel and Fields awards, while QS uses a very blunt instrument, faculty student ratio. It is possible to do well on on this indicator by recruiting large numbers of research staff who never do any teaching at all.

Times Higher Education rankings director Phil Baty has a reply in the Australian.
As yet unpublished research by international student recruitment agency IDP will show university rankings remain one of the very top information sources for international students when choosing where to study.

If they are making such a major investment in their future, the overall reputation of the institution is paramount. The name on the degree certificate is a global passport to a lifelong career.

Broad composite rankings - even those that say nothing about teaching and learning - will always matter to students.

But that does not mean the rankers do not have to improve.

The Times Higher Education league table, due for publication on October 4, is the only ranking to take teaching seriously. We employ five indicators (worth 30 per cent) dedicated to offering real insight into the teaching environment, based on things such as a university's resources, staff-student ratio and undergraduate-postgraduate mix.

Times Higher Education has made genuine efforts to capture some factors that may have relevance to teaching. Their academic survey has a question about teaching although it is only about postgraduate teaching. The others are necessarily somewhat indirect, income per academic, doctorates awarded, ratio of doctoral students to undergraduates.

If THE are to improve their learning environment criterion one option might be a properly organised, vetted and verified survey, perhaps based on university email records, of undergraduate students.

No comments: