University of Adelaide Falls out of the Top 100
News about the latest QS WUR is beginning to trickle out. This is from the Australian.
THE rest of the world is catching up with Australian universities, dragging down the sector's relative performance in the latest QS global rankings.
The league table, released this morning, shows Australia has one fewer institution in the QS top 100 for 2012-13, with the University of Adelaide sliding 10 places to 102.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Monday, September 10, 2012
Sunday, September 09, 2012
Will There be a New Number One?
One reason why QS and Times Higher Education get more publicity than the Shanghai ARWU, HEEACT and other rankings is that they periodically produce interesting surprises. Last year Caltech replaced Harvard as number one in the THE rankings and Tokyo overtook Hong Kong as the best Asian university. Two years ago Cambridge pushed Harvard aside at the top of the QS rankings.
Will there be another change this year?
There is an intriguing interview with Les Ebdon, the UK government's "university access tsar", in the Daily Telegraph. Ebdon claims that leading British universities are in danger of losing their world class status unless they start admitting more students from state schools who may be somewhat less academically qualified. Perhaps he knows something.
So if Cambridge slips and is replaced by Harvard, MIT or Yale as QS number one (if it is Oxford or Imperial QS will lose all credibility) we can expect comments that Cambridge should start listening to him before its too late.
I suspect that if there is a new number one it might have something to do with the QS employer review. Since this is a sign up survey and since the numbers are quite small it would not take many additional responses to push Harvard or MIT into first place.
With regard to THE, the problem there is that normalising everything by country, year and/or field is a potential source of instability. If there is a vigorous debate with lots of citations about an obscure article by a Harvard researcher in a little cited field it could dramatically boost the score on the citations indicator.
Getting a good score in the THE rankings also depends on what a school is being compared to. Last year, Hong Kong universities slumped because they were taken out of China (with low average scores) and classified as a separate country (with high average scores), so that their relative scores were lower. If they are put back in China they will go up this year and there will be a new number one Asian university.
So anybody want to bet on Harvard making a come back this year? Or Hong Kong regaining the top Asian spot from Tokyo in the THE rankings?
One reason why QS and Times Higher Education get more publicity than the Shanghai ARWU, HEEACT and other rankings is that they periodically produce interesting surprises. Last year Caltech replaced Harvard as number one in the THE rankings and Tokyo overtook Hong Kong as the best Asian university. Two years ago Cambridge pushed Harvard aside at the top of the QS rankings.
Will there be another change this year?
There is an intriguing interview with Les Ebdon, the UK government's "university access tsar", in the Daily Telegraph. Ebdon claims that leading British universities are in danger of losing their world class status unless they start admitting more students from state schools who may be somewhat less academically qualified. Perhaps he knows something.
So if Cambridge slips and is replaced by Harvard, MIT or Yale as QS number one (if it is Oxford or Imperial QS will lose all credibility) we can expect comments that Cambridge should start listening to him before its too late.
I suspect that if there is a new number one it might have something to do with the QS employer review. Since this is a sign up survey and since the numbers are quite small it would not take many additional responses to push Harvard or MIT into first place.
With regard to THE, the problem there is that normalising everything by country, year and/or field is a potential source of instability. If there is a vigorous debate with lots of citations about an obscure article by a Harvard researcher in a little cited field it could dramatically boost the score on the citations indicator.
Getting a good score in the THE rankings also depends on what a school is being compared to. Last year, Hong Kong universities slumped because they were taken out of China (with low average scores) and classified as a separate country (with high average scores), so that their relative scores were lower. If they are put back in China they will go up this year and there will be a new number one Asian university.
So anybody want to bet on Harvard making a come back this year? Or Hong Kong regaining the top Asian spot from Tokyo in the THE rankings?
Friday, September 07, 2012
Rating Rankings
A caustic comment from the University of Adelaide:
"University rankings would have to be the worst consumer ratings in the retail market. In no other area are customers so badly served by consumer ratings as in the global student market," said Professor Bebbington. "The international rankings must change, or student consumers worldwide will eventually stop using them.
"Next to buying a house, choosing a university education is for most students, the largest financial commitment they will ever make. A degree costs more than a car, but if consumer advice for cars was this poor, there would be uproar.
"Students the world over use rankings for advice on which particular teaching program, at what campus to enrol in. Most don't realise that many of the rankings scarcely measure teaching or the campus experience at all. They mostly measure research outcomes." For this reason, said Professor Bebbington, half the universities in the US, including some very outstanding institutions, remain unranked.
He went on to discuss the inconsistency of university ranking results against the quality of the learning experience. According to Professor Bebbington, such a contradiction should come as no surprise: "Anyone who knows exactly what the rankings actually measure knows they have little bearing on the quality of the education."
Another problem was an increasing number of ranking systems, each producing different results. "With some, we have seen universities shift 40 places in a year, simply because the methodology has changed, when the universities have barely changed at all," he said. "It leaves students and parents hopelessly confused."
The Jiao Tong rankings in particular favour natural sciences and scarcely reflect other fields according to Professor Bebbington. "Moreover, they assume all universities have research missions. Those dedicated to serving their State or region through teaching, rather than competing in the international research arena, may be outstanding places to study, but at present are unranked.
"What is needed is a ranking that offers different lists for undergraduate teaching programs, international research-focussed programs, and regionally focussed programs. We need a ranking that measures all disciplines and is not so focussed on hard science."
The University of Adelaide has been steadily improving in the Shanghai ARWU rankings, although not in the two indicators that measure current research of the highest quality, publications in Nature and Science and highly cited researchers. It also improved quite a bit in the QS rankings from 107th in 2006 to 92nd in 2011. One wonders what Professor Bebbington is complaining about. Has he seen next week's results?
Still he has a point. The ARWU rankings have a very indirect measure of teaching, alumni who have won Nobel and Fields awards, while QS uses a very blunt instrument, faculty student ratio. It is possible to do well on on this indicator by recruiting large numbers of research staff who never do any teaching at all.
Times Higher Education rankings director Phil Baty has a reply in the Australian.
As yet unpublished research by international student recruitment agency IDP will show university rankings remain one of the very top information sources for international students when choosing where to study.
If they are making such a major investment in their future, the overall reputation of the institution is paramount. The name on the degree certificate is a global passport to a lifelong career.
Broad composite rankings - even those that say nothing about teaching and learning - will always matter to students.
But that does not mean the rankers do not have to improve.
The Times Higher Education league table, due for publication on October 4, is the only ranking to take teaching seriously. We employ five indicators (worth 30 per cent) dedicated to offering real insight into the teaching environment, based on things such as a university's resources, staff-student ratio and undergraduate-postgraduate mix.
Times Higher Education has made genuine efforts to capture some factors that may have relevance to teaching. Their academic survey has a question about teaching although it is only about postgraduate teaching. The others are necessarily somewhat indirect, income per academic, doctorates awarded, ratio of doctoral students to undergraduates.
If THE are to improve their learning environment criterion one option might be a properly organised, vetted and verified survey, perhaps based on university email records, of undergraduate students.
Saturday, September 01, 2012
From Preschool to PISA
We still don't have a ranking of kindergartens although no doubt that will happen one day. But there is now a country ranking of early education prepared by the Economist Intelligence Unit for the Lien Foundation "a Singapore philanthropic house noted for its model of radical philanthropy" (Economist print edition, 30/06/12). Details can be found here.
The ranking combines four indicators, Availability, Affordability, Quality of the Preschool Environment and Social Context, "which examines how healthy and ready for school children are".
The top five are:
1. Finland
2. Sweden
3. Norway
4. UK
5. Belgium
The bottom five are:
41. Vietnam
42. China
43. Philippines
44. Indonesia
45. India
It is interesting to compare this with the ability of fifteen year old students as measured by the 2009 PISA report. Finland, which is top of the preschool ranking, also scores well on all three sections of the PISA rankings, Reading, Science and Maths.
However, the United Kingdom which is ranked 4th in the preschool rankings does no better than the OECD average in the PISA ranking for Reading and Maths although not too badly in Science.
At the bottom of the preschool rankings we find China and India. In the PISA plus ranking India was represented by just two states and they preformed miserably.
On the other hand, China whose preschool system is ranked 42nd out of 45, does very well, ranking first in all three sections, or rather Shanghai does very well.
A warning is needed here. Mainland China is represented in the PISA rankings only by Shanghai. The results would almost certainly be very different if they included the whole of China including its huge rural hinterland. It is likely that if all of China had been assessed its scores would have been something like Taiwan (Chinese Taipei), 495 instead of 556 for Reading, 543 instead of 600 for Maths and 520 instead of 575 for Science.
Even so, it is striking that the UK could have such highly ranked preschools and such modest secondary school achievement while China has lowly ranked preschools but such high scores at secondary level if we just consider Shanghai or scores similar to Taiwan's if we estimate what the nationwide level might be.
There are other apparent anomalies. For example, Singapore is only slightly better than Mexico on the preschool rankings but forges well ahead on the PISA rankings.
It could be that preschool education does not have much to do with long term achievement. Also, some of the criteria, such as how healthy children are, may not have much do with anything that a preschool does or can do. Nor should we forget the preschool ranking deals with input while the PISA rankings are about performance.
Furthermore, it is likely that the culture, social structure and demography of contemporary Asia, Europe and the Americas explain some of these differences in the effect of preschool education.
We still don't have a ranking of kindergartens although no doubt that will happen one day. But there is now a country ranking of early education prepared by the Economist Intelligence Unit for the Lien Foundation "a Singapore philanthropic house noted for its model of radical philanthropy" (Economist print edition, 30/06/12). Details can be found here.
The ranking combines four indicators, Availability, Affordability, Quality of the Preschool Environment and Social Context, "which examines how healthy and ready for school children are".
The top five are:
1. Finland
2. Sweden
3. Norway
4. UK
5. Belgium
The bottom five are:
41. Vietnam
42. China
43. Philippines
44. Indonesia
45. India
It is interesting to compare this with the ability of fifteen year old students as measured by the 2009 PISA report. Finland, which is top of the preschool ranking, also scores well on all three sections of the PISA rankings, Reading, Science and Maths.
However, the United Kingdom which is ranked 4th in the preschool rankings does no better than the OECD average in the PISA ranking for Reading and Maths although not too badly in Science.
At the bottom of the preschool rankings we find China and India. In the PISA plus ranking India was represented by just two states and they preformed miserably.
On the other hand, China whose preschool system is ranked 42nd out of 45, does very well, ranking first in all three sections, or rather Shanghai does very well.
A warning is needed here. Mainland China is represented in the PISA rankings only by Shanghai. The results would almost certainly be very different if they included the whole of China including its huge rural hinterland. It is likely that if all of China had been assessed its scores would have been something like Taiwan (Chinese Taipei), 495 instead of 556 for Reading, 543 instead of 600 for Maths and 520 instead of 575 for Science.
Even so, it is striking that the UK could have such highly ranked preschools and such modest secondary school achievement while China has lowly ranked preschools but such high scores at secondary level if we just consider Shanghai or scores similar to Taiwan's if we estimate what the nationwide level might be.
There are other apparent anomalies. For example, Singapore is only slightly better than Mexico on the preschool rankings but forges well ahead on the PISA rankings.
It could be that preschool education does not have much to do with long term achievement. Also, some of the criteria, such as how healthy children are, may not have much do with anything that a preschool does or can do. Nor should we forget the preschool ranking deals with input while the PISA rankings are about performance.
Furthermore, it is likely that the culture, social structure and demography of contemporary Asia, Europe and the Americas explain some of these differences in the effect of preschool education.
Subscribe to:
Comments (Atom)