Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Tuesday, December 22, 2015
Worth Reading 4: Rankings influence public perceptions of German universities
Der Ranking-Effekt Zum Einfluss des „Shanghai-Rankings“ auf die medial dargestellte Reputation deutscher Universitäten
Tim Hegglin · Mike S. Schäfer
Publizistik (2015) 60:381–402 DOI 10.1007/s11616-015-0246-
English Abstract
Increasingly, universities find themselves in a competition about public visibility and reputation in which media portrayals play a crucial role. But universities are complex, heterogeneous institutions which are difficult to compare. University rankings offer a seemingly simple solution for this problem: They reduce the complexity inherent to institutions of higher education to a small number of measures and easy-to-understand ranking tables – which may be particularly attractive for media as they conform to news values and media preferences. Therefore, we analyze whether the annual publications of the “Shanghai Ranking” influence media coverage about the included German universities. Based on a content analysis of broadsheet print media, our data show that a ranking effect exists: After the publication of the Ranking results, included universities are presented as more reputable in the media. This effect is particularly strong among better ranked universities. It does not, however, increase over a 10-year time period. The Ranking Effect How the “Shanghai Ranking” influences the mediated reputation of German Universities.
Thanks to Christian Scholz of the University of Hamburg for alerting me to this paper.
Monday, December 21, 2015
Worth Reading 3
Matthew David, Fabricating World Class: Global university league tables, status differentiation and myths of global competition
accepted for publication in the British Journal of Sociology of Education
This paper finds that UK media coverage of global university rankings is strongly biased towards the Russell Group, which supposedly consists of elite research intensive universities, emphasises the superiority of some US universities, interprets whatever happens in the rankings as evidence that British universities, especially those in the Russell Group, need and deserve as much money as they want.
For example, he quotes the Daily Mail as saying in 2007 that "Vice chancellors are now likely to seize on their strong showing [in the THES-QS world university rankings] to press the case for the £3,000 a-year cap on tuition fees to be lifted when it is reviewed in 2009," while the Times in 2008, when UK universities slipped, said: "Vice chancellors and commentators voiced concern that, without an increase in investment, Britain's standing as a first-class destination for higher education could be under threat"
The media and elite universities also claim repeatedly that lavishly funded Asian universities are overtaking the impoverished and neglected schools of the West.
David argues that none of this is supported by the actual data of the rankings. He looks at the top 200 of the three well known rankings QS, THE, ARWU up to 2012.
I would agree with most of these conclusions, especially the argument that the rankings data he uses do not support either US superiority or the rise of Asia.
I would go further and suggest that changes to the QS rankings in 2008 and 2015, plus ad hoc adjustments to the employer survey in 2011 and 2012 plus changes in rules for submission of data, plus variations in the degree of engagement with the rankings, plus the instability resulting from an unstable pool from which ranked universities are drawn would render the QS rankings invalid as a measure of any but the most obvious trends.
Similarly THE rankings, started in 2010, underwent substantial changes in 2011 and then in 2015. Between those years there were fluctuations for many universities because a few papers could have a disproportionate impact on the citations indicator and again because the pool of ranked universities from which indicator means are calculated is unstable.
If, however, we take the Shanghai rankings over the course of eleven years and look at the full five hundred rankings then we do find that Asia, or more accurately some of it, is rising.
The number of Chinese universities in the ARWU top 500 rose from 16 in 2004 to 44 in 2015. The number of South Korean universities rose from 8 to 12, and Australian from 14 to 20,
But the number of Indian universities remained unchanged at three, while the number of Japanese fell from 36 to 18.
David does not argue that Asia is not rising, merely that looking at the top level of the rankings does not show that it is.
What is probably more important in the long run is the comparative performance not of universities but of secondary school systems. Here the future of the US, the UK and continental Europe does indeed look bleak while that of East Asia and the Chinese diaspora is very promising.
accepted for publication in the British Journal of Sociology of Education
This paper finds that UK media coverage of global university rankings is strongly biased towards the Russell Group, which supposedly consists of elite research intensive universities, emphasises the superiority of some US universities, interprets whatever happens in the rankings as evidence that British universities, especially those in the Russell Group, need and deserve as much money as they want.
For example, he quotes the Daily Mail as saying in 2007 that "Vice chancellors are now likely to seize on their strong showing [in the THES-QS world university rankings] to press the case for the £3,000 a-year cap on tuition fees to be lifted when it is reviewed in 2009," while the Times in 2008, when UK universities slipped, said: "Vice chancellors and commentators voiced concern that, without an increase in investment, Britain's standing as a first-class destination for higher education could be under threat"
The media and elite universities also claim repeatedly that lavishly funded Asian universities are overtaking the impoverished and neglected schools of the West.
David argues that none of this is supported by the actual data of the rankings. He looks at the top 200 of the three well known rankings QS, THE, ARWU up to 2012.
I would agree with most of these conclusions, especially the argument that the rankings data he uses do not support either US superiority or the rise of Asia.
I would go further and suggest that changes to the QS rankings in 2008 and 2015, plus ad hoc adjustments to the employer survey in 2011 and 2012 plus changes in rules for submission of data, plus variations in the degree of engagement with the rankings, plus the instability resulting from an unstable pool from which ranked universities are drawn would render the QS rankings invalid as a measure of any but the most obvious trends.
Similarly THE rankings, started in 2010, underwent substantial changes in 2011 and then in 2015. Between those years there were fluctuations for many universities because a few papers could have a disproportionate impact on the citations indicator and again because the pool of ranked universities from which indicator means are calculated is unstable.
If, however, we take the Shanghai rankings over the course of eleven years and look at the full five hundred rankings then we do find that Asia, or more accurately some of it, is rising.
The number of Chinese universities in the ARWU top 500 rose from 16 in 2004 to 44 in 2015. The number of South Korean universities rose from 8 to 12, and Australian from 14 to 20,
But the number of Indian universities remained unchanged at three, while the number of Japanese fell from 36 to 18.
David does not argue that Asia is not rising, merely that looking at the top level of the rankings does not show that it is.
What is probably more important in the long run is the comparative performance not of universities but of secondary school systems. Here the future of the US, the UK and continental Europe does indeed look bleak while that of East Asia and the Chinese diaspora is very promising.
Saturday, December 19, 2015
Go East Young (and Maybe not so Young) Man and Woman!
Mary Collins, a distinguished immunologist at University College London (UCL), is leaving to take up an academic appointment in Japan. Going with her is her husband Tim Hunt, a Nobel winner, who was the victim of a particularly vicious witch hunt about some allegedly sexist remarks over dinner. She had apparently applied for the Japanese post before the uproar but her departure was hastened by the disgraceful way he was treated by UCL.
Could this be the beginning of a massive drain of academic talent from the West to Asia? What would it take to persuade people like Nicholas Christakis, Erika Christakis, Joshua Richwine, Mark Regnerus, Andrea Quenette, K.C. Johnson, and Matt Tyler to trade in abuse and harassment by the "progressive" academic establishment for a productive scholarly or administrative career in Korea, Japan, China or the Pacific rim?
Meanwhile Russia and the Arab Gulf are also stepping up their recruitment of foreign scientists. Has Mary Collins started a new trend?
Could this be the beginning of a massive drain of academic talent from the West to Asia? What would it take to persuade people like Nicholas Christakis, Erika Christakis, Joshua Richwine, Mark Regnerus, Andrea Quenette, K.C. Johnson, and Matt Tyler to trade in abuse and harassment by the "progressive" academic establishment for a productive scholarly or administrative career in Korea, Japan, China or the Pacific rim?
Meanwhile Russia and the Arab Gulf are also stepping up their recruitment of foreign scientists. Has Mary Collins started a new trend?
Friday, December 18, 2015
THE Adjectival Hyperbole
Times Higher Education (THE) has always had, or tried to have, a good opinion of itself and its rankings.
A perennial highlight of the ranking season is the flurry of adjectives used by THE to describe its summits (prestigious, exclusive) and the rankings and their methodology. Here is a selection:
"the sharper, deeper insights revealed by our new and more rigorous world rankings"
"Robust, transparent and sophisticated"
"the most comprehensive, sophisticated and balanced global rankings in the world."
"a dramatically improved ranking system"
"our dramatic innovations"
"our tried, trusted and comprehensive combination of 13 performance indicators remains in place, with the same carefully calibrated weightings"
"our most comprehensive, inclusive and insightful World University Rankings to date"
The problem is that if the rankings are so robust and sophisticated then what is the point of a dramatic improvement? If there is a dramatic improvement one year is there a need for more dramatic improvements in the next? And are there no limits to the rigor of the methodology and the sharpness and depth of the insights?
A perennial highlight of the ranking season is the flurry of adjectives used by THE to describe its summits (prestigious, exclusive) and the rankings and their methodology. Here is a selection:
"the sharper, deeper insights revealed by our new and more rigorous world rankings"
"Robust, transparent and sophisticated"
"the most comprehensive, sophisticated and balanced global rankings in the world."
"a dramatically improved ranking system"
"our dramatic innovations"
"our tried, trusted and comprehensive combination of 13 performance indicators remains in place, with the same carefully calibrated weightings"
"our most comprehensive, inclusive and insightful World University Rankings to date"
The problem is that if the rankings are so robust and sophisticated then what is the point of a dramatic improvement? If there is a dramatic improvement one year is there a need for more dramatic improvements in the next? And are there no limits to the rigor of the methodology and the sharpness and depth of the insights?
Monday, December 14, 2015
Why are university bosses paid so much?
Times Higher Education (THE) has an article by Ellie Bothwell about the earnings of university heads in the USA and the UK. The US data is from the Chronicle of Higher Education.
The sums paid are in some cases extraordinary. Maybe Lee Bollinger of Columbia deserves $4,615,230 but $1,634,000 for the head of Tulane?
On the other side of the Atlantic the biggest earner is the head of Nottingham Trent University. To the lay reader that makes as much sense as the manager of Notts County or Plymouth Argyle outearning Manchester City or Chelsea.
THE argues that there is little correlation between the salaries of the top earning twenty American and British university heads and university prestige as measured by position in the overall THE world rankings.
It would actually be very surprising if a large correlation were found since there is an obvious restriction of range effect if only the top 20 are considered. If we looked at the entire spectrum of salaries we would almost certainly get a much greater correlation. I suspect that THE is trying to deflect criticism that its rankings measure wealth and age rather than genuine quality.
THE do not give any numbers so I have calculated the correlation between the salaries of the US heads and overall scores in the brand name rankings. Maybe I'll get round to the British salaries next week.
The Pearson correlation coefficient between the salaries of the 20 most highly paid university heads in the US and overall THE world rankings scores is only .259, which is not statistically significant.
The correlation is greater when we compare salaries with the US News (USN) America's Best Colleges and the Shanghai Academic Ranking of World Universities. The top 20 US salaries have a .362 correlation with the overall scores in the 2015 America's Best Colleges (not significant) and .379 (significant at the 0.05 level [1 tailed]) with the total scores in the 2015 ARWU.
That suggests that American university heads are recruited with the object of doing well in the things that count in the USN rankings and more so in the Shanghai rankings. Or perhaps that the THE rankings are not so good at measuring the things that the heads are supposed to do.
Of course, if we looked at the whole range of university salaries and university quality there would probably be different results.
By the way, there is almost zero correlation between the top 20 salaries and university size as measured by the number of students.
Thursday, December 03, 2015
Not as Elite as They Thought
British higher education is very definitely not a flat system. There is an enormous difference between Oxford or LSE and the University of Bolton or the University of East London in terms of research output and quality, graduate outcomes, public perceptions, student attributes and just about anything else you could think of.
The most obvious dividing line in the UK university world is between the post-1992 and pre-1992 universities. The former were mostly polytechnics run by local authorities that did not award their own degrees, provided sub-degree courses and did little research.
Another line was drawn in 1994. Named after the hotel (only four stars but it is "old", "famous", "grand" and "impressive") where the inaugural meeting was held, the Russell Group now has 24 members, including of course Oxford and Cambridge, and claims to include only elite research intensive universities. Definitely no riff-raff.
The home page of the group gives a good idea of its priorities:
Our universities are global leaders in research, but it is vital they receive sufficient funding and support
A high-quality, research-led education requires proper funding at both undergraduate and postgraduate level
Collaboration with business is a key part of the work of our universities but Government could foster more innovation
Our universities are global businesses competing for staff, students and funding with the best in the world.
Like all good clubs, membership is not cheap. In 2012 the Universities of Durham, Exeter and York and Queen Mary College University of London paid £500,000 apiece to join.
They may have been wasting their money.
A paper by Vikki Boliver of Durham University, whose research does not appear to have received any sort of funding, finds that analysis of data on research activity, teaching quality, economic resources, academic selectivity and socioeconomic student mix reveals four tiers within UK tertiary education. They are:
- A very small premier league composed of Oxford and Cambridge
- A second tier composed of 22 members of the Russell Group plus 17 of the other old universities -- the first three alphabetically are Aberdeen, Bath and Birmingham
- A third tier with 13 old and 54 post-1992 universities -- starting with Abertaye, Aberystwyth, and University of the Arts Bournemouth
- A fourth tier 4 of 19 post-1992 universities -- starting with Anglia Ruskin, Bishop Grosseteste and University College Birmingham.
It looks like some of the Russell Group are in danger of descending into the abyss of the Tier Three riff-raff.
Incidentally, taking a look at the well known world rankings, the US News Best Global Universities has a gap of 12 places between Cambridge, second of the Tier 1 universities, and Imperial College, best of the Tier 2 schools.
The Shanghai rankings similarly have a gap of ten places between Oxford and University College London.
But there are only four places in the THE World University Rankings between Cambridge and Imperial and one between Oxford and UCL in the QS world rankings.
Another finding is that the differences between teaching quality in the old and new universities are relatively minor compared to the amount and impact of research.
Does that explain why the Russell Group are so hostile to initiatives like AHELO and U-Multirank?
Subscribe to:
Posts (Atom)