Tuesday, December 22, 2015

Worth Reading 4: Rankings influence public perceptions of German universities



Der Ranking-Effekt Zum Einfluss des „Shanghai-Rankings“ auf die medial dargestellte Reputation deutscher Universitäten

Tim Hegglin · Mike S. Schäfer

Publizistik (2015) 60:381–402 DOI 10.1007/s11616-015-0246-

English Abstract

Increasingly, universities find themselves in a competition about public visibility and reputation in which media portrayals play a crucial role. But universities are complex, heterogeneous institutions which are difficult to compare. University rankings offer a seemingly simple solution for this problem: They reduce the complexity inherent to institutions of higher education to a small number of measures and easy-to-understand ranking tables – which may be particularly attractive for media as they conform to news values and media preferences. Therefore, we analyze whether the annual publications of the “Shanghai Ranking” influence media coverage about the included German universities. Based on a content analysis of broadsheet print media, our data show that a ranking effect exists: After the publication of the Ranking results, included universities are presented as more reputable in the media. This effect is particularly strong among better ranked universities. It does not, however, increase over a 10-year time period. The Ranking Effect How the “Shanghai Ranking” influences the mediated reputation of German Universities.

Thanks to Christian Scholz of the University of Hamburg for alerting me to this paper.

Monday, December 21, 2015

Worth Reading 3

Matthew David, Fabricating World Class: Global university league tables, status differentiation and myths of global competition

accepted for publication in the British Journal of Sociology of Education


This paper finds that UK media coverage of global university rankings  is strongly biased towards the Russell Group, which supposedly consists of elite research intensive universities, emphasises the superiority of some US universities, interprets whatever happens in the rankings as evidence that British universities, especially those in the Russell Group, need and deserve as much money as they want.

For example, he quotes the Daily Mail as saying in 2007 that "Vice chancellors are now likely to seize on their strong showing [in the THES-QS world university rankings] to press the case for the £3,000 a-year cap on tuition fees to be lifted when it is reviewed in 2009," while the Times in 2008, when UK universities slipped, said: "Vice chancellors and commentators voiced concern that, without an increase in investment, Britain's standing as a first-class destination for higher education could be under threat"

The media and elite universities also claim repeatedly that lavishly funded Asian universities are overtaking the impoverished and neglected schools of the West.

David argues that none of this is supported by the actual data of the rankings. He looks at the top 200 of the three well known rankings QS, THE, ARWU up to 2012.

I would agree with most of these conclusions, especially the argument that the rankings data he uses do not support either US superiority or the rise of Asia.

I would go further and suggest that changes to the QS rankings in 2008 and 2015, plus ad hoc adjustments to the employer survey in 2011 and 2012 plus changes in rules for submission of data, plus variations in the degree of engagement with the rankings, plus  the instability resulting from an unstable pool from which ranked universities are drawn would render the QS rankings invalid as a measure of any but the most obvious trends.

Similarly THE rankings, started in 2010, underwent substantial changes in 2011 and then in 2015. Between those years there were fluctuations for many universities because  a few papers could have a disproportionate impact on the citations indicator and again because the pool of ranked universities from which indicator means are calculated is unstable.

If, however, we take the Shanghai rankings over the course of eleven years and look at the full five hundred rankings then we do find that Asia, or more accurately some of it, is rising.

The number of Chinese universities in the ARWU top 500 rose from 16 in 2004 to 44 in 2015. The number of South Korean universities rose from 8 to 12, and Australian from 14 to 20,

But the number of Indian universities remained unchanged at three, while the number of Japanese fell from 36 to 18.

David does not argue that Asia is not rising, merely that looking at the top level of the rankings does not show that it is.

What is probably more important in the long run is the comparative performance not of universities but  of secondary school systems. Here the future of the US, the UK and continental Europe does indeed look bleak while that of East Asia and the Chinese diaspora is very promising.



Saturday, December 19, 2015

Go East Young (and Maybe not so Young) Man and Woman!

Mary Collins, a distinguished immunologist at University College London (UCL), is leaving to take up an academic appointment in Japan. Going with her is her husband Tim Hunt, a Nobel winner, who was the victim of a particularly vicious witch hunt about some allegedly sexist remarks over dinner. She had apparently applied for the Japanese post before the uproar but her departure was hastened by the disgraceful way he was treated by UCL.

Could this be the beginning of a massive drain of academic talent from the West to Asia? What would it take to persuade people like Nicholas Christakis, Erika Christakis, Joshua RichwineMark RegnerusAndrea QuenetteK.C. Johnson, and Matt Tyler to trade in abuse and harassment by the "progressive" academic establishment for a productive scholarly or administrative career in Korea, Japan, China or the Pacific rim?

Meanwhile Russia and the Arab Gulf are also stepping up their recruitment of foreign scientists. Has Mary Collins started a new trend?

Friday, December 18, 2015

THE Adjectival Hyperbole

Times Higher Education (THE) has always had, or tried to have, a good opinion of itself and its rankings.

A perennial highlight of the ranking season is the flurry of adjectives used by THE to describe its summits (prestigious, exclusive) and the rankings and their methodology. Here is a selection:

"the sharper, deeper insights revealed by our new and more rigorous world rankings"

"Robust, transparent and sophisticated"


"the most comprehensive, sophisticated and balanced global rankings in the world."

"dramatically improved ranking system"

"our dramatic innovations"

"our tried, trusted and comprehensive combination of 13 performance indicators remains in place, with the same carefully calibrated weightings" 

"our most comprehensive, inclusive and insightful World University Rankings to date"

The problem is that if the rankings are so robust and sophisticated then what is the point of a dramatic improvement? If there is a dramatic improvement one year is there a need for more dramatic improvements in the next? And are there no limits to the rigor of the methodology and the sharpness and depth of the insights?