The Efficiency Rankings
Times Higher Education has a story about a study by Dirk Van Damme, head of the Centre for Educational Research and Innovation at the OECD. This will be presented at the Global University Summit held in Whitehall, London from the 28th to the 30th May.
The Summit "brings an invitation-only audience of leaders from the world’s foremost
universities, senior policy-makers and international business executives
to London in 2013." It is a "prestigious event" held in a "spectacular setting" and is sponsored by the University of Warwick, Times Higher Education, Thomson Reuters and UK Universities International Unit. Speakers include Vince Cable, Boris Johnson, the Russian ambassador and heads of various universities from around the world.
What Professor Van Damme has done is to treat the THE World University Rankings Research Indicator scores as an input and the Research Influence (Citations) scores as an output. The output scores are divided by the input scores and the result is a measure of the efficiency with which the inputs are turned into citations, which, as we all know, is the main function of the modern university.
According to THE:
"The input indicator takes scaled and normalised measures of research
income and volume into account, and also considers reputation, while the
output indicator looks at citations to institutional papers in Thomson
Reuters’ Web of Science database, normalised for subject differences.
Professor
van Damme said that the results - which show that university systems
outside the Anglo-American elite are able to realise and increase
outputs with much lower levels of input - did not surprise him.
“For
example, Switzerland really invests in the right types of research. It
has a few universities in which it concentrates resources, and they do
very well,” he said.
Previous studies have found the UK to have
the most efficient research system on measures of citation per
researcher and per unit of spending.
But Professor van Damme explained that under his approach, productivity - output per staff member - was included as an input.
“With
efficiency I mean the total research capacity of an institution,
including its productivity, divided by its impact. The UK is not doing
badly at all, but other countries are doing better, such as Ireland,
which has a very low research score but a good citations score,” he
said.
Given the severity of the country’s economic crisis, Ireland’s success was particularly impressive, he said.
“I think it is really conscious of the effort it has to make to maintain its position and is doing so.”
Low
efficiency scores for China and South Korea reflected the countries’
problems in translating their huge investment into outputs, he added."
One hesitates to be negative about a paper presented at a prestigious event in a spectacular setting to an invitation only audience but this is frankly rather silly.
I would accept that income can be regarded as an input but surely not reputation and surely not volume of publications. Also, unless Van Damme's methodology has undisclosed refinements he is treating research scores as having the same value regardless of whether they are composed mainly of scores for reputation or for number of publications or for research income.
Then there is the time period concerned. Research income is income for one year, Publications are drawn from a five year period. These are then compared with citations over a six year period. So the paper is asking how research income for 2010 produces citations in the years 2006 - 2011 of papers published in the years 2006 - 2010. A university is certainly being remarkably efficient if its 2010 income is producing citations in 2006, 2007, 2008 and 2009.
Turning to the citations side of the equation, it should be recalled that the THE citations indicator includes an adjustment by which the citation impact score for universities is divided by the square root of the citation impact score for the country as a whole. In other words a university located in a country where papers are not cited very much gets a big boost and the lower the national citation impact score the bigger the boost for the university. This is why Hong Kong universities suffered reduced scores when Thomson Reuters took them out of China when counting citations and put them in their own separate category.
So, it is not surprising that universities from outside the Anglo-Saxon elite do well for citations and thus appear to be very efficient. Thomson Reuters methodology gives such universities a very substantial weighting just for being located in countries that are less productive in terms of citations.
None of this is new. In 2010 Van Damme did something similar at a seminar in London.
Van Damme is just analysing the top 200 universities in the THE rankings. It would surely be more interesting to analyse the top 400 whose scores are obtainable from an iPad/iPhone app.
So here are the top ten universities in the world according to the efficiency with which they turn income, reputation and publications into citations. The procedure is to divide the citations score from the 2012 THE rankings by the research indicator score.
1. Tokyo Metropolitan University
2. Moscow State Engineering Physics Institute
3. Florida Institute of Technology
4. Southern Methodist University
5. University of Hertfordshire
6. University of Portsmouth
7. King Mongkut's University of Technology
8. Vigo University
9. Creighton University
10. Fribourg University
No doubt the good and the great of the academic world assembled in Whitehall will make a trip to Portsmouth or even to Vigo or Creighton if they can find them on the map.
And now for the hall of shame. Here are the bottom ten of the THE top 400, ranked according to efficiency as measured by citations indicator scores divided by research scores. The heads of these failing institutions will no doubt be packing their bags and looking for jobs as junior administrative assistants at technical colleges in Siberia or the upper Amazon
391. Tsinghua University
392. Chinese University of Hong Kong
393. National Taiwan University
394. National Chiao Tung University
395. Tilburg University
396. Delft University of Technology
397. Seoul National University
398. State University of Campinas
399. Sao Paulo University
400. Lomosonov Moscow State University
In a little while I hope to publish the full 400 after I have finished being sarcastic about the QS subject rankings.
Excellent article, very good way to show what people using measures they have thrown together but don't understand can result in. Sarcasm is sometimes the best way to deal with that.
ReplyDeleteWonderful post!! I would love to read more from your side as you have a very unique and good way to write your posts!
ReplyDeleteArt classes melbourne