Monday, July 29, 2013

Shopping Around

It seems that many universities are now targeting specific rankings. One example is Universiti Malaya which submits data to QS but so far has not taken part in the THE rankings.

Now there is a report about the University of Canberra:

"The University of Canberra will spend $15 million over the next five years on some of the world's top researchers as the university pushes to break into the world rankings by 2018
The university has budgeted $3 million a year to attract 10 ''high performing'' researchers in five specialist areas: governance, environment, communication, education and health.
The recruitment drive started last week with advertising in the London Times Higher Education supplement [sic], with the paper's ranking of ''young'' universities the target of UC's campaign, with 13 Australian universities already in their top 100.
''We've decided to aim for [that] one particular ranking, although that will probably mean we'll hit some of the targets for many of the rankings, because there are an overlapping set of criteria that are used,'' Professor Frances Shannon, the university's deputy vice-chancellor of research, said."

It looks like the university is aiming not just at the THE under-50 rankings but specifically at the citations indicator which rewards high levels of citations in fields that are usually not cited very much.

This may be another case where the pursuit of ranking glory undermines the overall quality of a university.

"The recruitment drive comes after the university was criticised this month for axing language courses to try to combat government funding cuts while continuing its sponsorship of the Brumbies rugby team."

Friday, July 19, 2013

A bad idea but not really new

From Times Higher Education

University teachers everywhere are subject to this sort of pressure but it is unusual for it to be stated so explicitly.

"A university put forward plans to assess academics’ performance according to the number of students receiving at least a 2:1 for their modules, Times Higher Education can reveal.
According to draft guidance notes issued by the University of Surrey - and seen by THE - academics were to be required to demonstrate a “personal contribution towards achieving excellence in assessment and feedback” during their annual appraisals.
Staff were to be judged on the “percentage of students receiving a mark of 60 per cent or above for each module taught”, according to the guidance form, issued in June 2012, which was prefaced by a foreword from Sir Christopher Snowden, Surrey’s vice-chancellor, who will be president of Universities UK from 1 August.
“The intention of this target is not to inflate grades unjustifiably but to ensure that levels of good degrees sit comfortably within subject benchmarks and against comparator institutions,” the document explained.
After “extensive negotiations” with trade unions, Surrey dropped the proposed “average target mark”, with replacement guidance instead recommending that staff show there to be “a normal distribution of marks” among students."

Friday, July 12, 2013

Serious Wonkiness

Alex Usher at HESA had a post on the recent THE Under-50 Rankings. Here is an except about the Reputation and Citations indicators.

"But there is some serious wonkiness in the statistics behind this year’s rankings which bear some scrutiny.  Oddly enough, they don’t come from the reputational survey, which is the most obvious source of data wonkiness.  Twenty-two percent of institutional scores in this ranking come from the reputational ranking; and yet in the THE’s reputation rankings (which uses the same data) not a single one of the universities listed here had a reputational score high enough that the THE felt comfortable releasing the data.  To put this another way: the THE seemingly does not believe that the differences in institutional scores among the Under-50 crowd are actually meaningful.  Hmmm.

No, the real weirdness in this year’s rankings comes in citations, the one category which should be invulnerable to institutional gaming.  These scores are based on field-normalized, 5-year citation averages; the resulting institutional scores are then themselves standardized (technically, they are what are known as z-scores).   By design, they just shouldn’t move that much in a single year.  So what to make of the fact that the University of Warwick’s citation score jumped 31% in a single year, Nanyang Polytechnic’s by 58%, or UT Dallas’ by a frankly insane 93%?  For that last one to be true, Dallas would have needed to have had 5 times as many citations in 2011 as it did in 2005.  I haven’t checked or anything, but unless the whole faculty is on stims, that probably didn’t happen.  So there’s something funny going on here."

Here is my comment on his post.

Your comment at University Ranking Watch and your post at your blog raise a number of interesting issues about the citations indicator in the THE-TR World University Rankings and the various spin-offs.

You point out that the scores for the citations indicator rose at an unrealistic rate between 2011 and 2012 for some of the new universities in the 100 Under 50 Rankings and ask how this could possibly reflect an equivalent rise in the number of citations.

Part of the explanation is that the scores for all indicators and nearly all universities in the WUR, and not just for the citations indicator and a few institutions, rose between 2011 and 2012. The mean overall score of the top 402 universities in 2011 was 44.3 and for the top 400 universities in 2012 it was 49.5.

The mean scores for every single indicator or group of indicators in the top 400 (402 in 2011) have also risen although not all at the same rate. Teaching rose from 37.9 to 41.7, International Outlook from 51.3 to 52.4, Industry Income from 47.1 to 50.7, Research from 36.2 to 40.8 and Citations from 57.2 to 65.2.

Notice that the scores for citations are higher than for the other indicators in 2011 and that the gap further increases in 2012.

This means that the citations indicator had a disproportionate effect on the rankings in 2011, one that became more disproportionate in 2012

It should be remembered that the scores for the indicators are z scores and therefore they measure not the absolute number of citations but the distance in standard deviations from the mean number of normalised citations of all the universities analysed. The mean is the mean not of the 200 universities listed in the top 200 universities in the printed and online rankings or the 400 included in the ipad/iphone app but the mean of the total number of universities that have asked to be ranked. That seems to have increased by a few hundred between 2011 and 2012 and will no doubt go on increasing over the next few years but probably at a steadily decreasing rate.

Most of the newcomers to the world rankings have overall scores and indicator scores that are lower than those of the universities in the top 200 or even the top 400. That means that the mean of the unprocessed scores on which the z scores are based decreased between 2011 and 2012 so that the overall and indicator scores of the elite universities increased regardless of what happened to the underlying raw data.

However, they did not increase at the same rate. The scores for the citations indication, as noted, were much higher in 2011 and in 2012 than they were for the other indicators. It is likely that this was because the difference between top 200 or 400 universities and those just below the elite is greater for citations than it is for indicators like income, publications and internationalisation. After all, most people would probably accept that internationally recognised research is a major factor in distinguishing world class universities from those that are merely good.

Another point about the citations indicator is that after the score for field and year normalised citations for each university is calculated it is adjusted according to a “regional modification”. This means that the score, after normalisation for year and field, is divided by the square root of the average for the country in which the university is located. So if University A has a score of 3.0 citations per paper and the average for the country is 3.0 then the score will be divided by 1.73, the square of 3, and the result is 1.73.  If a university in country B has the same score of 3.0 citations per paper but the overall average is just 1.0 citation per paper the final score will be 3.0 divided by the square root of 1, which is 1, and the result is 3.

University B therefore gets a much higher final score for citations even though the number of citations per paper is exactly the same as University A’s . The reason for the apparently higher score is simply that the two universities are being compared to all the other universities in their country. The lower the score for universities in general then the higher the regional modification for specific universities.  The citations indicator is not just measuring the number of citations produced by universities but also in effect the difference between the bulk of a country’s universities and the elite that make into the top 200 or 400.

It is possible then that a university might be helped into the top 200 or 400 by having a high score for citations that resulted from being better than other universities in a particular country that were performing badly.

It is also possible that if a country’s research performance took a dive, perhaps because of budget cuts, with the overall number of citations per paper declining, this would lead to an improvement in the score for citations of a university that managed to remain above the national average.

It is quite likely that -- assuming the methodology remains unchanged -- if countries like Italy, Portugal or Greece experience a fall in research output as a result of economic crises, their top universities will get a boost for citations because they are benchmarked against a lower national average.

Looking at the specific places mentioned, it should be noted once again that Thomson Reuters do not simply count the number of citations per paper but compare them with the mean citations for papers in particular fields published in particular years and cited in particular years.

Thus a paper in applied mathematics published in a journal in 2007 and cited in 2007, 2008, 2009, 2010, 2011 and 2012 will be compared to all papers in applied maths published in 2007 and cited in those years.

If it is usual for a paper in a specific field to receive few citations in the year of publication or the year after then even a moderate amount of citations can have a disproportionate effect on the citations score.

It is very likely that Warwick’s increased score for citations in 2012 had a lot to do with participation in a number of large scale astrophysical projects that involved many institutions and produced a larger than average number of citations in the years after publication. In June 2009, for example, the Astrophysical Journal Supplement Series published ‘The seventh data release of the Sloan Digital Sky Survey’ with contributions from 102 institutions, including Warwick. In 2009 it received 45 citations. The average for the journal was 13. The average for the field is known to Thomson Reuters but it is unlikely that anyone else has the technical capability to work it out. In 2010 the paper was cited 262 times: the average for the journal was 22. In 2011 it was cited 392 times: the average for the journal was 19 times.

This and similar publications have contributed to an improved performance for Warwick, one that was enhanced by the relatively modest number of total publications by which the normalised citations were divided.

With regard to Nanyang Technological University, it seems that a significant role was played by a few highly cited publications in Chemical Reviews in 2009 and in Nature in 2009 and 2010.

As for the University of Texas at Dallas, my suspicion was that publications by faculty at the University of Texas Southwestern Medical Center had been included, a claim that had been made about the QS rankings a few years ago.  Thomson Reuters have, however, denied this and say they have observed unusual behaviour by UT Dallas which they interpret as an improvement in the way that affiliations are recorded. I am not sure exactly what this means but assume that the improvement in the citations score is an artefact of changes in the way data is recorded rather than any change in the number or quality of citations.

There will almost certainly be more of this in the 2013 and 2014 rankings."
 

Tuesday, July 02, 2013

The Complete Efficiency Rankings

At last. Here are complete Efficiency Rankings measuring the efficiency with which universities turn inputs into citations. I am using the method of Professor Dirk van Damme which is to divide the scores for Citations: Research Influence in The THE World University Rankings by the scores for Research: Volume, Income and Reputation. Here is the method as cited in a previous post:

"The input indicator takes scaled and normalised measures of research income and volume into account, and also considers reputation, while the output indicator looks at citations to institutional papers in Thomson Reuters’ Web of Science database, normalised for subject differences.
Professor van Damme said that the results - which show that university systems outside the Anglo-American elite are able to realise and increase outputs with much lower levels of input - did not surprise him.
“For example, Switzerland really invests in the right types of research. It has a few universities in which it concentrates resources, and they do very well,” he said.
Previous studies have found the UK to have the most efficient research system on measures of citation per researcher and per unit of spending.
But Professor van Damme explained that under his approach, productivity - output per staff member - was included as an input.
“With efficiency I mean the total research capacity of an institution, including its productivity, divided by its impact. The UK is not doing badly at all, but other countries are doing better, such as Ireland, which has a very low research score but a good citations score,” he said.
Given the severity of the country’s economic crisis, Ireland’s success was particularly impressive, he said.
“I think it is really conscious of the effort it has to make to maintain its position and is doing so.”
Low efficiency scores for China and South Korea reflected the countries’ problems in translating their huge investment into outputs, he added."




1.  Tokyo Metroplitan
2.  Moscow Engineering Physics Institute
3.  Florida Institute of Technology
4.  Southern Methodist
5.  Hertfordshire
6.  Portsmouth
7= King Mongkut's University of Technology
7= Vigo
9.  Creighton
10.  Fribourg
11.  King Abdulaziz
12.  University of the Andes
13.  Trieste
14.  Renmin
15.  Medical University of Vienna
16.  Polytech University Valencia
17=  Beyreuth
18=  Montana
19.   Mainz
20.   Ferrara
21.   Drexel
22.   Valencia
23.   Linz
24.   Crete
25.   Colorado School of Mines
26.  Technical University of Dresden
27.   Innsbruck
28.  Nurnberg
29= Dauphine
29= Wake Forest
29= Maryland Baltimore County
32. St George's London
33.  William and Mary College
34. Hong Kong Baptist
35.  Basel
36.  Texas San Antonio
37.  Duisberg
38.  Lyon 1
39.  Wurzburg
40.  Charles Darwin
41.  Wayne State
42. Northeastern
43.  Bicocca
44. Royal Holloway
45.  Koc
46.  Georgia University of Health Science
47.  Modena
48.  Dundee
49.  Southern Denmark
50= IIT Roorkhee
50= Pompeu Fabra
52.  Graz
53= Oregon
53= Diderot
55.   Bielfeld
56.   Munster
57.  Waikato
58= Grenoble
59= East Anglia
60= Bonn
61=  Pavia
62.  ENS Lyon
63.  Eastern Finland
64.  Padua
65.  Brandeis
66.  Aberystwyth
67.  Tulane
68.  Tubingen
69= Warsaw
70= Sun Yat Sen
71= Keele
72.  Tromso
73.  Brunel
74.  Liege
75.  Queen Mary
76=  Vermont
77=  Trento
78.  Turin
79.  Jyvaskyla
80.  Carleton
81.  Kansas
82.  California Riverside
83.  SUNY Stony Brook
84=  George Washington
85=  Pisa
86.  Tasmania
87.  George Mason
88.  Boston College
89=  Oregon State
90=  Texas Dallas
91.  Trinity College Dublin
92= University Science and Technology China
92= Murdoch
92= Cinncinati
92= Galway
92= Yeshiva
97= Tufts
97= Minho
99. Miami
100.  Lehigh
101. Technical University Denmark
102= Rice
102= Iceland
104.  California Santa Cruz
104= Milan
106.  Monpellier 2
107.  Frankfurt
108= Bergen
109=  Strasbourg
110.  Victoria
111.  Rochester
112.  Cork
113.  Dartmouth
114.  Oklahoma
115.  Birkbeck
116.  Porto
117.  Canterbury
118= Newcastle  UK
118= Notre Dame
118= University College Dublin
121. Binghamton
122.  Aveiro
123=  Kiel
123= Sussex
125.  Temple
126.  Aachen
127=  Fribourg
127=  Queens Belfast
127= Colorado Boulder
130.  Iowa State
131. Tokyo Medical Dental
132= Autonomous Madrid
132= Swedish Agriculture
132= Tempere
135= Deakin
135= Barcelona
137= Stockholm
137= Stirling
139.  Laval
140.  Durham
141.  Bangor
142= Aberdeen
142= Vanderbilt
144.  Istanbul Technical
145.  Nanjing
146= Exeter
146= Emory
146= Leicester
149.  Southamton
150.  Paris Mines
151. Vrije Universiteit Brussel
152.  Polytechnic Milan
153. Kwazulu-Natal
154= Linkoping
154= Bilkent
154= Herriot-Watt
154= Bologna
158= Wyoming
158= Utah
158= Massey
161= Glasgow
161= Bern
163.  ENS Paris
164.  Zurich
165= Case Western Reserve
166= California Irvine
167= Tartu
168= Wellington
169= Salento
170.  South Carolina
171.  York UK
172.  Aalto
173= Curie
173= Macquarie
173= Boston
176= Delaware
177= Copenhagen
178= Hannover
179.  Norway University of Science and Technology
180.  Antwerp
181= Dalhousie
181= Renselaer Polytechnic Institute
183= Konstanz
184= Paris Sud
185.  Technical University Munich
186.  Lancaster
187.  Waseda
188.  Otago
189.  Arizona State
190= SUNY Albany
190= Gottingen
190= Autonomous Barcelona
193= Cape Town
194= St Andrews
195= Colorado State
195= Bath
195= Wollongong
198= Tsukuba
198= Simon Fraser
198= Liverpool
198= Umea
202= Geneva
202= Newcastle Australia
204= Universite Libre Bruxelles
204= Virginia
206= Lausanne
206= Louvain
208= Connecticut
208= Georgetown
208= York Canada
211.  EPF Lausanne
212= North Carolina State
212= Bristol
212= Aalborg
212= Free University Amsterdam
216= Indiana
216= Kentucky
218. Maryland College Park
219.  Karlsruhe Institute technology
220= University Technology Sydney
220= Iowa
222.  Charles
223.  Flinders
224.  Cardiff
225= Auckland
225= Oslo
227.  Pittsburgh
228= Heidelberg
228= Guelph
228= Washington State
228= Sheffield
232= Chinese University Hong Kong
232= Strathclyde
234= Ottawa
234= Gotherberg
234= Washington St Louis
237.  Medical South Carolina
238= McMaster
238= Brown
238= National Sun Yat Sen
238= Reading
242.  Ecole Polytechnique
243.  Helsinki
244= Quebec
244= National Central Taiwan     
246.  Bogazici
247= Southern California
247= Arizona
249.  Keio
250= Houston
250= Stellenbosch
250= Kings College London
250= Darmstadt
250= Western Australia
255= Pohang
255= IIT Bombay
257= Wageningen
257= Manitoba
259= South Australia
259= Nagoya
261= Leeds
261= UC Santa Barbara
261= Nijmegen
261= Jagiellon
265= New York University
265= Calgary
265= Ohio State
268.  Aarhus
269= Witwatersrand
269= North Carolina Chapel Hill
269= Michigan State
269= Fudan
273= Bochum
273= Munich
275= SUNY Buffalo
275= Adelaide
275= Sapienza
278= Utrecht
278= Edinburgh
278= Queensland University of Technology
281= Lund
281= Ghent
283.  Erasmus
284= Massachusetts
284= Illinois Chicago
284= Nottingham
287= Eindhoven
287= Amsterdam
289.  UC San Diego
290.  Birmingham
291= Western Ontario
291= Twente
293= Washington Seattle
293= Duke
295= Penn State
295= NUI Maynooth
297= Maastricht
297= Groningen
297= Columbia
297= Leiden
297= Georgia
302.  UC Davis
303= Southern Florida
303= Chalmers University of Technology
305= Minnesota
305= Essex
305= Manchester
305= Georgia Institute of Technology
309= Rutgers
309= Texas at Austin
311= Northwestern
311= Warwick
311= Vienna
311= MIT
315.  Johns Hopkins
316= Wisconsin Madison
316= Carnegie Mellon
318.  Alberta
319.  Pennsylvania
320= Hong Kong University of Science and Technology
320= Kyushu
322= Chicago
322= Vienna University of Technology
324= Queensland
324= Montreal
326.  British Columbia
327= Yale
327= Imperial College London
327= UCLA
327= Hebrew University of Jerusalem
327= Karolinska
332= Melbourne
332= Humboldt
332= National Tsinghua Taiwan
332= Cambridge
332= Harvard
332= Stanford
338= Monash 
338= Princeton
338= Caltech
338= Michigan
338= UC Berkeley
338= Cornell
344= Waterloo
344= KHT Sweden
344= Missouri
347.  University College London
348= Oxford
348= Middle East Technical University
350.  Yonsei
351= Toronto
351= Illinois Urbana Champagne 
351= Peking
351= Leuven
355= Zhejiang
355= Hokkaido
355= Hong Kong Polytechnic University
355= McGill
359= ETH Zurich
359= Tokyo Institute of Technology
361= Berlin
361= Uppsala
363= Korea
363= Sydney
365= Florida
365= New South Wales
367= Australian National
367= Tohoku
367= Purdue
367= Technion
371= Surrey
371= IIT Kharagpur
373= KAIST
373= Texas A and M
375. Virginia Polytechnic Institute
376= Osaka
376= Nanyang Technological University
376= Shanghai Jiao Tong
379.  LSE
380.  Sungkyunkwan
381.  Sharif University of Technology
382.  Tokyo
383= National Taiwan University of Science and Technology
383= National Autonomous University of Mexico
385= Kyoto
385= National University of Singapore
387.  Loughborough
388.  National Cheng Kung
389.  Tel Aviv
390= Hong Kong
390= Tsinghua
392.  Chinese University of Hong Kong
393.  National Taiwan
394.  National Chiao Tung
395.  Tilburg
396.  Delft
397.  Seoul National
398.  State University Campinas
399.  Sao Paulo
400.  Moscow State
























Monday, July 01, 2013

Competition and Controversy in Global Rankings

My article,  'Competition and Controversy in Global Rankings' can be accessed here.