One sign of the reliability of the Shanghai Academic Ranking of World Universities is that at the top there is little change from year to year. It is difficult to get excited about Tokyo slipping one place to join University College London in 21st position although I must admit the Federal Institute of Technology in Zurich moving up two whole places is rather intriguing.
These rankings are best used to study changes over a few years. Since 2004, according to data provided by the Shanghai rankers, the following countries have increased their membership of the world's elite of the top 100 universities
Australia +3
Israel +2
USA +1
Switzerland +1
Netherlands +1
Denmark +1
Belgium +1
These countries have seen universities leave the top 100.
Germany -3
Japan -2
UK -2
Sweden -1
Italy -1
Austria -1
At the very top there is no sign of the erosion of English speaking dominance (academically I think Israel can be classed as English speaking). If anything, it is being extended although with a shift from the UK to the US and Australia.
Looking at the top 500, which we might consider to include world class research universities, the picture is different. From 2004 to 2013 the following changes occurred.
China +26
Australia +5
Saudi Arabia +4
South Korea +3
Portugal +3
Brazil +2
New Zealand +2
Spain +1
Sweden +1
Turkey +1
Malaysia +1
Slovenia +1
Iran +1
Egypt +1
Croatia +1
Chile +1
Serbia +1
Mexico +1
USA -21
Japan -16
Germany -5
UK -5
Italy -4
France -2
India -2
Switzerland -1
Netherlands -1
Denmark -1
Hungary -1
Here the big story is the relative decline of the US, Northern Europe, Japan and India and the rise of China and, to a lesser extent, Australia, Korea, Southwest Asia, Southern Europe except Italy and Latin America.
There is very little sign of any Asian renaissance outside Greater China and Korea and maybe the Middle East. India has actually lost ground over the last decade and there is now only one institution from the whole of South Asia and central Asia.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Saturday, August 17, 2013
Wednesday, August 14, 2013
The Webometrics Methodology
Isidro F. Aguillo of Webometrics has kindly sent me a summary of the methodology:
- The ranking intends to measure global performance of the
universities using the web only as a proxy. Web design is mostly
irrelevant, web contents are key if the web policy intends to mirror
all the university missions on the web.
- We have a MODEL for the weighting of the variables in the composite
indicator. It is the traditional "impact factor" developed several
decades ago in bibliometrics adapted to the web: A ratio 1:1 (50%:50%)
between ACTIVITY and IMPACT.
- For measuring IMPACT (visibility?, impact?, quality?) there are
three alternatives: prestige surveys (THE, QS), peers citations
(Leiden, NTU, URAP) or link visibility (number of external inlinks or
backlinks). We use this last option because in this way we acknowledge
a larger diversity of activities and missions and (very important) by
a huge amount of users, a truly global audience.
- Personally I have two important objectives with the ranking: First,
I am a scholar (scholar.google.com/citations?user=SaCSbeoAAAAJ) who is
paid by the Spanish government to make scientific research, so the
ranking provides me with a lot of valuable data useful for analysis
and papers. Second, I have a "political" agenda, that is supporting
Open Access initiatives.
- So, for measuring ACTIVITY the key issue is considering the
full-text documents, so Openness consists of the number of files in
pdf, doc, ppt and similar formats.
- An important innovation in the 3 latest editions is the Excellence
indicator that is not really web related but intends to acknowledge
the research intensive institutions. The data is provided by Scimago
and reflects the top 10% more cited papers in 21 disciplines.
- The ranking intends to measure global performance of the
universities using the web only as a proxy. Web design is mostly
irrelevant, web contents are key if the web policy intends to mirror
all the university missions on the web.
- We have a MODEL for the weighting of the variables in the composite
indicator. It is the traditional "impact factor" developed several
decades ago in bibliometrics adapted to the web: A ratio 1:1 (50%:50%)
between ACTIVITY and IMPACT.
- For measuring IMPACT (visibility?, impact?, quality?) there are
three alternatives: prestige surveys (THE, QS), peers citations
(Leiden, NTU, URAP) or link visibility (number of external inlinks or
backlinks). We use this last option because in this way we acknowledge
a larger diversity of activities and missions and (very important) by
a huge amount of users, a truly global audience.
- Personally I have two important objectives with the ranking: First,
I am a scholar (scholar.google.com/citations?user=SaCSbeoAAAAJ) who is
paid by the Spanish government to make scientific research, so the
ranking provides me with a lot of valuable data useful for analysis
and papers. Second, I have a "political" agenda, that is supporting
Open Access initiatives.
- So, for measuring ACTIVITY the key issue is considering the
full-text documents, so Openness consists of the number of files in
pdf, doc, ppt and similar formats.
- An important innovation in the 3 latest editions is the Excellence
indicator that is not really web related but intends to acknowledge
the research intensive institutions. The data is provided by Scimago
and reflects the top 10% more cited papers in 21 disciplines.
The Webometrics Rankings
The July 2013 Webometrics rankings have just been published. The top five are:
1. Harvard
2. MIT
3. Stanford
4. UC Berkeley
5. UCLA
In first place in various regions are:
Latin America: Sao Paulo
Europe: Oxford
Asia: National University of Singapore
Africa: Kwazulu Natal
Arab World: King Saud University
Oceania: Australian National University
Caribbean: University of Puerto Rico Mayaguez
Middle East: Tel Aviv
South Asia: IIT Bombay
Eastern and Central Europe: Lomonosov Moscow State University
1. Harvard
2. MIT
3. Stanford
4. UC Berkeley
5. UCLA
In first place in various regions are:
Latin America: Sao Paulo
Europe: Oxford
Asia: National University of Singapore
Africa: Kwazulu Natal
Arab World: King Saud University
Oceania: Australian National University
Caribbean: University of Puerto Rico Mayaguez
Middle East: Tel Aviv
South Asia: IIT Bombay
Eastern and Central Europe: Lomonosov Moscow State University
Saturday, August 03, 2013
The Forbes Rankings
Forbes has just released its Best College list, which is compiled by the Center of College Affordability and Productivity. This index reflects student needs and the indicators include quality of teaching, student debt and graduate employability.
Stanford is at the top, Harvard is eighth and Caltech 18th. The armed forces academies and small liberal arts colleges do well.
The top five are:
1. Stanford
2. Pomona College.
3. Princeton
4. Yale
5. Columbia
Monday, July 29, 2013
Shopping Around
It seems that many universities are now targeting specific rankings. One example is Universiti Malaya which submits data to QS but so far has not taken part in the THE rankings.
Now there is a report about the University of Canberra:
"The University of Canberra will spend $15 million over the next five years on some of the world's top researchers as the university pushes to break into the world rankings by 2018
The university has budgeted $3 million a year to attract 10 ''high performing'' researchers in five specialist areas: governance, environment, communication, education and health.
The recruitment drive started last week with advertising in the London Times Higher Education supplement [sic], with the paper's ranking of ''young'' universities the target of UC's campaign, with 13 Australian universities already in their top 100.
''We've decided to aim for [that] one particular ranking, although that will probably mean we'll hit some of the targets for many of the rankings, because there are an overlapping set of criteria that are used,'' Professor Frances Shannon, the university's deputy vice-chancellor of research, said."
It looks like the university is aiming not just at the THE under-50 rankings but specifically at the citations indicator which rewards high levels of citations in fields that are usually not cited very much.
This may be another case where the pursuit of ranking glory undermines the overall quality of a university.
"The recruitment drive comes after the university was criticised this month for axing language courses to try to combat government funding cuts while continuing its sponsorship of the Brumbies rugby team."
Now there is a report about the University of Canberra:
"The University of Canberra will spend $15 million over the next five years on some of the world's top researchers as the university pushes to break into the world rankings by 2018
The university has budgeted $3 million a year to attract 10 ''high performing'' researchers in five specialist areas: governance, environment, communication, education and health.
The recruitment drive started last week with advertising in the London Times Higher Education supplement [sic], with the paper's ranking of ''young'' universities the target of UC's campaign, with 13 Australian universities already in their top 100.
''We've decided to aim for [that] one particular ranking, although that will probably mean we'll hit some of the targets for many of the rankings, because there are an overlapping set of criteria that are used,'' Professor Frances Shannon, the university's deputy vice-chancellor of research, said."
It looks like the university is aiming not just at the THE under-50 rankings but specifically at the citations indicator which rewards high levels of citations in fields that are usually not cited very much.
This may be another case where the pursuit of ranking glory undermines the overall quality of a university.
"The recruitment drive comes after the university was criticised this month for axing language courses to try to combat government funding cuts while continuing its sponsorship of the Brumbies rugby team."
Friday, July 19, 2013
A bad idea but not really new
From Times Higher Education
University teachers everywhere are subject to this sort of pressure but it is unusual for it to be stated so explicitly.
University teachers everywhere are subject to this sort of pressure but it is unusual for it to be stated so explicitly.
"A university put forward plans to assess academics’ performance according to the number of students receiving at least a 2:1 for their modules, Times Higher Education can reveal.
According to draft guidance notes issued by the University of Surrey - and seen by THE - academics were to be required to demonstrate a “personal contribution towards achieving excellence in assessment and feedback” during their annual appraisals.
Staff were to be judged on the “percentage of students receiving a mark of 60 per cent or above for each module taught”, according to the guidance form, issued in June 2012, which was prefaced by a foreword from Sir Christopher Snowden, Surrey’s vice-chancellor, who will be president of Universities UK from 1 August.
“The intention of this target is not to inflate grades unjustifiably but to ensure that levels of good degrees sit comfortably within subject benchmarks and against comparator institutions,” the document explained.
After “extensive negotiations” with trade unions, Surrey dropped the proposed “average target mark”, with replacement guidance instead recommending that staff show there to be “a normal distribution of marks” among students."
Friday, July 12, 2013
Serious Wonkiness
Alex Usher at HESA had a post on the recent THE Under-50 Rankings. Here is an except about the Reputation and Citations indicators.
"But there is some serious wonkiness in the statistics behind this year’s rankings which bear some scrutiny. Oddly enough, they don’t come from the reputational survey, which is the most obvious source of data wonkiness. Twenty-two percent of institutional scores in this ranking come from the reputational ranking; and yet in the THE’s reputation rankings (which uses the same data) not a single one of the universities listed here had a reputational score high enough that the THE felt comfortable releasing the data. To put this another way: the THE seemingly does not believe that the differences in institutional scores among the Under-50 crowd are actually meaningful. Hmmm.
No, the real weirdness in this year’s rankings comes in citations, the one category which should be invulnerable to institutional gaming. These scores are based on field-normalized, 5-year citation averages; the resulting institutional scores are then themselves standardized (technically, they are what are known as z-scores). By design, they just shouldn’t move that much in a single year. So what to make of the fact that the University of Warwick’s citation score jumped 31% in a single year, Nanyang Polytechnic’s by 58%, or UT Dallas’ by a frankly insane 93%? For that last one to be true, Dallas would have needed to have had 5 times as many citations in 2011 as it did in 2005. I haven’t checked or anything, but unless the whole faculty is on stims, that probably didn’t happen. So there’s something funny going on here."
Here is my comment on his post.
"But there is some serious wonkiness in the statistics behind this year’s rankings which bear some scrutiny. Oddly enough, they don’t come from the reputational survey, which is the most obvious source of data wonkiness. Twenty-two percent of institutional scores in this ranking come from the reputational ranking; and yet in the THE’s reputation rankings (which uses the same data) not a single one of the universities listed here had a reputational score high enough that the THE felt comfortable releasing the data. To put this another way: the THE seemingly does not believe that the differences in institutional scores among the Under-50 crowd are actually meaningful. Hmmm.
No, the real weirdness in this year’s rankings comes in citations, the one category which should be invulnerable to institutional gaming. These scores are based on field-normalized, 5-year citation averages; the resulting institutional scores are then themselves standardized (technically, they are what are known as z-scores). By design, they just shouldn’t move that much in a single year. So what to make of the fact that the University of Warwick’s citation score jumped 31% in a single year, Nanyang Polytechnic’s by 58%, or UT Dallas’ by a frankly insane 93%? For that last one to be true, Dallas would have needed to have had 5 times as many citations in 2011 as it did in 2005. I haven’t checked or anything, but unless the whole faculty is on stims, that probably didn’t happen. So there’s something funny going on here."
Here is my comment on his post.
Your comment at
University Ranking Watch and your post at your blog raise a number of
interesting issues about the citations indicator in the THE-TR World University
Rankings and the various spin-offs.
You point out
that the scores for the citations indicator rose at an unrealistic rate between
2011 and 2012 for some of the new universities in the 100 Under 50 Rankings and
ask how this could possibly reflect an equivalent rise in the number of
citations.
Part of the
explanation is that the scores for all indicators and nearly all universities in
the WUR, and not just for the citations indicator and a few institutions, rose between
2011 and 2012. The mean overall score of the top 402 universities in 2011 was
44.3 and for the top 400 universities in 2012 it was 49.5.
The mean scores
for every single indicator or group of indicators in the top 400 (402 in 2011)
have also risen although not all at the same rate. Teaching rose from 37.9 to
41.7, International Outlook from 51.3 to 52.4, Industry Income from 47.1 to
50.7, Research from 36.2 to 40.8 and Citations from 57.2 to 65.2.
Notice that the
scores for citations are higher than for the other indicators in 2011 and that
the gap further increases in 2012.
This means that
the citations indicator had a disproportionate effect on the rankings in 2011,
one that became more disproportionate in 2012
It should be
remembered that the scores for the indicators are z scores and therefore they measure
not the absolute number of citations but the distance in standard deviations from
the mean number of normalised citations of all the universities analysed. The
mean is the mean not of the 200 universities listed in the top 200 universities
in the printed and online rankings or the 400 included in the ipad/iphone app
but the mean of the total number of universities that have asked to be ranked. That
seems to have increased by a few hundred between 2011 and 2012 and will no doubt
go on increasing over the next few years but probably at a steadily decreasing
rate.
Most of the
newcomers to the world rankings have overall scores and indicator scores that
are lower than those of the universities in the top 200 or even the top 400. That
means that the mean of the unprocessed scores on which the z scores are based decreased
between 2011 and 2012 so that the overall and indicator scores of the elite
universities increased regardless of what happened to the underlying raw data.
However, they
did not increase at the same rate. The scores for the citations indication, as
noted, were much higher in 2011 and in 2012 than they were for the other
indicators. It is likely that this was because the difference between top 200
or 400 universities and those just below the elite is greater for citations
than it is for indicators like income, publications and internationalisation. After
all, most people would probably accept that internationally recognised research
is a major factor in distinguishing world class universities from those that
are merely good.
Another point
about the citations indicator is that after the score for field and year
normalised citations for each university is calculated it is adjusted according
to a “regional modification”. This means that the score, after normalisation
for year and field, is divided by the square root of the average for the
country in which the university is located. So if University A has a score of
3.0 citations per paper and the average for the country is 3.0 then the score
will be divided by 1.73, the square of 3, and the result is 1.73. If a university in country B has the same
score of 3.0 citations per paper but the overall average is just 1.0 citation
per paper the final score will be 3.0 divided by the square root of 1, which is
1, and the result is 3.
University B
therefore gets a much higher final score for citations even though the number
of citations per paper is exactly the same as University A’s . The reason for
the apparently higher score is simply that the two universities are being
compared to all the other universities in their country. The lower the score
for universities in general then the higher the regional modification for
specific universities. The citations
indicator is not just measuring the number of citations produced by
universities but also in effect the difference between the bulk of a country’s universities
and the elite that make into the top 200 or 400.
It is possible
then that a university might be helped into the top 200 or 400 by having a high
score for citations that resulted from being better than other universities in
a particular country that were performing badly.
It is also
possible that if a country’s research performance took a dive, perhaps because
of budget cuts, with the overall number of citations per paper declining, this
would lead to an improvement in the score for citations of a university that
managed to remain above the national average.
It is quite
likely that -- assuming the methodology remains unchanged -- if countries like
Italy, Portugal or Greece experience a fall in research output as a result of
economic crises, their top universities will get a boost for citations because
they are benchmarked against a lower national average.
Looking at the specific
places mentioned, it should be noted once again that Thomson Reuters do not
simply count the number of citations per paper but compare them with the mean
citations for papers in particular fields published in particular years and cited
in particular years.
Thus a paper in applied
mathematics published in a journal in 2007 and cited in 2007, 2008, 2009, 2010,
2011 and 2012 will be compared to all papers in applied maths published in 2007
and cited in those years.
If it is usual
for a paper in a specific field to receive few citations in the year of
publication or the year after then even a moderate amount of citations can have
a disproportionate effect on the citations score.
It is very
likely that Warwick’s increased score for citations in 2012 had a lot to do
with participation in a number of large scale astrophysical projects that
involved many institutions and produced a larger than average number of
citations in the years after publication. In June 2009, for example, the Astrophysical Journal Supplement Series
published ‘The seventh data release of the Sloan Digital Sky Survey’ with
contributions from 102 institutions, including Warwick. In 2009 it received 45
citations. The average for the journal was 13. The average for the field is
known to Thomson Reuters but it is unlikely that anyone else has the technical
capability to work it out. In 2010 the paper was cited 262 times: the average
for the journal was 22. In 2011 it was cited 392 times: the average for the
journal was 19 times.
This and similar
publications have contributed to an improved performance for Warwick, one that was
enhanced by the relatively modest number of total publications by which the
normalised citations were divided.
With regard to
Nanyang Technological University, it seems that a significant role was played
by a few highly cited publications in Chemical
Reviews in 2009 and in Nature in
2009 and 2010.
As for the
University of Texas at Dallas, my suspicion was that publications by faculty at
the University of Texas Southwestern Medical Center had been included, a claim
that had been made about the QS rankings a few years ago. Thomson Reuters have, however, denied this
and say they have observed unusual behaviour by UT Dallas which they interpret
as an improvement in the way that affiliations are recorded. I am not sure
exactly what this means but assume that the improvement in the citations score
is an artefact of changes in the way data is recorded rather than any change in
the number or quality of citations.
There will
almost certainly be more of this in the 2013 and 2014 rankings."
Tuesday, July 02, 2013
The Complete Efficiency Rankings
At last. Here are complete Efficiency Rankings measuring the efficiency with which universities turn inputs into citations. I am using the method of Professor Dirk van Damme which is to divide the scores for Citations: Research Influence in The THE World University Rankings by the scores for Research: Volume, Income and Reputation. Here is the method as cited in a previous post:
"The input indicator takes scaled and normalised measures of research income and volume into account, and also considers reputation, while the output indicator looks at citations to institutional papers in Thomson Reuters’ Web of Science database, normalised for subject differences.
Professor van Damme said that the results - which show that university systems outside the Anglo-American elite are able to realise and increase outputs with much lower levels of input - did not surprise him.
“For example, Switzerland really invests in the right types of research. It has a few universities in which it concentrates resources, and they do very well,” he said.
Previous studies have found the UK to have the most efficient research system on measures of citation per researcher and per unit of spending.
But Professor van Damme explained that under his approach, productivity - output per staff member - was included as an input.
“With efficiency I mean the total research capacity of an institution, including its productivity, divided by its impact. The UK is not doing badly at all, but other countries are doing better, such as Ireland, which has a very low research score but a good citations score,” he said.
Given the severity of the country’s economic crisis, Ireland’s success was particularly impressive, he said.
“I think it is really conscious of the effort it has to make to maintain its position and is doing so.”
Low efficiency scores for China and South Korea reflected the countries’ problems in translating their huge investment into outputs, he added."
1. Tokyo Metroplitan
2. Moscow Engineering Physics Institute
3. Florida Institute of Technology
4. Southern Methodist
5. Hertfordshire
6. Portsmouth
7= King Mongkut's University of Technology
7= Vigo
9. Creighton
10. Fribourg
11. King Abdulaziz
12. University of the Andes
13. Trieste
14. Renmin
15. Medical University of Vienna
16. Polytech University Valencia
17= Beyreuth
18= Montana
19. Mainz
20. Ferrara
21. Drexel
22. Valencia
23. Linz
24. Crete
25. Colorado School of Mines
26. Technical University of Dresden
27. Innsbruck
28. Nurnberg
29= Dauphine
29= Wake Forest
29= Maryland Baltimore County
32. St George's London
33. William and Mary College
34. Hong Kong Baptist
35. Basel
36. Texas San Antonio
37. Duisberg
38. Lyon 1
39. Wurzburg
40. Charles Darwin
41. Wayne State
42. Northeastern
43. Bicocca
44. Royal Holloway
45. Koc
46. Georgia University of Health Science
47. Modena
48. Dundee
49. Southern Denmark
50= IIT Roorkhee
50= Pompeu Fabra
52. Graz
53= Oregon
53= Diderot
55. Bielfeld
56. Munster
57. Waikato
58= Grenoble
59= East Anglia
60= Bonn
61= Pavia
62. ENS Lyon
63. Eastern Finland
64. Padua
65. Brandeis
66. Aberystwyth
67. Tulane
68. Tubingen
69= Warsaw
70= Sun Yat Sen
71= Keele
72. Tromso
73. Brunel
74. Liege
75. Queen Mary
76= Vermont
77= Trento
78. Turin
79. Jyvaskyla
80. Carleton
81. Kansas
82. California Riverside
83. SUNY Stony Brook
84= George Washington
85= Pisa
86. Tasmania
87. George Mason
88. Boston College
89= Oregon State
90= Texas Dallas
91. Trinity College Dublin
92= University Science and Technology China
92= Murdoch
92= Cinncinati
92= Galway
92= Yeshiva
97= Tufts
97= Minho
99. Miami
100. Lehigh
101. Technical University Denmark
102= Rice
102= Iceland
104. California Santa Cruz
104= Milan
106. Monpellier 2
107. Frankfurt
108= Bergen
109= Strasbourg
110. Victoria
111. Rochester
112. Cork
113. Dartmouth
114. Oklahoma
115. Birkbeck
116. Porto
117. Canterbury
118= Newcastle UK
118= Notre Dame
118= University College Dublin
121. Binghamton
122. Aveiro
123= Kiel
123= Sussex
125. Temple
126. Aachen
127= Fribourg
127= Queens Belfast
127= Colorado Boulder
130. Iowa State
131. Tokyo Medical Dental
132= Autonomous Madrid
132= Swedish Agriculture
132= Tempere
135= Deakin
135= Barcelona
137= Stockholm
137= Stirling
139. Laval
140. Durham
141. Bangor
142= Aberdeen
142= Vanderbilt
144. Istanbul Technical
145. Nanjing
146= Exeter
146= Emory
146= Leicester
149. Southamton
150. Paris Mines
151. Vrije Universiteit Brussel
152. Polytechnic Milan
153. Kwazulu-Natal
154= Linkoping
154= Bilkent
154= Herriot-Watt
154= Bologna
158= Wyoming
158= Utah
158= Massey
161= Glasgow
161= Bern
163. ENS Paris
164. Zurich
165= Case Western Reserve
166= California Irvine
167= Tartu
168= Wellington
169= Salento
170. South Carolina
171. York UK
172. Aalto
173= Curie
173= Macquarie
173= Boston
176= Delaware
177= Copenhagen
178= Hannover
179. Norway University of Science and Technology
180. Antwerp
181= Dalhousie
181= Renselaer Polytechnic Institute
183= Konstanz
184= Paris Sud
185. Technical University Munich
186. Lancaster
187. Waseda
188. Otago
189. Arizona State
190= SUNY Albany
190= Gottingen
190= Autonomous Barcelona
193= Cape Town
194= St Andrews
195= Colorado State
195= Bath
195= Wollongong
198= Tsukuba
198= Simon Fraser
198= Liverpool
198= Umea
202= Geneva
202= Newcastle Australia
204= Universite Libre Bruxelles
204= Virginia
206= Lausanne
206= Louvain
208= Connecticut
208= Georgetown
208= York Canada
211. EPF Lausanne
212= North Carolina State
212= Bristol
212= Aalborg
212= Free University Amsterdam
216= Indiana
216= Kentucky
218. Maryland College Park
219. Karlsruhe Institute technology
220= University Technology Sydney
220= Iowa
222. Charles
223. Flinders
224. Cardiff
225= Auckland
225= Oslo
227. Pittsburgh
228= Heidelberg
228= Guelph
228= Washington State
228= Sheffield
232= Chinese University Hong Kong
232= Strathclyde
234= Ottawa
234= Gotherberg
234= Washington St Louis
237. Medical South Carolina
238= McMaster
238= Brown
238= National Sun Yat Sen
238= Reading
242. Ecole Polytechnique
243. Helsinki
244= Quebec
244= National Central Taiwan
246. Bogazici
247= Southern California
247= Arizona
249. Keio
250= Houston
250= Stellenbosch
250= Kings College London
250= Darmstadt
250= Western Australia
255= Pohang
255= IIT Bombay
257= Wageningen
257= Manitoba
259= South Australia
259= Nagoya
261= Leeds
261= UC Santa Barbara
261= Nijmegen
261= Jagiellon
265= New York University
265= Calgary
265= Ohio State
268. Aarhus
269= Witwatersrand
269= North Carolina Chapel Hill
269= Michigan State
269= Fudan
273= Bochum
273= Munich
275= SUNY Buffalo
275= Adelaide
275= Sapienza
278= Utrecht
278= Edinburgh
278= Queensland University of Technology
281= Lund
281= Ghent
283. Erasmus
284= Massachusetts
284= Illinois Chicago
284= Nottingham
287= Eindhoven
287= Amsterdam
289. UC San Diego
290. Birmingham
291= Western Ontario
291= Twente
293= Washington Seattle
293= Duke
295= Penn State
295= NUI Maynooth
297= Maastricht
297= Groningen
297= Columbia
297= Leiden
297= Georgia
302. UC Davis
303= Southern Florida
303= Chalmers University of Technology
305= Minnesota
305= Essex
305= Manchester
305= Georgia Institute of Technology
309= Rutgers
309= Texas at Austin
311= Northwestern
311= Warwick
311= Vienna
311= MIT
315. Johns Hopkins
316= Wisconsin Madison
316= Carnegie Mellon
318. Alberta
319. Pennsylvania
320= Hong Kong University of Science and Technology
320= Kyushu
322= Chicago
322= Vienna University of Technology
324= Queensland
324= Montreal
326. British Columbia
327= Yale
327= Imperial College London
327= UCLA
327= Hebrew University of Jerusalem
327= Karolinska
332= Melbourne
332= Humboldt
332= National Tsinghua Taiwan
332= Cambridge
332= Harvard
332= Stanford
338= Monash
338= Princeton
338= Caltech
338= Michigan
338= UC Berkeley
338= Cornell
344= Waterloo
344= KHT Sweden
344= Missouri
347. University College London
348= Oxford
348= Middle East Technical University
350. Yonsei
351= Toronto
351= Illinois Urbana Champagne
351= Peking
351= Leuven
355= Zhejiang
355= Hokkaido
355= Hong Kong Polytechnic University
355= McGill
359= ETH Zurich
359= Tokyo Institute of Technology
361= Berlin
361= Uppsala
363= Korea
363= Sydney
365= Florida
365= New South Wales
367= Australian National
367= Tohoku
367= Purdue
367= Technion
371= Surrey
371= IIT Kharagpur
373= KAIST
373= Texas A and M
375. Virginia Polytechnic Institute
376= Osaka
376= Nanyang Technological University
376= Shanghai Jiao Tong
379. LSE
380. Sungkyunkwan
381. Sharif University of Technology
382. Tokyo
383= National Taiwan University of Science and Technology
383= National Autonomous University of Mexico
385= Kyoto
385= National University of Singapore
387. Loughborough
388. National Cheng Kung
389. Tel Aviv
390= Hong Kong
390= Tsinghua
392. Chinese University of Hong Kong
393. National Taiwan
394. National Chiao Tung
395. Tilburg
396. Delft
397. Seoul National
398. State University Campinas
399. Sao Paulo
400. Moscow State
"The input indicator takes scaled and normalised measures of research income and volume into account, and also considers reputation, while the output indicator looks at citations to institutional papers in Thomson Reuters’ Web of Science database, normalised for subject differences.
Professor van Damme said that the results - which show that university systems outside the Anglo-American elite are able to realise and increase outputs with much lower levels of input - did not surprise him.
“For example, Switzerland really invests in the right types of research. It has a few universities in which it concentrates resources, and they do very well,” he said.
Previous studies have found the UK to have the most efficient research system on measures of citation per researcher and per unit of spending.
But Professor van Damme explained that under his approach, productivity - output per staff member - was included as an input.
“With efficiency I mean the total research capacity of an institution, including its productivity, divided by its impact. The UK is not doing badly at all, but other countries are doing better, such as Ireland, which has a very low research score but a good citations score,” he said.
Given the severity of the country’s economic crisis, Ireland’s success was particularly impressive, he said.
“I think it is really conscious of the effort it has to make to maintain its position and is doing so.”
Low efficiency scores for China and South Korea reflected the countries’ problems in translating their huge investment into outputs, he added."
1. Tokyo Metroplitan
2. Moscow Engineering Physics Institute
3. Florida Institute of Technology
4. Southern Methodist
5. Hertfordshire
6. Portsmouth
7= King Mongkut's University of Technology
7= Vigo
9. Creighton
10. Fribourg
11. King Abdulaziz
12. University of the Andes
13. Trieste
14. Renmin
15. Medical University of Vienna
16. Polytech University Valencia
17= Beyreuth
18= Montana
19. Mainz
20. Ferrara
21. Drexel
22. Valencia
23. Linz
24. Crete
25. Colorado School of Mines
26. Technical University of Dresden
27. Innsbruck
28. Nurnberg
29= Dauphine
29= Wake Forest
29= Maryland Baltimore County
32. St George's London
33. William and Mary College
34. Hong Kong Baptist
35. Basel
36. Texas San Antonio
37. Duisberg
38. Lyon 1
39. Wurzburg
40. Charles Darwin
41. Wayne State
42. Northeastern
43. Bicocca
44. Royal Holloway
45. Koc
46. Georgia University of Health Science
47. Modena
48. Dundee
49. Southern Denmark
50= IIT Roorkhee
50= Pompeu Fabra
52. Graz
53= Oregon
53= Diderot
55. Bielfeld
56. Munster
57. Waikato
58= Grenoble
59= East Anglia
60= Bonn
61= Pavia
62. ENS Lyon
63. Eastern Finland
64. Padua
65. Brandeis
66. Aberystwyth
67. Tulane
68. Tubingen
69= Warsaw
70= Sun Yat Sen
71= Keele
72. Tromso
73. Brunel
74. Liege
75. Queen Mary
76= Vermont
77= Trento
78. Turin
79. Jyvaskyla
80. Carleton
81. Kansas
82. California Riverside
83. SUNY Stony Brook
84= George Washington
85= Pisa
86. Tasmania
87. George Mason
88. Boston College
89= Oregon State
90= Texas Dallas
91. Trinity College Dublin
92= University Science and Technology China
92= Murdoch
92= Cinncinati
92= Galway
92= Yeshiva
97= Tufts
97= Minho
99. Miami
100. Lehigh
101. Technical University Denmark
102= Rice
102= Iceland
104. California Santa Cruz
104= Milan
106. Monpellier 2
107. Frankfurt
108= Bergen
109= Strasbourg
110. Victoria
111. Rochester
112. Cork
113. Dartmouth
114. Oklahoma
115. Birkbeck
116. Porto
117. Canterbury
118= Newcastle UK
118= Notre Dame
118= University College Dublin
121. Binghamton
122. Aveiro
123= Kiel
123= Sussex
125. Temple
126. Aachen
127= Fribourg
127= Queens Belfast
127= Colorado Boulder
130. Iowa State
131. Tokyo Medical Dental
132= Autonomous Madrid
132= Swedish Agriculture
132= Tempere
135= Deakin
135= Barcelona
137= Stockholm
137= Stirling
139. Laval
140. Durham
141. Bangor
142= Aberdeen
142= Vanderbilt
144. Istanbul Technical
145. Nanjing
146= Exeter
146= Emory
146= Leicester
149. Southamton
150. Paris Mines
151. Vrije Universiteit Brussel
152. Polytechnic Milan
153. Kwazulu-Natal
154= Linkoping
154= Bilkent
154= Herriot-Watt
154= Bologna
158= Wyoming
158= Utah
158= Massey
161= Glasgow
161= Bern
163. ENS Paris
164. Zurich
165= Case Western Reserve
166= California Irvine
167= Tartu
168= Wellington
169= Salento
170. South Carolina
171. York UK
172. Aalto
173= Curie
173= Macquarie
173= Boston
176= Delaware
177= Copenhagen
178= Hannover
179. Norway University of Science and Technology
180. Antwerp
181= Dalhousie
181= Renselaer Polytechnic Institute
183= Konstanz
184= Paris Sud
185. Technical University Munich
186. Lancaster
187. Waseda
188. Otago
189. Arizona State
190= SUNY Albany
190= Gottingen
190= Autonomous Barcelona
193= Cape Town
194= St Andrews
195= Colorado State
195= Bath
195= Wollongong
198= Tsukuba
198= Simon Fraser
198= Liverpool
198= Umea
202= Geneva
202= Newcastle Australia
204= Universite Libre Bruxelles
204= Virginia
206= Lausanne
206= Louvain
208= Connecticut
208= Georgetown
208= York Canada
211. EPF Lausanne
212= North Carolina State
212= Bristol
212= Aalborg
212= Free University Amsterdam
216= Indiana
216= Kentucky
218. Maryland College Park
219. Karlsruhe Institute technology
220= University Technology Sydney
220= Iowa
222. Charles
223. Flinders
224. Cardiff
225= Auckland
225= Oslo
227. Pittsburgh
228= Heidelberg
228= Guelph
228= Washington State
228= Sheffield
232= Chinese University Hong Kong
232= Strathclyde
234= Ottawa
234= Gotherberg
234= Washington St Louis
237. Medical South Carolina
238= McMaster
238= Brown
238= National Sun Yat Sen
238= Reading
242. Ecole Polytechnique
243. Helsinki
244= Quebec
244= National Central Taiwan
246. Bogazici
247= Southern California
247= Arizona
249. Keio
250= Houston
250= Stellenbosch
250= Kings College London
250= Darmstadt
250= Western Australia
255= Pohang
255= IIT Bombay
257= Wageningen
257= Manitoba
259= South Australia
259= Nagoya
261= Leeds
261= UC Santa Barbara
261= Nijmegen
261= Jagiellon
265= New York University
265= Calgary
265= Ohio State
268. Aarhus
269= Witwatersrand
269= North Carolina Chapel Hill
269= Michigan State
269= Fudan
273= Bochum
273= Munich
275= SUNY Buffalo
275= Adelaide
275= Sapienza
278= Utrecht
278= Edinburgh
278= Queensland University of Technology
281= Lund
281= Ghent
283. Erasmus
284= Massachusetts
284= Illinois Chicago
284= Nottingham
287= Eindhoven
287= Amsterdam
289. UC San Diego
290. Birmingham
291= Western Ontario
291= Twente
293= Washington Seattle
293= Duke
295= Penn State
295= NUI Maynooth
297= Maastricht
297= Groningen
297= Columbia
297= Leiden
297= Georgia
302. UC Davis
303= Southern Florida
303= Chalmers University of Technology
305= Minnesota
305= Essex
305= Manchester
305= Georgia Institute of Technology
309= Rutgers
309= Texas at Austin
311= Northwestern
311= Warwick
311= Vienna
311= MIT
315. Johns Hopkins
316= Wisconsin Madison
316= Carnegie Mellon
318. Alberta
319. Pennsylvania
320= Hong Kong University of Science and Technology
320= Kyushu
322= Chicago
322= Vienna University of Technology
324= Queensland
324= Montreal
326. British Columbia
327= Yale
327= Imperial College London
327= UCLA
327= Hebrew University of Jerusalem
327= Karolinska
332= Melbourne
332= Humboldt
332= National Tsinghua Taiwan
332= Cambridge
332= Harvard
332= Stanford
338= Monash
338= Princeton
338= Caltech
338= Michigan
338= UC Berkeley
338= Cornell
344= Waterloo
344= KHT Sweden
344= Missouri
347. University College London
348= Oxford
348= Middle East Technical University
350. Yonsei
351= Toronto
351= Illinois Urbana Champagne
351= Peking
351= Leuven
355= Zhejiang
355= Hokkaido
355= Hong Kong Polytechnic University
355= McGill
359= ETH Zurich
359= Tokyo Institute of Technology
361= Berlin
361= Uppsala
363= Korea
363= Sydney
365= Florida
365= New South Wales
367= Australian National
367= Tohoku
367= Purdue
367= Technion
371= Surrey
371= IIT Kharagpur
373= KAIST
373= Texas A and M
375. Virginia Polytechnic Institute
376= Osaka
376= Nanyang Technological University
376= Shanghai Jiao Tong
379. LSE
380. Sungkyunkwan
381. Sharif University of Technology
382. Tokyo
383= National Taiwan University of Science and Technology
383= National Autonomous University of Mexico
385= Kyoto
385= National University of Singapore
387. Loughborough
388. National Cheng Kung
389. Tel Aviv
390= Hong Kong
390= Tsinghua
392. Chinese University of Hong Kong
393. National Taiwan
394. National Chiao Tung
395. Tilburg
396. Delft
397. Seoul National
398. State University Campinas
399. Sao Paulo
400. Moscow State
Monday, July 01, 2013
Tuesday, June 25, 2013
What about a Research Influence Ranking?
Keeping up with the current surge of global university rankings is becoming next to impossible. Still there are a few niches that have remained unoccupied. One might be a ranking of universities according to their ability to spread new knowledge around the world. So it might be a good idea to have a Research Influence Ranking based on the citations indicator in the Times Higher Education -- Thomson Reuters World University Rankings.
Thomson Reuters are the world's leading collectors and analysts of citations data so such an index ought to provide invaluable data source for governments, corporations and other stakeholders deciding where to place research funding. Data for 400 universities can be found on the THE iPhone/iPad app.
The top place in the world would be jointly held by Rice University in Texas and Moscow State Engineering Physics Institute, closely followed by MIT and the University of California Santa Cruz.
Then there are the first places in various regions and counties. (MEPhI would be first in Europe and Rice in the US and North America.)
Canada
University of Toronto
Latin America
University of the Andes, Colombia
United Kingdom (and Western Europe)
Royal Holloway London
Africa
University of Cape Town
Middle East
Koc University, Turkey
Asia (and Japan)
Tokyo Metropolitan University
ASEAN
King Mongkut's University of Technology, Thailand
Australia and the Pacific
University of Melbourne
On second thoughts, perhaps not such a good idea.
Thomson Reuters are the world's leading collectors and analysts of citations data so such an index ought to provide invaluable data source for governments, corporations and other stakeholders deciding where to place research funding. Data for 400 universities can be found on the THE iPhone/iPad app.
The top place in the world would be jointly held by Rice University in Texas and Moscow State Engineering Physics Institute, closely followed by MIT and the University of California Santa Cruz.
Then there are the first places in various regions and counties. (MEPhI would be first in Europe and Rice in the US and North America.)
Canada
University of Toronto
Latin America
University of the Andes, Colombia
United Kingdom (and Western Europe)
Royal Holloway London
Africa
University of Cape Town
Middle East
Koc University, Turkey
Asia (and Japan)
Tokyo Metropolitan University
ASEAN
King Mongkut's University of Technology, Thailand
Australia and the Pacific
University of Melbourne
On second thoughts, perhaps not such a good idea.
Monday, June 24, 2013
Bad Mood Rising
In 2006 I tried to get an article published in the Education section of the Guardian, that fearless advocate of radical causes and scourge of the establishment, outlining the many flaws and errors in the Times Higher Education Supplement -- Quacquarelli Symonds (as they were then) World University Rankings, especially its "peer review". Unfortunately, I was told that they would be wary of publishing an attack on a direct rival. That was how University Ranking Watch got started.
Since then QS and Times Higher Education have had an unpleasant divorce, with the latter now teaming up with Thomson Reuters. New rankings have appeared, some of them to rapidly disappear -- there was one from Wuhan and another from Australia but they seem to have vanished. The established rankings are spinning off subsidiary rankings at a bewildering rate.
As the higher education bubble collapses in the West everything is getting more competitive including rankings and everybody -- except ARWU -- seems to be getting rather bad-tempered.
Rankers and academic writers are no longer wary about "taking a pop" at each other. Recently, there has been an acrimonious exchange between Ben Sowter of QS and Simon Marginson of Melbourne University. This has gone so far as to include the claim that QS has used the threat of legal action to try to silence critics.
"[Ben] Sowter [of QS] does not mention that his company has twice threatened publications with legal action when publishing my bona fide criticisms of QS. One was The Australian: in that case QS prevented my criticisms from being aired. The other case was University World News, which refused to pull my remarks from its website when threatened by QS with legal action.
If Sowter and QS would address the points of criticism of their ranking and their infamous star system (best described as 'rent a reputation'), rather than attacking their critics, we might all be able to progress towards better rankings. That is my sole goal in this matter. As long as the QS ranking remains deficient in terms of social science, I will continue to criticise it, and I expect others will also continue to do so."
Meanwhile the Leiter Reports has a letter from "a reader in the UK".
THES DID drop QS for methodological reasons. The best explanation is here: http://www.insidehighered.com/views/2010/03/15/baty
But there may have been more to it? Clearly QS's business practices leave an awful lot to be desired. See: http://www.computerweekly.com/news/1280094547/Quacquarelli-Symonds-pays-80000-for-using-unlicensed-software
Also I understand that the "S" from QS -- Matt Symonds -- walked out on the company due to exasperation with the business practices. He has been airbrushed from QS history, but can be foud at: https://twitter.com/SymondsGSB
And as for the reputation survey, there was also this case of blantant manipulation: http://www.insidehighered.com/news/2013/04/08/irish-university-tries-recruit-voters-improve-its-international-ranking
And of course there's the high-pressure sales: http://www.theinternationalstudentrecruiter.com/how-to-become-a-top-500-university/
And the highly lucrative "consultancy" to help universities rise up the rankings: http://www.iu.qs.com/projects-and-services/consulting/
There are "opportunities" for branding -- a snip at just $80,000 -- with QS Showcase: http://qsshowcase.com/main/branding-opportunities/
Or what about some relaxing massage, or a tenis tournament and networking with the staff who compile the rankings: http://www.qsworldclass.com/6thqsworldclass/
Perhaps most distribing of all is the selling of dubious Star ratings: http://www.nytimes.com/2012/12/31/world/europe/31iht-educlede31.html?pagewanted=all&_r=0
Keep up the good work. Its an excellent blog.
All of this is true although I cannot get very excited about using pirated software and the bit about relaxing massage is rather petty -- I assume it is something to do with having a conference in Thailand. Incidentally, I don't think anyone from THE sent this since the reader refers to THES (The S for Supplement was removed in 2008).
This is all a long way from the days when journalists refused to take pops at their rivals, even when they knew the rankings were a bit rum.
Since then QS and Times Higher Education have had an unpleasant divorce, with the latter now teaming up with Thomson Reuters. New rankings have appeared, some of them to rapidly disappear -- there was one from Wuhan and another from Australia but they seem to have vanished. The established rankings are spinning off subsidiary rankings at a bewildering rate.
As the higher education bubble collapses in the West everything is getting more competitive including rankings and everybody -- except ARWU -- seems to be getting rather bad-tempered.
Rankers and academic writers are no longer wary about "taking a pop" at each other. Recently, there has been an acrimonious exchange between Ben Sowter of QS and Simon Marginson of Melbourne University. This has gone so far as to include the claim that QS has used the threat of legal action to try to silence critics.
"[Ben] Sowter [of QS] does not mention that his company has twice threatened publications with legal action when publishing my bona fide criticisms of QS. One was The Australian: in that case QS prevented my criticisms from being aired. The other case was University World News, which refused to pull my remarks from its website when threatened by QS with legal action.
If Sowter and QS would address the points of criticism of their ranking and their infamous star system (best described as 'rent a reputation'), rather than attacking their critics, we might all be able to progress towards better rankings. That is my sole goal in this matter. As long as the QS ranking remains deficient in terms of social science, I will continue to criticise it, and I expect others will also continue to do so."
Meanwhile the Leiter Reports has a letter from "a reader in the UK".
THES DID drop QS for methodological reasons. The best explanation is here: http://www.insidehighered.com/views/2010/03/15/baty
But there may have been more to it? Clearly QS's business practices leave an awful lot to be desired. See: http://www.computerweekly.com/news/1280094547/Quacquarelli-Symonds-pays-80000-for-using-unlicensed-software
Also I understand that the "S" from QS -- Matt Symonds -- walked out on the company due to exasperation with the business practices. He has been airbrushed from QS history, but can be foud at: https://twitter.com/SymondsGSB
And as for the reputation survey, there was also this case of blantant manipulation: http://www.insidehighered.com/news/2013/04/08/irish-university-tries-recruit-voters-improve-its-international-ranking
And of course there's the high-pressure sales: http://www.theinternationalstudentrecruiter.com/how-to-become-a-top-500-university/
And the highly lucrative "consultancy" to help universities rise up the rankings: http://www.iu.qs.com/projects-and-services/consulting/
There are "opportunities" for branding -- a snip at just $80,000 -- with QS Showcase: http://qsshowcase.com/main/branding-opportunities/
Or what about some relaxing massage, or a tenis tournament and networking with the staff who compile the rankings: http://www.qsworldclass.com/6thqsworldclass/
Perhaps most distribing of all is the selling of dubious Star ratings: http://www.nytimes.com/2012/12/31/world/europe/31iht-educlede31.html?pagewanted=all&_r=0
Keep up the good work. Its an excellent blog.
All of this is true although I cannot get very excited about using pirated software and the bit about relaxing massage is rather petty -- I assume it is something to do with having a conference in Thailand. Incidentally, I don't think anyone from THE sent this since the reader refers to THES (The S for Supplement was removed in 2008).
This is all a long way from the days when journalists refused to take pops at their rivals, even when they knew the rankings were a bit rum.
Sunday, June 23, 2013
Times Higher Education Under 50s Rankings
Times Higher Education has now published its ranking of universities less than fifty years old.
The top five are:
1. Pohang University of Science and Technology
2. EPF Lausanne
3. Korea Advanced Institute of Science and Technology
4. Hong Kong University of Science and Technology
5. University of California, Irvine
They are quite a bit different from the QS young universities rankings. In a while I hope to provide a detailed comparison.
The top five are:
1. Pohang University of Science and Technology
2. EPF Lausanne
3. Korea Advanced Institute of Science and Technology
4. Hong Kong University of Science and Technology
5. University of California, Irvine
They are quite a bit different from the QS young universities rankings. In a while I hope to provide a detailed comparison.
Saturday, June 22, 2013
Citation Cartels
An article by Paul Jump in Times Higher Education describes how Thomson Reuters have been excluding an increasing number of journals from their Journal Citation Reports for "anomalous citation patterns" which now includes not just self-citation but excessive mutual citation.
Surely it is now time for Thomson Reuters to stop counting self-citations for the Research Influence indicator in the THE World University Rankings. The threat of the self-citations of Dr El Naschie "of" Alexandria University has receded but there are others who would have a big impact on the rankings if they ever move to a university with a low volume of publications.
TR may not want to follow QS who no longer count citations for their rankings but excluding excessive mutual citation as well would put them one up again.
Surely it is now time for Thomson Reuters to stop counting self-citations for the Research Influence indicator in the THE World University Rankings. The threat of the self-citations of Dr El Naschie "of" Alexandria University has receded but there are others who would have a big impact on the rankings if they ever move to a university with a low volume of publications.
TR may not want to follow QS who no longer count citations for their rankings but excluding excessive mutual citation as well would put them one up again.
Wednesday, June 12, 2013
Uncanny Insight into Ranker Psychology
I just said that QS would announce its Young University Rankings now that THE has indicated the launch date for its rankings at Wellington College next week.
Actually it was just a few hours.
Anyway, here are the top five.
1. Hong Kong University of Science and Technology
2. Nanyang Technological University
3. Warwick
4. KAIST
5. City University of Hong Kong
Actually it was just a few hours.
Anyway, here are the top five.
1. Hong Kong University of Science and Technology
2. Nanyang Technological University
3. Warwick
4. KAIST
5. City University of Hong Kong
Tuesday, June 11, 2013
Prestigious Ranking Watch
Times Higher Education will be launching their Top 100 Under-50 Universities Rankings which, in case you have forgotten, is prestigious, at Wellington College, which is a school not a college, in another eight days.
Does anybody want to bet on the QS under-50 rankings appearing in a few days?
Meanwhile, the THE World Rankings will be published at the THE World Academic Summit in Singapore in October. See here. And yes, they are prestigious.
Monday, June 10, 2013
The QS Latin American Rankings
The QS Latin American Rankings show some interesting variations in methodology. The academic survey has a weight of 30%, compared to 40% in the World Rankings, and the employer survey a weight of 20%, compared to 10%.
Instead of 20% for citations per faculty there is 10% for papers per faculty and 10% for citations per paper. Since there are great variations according to the measures used to count research output and influence, as shown by the recent Leiden Ranking, this is very sensible.
Faculty-student ratio is reduced from 20% to 10% and international students and international faculty are removed. There is now 10% for proportion of staff with Ph Ds and 10% for web impact.
Here are the top five.
1. Universidade de Sao Paulo
2. Pontificia Universida Catolica de Chile
3. Universidade Estadual de Campinas
4. Universidad de Los Andes Colombia
5. Universidad de Chile
Instead of 20% for citations per faculty there is 10% for papers per faculty and 10% for citations per paper. Since there are great variations according to the measures used to count research output and influence, as shown by the recent Leiden Ranking, this is very sensible.
Faculty-student ratio is reduced from 20% to 10% and international students and international faculty are removed. There is now 10% for proportion of staff with Ph Ds and 10% for web impact.
Here are the top five.
1. Universidade de Sao Paulo
2. Pontificia Universida Catolica de Chile
3. Universidade Estadual de Campinas
4. Universidad de Los Andes Colombia
5. Universidad de Chile
Sunday, June 02, 2013
The Global Gender Index
Times Higher Education has just published an article on the Global Gender Index produced in collaboration with Thomson Reuters. This consists of calculating the percentage of female academics among those universities included in the top 400 of the Times Higher Education World University Rankings and producing a percentage for each country.
This is a rather dubious exercise. The data is reported by the institutions themselves and, as the international unit of UK HE has pointed out, such data may not always be reliable. In addition, the universities that volunteer to be ranked by THE and Thomson Reuters may not necessarily be representative of the higher education sector in general. The global research orientated universities that make it into the top 400 may be even less so.
The report finds that everywhere women make up less than half the academic work force and that the numbers are lowest in Japan, followed by Taiwan. Numbers are nearly equal in Turkey.
Predictably, the article includes a call for universities to be ranked according to how far they have achieved gender equity among academic staff and a suggestion that East Asian countries should learn from Turkey.
There is a question that needs to be considered. If universities in countries like Taiwan and Japan are poised to overtake the West, as QS and THE are constantly warning us, should we be so eager to conclude that they have something, or indeed anything, to learn from Turkey or from Northern Europe?
Times Higher Education has just published an article on the Global Gender Index produced in collaboration with Thomson Reuters. This consists of calculating the percentage of female academics among those universities included in the top 400 of the Times Higher Education World University Rankings and producing a percentage for each country.
This is a rather dubious exercise. The data is reported by the institutions themselves and, as the international unit of UK HE has pointed out, such data may not always be reliable. In addition, the universities that volunteer to be ranked by THE and Thomson Reuters may not necessarily be representative of the higher education sector in general. The global research orientated universities that make it into the top 400 may be even less so.
The report finds that everywhere women make up less than half the academic work force and that the numbers are lowest in Japan, followed by Taiwan. Numbers are nearly equal in Turkey.
Predictably, the article includes a call for universities to be ranked according to how far they have achieved gender equity among academic staff and a suggestion that East Asian countries should learn from Turkey.
There is a question that needs to be considered. If universities in countries like Taiwan and Japan are poised to overtake the West, as QS and THE are constantly warning us, should we be so eager to conclude that they have something, or indeed anything, to learn from Turkey or from Northern Europe?
Wednesday, May 29, 2013
Here is the full text of my article on the QS Subject Rankings published in the Philippine Daily Inquirer.
Philippine Daily Inquirer
It is time for the Philippines to think about constructing its own objective and transparent ranking or rating systems for its colleges and universities that would learn from the mistakes of the international rankers.
The ranking of universities is getting to be big business these days. There are quite a few rankings appearing from Scimago, Webometrics, University Ranking of Academic Performance (from Turkey), the Taiwan Rankings, plus many national rankings.
No doubt there will be more to come.
In addition, the big three of the ranking world—Quacquarelli Symonds (QS), Times Higher Education and Shanghai Jiao Tong University’s Academic Ranking of World Universities—are now producing a whole range of supplementary products, regional rankings, new university rankings, reputation rankings and subject rankings.
There is nothing wrong, in principle, with ranking universities. Indeed, it might be in some ways a necessity. The problem is that there are very serious problems with the rankings produced by QS, even though they seem to be better known in Southeast Asia than any of the others.
This is especially true of the subject rankings.
No new data
The QS subject rankings, which have just been released, do not contain new data. They are mostly based on data collected for last year’s World University Rankings—in some cases extracted from the rankings and, in others, recombined or recalculated.
There are four indicators used in these rankings. They are weighted differently for the different subjects and, in two subjects, only two of the indicators are used.
The four indicators are:
A survey of academics or people who claim to be academics or used to be academics, taken from a variety of sources. This is the same indicator used in the world rankings. Respondents were asked to name the best universities for research.
A survey of employers, which seem to comprise anyone who chooses to describe himself or herself as an employer or a recruiter.
The number of citations per paper. This is a change from the world rankings when the calculation was citations per faculty.
H-index. This is something that is easier to give examples for than to define. If a university publishes one paper and the paper is cited once, then it gets an index of one. If it publishes two or more papers and two of them are published twice each, then the index is two and so on. This is a way of combining quantity of research with quality as measured by influence on other researchers.
Out of these four indicators, three are about research and one is about the employability of a university’s graduates.
These rankings are not at all suitable for use by students wondering where they should go to study, whether at undergraduate or graduate level.
The only part that could be of any use is the employer review and that has a weight ranging from 40 percent for accounting and politics to 10 percent for arts and social science subjects, like history and sociology.
But even if the rankings are to be used just to evaluate the quantity or quality of research, they are frankly of little use. They are dominated by the survey of academic opinion, which is not of professional quality.
There are several ways in which people can take part in the survey. They can be nominated by a university, they can sign up themselves, they can be recommended by a previous respondent or they can be asked because they have subscribed to an academic journal or an online database.
Apart from checking that they have a valid academic e-mail address, it is not clear whether QS makes any attempt to check whether the survey respondents are really qualified to make any judgements about research.
Not plausible
The result is that the academic survey and also the employer survey have produced results that do not appear plausible.
In recent years, there have been some odd results from QS surveys. My personal favorite is the New York University Tisch School of the Arts, which set up a branch in Singapore in 2007 and graduated its first batch of students from a three-year Film course in 2010.In the QS Asian University Rankings of that year, the Singapore branch got zero for the other criteria (presumably the school did not submit data) but it was ranked 149th in Asia for academic reputation and 114th for employer reputation.
Not bad for a school that had yet to produce any graduates when the survey was taken early in the year.
In all of the subject rankings this year, the two surveys account for at least half of the total weighting and, in two cases, Languages and English, all of it.
Consequently, while some of the results for some subjects may be quite reasonable for the world top 50 or the top 100, after that they are sometimes downright bizarre.
The problem is that although QS has a lot of respondents worldwide, when it gets down to the subject level there can be very few. In pharmacy, for example, there are only 672 for the academic survey and in materials science 146 for the employer survey. Since the leading global players will get a large share of the responses, this means that universities further down the list will be getting a handful of responses for the survey. The result is that the order of universities in any subject in a single country like the Philippines can be decided by just one or two responses to the surveys.
Another problem is that, after a few obvious choices like Harvard, MIT (Massachusetts Institute of Technology), Tokyo, most respondents probably rely on a university’s general reputation and that can lead to all sorts of distortions.
Many of the subject rankings at the country level are quite strange. Sometimes they even include universities that do not offer courses in that subject. We have already seen that there are universities in the Philippines that are ranked for subjects that they do not teach.
Somebody might say that maybe they are doing research in a subject while teaching in a department with a different name, such as an economic historian teaching in the economics department but publishing in history journals and getting picked up by the academic survey for history.
Maybe, but it would not be a good idea for someone who wants to study history to apply to that particular university.
Another example is from Saudi Arabia, where King Fahd University of Petroleum and Minerals was apparently top for history, even though it does not have a history department or indeed anything where you might expect to find a historian. There are several universities in Saudi Arabia that may not teach history very well but at least they do actually teach it.
These subject rankings may have a modest utility for students who can pick or choose among top global universities and need some idea whether they should study engineering at SUNY (State University of New York) Buffalo (New York) or Leicester (United Kingdom) or linguistics at Birmingham or Michigan.
But they are of very little use for anyone else.
The QS university rankings by subject: Warning needed
By Richard HolmesPhilippine Daily Inquirer
7:37 pm | Monday, May 27th, 2013
107 360 8
The ranking of universities is getting to be big business these days. There are quite a few rankings appearing from Scimago, Webometrics, University Ranking of Academic Performance (from Turkey), the Taiwan Rankings, plus many national rankings.
No doubt there will be more to come.
In addition, the big three of the ranking world—Quacquarelli Symonds (QS), Times Higher Education and Shanghai Jiao Tong University’s Academic Ranking of World Universities—are now producing a whole range of supplementary products, regional rankings, new university rankings, reputation rankings and subject rankings.
There is nothing wrong, in principle, with ranking universities. Indeed, it might be in some ways a necessity. The problem is that there are very serious problems with the rankings produced by QS, even though they seem to be better known in Southeast Asia than any of the others.
This is especially true of the subject rankings.
No new data
The QS subject rankings, which have just been released, do not contain new data. They are mostly based on data collected for last year’s World University Rankings—in some cases extracted from the rankings and, in others, recombined or recalculated.
There are four indicators used in these rankings. They are weighted differently for the different subjects and, in two subjects, only two of the indicators are used.
The four indicators are:
A survey of academics or people who claim to be academics or used to be academics, taken from a variety of sources. This is the same indicator used in the world rankings. Respondents were asked to name the best universities for research.
A survey of employers, which seem to comprise anyone who chooses to describe himself or herself as an employer or a recruiter.
The number of citations per paper. This is a change from the world rankings when the calculation was citations per faculty.
H-index. This is something that is easier to give examples for than to define. If a university publishes one paper and the paper is cited once, then it gets an index of one. If it publishes two or more papers and two of them are published twice each, then the index is two and so on. This is a way of combining quantity of research with quality as measured by influence on other researchers.
Out of these four indicators, three are about research and one is about the employability of a university’s graduates.
These rankings are not at all suitable for use by students wondering where they should go to study, whether at undergraduate or graduate level.
The only part that could be of any use is the employer review and that has a weight ranging from 40 percent for accounting and politics to 10 percent for arts and social science subjects, like history and sociology.
But even if the rankings are to be used just to evaluate the quantity or quality of research, they are frankly of little use. They are dominated by the survey of academic opinion, which is not of professional quality.
There are several ways in which people can take part in the survey. They can be nominated by a university, they can sign up themselves, they can be recommended by a previous respondent or they can be asked because they have subscribed to an academic journal or an online database.
Apart from checking that they have a valid academic e-mail address, it is not clear whether QS makes any attempt to check whether the survey respondents are really qualified to make any judgements about research.
Not plausible
The result is that the academic survey and also the employer survey have produced results that do not appear plausible.
In recent years, there have been some odd results from QS surveys. My personal favorite is the New York University Tisch School of the Arts, which set up a branch in Singapore in 2007 and graduated its first batch of students from a three-year Film course in 2010.In the QS Asian University Rankings of that year, the Singapore branch got zero for the other criteria (presumably the school did not submit data) but it was ranked 149th in Asia for academic reputation and 114th for employer reputation.
Not bad for a school that had yet to produce any graduates when the survey was taken early in the year.
In all of the subject rankings this year, the two surveys account for at least half of the total weighting and, in two cases, Languages and English, all of it.
Consequently, while some of the results for some subjects may be quite reasonable for the world top 50 or the top 100, after that they are sometimes downright bizarre.
The problem is that although QS has a lot of respondents worldwide, when it gets down to the subject level there can be very few. In pharmacy, for example, there are only 672 for the academic survey and in materials science 146 for the employer survey. Since the leading global players will get a large share of the responses, this means that universities further down the list will be getting a handful of responses for the survey. The result is that the order of universities in any subject in a single country like the Philippines can be decided by just one or two responses to the surveys.
Another problem is that, after a few obvious choices like Harvard, MIT (Massachusetts Institute of Technology), Tokyo, most respondents probably rely on a university’s general reputation and that can lead to all sorts of distortions.
Many of the subject rankings at the country level are quite strange. Sometimes they even include universities that do not offer courses in that subject. We have already seen that there are universities in the Philippines that are ranked for subjects that they do not teach.
Somebody might say that maybe they are doing research in a subject while teaching in a department with a different name, such as an economic historian teaching in the economics department but publishing in history journals and getting picked up by the academic survey for history.
Maybe, but it would not be a good idea for someone who wants to study history to apply to that particular university.
Another example is from Saudi Arabia, where King Fahd University of Petroleum and Minerals was apparently top for history, even though it does not have a history department or indeed anything where you might expect to find a historian. There are several universities in Saudi Arabia that may not teach history very well but at least they do actually teach it.
These subject rankings may have a modest utility for students who can pick or choose among top global universities and need some idea whether they should study engineering at SUNY (State University of New York) Buffalo (New York) or Leicester (United Kingdom) or linguistics at Birmingham or Michigan.
But they are of very little use for anyone else.
Tuesday, May 28, 2013
The QS university rankings by subject: Warning needed
My article in the Philippine Daily Inquirer can be accessed here.
My article in the Philippine Daily Inquirer can be accessed here.
Sunday, May 26, 2013
The QS Subject Rankings: Not Everybody is Impressed
The subject rankings just released by QS seem to be a shrewd marketing move. Dozens of universities around the world have learnt that they have been ranked for something by the renown and revered QS, which will look good in their promotional literature.
Some people are not impressed. Brian Leiter, the law scholar and philosopher asks whether they are a fraud on the public. See here for his answer.
The subject rankings just released by QS seem to be a shrewd marketing move. Dozens of universities around the world have learnt that they have been ranked for something by the renown and revered QS, which will look good in their promotional literature.
Some people are not impressed. Brian Leiter, the law scholar and philosopher asks whether they are a fraud on the public. See here for his answer.
Why are they so worried?
The UK HE International Unit represents the views of the British university sector and is cooperating with Times Higher Education and Thomson Reuters in the organising of this week's Global University Summit in London, a Prestigious Event in a Spectacular Setting.
It has just issued a policy statement about the slowly emerging U-Multirank project, which is also discussed by David Jobbins in University World News.
The Unit has expressed a number of concerns. These include the overcrowding of the league table market, reliance on self-reported data which lack validity, the combining of incommensurate variables to create a league table, the risk of " becoming a blunt instrument that would not allow different strengths across an institution to be recognised" and diverting EU funds from other priorities. It claims that U-Multirank may "harm rather than benefit the sector."
It is difficult to see why the International Unit is getting so concerned. I agree that self reported data may lack validity but the QS and Times Higher Education global rankings also include such data. The combining of incommensurate variables is the essence of ranking. Sometimes blunt instruments are appropriate. A scalpel is of little use for hammering nails and a tool like U-Multirank may have uses which existing rankings do not.
As for the 2 million Euros, this is trivial compared with some of the things the EU has been wasting money on in recent years.
The UK HE International Unit represents the views of the British university sector and is cooperating with Times Higher Education and Thomson Reuters in the organising of this week's Global University Summit in London, a Prestigious Event in a Spectacular Setting.
It has just issued a policy statement about the slowly emerging U-Multirank project, which is also discussed by David Jobbins in University World News.
The Unit has expressed a number of concerns. These include the overcrowding of the league table market, reliance on self-reported data which lack validity, the combining of incommensurate variables to create a league table, the risk of " becoming a blunt instrument that would not allow different strengths across an institution to be recognised" and diverting EU funds from other priorities. It claims that U-Multirank may "harm rather than benefit the sector."
It is difficult to see why the International Unit is getting so concerned. I agree that self reported data may lack validity but the QS and Times Higher Education global rankings also include such data. The combining of incommensurate variables is the essence of ranking. Sometimes blunt instruments are appropriate. A scalpel is of little use for hammering nails and a tool like U-Multirank may have uses which existing rankings do not.
As for the 2 million Euros, this is trivial compared with some of the things the EU has been wasting money on in recent years.
Saturday, May 25, 2013
The Efficiency Rankings
Times Higher Education has a story about a study by Dirk Van Damme, head of the Centre for Educational Research and Innovation at the OECD. This will be presented at the Global University Summit held in Whitehall, London from the 28th to the 30th May.
The Summit "brings an invitation-only audience of leaders from the world’s foremost universities, senior policy-makers and international business executives to London in 2013." It is a "prestigious event" held in a "spectacular setting" and is sponsored by the University of Warwick, Times Higher Education, Thomson Reuters and UK Universities International Unit. Speakers include Vince Cable, Boris Johnson, the Russian ambassador and heads of various universities from around the world.
What Professor Van Damme has done is to treat the THE World University Rankings Research Indicator scores as an input and the Research Influence (Citations) scores as an output. The output scores are divided by the input scores and the result is a measure of the efficiency with which the inputs are turned into citations, which, as we all know, is the main function of the modern university.
According to THE:
"The input indicator takes scaled and normalised measures of research income and volume into account, and also considers reputation, while the output indicator looks at citations to institutional papers in Thomson Reuters’ Web of Science database, normalised for subject differences.
Professor van Damme said that the results - which show that university systems outside the Anglo-American elite are able to realise and increase outputs with much lower levels of input - did not surprise him.
“For example, Switzerland really invests in the right types of research. It has a few universities in which it concentrates resources, and they do very well,” he said.
Previous studies have found the UK to have the most efficient research system on measures of citation per researcher and per unit of spending.
But Professor van Damme explained that under his approach, productivity - output per staff member - was included as an input.
“With efficiency I mean the total research capacity of an institution, including its productivity, divided by its impact. The UK is not doing badly at all, but other countries are doing better, such as Ireland, which has a very low research score but a good citations score,” he said.
Given the severity of the country’s economic crisis, Ireland’s success was particularly impressive, he said.
“I think it is really conscious of the effort it has to make to maintain its position and is doing so.”
Low efficiency scores for China and South Korea reflected the countries’ problems in translating their huge investment into outputs, he added."
One hesitates to be negative about a paper presented at a prestigious event in a spectacular setting to an invitation only audience but this is frankly rather silly.
I would accept that income can be regarded as an input but surely not reputation and surely not volume of publications. Also, unless Van Damme's methodology has undisclosed refinements he is treating research scores as having the same value regardless of whether they are composed mainly of scores for reputation or for number of publications or for research income.
Then there is the time period concerned. Research income is income for one year, Publications are drawn from a five year period. These are then compared with citations over a six year period. So the paper is asking how research income for 2010 produces citations in the years 2006 - 2011 of papers published in the years 2006 - 2010. A university is certainly being remarkably efficient if its 2010 income is producing citations in 2006, 2007, 2008 and 2009.
Turning to the citations side of the equation, it should be recalled that the THE citations indicator includes an adjustment by which the citation impact score for universities is divided by the square root of the citation impact score for the country as a whole. In other words a university located in a country where papers are not cited very much gets a big boost and the lower the national citation impact score the bigger the boost for the university. This is why Hong Kong universities suffered reduced scores when Thomson Reuters took them out of China when counting citations and put them in their own separate category.
So, it is not surprising that universities from outside the Anglo-Saxon elite do well for citations and thus appear to be very efficient. Thomson Reuters methodology gives such universities a very substantial weighting just for being located in countries that are less productive in terms of citations.
None of this is new. In 2010 Van Damme did something similar at a seminar in London.
Van Damme is just analysing the top 200 universities in the THE rankings. It would surely be more interesting to analyse the top 400 whose scores are obtainable from an iPad/iPhone app.
So here are the top ten universities in the world according to the efficiency with which they turn income, reputation and publications into citations. The procedure is to divide the citations score from the 2012 THE rankings by the research indicator score.
1. Tokyo Metropolitan University
2. Moscow State Engineering Physics Institute
3. Florida Institute of Technology
4. Southern Methodist University
5. University of Hertfordshire
6. University of Portsmouth
7. King Mongkut's University of Technology
8. Vigo University
9. Creighton University
10. Fribourg University
No doubt the good and the great of the academic world assembled in Whitehall will make a trip to Portsmouth or even to Vigo or Creighton if they can find them on the map.
And now for the hall of shame. Here are the bottom ten of the THE top 400, ranked according to efficiency as measured by citations indicator scores divided by research scores. The heads of these failing institutions will no doubt be packing their bags and looking for jobs as junior administrative assistants at technical colleges in Siberia or the upper Amazon
391. Tsinghua University
392. Chinese University of Hong Kong
393. National Taiwan University
394. National Chiao Tung University
395. Tilburg University
396. Delft University of Technology
397. Seoul National University
398. State University of Campinas
399. Sao Paulo University
400. Lomosonov Moscow State University
In a little while I hope to publish the full 400 after I have finished being sarcastic about the QS subject rankings.
Times Higher Education has a story about a study by Dirk Van Damme, head of the Centre for Educational Research and Innovation at the OECD. This will be presented at the Global University Summit held in Whitehall, London from the 28th to the 30th May.
The Summit "brings an invitation-only audience of leaders from the world’s foremost universities, senior policy-makers and international business executives to London in 2013." It is a "prestigious event" held in a "spectacular setting" and is sponsored by the University of Warwick, Times Higher Education, Thomson Reuters and UK Universities International Unit. Speakers include Vince Cable, Boris Johnson, the Russian ambassador and heads of various universities from around the world.
What Professor Van Damme has done is to treat the THE World University Rankings Research Indicator scores as an input and the Research Influence (Citations) scores as an output. The output scores are divided by the input scores and the result is a measure of the efficiency with which the inputs are turned into citations, which, as we all know, is the main function of the modern university.
According to THE:
"The input indicator takes scaled and normalised measures of research income and volume into account, and also considers reputation, while the output indicator looks at citations to institutional papers in Thomson Reuters’ Web of Science database, normalised for subject differences.
Professor van Damme said that the results - which show that university systems outside the Anglo-American elite are able to realise and increase outputs with much lower levels of input - did not surprise him.
“For example, Switzerland really invests in the right types of research. It has a few universities in which it concentrates resources, and they do very well,” he said.
Previous studies have found the UK to have the most efficient research system on measures of citation per researcher and per unit of spending.
But Professor van Damme explained that under his approach, productivity - output per staff member - was included as an input.
“With efficiency I mean the total research capacity of an institution, including its productivity, divided by its impact. The UK is not doing badly at all, but other countries are doing better, such as Ireland, which has a very low research score but a good citations score,” he said.
Given the severity of the country’s economic crisis, Ireland’s success was particularly impressive, he said.
“I think it is really conscious of the effort it has to make to maintain its position and is doing so.”
Low efficiency scores for China and South Korea reflected the countries’ problems in translating their huge investment into outputs, he added."
One hesitates to be negative about a paper presented at a prestigious event in a spectacular setting to an invitation only audience but this is frankly rather silly.
I would accept that income can be regarded as an input but surely not reputation and surely not volume of publications. Also, unless Van Damme's methodology has undisclosed refinements he is treating research scores as having the same value regardless of whether they are composed mainly of scores for reputation or for number of publications or for research income.
Then there is the time period concerned. Research income is income for one year, Publications are drawn from a five year period. These are then compared with citations over a six year period. So the paper is asking how research income for 2010 produces citations in the years 2006 - 2011 of papers published in the years 2006 - 2010. A university is certainly being remarkably efficient if its 2010 income is producing citations in 2006, 2007, 2008 and 2009.
Turning to the citations side of the equation, it should be recalled that the THE citations indicator includes an adjustment by which the citation impact score for universities is divided by the square root of the citation impact score for the country as a whole. In other words a university located in a country where papers are not cited very much gets a big boost and the lower the national citation impact score the bigger the boost for the university. This is why Hong Kong universities suffered reduced scores when Thomson Reuters took them out of China when counting citations and put them in their own separate category.
So, it is not surprising that universities from outside the Anglo-Saxon elite do well for citations and thus appear to be very efficient. Thomson Reuters methodology gives such universities a very substantial weighting just for being located in countries that are less productive in terms of citations.
None of this is new. In 2010 Van Damme did something similar at a seminar in London.
Van Damme is just analysing the top 200 universities in the THE rankings. It would surely be more interesting to analyse the top 400 whose scores are obtainable from an iPad/iPhone app.
So here are the top ten universities in the world according to the efficiency with which they turn income, reputation and publications into citations. The procedure is to divide the citations score from the 2012 THE rankings by the research indicator score.
1. Tokyo Metropolitan University
2. Moscow State Engineering Physics Institute
3. Florida Institute of Technology
4. Southern Methodist University
5. University of Hertfordshire
6. University of Portsmouth
7. King Mongkut's University of Technology
8. Vigo University
9. Creighton University
10. Fribourg University
No doubt the good and the great of the academic world assembled in Whitehall will make a trip to Portsmouth or even to Vigo or Creighton if they can find them on the map.
And now for the hall of shame. Here are the bottom ten of the THE top 400, ranked according to efficiency as measured by citations indicator scores divided by research scores. The heads of these failing institutions will no doubt be packing their bags and looking for jobs as junior administrative assistants at technical colleges in Siberia or the upper Amazon
391. Tsinghua University
392. Chinese University of Hong Kong
393. National Taiwan University
394. National Chiao Tung University
395. Tilburg University
396. Delft University of Technology
397. Seoul National University
398. State University of Campinas
399. Sao Paulo University
400. Lomosonov Moscow State University
In a little while I hope to publish the full 400 after I have finished being sarcastic about the QS subject rankings.
Friday, May 24, 2013
Update on IREG Approval
- The International Ranking Experts group has also given its approval to the national ranking produced by the Perspektywy Education Foundation of Poland.
- The approval given to the QS World, Asian and Latin American Rankings does not apply to the QS Stars.
Saturday, May 18, 2013
The First IREG Audit
QS is the first ranking organisation to get the seal of approval from the International Ranking Experts Group (IREG) for its World, Asian and Latin American rankings.
The IREG audit process would appear on the surface to be quite rigorous. Take a look at the audit manual. There are a number of criteria some of which sound quite daunting but are not really so. For example, Criterion 8 says:
"If rankings are using composite indicators the weights of the individual indicators have to be published. Changes in weights over time should be limited and due to methodological or conception-related considerations."
Fair enough, but there is nothing about how weighting should be distributed across the indicators in the first place. Forty per cent for the academic survey in the QS rankings?
Some indicators are obvious -- providing a contact address. Others are so vague that they mean very little -- organisational measures that enhance the credibility of rankings.
The basic principle of the audit is that ranking organisations are given scores ranging from 1 (not sufficient/not applied) to 6 (distinguished) for the various criteria, with a double weighting for core criteria. The maximum score is 180 and
QS is the first ranking organisation to get the seal of approval from the International Ranking Experts Group (IREG) for its World, Asian and Latin American rankings.
The IREG audit process would appear on the surface to be quite rigorous. Take a look at the audit manual. There are a number of criteria some of which sound quite daunting but are not really so. For example, Criterion 8 says:
"If rankings are using composite indicators the weights of the individual indicators have to be published. Changes in weights over time should be limited and due to methodological or conception-related considerations."
Fair enough, but there is nothing about how weighting should be distributed across the indicators in the first place. Forty per cent for the academic survey in the QS rankings?
Some indicators are obvious -- providing a contact address. Others are so vague that they mean very little -- organisational measures that enhance the credibility of rankings.
The basic principle of the audit is that ranking organisations are given scores ranging from 1 (not sufficient/not applied) to 6 (distinguished) for the various criteria, with a double weighting for core criteria. The maximum score is 180 and
"On the bases of the assessment scale described
above, the threshold for a positive audit decision will
be 60 per cent of the maximum total score. This
means the average score on the individual criteria
has to be slightly higher than “adequate”. In order
to establish the IREG Ranking Audit as a quality
label none of the core criteria must be assessed
with a score lower than three."
So a positive result could mean that an organisation is distinguished in everything. It could also mean that it is on average slightly higher than adequate. It would be interesting to know which applies to QS.
I do not know whether the auditors had any criticisms to make. If not it is difficult to see the point of the exercise. If they did it would be nice to know what they were.
QS are to be commended for submitting to the audit although it probably was not very searching but it still seems that the ranking world needs more and better monitoring and observation.
Wednesday, May 15, 2013
QS Rankings by Subject
QS have produced their annual subject rankings. At the top there are no real surprises and, while there is certainly room for argument, I do not think that anyone will be shocked by the top ten or twenty in each subject.
The university with the most number ones is Harvard:
Medicine
Biology
Psychology
Pharmacy and Pharmacology
Earth and Marine Sciences
Politics and International Studies
Law
Economics and Econometrics
Accounting and Finance
Education
MIT has seven:
Computer Science
Chemical Engineering
Electrical Engineering
Mechanical Engineering
Phys and Astronomy
Chemistry
Materials Science
Then there is Berkeley with exactly the four you would expect:
Environmental Science
Statistics and Operational Research
Sociology
Communication and Media Studies
Oxford has three:
Philosophy
Modern Languages
Geography
Cambridge another three:
History
Linguistics
Mathematics
Imperial College London is top for Civil Engineering and University of California, Davis for Agriculture and Forestry.
These rankings are based on the academic opinion survey, the employer survey, citations per paper and h-index, a measure of both output and influence that eliminates outliers, in proportions that vary for each subject. They are very research-focused which is unfortunate since there seems to be a consensus emerging at conferences and seminars that the THE-TR rankings are for policy makers, the Shanghai ARWU for researchers and the QS rankings for undergraduate students.
Outside the top fifty there are some oddities resulting form the small number of responses when we leave the top fifty or top one hundred. I will leave it to specialists to find them.
QS have produced their annual subject rankings. At the top there are no real surprises and, while there is certainly room for argument, I do not think that anyone will be shocked by the top ten or twenty in each subject.
The university with the most number ones is Harvard:
Medicine
Biology
Psychology
Pharmacy and Pharmacology
Earth and Marine Sciences
Politics and International Studies
Law
Economics and Econometrics
Accounting and Finance
Education
MIT has seven:
Computer Science
Chemical Engineering
Electrical Engineering
Mechanical Engineering
Phys and Astronomy
Chemistry
Materials Science
Then there is Berkeley with exactly the four you would expect:
Environmental Science
Statistics and Operational Research
Sociology
Communication and Media Studies
Oxford has three:
Philosophy
Modern Languages
Geography
Cambridge another three:
History
Linguistics
Mathematics
Imperial College London is top for Civil Engineering and University of California, Davis for Agriculture and Forestry.
These rankings are based on the academic opinion survey, the employer survey, citations per paper and h-index, a measure of both output and influence that eliminates outliers, in proportions that vary for each subject. They are very research-focused which is unfortunate since there seems to be a consensus emerging at conferences and seminars that the THE-TR rankings are for policy makers, the Shanghai ARWU for researchers and the QS rankings for undergraduate students.
Outside the top fifty there are some oddities resulting form the small number of responses when we leave the top fifty or top one hundred. I will leave it to specialists to find them.
Subscribe to:
Posts (Atom)