Showing posts sorted by date for query MIT. Sort by relevance Show all posts
Showing posts sorted by date for query MIT. Sort by relevance Show all posts

Tuesday, March 04, 2014

Reactions to the QS Subject Rankings

It looks as though the QS subject rankings are a big hit. Here is just a sample of headlines and quotations from around the world.

World Ranking Recognises Agricultural Excellence at Lincoln [New Zealand]

CEU [Central European University, Hungary] Programs Rank Among the World's Top 100

Boston-Area Schools Rank Top in the World in These 5 Fields

"Cardiff has been ranked as one of the top universities in the world in a number of different subjects, according to a recent international league table."

NTU [National Taiwan University] leads local universities making QS rankings list

Swansea University continues to excel in QS world subject rankings

Penn State Programs Rank Well in 2014 QS World Rankings by Subject

Anna Varsity [India] Enters Top 250 in QS World Univ Rankings

Moscow State University among 200 best in the world

New Ranking Says Harvard And MIT Are The Best American Universities For 80% of Academic Subjects

QS: The University of Porto ranked among the best in the world

4 Indian Institutions in 2014 World Ranking

"The Institute of Education [London] has been ranked as the world's leading university for Education in the 2014 QS World University Rankings."

Nine UvA [University of Amsterdam] subject areas listed in QS World University Rankings top 50

"The University of Newcastle's [Australia] Civil and Structural Engineering discipline has surged in the QS World University Rankings by Subject list"























C

Sunday, March 02, 2014

The QS Subject Rankings: Reposting

QS have come out with their 2014 University Rankings by Subject, three months earlier than last year. Maybe this is to get ahead of Times Higher whose latest Reputation Rankings will be published next week.

The methodology of these rankings has not changed since last year so I am just reposting my article which was first published in the Philippine Daily Inquirer on 27th May and then reposted here on the 29th May 2013.



The QS University Rankings by Subject: Warning 

It is time for the Philippines to think about constructing its own objective and transparent ranking or rating systems for its colleges and universities that would learn from the mistakes of the international rankers.

The ranking of universities is getting to be big business these days. There are quite a few rankings appearing from Scimago, Webometrics, University Ranking of Academic Performance (from Turkey), the Taiwan Rankings, plus many national rankings.

No doubt there will be more to come.

In addition, the big three of the ranking world—Quacquarelli Symonds (QS), Times Higher Education and Shanghai Jiao Tong University’s Academic Ranking of World Universities—are now producing a whole range of supplementary products, regional rankings, new university rankings, reputation rankings and subject rankings.

There is nothing wrong, in principle, with ranking universities. Indeed, it might be in some ways a necessity. The problem is that there are very serious problems with the rankings produced by QS, even though they seem to be better known in Southeast Asia than any of the others.
This is especially true of the subject rankings.

No new data

The QS subject rankings, which have just been released, do not contain new data. They are mostly based on data collected for last year’s World University Rankings—in some cases extracted from the rankings and, in others, recombined or recalculated.

There are four indicators used in these rankings. They are weighted differently for the different subjects and, in two subjects, only two of the indicators are used.

The four indicators are:
A survey of academics or people who claim to be academics or used to be academics, taken from a variety of sources. This is the same indicator used in the world rankings. Respondents were asked to name the best universities for research.
A survey of employers, which seem to comprise anyone who chooses to describe himself or herself as an employer or a recruiter.
The number of citations per paper. This is a change from the world rankings when the calculation was citations per faculty.
H-index. This is something that is easier to give examples for than to define. If a university publishes one paper and the paper is cited once, then it gets an index of one. If it publishes two or more papers and two of them are published twice each, then the index is two and so on. This is a way of combining quantity of research with quality as measured by influence on other researchers.

Out of these four indicators, three are about research and one is about the employability of a university’s graduates.

These rankings are not at all suitable for use by students wondering where they should go to study, whether at undergraduate or graduate level.

The only part that could be of any use is the employer review and that has a weight ranging from 40 percent for accounting and politics to 10 percent for arts and social science subjects, like history and sociology.

But even if the rankings are to be used just to evaluate the quantity or quality of research, they are frankly of little use. They are dominated by the survey of academic opinion, which is not of professional quality.

There are several ways in which people can take part in the survey. They can be nominated by a university, they can sign up themselves, they can be recommended by a previous respondent or they can be asked because they have subscribed to an academic journal or an online database.

Apart from checking that they have a valid academic e-mail address, it is not clear whether QS makes any attempt to check whether the survey respondents are really qualified to make any judgements about research.

Not plausible

The result is that the academic survey and also the employer survey have produced results that do not appear plausible.

In recent years, there have been some odd results from QS surveys. My personal favorite is the New York University Tisch School of the Arts, which set up a branch in Singapore in 2007 and graduated its first batch of students from a three-year Film course in 2010.In the QS Asian University Rankings of that year, the Singapore branch got zero for the other criteria (presumably the school did not submit data) but it was ranked 149th in Asia for academic reputation and 114th for employer reputation.

Not bad for a school that had yet to produce any graduates when the survey was taken early in the year.

In all of the subject rankings this year, the two surveys account for at least half of the total weighting and, in two cases, Languages and English, all of it.

Consequently, while some of the results for some subjects may be quite reasonable for the world top 50 or the top 100, after that they are sometimes downright bizarre.

The problem is that although QS has a lot of respondents worldwide, when it gets down to the subject level there can be very few. In pharmacy, for example, there are only 672 for the academic survey and in materials science 146 for the employer survey. Since the leading global players will get a large share of the responses, this means that universities further down the list will be getting a handful of responses for the survey. The result is that the order of universities in any subject in a single country like the Philippines can be decided by just one or two responses to the surveys.

Another problem is that, after a few obvious choices like Harvard, MIT (Massachusetts Institute of Technology), Tokyo, most respondents probably rely on a university’s general reputation and that can lead to all sorts of distortions.

Many of the subject rankings at the country level are quite strange. Sometimes they even include universities that do not offer courses in that subject. We have already seen that there are universities in the Philippines that are ranked for subjects that they do not teach.

Somebody might say that maybe they are doing research in a subject while teaching in a department with a different name, such as an economic historian teaching in the economics department but publishing in history journals and getting picked up by the academic survey for history.

Maybe, but it would not be a good idea for someone who wants to study history to apply to that particular university.

Another example is from Saudi Arabia, where King Fahd University of Petroleum and Minerals was apparently top for history, even though it does not have a history department or indeed anything where you might expect to find a historian. There are several universities in Saudi Arabia that may not teach history very well but at least they do actually teach it.

These subject rankings may have a modest utility for students who can pick or choose among top global universities and need some idea whether they should study engineering at SUNY (State University of New York) Buffalo (New York) or Leicester (United Kingdom) or linguistics at Birmingham or Michigan.

But they are of very little use for anyone else.

Tuesday, February 18, 2014

The New Webometrics Rankings

The latest Webometrics rankings are out.

In the overall rankings the top five are:

1.  Harvard
2.  MIT
3.  Stanford
4.  Cornell
5.  Columbia.

Looking at the indicators one by one, the top five for presence (number of webpages in the main webdomain) are:

1.  Karolinska Institute
2.  National Taiwan University
3.  Harvard
4.  University of California San Francisco
5.  PRES Universite de Bordeaux.

The top five for impact (number of external inlinks received from third parties) are:

1.  University of California Berkeley
2.  MIT
3.  Harvard
4.  Stanford
5.  Cornell.

The top five for openness (number of rich files published in dedicated websites) are:

1.  University of California San Francisco
2.  Cornell
3.  Pennsylvania State University
4.  University of Kentucky
5.  University of Hong Kong.

The top five for excellence (number of papers in the 10% most cited category) are:

1.  Harvard
2.  Johns Hopkins
3.  Stanford
4.  UCLA
5.  Michigan

Thursday, February 06, 2014

The Best Universities for Research

It seems to be the time of year when there a slow trickle of university ranking spin-offs before the big three world rankings starting in August. We have had young university rankings, best student cities, most international universities, BRICS rankings.

Something is missing though, a ranking of top universities for research. So to assuage the pent up demand here are the top 20 universities for research according to six different ranking indicators. There is considerable variation with only two universities, Harvard and Stanford, appearing in every list.

First the top twenty universities for research output according to Scimago. This is measured by publications in the Scopus database over a five year period.

1.   Harvard
2.   Tokyo
3.   Toronto
4.   Tsinghua
5.   Sao Paulo
6.   Michigan Ann Arbor
7.   Johns Hopkins
8.   UCLA
9.   Zhejiang
10. University of Washington
11. Stanford
12. Graduate University of the Chinese Academy of Sciences
13. Shanghai Jiao Tong University
14. University College London
15. Oxford
16. Universite Pierre et Marie Curie Paris 6
17. University of Pennsylvania
18. Cambridge
19. Kyoto
20. Columbia

Next we have the normalized impact scores from Scimago, which measure citations to research publications taking account of field. This might be considered a measure of the quality of research rather than quantity. Note that a university would not be harmed if it had a large number of non-performing faculty who never wrote papers.

1.   MIT
2.   Harvard
3.   University of California San Francisco
4=  Stanford
4=  Princeton
6.   Duke
7.   Rice
8.   Chicago
9=  Columbia
9=  University of California Berkeley
9=  University of California Santa Cruz
12.  University Of California Santa Barbara
13.  Boston University
14= Johns Hopkins
14= University of Pennsylvania
16.  University of California San Diego
17= UCLA
17= University of Washington
17= Washington University of St Louis
20.  Oxford

The citations per faculty indicator in the QS World University Rankings also uses Scopus. It is not normalized by field so medical schools and technological institutes can do very well.

1.   Weizmann Institute of Technology
2.   Caltech
3.   Rockefeller University
4.   Harvard
5.   Stanford
6.   Gwanju Institute of Science and Technology
7.   UCLA
8.   University of California San Francisco
9.   Karolinska Institute
10. University of California Santa Barbara
11. University of California San Diego
12. London School of Hygiene and Tropical Medicine
13. MIT
14. Georgia Institute of Technology
15. University of Washington
16. Northwestern University
17. Emory
18. Tel Aviv
19. Minnesota Twin Cities
20. Cornell

The Times Higher Education -- Thomson Reuters Research Impact Citations Indicator is normalized by field (250 of them) and by year of publication. In addition, there is a "regional modification" that gives a big boost to universities in countries with generally low impact scores. A good score on this indicator can be obtained by contributing to multi-contributor publications, especially in physics, providing that total publications do not rise too much.

1=  MIT
1=  Tokyo Metropolitan University
3=  University of California Santa Cruz
3=  Rice
5.   Caltech
6.   Princeton
7.   University of California Santa Barbara
8.   University of California Berkeley
9=  Harvard
9=  Stanford
11. Florida Institute of Technology
12. Chicago
13. Royal Holloway,University of London
14.  University of Colorado Boulder
15= Colorado School of Mines
15= Northwestern
17= Duke
17= Universty of California San Diego
19.  Washington University of St Louis
20.  Boston College

The Shanghai Academic Ranking of World Universities Highly Cited indicator counts the number of researchers on the lists compiled by Thomson Reuters. It seems that new lists will now be produced every year so this indicator could become less stable.

1.   Harvard
2.   Stanford
3.   MIT
4.   University of California Berkeley
5.   Princeton
6.   Michigan Ann Arbor
7.   University of California San Diego
8.   Yale
9.   University of Pennsylvania
10.   UCLA
11=  Caltech
11=  Columbia
13.   University of Washington
14.   Cornell
15.   Cambridge.
16.   University of California San Francisco
17.   Chicago
18    University of Wisconsin Madison
19    University of Minnesota Twin Cities
20.   Oxford


Finally, the MNCS indicator from the Leiden Ranking, which is the number of field normalized citations per paper. It is possible for a few widely cited papers in the right discipline to have a disproportionate effect. The high placing for Gottingen results from a single computer science paper the citation of which is required for intellectual property reasons.

1.    MIT
2.    Gottingen
3.    Princeton
4.    Caltech
5.    Stanford
6.    Rice
7.    University of California Santa Barbara
8.    University of California Berkeley
9     Harvard
10   University of California Santa Cruz
11.  EPF Lausanne
12.  Yale
13   University of California San Francisco
14.  Chicago
15.  University of California San Diego
16.  Northwestern
17.  University of  Colorado Boulder
18.  Columbia
19.  University of Texas Austin
20.  UCLA




Friday, October 04, 2013

MIT and TMU are the most influential research universities in the world

I hope to comment extensively on the new Times Higher Education - Thomson Reuters rankings in a while but for the moment here is a comment on the citations indicator.

Last year Times Higher Education and Thomson Reuters solemnly informed the world that the two most influential places for research were Rice University in Texas and the Moscow State Engineering Physics Institute (MEPhI).

Now, the top two for Citations: research influence are MIT, which sounds a bit more sensible than Rice, and Tokyo Metropolitan University. Rice has slipped very slightly and MEPhI has disappeared from the general rankings because it was realised that it is a single-subject institution. I wonder how they worked that out.

That may be a bit unfair. What about that paper on opposition politics in central Russia in the 1920s?

Tokyo Metropolitan University's success at first seems rather odd because it also has a very low score for Research, which probably means that it has a poor reputation for research, does not receive much funding, has few graduate students and/or publishes few papers. So how could its research be so influential?

The answer is that it was one of scores of contributors to a couple of multi-authored publications on particle physics and a handful of widely cited papers in genetics and also produced few papers overall. I will let Thomson Reuters explain how that makes it into a pocket or a mountain of excellence.

Tuesday, September 10, 2013

The QS Rankings

The QS World University Rankings top 200 have just been published. The top 800 will be released later today.

The top ten are:

1.  MIT
2.  Harvard
3.  Cambridge
4.  University College London
5.  Imperial College London
6.  Oxford
7.  Stanford
8. Yale
9. Chicago
10. Caltech

Wednesday, August 14, 2013

The Webometrics Rankings

The July 2013 Webometrics rankings have just been published. The top five are:

1.  Harvard
2.  MIT
3.  Stanford
4.  UC Berkeley
5.  UCLA

In first place in various regions are:

Latin America: Sao Paulo
Europe: Oxford
Asia: National University of Singapore
Africa: Kwazulu Natal
Arab World: King Saud University
Oceania: Australian National University
Caribbean: University of Puerto Rico Mayaguez
Middle East: Tel Aviv
South Asia: IIT Bombay
Eastern and Central Europe: Lomonosov Moscow State University


Tuesday, July 02, 2013

The Complete Efficiency Rankings

At last. Here are complete Efficiency Rankings measuring the efficiency with which universities turn inputs into citations. I am using the method of Professor Dirk van Damme which is to divide the scores for Citations: Research Influence in The THE World University Rankings by the scores for Research: Volume, Income and Reputation. Here is the method as cited in a previous post:

"The input indicator takes scaled and normalised measures of research income and volume into account, and also considers reputation, while the output indicator looks at citations to institutional papers in Thomson Reuters’ Web of Science database, normalised for subject differences.
Professor van Damme said that the results - which show that university systems outside the Anglo-American elite are able to realise and increase outputs with much lower levels of input - did not surprise him.
“For example, Switzerland really invests in the right types of research. It has a few universities in which it concentrates resources, and they do very well,” he said.
Previous studies have found the UK to have the most efficient research system on measures of citation per researcher and per unit of spending.
But Professor van Damme explained that under his approach, productivity - output per staff member - was included as an input.
“With efficiency I mean the total research capacity of an institution, including its productivity, divided by its impact. The UK is not doing badly at all, but other countries are doing better, such as Ireland, which has a very low research score but a good citations score,” he said.
Given the severity of the country’s economic crisis, Ireland’s success was particularly impressive, he said.
“I think it is really conscious of the effort it has to make to maintain its position and is doing so.”
Low efficiency scores for China and South Korea reflected the countries’ problems in translating their huge investment into outputs, he added."




1.  Tokyo Metroplitan
2.  Moscow Engineering Physics Institute
3.  Florida Institute of Technology
4.  Southern Methodist
5.  Hertfordshire
6.  Portsmouth
7= King Mongkut's University of Technology
7= Vigo
9.  Creighton
10.  Fribourg
11.  King Abdulaziz
12.  University of the Andes
13.  Trieste
14.  Renmin
15.  Medical University of Vienna
16.  Polytech University Valencia
17=  Beyreuth
18=  Montana
19.   Mainz
20.   Ferrara
21.   Drexel
22.   Valencia
23.   Linz
24.   Crete
25.   Colorado School of Mines
26.  Technical University of Dresden
27.   Innsbruck
28.  Nurnberg
29= Dauphine
29= Wake Forest
29= Maryland Baltimore County
32. St George's London
33.  William and Mary College
34. Hong Kong Baptist
35.  Basel
36.  Texas San Antonio
37.  Duisberg
38.  Lyon 1
39.  Wurzburg
40.  Charles Darwin
41.  Wayne State
42. Northeastern
43.  Bicocca
44. Royal Holloway
45.  Koc
46.  Georgia University of Health Science
47.  Modena
48.  Dundee
49.  Southern Denmark
50= IIT Roorkhee
50= Pompeu Fabra
52.  Graz
53= Oregon
53= Diderot
55.   Bielfeld
56.   Munster
57.  Waikato
58= Grenoble
59= East Anglia
60= Bonn
61=  Pavia
62.  ENS Lyon
63.  Eastern Finland
64.  Padua
65.  Brandeis
66.  Aberystwyth
67.  Tulane
68.  Tubingen
69= Warsaw
70= Sun Yat Sen
71= Keele
72.  Tromso
73.  Brunel
74.  Liege
75.  Queen Mary
76=  Vermont
77=  Trento
78.  Turin
79.  Jyvaskyla
80.  Carleton
81.  Kansas
82.  California Riverside
83.  SUNY Stony Brook
84=  George Washington
85=  Pisa
86.  Tasmania
87.  George Mason
88.  Boston College
89=  Oregon State
90=  Texas Dallas
91.  Trinity College Dublin
92= University Science and Technology China
92= Murdoch
92= Cinncinati
92= Galway
92= Yeshiva
97= Tufts
97= Minho
99. Miami
100.  Lehigh
101. Technical University Denmark
102= Rice
102= Iceland
104.  California Santa Cruz
104= Milan
106.  Monpellier 2
107.  Frankfurt
108= Bergen
109=  Strasbourg
110.  Victoria
111.  Rochester
112.  Cork
113.  Dartmouth
114.  Oklahoma
115.  Birkbeck
116.  Porto
117.  Canterbury
118= Newcastle  UK
118= Notre Dame
118= University College Dublin
121. Binghamton
122.  Aveiro
123=  Kiel
123= Sussex
125.  Temple
126.  Aachen
127=  Fribourg
127=  Queens Belfast
127= Colorado Boulder
130.  Iowa State
131. Tokyo Medical Dental
132= Autonomous Madrid
132= Swedish Agriculture
132= Tempere
135= Deakin
135= Barcelona
137= Stockholm
137= Stirling
139.  Laval
140.  Durham
141.  Bangor
142= Aberdeen
142= Vanderbilt
144.  Istanbul Technical
145.  Nanjing
146= Exeter
146= Emory
146= Leicester
149.  Southamton
150.  Paris Mines
151. Vrije Universiteit Brussel
152.  Polytechnic Milan
153. Kwazulu-Natal
154= Linkoping
154= Bilkent
154= Herriot-Watt
154= Bologna
158= Wyoming
158= Utah
158= Massey
161= Glasgow
161= Bern
163.  ENS Paris
164.  Zurich
165= Case Western Reserve
166= California Irvine
167= Tartu
168= Wellington
169= Salento
170.  South Carolina
171.  York UK
172.  Aalto
173= Curie
173= Macquarie
173= Boston
176= Delaware
177= Copenhagen
178= Hannover
179.  Norway University of Science and Technology
180.  Antwerp
181= Dalhousie
181= Renselaer Polytechnic Institute
183= Konstanz
184= Paris Sud
185.  Technical University Munich
186.  Lancaster
187.  Waseda
188.  Otago
189.  Arizona State
190= SUNY Albany
190= Gottingen
190= Autonomous Barcelona
193= Cape Town
194= St Andrews
195= Colorado State
195= Bath
195= Wollongong
198= Tsukuba
198= Simon Fraser
198= Liverpool
198= Umea
202= Geneva
202= Newcastle Australia
204= Universite Libre Bruxelles
204= Virginia
206= Lausanne
206= Louvain
208= Connecticut
208= Georgetown
208= York Canada
211.  EPF Lausanne
212= North Carolina State
212= Bristol
212= Aalborg
212= Free University Amsterdam
216= Indiana
216= Kentucky
218. Maryland College Park
219.  Karlsruhe Institute technology
220= University Technology Sydney
220= Iowa
222.  Charles
223.  Flinders
224.  Cardiff
225= Auckland
225= Oslo
227.  Pittsburgh
228= Heidelberg
228= Guelph
228= Washington State
228= Sheffield
232= Chinese University Hong Kong
232= Strathclyde
234= Ottawa
234= Gotherberg
234= Washington St Louis
237.  Medical South Carolina
238= McMaster
238= Brown
238= National Sun Yat Sen
238= Reading
242.  Ecole Polytechnique
243.  Helsinki
244= Quebec
244= National Central Taiwan     
246.  Bogazici
247= Southern California
247= Arizona
249.  Keio
250= Houston
250= Stellenbosch
250= Kings College London
250= Darmstadt
250= Western Australia
255= Pohang
255= IIT Bombay
257= Wageningen
257= Manitoba
259= South Australia
259= Nagoya
261= Leeds
261= UC Santa Barbara
261= Nijmegen
261= Jagiellon
265= New York University
265= Calgary
265= Ohio State
268.  Aarhus
269= Witwatersrand
269= North Carolina Chapel Hill
269= Michigan State
269= Fudan
273= Bochum
273= Munich
275= SUNY Buffalo
275= Adelaide
275= Sapienza
278= Utrecht
278= Edinburgh
278= Queensland University of Technology
281= Lund
281= Ghent
283.  Erasmus
284= Massachusetts
284= Illinois Chicago
284= Nottingham
287= Eindhoven
287= Amsterdam
289.  UC San Diego
290.  Birmingham
291= Western Ontario
291= Twente
293= Washington Seattle
293= Duke
295= Penn State
295= NUI Maynooth
297= Maastricht
297= Groningen
297= Columbia
297= Leiden
297= Georgia
302.  UC Davis
303= Southern Florida
303= Chalmers University of Technology
305= Minnesota
305= Essex
305= Manchester
305= Georgia Institute of Technology
309= Rutgers
309= Texas at Austin
311= Northwestern
311= Warwick
311= Vienna
311= MIT
315.  Johns Hopkins
316= Wisconsin Madison
316= Carnegie Mellon
318.  Alberta
319.  Pennsylvania
320= Hong Kong University of Science and Technology
320= Kyushu
322= Chicago
322= Vienna University of Technology
324= Queensland
324= Montreal
326.  British Columbia
327= Yale
327= Imperial College London
327= UCLA
327= Hebrew University of Jerusalem
327= Karolinska
332= Melbourne
332= Humboldt
332= National Tsinghua Taiwan
332= Cambridge
332= Harvard
332= Stanford
338= Monash 
338= Princeton
338= Caltech
338= Michigan
338= UC Berkeley
338= Cornell
344= Waterloo
344= KHT Sweden
344= Missouri
347.  University College London
348= Oxford
348= Middle East Technical University
350.  Yonsei
351= Toronto
351= Illinois Urbana Champagne 
351= Peking
351= Leuven
355= Zhejiang
355= Hokkaido
355= Hong Kong Polytechnic University
355= McGill
359= ETH Zurich
359= Tokyo Institute of Technology
361= Berlin
361= Uppsala
363= Korea
363= Sydney
365= Florida
365= New South Wales
367= Australian National
367= Tohoku
367= Purdue
367= Technion
371= Surrey
371= IIT Kharagpur
373= KAIST
373= Texas A and M
375. Virginia Polytechnic Institute
376= Osaka
376= Nanyang Technological University
376= Shanghai Jiao Tong
379.  LSE
380.  Sungkyunkwan
381.  Sharif University of Technology
382.  Tokyo
383= National Taiwan University of Science and Technology
383= National Autonomous University of Mexico
385= Kyoto
385= National University of Singapore
387.  Loughborough
388.  National Cheng Kung
389.  Tel Aviv
390= Hong Kong
390= Tsinghua
392.  Chinese University of Hong Kong
393.  National Taiwan
394.  National Chiao Tung
395.  Tilburg
396.  Delft
397.  Seoul National
398.  State University Campinas
399.  Sao Paulo
400.  Moscow State
























Tuesday, June 25, 2013

What about a Research Influence Ranking?

Keeping up with the current surge of global university rankings is becoming next to impossible. Still there are  a few niches that have remained unoccupied. One might be a ranking of universities according to their ability to spread new knowledge around the world. So it might be a good idea to have a Research Influence Ranking based on the citations indicator in the Times Higher Education -- Thomson Reuters World University Rankings.

Thomson Reuters are the world's leading collectors and analysts of citations data so such an index ought to provide invaluable data source for governments, corporations and other stakeholders deciding where to place research funding. Data for 400 universities can be found on the THE iPhone/iPad app.

The top place in the world would be jointly held by Rice University in Texas and Moscow State Engineering Physics Institute, closely followed by MIT and the University of California Santa Cruz.

Then there are the first places in various regions and counties. (MEPhI would be first in Europe and Rice in the US and North America.)

Canada
University of Toronto

Latin America
University of the Andes, Colombia

United Kingdom (and Western Europe)
Royal Holloway London

Africa
University of Cape Town

Middle East
Koc University, Turkey

Asia (and Japan)
Tokyo Metropolitan University

ASEAN
King Mongkut's University of Technology, Thailand

Australia and the Pacific
University of Melbourne

On second thoughts, perhaps not such a good idea.


Wednesday, May 29, 2013

Here is the full text of my article on the QS Subject Rankings published in the Philippine Daily Inquirer.

 

The QS university rankings by subject: Warning needed

By

 107  360  8
It is time for the Philippines to think about constructing its own objective and transparent ranking or rating systems for its colleges and universities that would learn from the mistakes of the international rankers.

The ranking of universities is getting to be big business these days. There are quite a few rankings appearing from Scimago, Webometrics, University Ranking of Academic Performance (from Turkey), the Taiwan Rankings, plus many national rankings.

No doubt there will be more to come.

In addition, the big three of the ranking world—Quacquarelli Symonds (QS), Times Higher Education and Shanghai Jiao Tong University’s Academic Ranking of World Universities—are now producing a whole range of supplementary products, regional rankings, new university rankings, reputation rankings and subject rankings.

There is nothing wrong, in principle, with ranking universities. Indeed, it might be in some ways a necessity. The problem is that there are very serious problems with the rankings produced by QS, even though they seem to be better known in Southeast Asia than any of the others.
This is especially true of the subject rankings.

No new data

The QS subject rankings, which have just been released, do not contain new data. They are mostly based on data collected for last year’s World University Rankings—in some cases extracted from the rankings and, in others, recombined or recalculated.

There are four indicators used in these rankings. They are weighted differently for the different subjects and, in two subjects, only two of the indicators are used.

The four indicators are:
A survey of academics or people who claim to be academics or used to be academics, taken from a variety of sources. This is the same indicator used in the world rankings. Respondents were asked to name the best universities for research.
A survey of employers, which seem to comprise anyone who chooses to describe himself or herself as an employer or a recruiter.
The number of citations per paper. This is a change from the world rankings when the calculation was citations per faculty.
H-index. This is something that is easier to give examples for than to define. If a university publishes one paper and the paper is cited once, then it gets an index of one. If it publishes two or more papers and two of them are published twice each, then the index is two and so on. This is a way of combining quantity of research with quality as measured by influence on other researchers.

Out of these four indicators, three are about research and one is about the employability of a university’s graduates.

These rankings are not at all suitable for use by students wondering where they should go to study, whether at undergraduate or graduate level.

The only part that could be of any use is the employer review and that has a weight ranging from 40 percent for accounting and politics to 10 percent for arts and social science subjects, like history and sociology.

But even if the rankings are to be used just to evaluate the quantity or quality of research, they are frankly of little use. They are dominated by the survey of academic opinion, which is not of professional quality.

There are several ways in which people can take part in the survey. They can be nominated by a university, they can sign up themselves, they can be recommended by a previous respondent or they can be asked because they have subscribed to an academic journal or an online database.
Apart from checking that they have a valid academic e-mail address, it is not clear whether QS makes any attempt to check whether the survey respondents are really qualified to make any judgements about research.

Not plausible

The result is that the academic survey and also the employer survey have produced results that do not appear plausible.

In recent years, there have been some odd results from QS surveys. My personal favorite is the New York University Tisch School of the Arts, which set up a branch in Singapore in 2007 and graduated its first batch of students from a three-year Film course in 2010.In the QS Asian University Rankings of that year, the Singapore branch got zero for the other criteria (presumably the school did not submit data) but it was ranked 149th in Asia for academic reputation and 114th for employer reputation.

Not bad for a school that had yet to produce any graduates when the survey was taken early in the year.

In all of the subject rankings this year, the two surveys account for at least half of the total weighting and, in two cases, Languages and English, all of it.

Consequently, while some of the results for some subjects may be quite reasonable for the world top 50 or the top 100, after that they are sometimes downright bizarre.

The problem is that although QS has a lot of respondents worldwide, when it gets down to the subject level there can be very few. In pharmacy, for example, there are only 672 for the academic survey and in materials science 146 for the employer survey. Since the leading global players will get a large share of the responses, this means that universities further down the list will be getting a handful of responses for the survey. The result is that the order of universities in any subject in a single country like the Philippines can be decided by just one or two responses to the surveys.

Another problem is that, after a few obvious choices like Harvard, MIT (Massachusetts Institute of Technology), Tokyo, most respondents probably rely on a university’s general reputation and that can lead to all sorts of distortions.

Many of the subject rankings at the country level are quite strange. Sometimes they even include universities that do not offer courses in that subject. We have already seen that there are universities in the Philippines that are ranked for subjects that they do not teach.

Somebody might say that maybe they are doing research in a subject while teaching in a department with a different name, such as an economic historian teaching in the economics department but publishing in history journals and getting picked up by the academic survey for history.
Maybe, but it would not be a good idea for someone who wants to study history to apply to that particular university.

Another example is from Saudi Arabia, where King Fahd University of Petroleum and Minerals was apparently top for history, even though it does not have a history department or indeed anything where you might expect to find a historian. There are several universities in Saudi Arabia that may not teach history very well but at least they do actually teach it.

These subject rankings may have a modest utility for students who can pick or choose among top global universities and need some idea whether they should study engineering at SUNY (State University of New York) Buffalo (New York) or Leicester (United Kingdom) or linguistics at Birmingham or Michigan.

But they are of very little use for anyone else.


Wednesday, May 15, 2013

QS Rankings by Subject

QS have produced their annual subject rankings. At the top there are no real surprises and, while there is certainly room for argument, I do not think that anyone will be shocked by the top ten or twenty in each subject.

The university with the most number ones is Harvard:

Medicine
Biology
Psychology
Pharmacy and Pharmacology
Earth and Marine Sciences
Politics and International Studies
Law
Economics and Econometrics
Accounting and Finance
Education

MIT has seven:
Computer Science
Chemical Engineering
Electrical Engineering
Mechanical Engineering
Phys and Astronomy
Chemistry
Materials Science

Then there is Berkeley with exactly the four you would expect:
Environmental Science
Statistics and Operational Research
Sociology
Communication and Media Studies

Oxford has three:

Philosophy
Modern Languages
Geography

Cambridge another three:
History
Linguistics
Mathematics


Imperial College London is top for Civil Engineering and University of California, Davis for Agriculture and Forestry.


These rankings are based on the academic opinion survey, the employer survey, citations per paper and h-index, a measure of both output and influence that eliminates outliers, in proportions that vary for each subject. They are very research-focused which is unfortunate since there seems to be a consensus emerging at conferences and seminars that the THE-TR rankings are for policy makers, the Shanghai ARWU for researchers and the QS rankings for undergraduate students.

Outside the top fifty there are some oddities resulting form the small number of responses when we leave the top fifty or top one hundred. I will leave it to specialists to find them.











Saturday, April 20, 2013

The Leiden Ranking

The Leiden ranking for 2013 is out. This is produced by the Centre for Science and Technology Studies (CWTS) at Leiden University and represents pretty much the state of the art in assessing research publications and citations.

A variety of indicators are presented with several different settings but no overall winner is declared which means that these rankings are not going to get the publicity given to QS and Times Higher Education.

Here are top universities, using the default settings provided by CWTS.

Total Publications: Harvard
Citations per Paper: MIT
Normalised Citations per Paper: MIT
Quality of Publications: MIT

There are also indicators for international and industrial collaboration that I hope to discuss later.

It is also noticeable that high flyers in the Times Higher Education citations indicator, Alexandria University, Moscow Engineering Physics Institute (MEPhI), Hong Kong Baptist University, Royal Holloway, do not figure at all in the Leiden Ranking. What happened to them?

How could MEPhI, equal first in the world for research influence according to THE and Thomson Reuters, fail to even show up in the normalised citation indicator in the Leiden Ranking?

Firstly, Leiden have collected data for the top 500 universities in the world according to number of publications in the Web of Science. That would have been sufficient to keep these institutions out of the rankings.

In addition, Leiden use fractionalised counting as a default setting so that the impact of mutiple-author publications is divided by the number of university addresses. This would drastically reduce the impact of publications like the Review of Particle Physics.

Also, by field Leiden mean five broad subject groups whereas Thomson Reuters appears to use a larger number (21 if they use the same system as they do for highly cited researchers.) There is accordingly more chance of anomalous cases having a great influence in the THE rankings.

THE and Thomson Reuters would do well to look at the multi-authored, and most probably soon to be multi-cited, papers that were published in 2012 and look at the universities that could do well in 2014 if the methodology remains unchanged.


Tuesday, April 02, 2013

Combining Rankings

Meta University Ranking has combined the latest ARWU, QS and THE World Rankings. Universities are ordered by place so that Harvard gets the highest (i.e. lowest score) with an average of 2.67 (1st in ARWU, 3rd in QS and 4th in THE).

After that there is MIT, Cambridge, Caltech and Oxford.

Wednesday, March 06, 2013

The THE Reputation Rankings

Times Higher Education have published their reputation rankings based on data collected from the World University Rankings of 2012.

They are not very interesting. Which is exactly what they should be. When rankings show massive changes from one year to another a certain amount of scepticism is required.

The same six, Harvard, MIT, Stanford, Berkeley, Oxford and Cambridge are well ahead of everybody else as they were in 2012 and in 2011.

Taking a quick look at the top fifty, there is little movement between 2011 and 2013. Four universities for the US, Japan, Netherlands and Germany have dropped out. In their place there is one more from Korea and from the the UK and two more from Australia.

I was under the impression that Australian universities were facing savage cuts in research funding and were going to be deserted by international students and researchers..

Maybe it is the other universities that are being cut. or maybe a bit of blood letting is good for the health.

I also noticed that the number of respondents went down a bit in 2012. It could be that the academic world is beginning to suffer from ranking fatigue.

Saturday, December 15, 2012

The Taiwan Rankings

It is unfortunate that the "big three" of the international ranking scene -- ARWU (Shanghai), THE and QS -- receive a disproportionate amount of public attention while several research-based rankings are largely ignored. Among them is the National Taiwan University Ranking which until this year was run by the Higher Education Evaluation and Acceditation Council of Taiwan.

The rankings, which are based on the ISI databases, assign a weighting of 25%  to research productivity (number of articles over the last 11 years, number of articles in the current year), 35% to research impact (number of citations over the last 11 years, number of citations in the current year, average number of citations over the last 11 years) and 40 % to research excellence (h-index over the last 2 years, number of highly cited papers, number of articles in the current year in highly cited journals).

Rankings by field and subject are also available.

There is no attempt to assess teaching or student quality and publications in the arts and humanities are not counted.

These rankings are a valuable supplement to the Shanghai ARWU. The presentation of data over 11 and 1 year periods allows quick comparisons of changes over a decade.

Here are the top ten.

1. Harvard
2. Johns Hopkins
3. Stanford
4. University of Washington at Seattle
5. UCLA
6. University of Washington Ann Arbor
7. Toronto
8. University of California Berkeley
9. Oxford
10. MIT

High-flyers in other rankings do not do especially well here. Princeton is 52nd, Caltech 34th, Yale 19th, Cambridge 15th most probably because they are relatively small or have strengths in the humanities.

Sunday, November 18, 2012

Article in University World News

online hub
    s View Printable VersionEmail Article To a Friend
GLOBAL
Ranking’s research impact indicator is skewed

Saturday, October 27, 2012

More on MEPhI

Right after putting up the post on Moscow State Engineering Physics Institute and its "achievement" in getting the maximum score for research impact in the latest THE - TR World University Rankings, I found this exchange on Facebook.  See my comments at the end.

  • Valery Adzhiev So, the best university in the world in the "citation" (i.e. "research influence") category is Moscow State Engineering Physics Institute with maximum '100' score. This is remarkable achivement by any standards. At the same time it scored in "research" just 10.6 (out of 100) which is very, very low result. How on earth that can be?
  • Times Higher Education World University Rankings Hi Valery,

    Regarding MEPHI’s high citation impact, there are two causes: Firstly they have a couple of extremely highly cited papers out of a very low volume of papers.The two extremely highly cited papers are skewing what would ordinarily be a very g
    ood normalized citation impact to an even higher level.

    We also apply "regional modification" to the Normalized Citation Impact. This is an adjustment that we make to take into account the different citation cultures of each country (because of things like language and research policy). In the case of Russia, because the underlying citation impact of the country is low it means that Russian universities get a bit of a boost for the Normalized Citation Impact.

    MEPHI is right on the boundary for meeting the minimum requirement for the THE World University Rankings, and for this reason was excluded from the rankings in previous years. There is still a big concern with the number of papers being so low and I think we may see MEPHI’s citation impact change considerably over time as the effect of the above mentioned 2 papers go out of the system (although there will probably be new ones come in).

    Hope this helps to explain things.
    THE
  • Valery Adzhiev Thanks for your prompt reply. Unfortunately, the closer look at that case only adds rather awkward questions. "a couple of extremely highly cited papers are actually not "papers": they are biannual volumes titled "The Review of Particle Physics" that ...See More
  • Valery Adzhiev I continue. There are more than 200 authors (in fact, they are "editors") from more than 100 organisation from all over the world, who produce those volumes. Look: just one of them happened to be affiliated with MEPhI - and that rather modest fact (tha...See More
  • Valery Adzhiev Sorry, another addition: I'd just want to repeat that my point is not concerned only with MEPhI - Am talking about your methodology. Look at the "citation score" of some other universities. Royal Holloway, University of London having justt 27.7 in "res...See More
  • Alvin See Great observations, Valery.
  • Times Higher Education World University Rankings Hi Valery,

    Thanks again for your thorough analysis. The citation score is one of 13 indicators within what is a balanced and comprehensive system. Everything is put in place to ensure a balanced overall result, and we put our methodology up online for
    ...See More
  • Andrei Rostovtsev This is in fact rather philosofical point. There are also a number of very scandalous papers with definitively negative scientific impact, but making a lot of noise around. Those have also high contribution to the citation score, but negative impact t...See More

    It is true that two extremely highly cited publications combined with a low total number of publications skewed the results but what is equally or perhaps more important is that  these citations occur in the year or two years after publication when citations tend to be relatively infrequent compared to later years. The 2010 publication is a biennial review, like the 2008 publication, that will be cited copiously for two years after which it will no doubt be superseded by the 2012 edition.

    Also, we should note that in the ISI Web of Science, the 2008 publication is classified as "physics, multidisciplinary". Papers listed as multidisciplinary generally get relatively few citations so if the publication was compared to other multidisciplinary papers it would get an even larger weighting. 
    Valery has an excellent point when he points out that these publications have over 100 authors or contributors each (I am not sure whether they are actual researchers or administrators). Why then did not all the other contributors boost their instutitions' scores to similar heights? Partly because they were not in Russia and therefore did not get the regional weighting but also because they were publishing many more papers overall than MEPhI.  

    So basically, A. Romaniouk who contributed 1/173rd of one publication was considered as having more research impact than hundreds of researchers at Harvard, MIT, Caltech etc producing hundreds of papers cited hundreds of times.  Sorry, but is this a ranking of research quality or a lottery?

    The worse part of THE's reply is this:

    Thanks again for your thorough analysis. The citation score is one of 13 indicators within what is a balanced and comprehensive system. Everything is put in place to ensure a balanced overall result, and we put our methodology up online for all to see (and indeed scrutinise, which everyone is entitled to do).

    We welcome feedback, are constantly developing our system, and will definitely take your comments on board.

    The system is not balanced. Citations have a weighting of 30 %, much more than any other  indicator. Even the research reputation survey has a weighting of only 18%.  And to describe as comprehensive an indicator which allows a fraction of one or two publications to surpass massive amounts of original and influential research is really plumbing the depths of absurdity.

    I am just about to finish comparing the scores for research and research impact for the top 400 universities. There is a statistically significant correlation but it is quite modest. When research reputation, volume of publications and research income show such a modestcorrelation with research impact it is time to ask whether there is a serious problem with this indicator.

    Here is some advice for THE and TR.

    • First, and surely very obvious, if you are going to use field normalisation then calculate the score for discipline groups, natural sciences, social sciences and so on and aggregate the scores. So give MEPhI a 100 for physical or natural sciences if you think they deserve it but not for the arts and humanities.
    • Second, and also obvious, introduce fractional counting, that is dividing the number of citations by the number of authors of the cited paper.
    • Do not count citations to summaries, reviews or compilations of research.
    • Do not count citations of commercial material about computer programs. This would reduce the very high and implausible score for Gottingen which is derived from a single publication.
    • Do not assess research impact with only one indicator. See the Leiden ranking for the many ways of rating research.
    • Consider whether it is appropriate to have a regional weighting. This is after all an international ranking.
    • Reduce the weighting for this indicator.
    • Do not count self-citations. Better yet do  not count citations from researchers at the same university.
    • Strictly enforce your rule about  not including single subject institutions in the general rankings.
    • Increase the threshold number of publications for inclusion in the rankings from two hundred to four hundred.