Showing posts sorted by relevance for query MIT. Sort by date Show all posts
Showing posts sorted by relevance for query MIT. Sort by date Show all posts

Sunday, May 28, 2017

The View from Leiden

Ranking experts are constantly warning about the grim fate that awaits the universities of the West if they are not provided with all the money that they want and given complete freedom to hire staff and recruit students from anywhere that they want. If this does not happen they will be swamped by those famously international Asian universities dripping with funds from indulgent patrons.

The threat, if we are to believe the prominent rankers of Times Higher Education (THE), QS and Shanghai Ranking Consultancy, is always looming but somehow never quite arrives. The best Asian performer in the THE world rankings  is the National University of Singapore (NUS) in 24th place followed by Peking University in 29th. The QS World University Rankings have NUS 12th, Nanyang Technological University 13th and Tsinghua University 24th.  The Academic Ranking of World Universities published in Shanghai puts the University of Tokyo in 20th place and Peking University in 71st.

These rankings are in one way or another significantly biased towards Western European and North American institutions and against Asia. THE has three separate indicators that measure income, adding up to a combined weighting of 10.75% . Both QS and THE have reputations surveys. ARWU gives a 30 % weighting to Nobel and Fields awards winners, some of them from several decades ago.

Let's take a look at a set of rankings that is technically excellent, namely the Leiden Ranking. The producers do not provide an overall score. Instead it is possible to create a variety of rankings, total publications, publications by subject groups, publications in the top 50%, 10% and 1% of journals. Users can also select fractional or absolute counting and  change the minimum threshold of number of publications.

Here is the top ten, using  the default settings, publications 2012-15, fractional counting, minimum threshold of 100 papers. Publications in 2006-09 are in brackets.

1. Harvard  (1)
2. Toronto  (2)
3. Zhejiang  (14)
4. Michigan (3)
5. Shanghai Jiao Tong (37)
6. Johns Hopkins (5)
7  Sao Paulo (8)
8. Stanford (9)
9  Seoul National University (23)
10.  Tokyo (4).

Tsinghua University is 11th, up from 32nd in 2006-09 and Peking University is 15th, up from 54th. What is interesting about this is not just that East Asian universities are moving into the highest level of research universities but how rapidly they are doing so.

No doubt there are many who will say that this is a matter of quantity and that what really counts is not the number of papers but their reception by other researchers. There is something to this. If we look at publications in the top 1 % of journals (by frequency of citation) the top ten include six US universities headed by Harvard, three British and one Canadian.

Tsinghua is 28th, Zhejiang is 50th, Peking 62nd, Shanghai Jiao Tong 80th, Seoul National University 85th . Right now it looks like publication in the most reputed journals is dominated by English-speaking universities. But in the last few years Chinese and Korean universities have advanced rapidly, Peking 119th to 62nd, Zhejiang 118th to 50th, Shanghai Jiao Tong 112th to 80th, Tsinghua 101st to 28th, Seoul National University 107th to 85th.

It seems that in a few years East Asia will dominate the elite journals and will take the lead for quality as well as quantity.

Moving on to subject group rankings, Tsinghua University is in first place for mathematics and computer sciences. The top ten consists of nine Chinese and one Singaporean university. The best US performer is MIT in 16th place, the best British Imperial College London in 48th.

When we look at the top 1 % of journals, Tsinghua is still on top, although MIT moves up to 4th place and Stanford is 5th. 

The Asian tsunami has already arrived. East Asian, mainly Chinese and Chinese diaspora, universities, are dominant or becoming dominant in the STEM subjects, leaving the humanities and social sciences to the US.

There will of course be debate about what happened. Maybe money had something to do with it. But it also seems that western universities are becoming much less selective about student admissions and faculty appointments. If you admit students who write #BlackLivesMatter 100 times on their application forms or impose ideological tests for faculty appointment and promotion you may have succeed in imposing political uniformity but you will have serious problems trying to compete with the Gaokao hardened students and researchers of Chinese universities.

Friday, October 04, 2013

MIT and TMU are the most influential research universities in the world

I hope to comment extensively on the new Times Higher Education - Thomson Reuters rankings in a while but for the moment here is a comment on the citations indicator.

Last year Times Higher Education and Thomson Reuters solemnly informed the world that the two most influential places for research were Rice University in Texas and the Moscow State Engineering Physics Institute (MEPhI).

Now, the top two for Citations: research influence are MIT, which sounds a bit more sensible than Rice, and Tokyo Metropolitan University. Rice has slipped very slightly and MEPhI has disappeared from the general rankings because it was realised that it is a single-subject institution. I wonder how they worked that out.

That may be a bit unfair. What about that paper on opposition politics in central Russia in the 1920s?

Tokyo Metropolitan University's success at first seems rather odd because it also has a very low score for Research, which probably means that it has a poor reputation for research, does not receive much funding, has few graduate students and/or publishes few papers. So how could its research be so influential?

The answer is that it was one of scores of contributors to a couple of multi-authored publications on particle physics and a handful of widely cited papers in genetics and also produced few papers overall. I will let Thomson Reuters explain how that makes it into a pocket or a mountain of excellence.

Sunday, March 18, 2012

The THE Reputation Rankings

Times Higher Education has produced its third reputation ranking based on a survey of researchers published in ISI indexed journals. The top ten are:

1.  Harvard
2.  MIT
3.  Cambridge
4.  Stanford
5.  UC Berkeley
6.  Oxford
7.  Princeton
8.  Tokyo
9.  UCLA
10. Yale

This does not look all that dissimilar to the academic survey indicator in the  2011 QS World University Rankings. The top ten there is as follows:

1.  Harvard
2.  Cambridge
3.  Oxford
4.  UC Berkeley
5.  Stanford
6.  MIT
7.  Tokyo
8.  UCLA
9. Princeton
10. Yale

Once we step outside the top ten there are some differences. The National University of Singapore is 23rd in these rankings but 11th in the QS academic survey, possibly because QS still has respondents on its list from the time when it used World scientific, a Singapore based publishing company.

The Middle East Technical University in Ankara is in the top 100 (In the QS academic survey it is not even in the top 300), sharing the 90-100 band with Ecole Polytechnique, Bristol and Rutgers. At first glance this seems surprising since its research output is exceeded by other universities in the Middle East. But the technical excellence of its University Ranking by Academic Performance suggests that its research might be of a high quality.

Tuesday, April 12, 2011

QS Engineering Rankings

QS have started to published detailed subject rankings based on citations per paper over five years and their surveys of academics and employers. The first of these is engineering. There are five subfields: Computer Science and Information Systems, Chemical Engineering, Civil and Structural Engineering, Electrical and Electronic Engineering and Mechanical, Aeronautical and Manufacturing.

For Civil and Structural Engineering the weighting is 50% for the academic survey, 30% for the employers' survey and 20 % for citations per paper. For the others it is 40%, 30% and 30%.

MIT, not surprisingly, is top in each of the five engineering fields that are ranked. In general, the upper levels of these rankings seem reasonable. However, a look at the details, especially in the bottom half, 100-200 places, raises some questions.

One basic problem is that as QS make finer distinctions, they have to rely on smaller sets of data. There were 285 respondents to the academic survey for chemical engineering and 394 for civil and structural engineering. For the employer survey there were 836 for computer science. Each respondent to the academic survey was allowed to nominate up to 40 universities but usually the number was much lower than this. Around the 151-200 level the number of responses would surely have been very low. Similarly, the number of papers counted in each field varied considerably from 43,222 in civil and structural engineering to 514,95 in electrical and electronic engineering. We should therefore be rather sceptical about these rankings.

Something that is noticeable is that there is a reasonably high correlation between the scores for the academic survey and the employer survey. For electrical engineering it is .682, chemical engineering .695, civil engineering .695, computer science .722.

But there is no correlation at all between the citations per paper indicator and the surveys. For electrical engineering it is .064 between citations and academic survey and -.004 between citations and the employer survey. It is the same for the other subfields. None of the correlations are statistically significant.

Looking at the top universities for the three indicators, we see the same familiar places in each of the subfields according to the surveys: MIT, Stanford, Cambridge, Berkeley, Oxford, Harvard, Imperial College London, Melbourne, Caltech.

But looking at the top scorers for citations per paper, we find a much more varied and unfamiliar array of institutions: New York University, Wageningen, Dartmouth College, Notre Dame, Aalborg, Athens, Lund, Uppsala, Drexel, Tufts, IIT Roorkee, University of Washington, Rice, University of Massachusetts.

The agreement of employers and academic about the quality of engineering programs, even though they refer to different aspects, research and graduate employability, suggests that the surveys are moderately accurate, at least for the top hundred or so.

However, the lack of any correlation  at all between the citations indicator and the surveys needs to be raised. It could be that citations have identified up and coming superstars. Perhaps  the number of papers is so low in the various subfields that the indicator does not mean very much. Perhaps citations have been so manipulated in recent years -- see the case of Alexandria University -- that they are no longer a robust indicator of quality.

Thursday, October 02, 2014

Which universities have the greatest research influence?

Times Higher Education (THE) claims that its Citations:Research Influence indicator, prepared by Thomson Reuters (TR), is the flagship of its World University Rankings, It is strange then that the magazine has never published a research influence ranking although that ought to be just as interesting as its Young Universities Ranking, Reputation Rankings or gender index.

So let's have a look at the top 25  universities in the world this year ranked for research influence,  measured by field- and year- normalised citations, by Thomson Reuters.

Santa Cruz and Tokyo Metropolitan have the same impact as MIT. Federico Santa MariaTechnical University is ahead of Princeton. Florida Institute of Technology beats Harvard. Bogazici University and Scuola Normale Superiore do better than Oxford and Cambridge.

Are they serious?

Apparently. There will be an explanation in the next post. Meanwhile go and check if you don't believe me. And let me know if there's any dancing in the streets of Valparaiso, Pisa, Golden or Istanbul.


Rank and Score for Citations: Research Influence 2014-15 THE World Rankings

Rank University Score
1= University of California Santa Cruz 100
1= MIT 100
1= Tokyo Metropolitan University 100
4 Rice University 99.9
5= Caltech 99.7
5= Federico Santa Maria Technical University, Chile  99.7
7 Princeton University 99.6
8= Florida Institute of Technology 99.2
8= University of California Santa Barbara 99.2
10= Stanford University 99.1
10= University of California Berkeley 99.1
12= Harvard University 98.9
12= Royal Holloway University of London 98.9
14 University of Colorado Boulder  97.4
15 University of Chicago 97.3
16= Washington University of St Louis 97.1
16= Colorado School of Mines 97.1
18 Northwestern University 96.9
19 Bogazici University, Turkey  96.8
20 Duke University  96.6
21= Scuola Normale Superiore Pisa, Italy 96.4
21= University of California San Diego 96.4
23 Boston College 95.9
24 Oxford University 95.5
25= Brandeis University  95.3
25= UCLA 95.3

Monday, September 10, 2012

MIT is Number One

The new QS Rankings are out.

MIT has replaced Cambridge in first place.

Thursday, March 10, 2011

The THE Reputation Rankings

Times Higher Education have constructed a reputation ranking from the data collected for last year's World University Rankings. There is a weighting of two thirds for research and one third for postgraduate teaching. The top five are:

1.  Harvard
2.  MIT
3.  Cambridge
4.  UC Berkeley
5.  Stanford

Scores are given only for the top fifty universities. Then another fifty are sorted in bands of ten without scores. Evidently, the number of responses favouring universities outside the top 100 was so small that it was not worth listing.

This means that the THE reputational survey reveals significant differences between Harvard and MIT or between Cambridge and Oxford but it would be of no help to those wondering whether to study or work at the University of Cape Town or the University of Kwazulu-Natal or Trinity College Dublin or University College Dublin.

The scores for research reputation (top fifty for total reputation scores only) show a moderate correlation with the THE citations indicator (.422) and, perhaps surprisingly, a higher correlation with the citations per faculty score on the QS World University Rankings of 2010 (.538).

looking at the QS academic survey, which asked only about research, we can see that there was an insignificant correlation of .213 between the QS scores and the score for citations per faculty in the QS rankings (THE reputation ranking top 50 only). However, there was a higher correlation between the QS survey and the THE citations indicator of .422, the same as that between the THE research reputation scores and the the THE citations indicator.

Comparing the two research surveys with a third party, the citations indicator in the Scimago 2010 rankings, the THE research reputation survey did better with a correlation of .438 compared to an insignificant .188 for the QS academic survey.

This seems to suggest that the THE reputational survey does a better job at differentiating between the world's elite universities. But once we leave the top 100 it is perhaps less helpful and there may still be a role for the QS rankings

Saturday, February 11, 2017

What was the greatest ranking insight of 2016?

It is now difficult to imagine a world without university rankings. If they did not exist we would have to make judgements and decisions based on the self-serving announcements of bureaucrats and politicians, reputations derived from the achievements of past decades and popular and elite prejudices.

Rankings sometimes tell us things that are worth hearing. The first edition of the Shanghai rankings revealed emphatically that venerable European universities such as Bologna, the Sorbonne and Heidelberg were lagging behind their Anglo-Saxon competitors. More recently, the rise of research based universities in South Korea and Hong Kong and the relative stagnation of Japan has been documented by global rankings. The Shanghai ARWU also show the steady decline in the relative research capacity of a variety of US institutions including Wake Forest University, Dartmouth College, Wayne State University, the University of Oregon and Washington State University .

International university rankings have developed a lot in recent years and, with their large databases and sophisticated methodology, they can now provide us with an expanding wealth of "great insights into the strengths and shifting fortunes" of major universities.

So what was the greatest ranking insight of 2016?  Here are the first three on my shortlist. I hope to add a few more over the next couple of weeks. If anybody has suggestions I would be happy to publish them.

One. Cambridge University isn't even the best research university in Cambridge.
You may have thought that Cambridge University was one of the best research universities in the UK or Europe, perhaps even the best. But when it comes to research impact, as measured by field and year normalised citations with a 50% regional modification it isn't even the best in Cambridge. That honour, according to THE goes to Anglia Ruskin University, a former art school. Even more remarkable is that this achievement was due to the work of a single researcher. I shall keep the name a secret  in case his or her office becomes a stopping point for bus tours.

Two. The University of Buenos Aires and the Pontifical Catholic University of Chile rival the top European, American and Australian universities for graduate employability. 
The top universities for graduate employability according to the Quacquarelli Symonds (QS) employer survey are pretty obvious: Harvard, Oxford, Cambridge, MIT, Stanford. But it seems that there are quite a few Latin American universities in the world top 100 for employability. The University of Buenos Aires is 25th and the Pontifical Catholic University of Chile 28th in last year's QS world rankings employer survey indicator. Melbourne is 23rd, ETH 26th, Princeton 32nd and New York University 36th.

Three. King Abdulaziz University is one of the world's  leading universities for engineering.
The conventional wisdom seems settled, pick three or four from MIT, Harvard, Stanford, Berkeley, perhaps even a  star rising in the East like Tsinghua or the National University of Singapore. But in the Shanghai field rankings for Engineering last year the fifth place went to King Abdulaziz University in Jeddah. For highly cited researchers in engineering it is second in the world surpassed only by Stanford. 


Tuesday, July 05, 2011

QS Subject Rankings for the Social Sciences

QS have released their subject rankings for the social sciences based on data gathered during last year's rankings.

The overall rankings are not surprising. Here are top three in each subject.

Sociology
1.  Harvard
2.  UC Berkeley
3.  Oxford

Statistics and Operational Research
1.  Stanford
2.  Harvard
3.  UC Berkeley

Politics and International Studies
1.  Harvard
2.  Oxford
3.  Cambridge

Law
1.  Harvard
2.  Oxford
3.  Cambridge

Economics and Econometrics
1.  Harvard
2.  MIT
3. Stanford

Accounting and Finance
1.  Harvard
2.  Oxford
3.  MIT

The top three in the citations per paper indicator is, in most cases, rather different. Are these pockets of excellence or something else?

Sociology
1=  Boston College
1=  Munich
3.   Florida State University

Statistics and Operational Research
1.  Aarhus
2.  Helsinki
3.  Erasmus University Rotterdam

Politics and International Studies
1.  Yale
2.  Oslo
3.  Rutgers

Law
1.  Victoria University of Wellington
2.  Koln
3.  Munster

Economics and Econometrics
1.  Dartmouth
2.  Harvard
3.  Princeton

Accounting and Finance
1.  University of Pennsylvania
2=  Harvard
2=  North Carolina at Chapel Hill

Monday, April 18, 2016

Round University Rankings


The latest Round University Rankings have been released by the Russian company, RUR Rankings Agency. These are essentially holistic rankings that attempt to go beyond the measurement of research output and quality. There are twenty indicators, although some of them such as Teaching Reputation, International Teaching Reputation and Research Reputation and International Students and International Bachelors are so similar that the information they provide is limited.

Basically these rankings cover much the same ground as the Times Higher Education (THE) World University Rankings. The income from industry indicator is not included but there are an additional eight indicators. The data is taken from Thomson Reuters' Global Institutional Profiles Project (GIPP) which was used by THE for their rankings from 2010 to 2014.

Unlike THE, which lumps its indicators together into groups,  the scores in the RUR are listed separately in the profiles. In addition, the rankings provide data for seven continuous years from 2010 to 2016. This provides an unusual opportunity to examine in detail the development of universities over a period of seven years, measured by 20 indicators. This is not the case with other rankings which have fewer indicators or which have changed their methodology.

It should be noted that participation in the GIPP is voluntary and therefore the universities in each edition could be different. For example, in 2015 100 universities dropped out of the project and 62 joined.

It is, however,  possible to examine a number of claims that have been made about changes in university quality over the last few years. I will  take a look at these in the next few posts.

For the moment, here are the top five in the overall rankings and the dimension rankings.

Overall
1.   Caltech
2.   Harvard
3.   Stanford
4.   MIT
5.   Chicago


Teaching
1.   Caltech
2.   Harvard
3.   Stanford
4.   MIT
5.   Duke

Research
1.   Caltech
2.   Harvard
3.   Stanford
4.   Northwestern University
5.   Erasmus University Rotterdam

International Diversity
1.   EPF Lausanne
2.   Imperial College London
3.   National University of Singapore
4.   University College London
5.   Oxford

Financial Sustainability
1.   Caltech
2.   Harvard
3.   Scuola Normale Superiore Pisa
4.   Pohang University of Science and Technology
5.   Karolinska Institute

Unfortunately these rankings have received little or no recognition outside Russia. Here are some examples.


MIPT entered the top four universities in Russia according to the Round University Ranking

Russian Universities in the lead in terms of growth in the international ranking of Round University Ranking

TSU [Tomsk State University]  has entered the 100 best universities for the quality of teaching

[St Petersburg]

Russian universities to top intl rankings by 2020 – Education Minister Livanov to RT


Monday, August 27, 2012

The Shanghai Rankings 3

Two of the indicators in the Shanghai rankings measure research achievement at the highest level. The highly cited researchers indicator is based on a list of those scientists who have been cited most frequently by other researchers. Since ARWU counts current but not past affiliations of researchers, it is possible for a university to boost its score by recruiting researchers. This indicator might then  be seen as signalling a willingness to invest in and to retain international talent and hence a sign of future excellence. 

The top five for this indicator are

1,  Harvard
2.  Stanford
3.  UC Berkeley
4.  MIT
5.  Princeton

This indicator shows that there are a lot of US state universities and non-Ivy League schools that are doing well on this indicator. There is the University of Michigan (6th), University of Washington (13th), University of Minnesota (19th), Penn State (23rd), and Rutgers  (42nd).

Before this year, the methodology for this indicator was simple. If a highly cited researcher had two affiliations then there was a straightforward fifty-fifty division. Things were complicated when King Abdulaziz University (KAU) in Jeddah signed up scores of researchers on part time contracts, a story recounted in Science. ARWU has responded deftly by asking researchers to indicate how their time was divided if they had joint affiliations and this seems to have deflated KAU's score considerably but has had no or minimal effect for anyone else.

The top five universities for papers in Nature and Science are:

1.  Harvard
2.  Stanford
3.  MIT
4.  UC Berkeley
5.  Cambridge

High fliers on this indicator include several specialised science and medical institutions such as Imperial College London, Rockefeller University, Karolinka Institutet and the University of Texas Southwestern Medical Center.

Thursday, September 30, 2010

The THE Life Sciences Ranking (subscription required)

First is MIT, then Harvard, then Stanford. Nothing to argue about there.

For research impact (i.e. citations) the order is:

1.. MIT
2. University of Barcelona
3. Harvard
4. Princeton
5. Stanford
6. Oxford
7. Dundee
8. Hong Kong
9. Monash
10. Berkeley

In this subject group, the citations indicator gets 37.5%.

Sunday, August 19, 2012

The Shanghai Rankings 2

The Shanghai Rankings get more interesting when we look at the individual indicators. Here are the 2012 top five for Almuni who have won Nobel and Fields awards.

1. Harvard
2. Cambridge
3. MIT
4. Berkeley
5. Columbia

In the top fifty for this indicator there are the Ecole Normale Superieure, Moscow State University, the Technical University of Munich, Goettingen, Strasbourg and the City University of New York City College.

Essentially, this indicator allows universities that have seen better decades to gain a few points from an academic excellence that has long been in decline. City College of New York is an especially obvious victim of politics and bureaucracy.

The top five in the Awards indicator, faculty who have won Nobel prizes and Fields medals, are:

1.. Harvard
2.  Cambridge
3.  Princeton
4.  Chicago
5.  MIT

The top fifty includes the Universities of Buenos Aires, Heidelberg, Paris Dauphine, Bonn, Munich and Freiburg. Again, this indicator may be a pale reflection of past glory rather than a sign of future accomplishments.





Tuesday, November 15, 2011

The THE Subject Rankings

The ranking seasons has drawn to a close, or at least it will when we have digested the feasibility report from the European Commission's U-Multirank project. Meanwhile, to tie up some loose ends, here are the top 3 from each of THE's subject group rankings.

Engineering and Technology

1.  Caltech
2.  MIT
3.  Princeton

Arts and Humanities

1.  Stanford
2.  Harvard
3.  Chicago

Clinical, Pre-Clinical and Health

1.  Oxford
2.  Harvard
3.  Imperial College London

Life Sciences

1.  Harvard
2.  MIT
3.  Cambridge

Physical Sciences

1.  Caltech
2.  Princeton
3.  UC Berkeley

Social Sciences

To be posted on the 17th of November.

Sunday, November 18, 2012

Article in University World News

online hub
    s View Printable VersionEmail Article To a Friend
GLOBAL
Ranking’s research impact indicator is skewed

Wednesday, May 04, 2011

New QS Rankings

QS have just released their Life Sciences rankings based on their employer and academic surveys and citations per paper.

Here are the top five for medicine, biology and psychology.

Medicine

1.  Harvard
2.  Cambridge
3.  MIT
4.  Oxford
5.  Stanford

Biological Sciences

1.  Harvard
2.  MIT
3.  Cambridge
4.  Oxford
5.  Stanford

Psychology

1.  Harvard
2.  Cambridge
3.  Stanford
4.  Oxford
5.  UC Berkeley

Sunday, September 09, 2012

Will There be a New Number One?

One reason why QS and Times Higher Education get more publicity than the Shanghai ARWU, HEEACT and other rankings is that they periodically produce interesting surprises. Last year Caltech replaced Harvard as number one in the THE rankings and Tokyo overtook Hong Kong as the best Asian university. Two years ago Cambridge pushed Harvard aside at the top of the QS rankings.

Will there be another change this year?

There is an intriguing interview with Les Ebdon, the UK government's "university access tsar", in the Daily Telegraph. Ebdon claims that leading British universities are in danger  of losing their world class status unless they start admitting more students from state schools who may be somewhat less academically qualified. Perhaps he knows something.

So if Cambridge slips and is replaced by Harvard, MIT or Yale as QS number one (if it is Oxford or Imperial QS will lose all credibility) we can expect comments that Cambridge should start listening to him before its too late.

I suspect that if there is a new number one it might have something to do with the QS employer review. Since this is a sign up survey and since the numbers are quite small it would not take many additional responses to push Harvard or MIT into first place.

With regard to THE, the problem there is that normalising everything by country, year and/or field is a potential source of instability. If there is a vigorous debate with lots of  citations about an obscure article by a Harvard researcher in a little cited field it could dramatically boost the score on the citations indicator.

Getting a good score in the THE rankings also depends on what a school is being compared to. Last year, Hong Kong universities slumped because they were taken out of China (with low average scores) and classified as a separate country (with high average scores), so that their relative scores were lower. If they are put back in China they will go up this year and there will be a new number one Asian university.

So anybody want to bet on Harvard making a come back this year? Or Hong Kong regaining the top Asian spot from Tokyo in the THE rankings?

Wednesday, September 22, 2010

Selected Comments from Times Higher Education


Mike Reddin 17 September, 2010
World university rankings take national ranking systems from the ridiculous to the bizarre. Two of the most glaring are made more so by these latest meta analyses.
Number One: R&D funding is scored not by its quality or contribution to learning or understanding but by the amount of money spent on that research; it ranks expensive research higher than cheap research; it ranks a study of 'many things' better than the study of a 'few things'; it ranks higher the extensive and expensive pharmacological trial than the paper written in tranquility over the weekend. I repeat, it does not score 'contribution to knowledge'.

Number Two. Something deceptively similar happens in the ranking of citations. We rank according to number alone - not 'worth' - not whether the paper merited writing in the first place, not whether we are the better for or the worse without it, not whether it adds to or detracts from the sum of human knowledge. Write epic or trash .... as long as it is cited, you score. Let me offer utter rubbish - the more of you that denounce me the better; as long as you cite my name and my home institution.

Which brings me full circle: the 'rankings conceit' equates research / knowledge / learning / thinking / understanding with institutions - in this case, universities and universities alone. Our ranking of student 'outcomes' (our successes/failure as individuals on many scales) wildly presumes that they flow from 'inputs' (universities). Do universities *cause* these outcomes - do they add value to those they have admitted? Think on't. Mike Reddin http://www.publicgoods.co.uk/



jorge Sanchez 18 September, 2010
this is ridiculous~ LSE was placed 67 in the previous year and THE decided to end relations with QS because of this issue. now since THE is no longer teaming up with QS, how could you possibly explain this anomaly by placing LSE ranked 86 in the table????


Mark 18 September, 2010
where is the "chinese university of Hong Kong in the table??? it is no longer in the top 200 best universities....

last year was in the top 50 now is off the table??? is this a serious ranking?????


Of course it's silly 18 September, 2010
Just look at the proposition that teaching is better if you have a higher proportion of doctoral students to undergraduate students.

This is just plainly silly, as 10 seconds thinking about the reputation of teaching in the US will tell you: liberal arts colleges offer extraordinary teaching in the absence of PhD programmes.



Matthew H. Kramer 18 September, 2010
Though some tiers of these rankings are sensible, there are some bizarre anomalies. Mirabile dictu, the University of Texas doesn't appear at all; the University of Virginia is ridiculously low at 72; NYU is absurdly low at 60; the University of Hong Kong is preposterously overrated at 21. Moreover, as has been remarked in some of the previous comments -- and as is evident from a glance at the rankings -- the criteria hugely favor technical institutes. The rank of MIT at 3 is credible, because MIT is outstanding across the board. However, Cal Tech doesn't belong at 2, and Imperial (which has no programs at all in the humanities and social sciences) certainly doesn't belong at 9. Imperial and especially Cal Tech are outstanding in what they do, but neither of them is even close to outstanding across the gamut of subjects that are covered by any full-blown university. I hope that some of these anomalies will be eliminated through further adjustments in the criteria. The exclusion of Texas is itself sufficiently outlandish to warrant some major modifications in those criteria.



Matthew H. Kramer 18 September, 2010
Weird too is the wholesale exclusion of Israeli universities. Hebrew University, Tel Aviv University, and Technion belong among the top 200 in any credible ranking of the world's universities.


Neil Fazel 19 September, 2010
No Sharif, no U. Texas, no Technion. Another ranking to be ignored.


OZ academic 20 September, 2010
While the criteria seem to be OK, although they might be debated, how to carry out the statistical analyses and how to collect the data are the issues for the validity of the poll. The omission of Chinese University of Hong Kong, in the inclusion of the Hong Kong Baptist University and Hong Kong Polytechnic University in the world's top 200 universities, seems to be very "mysterious" to me. As I understand the Chinese University of Hong Kong is more or less of a similar standard in teaching and research in comparison to the Hong Kong University and the Hong Kong University of Science and Technology, but they have some slight edges over the Hong Kong Baptist University and the Hong Kong Polytechnic University. I wonder if there are mix-ups in the data collection processes. If this is true, then there are disputes in this poll not only in the criteria of assessment but also in the accuracy in data collections and analyses.

Wednesday, November 01, 2006

The Best Universities for Biomedicine?

THES has published a list of the world's 100 best universities for biomedicine. This is based, like the other subject rankings, on peer review . Here are the top twenty according to the THES reviewers.

1. Cambridge
2. Harvard
3. Oxford
4. Imperial College London
5. Stanford
6. Johns Hopkins
7. Melbourne
8. Beijing (Peking)
9. National University of Singapore
10. Berkeley
11. Yale
12. Tokyo
13. MIT
14. University of California at San Diego
15. Edinburgh
16. University College London
17. Kyoto
18. Toronto
19. Monash
20. Sydney

Here are the top twenty according to citations per paper, a measure of the quality of research.


1. MIT
2. Caltech
3. Princeton
4. Berkeley
5. Stanford
6. Harvard
7. Oxford
8. University of California at San Diego
9. Cambridge
10. Yale
11. Washington (St Louis)
12. Johns Hopkins
13. ETH Zurich
14. Duke
15. Dundee
16. University of Washington
17. Chicago
18. Vanderbilt
19. Columbia
20. UCLA

The two lists are quite different. Here are the positions according to citations per paper of some of the universities that were in the top twenty for the peer review;

University College London -- 24
Edinburgh -- 25
Imperial College London -- 28
Tokyo -- 34
Toronto -- 35
Kyoto -- 36
Monash -- 52
Melbourne -- 58
Sydney -- 67
National University of Singapore -- 74
Beijing -- 78=

Again, there is a consistent pattern of British, Australian and East Asia universities doing dramatically better in the peer review than in citations per paper. How did they acquire such a remarkable reputation if their research was of such undistinguished quality? Did they acquire a reputation for producing a large quantity of mediocre research?

Notice that Cambridge with the top score for peer review produces research of a quality inferior to, according to QS's data, eight universities, seven of which are in the US and four in California.

There are also 23 universities that produced insufficient papers to be counted by the consultants. Thirteen are in Asia, 5 in Australia and New Zealand, 4 in Europe and one in the US. How did they acquire such a remarkable reputation while producing so little research? Was the little research they did of a high quality?

Saturday, October 28, 2006

The Best Universities for Technology?

The Times Higher Education Supplement (THES) have published a list of the supposed top 100 universities in the world in the field of technology. The list purports to be based on opinion of experts in the field. However, like the ranking for science, it cannot be considered valid. First, let us compare the top 20 universities according to peer review and then the top 20 according to the data provided by THES for citations per paper, a reasonable measure of the quality of research.

First, the peer review:

1. MIT
2. Berkeley
3. Indian Institutes of Technology (all of them)
4. Imperial College London
5. Stanford
6. Cambridge
7. Tokyo
8. National University of Singapore
9. Caltech
10. Carnegie-Mellon
11. Oxford
12. ETH Zurich
13. Delft University of Technology
14. Tsing Hua
15. Nanyang Technological University
16. Melbourne
17. Hong Kong University of science and Technology
18. Tokyo Institute of Technology
19. New South Wales
20. Beijing (Peking University)

Now, the top twenty ranked according to citations per paper:

1. Caltech
2. Harvard
3. Yale
4. Stanford
5. Berkeley
6. University of California at Santa Barbara
7. Princeton
8. Technical University of Denmark
9. University of California at San Diego
10. MIT
11. Oxford
12. University of Pennsylvania
13. Pennsylvania State University
14. Cornell
15. Johns Hopkins
16. Boston
17. Northwestern
18. Columbia
19. Washington (St. Louis)
20. Technion (Israel)

Notice that the Indian Institutes of Technology, Tokyo, National University of Singapore, Nanyang Technological University, Tsing Hua, Melbourne, New South Wales and Beijing are not ranked in the top 20 according to quality of published research. Admittedly, it is possible that in this field a substantial amount of research consists of unpublished reports for state organizations or private companies but this would surely be more likely to affect American rather than Asian or Australian universities.

Looking a bit more closely at some of the universities in the top twenty for technology according to the peer review, we find that, when ranked for citations per paper, Tokyo is in 59th place, National University of Singapore 70th, Tsing Hua 86th, Indian Institutes of Technology 88th, Melbourne 35th, New South Wales 71st, and Beijing 76th. Even Cambridge, sixth in the peer review, falls to 29th.

Again, there are a large number of institutions that did not even produce enough papers to be worth counting, raising the question of how they could be sufficiently well known for there to be peers to vote for them. This is the list:

Indian Institutes of Technology
Korean Advanced Institute of Science and Technology
Tokyo Institute of Technology
Auckland
Royal Institute of Technology Sweden
Indian Institutes of Management
Queensland University of Technology
Adelaide
Sydney Technological University
Chulalongkorn
RMIT
Fudan
Nanjing

Once again there is a very clear pattern of the peer review massively favoring Asian and Australasian universities. Once again, I can see no other explanation than an overrepresentation of these regions, and a somewhat less glaring one of Europe, in the survey of peers combined with questions that allow or encourage respondents to nominate universities from their own regions or countries.

It is also rather disturbing that once again Cambridge does so much better on the peer review than on citations. Is it possible that THES and QS are manipulating the peer review to create an artificial race for supremacy – “Best of British Closing in on Uncle Sam’s finest”. Would it be cynical to suspect that next year Cambridge and Harvard will be in a circulation-boosting race for the number one position?

According to citations per faculty Harvard was 4th for science, second for technology and 6th for biomedicine while Cambridge was 19th, 29th and 9th.

For the peer review, Cambridge was first for science, 6th for technology and first for biomedicine. Harvard was 4th, 23rd and second.

Overall, there is no significant relationship between the peer review and research quality as measured by citations per paper. The correlation between the two is .169, which is statistically insignificant. For the few Asian universities that produced enough research to be counted, the correlation is .009, effectively no better than chance.

At the risk of being boringly repetitive, it is becoming clearer and clearer that that the THES rankings, especially the peer review component, are devoid of validity.