Showing posts sorted by relevance for query MIT. Sort by date Show all posts
Showing posts sorted by relevance for query MIT. Sort by date Show all posts

Wednesday, September 12, 2012

What happened to MIT and Cambridge?

QS now has a new world  number one university. Massachusetts Institute of Technology (MIT) has replaced Cambridge and overtaken Harvard.

Unfortunately, this change probably means very little.

Overall the change was very slight. MIT rose from 99.21 to 100 while Cambridge fell from 100 to 99.8.

There was no change in the two surveys that account for half of the weighting. MIT and Cambridge both scored 100 for the academic and the employer surveys in 2011 and in 2012.

On the citations per faculty indicator Cambridge did quite a bit better this year, rising from 92.7 to 97 while MIT fell slightly from 99.6 to 99.3. This could mean that, compared to front-runner Caltech,Cambridge has produced more articles, had articles cited more often, increased its faculty numbers or that there was some combination of the three.

For faculty student ratio, Cambridge fell slightly while MIT's score remained the same. For international students both fell slightly.

What made the difference was the international faculty indicator. Cambridge's score went from 98.4 to 98.2 while MIT's rose from 50 to 86.4, which means 1.82 more points in the total ranking, more than enough to overcome Cambridge's improvement in citations and pull slightly ahead.

Having done some rapid switching between the ranking scores and university statistics, I would estimate that a score of 50 represents about 15% international faculty and a score of 86 about 30 %.

It is most unlikely that MIT has in one year recruited about 150 international faculty while getting rid of a similar number of American faculty. We would surely have heard about it. After all, even the allocation of office space at MIT makes national headlines. Even more so if they had boosted the total number of faculty.

International faculty is a notoriously difficult statistic for data collectors. "International" could mean anything from getting a degree abroad to being a temporary visiting scholar. QS are quite clear that they mean current national status but this may not always reach the branch campuses, institutes, departments and programs where data is born before starting the long painful journey to the world rankings.

I suspect that what happened in the case of MIT is that somebody somewhere told somebody somewhere that permanent residents should be counted  as international or that faculty who forgot  to fill out a form  were moved into the international category or something like that.

All this draws attention to what may have been a major mistake by QS, that is configuring the surveys so that a large number of universities are squashed together at the top.For the academic survey, there are 11 universities with a score of 100 and another 17 with a score of 99 to 99.9. Consequently, differentiating between universities at the top depends largely on data about students and faculty submitted by institutions themselves. Even if they are totally scrupulous about finding and disseminating data there are all sorts of things that can can cause problems at each stage of the process.

I have not heard any official reaction yet from MIT. I believe that there are some people there who are quite good at counting things so maybe there will be a comment or an explanation soon






Wednesday, January 11, 2012

The end of the university as we know it?


MIT has already been putting its course materials online for anyone to access free of charge. Now they are going a step further.

"MIT today announced the launch of an online learning initiative internally called “MITx.” MITx will offer a portfolio of MIT courses through an online interactive learning platform that will:
  • organize and present course material to enable students to learn at their own pace
  • feature interactivity, online laboratories and student-to-student communication
  • allow for the individual assessment of any student’s work and allow students who demonstrate their mastery of subjects to earn a certificate of completion awarded by MITx
  • operate on an open-source, scalable software infrastructure in order to make it continuously improving and readily available to other educational institutions.
MIT expects that this learning platform will enhance the educational experience of its on-campus students, offering them online tools that supplement and enrich their classroom and laboratory experiences. MIT also expects that MITx will eventually host a virtual community of millions of learners around the world.

There are a lot of questions that come to mind. Will students be assessed according to the same standards as conventional MIT students? If someone accumulates sufficient certificates of completion will they be entitled to an MITx degree? What will happen if employers and graduate school  start accepting MITx certificates as equivalent to standard academic credentials? If so, will MIT be able to resist the temptation to start charging hefty fees for a certificate.

MIT may, perhaps unwittingly,  have started a process that will end with universities becoming something very different.

Sunday, August 14, 2011

Press release from Shanghai

Here is the press release from Shanghai Jiao Tong University giving more details about this year's rankings.

Monday, August 15, 2011
Shanghai, People's Republic of China
The Center for World-Class Universities of Shanghai Jiao Tong University released today the 2011 Academic Ranking of World Universities (ARWU), marking its 9th consecutive year of measuring the performance of top universities worldwide.
Harvard University tops the 2011 list; other Top 10 universities are: Stanford, MIT, Berkeley, Cambridge, Caltech, Princeton, Columbia, Chicago and Oxford. In Continental Europe, ETH Zurich (23rd) in Switzerland takes first place, followed by Paris-Sud (40th) and Pierre and Marie Curie (41st) in France. The best ranked universities in Asia are University of Tokyo (21st) and Kyoto University (24th) in Japan.
Three universities are ranked among Top 100 for the first time in the history of ARWU: University of Geneva (73rd), University of Queensland (88th) and University of Frankfurt (100th). As a result, the number of Top 100 universities in Switzerland, Australia and Germany increases to 4, 4 and 6 respectively.
Ten universities first enter into Top 500, among them University of Malaya in Malaysia and University of Zagreb in Croatia enable their home countries to be represented, together with other 40 countries, in the 2011 ARWU list.
Progress of universities in Middle East countries is remarkable. King Saud University in Saudi Arabia first appears in Top 300; King Fahd University of Petroleum & Minerals in Saudi Arabia, Istanbul University in Turkey and University of Teheran in Iran move up in Top 400 for the first time; Cairo University in Egypt is back to Top 500 after five years of staggering outside.
The number of Chinese universities in Top 500 increases to 35 in 2011, with National Taiwan University, Chinese University of Hong Kong, and Tsinghua University ranked among Top 200.
The Center for World-Class Universities of Shanghai Jiao Tong University also released the 2011 Academic Ranking of World Universities by Broad Subject Fields (ARWU-FIELD) and 2011 Academic Ranking of World Universities by Subject Field (ARWU-SUBJECT).Top 100 universities in five broad subject fields and in five selected subject fields are listed, where the best five universities are:
Natural Sciences and Mathematics – Harvard, Berkeley, Princeton, Caltech and Cambridge
Engineering/Technology and Computer Sciences – MIT, Stanford, Berkeley, UIUC and Georgia Tech
Life and Agriculture Sciences – Harvard, MIT, UC San Francisco, Cambridge and Washington (Seattle)
Clinical Medicine and Pharmacy – Harvard, UC San Francisco, Washington (Seattle), Johns Hopkins and Columbia
Social Sciences – Harvard, Chicago, MIT, Berkeley and Columbia
Mathematics – Princeton, Harvard, Berkeley, Stanford and Cambridge
Physics – MIT, Harvard, Caltech,Princeton and Berkeley
Chemistry – Harvard, Berkeley, Stanford, Cambridge and ETH Zurich
Computer Science – Stanford, MIT, Berkeley, Princeton and Harvard
Economics/Business – Harvard, Chicago, MIT, Berkeley and Columbia
The complete listsand detailed methodologies can be found at the Academic Ranking of World Universities website at http://www.ShanghaiRanking.com/.
Academic Ranking of World Universities (ARWU): Starting from 2003, ARWU has been presenting the world top 500 universities annually based on a set of objective indicators and third-party data. ARWU has been recognized as the precursor of global university rankings and the most trustworthy list. ARWU uses six objective indicators to rank world universities, including the number of alumni and staff winning Nobel Prizes and Fields Medals, number of highly cited researchers selected by Thomson Scientific, number of articles published in journals of Nature and Science, number of articles indexed in Science Citation Index - Expanded and Social Sciences Citation Index, and per capita performance with respect to the size of an institution. More than 1000 universities are actually ranked by ARWU every year and the best 500 are published.
Center for World-Class Universities of Shanghai Jiao Tong University (CWCU): CWCU has been focusing on the study of world-class universities for many years, published the first Chinese-language book titled world-class universities and co-published the first English book titled world-class universities with European Centre for Higher Education of UNESCO. CWCU initiated the "International Conference on World-Class Universities" in 2005 and organizes the conference every second year, which attracts a large number of participants from all major countries. CWCU endeavors to build databases of major research universities in the world and clearinghouse of literature on world-class universities, and provide consultation for governments and universities.
Contact: Dr.Ying CHENG at ShanghaiRanking@gmail.com

Friday, April 01, 2011

Best Grad Schools

The US News Graduate School Rankings were published on March 15th. Here are the top universities in various subject areas.

Business: Stanford

Education: Vanderbilt

Engineering:  MIT

Law:  Yale

Medical: Harvard

Biology: Stanford

Chemistry: Caltech, MIT, UC Berkeley

Computer Science; Carnegie-Mellon, MIT, Stanford, UC Berkeley

Earth Sciences: Caltech, MIT

Mathematics: MIT

Physics: Caltech, Harvard, MIT, Stanford

Statistics: Stanford

Library and Information Studies: Illinois at Urbana-Champagne

Criminology: Maryland -- College Park

Economics: Harvard, MIT, Princeton, Chicago

English: UC Berkeley

History: Princeton

Political Science: Harvard, Princeton, Stanford

Psychology: Stanford, UC Berkeley

Sociology: UC Berkeley

Public Affairs: Syracuse

Fine Arts: Rhode Island School of Design

Saturday, March 01, 2025

China, AI, and Rankings


Recently we have seen the crumbling of many illusions. It now seems hard to believe but only a few weeks ago we were assured that President Biden was as sharp as a fiddle or as fit as a tack or something. Also, the Russian economy was collapsing under the weight of Western sanctions. Or again, the presidential race was running neck and neck, and probably heading for a decisive Democrat vote, foretold by that state-of-the-art poll from Iowa.

An equally significant illusion was the supremacy of Western, especially Anglophone, science and scholarship. The remarkable growth of Asian research has often been dismissed as imitative and uncreative and anyway much less important than the amazing things Western universities are doing for sustainability and diversity.

The two big UK rankings, THE and QS, highly regarded by governments and media, have been instrumental in the underestimation of Chinese science and the overestimation of that of the West. Oxford is in first place in the THE world rankings and no other, while MIT leads the QS world rankings and no other. Indeed, Leiden Ranking, probably the most respected ranking among actual researchers, has them in 25th and 91st place for publications. 

The myopia of the Western rankers has been revealed by recent events in the world of AI. The release of the large language model (LLM) DeepSeek has caused much soul searching among western academics and scientists. It looks as good as Chat GPT and the others, probably better, and, it seems, very much cheaper. There will likely be more to come in the near future. The researchers and developers were mainly “researchers and developers from China’s elite universities, with minimal overseas education,” according to DeepSeek itself, including Peking University, Tsinghua University, Zhejiang University, Beihang University, Shanghai Jiao Tong University and Nanjing University. There are some overseas links, Monash, Stanford, Texas, but these are less significant.

Some of the anguish or the excitement may be premature. DeepSeek may inspire another Sputnik moment, although that does seem rather unlikely at the moment, and Western companies and institutions may surge ahead again. Also, I suspect, the cheapness may have been exaggerated. Like its western counterparts, DeepSeek has places that it would prefer not to go to – Tiananmen Square and the Uighurs among others – and that could undermine its validity in the long run.

But it is a remarkable achievement nonetheless and it is yet another example of the emerging technological prowess of the Chinese economy. We have seen China build a network of high-speed railways. Compare that with the infamous Los Angeles to San Francisco railroad. Compare China’s military modernization with the state of European navies and armies.

We might add, compare the steady advance of Chinese universities in the output and quality of research and innovation compared to the stagnation and decline of western academia. The main western rankers, THE and QS, have consistently rated  American and British universities more favourably than those in Asia, especially China. Recently it seems that the two dominant rankers have been doing their best to lend a hand to western universities while holding back those in Asia. THE started their Impact rankings with the intention of allowing universities to show the wonderful things they are doing to promote sustainability, an opportunity that has been seized by some Canadian, Australian, and British universities but totally ignored by China. QS has introduced a new sustainability indicator into its world rankings, in which Chinese universities do not do well.

 

AI Rankings

QS and THE have been especially unobservant about the rise of China in computer science, and more specifically in the field of AI. This is in contrast to those rankings based largely on research and derived from public verifiable data.

There are currently four rankings that focus on AI. QS has a ranking for Data Science and Artificial Intelligence and it is very much dominated by Western universities. The top 20 includes 10 US institutions and none from Mainland China, although it does include the Hong Kong University of Science and Technology and the Chinese University of Hong Kong. Massachusetts Institute of Technology is in first place and the best Mainland university is Tongji in 36th = place.

Now let’s look at EduRank, a rather obscure firm, probably located in California, whose methodology might be based on publications, citations and other metrics. Here the top 20 for AI has 15 US universities, with Stanford University in first place. The best performing Chinese university is Tsinghua at 9th place.  

University Ranking of Academic Performance (URAP) is published by the Middle East Technical University in Ankara. Their most recent  AI ranking has Tsinghua in first place with Carnegie Mellon in 10th. The top 20 has 12 Mainland universities and only three American.

The US News Best Global Universities ranking for AI is even more emphatic in its assertion of Chinese superiority. Twelve out of the top 20 universities for AI are Mainland Chinese, with Tsinghua at number one. The best US university was Carnegie Mellon University in 29th  place, well behind a few universities from Australia, Hong Kong, and Singapore.

 

Computer Science Rankings

Turning to the broader field of Computer Science, the THE Computer Science rankings have Oxford in first place, MIT in third, and Peking University in twelfth. Similarly, QS has a Computer Science and Information Systems subject ranking, the most recent edition of which shows MIT first, Oxford fourth, and Tsinghua eleventh.

In contrast, in the National Taiwan University Computer Science Rankings Tsinghua is first, Stanford seventh, and Oxford 171st (!). According to the Scimago Institutions Rankings for universities, Tsinghua is first for Computer Science,  MIT 9th,  Oxford 22nd.  The Iran-based ISC World University Rankings for Computer and Information Sciences place Tsinghua first, MIT 11th, and Oxford 18th. In the US News Best Global Universities Computer Science and Engineering ranking Tsinghua is first, MIT fifth, and Oxford 18th.

In the Shanghai subject rankings MIT is still just ahead of Tsinghua, mainly because of the World Class Output metric which includes international academic awards since 1991.

It seems then that QS, THE, and EduRank have significantly exaggerated the capabilities of elite Western universities in AI and Computer Science generally and underestimated those of Chinese and other Asian schools. It seems ironic that THE and, to a lesser extent, QS are regarded as arbiters of excellence while URAP, Scimago, the National Taiwan University rankings, and even US News are largely ignored.

 

 

Thursday, June 16, 2011

The QS Arts and Humanities Rankings

See here for the complete rankings.

Here are the top five in each indicator, academic survey, employer survey, citations per paper of the QS subject rankings.

There is nothing surprising about the leaders in the two surveys. But the citations indicator is another matter. Perhaps, QS has followed Times Higher in uncovering "clear pockets of excellence". Would any specialists out there like to comment on Newcastle University (the English one, not the Australian) and Durham as first for history -- something to do with proximity to Hadrian's Wall? What about Brown for Philosophy, Stellenbosch for Geography and Area Studies and Padua for linguistics?

English Language and Literature
Academic survey
1.  Harvard
2.  Oxford
3.  Cambridge
4.  UC Berkeley
5.  Yale

Employer Survey
1.  Oxford
2.  Cambridge
3.  Harvard
4.  MIT
5.  UC Los Angeles

No ranking for citations

Modern Languages
Academic Survey
1.  Harvard
2,  UC Berkeley
3.  Oxford
4.  Cambridge
5.  Cornell

Employer Survey
1.  Harvard
2.  Oxford
3.  Cambridge
4.  MIT
5.  Stanford

No rankings for citations

History
Academic Survey
1.  Harvard
2.  Cambridge
3.  Oxford
4.  Yale
5.  UC Berkeley

Employer Survey
1. Oxford
2.  Harvard
3.  Cambridge
4.  University of Pennsylvania
5. Yale

Citations per Paper
1=  Newcastle (UK)
1=  Durham
3.   Liverpool
4.   George Washington
5.   University of Washington

Philosophy
Academic Survey
1.  Oxford
2.  Harvard
3.  Cambridge
4.  UC Berkeley
5.  Princeton

Employer Survey
1.  Cambridge
2.  Harvard
3.  Oxford
4.  MIT
5.  UC Berkeley

Citations per Paper
1.  Brown
2.  Melbourne
3.  MIT
4=  Rutgers
4=  Zurich


Geography and Area Studies
Academic survey
1.  UC Berkeley
2.  Cambridge
3.  Oxford
4.  Harvard
5.  Tokyo

Employer Survey
1.  Harvard
2.  Cambridge
3.  Oxford
4.  MIT
5.  UC Berkeley

Citations per Paper
1.  Stellenbosch
2. Lancaster
3.  Durham'
4.  Queen Mary London
5.  University of Kansas


Linguistics
Academic Survey
1.  Cambridge
2.  Oxford
3.  Harvard
4.  UC Berkeley
5.  Stanford

Employer Survey
1.  Harvard
2.  Oxford
3.  MIT
4.  UC Berkeley
5.  Melbourne

Citations per Paper
1.  Padua
2.  Boston University
3.  York University (UK)
4.  Princeton
5.  Harvard

Sunday, September 11, 2016

Waiting for the THE world rankings



The world, having recovered from the shocks of the Shanghai, QS and RUR rankings, now waits for the THE world rankings, especially the research impact indicator measured by field normalised citations.

It might be helpful to show the top 5 universities for this criterion since 2010-11.

2010-11
1. Caltech
2. MIT
3. Princeton
4. Alexandria University
5. UC Santa Cruz

2011-12
1. Princeton
2. MIT
3. Caltech
4. UC Santa Barbara
5. Rice University

2012-13
1. Rice University
2. National Research Nuclear University MePhI
3. MIT
4. UC Santa Cruz
5. Princeton

2013-14
1. MIT
2. Tokyo Metropolitan University
3. Rice University
4. UC Santa Cruz
5. Caltech

2014-15
1. MIT
2. UC Santa Cruz
3. Tokyo Metropolitan University
4. Rice University
5. Caltech

2015-16
1. St George's, University of London
2. Stanford University
3. UC Santa Cruz
4  Caltech
5. Harvard

Notice that no university has been in the top five for citations in every year.

Last year THE introduced some changes to this indicator, one of which was to exclude papers with more than 1000 authors from the citation count. This, along with a dilution of the regional modification that gave a bonus to universities in low scoring countries, had a devastating effect on some universities in France, Korea, Japan, Morocco, Chile and Turkey.

The citations indicator has always been an embarrassment to THE, throwing up a number of improbable front runners aka previously undiscovered pockets of excellence. Last year they introduced some reforms but not enough. It would be a good idea for THE to get rid of the regional modification altogether, to introduce full scale fractional counting, to reduce the weighting assigned to citations, to exclude self-citations and secondary affiliations and to include more than one measure of research impact and research quality.

Excluding the papers, mainly in particle physics, with 1,000 plus "authors" meant avoiding the bizarre situation where a contributor to a single paper with 2,000 authors and 2,000 citations would get the same credit as 1,000 authors writing a thousand papers each of which had been cited twice.

But this measure also  meant that some of the most significant scientific activity of the century would not be counted in the rankings. The best solution would have been fractional counting, distributing the citations among all of the institutions or contributors, and in fact THE did this for their pilot African rankings at the University of Johannesburg.

Now, THE have announced a change for this year's rankings. According to their data chief Duncan Ross.

" Last year we excluded a small number of papers with more than 1,000 authors. I won’t rehearse the arguments for their exclusion here, but we said at the time that we would try to identify a way to re-include them that would prevent the distorting effect that they had on the overall metric for a few universities.


This year they are included – although they will be treated differently from other papers. Every university with researchers who author a kilo-author paper will receive at least 5 per cent credit for the paper – rising proportionally to the number of authors that the university has.
This is the first time that we have used a proportional measure in our citations score, and we will be monitoring it with interest.

We’re also pleased that this year the calculation of the Times Higher Education World University Rankings has been subject to independent audit by professional services firm PricewaterhouseCoopers (PwC). "
This could have perverse consequences. If an institution has one contributor to a 1,000 author paper with 2,000 citations then that author will get 2,000 citations for the university. But if there are 1001 authors then he or she would get only 50 citations.

It is possible that we will see a cluster of papers with 998, 999, 1000 authors as institutions remove their researchers from the author lists or project leaders start capping the number of contributors.

This could be a way  of finding out if research intensive universities really do care about the THE rankings.

Similarly, QS now excludes papers with more than ten contributing institutions. If researchers are concerned about the QS rankings they will ensure that the number of institutions does not go above ten. Let's see if we start getting large numbers of papers with ten institutions but none or few with 11, 12 13 etc.

I am wondering why THE would bother introducing this relatively small change. Wouldn't it make more sense to introduce a lot of small changes all at once and get the resulting volatility over and done with?

I wonder if this has something to do with the THE world academic summit being held at Berkeley on 26-28 September in cooperation with UC Berkeley. Last year Berkeley fell from 8th to 13th in the THE world rankings. Since it is a contributor to several multi-contributor papers it is possible that the partial re-inclusion of hyper-papers will help the university back into the top ten.



Wednesday, September 12, 2012

Disappointing

Here is a comment from MIT News. There is nothing about whether MIT did in fact recruit a load of new international faculty between 2011 and 2012.


For the first time, MIT has been ranked as the world’s top university in the QS World University Rankings. The No. 1 ranking moves the Institute up two spots from its third-place ranking last year; two years ago, MIT was ranked fifth.

The full 2012-13 rankings — published by Quacquarelli Symonds, an organization specializing in education and study abroad — can be found at
http://www.topuniversities.com/. QS rankings are based on research quality, graduate employment, teaching quality, and an assessment of the global diversity of faculty and students.

MIT was also ranked the world’s top university in 11 of 28 disciplines ranked by QS, including all five in the “engineering and technology” category: computer science, chemical engineering, civil engineering, electrical engineering and mechanical engineering.

QS also ranked the Institute as the world’s best university in chemistry, economics and econometrics, linguistics, materials science, mathematics, and physics and astronomy.
The Institute ranked among the top five institutions worldwide in another five QS disciplines: accounting and finance (2), biological sciences (2), statistics and operational research (3), environmental sciences (4) and communication and media studies (5).

Rounding out the top five universities in the QS ranking were the University of Cambridge, Harvard University, University College London and the University of Oxford.

Tuesday, April 20, 2010

Graduate School Rankings



US News has released its annual ranking of American graduate schools. These are subject rankings rather than holistic.



The top schools in selected categories are:



Business: Harvard, Stanford

Education: Vanderbilt

Engineering: MIT

Law: Yale

Medical Research: Harvard

Medical Primary Care: University of Washington, Seattle

Biological Sciences: Stanford

Chemistry: Caltech

Computer Science: Carnegie-Mellon, MIT, Stanford

Earth Sciences: Caltech, MIT

Mathematics: MIT

Physics: Caltech, MIT, Berkeley

Statistics:Stanford

Economics: Harvard, Princeton, Chicago, Stanford

Library and Information Sciences: University of illinois: Urbana-Champaign, University of North Carolina: Chapel Hill

English: Berkeley

Psychology: Stanford, Berkeley

History: Princeton, Stanford, Berkeley, Yale

Public Affairs: Syracuse

Fine Arts: Rhode Island School of Design

Wednesday, May 18, 2011

The QS Life Sciences Ranking Continued

Looking at the scores for the three indicators, academic survey, employer survey and citations per paper, we find the situation is similar to that of the engineering rankings released last month. There is a reasonably high correlation between the scores for the two surveys:

Medicine                     .720
Biological Sciences      .747
Psychology                  .570

The correlations between the score for citations per paper and the academic survey are low but still significant:
Medicine                          .290
Biological Sciences           .177
Psychology                       .217

The correlations between the indicator citations and the employer survey are low or very low and insignificant:
Medicine                               .129
Biological Sciences                .015 
Psychology                           -027


Looking at the top five universities for each indicator, there are no surprises as far as the surveys are concerned but some of the universities in the top five for citations do cause some eyebrow raising. Arizona State university? University of Cinncinati? Tokyo Metropolitan University? Perhaps these are hitherto unnoticed pockets of excellence of the Alexandrian kind?

Top Five in Medicine

Academic Survey

1.    Harvard
2.    Cambridge
3.    Oxford
4.    Stanford
5.    Yale

Employer Survey

1.     Harvard
2.     Cambridge
3.     Oxford
4.     MIT
5.     Stanford

Citations per Paper 

1.    MIT
2.    Rockefeller University
3.    Caltech
4.    The University of Texas M. D. Anderson Cancer Center
5.     Harvard


Top Five in Biological Sciences

Academic Survey

1.    Cambridge
2.    Harvard
3.    UC Berkeley
4.    Oxford
5.    MIT

Employer Survey


1.  Harvard
2.  Cambridge
3.  MIT
4.  Oxford
5.  Stanford

Citations per Paper

1.  Arizona State university
2.   Tokyo Metropolitan University
3.   MIT
4.   Rockefeller University
5.   Harvard

Top Five in Psychology

Academic Survey

1.    Harvard
2.   Stanford
3.    UC Berkeley
4.    Cambridge
5.    Oxford

Employer Survey 

1.     Cambridge
2.     Harvard
3.     Oxford
4.     Stanford
5.     UC Berkeley

Citations per Paper

1.     UC Irvine
2.     Emory
3.     Unuversity of Cinncinati
4.     Princeton
5.     Dartmouth College

Thursday, February 06, 2014

The Best Universities for Research

It seems to be the time of year when there a slow trickle of university ranking spin-offs before the big three world rankings starting in August. We have had young university rankings, best student cities, most international universities, BRICS rankings.

Something is missing though, a ranking of top universities for research. So to assuage the pent up demand here are the top 20 universities for research according to six different ranking indicators. There is considerable variation with only two universities, Harvard and Stanford, appearing in every list.

First the top twenty universities for research output according to Scimago. This is measured by publications in the Scopus database over a five year period.

1.   Harvard
2.   Tokyo
3.   Toronto
4.   Tsinghua
5.   Sao Paulo
6.   Michigan Ann Arbor
7.   Johns Hopkins
8.   UCLA
9.   Zhejiang
10. University of Washington
11. Stanford
12. Graduate University of the Chinese Academy of Sciences
13. Shanghai Jiao Tong University
14. University College London
15. Oxford
16. Universite Pierre et Marie Curie Paris 6
17. University of Pennsylvania
18. Cambridge
19. Kyoto
20. Columbia

Next we have the normalized impact scores from Scimago, which measure citations to research publications taking account of field. This might be considered a measure of the quality of research rather than quantity. Note that a university would not be harmed if it had a large number of non-performing faculty who never wrote papers.

1.   MIT
2.   Harvard
3.   University of California San Francisco
4=  Stanford
4=  Princeton
6.   Duke
7.   Rice
8.   Chicago
9=  Columbia
9=  University of California Berkeley
9=  University of California Santa Cruz
12.  University Of California Santa Barbara
13.  Boston University
14= Johns Hopkins
14= University of Pennsylvania
16.  University of California San Diego
17= UCLA
17= University of Washington
17= Washington University of St Louis
20.  Oxford

The citations per faculty indicator in the QS World University Rankings also uses Scopus. It is not normalized by field so medical schools and technological institutes can do very well.

1.   Weizmann Institute of Technology
2.   Caltech
3.   Rockefeller University
4.   Harvard
5.   Stanford
6.   Gwanju Institute of Science and Technology
7.   UCLA
8.   University of California San Francisco
9.   Karolinska Institute
10. University of California Santa Barbara
11. University of California San Diego
12. London School of Hygiene and Tropical Medicine
13. MIT
14. Georgia Institute of Technology
15. University of Washington
16. Northwestern University
17. Emory
18. Tel Aviv
19. Minnesota Twin Cities
20. Cornell

The Times Higher Education -- Thomson Reuters Research Impact Citations Indicator is normalized by field (250 of them) and by year of publication. In addition, there is a "regional modification" that gives a big boost to universities in countries with generally low impact scores. A good score on this indicator can be obtained by contributing to multi-contributor publications, especially in physics, providing that total publications do not rise too much.

1=  MIT
1=  Tokyo Metropolitan University
3=  University of California Santa Cruz
3=  Rice
5.   Caltech
6.   Princeton
7.   University of California Santa Barbara
8.   University of California Berkeley
9=  Harvard
9=  Stanford
11. Florida Institute of Technology
12. Chicago
13. Royal Holloway,University of London
14.  University of Colorado Boulder
15= Colorado School of Mines
15= Northwestern
17= Duke
17= Universty of California San Diego
19.  Washington University of St Louis
20.  Boston College

The Shanghai Academic Ranking of World Universities Highly Cited indicator counts the number of researchers on the lists compiled by Thomson Reuters. It seems that new lists will now be produced every year so this indicator could become less stable.

1.   Harvard
2.   Stanford
3.   MIT
4.   University of California Berkeley
5.   Princeton
6.   Michigan Ann Arbor
7.   University of California San Diego
8.   Yale
9.   University of Pennsylvania
10.   UCLA
11=  Caltech
11=  Columbia
13.   University of Washington
14.   Cornell
15.   Cambridge.
16.   University of California San Francisco
17.   Chicago
18    University of Wisconsin Madison
19    University of Minnesota Twin Cities
20.   Oxford


Finally, the MNCS indicator from the Leiden Ranking, which is the number of field normalized citations per paper. It is possible for a few widely cited papers in the right discipline to have a disproportionate effect. The high placing for Gottingen results from a single computer science paper the citation of which is required for intellectual property reasons.

1.    MIT
2.    Gottingen
3.    Princeton
4.    Caltech
5.    Stanford
6.    Rice
7.    University of California Santa Barbara
8.    University of California Berkeley
9     Harvard
10   University of California Santa Cruz
11.  EPF Lausanne
12.  Yale
13   University of California San Francisco
14.  Chicago
15.  University of California San Diego
16.  Northwestern
17.  University of  Colorado Boulder
18.  Columbia
19.  University of Texas Austin
20.  UCLA




Thursday, April 03, 2025

The Decline of American Universities: The View From Leiden, Ankara and Madrid


There has been a lot of talk recently about the crisis or crises of American universities. Certainly, if we look at the deteriorating financial situation, the thuggish behavior of demonstrators at Ivy League schools or big state universities, scandals about admissions, or fraudulent research then, yes, American universities do seem to be in a very bad way.

However, financial problems, violent extremism, corruption, and research fraud can be found almost everywhere. Is there a way to compare large numbers of institutions across international frontiers? There is no perfect mode of assessment, but global rankings can tell us quite a bit about the health or sickness of higher education and research.

When Americans think about university rankings, it is usually America’s Best Colleges published for more than four decades by US News (USN) that comes to mind. In the rest of the world, global rankings are more significant. The leader in public approval, if we mean governments, university leaders, and the media, is clearly the Times Higher Education (THE) World University Rankings. These rankings are characterised by bizarrely implausible results, sometimes dismissed as outliers or quirky statistics. In the last few years – sorry to keep repeating this -- we have seen Anglia Ruskin University and Babol Noshirvani University of Technology leading the world for research impact, Macau University of Science and Technology and the University of Macau superstars for internationalisation, Anadolu University and Makerere University in the global top ten for knowledge transfer. No matter, as long as the composite top fifty scores look reasonable from a traditional perspective and the usual heroes, Harvard, MIT, Oxford, are at the top or not too far away.

QS, another British company, was once THE’s data supplier but has pursued an independent path since 2010. Its rankings are more sensible than THE's, but it also seems to have an undue regard for the old Western elite. In its recent world subject rankings, Harvard was first in the world for all five broad subjects except Engineering and Technology, where the crown went to MIT, and Oxford was second in all but one.

These two, along with the Shanghai Rankings by virtue of their age, and occasionally the US News Best Global Universities, because of the fame of their national rankings, constitute the NBA of the ranking world. They are cited endlessly by the global media and provide lists for the appointment of external examiners and editorial boards and for recruitment, promotion, and admissions and even data for the immigration policies of the UK, Hong Kong, and the Netherlands.

However, there are other rankings based on publicly accessible data, transparent methodologies, and consistent procedures. They are largely ignored by those with power and influence, but they tell a coherent and factual story. They are published by universities or research centers with limited budgets and small but well-qualified research teams.

I will take three: Leiden Ranking, produced by the Centre for Science and Technology Studies (CWTS) at Leiden University, the Netherlands, University Ranking by Academic Performance (URAP) by the Informatics Institute at the Middle East Technical University in Ankara, and the SCImago Institutions Rankings (SIR) published by the SCImago Lab in Spain, which has links with the Spanish National Research Council and Spanish universities.

Leiden Ranking

Let’s start by taking a look at Leiden Ranking. The publishers decline to construct any composite or combined ranking, which limits its popular appeal. The default metric, which appears when you land on the list page, is just the number of articles and reviews in core journals in the Web of Science database. Back in 2006-2009, Harvard was in first place here, and other US universities filled up the upper levels of the ranking. The University of Michigan was third, and the University of California Los Angeles (UCLA) was fifth. Chinese universities were lagging behind. Zhejiang University in Hangzhou was 16th, and Tsinghua University in Beijing 32nd.

Fast forward to publications between 2019 and 2022, and Zhejiang has overtaken Harvard and pushed it into second place. The top twenty now includes several Chinese universities, some now world-famous, but others, such as Central South University or Jilin University, scarcely known in the West.

Much of this decline is due to China's advance at the expense of US schools, but that is not the whole story. UCLA has now fallen behind Toronto, São Paulo, Seoul National University, Oxford, University College London, Melbourne, Tokyo, and Copenhagen.

You could say that is just quantity, not quality, and maybe we should be looking at high-impact publications. In that case, we should look at publications in the top 10% of journals, where Zhejiang is still ahead of Harvard. It is only when we reach the top 1% of journals that Harvard still has a lead, and one wonders how long that will last.

That is just the number of publications. Academics tend to judge scientific quality by the number of citations that a work receives. Leiden Ranking no longer ranks universities by citations, perhaps with good reason, but does provide data in the individual profiles. Here we see Harvard’s citations per paper score rising from 13.31 in 2006-2009 to 15.71 in 2019-2022, while Zhejiang’s rises from 3.38 to 11.43. So, Harvard is still ahead for citations, but the gap is closing rapidly and will probably be gone in three or four years.

 

URAP

Turning to the URAP, which is based on a bundle of research metrics, Harvard was first in the combined rankings back in 2013-2014, and the best-performing Chinese institution was Peking University, in 51st place. Now, in the recently published 2024-2025 rankings, Harvard is still first, but Peking is now tenth, and Zhejiang and Tsinghua have also entered the top ten.

Other elite American universities have fallen: Berkeley from 5th to 54th, Yale from 18th to 38th, Boston University from 58th to 151st, Dartmouth from 333rd to 481st.

The relative and absolute decline of the American elite is even clearer if we look at certain key areas. In the ranking for Information and Computing Sciences, the top ten are all located in Mainland China and Singapore, with Tsinghua at the top. Harvard is 35th.

Some American universities are doing much better here than Harvard. MIT, which I suppose will soon be known as the Tsinghua of the West, is 12th, and Carnegie Mellon is 15th.

In Engineering the top 25 universities are all located in Mainland China, Hong Kong, or Singapore. The best American school is again MIT in 37th place, while Harvard languishes in 71st.

 

SCImago

These rankings are quite distinctive in that they have a section for Innovation, which comprises metrics related to patents, and for Societal Factors, which is a mixed bag containing data about altmetrics, gender, impact on policy, web presence, and the UN Sustainable Development Goals. It also includes non-university organisations such as hospitals, companies, non-profits, and government agencies.

When these rankings started in 2009, and before societal factors were included, Harvard was in second place after France's National Scientific Research Center (CNRS). MIT and UCLA were both in the top ten, and the best-performing Chinese university was Tsinghua, in 80th place, while Zhejiang and Peking lagged way behind at 124th and 176th, respectively.

In the latest 2025 rankings, Harvard has slipped to fourth place behind the Chinese Academy of Sciences, the Chinese Ministry of Education, and CNRS. Tsinghua, Zhejiang, and Peking are all in the top twenty, and MIT, UCLA, and the North Carolina schools have all fallen.

Looking at Computer Science, the world leader is the Chinese Academy of Sciences. The best university is Tsinghua, in fourth place. Then there are some multinational and American companies and more Chinese universities before arriving at Stanford in the 24th slot. Harvard is 64th

In the next post, we will look at the causes of all this.




Thursday, June 19, 2014

The New Highly Cited Researchers List

Citations have become a standard feature of global university rankings, although they are measured in very different ways. Since 2003 the Shanghai Academic Ranking of World Universities has used the list of highly cited researchers published by Thomson Reuters (TR), who have now prepared a new list of about 3,500 names to supplement the old one which has 7,000 plus.

The new list got off to a bad a start in 2013 because the preliminary list was based on a faulty procedure and because of problems with the assigning of papers to fields or subfields. This led to ARWU having to repeat the 2012 scores for their highly cited researchers indicator in their 2013 rankings.

The list contains a number of researchers who appear more than once. Just looking at the number of Harvard researchers for a few minutes, I have noticed that David M Sabatini, primary affiliation MIT with secondary affiliations at Broad Institute Harvard and MIT, is listed  for Biology and Biochemistry and also for Molecular Biology and Genetics.

Eric S Lander, primary affiliations with Broad Institute Harvard and MIT and secondary affiliations with MIT and Harvard, is listed three times for  Biology and Biochemistry, Clinical Medicine and Molecular Biology and Genetics.

Frank B Hu, primary affiliation with Harvard and secondary affiliation with King Abdulaziz University, Saudi Arabia, is listed under Agricultural Sciences, Clinical Medicine and Molecular Biology and Genetics.

This no doubt represents the reality of scientific research in which a single researcher might well excel in two or more closely related fields but if ARWU are just going to count the number of researchers in the new list there will be distortions if some are counted more than once.

The new list refers to achievements over the period 2002-12. Unlike the old list, which just counted the number of citations, the new one is based on normalisation by field -- 21 in this case -- and by year of publication. In other words, it is not the number of citations that matters but the numbers in relation to the world average for field and year of citation.

TR acknowledge that there is a problem resulting from the growing number of massively cited, multi-authored papers and reviews, especially in the subfields of Particle and High-Energy Physics. To deal with this issue they have excluded from the analysis papers in Physics with more than thirty institutional addresses.

I do not know if TR are planning on doing this for their data for the Times Higher Education World University Rankings. If they are, places like Panjab University are in for a nasty shock.

Another noticeable thing about the new lists is the large number of  secondary affiliations. In many cases the joint affiliations seem quite legitimate. For example, there are many researchers in subjects such as Biology and Biochemistry with affiliation to an Ivy League school and a nearby hospital or research institute. On the other hand, King Abdulaziz University in Jeddah has 150 secondary affiliations. Whether Thomson Reuters or ARWU will be able to determine that these represent a genuine association is questionable.

The publication of the new lists is further evidence  that citations can be used to measure very different things. It would be unwise for any ranking organisation to use only one citations based indicator or only one database.





Thursday, September 18, 2014

QS World University Rankings 2014



Publisher

QS (Quacquarelli Symonds)



Scope

Global. 701+ universities.


Top Ten


PlaceUniversity
1MIT
2=Cambridge
2=Imperial College London
4Harvard
5Oxford
6University College London
7Stanford
8California Institute of Technology (Caltech)
9Princeton
10Yale



Countries with Universities in the Top Hundred


Country      Number of Universities
USA28
UK19
Australia8
Netherlands                                              7
Canada5
Switzerland4
Japan4
Germany3
China3
Korea3
Hong Kong3
Denmark2
Singapore2
France2
Sweden2
Ireland1
Taiwan1
Finland1
Belgium1
New Zealand1



Top Ranked in Region


North America 
MIT
AfricaUniversity of Cape Town
EuropeCambridge
Imperial College London
Latin AmericaUniversidade de Sao Paulo                                    
AsiaNational University of Singapore                                    
Central and Eastern Europe  Lomonosov Moscow State University                                   
Arab WorldKing Fahd University of Petroleum and Minerals                                     
Middle EastHebrew University of Jerusalem                                  



Noise Index

In the top 20, this year's QS world rankings are less volatile than the previous edition but more so than the THE rankings or Shanghai ARWU. The top 20 universities in 2013 rose or fell an average of 1.45 places. The most remarkable change was the rise of Imperial College and Cambridge to second place behind MIT and ahead of Harvard.


RankingAverage Place Change
 of Universities in the top 20 
QS World Rankings 2013-20141.45
QS World Rankings 2012-20131.70
ARWU 2013 -2014 0.65
Webometrics 2013-20144.25
Center for World University Ranking (Jeddah)
2013-2014 
0.90
THE World Rankings 2012-20131.20


Looking at the top 100 universities, the  QS rankings  are little different from last year. The average university in the top 100 moved up or down 3.94 places compared to 3.97 between 2012 and 2013. These rankings are more reliable than this year's ARWU, which was affected by the new lists of highly cited researchers, and last year's THE rankings.

RankingAverage Place Change
 of Universities in the top 100 
QS World Rankings 2013-143.94
QS World Rankings 2012-20133.97
ARWU 2013 -2014 4.92
Webometrics 2013-201412.08
Center for World University Ranking (Jeddah)
2013-2014 
10.59
THE World Rankings 2012-20135.36




Methodology (from topuniversities)

1. Academic reputation (40%)

Academic reputation is measured using a global survey, in which academics are asked to identify the institutions where they believe the best work is currently taking place within their field of expertise.
For the 2014/15 edition, the rankings draw on almost 63,700 responses from academics worldwide, collated over three years. Only participants’ most recent responses are used, and they cannot vote for their own institution. Regional weightings are applied to counter any discrepancies in response rates.
The advantage of this indicator is that it gives a more equal weighting to different discipline areas than research citation counts. Whereas citation rates are far higher in subjects like biomedical sciences than they are in English literature, for example, the academic reputation survey weights responses from academics in different fields equally.
It also gives students a sense of the consensus of opinion among those who are by definition experts. Academics may not be well positioned to comment on teaching standards at other institutions, but it is well within their remit to have a view on where the most significant research is currently taking place within their field.

2. Employer reputation (10%)

The employer reputation indicator is also based on a global survey, taking in almost 28,800 responses for the 2014/15 edition. The survey asks employers to identify the universities they perceive as producing the best graduates. This indicator is unique among international university rankings.
The purpose of the employer survey is to give students a better sense of how universities are viewed in the job market. A higher weighting is given to votes for universities that come from outside of their own country, so it’s especially useful in helping prospective students to identify universities with a reputation that extends beyond their national borders. 

3. Student-to-faculty ratio (20%)

This is a simple measure of the number of academic staff employed relative to the number of students enrolled. In the absence of an international standard by which to measure teaching quality, it provides an insight into the universities that are best equipped to provide small class sizes and a good level of individual supervision.

4. Citations per faculty (20%)

This indicator aims to assess universities’ research output. A ‘citation’ means a piece of research being cited (referred to) within another piece of research. Generally, the more often a piece of research is cited by others, the more influential it is. So the more highly cited research papers a university publishes, the stronger its research output is considered.
QS collects this information using Scopus, the world’s largest database of research abstracts and citations. The latest five complete years of data are used, and the total citation count is assessed in relation to the number of academic faculty members at the university, so that larger institutions don’t have an unfair advantage.

5  6. International faculty ratio (5%)  international student ratio (5%)

The last two indicators aim to assess how successful a university has been in attracting students and faculty members from other nations. This is based on the proportion of international students and faculty members in relation to overall numbers. Each of these contributes 5% to the overall ranking results.

Tuesday, July 15, 2014

Another New Highly Cited Researchers List

Thomson Reuters have published another document, The World's Most Influential Scientific Minds, which contains the most highly cited researchers for the period 2002-13. This one includes only the primary affiliation of the researchers, not the secondary ones. If the Shanghai ARWU rankings, due in August, use this list rather than the one published previously, they will save themselves a lot of embarrassment.

Over at arxiv, Lutz Bornmann and Johann Bauer have produced a ranking of the leading institutions according to the number of highly cited researchers' primary affiliation. Here are their top ten universities, with government agencies and independent research centres omitted.

1.  University of California (all campuses)
2.  Harvard
3.  Stanford
4.  University of Texas (all campuses)
5.  University of Oxford
6.  Duke University
7.  MIT
8.  University of Michigan (all campuses)
9.  Northwestern University 
10. Princeton

Compared to the old list, used for the Highly Cited indicator in the first Shanghai rankings in 2003, Oxford and Northwestern are doing better and MIT and Princeton somewhat worse.

Bornmann and Bauer have also ranked universities according to the number of primary and secondary affiliations,counting each recorded affiliation as a fraction). The top ten are:

1.  University of California (all campuses)
2.  Harvard
3.  King Abdulaziz University, Jeddah, Saudi Arabia
4.  Stanford
5.  University of Texas 
6.  MIT
7.  Oxford
8.  University of Michigan
9.  University of Washington
10.  Duke

The paper concludes:

"To counteract attempts at manipulation, ARWU should only consider primary 

institutions of highly cited researchers. "




Saturday, December 09, 2023

Global Subject Rankings: The Case of Computer Science

Three ranking agencies have recently released the latest editions of their subject rankings: Times Higher Education, Shanghai Ranking, and Round University Rankings.  

QS, URAP, and National Taiwan University also published subject rankings earlier in the year. The US News global rankings announced last year can be filtered for subject. The methods are different and consequently the results are also rather different. It is instructive to focus on the results for a specific field, computer science and on two universities, Oxford and Tsinghua. Note that the scope of the rankings is sometimes different.

 

1.   Times Higher Education has published rankings of eleven broad subjects using the same indicators as in their world rankings, Teaching, Research Environment, Research Quality, International Outlook, and Industry: Income and Patents, but with different weightings. For example, Teaching has a weighting of 28% for the Engineering rankings and Industry: Income and Patents 8%, while for Arts and Humanities the weightings are 37.5% and 3% respectively.

These rankings continued to be led by the traditional Anglo-American elite. Harvard is in first place for three subjects, Stanford, MIT, and Oxford in two each and Berkeley and Caltech in one each.

The top five for Computer Science are:

1.    University of Oxford

2.    Stanford University

3.    Massachusetts Institute of Technology

4.    Carnegie Mellon University

5.    ETH Zurich.

Tsinghua is 13th.

 

2.   The Shanghai subject rankings are based on these metrics: influential journal publications, category normalised citation impact, international collaboration, papers in Top Journals or Top Conferences, and faculty winning significant academic awards.

According to these rankings China is now dominant in Engineering subjects. Chinese universities lead in fifteen subjects although Harvard, MIT and Northwestern University lead for seven subjects. The Natural Sciences, Medical Sciences, and Social Sciences are still largely the preserve of American and European universities.

Excellence in the Life Sciences appears to be divided between the USA and China. The top positions in Biology, Human Biology, Agriculture, and Veterinary Science are held respectively by Harvard, University of California San Francisco, Northwest Agriculture and Forestry University, and Nanjing Agricultural University.

The top five for Computer Science and Engineering are:

1.    Massachusetts Institute of Technology

2.    Stanford University

3.    Tsinghua University

4.    Carnegie Mellon University

5.    University of California Berkeley.

Oxford is 9th.

 

3.  The Round University Rankings (RUR), now published from Tbilisi, Georgia, are derived from 20 metrics grouped in 5 clusters, Teaching, Research, International Diversity, and Financial Sustainability. The same methodology is used for rankings in six broad fields. Here, Harvard is in first place for Medical Sciences, Social Sciences, and Technical Sciences, Caltech for Life Sciences, and University of Pennsylvania for Humanities.

RUR’s narrow subject rankings, published for the first time, use different criteria related to publications and citations: Number of Papers, Number of Citations, Citations per Paper, Number of Citing Papers, and Number of Highly Cited Papers. In these rankings, first place goes to twelve universities in the USA, eight in Mainland China, three in Singapore, and one each in Hong Kong, France, and the UK.

 The top five for Computer Science are:

1.    National University of Singapore

2.    Nanyang Technological University

3.    Massachusetts Institute of Technology

4.    Huazhong University of Science and Technology

5.    University of Electronic Science and Technology of China.

Tsinghua is 10th.  Oxford is 47th.

 

4.   The QS World University Rankings by Subject are based on five indicators: Academic reputation, Employer reputation, Research citations per paper, H-index and International research network.  At the top they are mostly led by the usual suspects, MIT, Harvard, Stanford, Oxford, and Cambridge.

The top five for Computer Science and Information Systems

1.    Massachusetts Institute of Technology

2.    Carnegie Mellon University

3.    Stanford University

4.    University of California Berkeley

5.    University of Oxford.

Tsinghua is 15th.

 

5.   University Ranking by Academic Performance (URAP) is produced by a research group at the Middle East Technical University, Ankara, and is based on publications, citations, and international collaboration. Last July it published rankings of 78 subjects.  

 The top five for Information and Computing Sciences were:

1.    Tsinghua University

2.    University of Electronic Science and Technology of China

3.   Nanyang Technological University

4.   National University of Singapore

5.   Xidian University

Oxford is 19th

 

6.    The US News Best Global Universities can be filtered by subject. They are based on publications, citations and research reputation.

The top five for Computer Science in 2022 were:

1.   Tsinghua University

2.   Stanford University

3.    Massachusetts Institute of Technology

4.    Carnegie Mellon University

5.   University of California Berkeley

Oxford was 11th.

 

7.    The National Taiwan University Rankings are based on articles, citations, highly cited papers, and H-index.

The top five for Computer Science are:

1.    Nanyang Technological University

2.    Tsinghua University

3.    University of Electronic Science and Technology of China

4.   National University of Singapore

5.    Xidian University

Oxford is 111th

 

So, Tsinghua is ahead of Oxford for computer science and related fields in the Shanghai Rankings, the Round University Rankings, URAP, the US News Best Global Universities, and the National Taiwan University Rankings. These rankings are entirely or mainly based on research publications and citations. Oxford is ahead of Tsinghua in both the QS and THE subject rankings. The contrast between the THE and the Taiwan rankings is especially striking.