Showing posts sorted by relevance for query MIT. Sort by date Show all posts
Showing posts sorted by relevance for query MIT. Sort by date Show all posts

Wednesday, September 12, 2012

What happened to MIT and Cambridge?

QS now has a new world  number one university. Massachusetts Institute of Technology (MIT) has replaced Cambridge and overtaken Harvard.

Unfortunately, this change probably means very little.

Overall the change was very slight. MIT rose from 99.21 to 100 while Cambridge fell from 100 to 99.8.

There was no change in the two surveys that account for half of the weighting. MIT and Cambridge both scored 100 for the academic and the employer surveys in 2011 and in 2012.

On the citations per faculty indicator Cambridge did quite a bit better this year, rising from 92.7 to 97 while MIT fell slightly from 99.6 to 99.3. This could mean that, compared to front-runner Caltech,Cambridge has produced more articles, had articles cited more often, increased its faculty numbers or that there was some combination of the three.

For faculty student ratio, Cambridge fell slightly while MIT's score remained the same. For international students both fell slightly.

What made the difference was the international faculty indicator. Cambridge's score went from 98.4 to 98.2 while MIT's rose from 50 to 86.4, which means 1.82 more points in the total ranking, more than enough to overcome Cambridge's improvement in citations and pull slightly ahead.

Having done some rapid switching between the ranking scores and university statistics, I would estimate that a score of 50 represents about 15% international faculty and a score of 86 about 30 %.

It is most unlikely that MIT has in one year recruited about 150 international faculty while getting rid of a similar number of American faculty. We would surely have heard about it. After all, even the allocation of office space at MIT makes national headlines. Even more so if they had boosted the total number of faculty.

International faculty is a notoriously difficult statistic for data collectors. "International" could mean anything from getting a degree abroad to being a temporary visiting scholar. QS are quite clear that they mean current national status but this may not always reach the branch campuses, institutes, departments and programs where data is born before starting the long painful journey to the world rankings.

I suspect that what happened in the case of MIT is that somebody somewhere told somebody somewhere that permanent residents should be counted  as international or that faculty who forgot  to fill out a form  were moved into the international category or something like that.

All this draws attention to what may have been a major mistake by QS, that is configuring the surveys so that a large number of universities are squashed together at the top.For the academic survey, there are 11 universities with a score of 100 and another 17 with a score of 99 to 99.9. Consequently, differentiating between universities at the top depends largely on data about students and faculty submitted by institutions themselves. Even if they are totally scrupulous about finding and disseminating data there are all sorts of things that can can cause problems at each stage of the process.

I have not heard any official reaction yet from MIT. I believe that there are some people there who are quite good at counting things so maybe there will be a comment or an explanation soon

Thursday, January 12, 2012

The end of the university as we know it?

MIT has already been putting its course materials online for anyone to access free of charge. Now they are going a step further.

"MIT today announced the launch of an online learning initiative internally called “MITx.” MITx will offer a portfolio of MIT courses through an online interactive learning platform that will:
  • organize and present course material to enable students to learn at their own pace
  • feature interactivity, online laboratories and student-to-student communication
  • allow for the individual assessment of any student’s work and allow students who demonstrate their mastery of subjects to earn a certificate of completion awarded by MITx
  • operate on an open-source, scalable software infrastructure in order to make it continuously improving and readily available to other educational institutions.
MIT expects that this learning platform will enhance the educational experience of its on-campus students, offering them online tools that supplement and enrich their classroom and laboratory experiences. MIT also expects that MITx will eventually host a virtual community of millions of learners around the world.

There are a lot of questions that come to mind. Will students be assessed according to the same standards as conventional MIT students? If someone accumulates sufficient certificates of completion will they be entitled to an MITx degree? What will happen if employers and graduate school  start accepting MITx certificates as equivalent to standard academic credentials? If so, will MIT be able to resist the temptation to start charging hefty fees for a certificate.

MIT may, perhaps unwittingly,  have started a process that will end with universities becoming something very different.

Monday, August 15, 2011

Press release from Shanghai

Here is the press release from Shanghai Jiao Tong University giving more details about this year's rankings.

Monday, August 15, 2011
Shanghai, People's Republic of China
The Center for World-Class Universities of Shanghai Jiao Tong University released today the 2011 Academic Ranking of World Universities (ARWU), marking its 9th consecutive year of measuring the performance of top universities worldwide.
Harvard University tops the 2011 list; other Top 10 universities are: Stanford, MIT, Berkeley, Cambridge, Caltech, Princeton, Columbia, Chicago and Oxford. In Continental Europe, ETH Zurich (23rd) in Switzerland takes first place, followed by Paris-Sud (40th) and Pierre and Marie Curie (41st) in France. The best ranked universities in Asia are University of Tokyo (21st) and Kyoto University (24th) in Japan.
Three universities are ranked among Top 100 for the first time in the history of ARWU: University of Geneva (73rd), University of Queensland (88th) and University of Frankfurt (100th). As a result, the number of Top 100 universities in Switzerland, Australia and Germany increases to 4, 4 and 6 respectively.
Ten universities first enter into Top 500, among them University of Malaya in Malaysia and University of Zagreb in Croatia enable their home countries to be represented, together with other 40 countries, in the 2011 ARWU list.
Progress of universities in Middle East countries is remarkable. King Saud University in Saudi Arabia first appears in Top 300; King Fahd University of Petroleum & Minerals in Saudi Arabia, Istanbul University in Turkey and University of Teheran in Iran move up in Top 400 for the first time; Cairo University in Egypt is back to Top 500 after five years of staggering outside.
The number of Chinese universities in Top 500 increases to 35 in 2011, with National Taiwan University, Chinese University of Hong Kong, and Tsinghua University ranked among Top 200.
The Center for World-Class Universities of Shanghai Jiao Tong University also released the 2011 Academic Ranking of World Universities by Broad Subject Fields (ARWU-FIELD) and 2011 Academic Ranking of World Universities by Subject Field (ARWU-SUBJECT).Top 100 universities in five broad subject fields and in five selected subject fields are listed, where the best five universities are:
Natural Sciences and Mathematics – Harvard, Berkeley, Princeton, Caltech and Cambridge
Engineering/Technology and Computer Sciences – MIT, Stanford, Berkeley, UIUC and Georgia Tech
Life and Agriculture Sciences – Harvard, MIT, UC San Francisco, Cambridge and Washington (Seattle)
Clinical Medicine and Pharmacy – Harvard, UC San Francisco, Washington (Seattle), Johns Hopkins and Columbia
Social Sciences – Harvard, Chicago, MIT, Berkeley and Columbia
Mathematics – Princeton, Harvard, Berkeley, Stanford and Cambridge
Physics – MIT, Harvard, Caltech,Princeton and Berkeley
Chemistry – Harvard, Berkeley, Stanford, Cambridge and ETH Zurich
Computer Science – Stanford, MIT, Berkeley, Princeton and Harvard
Economics/Business – Harvard, Chicago, MIT, Berkeley and Columbia
The complete listsand detailed methodologies can be found at the Academic Ranking of World Universities website at
Academic Ranking of World Universities (ARWU): Starting from 2003, ARWU has been presenting the world top 500 universities annually based on a set of objective indicators and third-party data. ARWU has been recognized as the precursor of global university rankings and the most trustworthy list. ARWU uses six objective indicators to rank world universities, including the number of alumni and staff winning Nobel Prizes and Fields Medals, number of highly cited researchers selected by Thomson Scientific, number of articles published in journals of Nature and Science, number of articles indexed in Science Citation Index - Expanded and Social Sciences Citation Index, and per capita performance with respect to the size of an institution. More than 1000 universities are actually ranked by ARWU every year and the best 500 are published.
Center for World-Class Universities of Shanghai Jiao Tong University (CWCU): CWCU has been focusing on the study of world-class universities for many years, published the first Chinese-language book titled world-class universities and co-published the first English book titled world-class universities with European Centre for Higher Education of UNESCO. CWCU initiated the "International Conference on World-Class Universities" in 2005 and organizes the conference every second year, which attracts a large number of participants from all major countries. CWCU endeavors to build databases of major research universities in the world and clearinghouse of literature on world-class universities, and provide consultation for governments and universities.
Contact: Dr.Ying CHENG at

Saturday, April 02, 2011

Best Grad Schools

The US News Graduate School Rankings were published on March 15th. Here are the top universities in various subject areas.

Business: Stanford

Education: Vanderbilt

Engineering:  MIT

Law:  Yale

Medical: Harvard

Biology: Stanford

Chemistry: Caltech, MIT, UC Berkeley

Computer Science; Carnegie-Mellon, MIT, Stanford, UC Berkeley

Earth Sciences: Caltech, MIT

Mathematics: MIT

Physics: Caltech, Harvard, MIT, Stanford

Statistics: Stanford

Library and Information Studies: Illinois at Urbana-Champagne

Criminology: Maryland -- College Park

Economics: Harvard, MIT, Princeton, Chicago

English: UC Berkeley

History: Princeton

Political Science: Harvard, Princeton, Stanford

Psychology: Stanford, UC Berkeley

Sociology: UC Berkeley

Public Affairs: Syracuse

Fine Arts: Rhode Island School of Design

Thursday, June 16, 2011

The QS Arts and Humanities Rankings

See here for the complete rankings.

Here are the top five in each indicator, academic survey, employer survey, citations per paper of the QS subject rankings.

There is nothing surprising about the leaders in the two surveys. But the citations indicator is another matter. Perhaps, QS has followed Times Higher in uncovering "clear pockets of excellence". Would any specialists out there like to comment on Newcastle University (the English one, not the Australian) and Durham as first for history -- something to do with proximity to Hadrian's Wall? What about Brown for Philosophy, Stellenbosch for Geography and Area Studies and Padua for linguistics?

English Language and Literature
Academic survey
1.  Harvard
2.  Oxford
3.  Cambridge
4.  UC Berkeley
5.  Yale

Employer Survey
1.  Oxford
2.  Cambridge
3.  Harvard
4.  MIT
5.  UC Los Angeles

No ranking for citations

Modern Languages
Academic Survey
1.  Harvard
2,  UC Berkeley
3.  Oxford
4.  Cambridge
5.  Cornell

Employer Survey
1.  Harvard
2.  Oxford
3.  Cambridge
4.  MIT
5.  Stanford

No rankings for citations

Academic Survey
1.  Harvard
2.  Cambridge
3.  Oxford
4.  Yale
5.  UC Berkeley

Employer Survey
1. Oxford
2.  Harvard
3.  Cambridge
4.  University of Pennsylvania
5. Yale

Citations per Paper
1=  Newcastle (UK)
1=  Durham
3.   Liverpool
4.   George Washington
5.   University of Washington

Academic Survey
1.  Oxford
2.  Harvard
3.  Cambridge
4.  UC Berkeley
5.  Princeton

Employer Survey
1.  Cambridge
2.  Harvard
3.  Oxford
4.  MIT
5.  UC Berkeley

Citations per Paper
1.  Brown
2.  Melbourne
3.  MIT
4=  Rutgers
4=  Zurich

Geography and Area Studies
Academic survey
1.  UC Berkeley
2.  Cambridge
3.  Oxford
4.  Harvard
5.  Tokyo

Employer Survey
1.  Harvard
2.  Cambridge
3.  Oxford
4.  MIT
5.  UC Berkeley

Citations per Paper
1.  Stellenbosch
2. Lancaster
3.  Durham'
4.  Queen Mary London
5.  University of Kansas

Academic Survey
1.  Cambridge
2.  Oxford
3.  Harvard
4.  UC Berkeley
5.  Stanford

Employer Survey
1.  Harvard
2.  Oxford
3.  MIT
4.  UC Berkeley
5.  Melbourne

Citations per Paper
1.  Padua
2.  Boston University
3.  York University (UK)
4.  Princeton
5.  Harvard

Sunday, September 11, 2016

Waiting for the THE world rankings

The world, having recovered from the shocks of the Shanghai, QS and RUR rankings, now waits for the THE world rankings, especially the research impact indicator measured by field normalised citations.

It might be helpful to show the top 5 universities for this criterion since 2010-11.

1. Caltech
2. MIT
3. Princeton
4. Alexandria University
5. UC Santa Cruz

1. Princeton
2. MIT
3. Caltech
4. UC Santa Barbara
5. Rice University

1. Rice University
2. National Research Nuclear University MePhI
3. MIT
4. UC Santa Cruz
5. Princeton

1. MIT
2. Tokyo Metropolitan University
3. Rice University
4. UC Santa Cruz
5. Caltech

1. MIT
2. UC Santa Cruz
3. Tokyo Metropolitan University
4. Rice University
5. Caltech

1. St George's, University of London
2. Stanford University
3. UC Santa Cruz
4  Caltech
5. Harvard

Notice that no university has been in the top five for citations in every year.

Last year THE introduced some changes to this indicator, one of which was to exclude papers with more than 1000 authors from the citation count. This, along with a dilution of the regional modification that gave a bonus to universities in low scoring countries, had a devastating effect on some universities in France, Korea, Japan, Morocco, Chile and Turkey.

The citations indicator has always been an embarrassment to THE, throwing up a number of improbable front runners aka previously undiscovered pockets of excellence. Last year they introduced some reforms but not enough. It would be a good idea for THE to get rid of the regional modification altogether, to introduce full scale fractional counting, to reduce the weighting assigned to citations, to exclude self-citations and secondary affiliations and to include more than one measure of research impact and research quality.

Excluding the papers, mainly in particle physics, with 1,000 plus "authors" meant avoiding the bizarre situation where a contributor to a single paper with 2,000 authors and 2,000 citations would get the same credit as 1,000 authors writing a thousand papers each of which had been cited twice.

But this measure also  meant that some of the most significant scientific activity of the century would not be counted in the rankings. The best solution would have been fractional counting, distributing the citations among all of the institutions or contributors, and in fact THE did this for their pilot African rankings at the University of Johannesburg.

Now, THE have announced a change for this year's rankings. According to their data chief Duncan Ross.

" Last year we excluded a small number of papers with more than 1,000 authors. I won’t rehearse the arguments for their exclusion here, but we said at the time that we would try to identify a way to re-include them that would prevent the distorting effect that they had on the overall metric for a few universities.

This year they are included – although they will be treated differently from other papers. Every university with researchers who author a kilo-author paper will receive at least 5 per cent credit for the paper – rising proportionally to the number of authors that the university has.
This is the first time that we have used a proportional measure in our citations score, and we will be monitoring it with interest.

We’re also pleased that this year the calculation of the Times Higher Education World University Rankings has been subject to independent audit by professional services firm PricewaterhouseCoopers (PwC). "
This could have perverse consequences. If an institution has one contributor to a 1,000 author paper with 2,000 citations then that author will get 2,000 citations for the university. But if there are 1001 authors then he or she would get only 50 citations.

It is possible that we will see a cluster of papers with 998, 999, 1000 authors as institutions remove their researchers from the author lists or project leaders start capping the number of contributors.

This could be a way  of finding out if research intensive universities really do care about the THE rankings.

Similarly, QS now excludes papers with more than ten contributing institutions. If researchers are concerned about the QS rankings they will ensure that the number of institutions does not go above ten. Let's see if we start getting large numbers of papers with ten institutions but none or few with 11, 12 13 etc.

I am wondering why THE would bother introducing this relatively small change. Wouldn't it make more sense to introduce a lot of small changes all at once and get the resulting volatility over and done with?

I wonder if this has something to do with the THE world academic summit being held at Berkeley on 26-28 September in cooperation with UC Berkeley. Last year Berkeley fell from 8th to 13th in the THE world rankings. Since it is a contributor to several multi-contributor papers it is possible that the partial re-inclusion of hyper-papers will help the university back into the top ten.

Thursday, February 06, 2014

The Best Universities for Research

It seems to be the time of year when there a slow trickle of university ranking spin-offs before the big three world rankings starting in August. We have had young university rankings, best student cities, most international universities, BRICS rankings.

Something is missing though, a ranking of top universities for research. So to assuage the pent up demand here are the top 20 universities for research according to six different ranking indicators. There is considerable variation with only two universities, Harvard and Stanford, appearing in every list.

First the top twenty universities for research output according to Scimago. This is measured by publications in the Scopus database over a five year period.

1.   Harvard
2.   Tokyo
3.   Toronto
4.   Tsinghua
5.   Sao Paulo
6.   Michigan Ann Arbor
7.   Johns Hopkins
8.   UCLA
9.   Zhejiang
10. University of Washington
11. Stanford
12. Graduate University of the Chinese Academy of Sciences
13. Shanghai Jiao Tong University
14. University College London
15. Oxford
16. Universite Pierre et Marie Curie Paris 6
17. University of Pennsylvania
18. Cambridge
19. Kyoto
20. Columbia

Next we have the normalized impact scores from Scimago, which measure citations to research publications taking account of field. This might be considered a measure of the quality of research rather than quantity. Note that a university would not be harmed if it had a large number of non-performing faculty who never wrote papers.

1.   MIT
2.   Harvard
3.   University of California San Francisco
4=  Stanford
4=  Princeton
6.   Duke
7.   Rice
8.   Chicago
9=  Columbia
9=  University of California Berkeley
9=  University of California Santa Cruz
12.  University Of California Santa Barbara
13.  Boston University
14= Johns Hopkins
14= University of Pennsylvania
16.  University of California San Diego
17= UCLA
17= University of Washington
17= Washington University of St Louis
20.  Oxford

The citations per faculty indicator in the QS World University Rankings also uses Scopus. It is not normalized by field so medical schools and technological institutes can do very well.

1.   Weizmann Institute of Technology
2.   Caltech
3.   Rockefeller University
4.   Harvard
5.   Stanford
6.   Gwanju Institute of Science and Technology
7.   UCLA
8.   University of California San Francisco
9.   Karolinska Institute
10. University of California Santa Barbara
11. University of California San Diego
12. London School of Hygiene and Tropical Medicine
13. MIT
14. Georgia Institute of Technology
15. University of Washington
16. Northwestern University
17. Emory
18. Tel Aviv
19. Minnesota Twin Cities
20. Cornell

The Times Higher Education -- Thomson Reuters Research Impact Citations Indicator is normalized by field (250 of them) and by year of publication. In addition, there is a "regional modification" that gives a big boost to universities in countries with generally low impact scores. A good score on this indicator can be obtained by contributing to multi-contributor publications, especially in physics, providing that total publications do not rise too much.

1=  MIT
1=  Tokyo Metropolitan University
3=  University of California Santa Cruz
3=  Rice
5.   Caltech
6.   Princeton
7.   University of California Santa Barbara
8.   University of California Berkeley
9=  Harvard
9=  Stanford
11. Florida Institute of Technology
12. Chicago
13. Royal Holloway,University of London
14.  University of Colorado Boulder
15= Colorado School of Mines
15= Northwestern
17= Duke
17= Universty of California San Diego
19.  Washington University of St Louis
20.  Boston College

The Shanghai Academic Ranking of World Universities Highly Cited indicator counts the number of researchers on the lists compiled by Thomson Reuters. It seems that new lists will now be produced every year so this indicator could become less stable.

1.   Harvard
2.   Stanford
3.   MIT
4.   University of California Berkeley
5.   Princeton
6.   Michigan Ann Arbor
7.   University of California San Diego
8.   Yale
9.   University of Pennsylvania
10.   UCLA
11=  Caltech
11=  Columbia
13.   University of Washington
14.   Cornell
15.   Cambridge.
16.   University of California San Francisco
17.   Chicago
18    University of Wisconsin Madison
19    University of Minnesota Twin Cities
20.   Oxford

Finally, the MNCS indicator from the Leiden Ranking, which is the number of field normalized citations per paper. It is possible for a few widely cited papers in the right discipline to have a disproportionate effect. The high placing for Gottingen results from a single computer science paper the citation of which is required for intellectual property reasons.

1.    MIT
2.    Gottingen
3.    Princeton
4.    Caltech
5.    Stanford
6.    Rice
7.    University of California Santa Barbara
8.    University of California Berkeley
9     Harvard
10   University of California Santa Cruz
11.  EPF Lausanne
12.  Yale
13   University of California San Francisco
14.  Chicago
15.  University of California San Diego
16.  Northwestern
17.  University of  Colorado Boulder
18.  Columbia
19.  University of Texas Austin
20.  UCLA

Thursday, September 13, 2012


Here is a comment from MIT News. There is nothing about whether MIT did in fact recruit a load of new international faculty between 2011 and 2012.

For the first time, MIT has been ranked as the world’s top university in the QS World University Rankings. The No. 1 ranking moves the Institute up two spots from its third-place ranking last year; two years ago, MIT was ranked fifth.

The full 2012-13 rankings — published by Quacquarelli Symonds, an organization specializing in education and study abroad — can be found at QS rankings are based on research quality, graduate employment, teaching quality, and an assessment of the global diversity of faculty and students.

MIT was also ranked the world’s top university in 11 of 28 disciplines ranked by QS, including all five in the “engineering and technology” category: computer science, chemical engineering, civil engineering, electrical engineering and mechanical engineering.

QS also ranked the Institute as the world’s best university in chemistry, economics and econometrics, linguistics, materials science, mathematics, and physics and astronomy.
The Institute ranked among the top five institutions worldwide in another five QS disciplines: accounting and finance (2), biological sciences (2), statistics and operational research (3), environmental sciences (4) and communication and media studies (5).

Rounding out the top five universities in the QS ranking were the University of Cambridge, Harvard University, University College London and the University of Oxford.

Wednesday, May 18, 2011

The QS Life Sciences Ranking Continued

Looking at the scores for the three indicators, academic survey, employer survey and citations per paper, we find the situation is similar to that of the engineering rankings released last month. There is a reasonably high correlation between the scores for the two surveys:

Medicine                     .720
Biological Sciences      .747
Psychology                  .570

The correlations between the score for citations per paper and the academic survey are low but still significant:
Medicine                          .290
Biological Sciences           .177
Psychology                       .217

The correlations between the indicator citations and the employer survey are low or very low and insignificant:
Medicine                               .129
Biological Sciences                .015 
Psychology                           -027

Looking at the top five universities for each indicator, there are no surprises as far as the surveys are concerned but some of the universities in the top five for citations do cause some eyebrow raising. Arizona State university? University of Cinncinati? Tokyo Metropolitan University? Perhaps these are hitherto unnoticed pockets of excellence of the Alexandrian kind?

Top Five in Medicine

Academic Survey

1.    Harvard
2.    Cambridge
3.    Oxford
4.    Stanford
5.    Yale

Employer Survey

1.     Harvard
2.     Cambridge
3.     Oxford
4.     MIT
5.     Stanford

Citations per Paper 

1.    MIT
2.    Rockefeller University
3.    Caltech
4.    The University of Texas M. D. Anderson Cancer Center
5.     Harvard

Top Five in Biological Sciences

Academic Survey

1.    Cambridge
2.    Harvard
3.    UC Berkeley
4.    Oxford
5.    MIT

Employer Survey

1.  Harvard
2.  Cambridge
3.  MIT
4.  Oxford
5.  Stanford

Citations per Paper

1.  Arizona State university
2.   Tokyo Metropolitan University
3.   MIT
4.   Rockefeller University
5.   Harvard

Top Five in Psychology

Academic Survey

1.    Harvard
2.   Stanford
3.    UC Berkeley
4.    Cambridge
5.    Oxford

Employer Survey 

1.     Cambridge
2.     Harvard
3.     Oxford
4.     Stanford
5.     UC Berkeley

Citations per Paper

1.     UC Irvine
2.     Emory
3.     Unuversity of Cinncinati
4.     Princeton
5.     Dartmouth College

Wednesday, April 21, 2010

Graduate School Rankings

US News has released its annual ranking of American graduate schools. These are subject rankings rather than holistic.

The top schools in selected categories are:

Business: Harvard, Stanford

Education: Vanderbilt

Engineering: MIT

Law: Yale

Medical Research: Harvard

Medical Primary Care: University of Washington, Seattle

Biological Sciences: Stanford

Chemistry: Caltech

Computer Science: Carnegie-Mellon, MIT, Stanford

Earth Sciences: Caltech, MIT

Mathematics: MIT

Physics: Caltech, MIT, Berkeley


Economics: Harvard, Princeton, Chicago, Stanford

Library and Information Sciences: University of illinois: Urbana-Champaign, University of North Carolina: Chapel Hill

English: Berkeley

Psychology: Stanford, Berkeley

History: Princeton, Stanford, Berkeley, Yale

Public Affairs: Syracuse

Fine Arts: Rhode Island School of Design

Friday, June 20, 2014

The New Highly Cited Researchers List

Citations have become a standard feature of global university rankings, although they are measured in very different ways. Since 2003 the Shanghai Academic Ranking of World Universities has used the list of highly cited researchers published by Thomson Reuters (TR), who have now prepared a new list of about 3,500 names to supplement the old one which has 7,000 plus.

The new list got off to a bad a start in 2013 because the preliminary list was based on a faulty procedure and because of problems with the assigning of papers to fields or subfields. This led to ARWU having to repeat the 2012 scores for their highly cited researchers indicator in their 2013 rankings.

The list contains a number of researchers who appear more than once. Just looking at the number of Harvard researchers for a few minutes, I have noticed that David M Sabatini, primary affiliation MIT with secondary affiliations at Broad Institute Harvard and MIT, is listed  for Biology and Biochemistry and also for Molecular Biology and Genetics.

Eric S Lander, primary affiliations with Broad Institute Harvard and MIT and secondary affiliations with MIT and Harvard, is listed three times for  Biology and Biochemistry, Clinical Medicine and Molecular Biology and Genetics.

Frank B Hu, primary affiliation with Harvard and secondary affiliation with King Abdulaziz University, Saudi Arabia, is listed under Agricultural Sciences, Clinical Medicine and Molecular Biology and Genetics.

This no doubt represents the reality of scientific research in which a single researcher might well excel in two or more closely related fields but if ARWU are just going to count the number of researchers in the new list there will be distortions if some are counted more than once.

The new list refers to achievements over the period 2002-12. Unlike the old list, which just counted the number of citations, the new one is based on normalisation by field -- 21 in this case -- and by year of publication. In other words, it is not the number of citations that matters but the numbers in relation to the world average for field and year of citation.

TR acknowledge that there is a problem resulting from the growing number of massively cited, multi-authored papers and reviews, especially in the subfields of Particle and High-Energy Physics. To deal with this issue they have excluded from the analysis papers in Physics with more than thirty institutional addresses.

I do not know if TR are planning on doing this for their data for the Times Higher Education World University Rankings. If they are, places like Panjab University are in for a nasty shock.

Another noticeable thing about the new lists is the large number of  secondary affiliations. In many cases the joint affiliations seem quite legitimate. For example, there are many researchers in subjects such as Biology and Biochemistry with affiliation to an Ivy League school and a nearby hospital or research institute. On the other hand, King Abdulaziz University in Jeddah has 150 secondary affiliations. Whether Thomson Reuters or ARWU will be able to determine that these represent a genuine association is questionable.

The publication of the new lists is further evidence  that citations can be used to measure very different things. It would be unwise for any ranking organisation to use only one citations based indicator or only one database.

Monday, January 11, 2016

Diversity Makes You Brighter ... if You're a Latino Stockpicker in Texas or Chinese in Singapore

Nearly everybody, or at least those who run the western mainstream media, agrees that some things are sacred. Unfortunately,  this is not always obvious to the uncredentialled who from time to time need to be beaten about their empty heads with the "findings" of "studies".

So we find that academic papers often with small or completely inappropriate samples, minimal effect sizes, marginal significance levels, dubious data collection procedures, unreproduced results or implausible assumptions are published in top flight journals, cited all over the Internet or even showcased in the pages of the "quality" or mass market press.

For example, anyone with any sort of mind knows that the environment is the only thing that determines intelligence.

So in 2009 we had an article in the Journal of Neuroscience that supposedly proves that a stimulating environment will not only make its beneficiaries more intelligent but also the children of the experimental subjects.

A headline in the Daily Mail proclaimed that " Mothers who enjoyed a stimulating childhood 'have brainier babies"

The first sentence of the reports claims that "[a] mother's childhood experiences may influence not only her own brain development but also that of her sons and daughters, a study suggests."

Wonderful. This could, of course, be an argument for allowing programs like Head Start to run for another three decades so that that their effects would show up in the next generation. Then the next sentence gives the game away.

"Researchers in the US found that a stimulating environment early in life improved the memory of female mice with a genetic learning defect."

Notice that experiment involved mice and not humans or any other mammal bigger than a ferret, it improved memory and nothing else, and the subjects had a genetic learning defect.

Still, that did not stop the MIT Technology Review from reporting Moshe Szyf of McGill University a saying “[i]f the findings can be conveyed to human, it means that girls’ education is important not just to their generation but to the next one,”

All of this, if confirmed, would be a serious blow against modern evolutionary theory. The MIT Technology Review got it right when it spoke about a comeback for Lamarckianism. But if there is anything scientists should have learnt over the last few decades it is that an experiment that appears to overthrow current theory, not to mention common sense and observation, is often flawed in some way. Confronted with evidence in 2011 that neutrinos were travelling faster than light, physicists with CERN reviewed their experimental procedures until they found that the apparent theory busting observation was caused by a loose fibre optic cable.

If a study had shown that a stimulating environment had a negative effect on the subjects or on the next generation or that it was stimulation for fathers that made the difference, would it have been cited in the Daily Mail or the MIT Technology Review? Would it even have been published in the Journal of Neuroscience? Wouldn't everybody have been looking for the equivalent of a loose cable?

A related idea that has reached the status of unassailable truth is that the famous academic achievement gap between Asians and Whites, and African Americans and Hispanics, could be eradicated by some sort of environmental manipulation such as spending money, providing safe spaces or laptops,  boosting self esteem or fine tuning teaching methods.

A few years ago Science, the apex of scientific research, published a paper by Geoffrey L. Cohen, Julio Garcia, Nancy Apfel and Allison Master that claimed a few minutes writing a essay affirming students' values (the control group wrote about somebody else's values) would start a process leading to an improvement in their relative academic performance. This applied only to low-achieving African American students.

I suspect that anyone with any sort of experience of secondary school classrooms would be surprised by the claim that such a brief exercise could have such a disproportionate impact.

The authors in their conclusion say:

"Finally, our apparently disproportionate results rested on an obvious precondition: the existence in the school of adequate material, social, and psychological resources and support to permit and sustain positive academic outcomes. Students must also have had the skills to perform significantly better. What appear to be small or brief events in isolation may in reality be the last element required to set in motion a process whose other necessary conditions already lay, not fully realised, in the situation."

In other words the experiment would not work unless there were "adequate material, social, and psychological resources and support" in the school, and unless students "have had the skills to perform significantly.

Is it possible that a school with all those resources, support and skills might also be one where students, mentors, teachers or classmates might just somehow leak who was in the experimental and who was in the control group?

Perhaps the experiment really is valid. If so we can expect to see millions of US secondary school students and perhaps university students writing their self affirmation essays and watch the achievement gap wither away.

In 2012, this study made the top 20 of studies that Psychfiledrawer would like to see reproduced, along with studies that showed that participants were more likely to give up trying to solve a puzzle if they ate radishes than if they ate cookies, that anxiety reducing interventions boost exam scores, music training raises IQ,  and, of course, Rosenthal and Jacobsons' famous study showing that teacher expectations can change students' IQ.

Geoffrey Cohen has provided a short list of studies that he claims replicate his findings. I suspect that only someone already convinced of the reality of self affirmation would be impressed.

Another variant of the environmental determinism creed is that diversity (racial or maybe gender although certainly not intellectual or ideological) is a wonderful thing that enriches the lives of everybody. There are powerful economic motives for universities to believe this and so we find that a succession of dubious studies are show cased as though they are the last and definitive word on the topic.

The latest such study is by Sheen S. Levine, David Stark and others and was the basis for an op ed in the New York Times (NYT).

The background is that the US Supreme Court back in 2003 had decided that universities could not admit students on the basis of race but they could try to recruit more minority students because having large numbers of a minority group would be good for everybody. Now the court is revisiting the issue and asking whether racial preferences can be justified by the benefits they supposedly provide for everyone.

Levine and Stark in their NYT piece claim that they can and refer to a study that they published with four other authors in the Proceedings of the American Academy of Sciences. Essentially, this involved an experiment in simulating stock trading  and it was found that  homogenous "markets" in Singapore and Kingsville, Texas, (ethnically Chinese and Latino respectively) were less accurate in pricing  stocks than those that were ethnically diverse with participants from minority groups (Indian and Malay in Singapore, non-Hispanic White, Black and Asian in Texas).

They argue that:

"racial and ethnic diversity matter for learning, the core purpose of a university. Increasing diversity is not only a way to let the historically disadvantaged into college, but also to promote sharper thinking for everyone.

Our research provides such evidence. Diversity improves the way people think. By disrupting conformity, racial and ethnic diversity prompts people to scrutinize facts, think more deeply and develop their own opinions. Our findings show that such diversity actually benefits everyone, minorities and majority alike."

From this very specific exercise the authors  conclude that diversity is beneficial for American universities which are surely not comparable to a simulated stock market.

Frankly, if this is the best they can do to justify diversity then it looks as though affirmative action in US education is doomed.

Looking at the original paper also suggests that quite different conclusions could be drawn. It is true that in each country the diverse market was more accurate than the homogenous one (Chinese in Singapore, Latino in Texas) but the homogenous Singapore market was more accurate than the diverse Texas market (see fig. 2) and very much more accurate than the homogenous Texas market. Notice that this difference is obscured by the way the data is presented.

There is a moral case for affirmative action provided that it is limited to the descendants of the enslaved and the dispossessed but it is wasting everybody's time to cherry-pick studies like these to support questionable empirical claims and to stretch their generalisability well beyond reasonable limits.

Tuesday, May 12, 2015

The Geography of Excellence: the Importance of Weighting

So finally, the 2015 QS subject rankings were published. It seems that the first attempt was postponed when the original methodology produced implausible fluctuations, probably resulting from the volatility that is inevitable when there are a small number of data points -- citations and survey responses -- outside the top 50 for certain subjects.

QS have done some tweaking, some of it aimed at smoothing out the fluctuations in the responses to their academic and employer surveys.

These rankings look at bit different from the World University Rankings. Cambridge has the most top ten placings (31), followed by Oxford and Stanford (29 each), Harvard (28), Berkeley (26) and MIT (16).

But in the world rankings MIT is in first place, Cambridge second, Imperial College London third, Harvard fourth and Oxford and University College London joint fifth.

The subject rankings use two indicators from the world, the academic survey and the employer survey but not internationalisation, student faculty ratio and citations per faculty. They add two indicators, citations per paper and h-index.

The result is that the London colleges do less well in the subject rankings since they do not benefit from their large numbers of international students and faculty. Caltech, Princeton and Yale also do relatively badly probably because the new rankings do not take account of their low faculty student faculty ratios.

The lesson of this is that if weighting is not everything, it is definitely very important.

Below is a list of universities ordered by the number of top five placings. There are signs of the Asian advance --  Peking, Hong Kong and the National University of Singapore -- but it is an East Asian advance.

Europe is there too but it is Cold Europe -- Switzerland, Netherlands and Sweden -- not the Mediterranean.

RankUniversityCountryNumber of Top Five Places
1   HarvardUSA26
4   StanfordUSA17
5=UC BerkeleyUSA16
7London School of EconomicsUK7
8=University College LondonUK3
8=ETH ZurichSwitzerland 3
10=New York UniversityUSA2
10=Yale  USA2
10=Delft University of TechnologyNetherlands2
10=National University of SingaporeSingapore2
10=UC Los AngelesUSA2
10=UC DavisUSA2
10=Cornell USA2
10=Wisconsin - MadisonUSA2
10=Imperial College LondonUK2
20=University of Southern California USA1
20=Pratt Institute, New YorkUSA1
20=Rhode Island School of DesignUSA1
20=Parsons: the New School for Design USA1
20=Royal College of Arts LondonUK1
20=Sciences PoFrance1
20=University of PennsylvaniaUSA1
20=London Business SchoolUK1
20=Royal Veterinary College LondonUK1
20=UC San FranciscoUSA1
20=Johns  HopkinsUSA1
20=KU LeuvenUSA1
20=Hong KongHong Kong1
20=Karolinska InstituteSweden1
20=Carnegie Mellon UniversityUSA1
20=Georgia Institute ofTechnologyUSA1

Thursday, September 18, 2014

QS World University Rankings 2014


QS (Quacquarelli Symonds)


Global. 701+ universities.

Top Ten

2=Imperial College London
6University College London
8California Institute of Technology (Caltech)

Countries with Universities in the Top Hundred

Country      Number of Universities
Netherlands                                              7
Hong Kong3
New Zealand1

Top Ranked in Region

North America 
AfricaUniversity of Cape Town
Imperial College London
Latin AmericaUniversidade de Sao Paulo                                    
AsiaNational University of Singapore                                    
Central and Eastern Europe  Lomonosov Moscow State University                                   
Arab WorldKing Fahd University of Petroleum and Minerals                                     
Middle EastHebrew University of Jerusalem                                  

Noise Index

In the top 20, this year's QS world rankings are less volatile than the previous edition but more so than the THE rankings or Shanghai ARWU. The top 20 universities in 2013 rose or fell an average of 1.45 places. The most remarkable change was the rise of Imperial College and Cambridge to second place behind MIT and ahead of Harvard.

RankingAverage Place Change
 of Universities in the top 20 
QS World Rankings 2013-20141.45
QS World Rankings 2012-20131.70
ARWU 2013 -2014 0.65
Webometrics 2013-20144.25
Center for World University Ranking (Jeddah)
THE World Rankings 2012-20131.20

Looking at the top 100 universities, the  QS rankings  are little different from last year. The average university in the top 100 moved up or down 3.94 places compared to 3.97 between 2012 and 2013. These rankings are more reliable than this year's ARWU, which was affected by the new lists of highly cited researchers, and last year's THE rankings.

RankingAverage Place Change
 of Universities in the top 100 
QS World Rankings 2013-143.94
QS World Rankings 2012-20133.97
ARWU 2013 -2014 4.92
Webometrics 2013-201412.08
Center for World University Ranking (Jeddah)
THE World Rankings 2012-20135.36

Methodology (from topuniversities)

1. Academic reputation (40%)

Academic reputation is measured using a global survey, in which academics are asked to identify the institutions where they believe the best work is currently taking place within their field of expertise.
For the 2014/15 edition, the rankings draw on almost 63,700 responses from academics worldwide, collated over three years. Only participants’ most recent responses are used, and they cannot vote for their own institution. Regional weightings are applied to counter any discrepancies in response rates.
The advantage of this indicator is that it gives a more equal weighting to different discipline areas than research citation counts. Whereas citation rates are far higher in subjects like biomedical sciences than they are in English literature, for example, the academic reputation survey weights responses from academics in different fields equally.
It also gives students a sense of the consensus of opinion among those who are by definition experts. Academics may not be well positioned to comment on teaching standards at other institutions, but it is well within their remit to have a view on where the most significant research is currently taking place within their field.

2. Employer reputation (10%)

The employer reputation indicator is also based on a global survey, taking in almost 28,800 responses for the 2014/15 edition. The survey asks employers to identify the universities they perceive as producing the best graduates. This indicator is unique among international university rankings.
The purpose of the employer survey is to give students a better sense of how universities are viewed in the job market. A higher weighting is given to votes for universities that come from outside of their own country, so it’s especially useful in helping prospective students to identify universities with a reputation that extends beyond their national borders. 

3. Student-to-faculty ratio (20%)

This is a simple measure of the number of academic staff employed relative to the number of students enrolled. In the absence of an international standard by which to measure teaching quality, it provides an insight into the universities that are best equipped to provide small class sizes and a good level of individual supervision.

4. Citations per faculty (20%)

This indicator aims to assess universities’ research output. A ‘citation’ means a piece of research being cited (referred to) within another piece of research. Generally, the more often a piece of research is cited by others, the more influential it is. So the more highly cited research papers a university publishes, the stronger its research output is considered.
QS collects this information using Scopus, the world’s largest database of research abstracts and citations. The latest five complete years of data are used, and the total citation count is assessed in relation to the number of academic faculty members at the university, so that larger institutions don’t have an unfair advantage.

5  6. International faculty ratio (5%)  international student ratio (5%)

The last two indicators aim to assess how successful a university has been in attracting students and faculty members from other nations. This is based on the proportion of international students and faculty members in relation to overall numbers. Each of these contributes 5% to the overall ranking results.

Tuesday, July 15, 2014

Another New Highly Cited Researchers List

Thomson Reuters have published another document, The World's Most Influential Scientific Minds, which contains the most highly cited researchers for the period 2002-13. This one includes only the primary affiliation of the researchers, not the secondary ones. If the Shanghai ARWU rankings, due in August, use this list rather than the one published previously, they will save themselves a lot of embarrassment.

Over at arxiv, Lutz Bornmann and Johann Bauer have produced a ranking of the leading institutions according to the number of highly cited researchers' primary affiliation. Here are their top ten universities, with government agencies and independent research centres omitted.

1.  University of California (all campuses)
2.  Harvard
3.  Stanford
4.  University of Texas (all campuses)
5.  University of Oxford
6.  Duke University
7.  MIT
8.  University of Michigan (all campuses)
9.  Northwestern University 
10. Princeton

Compared to the old list, used for the Highly Cited indicator in the first Shanghai rankings in 2003, Oxford and Northwestern are doing better and MIT and Princeton somewhat worse.

Bornmann and Bauer have also ranked universities according to the number of primary and secondary affiliations,counting each recorded affiliation as a fraction). The top ten are:

1.  University of California (all campuses)
2.  Harvard
3.  King Abdulaziz University, Jeddah, Saudi Arabia
4.  Stanford
5.  University of Texas 
6.  MIT
7.  Oxford
8.  University of Michigan
9.  University of Washington
10.  Duke

The paper concludes:

"To counteract attempts at manipulation, ARWU should only consider primary 

institutions of highly cited researchers. "

Saturday, April 20, 2013

The Leiden Ranking

The Leiden ranking for 2013 is out. This is produced by the Centre for Science and Technology Studies (CWTS) at Leiden University and represents pretty much the state of the art in assessing research publications and citations.

A variety of indicators are presented with several different settings but no overall winner is declared which means that these rankings are not going to get the publicity given to QS and Times Higher Education.

Here are top universities, using the default settings provided by CWTS.

Total Publications: Harvard
Citations per Paper: MIT
Normalised Citations per Paper: MIT
Quality of Publications: MIT

There are also indicators for international and industrial collaboration that I hope to discuss later.

It is also noticeable that high flyers in the Times Higher Education citations indicator, Alexandria University, Moscow Engineering Physics Institute (MEPhI), Hong Kong Baptist University, Royal Holloway, do not figure at all in the Leiden Ranking. What happened to them?

How could MEPhI, equal first in the world for research influence according to THE and Thomson Reuters, fail to even show up in the normalised citation indicator in the Leiden Ranking?

Firstly, Leiden have collected data for the top 500 universities in the world according to number of publications in the Web of Science. That would have been sufficient to keep these institutions out of the rankings.

In addition, Leiden use fractionalised counting as a default setting so that the impact of mutiple-author publications is divided by the number of university addresses. This would drastically reduce the impact of publications like the Review of Particle Physics.

Also, by field Leiden mean five broad subject groups whereas Thomson Reuters appears to use a larger number (21 if they use the same system as they do for highly cited researchers.) There is accordingly more chance of anomalous cases having a great influence in the THE rankings.

THE and Thomson Reuters would do well to look at the multi-authored, and most probably soon to be multi-cited, papers that were published in 2012 and look at the universities that could do well in 2014 if the methodology remains unchanged.

Monday, August 18, 2008

The Shanghai Rankings 2008

Shanghai Jiao Tong University (SJTU) has just released their rankings for 2008. Compared to the THE-QS rankings, public response, especially in Asia and Australia, has been slight. This is largely because ascent and descent within the Shanghai index is minimal, a tribute to their reliability. In contrast, the THE-QS rankings, with their changes in methodology and frequent errors, arouse almost as much interest as a country's performance in the Olympics.

Still, it is instructive to check how well various universities do on the different components of the Shanghai rankings.

The current top ten are as follows:

1. Harvard
2. Stanford
3. Berkeley
4. Cambridge
5. MIT
6. Caltech
7. Columbia
8. Princeton
9. Chicago
10. Oxford

The Shanghai index includes two categories based on Nobel prizes and Fields medals. These measure the quality of research that might have been produced decades ago. Looking at the other criteria gives a rather different picture of current research.

It is interesting to see what happens to these ten if we rank them according to SJTU's PUB category, the total number of articles indexed in the Science Citation Index-Expanded (SCIE) and Social Science Citation Index (SSCI) in 2007. The SSCI gets a double weighting.

Harvard remains at number 1

Stanford goes down to number 8

Berkeley goes down to 11

Cambridge goes down to 23

MIT is down at 34

Caltech tumbles to 86

Columbia is down just a bit at 10

Princeton crashes to 120

Chicago falls to 72

Oxford goes down to 18

If this category represents current research output then it looks as though some American universities and Oxbridge have entered a period of decline. Of course, Caltech and MIT may suffer from the PUB category including social science research but would that explain why Princeton and Chicago are now apparently producing a relatively small amount of research?

The top ten for PUB is

1. Harvard

2. Tokyo

3. Toronto

4. University of Michigan


6. University of Washington

7. Stanford

8. Kyoto

9. Columbia

10. Berkeley

Tuesday, April 01, 2014

Comparing the THE and QS Reputation Rankings

This year's Times Higher Education (THE) Reputation Rankings were  a bit boring, at least at the top, and that is just what they should be.

The top ten are almost the same as last year. Harvard is still first and MIT is second. Tokyo has dropped out of the top ten to 11th place and has been replaced by Caltech. Stanford is up three places and is now third. Cambridge and Oxford are both down one place. Further down, there is some churning but it is difficult to see any clear and consistent trends, although the media have done their best to find stories, UK universities falling or sliding or slipping, no Indian or Irish or African universities in the top 100.

These rankings may be more interesting for who is not there than for who is. There are some notable absentees from the top 100. Last year Tokyo Metropolitan University was, according to THE and data providers Thomson Reuters (TR), first in the world, along with MIT, for research impact. Yet it fails to appear in the top 100 in a reputation  survey in which research has a two thirds weighting. Rice University, joint first in the world for research impact with Moscow State Engineering Physics Institute  in 2012 is also absent. How is this possible? Am I missing something?

In general, the THE-TR reputation survey, the data collection for which was contracted out  to the pollsters Ipsos Mori CT, appears to be quite rigorous and reliable. Survey forms were sent out to a clearly defined group, researchers with papers in the ISI indexes. THE claim that this means that their respondents must therefore be active producers of academic research. That is stretching it a bit. Getting your name on a article published in a reputable journal might mean a high degree of academic competence or it could  just mean having some sort of influence over the research process. I have heard a report about an Asian university where researchers were urged to put their heads of  department on the list of co-authors. Still, on balance it seems that the respondents to the THE survey are mostly from a stable group, namely those who have usually made some sort of contribution to a research paper of sufficient merit to be included in an academic journal .

TR also appear to have used a systematic approach in sending out the survey forms. When the first survey was being prepared in 2010 they announced that the forms would be emailed according to the number of researchers recorded by UNESCO in 2007. It is not clear if this procedure has been followed strictly over the last four years. Oceania, presumably Australia and New  Zealand, appears to have a very large  number of responses this year, 10%, although TR reported in 2010 that UNESCO found only 2.1 % of the world's researchers in that region.

The number of responses received appears reasonably large although it has declined recently.  In 2013 TR collected 10, 536 responses, considerably less than in 2012 when it was 16,639. Again, it is not clear what happened.

The number of responses from the various subject areas has changed somewhat. Since 2012 the proportion from the social sciences has gone from 19% to 22% as has engineering and technology while life sciences has gone from 16% to 22%.

QS do not publish reputation surveys but it is possible to filter their ranking scores to find out how universities performed on their academic survey.

The QS approach is less systematic. They started out using the subscription lists of World Scientific, a Singapore based academic publishing company with links to Imperial College London. Then they added respondents from  Mardev, a publisher of academic lists, to beef up the number of names in the humanities. Since then the balance has shifted with more names coming from Mardev with some topping up from World Scientific. QS have also added a sign up facility where people are allowed to apply to receive survey forms. That was suspended in April 2013 but has recently been revived. They have also asked universities to submit lists of potential respondents and respondents to suggest further names. The  exact number of responses coming from all these different sources is not known.

Over the last few years QS have made their survey rather more rigorous. First, respondents were not allowed to vote for the universities where they were currently employed. They were restricted to one response per computer and universities were not allowed to solicit votes or instruct staff who to vote for or who not to vote for. Then they were told not to promote any form of participation in the surveys.

In addition to methodological changes, the proportion of responses from different countries has changed significantly since 2007 with a large increase from Latin America, especially Brazil and Mexico, the USA and larger European countries and a fall in those from India, China and the Asia-Pacific region. All of this means that it is very difficult to figure out whether the rise or fall of a university reflects a change in methodology or distribution of responses or a genuine shift in international reputation

Comparing the THE-TR and QS surveys there is some overlap at the top. The top five are the same in both although in a different order: Harvard, MIT, Stanford, Oxford and Cambridge.

After that, we find that the QS academic survey favours universities in Asia-Pacific and  Latin America. Tokyo is seventh according to QS but THE-TR have it in 11th place. Peking is 19th for QS and 41st for THE -TR. Sao Paulo is 51st in the QS indicator but is in the 81-90 band in the THE-TR rankings. The Autonomous National University of Mexico (UNAM) is not even in THE-TR's top 100 but QS put it 48th.

On the other hand Caltech, Moscow State University, Seoul National University and Middle Eastern Technical University do much better with THE-TR than with QS .

I suspect that the QS survey is tapping a younger less experienced pool of respondents from less regarded universities and from countries with high aspirations but so far limited achievements.

Monday, May 29, 2017

The View from Leiden

Ranking experts are constantly warning about the grim fate that awaits the universities of the West if they are not provided with all the money that they want and given complete freedom to hire staff and recruit students from anywhere that they want. If this does not happen they will be swamped by those famously international Asian universities dripping with funds from indulgent patrons.

The threat, if we are to believe the prominent rankers of Times Higher Education (THE), QS and Shanghai Ranking Consultancy, is always looming but somehow never quite arrives. The best Asian performer in the THE world rankings  is the National University of Singapore (NUS) in 24th place followed by Peking University in 29th. The QS World University Rankings have NUS 12th, Nanyang Technological University 13th and Tsinghua University 24th.  The Academic Ranking of World Universities published in Shanghai puts the University of Tokyo in 20th place and Peking University in 71st.

These rankings are in one way or another significantly biased towards Western European and North American institutions and against Asia. THE has three separate indicators that measure income, adding up to a combined weighting of 10.75% . Both QS and THE have reputations surveys. ARWU gives a 30 % weighting to Nobel and Fields awards winners, some of them from several decades ago.

Let's take a look at a set of rankings that is technically excellent, namely the Leiden Ranking. The producers do not provide an overall score. Instead it is possible to create a variety of rankings, total publications, publications by subject groups, publications in the top 50%, 10% and 1% of journals. Users can also select fractional or absolute counting and  change the minimum threshold of number of publications.

Here is the top ten, using  the default settings, publications 2012-15, fractional counting, minimum threshold of 100 papers. Publications in 2006-09 are in brackets.

1. Harvard  (1)
2. Toronto  (2)
3. Zhejiang  (14)
4. Michigan (3)
5. Shanghai Jiao Tong (37)
6. Johns Hopkins (5)
7  Sao Paulo (8)
8. Stanford (9)
9  Seoul National University (23)
10.  Tokyo (4).

Tsinghua University is 11th, up from 32nd in 2006-09 and Peking University is 15th, up from 54th. What is interesting about this is not just that East Asian universities are moving into the highest level of research universities but how rapidly they are doing so.

No doubt there are many who will say that this is a matter of quantity and that what really counts is not the number of papers but their reception by other researchers. There is something to this. If we look at publications in the top 1 % of journals (by frequency of citation) the top ten include six US universities headed by Harvard, three British and one Canadian.

Tsinghua is 28th, Zhejiang is 50th, Peking 62nd, Shanghai Jiao Tong 80th, Seoul National University 85th . Right now it looks like publication in the most reputed journals is dominated by English-speaking universities. But in the last few years Chinese and Korean universities have advanced rapidly, Peking 119th to 62nd, Zhejiang 118th to 50th, Shanghai Jiao Tong 112th to 80th, Tsinghua 101st to 28th, Seoul National University 107th to 85th.

It seems that in a few years East Asia will dominate the elite journals and will take the lead for quality as well as quantity.

Moving on to subject group rankings, Tsinghua University is in first place for mathematics and computer sciences. The top ten consists of nine Chinese and one Singaporean university. The best US performer is MIT in 16th place, the best British Imperial College London in 48th.

When we look at the top 1 % of journals, Tsinghua is still on top, although MIT moves up to 4th place and Stanford is 5th. 

The Asian tsunami has already arrived. East Asian, mainly Chinese and Chinese diaspora, universities, are dominant or becoming dominant in the STEM subjects, leaving the humanities and social sciences to the US.

There will of course be debate about what happened. Maybe money had something to do with it. But it also seems that western universities are becoming much less selective about student admissions and faculty appointments. If you admit students who write #BlackLivesMatter 100 times on their application forms or impose ideological tests for faculty appointment and promotion you may have succeed in imposing political uniformity but you will have serious problems trying to compete with the Gaokao hardened students and researchers of Chinese universities.