It was a bit of a surprise when US News & World Report (USNWR) announced that they were going global but perhaps it shouldn't have been. The USNWR has been ranking American colleges since the early 80s, making even the Shanghai Centre for World Class Universities or QS look like novices. Also, with the advance of globalisation of higher education and research there is now a market for comparisons of US universities and their international competitors.
The Best Global Universities rankings are research based, except for two indicators, each with a 5% weighting, that count Ph D degrees. They are also heavily citation oriented, with a huge 42.5% weighting going to citations. However, the US News staff have used their common sense and included four measures of citations, normalized citation impact, total citations, number of highly cited papers and percentage of highly cited papers.
The result of this is that many of the high fliers in this year's THE rankings are absent. Bogazici University in Turkey, 14th best in Asia according to THE, is absent, So is Federico Santa Maria Technical University in Chile, according to THE second best in Latin America and Panjab University, supposedly the second best in India.
The reason for this contrast is simply that THE and Thomson Reuters rewarded these institutions for a few physics papers with hundreds of participating institutions by using a very inappropriate methodology and giving it a 30% weighting. USNWR have trimmed this indicator to 10% and so the high fliers have been grounded.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Friday, October 31, 2014
Friday, October 17, 2014
The university rankings business gets bigger and bigger
US News is going global. There are three different Arab/ MENA rankings on the way. Now, QS is getting ready for further growth. This is from Education Investor.
Posted on: 16/10/2014
Exclusive: QS seeks £10m investment
The university rankings provider QS is looking to sell a £10 million stake in its business, EducationInvestorunderstands.
According to its website, QS runs websites and events that connect graduates and employers. But it is best known for its World University Rankings, which it claims are “the most widely read university comparison of their kind”.
Three sources close to the matter said a deal was on the table, and one said that first round bids had already been submitted. QS wants to raise the cash “half to buy out an existing shareholding and half to use as growth capital”.
However, Nunzio Quacquatelli, managing director and majority shareholder of QS, told EducationInvestorthat the firm was “looking at all options, both debt and possibly structured finance”.
“We are looking for some external funding to support our rapid growth. Our vision is to be a leading information company in the higher education sector with global ambitions and [with this funding] we aim to continue on this path.”
QS operates in over 70 countries, and has more than 200 staff and 1,200 clients. Its valuation hasn’t been publicised, but the firm is understood to have an ebitda of £3.3 million and revenue of £19.8 million.
According to one source, the deal is expected to complete later in the fourth quarter.
|
Posted on: 16/10/2014
Tuesday, October 14, 2014
Shanghai without the Awards
Updated. The link to the site is here.
The Center for World-Class Universities (CWU) at Shanghai Jiao Tong University has produced an interesting new ranking by removing the Alumni and Awards indicators from its Academic Ranking of World Universities. These indicators have been criticised for allowing western universities to live off their intellectual capital and ignoring the rise of newcomers in Asia.
So what would ARWU look like without the Nobel and Fields awards?
At the very top things are the same. Harvard is still first and Stanford second. But Cambridge goes down and Oxford goes up.
Universities that would benefit significantly from deleting these indicators include Michigan, rising from 23rd to 13th, Pennsylvania State University from 58th to 35th, University of Florida, Tsinghua University, Alberta, Peking, Sao Paul, Tel Aviv, Zhejiang and Scuola Normale Pisa, which would rise to the 201-300 band.
CWU have calculated the ratio between places in ARWU and the Alternative Ranking. The higher the score the greater the benefit from the Awards and Alumni indicators. The biggest gainers from Nobel and Fields laureates are Princeton, Moscow State University and Paris Sud (11).
The countries that have benefited most from these indicators are the USA, France, Germany.
It looks as though the ARWU has favoured the Ivy League, continental European universities and Cambridge at the expense of American public universities and the rising stars of Asia.
.
The Center for World-Class Universities (CWU) at Shanghai Jiao Tong University has produced an interesting new ranking by removing the Alumni and Awards indicators from its Academic Ranking of World Universities. These indicators have been criticised for allowing western universities to live off their intellectual capital and ignoring the rise of newcomers in Asia.
So what would ARWU look like without the Nobel and Fields awards?
At the very top things are the same. Harvard is still first and Stanford second. But Cambridge goes down and Oxford goes up.
Universities that would benefit significantly from deleting these indicators include Michigan, rising from 23rd to 13th, Pennsylvania State University from 58th to 35th, University of Florida, Tsinghua University, Alberta, Peking, Sao Paul, Tel Aviv, Zhejiang and Scuola Normale Pisa, which would rise to the 201-300 band.
CWU have calculated the ratio between places in ARWU and the Alternative Ranking. The higher the score the greater the benefit from the Awards and Alumni indicators. The biggest gainers from Nobel and Fields laureates are Princeton, Moscow State University and Paris Sud (11).
The countries that have benefited most from these indicators are the USA, France, Germany.
It looks as though the ARWU has favoured the Ivy League, continental European universities and Cambridge at the expense of American public universities and the rising stars of Asia.
.
Saturday, October 11, 2014
Another Global Ranking
Just when you thought you could stop reading about rankings.
For several years the US News & World Report (USNWR), publishers of America's Best Colleges, repackaged the QS World University Rankings and just put its own stamp on them for the American public.
Now, the USNWR has announced that it is going into the global rankings business. It seems that this time that they will produce completely new rankings that have nothing to do with the Times Higher Education (THE) rankings. There will also be regional, country and subject rankings.
The data, however,will come from Thomson Reuters (TR), who are also the data providers for the THE world rankings and two of the indicators, Highly Cited Researchers and Publications, in the Shanghai ARWU rankings. It is definitely unhealthy if TR are going to supply the data or some of it for three out of four well known world rankings.
Bob Morse says that the new rankings will be "powered by Thomson Reuters InCitesTM research analytics solutions". Does this mean that universities who do not join InCites will not be ranked? Will universities be allowed to opt in or opt out? Will all data come from TR? Will the survey be shared with THE or will there be another one?
For several years the US News & World Report (USNWR), publishers of America's Best Colleges, repackaged the QS World University Rankings and just put its own stamp on them for the American public.
Now, the USNWR has announced that it is going into the global rankings business. It seems that this time that they will produce completely new rankings that have nothing to do with the Times Higher Education (THE) rankings. There will also be regional, country and subject rankings.
The data, however,will come from Thomson Reuters (TR), who are also the data providers for the THE world rankings and two of the indicators, Highly Cited Researchers and Publications, in the Shanghai ARWU rankings. It is definitely unhealthy if TR are going to supply the data or some of it for three out of four well known world rankings.
Bob Morse says that the new rankings will be "powered by Thomson Reuters InCitesTM research analytics solutions". Does this mean that universities who do not join InCites will not be ranked? Will universities be allowed to opt in or opt out? Will all data come from TR? Will the survey be shared with THE or will there be another one?
Tuesday, October 07, 2014
The Times Higher Education World University Rankings
Publisher
Times Higher Education
Scope
Global. Data provided for 400 universities. Over 800 ranked.
Top Ten
Place | University |
---|---|
1 | California Institute of Technology (Caltech) |
2 | Harvard University |
3 | Oxford University |
4 | Stanford University |
5 | Cambridge University |
6 | Massachusetts Institute of Technology (MIT) |
7 | Princeton University |
8 | University of California Berkeley |
9= | Imperial College London |
9= | Yale University |
Countries with Universities in the Top Hundred
Country | Number of Universities |
---|---|
USA | 45 |
UK | 11 |
Germany | 6 |
Netherlands | 6 |
Australia | 5 |
Canada | 4 |
Switzerland | 3 |
Sweden | 3 |
South Korea | 3 |
Japan | 2 |
Singapore | 2 |
Hong Kong | 2 |
China | 2 |
France | 2 |
Belgium | 2 |
Italy | 1 |
Turkey | 1 |
Top Ranked in Region
North America
|
California Institute of Technology (Caltech)
|
---|---|
Africa | University of Cape Town |
Europe | Oxford University |
Latin America | Universidade de Sao Paulo |
Asia | University of Tokyo |
Central and Eastern Europe | Lomonosov Moscow State University |
Arab World | University of Marrakech Cadi Ayyad |
Middle East | Middle East Technical University |
Oceania | University of Melbourne |
Noise Index
In the top 20, this year's THE world rankings are less volatile than the previous edition and this year's QS rankings. They are still slightly less stable than the Shanghai rankings.
Ranking | Average Place Change of Universities in the top 20 |
---|---|
THE World rankings 2013-14 | 0.70 |
THE World Rankings 2012-2013 | 1.20 |
QS World Rankings 2013-2014 | 1.45 |
ARWU 2013 -2014 | 0.65 |
Webometrics 2013-2014 | 4.25 |
Center for World University Ranking (Jeddah) 2013-2014 | 0.90 |
Looking at the top 100 universities, the THE rankings are more stable than last year. The average university in the top 100 in 2013 rose or fell 4.34 places. The QS rankings are now more stable than the THE or Shanghai rankings.
Ranking | Average Place Change of Universities in the top 100 |
---|---|
THE World Rankings 2013-2014 | 4.34 |
THE World Rankings 2012-2013 | 5.36 |
QS World Rankings 2013-14 | 3.94 |
ARWU 2013 -2014 | 4.92 |
Webometrics 2013-2014 | 12.08 |
Center for World University Ranking (Jeddah) 2013-2014 | 10.59 |
Note: universities falling out of the top 100 are treated as though they fell to 101st position.
Saturday, October 04, 2014
How to win citations and rise in the rankings
A large part of the academic world has either been congratulating itself on performing well in the latest Times Higher Education (THE) world rankings, the data for which is provided by Thomson Reuters (TR), or complaining that only large injections of public money will keep their universities from falling into the great pit of the unranked.
Some, however, have been baffled by some of the placings reported by THE this year. Federico Santa Maria Technical University in Chile is allegedly the fourth best university in Latin America, Scuola Normale Superiore di Pisa the best in Italy and Turkish universities are apparently the rising stars of the academic world.
When there is a a university that appears to be punching above its weight the cause often turns out to be the citations indicator
Scuola Normale Superiore di Pisa is 63rd in the world with an overall score of 61.9 but a citations score of 96.4.
Royal Holloway, University of London is 118th in the world with an overall score of 53 but a citations score of 98.9
The University of California Santa Cruz is top of the world for citations with an overall score of 53.7 and 100 for citations
Bogazici University is 139th in the world with an overall score of 51.1 and a citations score of 96.8.
Federico Santa Maria Technical University in Valparaiso is in the 251-175 band so the total score is not given although it would be easy enough to work out. It has a score of 99.7 for citations.
So what is going on?
The problem lies with various aspects of Thomson Reuters' methodology.
First they use field normalisation. That means that they do not simply count the number of citations but compare the number of citations in 250 fields with the world average in each field. Not only that, but they compare each year in which the paper is cited with the world average of citations for that year.
The rationale for this is that the number of citations and the rapidity with which papers are cited vary from field to field. A paper reporting a cure for cancer or the discovery of a new particle will be cited hundreds of times within weeks. A paper in philosophy, economics or history may languish for years before anyone takes notice. John Muth's work on rational expectations was hardly noticed or cited for years before eventually starting a revolution in economic theory. So universities should be compared to the average for fields and years. Otherwise, those that are strong in the humanities and social sciences will be penalised.
Up to a point this is not a bad idea. But it does assume that all disciplines are equally valuable and demanding. But if the world has decided that it will fund medical research or astrophysics and support journals and pay researchers to read and cite other researchers' papers rather than media studies or education, then this is perhaps something rankers and data collectors should take account of.
In any case, by normalising for so many fields and then throwing normalisation by year into the mix, TR increase the likelihood of statistical anomalies. If someone can get a few dozen citations within a couple of years after publication in a field where citations, especially early ones, average below one a year then this could give an enormous boost to a university's citation score. That is precisely what happened with Alexandria University in 2010. Methodological tweaking has mitigated the risk to some extent but not completely. A university could also get a big boost by getting credit, no matter how undeserved, for a breakthrough paper or a review that is widely cited.
So let's take a look at some of the influential universities in the 2014 THE rankings. Scuola Normale Superiore di Pisa (SNSP) is a small research intensive institution that might not even meet the criteria to be ranked by TR. Its output is modest, 2,407 publications in the Web of Science core collection between 2009 and 2013, although for a small institution that is quite good.
One of those publications is 'Observation of a new boson...' in Physics Letters B in September 2012, which has been cited 1,631 times.
The paper has 2,896 "authors", whom I counted by looking for semicolons in the "find" box, affiliated to 228 institutions. Five of them are from SNSP.
To put it crudely, SSNP is making an "authorship" contribution of 0.17 % to the paper but getting 100% of the citation credit, as does every other contributor. Perhaps its researchers are playing a leading role in the Large Hadron Collider project or perhaps it has made a disproportionate financial contribution but TR provide no reason to think so.
The University of the Andes, supposedly the second best university in Latin America, is also a contributor to this publication, as is Panjab University, supposedly the second best institution in the Indian subcontinent.
Meanwhile, Royal Holloway, University of London has contributed to "Observation of a new particle...' in the same journal and issue. This has received 1,734 citations and involved 2,932 authors from 267 institutions, along with Tokyo Metropolitan University, Federico Santa Maria Technical University, Middle Eastern Technical University and Bogazici University.
The University of California Santa Cruz is one of 119 institutions that contributed to the 'Review of particle physics...' 2010 which has been cited 3,739 times to date. Like all the other contributors it gets full credit for all those citations.
It is not just the number of citations that boosts citation impact scores but also their occurrence within a year or two of publication so that the number of citations is much greater than the average for that field and those years.
The proliferation of papers with hundreds of authors is not confined to physics. There are several examples from medicine and genetics as well.
At this point, the question arises why not divide the citations for each paper among the authors of the paper? This is an option available in the Leiden Ranking so it should not be beyond TR's technical capabilities.
Or why not stop counting multi - authored publications when they exceed a certain quota of authors? This is exactly what TR did earlier this year when collecting data for its new highly cited researchers lists. Physics papers with more than 30 institutional affiliations were omitted, a very sensible procedure that should have been applied across the board.
So basically, one route to success in the rankings is to get into a multi - collaborator mega - cited project.
But that is not enough in itself. There are hundreds of universities contributing to these publications. But not all of them reap such disproportionate benefits. It is important not to publish too much. A dozen LHC papers will do wonders if you publish 400 or 500 papers a year. Four thousand a year and it will make little difference. One reason for the success of otherwise obscure institutions is that the number of papers by which the citations are divided is small.
So why on earth are TR using a method that produces such laughable results? Lets face it, if any other ranker put SNS Pisa, Federico Santa Maria or Bogaziii at the top of its flagship indicator we would go deaf from the chorus of academic tut-tutting.
TR, I suspect, are doing this because this method is identical or nearly identical to that used for their InCites system for evaluating individual academics within institutions, which appears very lucrative, and they do not want the expense and inconvenience of recalculating data.
Also perhaps, TR have become so enamoured of the complexity and sophistication of their operations that they really do think that they have actually discovered pockets of excellence in unlikely places that nobody else has the skill or the resources to even notice.
But we have not finished. There is one more element in TR's distinctive methodology and that is its regional modification introduced by Thompson Reuters in 2011.
This means that the normalised citation impact score of the university is divided by the square root of the impact score of the country in which it is located. A university located in a low scoring country will get a bonus that will be greater the lower the country's impact score. This would clearly be an advantage to countries like Chile, India and Turkey.
Every year there are more multi - authored multi -cited papers. It would not be surprising if university presidents start scanning the author lists of publications like the Review of Particle Physics, send out recruitment letters and get ready for ranking stardom.
Some, however, have been baffled by some of the placings reported by THE this year. Federico Santa Maria Technical University in Chile is allegedly the fourth best university in Latin America, Scuola Normale Superiore di Pisa the best in Italy and Turkish universities are apparently the rising stars of the academic world.
When there is a a university that appears to be punching above its weight the cause often turns out to be the citations indicator
Scuola Normale Superiore di Pisa is 63rd in the world with an overall score of 61.9 but a citations score of 96.4.
Royal Holloway, University of London is 118th in the world with an overall score of 53 but a citations score of 98.9
The University of California Santa Cruz is top of the world for citations with an overall score of 53.7 and 100 for citations
Bogazici University is 139th in the world with an overall score of 51.1 and a citations score of 96.8.
Federico Santa Maria Technical University in Valparaiso is in the 251-175 band so the total score is not given although it would be easy enough to work out. It has a score of 99.7 for citations.
So what is going on?
The problem lies with various aspects of Thomson Reuters' methodology.
First they use field normalisation. That means that they do not simply count the number of citations but compare the number of citations in 250 fields with the world average in each field. Not only that, but they compare each year in which the paper is cited with the world average of citations for that year.
The rationale for this is that the number of citations and the rapidity with which papers are cited vary from field to field. A paper reporting a cure for cancer or the discovery of a new particle will be cited hundreds of times within weeks. A paper in philosophy, economics or history may languish for years before anyone takes notice. John Muth's work on rational expectations was hardly noticed or cited for years before eventually starting a revolution in economic theory. So universities should be compared to the average for fields and years. Otherwise, those that are strong in the humanities and social sciences will be penalised.
Up to a point this is not a bad idea. But it does assume that all disciplines are equally valuable and demanding. But if the world has decided that it will fund medical research or astrophysics and support journals and pay researchers to read and cite other researchers' papers rather than media studies or education, then this is perhaps something rankers and data collectors should take account of.
In any case, by normalising for so many fields and then throwing normalisation by year into the mix, TR increase the likelihood of statistical anomalies. If someone can get a few dozen citations within a couple of years after publication in a field where citations, especially early ones, average below one a year then this could give an enormous boost to a university's citation score. That is precisely what happened with Alexandria University in 2010. Methodological tweaking has mitigated the risk to some extent but not completely. A university could also get a big boost by getting credit, no matter how undeserved, for a breakthrough paper or a review that is widely cited.
So let's take a look at some of the influential universities in the 2014 THE rankings. Scuola Normale Superiore di Pisa (SNSP) is a small research intensive institution that might not even meet the criteria to be ranked by TR. Its output is modest, 2,407 publications in the Web of Science core collection between 2009 and 2013, although for a small institution that is quite good.
One of those publications is 'Observation of a new boson...' in Physics Letters B in September 2012, which has been cited 1,631 times.
The paper has 2,896 "authors", whom I counted by looking for semicolons in the "find" box, affiliated to 228 institutions. Five of them are from SNSP.
To put it crudely, SSNP is making an "authorship" contribution of 0.17 % to the paper but getting 100% of the citation credit, as does every other contributor. Perhaps its researchers are playing a leading role in the Large Hadron Collider project or perhaps it has made a disproportionate financial contribution but TR provide no reason to think so.
The University of the Andes, supposedly the second best university in Latin America, is also a contributor to this publication, as is Panjab University, supposedly the second best institution in the Indian subcontinent.
Meanwhile, Royal Holloway, University of London has contributed to "Observation of a new particle...' in the same journal and issue. This has received 1,734 citations and involved 2,932 authors from 267 institutions, along with Tokyo Metropolitan University, Federico Santa Maria Technical University, Middle Eastern Technical University and Bogazici University.
The University of California Santa Cruz is one of 119 institutions that contributed to the 'Review of particle physics...' 2010 which has been cited 3,739 times to date. Like all the other contributors it gets full credit for all those citations.
It is not just the number of citations that boosts citation impact scores but also their occurrence within a year or two of publication so that the number of citations is much greater than the average for that field and those years.
The proliferation of papers with hundreds of authors is not confined to physics. There are several examples from medicine and genetics as well.
At this point, the question arises why not divide the citations for each paper among the authors of the paper? This is an option available in the Leiden Ranking so it should not be beyond TR's technical capabilities.
Or why not stop counting multi - authored publications when they exceed a certain quota of authors? This is exactly what TR did earlier this year when collecting data for its new highly cited researchers lists. Physics papers with more than 30 institutional affiliations were omitted, a very sensible procedure that should have been applied across the board.
So basically, one route to success in the rankings is to get into a multi - collaborator mega - cited project.
But that is not enough in itself. There are hundreds of universities contributing to these publications. But not all of them reap such disproportionate benefits. It is important not to publish too much. A dozen LHC papers will do wonders if you publish 400 or 500 papers a year. Four thousand a year and it will make little difference. One reason for the success of otherwise obscure institutions is that the number of papers by which the citations are divided is small.
So why on earth are TR using a method that produces such laughable results? Lets face it, if any other ranker put SNS Pisa, Federico Santa Maria or Bogaziii at the top of its flagship indicator we would go deaf from the chorus of academic tut-tutting.
TR, I suspect, are doing this because this method is identical or nearly identical to that used for their InCites system for evaluating individual academics within institutions, which appears very lucrative, and they do not want the expense and inconvenience of recalculating data.
Also perhaps, TR have become so enamoured of the complexity and sophistication of their operations that they really do think that they have actually discovered pockets of excellence in unlikely places that nobody else has the skill or the resources to even notice.
But we have not finished. There is one more element in TR's distinctive methodology and that is its regional modification introduced by Thompson Reuters in 2011.
This means that the normalised citation impact score of the university is divided by the square root of the impact score of the country in which it is located. A university located in a low scoring country will get a bonus that will be greater the lower the country's impact score. This would clearly be an advantage to countries like Chile, India and Turkey.
Every year there are more multi - authored multi -cited papers. It would not be surprising if university presidents start scanning the author lists of publications like the Review of Particle Physics, send out recruitment letters and get ready for ranking stardom.
Friday, October 03, 2014
Scuola Normale Superiore di Pisa: Are they dancing in the streets?
There is a lot of coverage of that huge pocket of excellence in Italy at ROARS:Return on Academic Research. Still trying to make sense of the Google translator.
The university with more research influence than.....
With apologies to the Sydney Morning Herald which had a headline about Caltech.
I don't know if the people of Valparaiso are aware that they are home to a world- class university but if they do find out this might be a nice headline.
Federico Santa Maria Technical University. More research influence than Princeton, Stanford, Harvard, Oxford, Cambridge, Yale, Duke, Bogazici, Colorado School of Mines.... (insert as you wish.)
I don't know if the people of Valparaiso are aware that they are home to a world- class university but if they do find out this might be a nice headline.
Federico Santa Maria Technical University. More research influence than Princeton, Stanford, Harvard, Oxford, Cambridge, Yale, Duke, Bogazici, Colorado School of Mines.... (insert as you wish.)
Thursday, October 02, 2014
Which universities have the greatest research influence?
Times Higher Education (THE) claims that its Citations:Research Influence indicator, prepared by Thomson Reuters (TR), is the flagship of its World University Rankings, It is strange then that the magazine has never published a research influence ranking although that ought to be just as interesting as its Young Universities Ranking, Reputation Rankings or gender index.
So let's have a look at the top 25 universities in the world this year ranked for research influence, measured by field- and year- normalised citations, by Thomson Reuters.
Santa Cruz and Tokyo Metropolitan have the same impact as MIT. Federico Santa MariaTechnical University is ahead of Princeton. Florida Institute of Technology beats Harvard. Bogazici University and Scuola Normale Superiore do better than Oxford and Cambridge.
Are they serious?
Apparently. There will be an explanation in the next post. Meanwhile go and check if you don't believe me. And let me know if there's any dancing in the streets of Valparaiso, Pisa, Golden or Istanbul.
Rank and Score for Citations: Research Influence 2014-15 THE World Rankings
So let's have a look at the top 25 universities in the world this year ranked for research influence, measured by field- and year- normalised citations, by Thomson Reuters.
Santa Cruz and Tokyo Metropolitan have the same impact as MIT. Federico Santa MariaTechnical University is ahead of Princeton. Florida Institute of Technology beats Harvard. Bogazici University and Scuola Normale Superiore do better than Oxford and Cambridge.
Are they serious?
Apparently. There will be an explanation in the next post. Meanwhile go and check if you don't believe me. And let me know if there's any dancing in the streets of Valparaiso, Pisa, Golden or Istanbul.
Rank and Score for Citations: Research Influence 2014-15 THE World Rankings
Rank | University | Score |
---|---|---|
1= | University of California Santa Cruz | 100 |
1= | MIT | 100 |
1= | Tokyo Metropolitan University | 100 |
4 | Rice University | 99.9 |
5= | Caltech | 99.7 |
5= | Federico Santa Maria Technical University, Chile | 99.7 |
7 | Princeton University | 99.6 |
8= | Florida Institute of Technology | 99.2 |
8= | University of California Santa Barbara | 99.2 |
10= | Stanford University | 99.1 |
10= | University of California Berkeley | 99.1 |
12= | Harvard University | 98.9 |
12= | Royal Holloway University of London | 98.9 |
14 | University of Colorado Boulder | 97.4 |
15 | University of Chicago | 97.3 |
16= | Washington University of St Louis | 97.1 |
16= | Colorado School of Mines | 97.1 |
18 | Northwestern University | 96.9 |
19 | Bogazici University, Turkey | 96.8 |
20 | Duke University | 96.6 |
21= | Scuola Normale Superiore Pisa, Italy | 96.4 |
21= | University of California San Diego | 96.4 |
23 | Boston College | 95.9 |
24 | Oxford University | 95.5 |
25= | Brandeis University | 95.3 |
25= | UCLA | 95.3 |
Subscribe to:
Posts (Atom)