I am feeling a bit embarrassed. In a recent post I wrote about the Shanghai Rankings (ARWU) being a bit boring (which is good) because university ranks usually do not change very much. But then I noticed that a couple of Australian universities did very well in the latest rankings. One of them, the Australian National University (ANU), has risen a spectacular (for ARWU) 31 places over last year. The Financial Review says that "[u]niversity scientific research has boosted the position of two Australian universities in a global ranking of higher education providers."
The ranking is ARWU and the rise in the ranking is linked to the economic contribution of Australian universities, especially those in the Group of Eight.
So how well did Australian universities do? The top performer, as in previous years, is the University of Melbourne, which went up a spot to 38th place. Two other universities went up a lot in a very un-Shanghainese way, ANU, already mentioned, from 69th to 38th place and the University of Sydney from to 83rd to 68th
The University of Queensland was unchanged in 55th place while Monash fell from 78th to 91st and the University of Western Australia from 91st to 93rd.
How did ANU and Sydney do it? The ANU scores for Nobel and Fields awards were unchanged. Publications were up a bit and papers in Nature and Science down a bit.
What made the difference was the score for highly cited researchers, derived from lists kept by Clarivate Analytics, which rose from 15.4 to 23.5, a difference of 8.1 or, after weighting, 1.62 points of the overall score. The difference in total scores between 2017 and 2018 was 1.9 so those highly cited researchers made up most of the difference.
In 2016 ANU had two researchers in the list, which was used for the 2017 rankings. One was also on the 2017 list, used in 2018. In 2017 there were six ANU highly cited researchers, one from the previous year and one who had moved from MIT. The other four were long serving ANU researchers.
Let's be clear. ANU has not been handing out unusual contracts or poaching from other institutions. It has grown its own researchers and should be congratulated.
But using an indicator where a single researcher can lift a top 100 university seven or eight places is an invitation to perverse consequences. ARWU should consider whether it is time to explore other measures of research impact.
The improved scores for the University of Sydney resulted from an increase between 2016 and 2017 in the number of articles published in the Science Citation Index Expanded and the Social Science Citation Index.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Showing posts sorted by date for query MIT. Sort by relevance Show all posts
Showing posts sorted by date for query MIT. Sort by relevance Show all posts
Friday, August 24, 2018
Monday, June 04, 2018
Ranking rankings: Crass materialism
As the number and scope of university rankings increase it is time to start thinking about how to rank the rankers.
Indicators for global rankings might include number of universities ranked (Webometrics in 1st place), number of indicators (Round University Ranking), bias, and stability.
There could also be an indicator for crass materialism. Here is a candidate for first place. CNBC quotes a report from Wealth-X (supposedly downloadable, good luck) and lists the top ten universities, all in the USA, for billionaires. Apparently, the ranking also includes universities outside the US.
1. Harvard
2. Stanford
3. Pennsylvania
4. Columbia
5. MIT
6. Cornell
7. Yale
8= Southern California
8= Chicago
10 Michigan.
Indicators for global rankings might include number of universities ranked (Webometrics in 1st place), number of indicators (Round University Ranking), bias, and stability.
There could also be an indicator for crass materialism. Here is a candidate for first place. CNBC quotes a report from Wealth-X (supposedly downloadable, good luck) and lists the top ten universities, all in the USA, for billionaires. Apparently, the ranking also includes universities outside the US.
1. Harvard
2. Stanford
3. Pennsylvania
4. Columbia
5. MIT
6. Cornell
7. Yale
8= Southern California
8= Chicago
10 Michigan.
Thursday, May 31, 2018
Where did the top data scientists study?
The website efinancialcareers has a list of the top twenty data scientists in finance and banking. This looks like a subjective list and another writer might come up with a different set of experts. Even so it is quite interesting.
Their degrees are mainly in things like engineering, computer science and maths. There is only one each in business, economics and finance.
The institutions where they studied are:
Stanford (three)
University College London (three)
Institut Polytechnique de Grenoble
Oxford
Leonard Stern School of Business, New York University
University of Mexico
Universite Paris Dauphine
Ecole Polytechnique
Rensselaer Polytechnic Institute (RPI)
California State University
Indian Institute of Science
Johns Hopkins University
Institute of Management Development and Research, India
University of Illinois
University of Pittsburgh
Indian Institute of Technology.
Harvard, MIT and Cambridge are absent but there are three Indian Institutes, three French schools and some non-Ivy US places like RPI and the Universities of Pittsburgh and Illinois.
Friday, May 11, 2018
Ranking Insights from Russia
The ranking industry is expanding and new rankings appear all the time. Most global rankings measure research publications and citations. Others try to add to the mix indicators that might have something to do with teaching and learning. There is now a ranking that tries to capture various third missions.
The Round University Rankings published in Russia are in the tradition of holistic rankings. They give a 40 % weighting to research, 40 % to teaching, 10% to international diversity and 10% to financial sustainability. Each group contains five equally weighted indicators. The data is derived from Clarivate Analytics who also contribute to the US News Best Global Universities Rankings.
These rankings are similar to the THE rankings in that they attempt to assess quality rather than quantity but they have 20 indicators instead of 13 and assign sensible weightings. Unfortunately, they receive only a fraction of the attention given to the THE rankings.
They are, however, very valuable since they dig deeper into the data than other global rankings. They also show that there is a downside to measures of quality and that data submitted directly by institutions should be treated with caution and perhaps scepticism.
Here are the top universities for each of the RUR indicators.
Teaching
Academic staff per students: VIB (Flemish Institute of Biotechnology), Belgium
Academic staff per bachelor degrees awarded: University of Valladolid, Spain
Doctoral degrees per academic staff: Kurdistan University of Medical Science, Iran
Doctoral degrees per bachelor degrees awarded: Jawaharlal Nehru University, India
World teaching reputation Harvard University, USA.
Research
Citations per academic and research staff: Harvard
Doctoral degrees per admitted PhD: Al Farabi Kazakh National University
Normalised citation impact: Rockefeller University, USA
Share of international co-authored papers: Free University of Berlin
World research reputation: Harvard.
International diversity
Share of international academic staff: American University of Sharjah, UAE
Share of international students: American University of Sharjah
Share of international co-authored papers: Innopolis University, Russia
Reputation outside region: Voronezh State Technical University, Russai
International Level: EPF Lausanne, Switzerland.
Financial sustainability:
Institutional income per academic staff: Universidade Federal Do Ceara, Brazil
Institutional income per student: Rockefeller University
Papers per research income: Novosibersk State University of Economics and Management, Russia
Research income per academic and research staff: Istanbul Technical University, Turkey
Research income per institutional income: A C Camargo Cancer Center, Brazil.
There are some surprising results here. The most obvious is Voronezh State Technical University which is first for reputation outside its region (Asia, Europe and so on), even though its overall scores for reputation and for international diversity are very low. The other top universities for this metric are just what you would expect, Harvard, MIT, Stanford, Oxford and so on. I wonder whether there is some sort of bug in the survey procedure, perhaps something like the university's supporters being assigned to Asia and therefore out of region. The university is also in second place in the world for papers per research income despite very low scores for the other research indicators.
There are other oddities such as Novosibersk State University of Economics and Management placed first for papers per research income and Universidade Federal Do Ceara for institutional income per academic staff. These may result from anomalies in the procedures for reporting and analysing data, possibly including problems in collecting data on income and staff.
It also seems that medical schools and specialist or predominantly postgraduate institutions such as Rockefeller University, the Kurdistan University of Medical Science, Jawarhalal Nehru University and VIB have a big advantage with these indicators since they tend to have favourable faculty student ratios, sometimes boosted by large numbers of clinical and research only staff, and a large proportion of doctoral students.
Jawaharlal Nehru University is a mainly postgraduate university so a high placing for academic staff per bachelor degrees awarded is not unexpected although I am surprised that it is ahead of Yale and Princeton. I must admit that the third place here for the University of Baghdad needs some explanation.
The indicator doctoral degrees per admitted PhD might identify universities that do a good job of selection and training and get large numbers of doctoral candidates through the system. Or perhaps it identifies universities where doctoral programmes are so lacking in rigour that nearly everybody can get their degree once admitted. The top ten of this indicator includes De Montfort University, Shakarim University, Kingston University, and the University of Westminster, none of which are famous for research excellence across the range of disciplines.
Measures of international diversity have become a staple of global rankings since they are fairly easy to collect. The problem is that international orientation may have something do with quality but it may also simply be a necessary attribute of being in a small country next to larger countries with the same or similar language and culture. The top ten for the international student indicators includes the Central European University and the American university of Sharjah. For international faculty it includes the University of Macau and Qatar University.
To conclude, these indicators suggest that self submitted institutional data should be used sparingly and that data from third party sources may be preferable. Also, while ranking by quality instead of quantity is sometimes advisable it also means that anomalies and outliers are more likely to appear.
The Round University Rankings published in Russia are in the tradition of holistic rankings. They give a 40 % weighting to research, 40 % to teaching, 10% to international diversity and 10% to financial sustainability. Each group contains five equally weighted indicators. The data is derived from Clarivate Analytics who also contribute to the US News Best Global Universities Rankings.
These rankings are similar to the THE rankings in that they attempt to assess quality rather than quantity but they have 20 indicators instead of 13 and assign sensible weightings. Unfortunately, they receive only a fraction of the attention given to the THE rankings.
They are, however, very valuable since they dig deeper into the data than other global rankings. They also show that there is a downside to measures of quality and that data submitted directly by institutions should be treated with caution and perhaps scepticism.
Here are the top universities for each of the RUR indicators.
Teaching
Academic staff per students: VIB (Flemish Institute of Biotechnology), Belgium
Academic staff per bachelor degrees awarded: University of Valladolid, Spain
Doctoral degrees per academic staff: Kurdistan University of Medical Science, Iran
Doctoral degrees per bachelor degrees awarded: Jawaharlal Nehru University, India
World teaching reputation Harvard University, USA.
Research
Citations per academic and research staff: Harvard
Doctoral degrees per admitted PhD: Al Farabi Kazakh National University
Normalised citation impact: Rockefeller University, USA
Share of international co-authored papers: Free University of Berlin
World research reputation: Harvard.
International diversity
Share of international academic staff: American University of Sharjah, UAE
Share of international students: American University of Sharjah
Share of international co-authored papers: Innopolis University, Russia
Reputation outside region: Voronezh State Technical University, Russai
International Level: EPF Lausanne, Switzerland.
Financial sustainability:
Institutional income per academic staff: Universidade Federal Do Ceara, Brazil
Institutional income per student: Rockefeller University
Papers per research income: Novosibersk State University of Economics and Management, Russia
Research income per academic and research staff: Istanbul Technical University, Turkey
Research income per institutional income: A C Camargo Cancer Center, Brazil.
There are some surprising results here. The most obvious is Voronezh State Technical University which is first for reputation outside its region (Asia, Europe and so on), even though its overall scores for reputation and for international diversity are very low. The other top universities for this metric are just what you would expect, Harvard, MIT, Stanford, Oxford and so on. I wonder whether there is some sort of bug in the survey procedure, perhaps something like the university's supporters being assigned to Asia and therefore out of region. The university is also in second place in the world for papers per research income despite very low scores for the other research indicators.
There are other oddities such as Novosibersk State University of Economics and Management placed first for papers per research income and Universidade Federal Do Ceara for institutional income per academic staff. These may result from anomalies in the procedures for reporting and analysing data, possibly including problems in collecting data on income and staff.
It also seems that medical schools and specialist or predominantly postgraduate institutions such as Rockefeller University, the Kurdistan University of Medical Science, Jawarhalal Nehru University and VIB have a big advantage with these indicators since they tend to have favourable faculty student ratios, sometimes boosted by large numbers of clinical and research only staff, and a large proportion of doctoral students.
Jawaharlal Nehru University is a mainly postgraduate university so a high placing for academic staff per bachelor degrees awarded is not unexpected although I am surprised that it is ahead of Yale and Princeton. I must admit that the third place here for the University of Baghdad needs some explanation.
The indicator doctoral degrees per admitted PhD might identify universities that do a good job of selection and training and get large numbers of doctoral candidates through the system. Or perhaps it identifies universities where doctoral programmes are so lacking in rigour that nearly everybody can get their degree once admitted. The top ten of this indicator includes De Montfort University, Shakarim University, Kingston University, and the University of Westminster, none of which are famous for research excellence across the range of disciplines.
Measures of international diversity have become a staple of global rankings since they are fairly easy to collect. The problem is that international orientation may have something do with quality but it may also simply be a necessary attribute of being in a small country next to larger countries with the same or similar language and culture. The top ten for the international student indicators includes the Central European University and the American university of Sharjah. For international faculty it includes the University of Macau and Qatar University.
To conclude, these indicators suggest that self submitted institutional data should be used sparingly and that data from third party sources may be preferable. Also, while ranking by quality instead of quantity is sometimes advisable it also means that anomalies and outliers are more likely to appear.
Saturday, December 16, 2017
Measuring graduate employability; two rankings
Global university rankings are now well into their second decade. Since 2003, when the first Shanghai rankings appeared, there has been a steady growth of global and regional rankings. At the moment most global rankings are of two kinds, those that focus entirely or almost entirely on research and those such as the Russian Round Rankings, Times Higher Education (THE) and Quacquarelli Symonds (QS) that claim to also measure teaching, learning or graduate quality in some way, although even those are biased towards research when you scratch the surface a little.
The ranking industry has become adept at measuring research productivity and quality in various ways. But the assessment of undergraduate teaching and learning is another matter.
Several ranking organisations use faculty student ratio as a proxy for quality of teaching which in turn is assumed to have some connection with something that happens to students during their programmes. THE also count institutional income, research income and income from industry, again assuming that there is a significant association with academic excellence. Indicators like this are usually based on those supplied by institutions. For examples of problems here see an article by Alex Usher and a reply by Phil Baty.
An attempt to get at student quality is provided by the CWUR rankings now based in UAE, counting alumni who win international awards or who are CEOs of major companies. But obviously this is relevant only for a very small number of universities. A new pilot ranking from Moscow also counts international awards.
The only attempt to measure student quality by the well known rankers that is relevant to most institutions is the survey of employers in the QS world and regional rankings. There are some obvious difficulties here. QS gets respondents from a variety of channels and this may allow some universities to influence the survey. In recent years some Latin American universities have done much better on this indicator than on any other.
THE now publish a global employability ranking which is conducted by two European firms, Trendence and Emerging. This is based on two surveys of recruiters in Argentina, Australia, Austria, Brazil, Canada, China, Germany, France, India, Israel, Italy, Japan, Mexico, Netherlands, Singapore, Spain, South Africa, South Korea, Turkey, UAE, UK, and USA. There were two panels with a total of over 6,000 respondents.
A global survey that does not include Chile, Sweden, Egypt, Nigeria, Saudi Arabia, Russia, Pakistan, Indonesia, Bangladesh, Poland, Malaysia or Taiwan can hardly claim to be representative of international employers. This limited representation may explain some oddities of the rankings such as the high places of the American University of Dubai and and the National Autonomous University of Mexico.
The first five places in these rankings are quite similar to the THE world rankings: Caltech, Harvard, Columbia, MIT, Cambridge. But there some significant differences after that and some substantial changes since last year. Here Columbia, 14th in the world rankings, is in third place, up from 12th last year. Boston University is 6th here but 70th in the world rankings. Tokyo Institute of Technology in 19th place is in the 251-300 band in the world rankings. CentraleSupelec, is 41st, but in the world 401-500 group.
These rankings are useful only for a small minority of universities, stakeholders and students. Only 150 schools are ranked and only a small proportion of the world's employers consulted.
QS have also released their global employability rankings with 500 universities. These combine the employer reputation survey, used in their world rankings with other indicators: alumni outcomes, based on lists of high achievers, partnership with employers, that is research collaboration noted in the Scopus database, employer-student connections, that is employers actively present on campus, and graduate employment rate. There seems to be a close association, at least at the top, between overall scores, employer reputation and alumni outcomes. Overall the top three are Stanford, UCLA, Harvard. For employer reputation they are Cambridge, Oxford, Harvard and for alumni outcomes Harvard, Stanford Oxford.
The other indicators are a different matter. For employer-student connections the top three are Huazhong University of Science and Technology, Arizona State University, and New York University. In fact seven out of the top ten on this measure are Chinese. For graduate employment rate they are Politecnico di Torino, Moscow State Institute of International Relations, and Sungkyunkwan University and for partnership with employers Stanford, Surrey and Politecnico Milano. When the front runners in indicators are so different one has to wonder about their validity.
There are some very substantial differences in the ranks given to various universities in these rankings. Caltech is first in the Emerging-Trendence rankings and 73rd in QS. Hong Kong University of Science and Technology is 12th in Emerging-Trendence but not ranked at all by QS. The University of Sydney is 4th in QS and 48th in Emerging-Trendence. The American University of Dubai is in QS's 301-500 band but 138th for Emerging-Trendence
The rankings published by THE could be some value to those students contemplating careers with the leading companies in the richest countries.
The QS rankings may be more helpful for those students or stakeholders looking at universities outside the very top of the global elite. Even so QS have ranked only a fraction of the world's universities.
It still seems that the way forward in the assessment of graduate outcomes and employability is through standardised testing along the lines of AHELO or the Collegiate Learning Assessment.
The ranking industry has become adept at measuring research productivity and quality in various ways. But the assessment of undergraduate teaching and learning is another matter.
Several ranking organisations use faculty student ratio as a proxy for quality of teaching which in turn is assumed to have some connection with something that happens to students during their programmes. THE also count institutional income, research income and income from industry, again assuming that there is a significant association with academic excellence. Indicators like this are usually based on those supplied by institutions. For examples of problems here see an article by Alex Usher and a reply by Phil Baty.
An attempt to get at student quality is provided by the CWUR rankings now based in UAE, counting alumni who win international awards or who are CEOs of major companies. But obviously this is relevant only for a very small number of universities. A new pilot ranking from Moscow also counts international awards.
The only attempt to measure student quality by the well known rankers that is relevant to most institutions is the survey of employers in the QS world and regional rankings. There are some obvious difficulties here. QS gets respondents from a variety of channels and this may allow some universities to influence the survey. In recent years some Latin American universities have done much better on this indicator than on any other.
THE now publish a global employability ranking which is conducted by two European firms, Trendence and Emerging. This is based on two surveys of recruiters in Argentina, Australia, Austria, Brazil, Canada, China, Germany, France, India, Israel, Italy, Japan, Mexico, Netherlands, Singapore, Spain, South Africa, South Korea, Turkey, UAE, UK, and USA. There were two panels with a total of over 6,000 respondents.
A global survey that does not include Chile, Sweden, Egypt, Nigeria, Saudi Arabia, Russia, Pakistan, Indonesia, Bangladesh, Poland, Malaysia or Taiwan can hardly claim to be representative of international employers. This limited representation may explain some oddities of the rankings such as the high places of the American University of Dubai and and the National Autonomous University of Mexico.
The first five places in these rankings are quite similar to the THE world rankings: Caltech, Harvard, Columbia, MIT, Cambridge. But there some significant differences after that and some substantial changes since last year. Here Columbia, 14th in the world rankings, is in third place, up from 12th last year. Boston University is 6th here but 70th in the world rankings. Tokyo Institute of Technology in 19th place is in the 251-300 band in the world rankings. CentraleSupelec, is 41st, but in the world 401-500 group.
These rankings are useful only for a small minority of universities, stakeholders and students. Only 150 schools are ranked and only a small proportion of the world's employers consulted.
QS have also released their global employability rankings with 500 universities. These combine the employer reputation survey, used in their world rankings with other indicators: alumni outcomes, based on lists of high achievers, partnership with employers, that is research collaboration noted in the Scopus database, employer-student connections, that is employers actively present on campus, and graduate employment rate. There seems to be a close association, at least at the top, between overall scores, employer reputation and alumni outcomes. Overall the top three are Stanford, UCLA, Harvard. For employer reputation they are Cambridge, Oxford, Harvard and for alumni outcomes Harvard, Stanford Oxford.
The other indicators are a different matter. For employer-student connections the top three are Huazhong University of Science and Technology, Arizona State University, and New York University. In fact seven out of the top ten on this measure are Chinese. For graduate employment rate they are Politecnico di Torino, Moscow State Institute of International Relations, and Sungkyunkwan University and for partnership with employers Stanford, Surrey and Politecnico Milano. When the front runners in indicators are so different one has to wonder about their validity.
There are some very substantial differences in the ranks given to various universities in these rankings. Caltech is first in the Emerging-Trendence rankings and 73rd in QS. Hong Kong University of Science and Technology is 12th in Emerging-Trendence but not ranked at all by QS. The University of Sydney is 4th in QS and 48th in Emerging-Trendence. The American University of Dubai is in QS's 301-500 band but 138th for Emerging-Trendence
The rankings published by THE could be some value to those students contemplating careers with the leading companies in the richest countries.
The QS rankings may be more helpful for those students or stakeholders looking at universities outside the very top of the global elite. Even so QS have ranked only a fraction of the world's universities.
It still seems that the way forward in the assessment of graduate outcomes and employability is through standardised testing along the lines of AHELO or the Collegiate Learning Assessment.
Friday, November 17, 2017
Another global ranking?
In response to suggestion by Hee Kim Poh of Nanyang Technological University, I have had a look at the Worldwide Professional University Rankings which appear to be linked to "Global World Communicator" and the "International Council of Scientists" and may be based in Latvia.
There is a methodology page but it does not include essential information. One indicator is "number of publications to number of academic staff" but there is nothing about how either of these are calculated or where the data comes from. There is a reference to a survey of members of the International Council of Scientists but nothing about the wording of the survey, date of survey, distribution of respondents or the response rate.
Anyway, here is the introduction to the methodology:
"The methodology of the professional ranking of universities is based on comparing universities and professional evaluation by level of proposed training programs (degrees), availability and completeness of information on activities of a university, its capacity and reputation on a national and international levels. Main task is to determine parameters and ratios needed to assess quality of the learning process and obtained specialist knowledge. Professional formalized ranking system based on a mathematical calculation of the relation of parameters of the learning process characterizing quality of education and learning environment. Professional evaluation criteria are developed and ranking is carried out by experts of the highest professional qualification in relevant fields - professors of universities, specialists of the highest level of education, who have enough experience in teaching and scientific activities. Professional rating of universities consists of three components.. "
The top five universities are 1. Caltech, 2. Harvard, 3. MIT, 4. Stanford, 5. ETH Zurich.
Without further information, I do not think that this ranking is worth further attention.
http://www.cicerobook.com/en/ranks
There is a methodology page but it does not include essential information. One indicator is "number of publications to number of academic staff" but there is nothing about how either of these are calculated or where the data comes from. There is a reference to a survey of members of the International Council of Scientists but nothing about the wording of the survey, date of survey, distribution of respondents or the response rate.
Anyway, here is the introduction to the methodology:
"The methodology of the professional ranking of universities is based on comparing universities and professional evaluation by level of proposed training programs (degrees), availability and completeness of information on activities of a university, its capacity and reputation on a national and international levels. Main task is to determine parameters and ratios needed to assess quality of the learning process and obtained specialist knowledge. Professional formalized ranking system based on a mathematical calculation of the relation of parameters of the learning process characterizing quality of education and learning environment. Professional evaluation criteria are developed and ranking is carried out by experts of the highest professional qualification in relevant fields - professors of universities, specialists of the highest level of education, who have enough experience in teaching and scientific activities. Professional rating of universities consists of three components.. "
The top five universities are 1. Caltech, 2. Harvard, 3. MIT, 4. Stanford, 5. ETH Zurich.
Without further information, I do not think that this ranking is worth further attention.
http://www.cicerobook.com/en/ranks
Tuesday, September 05, 2017
Highlights from THE citations indicator
The latest THE world rankings were published yesterday. As always, the most interesting part is the field- and year- normalised citations indicator that supposedly measures research impact.
Over the last few years, an array of implausible places have zoomed into the top ranks of this metric, sometimes disappearing as rapidly as they arrived.
The first place for citations this year goes to MIT. I don't think anyone would find that very controversial.
Here are some of the institutions that feature in the top 100 of THE's most important indicator which has a weighting of 30 per cent.
2nd St. George's, University of London
3rd= University of California Santa Cruz, ahead of Berkeley and UCLA
6th = Brandeis University, equal to Harvard
11th= Anglia Ruskin University, UK, equal to Chicago
14th= Babol Noshirvani University of Technology, Iran, equal to Oxford
16th= Oregon Health and Science University
31st King Abdulaziz University, Saudi Arabia
34th= Brighton and Sussex Medical School, UK, equal to Edinburgh
44th Vita-Salute San Raffaele University, Italy, ahead of the University of Michigan
45th= Ulsan National Institute of Science and Technology, best in South Korea
58th= University of Kiel, best in Germany and equal to King's College London
67th= University of Iceland
77th= University of Luxembourg, equal to University of Amsterdam
Thursday, August 24, 2017
Guest Post by Pablo Achard
This post is by Pablo Achard of the University of Geneva. It refers to the Shanghai subject rankings. However, the problem of outliers in subject and regional rankings is one that affects all the well known rankings and will probably become more important over the next few years
How a single article is worth 60 places
We can’t repeat it
enough: an indicator is bad when a small variation in the input is overly
amplified in the output. This is the case when indicators are based on very few
events.
I recently came
through this issue (again) with Shanghai’s subject ranking of universities. The
universities of Geneva and Lausanne (Switzerland) share the same School of Pharmacy and a huge share of published articles in this discipline are signed
under the name of both institutions. But in the “Pharmacy and pharmaceutical
sciences” ranking, one is ranked between the 101st and 150th
position while the other is 40th. Where does this difference come
from?
Comparing the scores
obtained under each category gives a clue
|
Geneva
|
Lausanne
|
Weight in the final score
|
PUB
|
46
|
44.3
|
1
|
CNCI
|
63.2
|
65.6
|
1
|
IC
|
83.6
|
79.5
|
0.2
|
TOP
|
0
|
40.8
|
1
|
AWARD
|
0
|
0
|
1
|
Weighted sum
|
125.9
|
166.6
|
|
So the main difference
between the two institutions is the score in “TOP”. Actually, the difference in
the weighted sum (40.7) is almost equal to the value of this score (40.8). If
Geneva and Lausanne had the same TOP score, they would be 40th and
41st.
Surprisingly, a look
at other institutions for that TOP indicator show only 5 different values : 0,
40.8, 57.7, 70.7 and 100. According to the methodology page of the ranking, “TOP
is the number of papers published in Top Journals in an Academic Subject for an
institution during the period of 2011-2015. Top Journals are identified through
ShanghaiRanking’s Academic Excellence Survey […] The
list of the top journals can be found here […] Only
papers of ‘Article’ type are considered.”
Looking deeper, there
is just one journal in this list for Pharmacy: NATURE REVIEWS DRUG DISCOVERY.
As its name indicates, this recognized journal mainly publishes ‘reviews’. A search
on Web of Knowledge shows that in the period 2011-2015, only 63 ‘articles’ were
published in this journal. That means a small variation in the input is overly
amplified.
I searched for several
institutions and rapidly found this rule: Harvard published 4 articles during
these five years and got a score of 100 ; MIT published 3 articles and got a
score of 70.7 ; 10 institutions published 2 articles and got a 57.7 and finally
about 50 institutions published 1 article and got a 40.8.
I still don’t get why
this score is so unlinear. But Lausanne published one single article in NATURE
REVIEWS DRUG DISCOVERY and Geneva none (they published ‘reviews’ and ‘letters’
but no ‘articles’) and that small difference led to at least a 60 places gap
between the two institutions.
This is of course just
one example of what happens too often: rankers want to publish sub-rankings and
end up with indicators where outliers can’t be absorbed into large
distributions. One article, one prize or one co-author in a large and
productive collaboration all of the sudden makes very large differences in final
scores and ranks.
Thursday, August 03, 2017
America's Top Colleges: 2017 Rankings
America's Top Colleges is published by Forbes business magazine. It is an unabashed assessment of institutions from the viewpoint of the student as investor. The metrics are post-graduate success, debt, student experience, graduation rate and academic success.
The top three colleges are Harvard, Stanford and Yale.
The top three liberal arts colleges are Pomona, Claremont McKenna and Williams.
The top three low debt private colleges are College of the Ozarks, Berea College and Princeton.
The top three STEM colleges are MIT, Caltech and Harvey Mudd College.
Wednesday, July 19, 2017
Comments on an Article by Brian Leiter
Global
university rankings are now nearly a decade and a half old. The Shanghai
rankings (Academic Ranking of World Universities or ARWU) began in 2003,
followed a year later by Webometrics and the THES-QS rankings which, after an
unpleasant divorce, became the Times Higher Education (THE)
and the Quacquarelli Symonds (QS) world rankings. Since then the number of
rankings with a variety of audiences and methodologies has expanded.
We now
have several research-based rankings, University Ranking by Academic
Performance (URAP) from Turkey, the National Taiwan
University Rankings, Best Global Universities from US News, Leiden
Ranking, as well as rankings that include some attempt to assess and
compare something other than research, the Round University Rankings from
Russia and U-Multirank from
the European Union. And, of course, we also have subject
rankings, regional
rankings, even age
group rankings.
It is
interesting that some of these rankings have developed beyond the original
founders of global rankings. Leiden Ranking is now the gold standard for the
analysis of publications and citations. The Russian rankings use the same Web
of Science database that THE did until 2014 and it has 12 out of the 13
indicators used by THE plus another eight in a more sensible and transparent
arrangement. However, both of these receive only a fraction of the attention
given to the THE rankings.
The
research rankings from Turkey and Taiwan are similar to the Shanghai rankings
but without the elderly or long departed Fields and Nobel award winners and
with a more coherent methodology. U-Multirank is almost alone in trying to
get at things that might be of interest to prospective undergraduate students.
It is
regrettable that an article by Professor Brian Leiter of the University of
Chicago in the Chronicle of Higher Education , 'Academic
Ethics: To Rank or Not to Rank' ignores such developments
and mentions only the original “Big Three”, Shanghai, QS and THE. This is
perhaps forgivable since the establishment media, including THE and the
Chronicle, and leading state and academic bureaucrats have until recently paid
very little attention to innovative developments in university ranking. Leiter
attacks the QS rankings and proposes that they should be boycotted while trying
to improve the THE rankings.
It is a
little odd that Leiter should be so caustic, not entirely without justification,
about QS while apparently being unaware of similar or greater problems with THE.
He begins
by saying that QS stands for “quirky silliness”. I would not disagree with that although
in recent years QS has been getting less silly. I have been as sarcastic as
anyone about the failings of QS: see here and here for
an amusing commentary.
But the
suggestion that QS is uniquely bad in contrast to THE is way off the target.
There are many issues with the QS methodology, especially with its employer and
academic surveys, and it has often announced placings that seem very
questionable such as Nanyang Technological University (NTU) ahead of Princeton
and Yale or the University of Buenos Aires in the world top 100, largely
as a result of a suspiciously good performance in the survey
indicators. The
oddities of the QS rankings are, however, no worse than some of the absurdities
that THE has served up in their world and
regional rankings. We have had places like University of Marakkesh Cadi
Ayyad University in Morocco, Middle East Technical University in Turkey,
Federico Santa Maria Technical University in Chile, Alexandria University
and Veltech University
in India rise to ludicrously high places, sometimes just for a year or two, as
the result of a few papers or even a single highly cited author.
I am not
entirely persuaded that NTU deserves its top
12 placing in the QS rankings. You can see here QS’s
unconvincing reply to a question that I provided. QS claims that NTU's excellence
is shown by its success in attracting foreign faculty, students and
collaborators, but when you are in a country where people show their passports
to drive to the dentist, being international is no great accomplishment. Even
so, it is evidently world class as far as engineering and computer science are
concerned and it is not impossible that it could reach an undisputed overall top
ten or twenty ranking the next decade.
While the
THE top ten or twenty or even fifty looks quite reasonable, apart from Oxford
in first place, there are many anomalies as soon as we start breaking the
rankings apart by country or indicator and THE has pushed some very weird data
in recent years. Look at these
places supposed to be regional or international centers of across
the board research excellence as measured by citations: St Georges University
of London, Brandeis University, the Free University of Bozen-Bolsano,
King Abdulaziz University, the University of Iceland, Veltech University.
If QS is silly what are we to call a ranking where Anglia Ruskin University is
supposed to have a greater research impact than Chicago, Cambridge or Tsinghua.
Leiter
starts his article by pointing out that the QS academic survey is largely
driven by the geographical distribution of its respondents and by the halo
effect. This is very probably true and to that I would add that a lot of the
responses to academic surveys of this kind are likely driven by simple self
interest, academics voting for their alma mater or current employer. QS does
not allow respondents to vote for the latter but they can vote for the former
and also vote for grant providers or collaborators.
He says
that “QS does not, however, disclose the geographic distribution of
its survey respondents, so the extent of the distorting effect cannot be
determined". This is not true of the overall survey. QS does in fact
give very
detailed figures about the origin of its respondents and there
is good evidence here of probable distorting effects. There are, for example,
more responses from Taiwan than from Mainland China, and almost as many from
Malaysia as from Russia. QS does not, however, go down to subject level when
listing geographic distribution.
He then
refers to the case of University
College Cork (UCC) asking faculty to solicit friends in other
institutions to vote for UCC. This is definitely a bad practice, but it was in
violation of QS guidelines and QS have investigated. I do not know what came of
the investigation but it is worth noting that the message would not have been
an issue if it had referred to the THE survey.
On
balance, I would agree that THE ‘s survey methodology is less dubious than QS’s
and less likely to be influenced by energetic PR campaigns. It would certainly
be a good idea if the weighting of the QS survey was reduced and if there was more
rigorous screening and classification of potential respondents.
But I
think we also have to bear in mind that QS does prohibit respondents from
voting for their own universities and it does average results out over a five-
year period (formerly three years).
It is
interesting that while THE does not usually combine and average survey
results it
did so in the 2016-17 world rankings combining the 2015 and 2016
survey results. This was, I suspect, probably because of a substantial drop in 2016 in the
percentage of respondents from the arts and humanities that would, if
unadjusted, have caused a serious problem for UK universities, especially those
in the Russell Group.
Leiter
then goes on to condemn QS for its dubious business practices. He reports that
THE dropped QS because of its dubious practices. That is what THE says but it
is widely rumoured within the rankings industry that THE was also interested in
the financial advantages of a direct partnership with Thomson Reuters rather
than getting data from QS.
He also
refers to QS’s hosting a series of “World Class events” where world university
leaders pay $950 for “seminar, dinners, coffee breaks” and “learn best practice
for branding and marketing your institution through case studies and expert
knowledge” and the QS stars plan where universities pay to be audited by QS in
return for stars that they can use for promotion and advertising. I would add
to his criticism that the Stars program has apparently undergone a typical
“grade inflation” with the number of five-star universities increasing all the
time.
Also, QS
offers specific consulting services and it has a large number of clients from
around the world although there are many more from Australia and Indonesia than
from Canada and the US. Of the three from the US one is MIT which has
been number
one in the QS world rankings since 2012, a position it
probably achieved after a change in the way in which faculty were classified.
It would,
however, be misleading to suggest that THE is any better in this respect. Since
2014 it has launched a serious and unapologetic “monetisation of data” program.
There are
events such as the forthcoming world "academic summit" where for 1,199
GBP (standard university) or 2,200 GBP (corporate), delegates can get "Exclusive
insight into the 2017 Times Higher Education World University
Rankings at the official launch and rankings masterclass,”, plus “prestigious
gala dinner, drinks reception and other networking events”. THE also provides a variety of
benchmarking and performance analysis services, branding, advertising and
reputation management campaigns and a range of silver and gold profiles,
including adverts and sponsored supplements. THE’s data
clients include some illustrious names like the National University of
Singapore and Trinity College Dublin plus some less well-known places such as
Federico Santa Maria Technical University, Orebro University, King Abdulaziz University,
National Research Nuclear University MEPhI Moscow, and Charles Darwin
University.
Among
THE’s activities are regional events that promise “partnership opportunities
for global thought leaders” and where rankings like “the WUR are presented at
these events with our award-winning data team on hand to explain them, allowing
institutions better understanding of their findings”.
At some
of these summits the rankings presented are trimmed and tweaked and somehow
the hosts emerge in a favourable light. In February 2015, for example, THE held
a Middle East and North Africa (MENA) summit that included a “snapshot ranking” that
put Texas A and M University Qatar, a branch campus that offers nothing but
engineering courses, in first place and Qatar University in fourth. The ranking
consisted of precisely one indicator out of the 13 that make up THE’s world
university rankings, field and year normalised citations. United Arab Emirates
University (UAEU) was 11th and the American University of
Sharjah in the UAE 14th.
The next
MENA summit was held in January 2016 in Al Ain in UAE. There was no snapshot
this time and the methodology for the MENA rankings included 13 indicators in
THE’s world rankings. Host country universities were now in fifth (UAEU) and
eighth place (American University in Sharjah). Texas A and M Qatar was not
ranked and Qatar University fell to sixth place.
Something
similar happened to Africa. In 2015, THE went to the University of Johannesburg
for a summit that brought together “outstanding global thought leaders from
industry, government, higher education and research” and which unveiled THE’s
Africa ranking based on citations (with the innovation of fractional counting)
that put the host university in ninth place and the University of Ghana in
twelfth.
In 2016
the show moved on to the University of Ghana where another ranking was produced
based on all the 13 world ranking indicators. This time the University of
Johannesburg did not take part and the University of Ghana went from 12th place
to 7th.
I may
have missed something but so far I do not see sign of THE Africa or MENA
summits planned for 2017. If so, then African and MENA university leaders are
to be congratulated for a very healthy scepticism.
To be
fair, THE does not seem to have done any methodological tweaking for this year’s
Asian, Asia Pacific and Latin American rankings.
Leiter
concludes that American academics should boycott the QS survey but not THE’s
and that they should lobby THE to improve its survey practices. That, I
suspect, is pretty much a nonstarter. QS has never had much a presence in the
US anyway and THE is unlikely to change significantly as long as its commercial
dominance goes unchallenged and as long as scholars and administrators fail to
see through its PR wizardry. It would be better for everybody to start looking beyond the "Big Three" rankings.
Sunday, June 18, 2017
Comparing the THE and QS Academic Reputation Surveys
Times Higher Education (THE) has just published its 2017 reputation rankings which include 100 universities. These are based on a survey distributed between January and March of this year and will be included, after standardisation, in the 2017-18 (or 2018) World University Rankings scheduled for publication in a few months. In the forthcoming world rankings the reputation survey will be divided into two metrics in the research and teaching indicator groups, with a combined weighting of 33 percent. The survey asked about research and postgraduate teaching but since the correlation between these two questions is very high there is effectively only one indicator.
The QS world rankings released last week included scores derived from two surveys, one of academics with a 40% weighting and one of employers with 10%. The academic survey was concerned only with research.
The methodology of the THE survey is relatively simple. The respondents are drawn from the database of researchers with publications in Scopus indexed journals, in other words those who get to be listed as corresponding author. THE claims that this makes them experienced senior researchers although in many parts of the world being a member or leader of a research team often has more to do with politics than merit.
In contrast, the QS methodology has changed quite a lot over the last few years. It began with scouring the mailing lists of World Scientific, a Singapore based academic publisher with links to Imperial College London, then adding various other channels including lists supplied by institutions and sign up facilities for potential respondents. The result is a survey that appears more inclusive than THE's with more respondents from outside the elite but one whose validity may be rather suspect.
The THE ranking found that there were six super-brand universities that stood out from everyone else, Harvard, MIT, Stanford, Cambridge, Oxford, and Berkeley. There was a big gap between Berkeley and number seven Princeton and then the long smooth slope continues.
After that, the ranking is dominated by English speaking universities, with the USA contributing 42, the UK 10, Canada 3 and Australia 3. East Asia and the Chinese diaspora (Hong Kong, Taiwan and Singapore) are fairly well represented, while South and Central Asia, the Middle East and Africa are absent.
For any survey a great deal depends on how the forms are distributed. Last year, the THE survey had a lot more responses from the social sciences, including economics and business studies, and fewer from the arts and the humanities, and that contributed to some Asian universities rising and some British ones falling.
Such falls are typically attributed in the education establishment media to anxiety about the looming horrors of Brexit, the vicious snatching of research funds and the rising tide of hostility to international students.
This year British Universities did a bit better in the THE reputation ranking this year with five going up, three staying put and three going down. No doubt we will soon hear about the invigorating effects of Brexit and the benefits of austerity. Perhaps also it might have something to do with the number of survey responses from the arts and humanities going up from 9% to 12.5%, something that would surely benefit UK universities.
The QS reputation indicator has the same universities in the top six but not in quite the same order: Cambridge, fourth in THE, is second in the QS indicator. After that it starts looking very different. Number seven is the University of Tokyo, which THE puts in 11th place for academic reputation. Other Asian universities do much better in the QS indicator. The National University of Singapore is 11th ( 27th in THE) Nanyang Technological University Singapore is 50th (THE 81-90 band), Peking University is 14th (THE 17h) Chulalongkorn University Thailand is 99th (not in the THE top 100).
It is noticeable that Latin American universities such as the University of Sao Paulo, the University of Buenos Aires and the Pontifical Catholic University of Chile get a higher placing in the QS indicator than they do in the THE ranking as do some Southern European universities such as Barcelona, Sapienza and Bologna.
The THE reputation ranking gives us a snapshot of the current views of the world's academic elite and probably underestimates the rising universities of Greater China and Korea. QS cast their nets further and have probably caught a few of tomorrow's world class institutions although I suspect that the Latin American high fliers, apart from Sao Paulo, are very overrated.
The QS world rankings released last week included scores derived from two surveys, one of academics with a 40% weighting and one of employers with 10%. The academic survey was concerned only with research.
The methodology of the THE survey is relatively simple. The respondents are drawn from the database of researchers with publications in Scopus indexed journals, in other words those who get to be listed as corresponding author. THE claims that this makes them experienced senior researchers although in many parts of the world being a member or leader of a research team often has more to do with politics than merit.
In contrast, the QS methodology has changed quite a lot over the last few years. It began with scouring the mailing lists of World Scientific, a Singapore based academic publisher with links to Imperial College London, then adding various other channels including lists supplied by institutions and sign up facilities for potential respondents. The result is a survey that appears more inclusive than THE's with more respondents from outside the elite but one whose validity may be rather suspect.
The THE ranking found that there were six super-brand universities that stood out from everyone else, Harvard, MIT, Stanford, Cambridge, Oxford, and Berkeley. There was a big gap between Berkeley and number seven Princeton and then the long smooth slope continues.
After that, the ranking is dominated by English speaking universities, with the USA contributing 42, the UK 10, Canada 3 and Australia 3. East Asia and the Chinese diaspora (Hong Kong, Taiwan and Singapore) are fairly well represented, while South and Central Asia, the Middle East and Africa are absent.
For any survey a great deal depends on how the forms are distributed. Last year, the THE survey had a lot more responses from the social sciences, including economics and business studies, and fewer from the arts and the humanities, and that contributed to some Asian universities rising and some British ones falling.
Such falls are typically attributed in the education establishment media to anxiety about the looming horrors of Brexit, the vicious snatching of research funds and the rising tide of hostility to international students.
This year British Universities did a bit better in the THE reputation ranking this year with five going up, three staying put and three going down. No doubt we will soon hear about the invigorating effects of Brexit and the benefits of austerity. Perhaps also it might have something to do with the number of survey responses from the arts and humanities going up from 9% to 12.5%, something that would surely benefit UK universities.
The QS reputation indicator has the same universities in the top six but not in quite the same order: Cambridge, fourth in THE, is second in the QS indicator. After that it starts looking very different. Number seven is the University of Tokyo, which THE puts in 11th place for academic reputation. Other Asian universities do much better in the QS indicator. The National University of Singapore is 11th ( 27th in THE) Nanyang Technological University Singapore is 50th (THE 81-90 band), Peking University is 14th (THE 17h) Chulalongkorn University Thailand is 99th (not in the THE top 100).
It is noticeable that Latin American universities such as the University of Sao Paulo, the University of Buenos Aires and the Pontifical Catholic University of Chile get a higher placing in the QS indicator than they do in the THE ranking as do some Southern European universities such as Barcelona, Sapienza and Bologna.
The THE reputation ranking gives us a snapshot of the current views of the world's academic elite and probably underestimates the rising universities of Greater China and Korea. QS cast their nets further and have probably caught a few of tomorrow's world class institutions although I suspect that the Latin American high fliers, apart from Sao Paulo, are very overrated.
Monday, May 29, 2017
Ten Universities with a Surprisingly Large Research Impact
Every so often newspapers produce lists of universities that excel in or are noteworthy for something. Here is a list of ten universities that, according to Times Higher Education (THE), have achieved remarkable success in the world of global research. In a time of austerity when the wells of patronage are running dry, they should be an example to us all: they have achieved a massive global research impact, measured by field-normalised citations, despite limited funding, minimal reputations and few or very few publications. The source is the THE World and Asian rankings citations indicator.
1. First on the list is Alexandria University in Egypt, 4th in the world and a near perfect score for research impact in 2010-11.
2. In the same year Hong Kong Baptist University was tenth for research impact, ahead of the University of Chicago and the University of Hong Kong.
3. In 2011-12 Royal Holloway, University of London, was in 12th place, ahead of any other British or European institution.
4. The National Research Nuclear University MEPhI, in Moscow, a specialist institution, was top of the table for citations in 2012-13.
5. In 2013-14 and 2014-15 Tokyo Metropolitan University had a perfect score of 100 for citations, a distinction shared only with MIT.
6. In 2014-15 Federico Santa Maria Technical University was sixth in the world for research impact and first in Latin America with a near perfect score of 99.7.
7. In the same year Bogazici University in Turkey reached the top twenty for research impact.
8. St George's, University of London, was the top institution in the world for research impact in 2016-17.
9. In that year Anglia Ruskin University, a former art school, was tenth for this metric, equal to Oxford and well ahead of the other university in Cambridge.
10. Last year's THE Asian rankings saw Vel Tech University in Chennai achieve the highest impact of any Asian university.
1. First on the list is Alexandria University in Egypt, 4th in the world and a near perfect score for research impact in 2010-11.
2. In the same year Hong Kong Baptist University was tenth for research impact, ahead of the University of Chicago and the University of Hong Kong.
3. In 2011-12 Royal Holloway, University of London, was in 12th place, ahead of any other British or European institution.
4. The National Research Nuclear University MEPhI, in Moscow, a specialist institution, was top of the table for citations in 2012-13.
5. In 2013-14 and 2014-15 Tokyo Metropolitan University had a perfect score of 100 for citations, a distinction shared only with MIT.
6. In 2014-15 Federico Santa Maria Technical University was sixth in the world for research impact and first in Latin America with a near perfect score of 99.7.
7. In the same year Bogazici University in Turkey reached the top twenty for research impact.
8. St George's, University of London, was the top institution in the world for research impact in 2016-17.
9. In that year Anglia Ruskin University, a former art school, was tenth for this metric, equal to Oxford and well ahead of the other university in Cambridge.
10. Last year's THE Asian rankings saw Vel Tech University in Chennai achieve the highest impact of any Asian university.
Sunday, May 28, 2017
The View from Leiden
Ranking experts are constantly warning about the grim fate that awaits the universities of the West if they are not provided with all the money that they want and given complete freedom to hire staff and recruit students from anywhere that they want. If this does not happen they will be swamped by those famously international Asian universities dripping with funds from indulgent patrons.
The threat, if we are to believe the prominent rankers of Times Higher Education (THE), QS and Shanghai Ranking Consultancy, is always looming but somehow never quite arrives. The best Asian performer in the THE world rankings is the National University of Singapore (NUS) in 24th place followed by Peking University in 29th. The QS World University Rankings have NUS 12th, Nanyang Technological University 13th and Tsinghua University 24th. The Academic Ranking of World Universities published in Shanghai puts the University of Tokyo in 20th place and Peking University in 71st.
These rankings are in one way or another significantly biased towards Western European and North American institutions and against Asia. THE has three separate indicators that measure income, adding up to a combined weighting of 10.75% . Both QS and THE have reputations surveys. ARWU gives a 30 % weighting to Nobel and Fields awards winners, some of them from several decades ago.
Let's take a look at a set of rankings that is technically excellent, namely the Leiden Ranking. The producers do not provide an overall score. Instead it is possible to create a variety of rankings, total publications, publications by subject groups, publications in the top 50%, 10% and 1% of journals. Users can also select fractional or absolute counting and change the minimum threshold of number of publications.
Here is the top ten, using the default settings, publications 2012-15, fractional counting, minimum threshold of 100 papers. Publications in 2006-09 are in brackets.
1. Harvard (1)
2. Toronto (2)
3. Zhejiang (14)
4. Michigan (3)
5. Shanghai Jiao Tong (37)
6. Johns Hopkins (5)
7 Sao Paulo (8)
8. Stanford (9)
9 Seoul National University (23)
10. Tokyo (4).
Tsinghua University is 11th, up from 32nd in 2006-09 and Peking University is 15th, up from 54th. What is interesting about this is not just that East Asian universities are moving into the highest level of research universities but how rapidly they are doing so.
No doubt there are many who will say that this is a matter of quantity and that what really counts is not the number of papers but their reception by other researchers. There is something to this. If we look at publications in the top 1 % of journals (by frequency of citation) the top ten include six US universities headed by Harvard, three British and one Canadian.
Tsinghua is 28th, Zhejiang is 50th, Peking 62nd, Shanghai Jiao Tong 80th, Seoul National University 85th . Right now it looks like publication in the most reputed journals is dominated by English-speaking universities. But in the last few years Chinese and Korean universities have advanced rapidly, Peking 119th to 62nd, Zhejiang 118th to 50th, Shanghai Jiao Tong 112th to 80th, Tsinghua 101st to 28th, Seoul National University 107th to 85th.
It seems that in a few years East Asia will dominate the elite journals and will take the lead for quality as well as quantity.
Moving on to subject group rankings, Tsinghua University is in first place for mathematics and computer sciences. The top ten consists of nine Chinese and one Singaporean university. The best US performer is MIT in 16th place, the best British Imperial College London in 48th.
When we look at the top 1 % of journals, Tsinghua is still on top, although MIT moves up to 4th place and Stanford is 5th.
The Asian tsunami has already arrived. East Asian, mainly Chinese and Chinese diaspora, universities, are dominant or becoming dominant in the STEM subjects, leaving the humanities and social sciences to the US.
There will of course be debate about what happened. Maybe money had something to do with it. But it also seems that western universities are becoming much less selective about student admissions and faculty appointments. If you admit students who write #BlackLivesMatter 100 times on their application forms or impose ideological tests for faculty appointment and promotion you may have succeed in imposing political uniformity but you will have serious problems trying to compete with the Gaokao hardened students and researchers of Chinese universities.
The threat, if we are to believe the prominent rankers of Times Higher Education (THE), QS and Shanghai Ranking Consultancy, is always looming but somehow never quite arrives. The best Asian performer in the THE world rankings is the National University of Singapore (NUS) in 24th place followed by Peking University in 29th. The QS World University Rankings have NUS 12th, Nanyang Technological University 13th and Tsinghua University 24th. The Academic Ranking of World Universities published in Shanghai puts the University of Tokyo in 20th place and Peking University in 71st.
These rankings are in one way or another significantly biased towards Western European and North American institutions and against Asia. THE has three separate indicators that measure income, adding up to a combined weighting of 10.75% . Both QS and THE have reputations surveys. ARWU gives a 30 % weighting to Nobel and Fields awards winners, some of them from several decades ago.
Let's take a look at a set of rankings that is technically excellent, namely the Leiden Ranking. The producers do not provide an overall score. Instead it is possible to create a variety of rankings, total publications, publications by subject groups, publications in the top 50%, 10% and 1% of journals. Users can also select fractional or absolute counting and change the minimum threshold of number of publications.
Here is the top ten, using the default settings, publications 2012-15, fractional counting, minimum threshold of 100 papers. Publications in 2006-09 are in brackets.
1. Harvard (1)
2. Toronto (2)
3. Zhejiang (14)
4. Michigan (3)
5. Shanghai Jiao Tong (37)
6. Johns Hopkins (5)
7 Sao Paulo (8)
8. Stanford (9)
9 Seoul National University (23)
10. Tokyo (4).
Tsinghua University is 11th, up from 32nd in 2006-09 and Peking University is 15th, up from 54th. What is interesting about this is not just that East Asian universities are moving into the highest level of research universities but how rapidly they are doing so.
No doubt there are many who will say that this is a matter of quantity and that what really counts is not the number of papers but their reception by other researchers. There is something to this. If we look at publications in the top 1 % of journals (by frequency of citation) the top ten include six US universities headed by Harvard, three British and one Canadian.
Tsinghua is 28th, Zhejiang is 50th, Peking 62nd, Shanghai Jiao Tong 80th, Seoul National University 85th . Right now it looks like publication in the most reputed journals is dominated by English-speaking universities. But in the last few years Chinese and Korean universities have advanced rapidly, Peking 119th to 62nd, Zhejiang 118th to 50th, Shanghai Jiao Tong 112th to 80th, Tsinghua 101st to 28th, Seoul National University 107th to 85th.
It seems that in a few years East Asia will dominate the elite journals and will take the lead for quality as well as quantity.
Moving on to subject group rankings, Tsinghua University is in first place for mathematics and computer sciences. The top ten consists of nine Chinese and one Singaporean university. The best US performer is MIT in 16th place, the best British Imperial College London in 48th.
When we look at the top 1 % of journals, Tsinghua is still on top, although MIT moves up to 4th place and Stanford is 5th.
The Asian tsunami has already arrived. East Asian, mainly Chinese and Chinese diaspora, universities, are dominant or becoming dominant in the STEM subjects, leaving the humanities and social sciences to the US.
There will of course be debate about what happened. Maybe money had something to do with it. But it also seems that western universities are becoming much less selective about student admissions and faculty appointments. If you admit students who write #BlackLivesMatter 100 times on their application forms or impose ideological tests for faculty appointment and promotion you may have succeed in imposing political uniformity but you will have serious problems trying to compete with the Gaokao hardened students and researchers of Chinese universities.
Subscribe to:
Posts (Atom)