Every so often newspapers produce lists of universities that excel in or are noteworthy for something. Here is a list of ten universities that, according to Times Higher Education (THE), have achieved remarkable success in the world of global research. In a time of austerity when the wells of patronage are running dry, they should be an example to us all: they have achieved a massive global research impact, measured by field-normalised citations, despite limited funding, minimal reputations and few or very few publications. The source is the THE World and Asian rankings citations indicator.
1. First on the list is Alexandria University in Egypt, 4th in the world and a near perfect score for research impact in 2010-11.
2. In the same year Hong Kong Baptist University was tenth for research impact, ahead of the University of Chicago and the University of Hong Kong.
3. In 2011-12 Royal Holloway, University of London, was in 12th place, ahead of any other British or European institution.
4. The National Research Nuclear University MEPhI, in Moscow, a specialist institution, was top of the table for citations in 2012-13.
5. In 2013-14 and 2014-15 Tokyo Metropolitan University had a perfect score of 100 for citations, a distinction shared only with MIT.
6. In 2014-15 Federico Santa Maria Technical University was sixth in the world for research impact and first in Latin America with a near perfect score of 99.7.
7. In the same year Bogazici University in Turkey reached the top twenty for research impact.
8. St George's, University of London, was the top institution in the world for research impact in 2016-17.
9. In that year Anglia Ruskin University, a former art school, was tenth for this metric, equal to Oxford and well ahead of the other university in Cambridge.
10. Last year's THE Asian rankings saw Vel Tech University in Chennai achieve the highest impact of any Asian university.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Monday, May 29, 2017
Sunday, May 28, 2017
The View from Leiden
Ranking experts are constantly warning about the grim fate that awaits the universities of the West if they are not provided with all the money that they want and given complete freedom to hire staff and recruit students from anywhere that they want. If this does not happen they will be swamped by those famously international Asian universities dripping with funds from indulgent patrons.
The threat, if we are to believe the prominent rankers of Times Higher Education (THE), QS and Shanghai Ranking Consultancy, is always looming but somehow never quite arrives. The best Asian performer in the THE world rankings is the National University of Singapore (NUS) in 24th place followed by Peking University in 29th. The QS World University Rankings have NUS 12th, Nanyang Technological University 13th and Tsinghua University 24th. The Academic Ranking of World Universities published in Shanghai puts the University of Tokyo in 20th place and Peking University in 71st.
These rankings are in one way or another significantly biased towards Western European and North American institutions and against Asia. THE has three separate indicators that measure income, adding up to a combined weighting of 10.75% . Both QS and THE have reputations surveys. ARWU gives a 30 % weighting to Nobel and Fields awards winners, some of them from several decades ago.
Let's take a look at a set of rankings that is technically excellent, namely the Leiden Ranking. The producers do not provide an overall score. Instead it is possible to create a variety of rankings, total publications, publications by subject groups, publications in the top 50%, 10% and 1% of journals. Users can also select fractional or absolute counting and change the minimum threshold of number of publications.
Here is the top ten, using the default settings, publications 2012-15, fractional counting, minimum threshold of 100 papers. Publications in 2006-09 are in brackets.
1. Harvard (1)
2. Toronto (2)
3. Zhejiang (14)
4. Michigan (3)
5. Shanghai Jiao Tong (37)
6. Johns Hopkins (5)
7 Sao Paulo (8)
8. Stanford (9)
9 Seoul National University (23)
10. Tokyo (4).
Tsinghua University is 11th, up from 32nd in 2006-09 and Peking University is 15th, up from 54th. What is interesting about this is not just that East Asian universities are moving into the highest level of research universities but how rapidly they are doing so.
No doubt there are many who will say that this is a matter of quantity and that what really counts is not the number of papers but their reception by other researchers. There is something to this. If we look at publications in the top 1 % of journals (by frequency of citation) the top ten include six US universities headed by Harvard, three British and one Canadian.
Tsinghua is 28th, Zhejiang is 50th, Peking 62nd, Shanghai Jiao Tong 80th, Seoul National University 85th . Right now it looks like publication in the most reputed journals is dominated by English-speaking universities. But in the last few years Chinese and Korean universities have advanced rapidly, Peking 119th to 62nd, Zhejiang 118th to 50th, Shanghai Jiao Tong 112th to 80th, Tsinghua 101st to 28th, Seoul National University 107th to 85th.
It seems that in a few years East Asia will dominate the elite journals and will take the lead for quality as well as quantity.
Moving on to subject group rankings, Tsinghua University is in first place for mathematics and computer sciences. The top ten consists of nine Chinese and one Singaporean university. The best US performer is MIT in 16th place, the best British Imperial College London in 48th.
When we look at the top 1 % of journals, Tsinghua is still on top, although MIT moves up to 4th place and Stanford is 5th.
The Asian tsunami has already arrived. East Asian, mainly Chinese and Chinese diaspora, universities, are dominant or becoming dominant in the STEM subjects, leaving the humanities and social sciences to the US.
There will of course be debate about what happened. Maybe money had something to do with it. But it also seems that western universities are becoming much less selective about student admissions and faculty appointments. If you admit students who write #BlackLivesMatter 100 times on their application forms or impose ideological tests for faculty appointment and promotion you may have succeed in imposing political uniformity but you will have serious problems trying to compete with the Gaokao hardened students and researchers of Chinese universities.
The threat, if we are to believe the prominent rankers of Times Higher Education (THE), QS and Shanghai Ranking Consultancy, is always looming but somehow never quite arrives. The best Asian performer in the THE world rankings is the National University of Singapore (NUS) in 24th place followed by Peking University in 29th. The QS World University Rankings have NUS 12th, Nanyang Technological University 13th and Tsinghua University 24th. The Academic Ranking of World Universities published in Shanghai puts the University of Tokyo in 20th place and Peking University in 71st.
These rankings are in one way or another significantly biased towards Western European and North American institutions and against Asia. THE has three separate indicators that measure income, adding up to a combined weighting of 10.75% . Both QS and THE have reputations surveys. ARWU gives a 30 % weighting to Nobel and Fields awards winners, some of them from several decades ago.
Let's take a look at a set of rankings that is technically excellent, namely the Leiden Ranking. The producers do not provide an overall score. Instead it is possible to create a variety of rankings, total publications, publications by subject groups, publications in the top 50%, 10% and 1% of journals. Users can also select fractional or absolute counting and change the minimum threshold of number of publications.
Here is the top ten, using the default settings, publications 2012-15, fractional counting, minimum threshold of 100 papers. Publications in 2006-09 are in brackets.
1. Harvard (1)
2. Toronto (2)
3. Zhejiang (14)
4. Michigan (3)
5. Shanghai Jiao Tong (37)
6. Johns Hopkins (5)
7 Sao Paulo (8)
8. Stanford (9)
9 Seoul National University (23)
10. Tokyo (4).
Tsinghua University is 11th, up from 32nd in 2006-09 and Peking University is 15th, up from 54th. What is interesting about this is not just that East Asian universities are moving into the highest level of research universities but how rapidly they are doing so.
No doubt there are many who will say that this is a matter of quantity and that what really counts is not the number of papers but their reception by other researchers. There is something to this. If we look at publications in the top 1 % of journals (by frequency of citation) the top ten include six US universities headed by Harvard, three British and one Canadian.
Tsinghua is 28th, Zhejiang is 50th, Peking 62nd, Shanghai Jiao Tong 80th, Seoul National University 85th . Right now it looks like publication in the most reputed journals is dominated by English-speaking universities. But in the last few years Chinese and Korean universities have advanced rapidly, Peking 119th to 62nd, Zhejiang 118th to 50th, Shanghai Jiao Tong 112th to 80th, Tsinghua 101st to 28th, Seoul National University 107th to 85th.
It seems that in a few years East Asia will dominate the elite journals and will take the lead for quality as well as quantity.
Moving on to subject group rankings, Tsinghua University is in first place for mathematics and computer sciences. The top ten consists of nine Chinese and one Singaporean university. The best US performer is MIT in 16th place, the best British Imperial College London in 48th.
When we look at the top 1 % of journals, Tsinghua is still on top, although MIT moves up to 4th place and Stanford is 5th.
The Asian tsunami has already arrived. East Asian, mainly Chinese and Chinese diaspora, universities, are dominant or becoming dominant in the STEM subjects, leaving the humanities and social sciences to the US.
There will of course be debate about what happened. Maybe money had something to do with it. But it also seems that western universities are becoming much less selective about student admissions and faculty appointments. If you admit students who write #BlackLivesMatter 100 times on their application forms or impose ideological tests for faculty appointment and promotion you may have succeed in imposing political uniformity but you will have serious problems trying to compete with the Gaokao hardened students and researchers of Chinese universities.
Monday, May 22, 2017
Arab University Rankings: Another Snapshot from Times Higher Education
Times Higher Education (THE) has produced a "snapshot" ranking of Arab universities extracted from its World University Rankings. There has been no change in the indicators or their weighting. Only 28 universities are included which raises questions about how suitable THE's methodology is for regions like the Middle East and North Africa.
This is an improvement over a remarkable MENA snapshot that THE did in 2015 which put Texas A and M University Qatar in first place by virtue of one half-time faculty member who was listed as a contributor to a multi-author, multi-cited CERN paper.
The top five universities this time are King Abdulaziz University (KAU), Saudi Arabia, King Fahad University of Petroleum and Minerals, Saudi Arabia, King Saud University, Saudi Arabia, Khalifa University of Science, Technology and Research, UAE, and Qatar University.
The top three places are held by Saudi institutions. So how did they do it? According to an article by a THE editor for the World Economic Forum it was all due to money and internationalisation.
Up to a point that is correct. The sad story of Trinity College Dublin's botched data submission shows roughly how much a given reported increase can affect a university's overall rank: roughly 5 million Euro in reported total income (with proportionate increases for research income and income from industry) and a middling high university can jump up a spot in the overall rankings.
But does that explain KAU in top place? It did get a high score, 92.1, for international orientation but five other Arab universities did better. For teaching it was third, for industry income third, and for research seventh. What actually made the difference was Citations. KAU had a score of 93.3, far ahead of the next contender, Jordan University of Science and Technology with 50.2.
KAU's research impact is, according to THE, second in Asia only to the shooting star of india, Vel Tech University whose single self citing prodigy supposedly had a greater impact than the whole of any other Asian university. KAU's citations score was the result of the massive recruitment of adjunct faculty, 40 at the last count, who put KAU as a second affiliation. How much time they put in at KAU is uncertain but the Shanghai rankings calculated that highly cited researchers spent an average of 16% of their time at the university of their second affiliation.
It is bad enough that THE put so much emphasis on income and internationalisation in their methodology, promoting the diversion of resources from things like primary and pre-school education, adult literacy and alternatives to oil exports. To encourage universities to rise in the rankings by hiring adjunct faculty whose contribution is uncertain is very irresponsible. It would be a good thing if this snapshot was ignored.
This is an improvement over a remarkable MENA snapshot that THE did in 2015 which put Texas A and M University Qatar in first place by virtue of one half-time faculty member who was listed as a contributor to a multi-author, multi-cited CERN paper.
The top five universities this time are King Abdulaziz University (KAU), Saudi Arabia, King Fahad University of Petroleum and Minerals, Saudi Arabia, King Saud University, Saudi Arabia, Khalifa University of Science, Technology and Research, UAE, and Qatar University.
The top three places are held by Saudi institutions. So how did they do it? According to an article by a THE editor for the World Economic Forum it was all due to money and internationalisation.
Up to a point that is correct. The sad story of Trinity College Dublin's botched data submission shows roughly how much a given reported increase can affect a university's overall rank: roughly 5 million Euro in reported total income (with proportionate increases for research income and income from industry) and a middling high university can jump up a spot in the overall rankings.
But does that explain KAU in top place? It did get a high score, 92.1, for international orientation but five other Arab universities did better. For teaching it was third, for industry income third, and for research seventh. What actually made the difference was Citations. KAU had a score of 93.3, far ahead of the next contender, Jordan University of Science and Technology with 50.2.
KAU's research impact is, according to THE, second in Asia only to the shooting star of india, Vel Tech University whose single self citing prodigy supposedly had a greater impact than the whole of any other Asian university. KAU's citations score was the result of the massive recruitment of adjunct faculty, 40 at the last count, who put KAU as a second affiliation. How much time they put in at KAU is uncertain but the Shanghai rankings calculated that highly cited researchers spent an average of 16% of their time at the university of their second affiliation.
It is bad enough that THE put so much emphasis on income and internationalisation in their methodology, promoting the diversion of resources from things like primary and pre-school education, adult literacy and alternatives to oil exports. To encourage universities to rise in the rankings by hiring adjunct faculty whose contribution is uncertain is very irresponsible. It would be a good thing if this snapshot was ignored.
Wednesday, May 10, 2017
Ranking article: The building of weak expertise: the work of global university rankers
Miguel Antonio Lim, The building of weak expertise: the work of global university rankers
University rankers are the subject of much criticism, and yet they remain influential in the field of higher education. Drawing from a two-year field study of university ranking organizations, interviews with key correspondents in the sector, and an analysis of related documents, I introduce the concept of weak expertise. This kind of expertise is the result of a constantly negotiated balance between the relevance, reliability, and robustness of rankers’ data and their relationships with their key readers and audiences. Building this expertise entails collecting robust data, presenting it in ways that are relevant to audiences, and engaging with critics. I show how one ranking organization, the Times Higher Education (THE), sought to maintain its legitimacy in the face of opposition from important stakeholders and how it sought to introduce a new “Innovation and Impact” ranking. The paper analyzes the strategies, methods, and particular practices that university rankers undertake to legitimate their knowledge—and is the first work to do so using insights gathered alongside the operations of one of the ranking agencies as well as from the rankings’ conference circuit. Rather than assuming that all of these trust-building mechanisms have solidified the hold of the THE over its audience, they can be seen as signs of a constant struggle for influence over a skeptical audience.
Higher Education 13 April 2017
https://link.springer.com/article/10.1007%2Fs10734-017-0147-8
University rankers are the subject of much criticism, and yet they remain influential in the field of higher education. Drawing from a two-year field study of university ranking organizations, interviews with key correspondents in the sector, and an analysis of related documents, I introduce the concept of weak expertise. This kind of expertise is the result of a constantly negotiated balance between the relevance, reliability, and robustness of rankers’ data and their relationships with their key readers and audiences. Building this expertise entails collecting robust data, presenting it in ways that are relevant to audiences, and engaging with critics. I show how one ranking organization, the Times Higher Education (THE), sought to maintain its legitimacy in the face of opposition from important stakeholders and how it sought to introduce a new “Innovation and Impact” ranking. The paper analyzes the strategies, methods, and particular practices that university rankers undertake to legitimate their knowledge—and is the first work to do so using insights gathered alongside the operations of one of the ranking agencies as well as from the rankings’ conference circuit. Rather than assuming that all of these trust-building mechanisms have solidified the hold of the THE over its audience, they can be seen as signs of a constant struggle for influence over a skeptical audience.
Higher Education 13 April 2017
DOI: 10.1007/s10734-017-0147-8
https://link.springer.com/article/10.1007%2Fs10734-017-0147-8
What good are international faculty?
In the previous post I did a bit of fiddling around with correlations and found that UK universities' scores for the international student indicator in the QS world rankings did not correlate very much with beneficial outcome for students such as employability and course completion. They did, however, correlate quite well with spending per student.
That would suggest that British universities want lots of international students because it is good for their finances.
What about international faculty?
Comparing the scores for various outcomes with the QS international faculty score shows that in most cases correlation is low and statistically insignificant. This includes course satisfaction and satisfaction with teaching (Guardian University League Tables), and completion and student satisfaction (THE TEF simulation).
There is, however, one metric that is positively, albeit modestly, and significantly associated with international students and that is the Research Excellence Framework score (REF) score (from Complete University Guide): .284 (sig (2-tailed) .043), N 51).
So it seems that international students are valued for the money they bring with them and international faculty for boosting research quality.
Caveat: this applies to highly ranked universities in the UK. How far it is true of other places or even less prestigious British institutions remains to be seen.
That would suggest that British universities want lots of international students because it is good for their finances.
What about international faculty?
Comparing the scores for various outcomes with the QS international faculty score shows that in most cases correlation is low and statistically insignificant. This includes course satisfaction and satisfaction with teaching (Guardian University League Tables), and completion and student satisfaction (THE TEF simulation).
There is, however, one metric that is positively, albeit modestly, and significantly associated with international students and that is the Research Excellence Framework score (REF) score (from Complete University Guide): .284 (sig (2-tailed) .043), N 51).
So it seems that international students are valued for the money they bring with them and international faculty for boosting research quality.
Caveat: this applies to highly ranked universities in the UK. How far it is true of other places or even less prestigious British institutions remains to be seen.
Monday, May 01, 2017
What good are international students?
There has been a big fuss in the UK about the status of international students. Many in the higher education industry are upset that the government insists on including these students in the overall total of immigrants, which might lead at some point to a reduction in their numbers. Leading twitterati have erupted in anger. Phil Baty of THE has called the government's decision "bizarre, bonkers & deeply depressing" and even invoked the name of the arch demon Enoch Powell in support.
So are international students a benefit to British universities? I have just done a quick correlation of the scores for the international students indicator in the QS World University Rankings to see whether there is any link between positive outcomes for students and the number of international students.
This is of course only suggestive. QS provides scores for only 51 universities included in their world top 500 and the situation might be different for other countries. Another caveat is that international students might provide net economic benefits for surrounding communities although that is far from settled.
Here are the correlations with the QS international student score (significance 2 tailed and N in brackets).
Value added .182 (.206; 50) From the Guardian Rankings 2016. Compares entry qualifications with degrees awarded.
Career .102 (.480; 50) Graduate-level employment or postgraduate study six months after graduation. Also from the Guardian rankings. The correlation with the graduate destinations indicator, based on the same data, in the Times Higher Education TEF simulation is even lower, .018, and turns negative after benchmarking, -172.
Students completing degrees .128 (.376; 50). From the TEF simulation. Again, the correlation turns negative after benchmarking.
QS Employers reputation survey .234 (.140; 41). From the 2016 world rankings.
So the number of international students has a slight and statistically insignificant relationship with the quality of teaching and learning as measured by value added, graduate employability, course completion and reputation with employers. Why then are universities so desperate to get as many as possible?
This, I think, is the answer. The correlation between the QS international students indicator and spending per student, measured by the Guardian ranking is .414 (.003; 50) which is very significant considering the noise generated in comparisons of this sort. Of course, correlation does not equal causation, but it seems a reasonable hypothesis that it is the money brought by international students that makes them so attractive to British universities.
So are international students a benefit to British universities? I have just done a quick correlation of the scores for the international students indicator in the QS World University Rankings to see whether there is any link between positive outcomes for students and the number of international students.
This is of course only suggestive. QS provides scores for only 51 universities included in their world top 500 and the situation might be different for other countries. Another caveat is that international students might provide net economic benefits for surrounding communities although that is far from settled.
Here are the correlations with the QS international student score (significance 2 tailed and N in brackets).
Value added .182 (.206; 50) From the Guardian Rankings 2016. Compares entry qualifications with degrees awarded.
Career .102 (.480; 50) Graduate-level employment or postgraduate study six months after graduation. Also from the Guardian rankings. The correlation with the graduate destinations indicator, based on the same data, in the Times Higher Education TEF simulation is even lower, .018, and turns negative after benchmarking, -172.
Students completing degrees .128 (.376; 50). From the TEF simulation. Again, the correlation turns negative after benchmarking.
QS Employers reputation survey .234 (.140; 41). From the 2016 world rankings.
So the number of international students has a slight and statistically insignificant relationship with the quality of teaching and learning as measured by value added, graduate employability, course completion and reputation with employers. Why then are universities so desperate to get as many as possible?
This, I think, is the answer. The correlation between the QS international students indicator and spending per student, measured by the Guardian ranking is .414 (.003; 50) which is very significant considering the noise generated in comparisons of this sort. Of course, correlation does not equal causation, but it seems a reasonable hypothesis that it is the money brought by international students that makes them so attractive to British universities.
Subscribe to:
Posts (Atom)