Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Thursday, May 31, 2018
Where did the top data scientists study?
The website efinancialcareers has a list of the top twenty data scientists in finance and banking. This looks like a subjective list and another writer might come up with a different set of experts. Even so it is quite interesting.
Their degrees are mainly in things like engineering, computer science and maths. There is only one each in business, economics and finance.
The institutions where they studied are:
Stanford (three)
University College London (three)
Institut Polytechnique de Grenoble
Oxford
Leonard Stern School of Business, New York University
University of Mexico
Universite Paris Dauphine
Ecole Polytechnique
Rensselaer Polytechnic Institute (RPI)
California State University
Indian Institute of Science
Johns Hopkins University
Institute of Management Development and Research, India
University of Illinois
University of Pittsburgh
Indian Institute of Technology.
Harvard, MIT and Cambridge are absent but there are three Indian Institutes, three French schools and some non-Ivy US places like RPI and the Universities of Pittsburgh and Illinois.
Wednesday, May 30, 2018
Why are US universities doing so well in the THE reputation rankings?
For the last couple of years the higher education media has tried to present any blip in the fortunes of UK universities as one of the malign effects of Brexit, whose toxic rays are unlimited by space, time or logic. Similarly, if anything unpleasant happens to US institutions, it is often linked to the evil spell of the great orange devil, who is scaring away international students, preventing the recruitment of the scientific elites of the world, or even being insufficiently credulous of the latest settled science.
So what is the explanation for the remarkable renaissance of US higher education apparently revealed by the THE reputation survey published today?
Is Trump working his magic to make American colleges great again?
UCLA is up four places, Carnegie Mellon seven, Cornell six, University of Washington six, Pennsylvania three. In contrast, several European and Asian institutions have fallen, University College London and the University of Kyoto by two places, Munich by seven, and Moscow State University by three.
In the previous post I noted that this year's survey had seen an increased response from engineering and computer science and a reduced one from the social sciences and the arts and humanities. As expected, LSE has tumbled five places and Oxford has fallen one place. Surprisingly, Caltech has fallen as well.
Some schools that are strong in engineering, such as Nanyang Technological University and Georgia Institute of Technology, have done well but I do not know if that is a full explanation for the success of US universities.
I suspect that US administrators have learned that influencing reputation is easier than maintaining scientific and intellectual standards and that a gap is emerging between perceptions and actual achievements.
It will be interesting to see if these results are confirmed by the reputation indicators included in the QS, Best Global Universities, and the Round University Rankings
So what is the explanation for the remarkable renaissance of US higher education apparently revealed by the THE reputation survey published today?
Is Trump working his magic to make American colleges great again?
UCLA is up four places, Carnegie Mellon seven, Cornell six, University of Washington six, Pennsylvania three. In contrast, several European and Asian institutions have fallen, University College London and the University of Kyoto by two places, Munich by seven, and Moscow State University by three.
In the previous post I noted that this year's survey had seen an increased response from engineering and computer science and a reduced one from the social sciences and the arts and humanities. As expected, LSE has tumbled five places and Oxford has fallen one place. Surprisingly, Caltech has fallen as well.
Some schools that are strong in engineering, such as Nanyang Technological University and Georgia Institute of Technology, have done well but I do not know if that is a full explanation for the success of US universities.
I suspect that US administrators have learned that influencing reputation is easier than maintaining scientific and intellectual standards and that a gap is emerging between perceptions and actual achievements.
It will be interesting to see if these results are confirmed by the reputation indicators included in the QS, Best Global Universities, and the Round University Rankings
Saturday, May 26, 2018
The THE reputation rankings
THE have just published details of their reputation rankings which will be published on May 30th, just ahead, no doubt coincidentally, of the QS World University Rankings.
The number of responses has gone down a bit, from 10,566 last year to 10,162, possibly reflecting growing survey fatigue among academics.
In surveys of this kind the distribution of responses is crucial. The more responses from engineers the better for universities in Asia. The more from scholars in the humanities the better for Western Europe. I have noted in a previous blog that the fortunes of Oxford in this ranking are tied to the percentage of responses from the arts and humanities.
This year there have been modest or small reductions in the percentage of responses from the clinical and health sciences, the life sciences, the social sciences, education and psychology and large ones for business and economics and the arts and humanities.
The number of responses in engineering and computer science has increased considerably.
It is likely that this year places like Caltech and Nanyang Technological University will do better while Oxford and LSE will suffer. It will be interesting to see if THE claim that this is all the fault of Brexit, an anti-feminist reaction to Oxford's appointment of a female vice-chancellor or government Scrooges turning off the funding tap.
The number of responses has gone down a bit, from 10,566 last year to 10,162, possibly reflecting growing survey fatigue among academics.
In surveys of this kind the distribution of responses is crucial. The more responses from engineers the better for universities in Asia. The more from scholars in the humanities the better for Western Europe. I have noted in a previous blog that the fortunes of Oxford in this ranking are tied to the percentage of responses from the arts and humanities.
This year there have been modest or small reductions in the percentage of responses from the clinical and health sciences, the life sciences, the social sciences, education and psychology and large ones for business and economics and the arts and humanities.
The number of responses in engineering and computer science has increased considerably.
It is likely that this year places like Caltech and Nanyang Technological University will do better while Oxford and LSE will suffer. It will be interesting to see if THE claim that this is all the fault of Brexit, an anti-feminist reaction to Oxford's appointment of a female vice-chancellor or government Scrooges turning off the funding tap.
|
2017 %
|
2018 %
|
Physical science
|
14.6
|
15.6
|
Clinical and health
|
14.5
|
13.2
|
Life sciences
|
13.3
|
12.8
|
Business and economics
|
13.1
|
9
|
engineering
|
12.7
|
18.1
|
Arts and humanities
|
12.5
|
7.5
|
Social sciences
|
8.9
|
7.6
|
Computer science
|
4.2
|
10.4
|
Education
|
2.6
|
2.5
|
Psychology
|
2.6
|
2.3
|
Law
|
0.9
|
1.0
|
|
|
|
North America
|
22
|
22
|
Asia Pacific
|
33
|
32
|
Western Europe
|
25
|
26
|
Eastern Europe
|
11
|
11
|
Latin America
|
5
|
5
|
Middle East
|
3
|
3
|
Africa
|
2
|
2
|
Friday, May 18, 2018
Getting ready for the next World's Smartest Rankings
As the world waits for the coming round of global rankings -- will Harvard still be number one in the Shanghai rankings? -- I am starting to update my list of smart rankings.
One of favorites was 'the Campus Squirrel Listings." A candidate for inclusion in the next edition is 'The Top 10 Colleges for Dog Lovers'
Number one in the USA is Stephens College in Columbia, Missouri.
One of favorites was 'the Campus Squirrel Listings." A candidate for inclusion in the next edition is 'The Top 10 Colleges for Dog Lovers'
Number one in the USA is Stephens College in Columbia, Missouri.
Monday, May 14, 2018
Alberta ousts Savitribai Phule Pune in latest edition of innovative ranking
The Fortunate 500 rankings use a sophisticated and innovative methodology to rank global universities. The 2018 edition puts the University of Alberta in first place in the world displacing Savitribai Phule Pune University which has mysteriously dropped out of the top 500.
The top British university is the University of Reading and number one in the USA is Caldwell University.
None of these universities have made any official comment.
These rankings have received almost no interest in the international media. Perhaps the rankers should start announcing the results at a prestigious summit in a spectacular setting along with networking brunches and masterclasses. I have selected a venue for them using an equally sophisticated methodology, North Korea. Perhaps the summit could be combined with with the groundbreaking ceremony for the Pyongyang Trump Tower.
The top British university is the University of Reading and number one in the USA is Caldwell University.
None of these universities have made any official comment.
These rankings have received almost no interest in the international media. Perhaps the rankers should start announcing the results at a prestigious summit in a spectacular setting along with networking brunches and masterclasses. I have selected a venue for them using an equally sophisticated methodology, North Korea. Perhaps the summit could be combined with with the groundbreaking ceremony for the Pyongyang Trump Tower.
Friday, May 11, 2018
Ranking Insights from Russia
The ranking industry is expanding and new rankings appear all the time. Most global rankings measure research publications and citations. Others try to add to the mix indicators that might have something to do with teaching and learning. There is now a ranking that tries to capture various third missions.
The Round University Rankings published in Russia are in the tradition of holistic rankings. They give a 40 % weighting to research, 40 % to teaching, 10% to international diversity and 10% to financial sustainability. Each group contains five equally weighted indicators. The data is derived from Clarivate Analytics who also contribute to the US News Best Global Universities Rankings.
These rankings are similar to the THE rankings in that they attempt to assess quality rather than quantity but they have 20 indicators instead of 13 and assign sensible weightings. Unfortunately, they receive only a fraction of the attention given to the THE rankings.
They are, however, very valuable since they dig deeper into the data than other global rankings. They also show that there is a downside to measures of quality and that data submitted directly by institutions should be treated with caution and perhaps scepticism.
Here are the top universities for each of the RUR indicators.
Teaching
Academic staff per students: VIB (Flemish Institute of Biotechnology), Belgium
Academic staff per bachelor degrees awarded: University of Valladolid, Spain
Doctoral degrees per academic staff: Kurdistan University of Medical Science, Iran
Doctoral degrees per bachelor degrees awarded: Jawaharlal Nehru University, India
World teaching reputation Harvard University, USA.
Research
Citations per academic and research staff: Harvard
Doctoral degrees per admitted PhD: Al Farabi Kazakh National University
Normalised citation impact: Rockefeller University, USA
Share of international co-authored papers: Free University of Berlin
World research reputation: Harvard.
International diversity
Share of international academic staff: American University of Sharjah, UAE
Share of international students: American University of Sharjah
Share of international co-authored papers: Innopolis University, Russia
Reputation outside region: Voronezh State Technical University, Russai
International Level: EPF Lausanne, Switzerland.
Financial sustainability:
Institutional income per academic staff: Universidade Federal Do Ceara, Brazil
Institutional income per student: Rockefeller University
Papers per research income: Novosibersk State University of Economics and Management, Russia
Research income per academic and research staff: Istanbul Technical University, Turkey
Research income per institutional income: A C Camargo Cancer Center, Brazil.
There are some surprising results here. The most obvious is Voronezh State Technical University which is first for reputation outside its region (Asia, Europe and so on), even though its overall scores for reputation and for international diversity are very low. The other top universities for this metric are just what you would expect, Harvard, MIT, Stanford, Oxford and so on. I wonder whether there is some sort of bug in the survey procedure, perhaps something like the university's supporters being assigned to Asia and therefore out of region. The university is also in second place in the world for papers per research income despite very low scores for the other research indicators.
There are other oddities such as Novosibersk State University of Economics and Management placed first for papers per research income and Universidade Federal Do Ceara for institutional income per academic staff. These may result from anomalies in the procedures for reporting and analysing data, possibly including problems in collecting data on income and staff.
It also seems that medical schools and specialist or predominantly postgraduate institutions such as Rockefeller University, the Kurdistan University of Medical Science, Jawarhalal Nehru University and VIB have a big advantage with these indicators since they tend to have favourable faculty student ratios, sometimes boosted by large numbers of clinical and research only staff, and a large proportion of doctoral students.
Jawaharlal Nehru University is a mainly postgraduate university so a high placing for academic staff per bachelor degrees awarded is not unexpected although I am surprised that it is ahead of Yale and Princeton. I must admit that the third place here for the University of Baghdad needs some explanation.
The indicator doctoral degrees per admitted PhD might identify universities that do a good job of selection and training and get large numbers of doctoral candidates through the system. Or perhaps it identifies universities where doctoral programmes are so lacking in rigour that nearly everybody can get their degree once admitted. The top ten of this indicator includes De Montfort University, Shakarim University, Kingston University, and the University of Westminster, none of which are famous for research excellence across the range of disciplines.
Measures of international diversity have become a staple of global rankings since they are fairly easy to collect. The problem is that international orientation may have something do with quality but it may also simply be a necessary attribute of being in a small country next to larger countries with the same or similar language and culture. The top ten for the international student indicators includes the Central European University and the American university of Sharjah. For international faculty it includes the University of Macau and Qatar University.
To conclude, these indicators suggest that self submitted institutional data should be used sparingly and that data from third party sources may be preferable. Also, while ranking by quality instead of quantity is sometimes advisable it also means that anomalies and outliers are more likely to appear.
The Round University Rankings published in Russia are in the tradition of holistic rankings. They give a 40 % weighting to research, 40 % to teaching, 10% to international diversity and 10% to financial sustainability. Each group contains five equally weighted indicators. The data is derived from Clarivate Analytics who also contribute to the US News Best Global Universities Rankings.
These rankings are similar to the THE rankings in that they attempt to assess quality rather than quantity but they have 20 indicators instead of 13 and assign sensible weightings. Unfortunately, they receive only a fraction of the attention given to the THE rankings.
They are, however, very valuable since they dig deeper into the data than other global rankings. They also show that there is a downside to measures of quality and that data submitted directly by institutions should be treated with caution and perhaps scepticism.
Here are the top universities for each of the RUR indicators.
Teaching
Academic staff per students: VIB (Flemish Institute of Biotechnology), Belgium
Academic staff per bachelor degrees awarded: University of Valladolid, Spain
Doctoral degrees per academic staff: Kurdistan University of Medical Science, Iran
Doctoral degrees per bachelor degrees awarded: Jawaharlal Nehru University, India
World teaching reputation Harvard University, USA.
Research
Citations per academic and research staff: Harvard
Doctoral degrees per admitted PhD: Al Farabi Kazakh National University
Normalised citation impact: Rockefeller University, USA
Share of international co-authored papers: Free University of Berlin
World research reputation: Harvard.
International diversity
Share of international academic staff: American University of Sharjah, UAE
Share of international students: American University of Sharjah
Share of international co-authored papers: Innopolis University, Russia
Reputation outside region: Voronezh State Technical University, Russai
International Level: EPF Lausanne, Switzerland.
Financial sustainability:
Institutional income per academic staff: Universidade Federal Do Ceara, Brazil
Institutional income per student: Rockefeller University
Papers per research income: Novosibersk State University of Economics and Management, Russia
Research income per academic and research staff: Istanbul Technical University, Turkey
Research income per institutional income: A C Camargo Cancer Center, Brazil.
There are some surprising results here. The most obvious is Voronezh State Technical University which is first for reputation outside its region (Asia, Europe and so on), even though its overall scores for reputation and for international diversity are very low. The other top universities for this metric are just what you would expect, Harvard, MIT, Stanford, Oxford and so on. I wonder whether there is some sort of bug in the survey procedure, perhaps something like the university's supporters being assigned to Asia and therefore out of region. The university is also in second place in the world for papers per research income despite very low scores for the other research indicators.
There are other oddities such as Novosibersk State University of Economics and Management placed first for papers per research income and Universidade Federal Do Ceara for institutional income per academic staff. These may result from anomalies in the procedures for reporting and analysing data, possibly including problems in collecting data on income and staff.
It also seems that medical schools and specialist or predominantly postgraduate institutions such as Rockefeller University, the Kurdistan University of Medical Science, Jawarhalal Nehru University and VIB have a big advantage with these indicators since they tend to have favourable faculty student ratios, sometimes boosted by large numbers of clinical and research only staff, and a large proportion of doctoral students.
Jawaharlal Nehru University is a mainly postgraduate university so a high placing for academic staff per bachelor degrees awarded is not unexpected although I am surprised that it is ahead of Yale and Princeton. I must admit that the third place here for the University of Baghdad needs some explanation.
The indicator doctoral degrees per admitted PhD might identify universities that do a good job of selection and training and get large numbers of doctoral candidates through the system. Or perhaps it identifies universities where doctoral programmes are so lacking in rigour that nearly everybody can get their degree once admitted. The top ten of this indicator includes De Montfort University, Shakarim University, Kingston University, and the University of Westminster, none of which are famous for research excellence across the range of disciplines.
Measures of international diversity have become a staple of global rankings since they are fairly easy to collect. The problem is that international orientation may have something do with quality but it may also simply be a necessary attribute of being in a small country next to larger countries with the same or similar language and culture. The top ten for the international student indicators includes the Central European University and the American university of Sharjah. For international faculty it includes the University of Macau and Qatar University.
To conclude, these indicators suggest that self submitted institutional data should be used sparingly and that data from third party sources may be preferable. Also, while ranking by quality instead of quantity is sometimes advisable it also means that anomalies and outliers are more likely to appear.
Tuesday, May 01, 2018
World Top 20 Project
The International Rankings Expert Group (IREG) has produced an inventory of international rankings that is testimony to the enormous interest in comparing and classifying universities around the world.
In addition to those rankings that were included there are several "also rans", rankings that were not counted because they included only one indicator, had been published only once or provided insufficient information about methodology .
One of these is the World Top 20 Project whose Executive Director and founder is Albert N Mitchell II. The website claims to rank 500 universities according to seven criteria and to use data from institutional databases and educational publications to construct eight regional rankings. The scores are then compared with those from the US News Best Global Universities, the THE World University Rankings, the QS World University Rankings, and the Center for World University Rankings to select the global top twenty.
The top five universities in each region are listed. Most seem quite sensible -- Cape Town is first in Africa, Tokyo in Europe, Harvard in North America and Cambridge in Europe -- but there are no Mexican universities in the top five in central America.
It is interesting that this site has included CWUR rather than the Shanghai ARWU in the big four world rankings. Could this be the start of a new trend?
There is no information about ranks or scores for the various indicators or details about the sources of data. It is also difficult to see how information about things like career development facilities, disability access and low-income outreach could be collected from the universities mentioned.
Unless further information about sources and methods appears, it seems that there is no need to discuss these rankings any further
In addition to those rankings that were included there are several "also rans", rankings that were not counted because they included only one indicator, had been published only once or provided insufficient information about methodology .
One of these is the World Top 20 Project whose Executive Director and founder is Albert N Mitchell II. The website claims to rank 500 universities according to seven criteria and to use data from institutional databases and educational publications to construct eight regional rankings. The scores are then compared with those from the US News Best Global Universities, the THE World University Rankings, the QS World University Rankings, and the Center for World University Rankings to select the global top twenty.
The top five universities in each region are listed. Most seem quite sensible -- Cape Town is first in Africa, Tokyo in Europe, Harvard in North America and Cambridge in Europe -- but there are no Mexican universities in the top five in central America.
It is interesting that this site has included CWUR rather than the Shanghai ARWU in the big four world rankings. Could this be the start of a new trend?
There is no information about ranks or scores for the various indicators or details about the sources of data. It is also difficult to see how information about things like career development facilities, disability access and low-income outreach could be collected from the universities mentioned.
Unless further information about sources and methods appears, it seems that there is no need to discuss these rankings any further
Friday, April 13, 2018
At last. A Ranking With Cambridge at the Bottom
Cambridge usually does well in national and global rankings. The most recent ARWU from Shanghai puts it in third place and although it does less well in other rankings it always seems to be in the top twenty. It has suffered at the hands of the citations indicator in the THE world rankings which seem to think that Anglia Ruskin University, formerly the Cambridgeshire College of Arts and Technology, has a greater global research impact but nobody takes that seriously.
So it is a surprise to find an article in the Guardian about a ranking from the Higher Education Policy Institute ( HEPI) in the UK that actually puts Cambridge at the bottom and the University of Hull at the top. Near the bottom are others in the Russell group, Oxford, Bristol and LSE.
At the top we find Edge Hill, Cardiff Metropolitan and, of course, Anglia Ruskin Universities.
The ranking was part of a report written for HEPI by Iain Martin, vice-chancellor of Anglia Ruskin University, that supposedly rates universities for fair access, that is having a student intake that mirrors society as a whole. It compares the percentage of participation in higher education of school leavers in local authority areas with the percentage admitted by specific universities. Universities have a high rank if they draw students from areas where relatively few school leavers go to university. The rationale is the claim that learning outcomes are improved when people of diverse backgrounds study together.
It is noticeable that there several Scottish universities clustered at the bottom even though Scotland has a free tuition policy (not for the English of course) that was supposed to guarantee fair access.
This rankings looks like an inversion of the ranking of UK universities according to average entry tariff, ie 'A' level grades, and a similar inversion of most global rankings based on research or reputation.
Cambridge and other Russell Group universities have been under increasing pressure to relax entry standards and indiscriminately recruit more low income students and those from historically unrepresented groups. It seems that they are slowly giving way to the pressure and that as academic standards erode they will be gradually eclipsed by the rising universities of East Asia.
Saturday, March 24, 2018
Ranking Arab Universities
The Middle East and North Africa (MENA) region has been slower than some others to jump on the rankings train but it seems to be making up for lost time. In addition to the standard world rankings there are now MENA (or Arab world or region) university rankings from Quacquarelli Symonds (QS), Times Higher Education (THE), US News (USN) and Webometrics.
Taking methodologies developed to rank elite western universities and applying them to regions with different traditions, resources and priorities is no easy task. For most Arab universities, research is of little significance and attaining international prominence is something that only a few places can reasonably hope for. But there is still a need to differentiate among those institutions that are focussed largely on teaching.
Alex Usher of HESA has spoken of the difficulty of using metrics based on research, expenditure, and student quality. I agree that institutional data is not very helpful here. However, measures of social influence such as those in the Webometrics and QS Arab rankings, and peer and employer surveys, used by USN and QS, might be useful in assessing the teaching quality, or at least the perceived quality, of these universities.
If rankings are to be of any use in the MENA region, then they will have to find ways of comparing selectivity, student quality and social impact. There is little point in forcing regional universities into the Procrustean bed of global indicators designed to make fine distinctions within the Russell Group or the Ivy League.
This is pretty much what THE have done with the 2018 edition of their Arab World Rankings, which is simply extracted from their world rankings published in 2017. These rankings are very research orientated and include measures of income, doctoral degrees and internationalisation. They also give a disproportionate weighting to citations, supposedly a measure of research impact or research quality.
Here are the top five in the recent editions of the various Arab Region/MENA rankings.
THE
1. King Abdulaziz University, Saudi Arabia
2. Khalifa University, UAE
3. Qatar University
4. Jordan University of Science and Technology
5. United Arab Emirates University (UAEU)
QS
1. American University of Beirut, Lebanon
2. King Fahd University of Petroleum and Minerals, Saudi Arabia
3. King Saud University, Saudi Arabia
4. King Abdulaziz University
5. United Arab Emirates University
USN
1. King Saud University
2. King Abdulaziz University
3. King Abdullah University of Science and Technology (KAUST), Saudi Arabia
4. Cairo University, Egypt
5. American University of Beirut
Webometrics
1. King Saud University
2. King Abdulaziz University
3. King Abdullah University of Science and Technology
4. Cairo University
5. American University of Beirut
Webometrics and USN are identical for the first six places. It is only when we reach seventh place that they diverge: UAEU in Webometrics and Ain Shams, Egypt, in the USN rankings. Webometrics measures web activity with a substantial research output indicator while USN is mainly about research with some weighting for reputation.
The list of top universities in QS, which uses Webometrics data as one indicator, is quite similar. QS does not count research universities such as KAUST, third place in the WEbometrics and USN rankings but otherwise it is not too different from the other two.
The THE rankings have a disproportionate weighting for research impact supposedly measured by field and year normalised citations. Officially, it is 30 % but in fact it is much higher because of the regional modification that gives a big bonus to universities in countries with a low citation impact score.
For example, KAU's score for citations amounts to nearly 60% of its total score. Other universities in THE's top twenty have citation scores higher, sometimes much higher, than their research scores.
In effect, the THE Arab rankings are mostly about citations, very often in a limited range of disciplines. They can be easily, sometimes accidentally, gamed and can lead to perverse consequences, such as recruiting highly cited researchers or searching for citation-rich projects that have little relevance to the region or country.
Taking methodologies developed to rank elite western universities and applying them to regions with different traditions, resources and priorities is no easy task. For most Arab universities, research is of little significance and attaining international prominence is something that only a few places can reasonably hope for. But there is still a need to differentiate among those institutions that are focussed largely on teaching.
Alex Usher of HESA has spoken of the difficulty of using metrics based on research, expenditure, and student quality. I agree that institutional data is not very helpful here. However, measures of social influence such as those in the Webometrics and QS Arab rankings, and peer and employer surveys, used by USN and QS, might be useful in assessing the teaching quality, or at least the perceived quality, of these universities.
If rankings are to be of any use in the MENA region, then they will have to find ways of comparing selectivity, student quality and social impact. There is little point in forcing regional universities into the Procrustean bed of global indicators designed to make fine distinctions within the Russell Group or the Ivy League.
This is pretty much what THE have done with the 2018 edition of their Arab World Rankings, which is simply extracted from their world rankings published in 2017. These rankings are very research orientated and include measures of income, doctoral degrees and internationalisation. They also give a disproportionate weighting to citations, supposedly a measure of research impact or research quality.
Here are the top five in the recent editions of the various Arab Region/MENA rankings.
THE
1. King Abdulaziz University, Saudi Arabia
2. Khalifa University, UAE
3. Qatar University
4. Jordan University of Science and Technology
5. United Arab Emirates University (UAEU)
QS
1. American University of Beirut, Lebanon
2. King Fahd University of Petroleum and Minerals, Saudi Arabia
3. King Saud University, Saudi Arabia
4. King Abdulaziz University
5. United Arab Emirates University
USN
1. King Saud University
2. King Abdulaziz University
3. King Abdullah University of Science and Technology (KAUST), Saudi Arabia
4. Cairo University, Egypt
5. American University of Beirut
Webometrics
1. King Saud University
2. King Abdulaziz University
3. King Abdullah University of Science and Technology
4. Cairo University
5. American University of Beirut
The list of top universities in QS, which uses Webometrics data as one indicator, is quite similar. QS does not count research universities such as KAUST, third place in the WEbometrics and USN rankings but otherwise it is not too different from the other two.
The THE rankings have a disproportionate weighting for research impact supposedly measured by field and year normalised citations. Officially, it is 30 % but in fact it is much higher because of the regional modification that gives a big bonus to universities in countries with a low citation impact score.
For example, KAU's score for citations amounts to nearly 60% of its total score. Other universities in THE's top twenty have citation scores higher, sometimes much higher, than their research scores.
In effect, the THE Arab rankings are mostly about citations, very often in a limited range of disciplines. They can be easily, sometimes accidentally, gamed and can lead to perverse consequences, such as recruiting highly cited researchers or searching for citation-rich projects that have little relevance to the region or country.
Friday, March 23, 2018
More evidence of the rise of China
A regular story in the ranking world is the rise of Asia, usually as a warning to stingy Western governments who fail to give their universities the money that they desperately need to be world-class.
Sometimes the rise of Asia turns out to be nothing more than a methodological tweaking or a bug that allows minor fluctuations to be amplified. Asia often turns out to be just East Asia or sometimes even just Shanghai and Peking. But it still remains true that China, followed perhaps by South Korea, Taiwan, Singapore and Hong Kong, is steadily becoming a scientific superpower and that the USA and Europe are entering a period of relative decline.
This blog has already noted that China has overtaken the West in supercomputing power and in the total output of scientific publications.
David Goldman of Asia Times, writing in Breitbart, has reported another sign of the rise of China: the number of doctorates in STEM subjects is well ahead of the USA. And we should remember that many of those doctorates are Chinese nationals or of Chinese descent who may or may not remain in the US.
“What I’m concerned about is the fact that China is testing a railgun mounted on a navy ship before the United States is and that China has the biggest quantum computing facility in the world about to open,” said Goldman. “It probably has more advanced research in quantum communications than we have, and they’re graduating twice as many doctorates in STEM fields than we are. That’s what really frightens me.”
There are, of course, some areas where US researchers reign supreme such as gaming research and gender, queer and trans studies. But I suspect that is not something that will help the US win the coming trade wars or any other sort of war.
Sometimes the rise of Asia turns out to be nothing more than a methodological tweaking or a bug that allows minor fluctuations to be amplified. Asia often turns out to be just East Asia or sometimes even just Shanghai and Peking. But it still remains true that China, followed perhaps by South Korea, Taiwan, Singapore and Hong Kong, is steadily becoming a scientific superpower and that the USA and Europe are entering a period of relative decline.
This blog has already noted that China has overtaken the West in supercomputing power and in the total output of scientific publications.
David Goldman of Asia Times, writing in Breitbart, has reported another sign of the rise of China: the number of doctorates in STEM subjects is well ahead of the USA. And we should remember that many of those doctorates are Chinese nationals or of Chinese descent who may or may not remain in the US.
“What I’m concerned about is the fact that China is testing a railgun mounted on a navy ship before the United States is and that China has the biggest quantum computing facility in the world about to open,” said Goldman. “It probably has more advanced research in quantum communications than we have, and they’re graduating twice as many doctorates in STEM fields than we are. That’s what really frightens me.”
Monday, March 12, 2018
Anglia Ruskin University sued for awarding Mickey Mouse degrees
Pok Wong, or Fiona Pok Wong, a graduate of Anglia Ruskin University (ARU) in Cambridge, wants 60,000 pounds for a breach of contract and fraudulent misrepresentation and false imprisonment after a protest at the graduation ceremony.
ARU has appeared in this blog before following its spectacular performance in the research impact indicator in the THE world rankings. It has had the common sense to keep quiet about this rather quirky result.
Ms Wong has claimed that her degree in International Business Strategy was just a "Mickey Mouse" degree and that the teaching was of poor quality with one lecturer coming late and leaving early and sometimes even telling the students to self study in the library. She is reported to claim that "since graduating ... it has been proven that the degree ... does not play a role to help secure a rewarding job with good prospects."
It seems that in 2013 she had a job as a Financial Planner with AIA International so her degree from ARU did not leave her totally unemployable. Between 2013 and 2016 she studied for Graduate Diplomas in Law and Paralegal Legal Practice at BPP University College of Professional Studies, which does not appear in the national UK rankings but is ranked 5,499th in the world by Webometrics.
I doubt that the suit will succeed. It is of course regrettable if ARU has been lax about its teaching quality but whether that has much to do with Ms Wong not getting the job she thinks she deserves is debatable. ARU is not among the elite universities of England and its score for graduate employment is particularly bad. It is not a selective university so the question arises why Ms Wong did not apply to a better university with a better reputation.
The university would be justified if it pointed out that publishing photos proclaiming "ARU sucks" may not be the best way of selling yourself to potential employers.
If she does succeed it would be a disastrous precedent for British universities who would be vulnerable to every graduate who failed to get suitable employment or any employment at all.
But the affair should be a warning to all universities to be careful about the claims they make in advertising their products. Prospective students should also take a critical look at the data in all the indicators in all the rankings before banking in their tuition fees.
Sunday, March 11, 2018
Salaries, rankings, academic quality, racism, sexism, and heightism at Renssalaer Polytechnic Institute
From time to time the question of the salaries of university administrators resurfaces. Last August the issue of the salary of the yacht and Bentley owning vice-chancellor of the University of Bolton in the UK received national prominence. His salary of GBP 260,500, including pension contributions and healthcare benefits, seemed to have little relationship to the quality of the university which was not included in the QS and THE world rankings and managed a rank of 1,846 in Webometrics and 2,106 in University Ranking by Academic Performance (URAP). A poll in the local newspaper showed 93% of respondents opposed to the increase.
A previous post in this blog reported that vice chancellors salaries had no statistically significant relationship to student satisfaction in the UK although they had more than average faculty salaries and the number of faculty with teaching qualifications.
This issue has cropped up in the US where it has been noted that the highest paid university president is Shirley Ann Jackson of the Renssalaer Polytechnic Institute (RPI).
She has come under fire for being overpaid, autocratic and allowing RPI to go into academic decline. Her supporters have argued that her critics are guilty of residual racism, sexism and even heightism. A letter in the Troy Times Union from David Hershberg uses the Times Higher Education (THE) world rankings to chastise Jackson
"RPI was always in the top 5 of undergraduate engineering schools. Now it's No. 30 in U.S. News and World Report's latest rankings. Despite the continued loss of stature of my alma mater, the school's president, Shirley Ann Jackson, is the highest paid president of a university by far and on some 10 other boards that supplement her $7 million salary and other compensation. This is RPI's rankings the last eight years in the Times Higher Education World University Rankings: 2011, 104; 2012, 144; 2013, 174; 2014, 181; 2015, 226-250; 2016, 251-300; 2017, 251-300; and 2018, 301-350. Further, U.S. News & World Report has RPI at No. 434 globally and No. 195 engineering school. This warrants a change at the top. This is what matters, not gender or race."
It seems that for some people in the USA international rankings, especially THE's, have become the measure of university excellence..
First, it must be said that the THE World University Rankings are not a good measure of university quality. These rankings have seen dramatic rises and falls in recent years. Between 2014-15 and 2015-16, for example, Middle East Technical University (METU) in Ankara fell from 85th place to the 501-600 band while many French, Japanese, Korean and other Turkish universities fell dozens of places. This had nothing to do with the quality of the universities and everything to do with methodological changes, especially to the citations indicator.
The verdict of the US News America's best Colleges is simple. RPI was 42nd in 2007 and it is 42nd in the 2018 rankings, although apparently alumni giving has gone.down.
Comparing data from US News in 2007 and 2015, RPI is more selective with more applicants of whom a smaller proportion are admitted. SAT scores are higher and more students come from the top 10% of their high school. There are more women and more international and out of state students.
The school may, however, have become less equitable. The percentage of Black students has fallen from 4% to 2% and that of students needing financial aid from 70% to 65%.
As a national university with an undergraduate teaching mission RPI is certainly not declining in any sense although it may be less welcoming for poor and Black students and it is definitely becoming more expensive for everybody.
The international rankings, especially those based on research, tell a different story. RPI is slipping everywhere: from 243 in 2014 to 301 in 2017 in the CWUR rankings, from 589 in 2010-11 to 618 in 2017 in URAP, from 341 in 2013 to 390 in 2017 in Nature Index, from 128 in 2010 to 193 in 2017 in the Round University Rankings.
In the Shanghai rankings, RPI fell from the 151-200 band to the 501-600, partly because of the loss of a couple of highly cited researchers and the declining value of a Nobel winning alumnus .
RPI's fall in the global rankings is largely a reflection of the general decline of the US and the rise of China, which has overtaken the US in research output and supercomputing. But there is more. In the indicator that measures research quality in the CWTS Leiden ranking, percentage of papers in the top 10% of journals, RPI has fallen from 23 in 2011-12 to 194 in 2017.
It seems that RPI is holding its own or a bit more as an American teaching university. Whether that is worth the biggest salary in the country is for others to argue about. But it is definitely losing out to international competition as far as research quality is concerned. That, however, is an American problem and RPI's difficulties are hardly unique.
Thursday, March 08, 2018
Rankings and the financialisation of higher education
University rankings are now being used for purposes that would have been inconceivable a decade ago. The latest is supporting the large scale borrowing of money by UK universities.
The Financial Times has an interesting article by Thomas Hale about the growing financialisation of British higher education. He reports that some universities such as Portsmouth, Bristol, Cardiff and Oxford are resorting to capital markets for financing supposedly because of declining government support.
The University of Portsmouth has borrowed GBP 100 million from two North American institutional investors. The placement agent was Lloyds and PricewaterhouseCoopers (PwC) the advisors.
"The money will be spent on the first phase of “estate development”. It is expected to involve a number of buildings, including an indoor sports facility, the extension of a lecture hall, and a flagship “teaching and learning building”."
It seems that this is just part of a larger trend.
The press release explicitly referred to Portsmouth as being first in the UK for boosting graduate salaries, by which is meant earning above expectations based on things like social background and exam results. That could reflect credit on the university although a cynic might wonder whether that is just because expectations were very low to start off with. In addition, the university is ranked 37th among UK universities in the Guardian University Guide and in the top 100 in the Times Higher Education (THE) Young Universities Rankings.
If millions of pounds have been advanced in part because of a 98th place in the THE young universities rankings that might not be a wise decision. These rankings are quite credible for the top 20 or 30 but go down a bit more and in 74th place is Veltech University in India which has a perfect score for research impact based entirely on the publications of exactly one serial self-citer.
The profile of the University of Portsmouth shows a fairly high score for citations and a low one for research, which is often a sign that its position has little to do with research excellence and more to do with getting into high-citation, multi-author astrophysics and medical projects. That does appear to be the case with Portsmouth and it could mean that the university's place in the young university rankings is precarious since it could be undermined by methodological changes or by the departure of a few highly cited researchers.
The role of PwC as advisor is interesting since that company is also charged with auditing the THE world rankings.
The Financial Times has an interesting article by Thomas Hale about the growing financialisation of British higher education. He reports that some universities such as Portsmouth, Bristol, Cardiff and Oxford are resorting to capital markets for financing supposedly because of declining government support.
The University of Portsmouth has borrowed GBP 100 million from two North American institutional investors. The placement agent was Lloyds and PricewaterhouseCoopers (PwC) the advisors.
"The money will be spent on the first phase of “estate development”. It is expected to involve a number of buildings, including an indoor sports facility, the extension of a lecture hall, and a flagship “teaching and learning building”."
It seems that this is just part of a larger trend.
"The private placement market – by definition, more opaque than its public counterpart — is a particularly attractive option for universities, and a popular target of investment for US pension and insurance money seeking long-term projects. Lloyds estimates that more than £3bn has been borrowed by UK universities since 2016 on capital markets, with around half of that coming via private placements.
The market is small by the standards of capital markets, but significant in relation to the overall size of the country’s higher education sector, which has a total annual income of close to £30bn, according to the Higher Education Funding Council for England. "
The press release explicitly referred to Portsmouth as being first in the UK for boosting graduate salaries, by which is meant earning above expectations based on things like social background and exam results. That could reflect credit on the university although a cynic might wonder whether that is just because expectations were very low to start off with. In addition, the university is ranked 37th among UK universities in the Guardian University Guide and in the top 100 in the Times Higher Education (THE) Young Universities Rankings.
If millions of pounds have been advanced in part because of a 98th place in the THE young universities rankings that might not be a wise decision. These rankings are quite credible for the top 20 or 30 but go down a bit more and in 74th place is Veltech University in India which has a perfect score for research impact based entirely on the publications of exactly one serial self-citer.
The profile of the University of Portsmouth shows a fairly high score for citations and a low one for research, which is often a sign that its position has little to do with research excellence and more to do with getting into high-citation, multi-author astrophysics and medical projects. That does appear to be the case with Portsmouth and it could mean that the university's place in the young university rankings is precarious since it could be undermined by methodological changes or by the departure of a few highly cited researchers.
The role of PwC as advisor is interesting since that company is also charged with auditing the THE world rankings.
Tuesday, February 27, 2018
Are the rankings biased?
Louise Richardson, vice-chancellor of the University of Oxford has published an article in the Financial Times proclaiming that British universities are a national asset and that their researchers deserve that same adulation as athletes and actors.
"Listening to the public discourse one could be forgiven for thinking that the British higher education system is a failure. It is not. It is the envy of the world."
That is an unfortunate phrase. It used to be asserted that the National Health Service was the envy of the world.
She cites as evidence for university excellence the Times Higher Education World University Rankings which have three British universities in the world's top ten and twelve in the top one hundred. These rankings also, although she does not mention it here, put Oxford in first place.
There are now, according to IREG, 21 global university rankings. One wonders why a world-class scholar and head of a world-class university would choose rankings that regularly produce absurdities such as Anglia Ruskin University ahead of Oxford for research impact and Babol Noshirvani University of Technology its equal.
But perhaps it is not really surprising since of those rankings THE is the only one to put Oxford in first place. In the others it ranges from third place in the URAP rankings published in Ankara to seventh in the Shanghai Rankings (ARWU), Webometrics (WEB) and Round University Ranking (RUR) from Russia
That leads to the question of how far the rankings are biased in favor of universities in their own countries.
Below is a quick and simple comparison of how top universities perform in rankings published in the countries where they located and in other rankings.
I have looked at the rank of the top scoring home country university in each of eleven global rankings and then at how well that university does in the other rankings. The table below gives the overall rank of each "national flagship" in the most recent eleven global university rankings. The rank in the home country rankings is in red.
We can see that Oxford does better in the Times Higher Education (THE) world rankings where it is first than in the others where its rank ranges from 3rd to 7th. Similarly, Cambridge is the best performing UK university in the QS rankings where it is 4th. It is also 4th in the Center for World University Rankings (CWUR), now published in the UAE, and 3rd in ARWU. In the other rankings it does less well.
ARWU, the US News Best Global Universities (BGU), Scimago (SCI), Webometrics (WEB), URAP, the National Taiwan University Rankings (NTU), and RUR do not seem to be biased in favour of their country's flagship universities. For example, URAP ranks Middle East Technical University (METU) 532nd which is lower than five other rankings and higher than three.
CWUR used to be published from Jeddah in Saudi Arabia but has now moved to the Emirates so I count the whole Arabian peninsula as its home. The top home university is therefore King Saud University (KSU), which is ranked 560th, worse than in any other ranking except for THE.
The GreenMetric Rankings, produced by Universitas Indonesia (UI), have that university in 23rd place, which is very much better than any other.
It looks like THE, GreenMetric and, to a lesser extent QS, are biased towards their top home country institutions.
This only refers to the best universities and we might get different result looking at all the ranked universities.
There is a paper by Chris Claassen that does this although it covers fewer rankings.
"Listening to the public discourse one could be forgiven for thinking that the British higher education system is a failure. It is not. It is the envy of the world."
That is an unfortunate phrase. It used to be asserted that the National Health Service was the envy of the world.
She cites as evidence for university excellence the Times Higher Education World University Rankings which have three British universities in the world's top ten and twelve in the top one hundred. These rankings also, although she does not mention it here, put Oxford in first place.
There are now, according to IREG, 21 global university rankings. One wonders why a world-class scholar and head of a world-class university would choose rankings that regularly produce absurdities such as Anglia Ruskin University ahead of Oxford for research impact and Babol Noshirvani University of Technology its equal.
But perhaps it is not really surprising since of those rankings THE is the only one to put Oxford in first place. In the others it ranges from third place in the URAP rankings published in Ankara to seventh in the Shanghai Rankings (ARWU), Webometrics (WEB) and Round University Ranking (RUR) from Russia
That leads to the question of how far the rankings are biased in favor of universities in their own countries.
Below is a quick and simple comparison of how top universities perform in rankings published in the countries where they located and in other rankings.
I have looked at the rank of the top scoring home country university in each of eleven global rankings and then at how well that university does in the other rankings. The table below gives the overall rank of each "national flagship" in the most recent eleven global university rankings. The rank in the home country rankings is in red.
We can see that Oxford does better in the Times Higher Education (THE) world rankings where it is first than in the others where its rank ranges from 3rd to 7th. Similarly, Cambridge is the best performing UK university in the QS rankings where it is 4th. It is also 4th in the Center for World University Rankings (CWUR), now published in the UAE, and 3rd in ARWU. In the other rankings it does less well.
ARWU, the US News Best Global Universities (BGU), Scimago (SCI), Webometrics (WEB), URAP, the National Taiwan University Rankings (NTU), and RUR do not seem to be biased in favour of their country's flagship universities. For example, URAP ranks Middle East Technical University (METU) 532nd which is lower than five other rankings and higher than three.
CWUR used to be published from Jeddah in Saudi Arabia but has now moved to the Emirates so I count the whole Arabian peninsula as its home. The top home university is therefore King Saud University (KSU), which is ranked 560th, worse than in any other ranking except for THE.
The GreenMetric Rankings, produced by Universitas Indonesia (UI), have that university in 23rd place, which is very much better than any other.
It looks like THE, GreenMetric and, to a lesser extent QS, are biased towards their top home country institutions.
This only refers to the best universities and we might get different result looking at all the ranked universities.
There is a paper by Chris Claassen that does this although it covers fewer rankings.
THE
|
ARWU
|
QS
|
BGU
|
SCI
|
WEB
|
URAP
|
NTU
|
RUR
|
CWUR
|
GM
|
|
Oxford
|
1
|
7
|
6
|
5
|
6
|
7
|
3
|
5
|
7
|
5
|
6
|
Tsinghua
|
35
|
48
|
25
|
64
|
8
|
45
|
25
|
34
|
75
|
65
|
NR
|
Cambridge
|
4
|
3
|
5
|
7
|
16
|
11
|
9
|
12
|
9
|
4
|
NR
|
Harvard
|
6
|
1
|
3
|
1
|
1
|
1
|
1
|
1
|
1
|
1
|
NR
|
Barcelona
|
201-250
|
201-300
|
156
|
81
|
151
|
138
|
46
|
64
|
212
|
103
|
180
|
METU
|
601-800
|
701-800
|
471-480
|
314
|
489
|
521
|
532
|
601-700
|
407
|
498
|
NR
|
NTU
|
195
|
151-200
|
76
|
166
|
342
|
85
|
100
|
114
|
107
|
52
|
92
|
Lomonosov MSU
|
188
|
93
|
95
|
267
|
342
|
235
|
194
|
236
|
145
|
97
|
NR
|
KSU
|
501-600
|
101-150
|
221
|
377
|
NR
|
424
|
192
|
318
|
460
|
560
|
NR
|
UI
|
600-800
|
NR
|
277
|
NR
|
632
|
888
|
1548
|
NR
|
NR
|
NR
|
23
|
Subscribe to:
Posts (Atom)