My previous post has just been republished by University World News.
Comments here are welcome.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Saturday, September 28, 2019
Thursday, September 26, 2019
Trinity College Dublin: Time to forget about THE?:
Global rankings, especially THE's, have been very useful to British universities, at least to those sitting at the apex of the system. If a Russell Group university falls in the rankings then it is the fault of impending Brexit and/or the terrible austerity inflicted on the nation's research prowess. If it rises then this is cause for congratulation but with a hint of foreboding. How can they keep advancing with Ebenezer Scrooge controlling the treasury and the bitter winds of Brexit howling at the door? The universities have reciprocated by not inquiring about how THE constructs its rankings, particularly the citations and industry income indicators.
Across the Irish Sea, universities have for the most part also been loyal to THE. Trinity College Dublin (TCD) has continued to submit data to THE and also to QS and has passively accepted the results of the rankings even when they show the institution going down and down. The steady decline is usually blamed on the meanness of the Irish government and its failure to provide sufficient funds.
I have dealt with Trinity's misfortunes here, here, here, and here, In 2015 TCD fell seven places in the QS world rankings and 22 in THE's. In contrast it had been rising in the Shanghai ARWU rankings since 2004 and in the Round University Rankings (RUR) since 2010, although everybody pretended not to notice this.
This year history repeats itself all over again. TCD has fallen in THE world rankings from 120th place to 164th. Again this supposedly is the fault of the Irish state to provide enough money.
But we get a very different picture when we look at the Shanghai Rankings. TCD has risen from 167th place to 154th, getting close to the 101-150 band. Leaving aside the Nobel and Fields Awards, Trinity has gained 6.6 points for highly cited researchers, 1.4 for publications, and 1.4 for productivity per capita. It has, however, fallen 0.6 for papers in Nature and Science.
Looking at RUR, TCD has risen from 75th to 57th for Research and 35th to 29th for international diversity. It has fallen slightly for financial sustainability from 191st to 197th and for Teaching from 275th to 335th, mainly because of a fall in the number of academic staff.
It seems perverse for TCD to keep on about its decline in the THE rankings when it can point to a steady rise in the Shanghai rankings which are not perfect but are certainly more stable, consistent and realistic than THE.
Does THE really want to be judged by rankings that apparently think that Anadolu University is best for Innovation, Luxembourg for International Orientation and Aswan for research impact measured by Citations?
But if TCD really insist on sticking with THE then I suggest that they recruit a few researchers taking part in the Global Burden of Disease Study, funded by the Bill and Melinda Gates Foundation.*
They should also think about amalgamating with the Royal College of Surgeons.
*assuming no methodological change
I have dealt with Trinity's misfortunes here, here, here, and here, In 2015 TCD fell seven places in the QS world rankings and 22 in THE's. In contrast it had been rising in the Shanghai ARWU rankings since 2004 and in the Round University Rankings (RUR) since 2010, although everybody pretended not to notice this.
This year history repeats itself all over again. TCD has fallen in THE world rankings from 120th place to 164th. Again this supposedly is the fault of the Irish state to provide enough money.
But we get a very different picture when we look at the Shanghai Rankings. TCD has risen from 167th place to 154th, getting close to the 101-150 band. Leaving aside the Nobel and Fields Awards, Trinity has gained 6.6 points for highly cited researchers, 1.4 for publications, and 1.4 for productivity per capita. It has, however, fallen 0.6 for papers in Nature and Science.
Looking at RUR, TCD has risen from 75th to 57th for Research and 35th to 29th for international diversity. It has fallen slightly for financial sustainability from 191st to 197th and for Teaching from 275th to 335th, mainly because of a fall in the number of academic staff.
It seems perverse for TCD to keep on about its decline in the THE rankings when it can point to a steady rise in the Shanghai rankings which are not perfect but are certainly more stable, consistent and realistic than THE.
Does THE really want to be judged by rankings that apparently think that Anadolu University is best for Innovation, Luxembourg for International Orientation and Aswan for research impact measured by Citations?
But if TCD really insist on sticking with THE then I suggest that they recruit a few researchers taking part in the Global Burden of Disease Study, funded by the Bill and Melinda Gates Foundation.*
They should also think about amalgamating with the Royal College of Surgeons.
*assuming no methodological change
Tuesday, September 17, 2019
Going Up and Up Down Under: the Case of the University of Canberra
It is a fact almost universally ignored that when a university suddenly rises or falls many places in the global rankings the cause is not transformative leadership, inclusive excellence, team work, or strategic planning but nearly always a defect or a change in the rankers' methodology.
Let's take a look at the fortunes of the University of Canberra (UC) which THE world rankings now have in the world's top 200 universities and Australia's top ten. This is a remarkable achievement since the university did not appear in these rankings until 2015-16 when it was placed in the 500-600 band with very modest scores of 18.4 for teaching, 19.3 for research, 29.8 for citations, which is supposed to measure research impact, 36.2 for industry income, and 54.6 for international outlook.
Just four years later the indicator scores are 25.2 for teaching, 31.1 for research, 99.2 for citations, 38.6 for industry income, and 86.9 for international orientation.
The increase in the overall score over four years, calculated with different weightings for the indicators, was composed of 20.8 points for citations and 6.3 for the other four indicators combined. Without those 20.8 points Canberra would be in the 601-800 band.
I will look at where that massive citation score came from in a moment.
It seems that the Australian media is reporting on this superficially impressive performance with little or no scepticism and without noting how different it is from the other global rankings.
The university has issued a statement quoting vice-chancellor Professor Deep Saini as saying that the "result confirms the steady strengthening of the quality at the University of Canberra, thanks to the outstanding work of our research, teaching and professional staff" and that the "increase in citation impact is indicative of the quality of research undertaken at the university, coupled with a rapid growth in influence and reach, and has positioned the university as amongst the best in the world."
The Canberra Times reports that the vice-chancellor has said that part of the improvement was the result of a talent acquisition campaign while noting that many faculty were complaining about pressure and excessive workloads.
Leigh Sullivan, DVC for research and innovation, has a piece in the Campus Morning Mail that hints at reservations about UC's apparent success, which is " a direct result of its Research Foundation Plan (2013-2017) and "a strong emphasis on providing strategic support for research excellence in a few select research areas where UC has strong capability." He notes that when the citation scores of research stars are excluded there has still been a significant increase in citations and warns that what goes up can go down and that performance can be affected by changes in the ranking methodology.
The website riotact quotes the vice-chancellor on the improvement in research quality as evidence by the citation score and as calling for more funding for universities: the "government has to really think and look hard at how well we support our universities. That's not to say it badly supports us, it's that the university sector deserves to be on the radar of our government as a major national asset."
The impressive ascent of UC is unique to THE. No serious ranking puts it in the top 200 or anywhere near. In the current Shanghai Rankings it is in the 601-700 band and has been falling for the last two years. In Webometrics it is 730th in the world and 947th for Excellence, that is publications in the 10% most cited in 25 disciplines. In University Ranking by Academic Performance it is 899th and in the CWUR Rankings it doesn't even make the top 1,000.
Round University Ranking and Leiden Ranking do not rank UC at all.
Apart from THE UC does best in the QS rankings where it is 484th in the world and 26th in Australia.
So how could UC perform so brilliantly in THE rankings when nobody else has recognised that brilliance? What does THE know that nobody else does? Actually, it does not perform brilliantly in the THE rankings, just in the citations indicator which is supposed to measure research influence or research impact.
This year UC has a score of 99.2 which puts it in the top twenty for citations just behind Nova Southeastern University in Florida and Cankaya University in Turkey and ahead of Harvard, Princeton and Oxford. The top university this year is Aswan University in Egypt replacing Babol Noshirvani University of Technology in Iran.
No, THE is not copying the interesting methodology of the Fortunate 500. This is the result of an absurd methodology that THE is unable or unwilling for some reason to change.
THE has a self-inflicted problem with a small number of papers that have hundreds or thousands of "authors" and collect thousands of citations. Some of these are from the CERN project and THE has dealt with them by using a modified form of fractional counting for papers with more than a thousand authors. That has removed the privilege of institutions that contribut to CERN projects but has replaced it with the privilege of those that contribute to the Global Burden of Disease Study (GBDS) whose papers tend to have hundreds but not thousands of contributors and sometimes receive over a thousand citations. As a result, places like Tokyo Metropolitan University, National Research University MEPhI and Royal Holloway London have been replaced as citation super stars by St Georges' London, Brighton and Sussex Medical School, and Oregon Health and Science University.
It would be a simple matter to apply fractional counting to all papers, dividing the number of citations by the number of authors. After all Leiden Ranking and Nature Index manage to do it but THE for some reason has chosen not to follow.
The problem is compounded by counting self-citations, by hyper-normalisation so that the chances of hitting the jackpot with an unusually highly cited paper are increased, and by the country bonus that boosts the scores for universities by virtue of their location in low scoring countries.
And so to UC's apparent success this year. This is entirely the result of it's citation score which is entirely dependent on THE's methodology.
Between 2014 and 2018 UC had 3,825 articles in the Scopus database of which 27 were linked to the GBDS which is funded by the Bill and Melinda Gates Foundation. Those 27 articles, each with hundreds of contributors, have received 18,431 citations all of which are credited to UC and its contributor. The total number of citations is 53,929 so those 27 articles accounted for over a third of UC's citations. Their impact might be even greater if they were cited disproportionately soon after publication.
UC has of course improved its citation performance even without those articles but it is clear that they have made an outsize contribution. UC is not alone here. Many universities in the top 100 for citations in the THE world rankings owe their status to the GBDS: Anglia Ruskin, Reykjavik, Aswan, Indian Institute of Technology Ropar, the University of Peradeniya, Desarrollo, Pontifical Javeriana and so on.
There is absolutely nothing wrong with the GBDS nor with UC encouraging researchers to take part. The problem lies with THE and its reluctance to repair an indicator that produces serious distortions and is an embarrassment to those universities who apparently look to the THE rankings to validate their status.
The Canberra Times reports that the vice-chancellor has said that part of the improvement was the result of a talent acquisition campaign while noting that many faculty were complaining about pressure and excessive workloads.
Leigh Sullivan, DVC for research and innovation, has a piece in the Campus Morning Mail that hints at reservations about UC's apparent success, which is " a direct result of its Research Foundation Plan (2013-2017) and "a strong emphasis on providing strategic support for research excellence in a few select research areas where UC has strong capability." He notes that when the citation scores of research stars are excluded there has still been a significant increase in citations and warns that what goes up can go down and that performance can be affected by changes in the ranking methodology.
The website riotact quotes the vice-chancellor on the improvement in research quality as evidence by the citation score and as calling for more funding for universities: the "government has to really think and look hard at how well we support our universities. That's not to say it badly supports us, it's that the university sector deserves to be on the radar of our government as a major national asset."
The impressive ascent of UC is unique to THE. No serious ranking puts it in the top 200 or anywhere near. In the current Shanghai Rankings it is in the 601-700 band and has been falling for the last two years. In Webometrics it is 730th in the world and 947th for Excellence, that is publications in the 10% most cited in 25 disciplines. In University Ranking by Academic Performance it is 899th and in the CWUR Rankings it doesn't even make the top 1,000.
Round University Ranking and Leiden Ranking do not rank UC at all.
Apart from THE UC does best in the QS rankings where it is 484th in the world and 26th in Australia.
So how could UC perform so brilliantly in THE rankings when nobody else has recognised that brilliance? What does THE know that nobody else does? Actually, it does not perform brilliantly in the THE rankings, just in the citations indicator which is supposed to measure research influence or research impact.
This year UC has a score of 99.2 which puts it in the top twenty for citations just behind Nova Southeastern University in Florida and Cankaya University in Turkey and ahead of Harvard, Princeton and Oxford. The top university this year is Aswan University in Egypt replacing Babol Noshirvani University of Technology in Iran.
No, THE is not copying the interesting methodology of the Fortunate 500. This is the result of an absurd methodology that THE is unable or unwilling for some reason to change.
THE has a self-inflicted problem with a small number of papers that have hundreds or thousands of "authors" and collect thousands of citations. Some of these are from the CERN project and THE has dealt with them by using a modified form of fractional counting for papers with more than a thousand authors. That has removed the privilege of institutions that contribut to CERN projects but has replaced it with the privilege of those that contribute to the Global Burden of Disease Study (GBDS) whose papers tend to have hundreds but not thousands of contributors and sometimes receive over a thousand citations. As a result, places like Tokyo Metropolitan University, National Research University MEPhI and Royal Holloway London have been replaced as citation super stars by St Georges' London, Brighton and Sussex Medical School, and Oregon Health and Science University.
It would be a simple matter to apply fractional counting to all papers, dividing the number of citations by the number of authors. After all Leiden Ranking and Nature Index manage to do it but THE for some reason has chosen not to follow.
The problem is compounded by counting self-citations, by hyper-normalisation so that the chances of hitting the jackpot with an unusually highly cited paper are increased, and by the country bonus that boosts the scores for universities by virtue of their location in low scoring countries.
And so to UC's apparent success this year. This is entirely the result of it's citation score which is entirely dependent on THE's methodology.
Between 2014 and 2018 UC had 3,825 articles in the Scopus database of which 27 were linked to the GBDS which is funded by the Bill and Melinda Gates Foundation. Those 27 articles, each with hundreds of contributors, have received 18,431 citations all of which are credited to UC and its contributor. The total number of citations is 53,929 so those 27 articles accounted for over a third of UC's citations. Their impact might be even greater if they were cited disproportionately soon after publication.
UC has of course improved its citation performance even without those articles but it is clear that they have made an outsize contribution. UC is not alone here. Many universities in the top 100 for citations in the THE world rankings owe their status to the GBDS: Anglia Ruskin, Reykjavik, Aswan, Indian Institute of Technology Ropar, the University of Peradeniya, Desarrollo, Pontifical Javeriana and so on.
There is absolutely nothing wrong with the GBDS nor with UC encouraging researchers to take part. The problem lies with THE and its reluctance to repair an indicator that produces serious distortions and is an embarrassment to those universities who apparently look to the THE rankings to validate their status.
Monday, September 16, 2019
What should universities do about organised cheating?
Every so often the world of higher education is swept by a big panic about systemic and widespread cheating. The latest instance is concern about contract cheating or essay mills that provide bespoke essays or papers for students.
It seems that the Australian government will introduce legislation to penalise the supply or advertising of cheating services to students. There are already laws in several American states and there have been calls for the UK to follow suit.
There is perhaps a bit of hypocrisy here. If universities in Europe Australia and North America admit more and more students who lack the cognitive or language skills to do the required work and if they chose to use assessment methods that are vulnerable to deception and dishonesty such as unsupervised essays and group projects then cheating is close to inevitable.
On the supply side there appear to be large numbers of people around the world without decent academic jobs or jobs of any sort who are capable of producing academic of a high standard, sometimes worth an A grade or a first. The Internet has made it possible for lazy or incompetent students to link up with competent writers.
The Daily Mail has reported that Kenya hosts a medium sized industry with students and academics slaving away to churn out essays for British and American students. This is no doubt a hugely exploitative business but consider the consequences of shutting down the essay mills. Many educated Kenyans are going to suffer financially. Many students will drop out, resort to other forms of cheating, or will demand more support and counselling and transitional or foundation programmes.
If universities are serious about the scourge of essay mills they need to work on both the supply and the demand side. They might start by offering the essay writers in Kenya to apply scholarships for undergraduate or postgraduate courses or posts in EAP departments.
On the demand side the solution seems to be simple. Stop admitting students because they show leadership ability, have overcome adversity, will make the department look like Britain, America or the world, will help craft an interesting class, and admit them because they have demonstrated an ability to do the necessary work.
https://www.studyinternational.com/news/australia-essay-mills-contract-cheating-penalty-law/
It seems that the Australian government will introduce legislation to penalise the supply or advertising of cheating services to students. There are already laws in several American states and there have been calls for the UK to follow suit.
There is perhaps a bit of hypocrisy here. If universities in Europe Australia and North America admit more and more students who lack the cognitive or language skills to do the required work and if they chose to use assessment methods that are vulnerable to deception and dishonesty such as unsupervised essays and group projects then cheating is close to inevitable.
On the supply side there appear to be large numbers of people around the world without decent academic jobs or jobs of any sort who are capable of producing academic of a high standard, sometimes worth an A grade or a first. The Internet has made it possible for lazy or incompetent students to link up with competent writers.
The Daily Mail has reported that Kenya hosts a medium sized industry with students and academics slaving away to churn out essays for British and American students. This is no doubt a hugely exploitative business but consider the consequences of shutting down the essay mills. Many educated Kenyans are going to suffer financially. Many students will drop out, resort to other forms of cheating, or will demand more support and counselling and transitional or foundation programmes.
If universities are serious about the scourge of essay mills they need to work on both the supply and the demand side. They might start by offering the essay writers in Kenya to apply scholarships for undergraduate or postgraduate courses or posts in EAP departments.
On the demand side the solution seems to be simple. Stop admitting students because they show leadership ability, have overcome adversity, will make the department look like Britain, America or the world, will help craft an interesting class, and admit them because they have demonstrated an ability to do the necessary work.
https://www.studyinternational.com/news/australia-essay-mills-contract-cheating-penalty-law/
Saturday, September 14, 2019
Are UK universities facing a terrible catastrophe?
A repeated theme of mainstream media reporting on university rankings (nearly always QS or THE) is that Brexit has inflicted, is inflicting, or is surely going to inflict great damage on British education and the universities because they will not get any research grants from the European Union or be able to network with their continental peers.
The latest of these dire warnings can be found in a recent edition of the Guardian, which is the voice of the British progressive establishment. Marja Makarow claims that Swiss science was forced "into exile" after the 2014 referendum on immigration controls. Following this, Switzerland supposedly entered a period of isolation without access to Horizon 2020 or grants from the European Research Council and with a declining reputation and a loss of international collaboration and networks. This will happen to British research and universities if Brexit is allowed to happen.
But has Swiss research suffered? A quick tour of some relevant rankings suggests that it has not. The European Research Ranking which measures research funding and networking in Europe has two Swiss universities in the top ten. The Universitas 21 systems rankings put Switzerland in third place for output, up from sixth in 2013, and first for connectivity.
The Leiden Ranking shows that EPF Lausanne and ETH Zurich have fallen for total publications between 2011-14 and 2014-17 but both have risen for publications in the top 10% of journals, a measure of research quality.
The Round University Rankings show that EPF and ETH have both improved for research since 2013 and both have improved their world research reputation.
So it looks as though Switzerland has not really suffered very much, if at all. Perhaps Brexit, if it ever happens, will turn out to be something less than the cataclysm that is feared, or hoped for.
The latest of these dire warnings can be found in a recent edition of the Guardian, which is the voice of the British progressive establishment. Marja Makarow claims that Swiss science was forced "into exile" after the 2014 referendum on immigration controls. Following this, Switzerland supposedly entered a period of isolation without access to Horizon 2020 or grants from the European Research Council and with a declining reputation and a loss of international collaboration and networks. This will happen to British research and universities if Brexit is allowed to happen.
But has Swiss research suffered? A quick tour of some relevant rankings suggests that it has not. The European Research Ranking which measures research funding and networking in Europe has two Swiss universities in the top ten. The Universitas 21 systems rankings put Switzerland in third place for output, up from sixth in 2013, and first for connectivity.
The Leiden Ranking shows that EPF Lausanne and ETH Zurich have fallen for total publications between 2011-14 and 2014-17 but both have risen for publications in the top 10% of journals, a measure of research quality.
The Round University Rankings show that EPF and ETH have both improved for research since 2013 and both have improved their world research reputation.
So it looks as though Switzerland has not really suffered very much, if at all. Perhaps Brexit, if it ever happens, will turn out to be something less than the cataclysm that is feared, or hoped for.
Saturday, September 07, 2019
Finer and finer rankings prove anything you want
If you take a single metric from a single ranking and do a bit of slicing by country, region, subject, field and/or age there is a good chance that you can prove almost anything, for example that the University of the Philippines is a world beater for medical research. Here is another example from the Financial Times.
An article by John O'Hagan, Emeritus Professor at Trinity College Dublin, claims that German universities are doing well for research impact in the QS economics world rankings. Supposedly, "no German university appears in the top 50 economics departments in the world using the overall QS rankings. However, when just research impact is used, the picture changes dramatically, with three German universities, Bonn, Mannheim and Munich, in the top 50, all above Cambridge and Oxford on this ranking."
This is a response to Frederick Studemann's claim that German universities are about to move up the rankings. O'Hagan is saying that is already happening.
I am not sure what this is about. I had a look at the most recent QS economics rankings and found that in fact Mannheim is in the top fifty overall for that subject. The QS subject rankings do not have a research impact indicator. They have academic reputation, citations per paper, and h-index, which might be considered proxies for research impact, but for none of these are the three universities in the top fifty. Two of the three universities are in the top fifty for academic research reputation, one for citations per paper and two for h-index.
So it seems that the article isn't referring to the QS economics subject ranking. Maybe it is the overall ranking that professor O'Hagan is thinking of? There are no German universities in the overall top fifty there but there are also none in the citations per faculty indicator.
I will assume that the article is based on an actual ranking somewhere, maybe an earlier edition of the QS subject rankings or the THE world rankings or from one of the many spin-offs.
But it seems a stretch to talk about German universities moving up the rankings just because they did well in one metric in one of the 40 plus international rankings in one year.
An article by John O'Hagan, Emeritus Professor at Trinity College Dublin, claims that German universities are doing well for research impact in the QS economics world rankings. Supposedly, "no German university appears in the top 50 economics departments in the world using the overall QS rankings. However, when just research impact is used, the picture changes dramatically, with three German universities, Bonn, Mannheim and Munich, in the top 50, all above Cambridge and Oxford on this ranking."
This is a response to Frederick Studemann's claim that German universities are about to move up the rankings. O'Hagan is saying that is already happening.
I am not sure what this is about. I had a look at the most recent QS economics rankings and found that in fact Mannheim is in the top fifty overall for that subject. The QS subject rankings do not have a research impact indicator. They have academic reputation, citations per paper, and h-index, which might be considered proxies for research impact, but for none of these are the three universities in the top fifty. Two of the three universities are in the top fifty for academic research reputation, one for citations per paper and two for h-index.
So it seems that the article isn't referring to the QS economics subject ranking. Maybe it is the overall ranking that professor O'Hagan is thinking of? There are no German universities in the overall top fifty there but there are also none in the citations per faculty indicator.
I will assume that the article is based on an actual ranking somewhere, maybe an earlier edition of the QS subject rankings or the THE world rankings or from one of the many spin-offs.
But it seems a stretch to talk about German universities moving up the rankings just because they did well in one metric in one of the 40 plus international rankings in one year.