A repeated theme of mainstream media reporting on university rankings (nearly always QS or THE) is that Brexit has inflicted, is inflicting, or is surely going to inflict great damage on British education and the universities because they will not get any research grants from the European Union or be able to network with their continental peers.
The latest of these dire warnings can be found in a recent edition of the Guardian, which is the voice of the British progressive establishment. Marja Makarow claims that Swiss science was forced "into exile" after the 2014 referendum on immigration controls. Following this, Switzerland supposedly entered a period of isolation without access to Horizon 2020 or grants from the European Research Council and with a declining reputation and a loss of international collaboration and networks. This will happen to British research and universities if Brexit is allowed to happen.
But has Swiss research suffered? A quick tour of some relevant rankings suggests that it has not. The European Research Ranking which measures research funding and networking in Europe has two Swiss universities in the top ten. The Universitas 21 systems rankings put Switzerland in third place for output, up from sixth in 2013, and first for connectivity.
The Leiden Ranking shows that EPF Lausanne and ETH Zurich have fallen for total publications between 2011-14 and 2014-17 but both have risen for publications in the top 10% of journals, a measure of research quality.
The Round University Rankings show that EPF and ETH have both improved for research since 2013 and both have improved their world research reputation.
So it looks as though Switzerland has not really suffered very much, if at all. Perhaps Brexit, if it ever happens, will turn out to be something less than the cataclysm that is feared, or hoped for.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Saturday, September 14, 2019
Saturday, September 07, 2019
Finer and finer rankings prove anything you want
If you take a single metric from a single ranking and do a bit of slicing by country, region, subject, field and/or age there is a good chance that you can prove almost anything, for example that the University of the Philippines is a world beater for medical research. Here is another example from the Financial Times.
An article by John O'Hagan, Emeritus Professor at Trinity College Dublin, claims that German universities are doing well for research impact in the QS economics world rankings. Supposedly, "no German university appears in the top 50 economics departments in the world using the overall QS rankings. However, when just research impact is used, the picture changes dramatically, with three German universities, Bonn, Mannheim and Munich, in the top 50, all above Cambridge and Oxford on this ranking."
This is a response to Frederick Studemann's claim that German universities are about to move up the rankings. O'Hagan is saying that is already happening.
I am not sure what this is about. I had a look at the most recent QS economics rankings and found that in fact Mannheim is in the top fifty overall for that subject. The QS subject rankings do not have a research impact indicator. They have academic reputation, citations per paper, and h-index, which might be considered proxies for research impact, but for none of these are the three universities in the top fifty. Two of the three universities are in the top fifty for academic research reputation, one for citations per paper and two for h-index.
So it seems that the article isn't referring to the QS economics subject ranking. Maybe it is the overall ranking that professor O'Hagan is thinking of? There are no German universities in the overall top fifty there but there are also none in the citations per faculty indicator.
I will assume that the article is based on an actual ranking somewhere, maybe an earlier edition of the QS subject rankings or the THE world rankings or from one of the many spin-offs.
But it seems a stretch to talk about German universities moving up the rankings just because they did well in one metric in one of the 40 plus international rankings in one year.
An article by John O'Hagan, Emeritus Professor at Trinity College Dublin, claims that German universities are doing well for research impact in the QS economics world rankings. Supposedly, "no German university appears in the top 50 economics departments in the world using the overall QS rankings. However, when just research impact is used, the picture changes dramatically, with three German universities, Bonn, Mannheim and Munich, in the top 50, all above Cambridge and Oxford on this ranking."
This is a response to Frederick Studemann's claim that German universities are about to move up the rankings. O'Hagan is saying that is already happening.
I am not sure what this is about. I had a look at the most recent QS economics rankings and found that in fact Mannheim is in the top fifty overall for that subject. The QS subject rankings do not have a research impact indicator. They have academic reputation, citations per paper, and h-index, which might be considered proxies for research impact, but for none of these are the three universities in the top fifty. Two of the three universities are in the top fifty for academic research reputation, one for citations per paper and two for h-index.
So it seems that the article isn't referring to the QS economics subject ranking. Maybe it is the overall ranking that professor O'Hagan is thinking of? There are no German universities in the overall top fifty there but there are also none in the citations per faculty indicator.
I will assume that the article is based on an actual ranking somewhere, maybe an earlier edition of the QS subject rankings or the THE world rankings or from one of the many spin-offs.
But it seems a stretch to talk about German universities moving up the rankings just because they did well in one metric in one of the 40 plus international rankings in one year.
Saturday, August 24, 2019
Seven modest suggestions for Times Higher
The latest fabulous dynamic exciting trusted prestigious sophisticated etc etc Times Higher Education (THE) world academic summit is coming.
The most interesting, or at least the most amusing, event will probably be the revelation of the citations indicator which supposedly measures research impact. Over the last few years this metric has discovered a series of unexpected world-class research universities: Alexandria University, Tokyo Metropolitan University, Anglia Ruskin University, the University of Reykjavik, St. George's London, Babol Noshirvani University of Technology, Brighton and Sussex Medical School. THE once called this their flagship indicator but oddly enough they don't seem to have got round to releasing it as a standalone ranking.
But looking at the big picture, THE doesn't appear to have suffered much, if at all, from the absurdity of the indicator. The great and the good of the academic world continue to swarm to THE summits where they bask in the glow of the charts and tables that confirm their superiority.
THE have hinted that this summit will see big reforms to the rankings especially the citations indicator. That would certainly improve their credibility although they may be less interesting.
I have discussed THE's citation problems here, here, here, and here. So, for one last time, I hope, here the main flaws and we will see whether THE will fix them.
1. A 30% weighting for any single indicator is far too high. It would be much better to reduce it to 10 or 20%.
2. Using only one method to measure citations is not a good idea. Take at look at the Leiden Ranking and play around with the settings and parameters. You will see that you can get very different results with just a bit of tweaking. It is necessary to use a variety of metrics to get a broad picture of research quality, impact and influence.
3. THE have a regional modification or country bonus that divides the research impact score of universities by the square root of the scores of the country where they are located. The effect of this is to increase the score of every university except those in the top ranking country with the increase being greater for those with worse research records. This applies to half of the indicator and is supposed to compensate for some researchers lacking access to international networks. For some reason this was never a problem for the publications, income or international indicators. Removing the bonus would do a let to make the metric more credible.
4. The indicator is over-normalized. Impact scores are bench marked to the world average for over three hundred fields plus year of publication. The more fields the greater the chance that a university can benefit from an anomalous paper that receives an unusually high number of citations. It would help if THE reduced the number of fields although that seems unlikely.
5. Unless a paper has over a thousand authors THE treat every single contributor as receiving every single citation. Above that number they use fractional counting. The result is that the THE rankings privilege medical institutions such as St George's and the Brighton and Sussex Medical School that take part in multi-author projects such as the Global Burden of Disease study. All round fractional counting would seem the obvious answer although it might add a bit to costs.
6. Self-citation has become an issue recently. THE have said several times that it doesn't make very much difference. That may be true but there have been occasions when a single serial self citer can make a university like Alexandria or Veltech soar into the research stratosphere and that could happen again.
7. A lot of researchers are adding additional affiliations to their names when they publish. Those secondary, tertiary, sometimes more affiliations are counted by rankers as though they were primary affiliations. It would make sense to count only primary affiliations as ARWU does with highly cite researchers.
Subscribe to:
Posts (Atom)