Tuesday, August 13, 2019

University of the Philippines beats Oxford, Cambridge, Yale, Harvard, Tsinghua, Peking etc etc

Rankings can do some good sometimes. They can also do a lot of harm and that harm is multiplied when they are sliced more and more thinly to produce rankings by age, by size, by mission, by region, by indicator, by subject. When this happens minor defects in the overall rankings are amplified.

That would not be so bad if universities, political leaders and the media were to treat the tables and the graphs with a healthy scepticism. Unfortunately, they treat the rankings, especially THE, with obsequious deference as long as they are provided with occasional bits of publicity fodder.

Recently, the Philippines media have proclaimed that the University of the Philippines (UP) has beaten Harvard, Oxford and Stanford for health research citations. It was seventh in the THE Clinical, Pre-clinical and Health category behind Tokyo Metropolitan University, Auckland University of Technology, Metropolitan  Autonomous University Mexico, Jordan University of Science and Technology, University of Canberra and Anglia Ruskin University.

The Inquirer is very helpful and provides an explanation from the Philippine Council for Health Research and Development that citation scores “indicate the number of times a research has been cited in other research outputs” and that the score "serves as an indicator of the impact or influence of a research project which other researchers use as reference from which they can build on succeeding breakthroughs or innovations.” 

Fair enough, but how can UP, which has a miserable score of  13.4 for research in the same subject ranking have such a massive research influence? How can it have an extremely low output of papers, a poor reputation for research, and very little funding and still be a world beater for research impact.

It is in fact nothing to do with UP, nothing to do with everyone working as a team, decisive leadership or recruiting international talent.

It is the result of a bizarre and ludicrous methodology.  First, THE does not use fractional counting for papers with less than a thousand authors. UP, along with many other universities, has taken part in the Global Burden of Disease project funded by the Bill and Melinda Gates Foundation. This has produced a succession of papers, many of them in the Lancet, with hundreds of contributing institutions and researchers, whose names are all listed as authors, and hundreds or thousands of citations. As long as the number of authors does not reach 1,000 each author is counted as though he or she were the recipient of all the citations. So UP gets the credit for a massive number of citations which is divided by a relatively small number of papers.

Why not just use fractional counting, dividing the citations among the contributors or the intuitions, like Leiden Ranking does. Probably because it might add a little bit to costs, perhaps because THE doesn't like to admit it made a mistake.

Then we have the country bonus or regional modification, applied to half the indicator, which increases the score for universities in countries with low impact.

The result of all this is that UP, surrounded by low scoring universities, not producing very much research but with a role in a citation rich mega project, gets a score for this indicator that puts it ahead of the Ivy League, the Group of Eight and the leading universities of East Asia.

If nobody took this seriously, then no great harm would be done. Unfortunately it seems that large numbers of academics, bureaucrats and journalists do take the THE rankings very seriously or pretend to do so in public. 

And so committee addicts get bonuses and promotions, talented researchers spend their days in unending ranking-inspired transformational seminars, funds go to the mediocre and the sub mediocre, students and stakeholders base their careers on  misleading data, and the problems of higher education are covered up or ignored.



3 comments:

Anonymous said...

I never trust the British rankings: QS and THE. The British rankings are just their advertisement tool to promote their own universities.

FASAL said...

great work bro very well written thanks for sharing this article. Keep hustling .if you interest in IAS coaching you can checkout as Below

IAS Academy in Chennai
IAS coaching in Chennai
Best IAS Academy in Chennai
Best IAS Coaching in Chennai
IAS Coaching Centre in Chennai
UPSC Coaching in Chennai
Civil Services Coaching in Chennai
Top 10 IAS Academy in Chennai
IAS Coaching Centres in Chennai
IAS exam
IAS Academy
IAS Academy in Trivandrum
IAS Academy in Hyderabad
IAS Academy in Bangalore


Unknown said...

University rankings are a joke. Some league tables use include the nationality of academics as a metric: employing academics from around the world automatically earns you a higher score!

It is totally silly. The only way to find out how prestigious a university is, is to look at the type of students who apply.