Some parts of the world seem to be increasingly sceptical of international rankings, or least those produced by Times Higher Education (THE). MENA (Middle East and North Africa) and Africa did not seem to be very enthusiastic about THE's snapshot or pilot rankings. Many Latin American universities have chosen not to participate in the world and regional rankings.
India also seems to be suspicious of the rankings. An article by Vyasa Shastri in the E-paper, livemint, details some of the ways in which universities might attempt to manipulate rankings to their advantage.
It is well worth reading although I have one quibble. The article refers to King Abdulaziz University recruiting faculty who would list the university as their secondary affiliation (now 41) when publishing papers. The original idea was to get top marks in the Shanghai Ranking's highly cited researchers indicator. The article correctly notes that the Shanghai rankings no longer count secondary affiliations but they can still help in the Nature and Science and publications indicators and in citations and publications metrics in other rankings.
Also, other Saudi universities do not recruit large numbers of secondary affiliations. There are only four for the rest of Saudi Arabia although I notice that there are now quite a few for Chinese and Australian universities, including five for the University of Melbourne.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Sunday, September 17, 2017
Last word, I hope, on Babol Noshirvani University of Technology
If you type in 'Babol University of Technology" rather than 'Babol Noshirvani University of Technology' into the Scopus search box then the university does have enough publications to meet THE's criteria for inclusion the world rankings.
So it seems that it was those highly cited researchers in engineering that propelled the university into the research impact stratosphere. That, and a rather eccentric methodology.
So it seems that it was those highly cited researchers in engineering that propelled the university into the research impact stratosphere. That, and a rather eccentric methodology.
Saturday, September 09, 2017
More on Babol Noshirvani University of Technology
To answer the question in the previous post, how did Babol Noshirvani University of Technology in Iran do so well in the latest THE rankings, part of the answer is that it has two highly cited researchers in engineering, Davood Domiri Ganji and Mohsen Sheikholeslami. I see no reason to question the quality of their research.
But I still have a couple of questions. First THE say that they exclude universities whose research output is less than 1,000 articles between 2012 and 2016. But checking with Scopus indicates that the university had 468 articles over that period, or 591 documents of all kinds including conference papers, book chapters and reviews, which seems way below the threshold level for inclusion. Is it possible that THE have included the Babol University of Medical Sciences in the count of publications or citations?
Those documents have been cited a total of 2,601 times, which is respectable but not quite on a scale that would rival Oxford and Chicago. It is possible that some or one of those articles have, for some reason, got an unusual number of citations compared to the world average and that this has distorted the indicator score. If so, then we have yet another example of a defective methodology producing absurd results.
But I still have a couple of questions. First THE say that they exclude universities whose research output is less than 1,000 articles between 2012 and 2016. But checking with Scopus indicates that the university had 468 articles over that period, or 591 documents of all kinds including conference papers, book chapters and reviews, which seems way below the threshold level for inclusion. Is it possible that THE have included the Babol University of Medical Sciences in the count of publications or citations?
Those documents have been cited a total of 2,601 times, which is respectable but not quite on a scale that would rival Oxford and Chicago. It is possible that some or one of those articles have, for some reason, got an unusual number of citations compared to the world average and that this has distorted the indicator score. If so, then we have yet another example of a defective methodology producing absurd results.
Friday, September 08, 2017
Why did Babol Noshirvani University of Technology do so well in the THE rankings?
The THE world rankings and their regional offshoots have always been a source of entertainment mixed with a little bit of bewilderment. Every year a succession of improbable places jumps into the upper reaches of the citations indicator which is supposed to measure global research impact. Usually it is possible to tell what happened Often it is because of participation in a massive international physics project, although not so much over the last couple of years, contribution to a global medical or genetics survey, or even assiduous self-citation.
However, after checking with Scopus and the Web of Science, I still cannot see exactly how Babol Noshirvani University of Technology got into 14th place for this metric, equal to Oxford and ahead of Yale and Johns Hopkins, in the latest world rankings and 301-350 overall, well ahead of every other Iranian university?
Can anybody help with an explanation?
However, after checking with Scopus and the Web of Science, I still cannot see exactly how Babol Noshirvani University of Technology got into 14th place for this metric, equal to Oxford and ahead of Yale and Johns Hopkins, in the latest world rankings and 301-350 overall, well ahead of every other Iranian university?
Can anybody help with an explanation?
Tuesday, September 05, 2017
Highlights from THE citations indicator
The latest THE world rankings were published yesterday. As always, the most interesting part is the field- and year- normalised citations indicator that supposedly measures research impact.
Over the last few years, an array of implausible places have zoomed into the top ranks of this metric, sometimes disappearing as rapidly as they arrived.
The first place for citations this year goes to MIT. I don't think anyone would find that very controversial.
Here are some of the institutions that feature in the top 100 of THE's most important indicator which has a weighting of 30 per cent.
2nd St. George's, University of London
3rd= University of California Santa Cruz, ahead of Berkeley and UCLA
6th = Brandeis University, equal to Harvard
11th= Anglia Ruskin University, UK, equal to Chicago
14th= Babol Noshirvani University of Technology, Iran, equal to Oxford
16th= Oregon Health and Science University
31st King Abdulaziz University, Saudi Arabia
34th= Brighton and Sussex Medical School, UK, equal to Edinburgh
44th Vita-Salute San Raffaele University, Italy, ahead of the University of Michigan
45th= Ulsan National Institute of Science and Technology, best in South Korea
58th= University of Kiel, best in Germany and equal to King's College London
67th= University of Iceland
77th= University of Luxembourg, equal to University of Amsterdam
Subscribe to:
Posts (Atom)