Saturday, April 13, 2019

Do we really need a global impact ranking?

Sixteen years ago there was just one international university ranking, the Shanghai Academic Ranking of World Universities (ARWU). Since then rankings have proliferated. We have world rankings, regional rankings, subject rankings, business school rankings, young university rankings, employability rankings, systems rankings, and best student cities.

As if this wasn't enough, there is now a "global impact" ranking published by Times Higher Education (THE). This was announced with a big dose of breathless hyperbole as though it was something revolutionary and unprecedented. Not quite. Before THE's ranking there was the GreenMetric ranking published by Universitas Indonesia. This measured universities' contribution to sustainability through indicators like water, waste, transportation, and education and research .

THE was doing something more specific and perhaps more ambitious, measuring adherence to the Sustainable Development Goals proclaimed by the UN. Universities could submit data about eleven out of the 17 seventeen goals and a minimum of four were counted for the overall rankings, with one, partnership for the goals, being mandatory. 

The two rankings have attracted different respondents so perhaps they are complementary rather than competitive. The GreenMetric rankings include 66 universities from Indonesia, 18 from Malaysia and 61 from the USA compared to 7, 9 and 31 in the THE impact rankings. On the other hand, the THE rankings have a lot more universities from Australia  and the UK. It is noticeable that China is almost entirely absent from both (2 universities in GreenMetric and 3 in THE's).

But is there really any point in a global impact ranking? Some universities in the West seem to be doing a fairly decent job of producing research  in the natural sciences although no doubt much of it is mediocre or worse and there is also a lot of politically correct nonsense being produced in the humanities and social sciences. They have been far less successful in teaching undergraduates and providing them with the skills required by employers and professional and graduate schools. It is surely debatable whether universities should be concerned about the UN sustainable development goals before they have figured out to fulfill their  teaching mission.

Similarly, rankers have become quite adept at measuring and comparing research output and quality. There are several technically competent rankings which look at research from different viewpoints. There is the Shanghai ARWU which counts long dead Nobel and Fields laureates, the National Taiwan University ranking which counts publications over an eleven year period  period, Scimago which  includes patents, URAP with 2,500 institutions, the US News Best Global Universities which includes books and conferences.

The THE world ranking is probably the least useful of the research-dominant rankings. It gives a 30 % weighting to research which is assessed by three indicators, reputation, publications per staff and research income per staff. An improvement in the score for research could result from an improved reputation for research, an reduction in the number of academic staff, an increase in the number  of publications, an increase in research funding, or a combination of some or all of these. Students and stakeholders who want to know exactly why the research prowess of a university is rising or falling will not find THE very helpful. 

The THE world and regional rankings also have a citations indicator derived from normalised citations impact. Citations are benchmarked against documents in 300+ fields, five document types and five years of publications. Further, citations to documents with less that a thousand authors are not fractionalised. Further again, self-citations are allowed. And again, there is a regional modification or country bonus applied to half of the indicator, dividing a universities impact score by the square root of the score of the country in which it is located. This means that every university except those in the country with the highest score goes up, some a bit and some a lot.

The result of all this is a bit of a mess. Over the last few years we have seen institutions rise to glory at the top of the citations that should never have been there, usually because they have succeeded in combining  a small number of publications with participation in a mega project with hundreds of authors and affiliated universities and thousands of citations. Top universities for research impact in the 2018-19 world rankings include Babol Noshirvani University of Technology, the University of Reykjavik, the Brighton and Sussex Medical School and Anglia Ruskin University. 

There is something disturbing about university leaders queuing up to bask in the approval of an organisation that seems to think that Babol Norshirvani University of Technology has a greater research influence than anywhere else in the world. The idea that a ranking organisation that cannot publish a plausible list of influential research universities should have the nerve to start talking about measuring global impact is quite surprising.

Most rankers have done better at evaluating research than THE. At least they have not produced indicators as ridiculous as the normalised citations indicator. Teaching, especially undergraduate teaching, is another matter. Attempts to capture the quality of  university teaching have been far from successful. Rankers have tried to measure inputs such as income or faculty resources or have conducted surveys but these are at best very indirect indicators. It seems strange that they should now turn their attention to various third missions.

Of course, research and teaching are not the only thing that universities do. But until international ranking organisations have worked out how to effectively compare universities for the quality of learning and teaching or graduate employability it seems premature to start trying to measure anything else. 

It is likely though that many universities will welcome the latest THE initiative. Many Western universities faced with declining standards and funding and competition from the East will welcome the opportunity to find something where they can get high scores that will help with branding and promotion.