Saturday, April 13, 2019

Do we really need a global impact ranking?

Sixteen years ago there was just one international university ranking, the Shanghai Academic Ranking of World Universities (ARWU). Since then rankings have proliferated. We have world rankings, regional rankings, subject rankings, business school rankings, young university rankings, employability rankings, systems rankings, and best student cities.

As if this wasn't enough, there is now a "global impact" ranking published by Times Higher Education (THE). This was announced with a big dose of breathless hyperbole as though it was something revolutionary and unprecedented. Not quite. Before THE's ranking there was the GreenMetric ranking published by Universitas Indonesia. This measured universities' contribution to sustainability through indicators like water, waste, transportation, and education and research .

THE was doing something more specific and perhaps more ambitious, measuring adherence to the Sustainable Development Goals proclaimed by the UN. Universities could submit data about eleven out of the 17 seventeen goals and a minimum of four were counted for the overall rankings, with one, partnership for the goals, being mandatory. 

The two rankings have attracted different respondents so perhaps they are complementary rather than competitive. The GreenMetric rankings include 66 universities from Indonesia, 18 from Malaysia and 61 from the USA compared to 7, 9 and 31 in the THE impact rankings. On the other hand, the THE rankings have a lot more universities from Australia  and the UK. It is noticeable that China is almost entirely absent from both (2 universities in GreenMetric and 3 in THE's).

But is there really any point in a global impact ranking? Some universities in the West seem to be doing a fairly decent job of producing research  in the natural sciences although no doubt much of it is mediocre or worse and there is also a lot of politically correct nonsense being produced in the humanities and social sciences. They have been far less successful in teaching undergraduates and providing them with the skills required by employers and professional and graduate schools. It is surely debatable whether universities should be concerned about the UN sustainable development goals before they have figured out to fulfill their  teaching mission.

Similarly, rankers have become quite adept at measuring and comparing research output and quality. There are several technically competent rankings which look at research from different viewpoints. There is the Shanghai ARWU which counts long dead Nobel and Fields laureates, the National Taiwan University ranking which counts publications over an eleven year period  period, Scimago which  includes patents, URAP with 2,500 institutions, the US News Best Global Universities which includes books and conferences.

The THE world ranking is probably the least useful of the research-dominant rankings. It gives a 30 % weighting to research which is assessed by three indicators, reputation, publications per staff and research income per staff. An improvement in the score for research could result from an improved reputation for research, an reduction in the number of academic staff, an increase in the number  of publications, an increase in research funding, or a combination of some or all of these. Students and stakeholders who want to know exactly why the research prowess of a university is rising or falling will not find THE very helpful. 

The THE world and regional rankings also have a citations indicator derived from normalised citations impact. Citations are benchmarked against documents in 300+ fields, five document types and five years of publications. Further, citations to documents with less that a thousand authors are not fractionalised. Further again, self-citations are allowed. And again, there is a regional modification or country bonus applied to half of the indicator, dividing a universities impact score by the square root of the score of the country in which it is located. This means that every university except those in the country with the highest score goes up, some a bit and some a lot.

The result of all this is a bit of a mess. Over the last few years we have seen institutions rise to glory at the top of the citations that should never have been there, usually because they have succeeded in combining  a small number of publications with participation in a mega project with hundreds of authors and affiliated universities and thousands of citations. Top universities for research impact in the 2018-19 world rankings include Babol Noshirvani University of Technology, the University of Reykjavik, the Brighton and Sussex Medical School and Anglia Ruskin University. 

There is something disturbing about university leaders queuing up to bask in the approval of an organisation that seems to think that Babol Norshirvani University of Technology has a greater research influence than anywhere else in the world. The idea that a ranking organisation that cannot publish a plausible list of influential research universities should have the nerve to start talking about measuring global impact is quite surprising.

Most rankers have done better at evaluating research than THE. At least they have not produced indicators as ridiculous as the normalised citations indicator. Teaching, especially undergraduate teaching, is another matter. Attempts to capture the quality of  university teaching have been far from successful. Rankers have tried to measure inputs such as income or faculty resources or have conducted surveys but these are at best very indirect indicators. It seems strange that they should now turn their attention to various third missions.

Of course, research and teaching are not the only thing that universities do. But until international ranking organisations have worked out how to effectively compare universities for the quality of learning and teaching or graduate employability it seems premature to start trying to measure anything else. 

It is likely though that many universities will welcome the latest THE initiative. Many Western universities faced with declining standards and funding and competition from the East will welcome the opportunity to find something where they can get high scores that will help with branding and promotion.


Where is the real educational capital of the world?

Here is another example of how rankings, especially those produced by Times Higher Education (THE), are used to mislead the public.

The London Post has announced that London is the Higher Educational Capital of the World for 2019. Support for this claim is provided by four London universities appearing in the top 40 of the THE World University Rankings which, unsurprisingly, have been welcomed by London Mayor Sadiq Khan.

In addition, THE has Oxford and Cambridge as first and second in the world in their overall rankings and QS has declared London to be the Best Student City.

THE is not the only global ranking. There are now several others and none of them have Oxford in first place. Most of them give the top spot to Harvard, although in the QS world rankings it is MIT and in the GreenMetric rankings Wageningen.

Also, if we look at the number of universities in the top 50 of the Shanghai rankings we cannot see London as the undisputed HE capital of the world. Using this simple criterion it would be New York with three, Columbia, New York University and Rockefeller.

Then come Boston, Paris, Chicago and London with two each.



Saturday, April 06, 2019

Resources alone may not be enough

Universitas 21 has just published its annual ranking of higher education systems. There are four criteria each containing several metrics: resources, connectivity, environment and output.
The ranking has received a reasonable amount of media coverage although not as much as THE or QS.

A comparison of the ranks for the Resources indicator, comprising five measures of expenditure, and for Output, which includes research, citations, performance on rankings, graduation rates and enrolments, produces some interesting insights. There are countries such as Denmark and Switzerland that do well for both. China, Israel and some European countries seem to be very good at getting a high output from the resources available. There are others, including Turkey, Brazil, Saudi Arabia, and Malaysia, that appear to have adequate or more than adequate resources but whose rank for output is not so high. 

These are of course limited indicators and it could perhaps just be a matter of time before the resources produce the desired results. The time for panic or celebration may not have arrived yet. Even so, it does seem that some countries or cultures are able to make better use of their resources than others.

The table below orders countries according to the difference between their ranks for resources and for output. Ireland is 20 places higher for output than it is for resources. India is seven places lower.

The relatively poor performance for Singapore is surprising given that country's reputation for all round excellence. Possibly there is a point where expenditure on higher education runs into diminishing or even negative returns.



China
+20
Ireland
+20
Russia
+18
Greece
+16
Hungary
+14
Italy
+14
UK
+11
Israel
+10
Slovenia
+10
South Korea
+10
Australia
+8
USA
+8
Spain
+7
Taiwan
+4
Bulgaria
+3
Germany
+3
Iran
+3
Netherlands
+3
Japan
+3
Czech Republic
+2
Belgium
+1
Croatia
+1
Romania
+1
Thailand
+1
Finland
0
France
0
Indonesia
0
New Zealand
0
Canada
-1
Denmark
-1
Portugal
-1
Argentina
-2
Norway
-2
Poland
-2
South Africa
-2
Switzerland
-2
Ukraine
-5
Hong Kong
-4
Sweden
-5
India
-7
Chile
-9
Singapore
-9
Austria
-11
Mexico
-13
Serbia
-13
Slovakia
-14
Turkey
-14
Brazil
-16
Saudi Arabia
-25
Malaysia
-28