Sunday, May 15, 2016

The THE reputation rankings: Much ado about not very much

Every so often, especially in North America and Western Europe, there is a panic about the impact of government policies on higher education, usually the failure to provide as much money as universities want, or sometimes as many overseas students as they need to fill lecture halls or cover budget deficits. Global university rankings have a lot to do with the onset and spread of these panics.

True to form, the British  "quality" media have been getting into a tizzy over the latest edition of the Times Higher Education (THE) world reputation ranking. According to Javier Espinoza, education editor of the Telegraph, top UK universities have been under pressure to admit minority and state school students and have also had difficulty in recruiting foreign students. This has somehow caused them to forget about doing research or teaching the most able students. It seems that academics from countries around the world, where such problems are of course unknown, are reacting by withholding their votes from British universities when responding to the THE survey and transferring their approval to the rising stars of Asia.

This supposedly has caused UK institutions to slide down the rankings and two of them, Bristol and Durham, have even dropped out of the top 100 altogether into the great dark pit of the unranked.

The Guardian notes that Oxford and Cambridge are falling and are now only just in the world's top five while the Independent quotes Phil Baty, as saying that "our evidence - from six massive global surveys over six years, including the views of more than 80,000 scholars - proves the balance of power in higher education and research is slowly shifting from the West to the East". 

This, it would seem, is all because of cuts in funding and restrictions on the entry of overseas students and faculty.

All this is is rather implausible. First of all, these are reputation rankings. They refer only to one indicator that accounts for 33 percent of the World University Rankings that will appear later this year. It is not certain that the other indicators will go in the same direction.

Secondly, these rankings have not been standardised as they will be when included in the world rankings, which means that the huge gap between the Big Six, Harvard -- MIT, Berkeley, Stanford, Oxford and Cambridge -- and the rest is laid bare, as it will not be in the autumn, and so we can get a rough idea of how many academics were voting for each university. A crude guess is that when we get down to around 50th place the number of votes will be around five hundred and even less when we reach 100th place.

This means that below the 50 mark a shift in the opinion of a few dozen respondents could easily push a university up or down into a new band or even into or out of the top 100.

Another thing we should remember is that the expertise of the researchers in the Scopus database, from which respondents are drawn, is  exaggerated. The qualification for receiving a survey form is being the corresponding author of a publication listed in the Scopus database. There is much anecdotal evidence that in some places winning research grants or getting the corresponding author slot has more to do with politics than with merit. The THE survey is better than QS's, which allows anyone with an academic email address to take part, but it does not guarantee that every respondent is an unbiased and senior researcher.

We should also note that, unlike the US News and QS survey indicators, THE takes no measures to damp down year to year fluctuations. Nor does it do anything to prevent academics from supporting their own universities in the survey.

So, do we really need to get excited about a few dozen "senior researchers" withdrawing their support from British universities?

The credibility of these rankings is further undermined by apparent changes in the distribution of responses by subject group. According to the methodology page in Times Higher Education for 2015, 16% of the responses were from the arts and humanities and 19% were from the social sciences, which in that year included business studies and economics. This year, according to the THE methodology page, 9% of the responses were from the arts and humanities and 15 % were from the social sciences and 13 % were from business and economics, adding up to 28%.

In other words the responses from the arts and humanities have apparently fallen by 7 percentage points, or around 700 responses, and the combined responses from social sciences and business and economics have apparently risen by nine points, or about 900 responses.

If these numbers are accurate then there has been among survey respondents a very substantial shift from the arts and humanities to the social sciences (inclusive of business and economics) and it is possible that this could be sufficient to cause the recorded decline in the reputation scores of British universities which usually do much better  in the arts and humanities than in the social sciences.

In the THE subject group rankings last year, Durham, for example, was 28th for arts and humanities in the THE 2015-16 World University Rankings and 36th for the social sciences. Exeter was 71st for arts and humanities and 81st for the social sciences.

At the same time some of those rising  Asian universities were definitely  stronger in the social sciences than in the humanities: Peking was 52nd for social sciences and 84th for arts and humanities, Hong Kong 39th for social sciences and 44th for arts and humanities, Nanyang Technological University 95th for social sciences and outside the top 100 universities for the arts and humanities.

It is possible that such a symmetrical change could be the result of changes in the way disciplines are classified or even a simple transposition of data. So far, THE have given no indication that this was the case.

It is interesting that an exception to to the narrative of British decline is the London Business School which has risen from the 91-100 band to 81-90.

The general claim that the views of 80,000 academics over six years are evidence of a shift from west to east is also somewhat tenuous. There have been several changes in the collection and organisation of data over the last few years that could affect the outcomes of the reputation  survey.

Between 2010-2011 and 2016 the percentage of responses from the social sciences (originally including  business and economics) has risen from 19% to 28 % for social sciences plus business and economics counted separately. Those for clinical and health sciences and life sciences  have fallen somewhat while there has been a slight rise for the arts and humanities, with a large spike in 2015.

The number of responses from the Asia Pacific region and the Middle East has has risen from 25% to 36% while those from the Americas (North and Latin) have  fallen from 44% to 25%. The number of languages in which the survey is administered has increased from eight in 2011 to fifteen this year.

The source of respondents has shifted from the Thomson Reuters Web of Science to Scopus, which includes more publications from languages other than English.

The value of these changes is not disputed here but they should make everybody very cautious about using the reputation rankings to make large claims about what is happening to British universities or what the causes of their problems are.




3 comments:

  1. Very well expressed. A necessary counterweight to the prevailing mode of thinking.

    ReplyDelete
  2. Garmentspedia Blog really appriciate your writting. This is really very good site

    ReplyDelete
  3. The university ranking is important because people like me wants join for courses and we need the best university.

    ReplyDelete