Showing posts sorted by relevance for query oxford reputation. Sort by date Show all posts
Showing posts sorted by relevance for query oxford reputation. Sort by date Show all posts

Monday, July 03, 2017

Proving anything you want from rankings

It seems that university rankings can be used to prove almost anything that journalists want to prove.

Ever since the Brexit referendum experts and pundits of various kinds have been muttering about the dread disease that is undermining or about to undermine the research prowess of British universities. The malignity of Brexit is so great that it can send its evil rays back from the future.

Last year, as several British universities tumbled down the Quacquarelli Symonds (QS) world rankings, the Independent claimed that “[p]ost-Brexit uncertainty and long-term funding issues have seen storm clouds gather over UK higher education in this year’s QS World University Rankings”.

It is difficult to figure out how anxiety about a vote that took place on June 24th 2016 could affect a ranking based on institutional data for 2014 and bibliometric data from the previous five years.

It is just about possible that some academics or employers might have woken up on June 24th to see that their intellectual inferiors had joined the orcs to raze the ivory towers of Baggins University and Bree Poly and then rushed to send a late response to the QS opinion survey. But QS, to their credit, have taken steps to deal with that sort of thing by averaging out survey responses over a period of five years.

European and American universities have been complaining for a long time that they do not get enough money from the state and that their performance in the global rankings is undermined because they do not get enough international students or researchers. That is a bit more plausible. After all, income does account for three separate indicators in the Times Higher Education (THE) world rankings so reduced income would obviously cause universities to fall a bit. The scandal over Trinity College Dublin’s botched rankings data submission showed precisely how much a given increase in reported total income (with research and industry income in a constant proportion) means for the THE world rankings. International metrics account for 10% of the QS rankings and 7.5% of the THE world rankings. Whether a decline in income or the number of international students has a direct effect or indeed any effect at all on research output or the quality of teaching is quite another matter.

The problem with claims like this is that the QS and THE rankings are very blunt instruments that should not be used to make year by year analyses or to influence government or university policy. There have been several changes in methodology, there are fluctuations in the distribution of survey responses by region and subject and the average scores for indicators may go up and down as the number of participants changes. All of these mean that it is very unwise to make extravagant assertions about university quality based on what happens in those rankings.

Before making any claim based on ranking changes it would be a good idea to wait a few years until the impact of any methodological change has passed through the system

Another variation in this genre is the recent claim in the Daily Telegraph that “British universities are slipping down the world rankings, with experts blaming the decline on pressure to admit more disadvantaged students.”

Among the experts is Alan Smithers of the University of Buckingham who is reported as saying “universities are no longer free to take their own decisions and recruit the most talented students which would ensure top positions in league tables”.

There is certainly good evidence that British university courses are becoming much less rigorous. Every year reports come in about declining standards everywhere. The latest is the proposal at Oxford to allow students to do take home instead of timed exams.

But it is unlikely that this could show up in the QS or THE rankings. None of the global rankings has a metric that measures the attributes of graduates except perhaps the QS employers survey. It is probable that a decline in the cognitive skills of admitted undergraduate students would eventually trickle up to the qualities of research students and then to the output and quality of research but that is not something that could happen in a single year especially when there is so much noise generated by methodological changes.

The cold reality is that university rankings can tell us some things about universities and how they change over perhaps half a decade and some metrics are better than others but it is an exercise in futility to use overall rankings or indicators subject to methodological tweaking to argue about how political or economic changes are impacting western universities.

The latest improbable claim about rankings is that Oxford’s achieving parity with Cambridge in the THE reputation rankings was the result of  a positive image created by appointing its first female Vice Chancellor.

Phil Baty, THE’s editor, is reported as saying that ‘Oxford University’s move to appoint its first female Vice Chancellor sent a “symbolic” wave around the world which created a positive image for the institution among academics.’

There is a bit of a problem here. Louise Richardson was appointed Vice -Chancellor in January 2016. The polling for the 2016 THE reputation rankings took place between January and March 2016. One would expect that if the appointment of Richardson had any effect on academic opinion at all then it would be in those months. It certainly seems more likely than an impact that was delayed for more than a year. If the appointment did affect the reputation rankings then it was apparently a negative one for Oxford’s score fell massively from 80.4 in 2015 to 69.1 in 2016 (compared to 100 for Harvard in both years). 

So, did Oxford suffer in 2016 because spiteful curmudgeons were infuriated by an upstart intruding into the dreaming spires?

The collapse of Oxford in the 2016 reputation rankings and its slight recovery in 2017 almost certainly had nothing to do with the new Vice-Chancellor.

Take a look at the table below. Oxford’s reputation score tracks the percentage of THE survey responses from the arts and humanities. It goes up when there are more respondents from those subjects and goes down when there are fewer. This is the case for British universities in general and also for Cambridge except for this year.

The general trend since 2011 has been for the gap between Cambridge and Oxford to fall steadily and that trend happened before Oxford acquired a new Vice-Chancellor although it accelerated and finally erased the gap this year.

What is unusual about this year’s reputation ranking is not that Oxford recovered as the number of arts and humanities respondents increased but that Cambridge continued to fall.

I wonder if it has something to do with Cambridge’s “disastrous” performance in the THE research impact (citations) indicator in recent years.  In the 2014-15 world rankings Cambridge was 28th behind places like Federico Santa Maria Technical University and Bogazici University. In 2015-16 it was 27th behind St Petersburg Polytechnic University. But a greater humiliation came in the 2016-17 rankings. Cambridge fell to 31st in the world for research impact. Even worse it was well behind Anglia Ruskin University, a former art school. For research impact Cambridge University wasn’t the best university in Europe or England. It wasn’t even the best in Cambridge, at least if you trusted the sophisticated THE rankings.

Rankings are not entirely worthless and if they did not exist no doubt they would somehow be invented. But it is doing nobody any good to use them to promote the special interests of university bureaucrats and insecure senior academics.

Table: Scores in THE reputation rankings


Year
Oxford
Cambridge
Gap
% responses arts and
humanities
2011
68.6
80.7
12.1
--
2012
71.2
80.7
9.5
7%
2013
73.0
81.3
8.3
10.5%
2014
67.8
74.3
6.5
9%
2015
80.4
84.3
3.9
16%
2016
67.6
72.2
4.6
9%
2017
69.1
69.1
0
12.5%




Tuesday, June 19, 2018

Are the US and the UK really making a comeback?

The latest QS World University Rankings and the THE World Reputation Rankings have just been published. The latter will feed into the forthcoming world rankings where the two reputation indicators, research and postgraduate teaching, will account for 33 per cent of the total weighting, 

The THE reputation rankings include only 100 universities. QS is now ranking close to 1,000 universities and provides scores for 500 of them including academic reputation and employer reputation.

The publication of these rankings has led to claims that British and American universities are performing well again after  a period of stress and difficulty. In recent years we have heard a great deal about the rise of Asia and the decline of the West. Now it seems that THE and QS are telling us that things are beginning to change.

The rise of Asia has perhaps been overblown but if Asia is narrowly as Northeast Asia and Greater China then there is definitely something going on. Take a look at the record of Zhejiang University in the Leiden Ranking publications indicator. In 2006-9 Harvard produced a total of 27,422 papers and Zhejiang University 11,173. In the period 2013-16 the numbers were 33,045 for Harvard and 20,876 for Zhejiang. In seven years Zhejiang has gone from 42% of Harvard's score to 63%. It is not impossible that Zhejiang will reach parity within two decades.

We are talking about quantity here. Reaching parity for research of the highest quality and the greatest impact will take longer but here too it seems likely that within a generation universities like Peking, Zhejiang, Fudan, KAIST and the National University of Singapore will catch up with and perhaps surpass the Ivy League, the Russell Group and the Group of Eight.

The scientific advance of China and its neighbours is confirmed by data from a variety of sources, including the deployment of supercomputers,  the use of robots, and, just recently,  the Chinese Academy of Science holding its place at the top of the Nature Index.

There are caveats. Plagiarism is a serious problem and the efficiency of Chinese research culture is undermined by  cronyism and political conformity. But these are problems that are endemic, and perhaps worse, in Western universities.

So it might seem surprising that the two recent world rankings show that American and British universities are rising again. 

But perhaps it should not be too surprising. QS and THE emphasise reputation surveys, which have a weighting of 50% in the QS world rankings and 33% in THE's. There are signs that British and American universities and others in the Anglosphere are learning the reputation management game while universities in Asia are not so interested.

Take a look at the the top fifty universities in the QS academic reputation indicator, which is supposed to be about the best universities for research. The countries represented are:
US 20
UK 7
Australia 5
Canada 3
Japan 2
Singapore 2
China 2
Germany 2.

There is one each for Switzerland, Hong Kong, South Korea, Mexico, Taiwan, France and Brazil.

The top fifty universities in the QS citations per faculty indicator, a measure of research excellence, are located in:
USA 20
China 4
Switzerland 4
Netherlands 3
India  2
Korea 2
Israel 2
Hong Kong 2
Australia 2.

There is one each from Saudi Arabia, Italy, Germany, UK, Sweden, Taiwan, Singapore and Belgium.

Measuring citations is a notoriously tricky business and probably some of the high flyers in the reputation charts are genuine local heroes little known to the rest of the world. There is also now a lot of professional advice available about reputation management for those with cash to spare. Even so it is striking that British, Australian, and Canadian universities do relatively well on reputation in the QS rankings while China, Switzerland, the Netherlands, India and Israel do relatively well for citations.

For leading British universities the mismatch is very substantial. According to the 2018-19 QS world rankings, Cambridge is 2nd for academic reputation, 71st for citations, Manchester is 33rd and 221st, King's College London 47th and 159th, Edinburgh 24th and 181st. It is not surprising that British universities should perform well in rankings where there is a 40 % weighting for reputation.

The THE reputation rankings have produced some good results for several US universities.
UCLA has risen from 13th to 9th    
Cornell from 23rd to 18th                      
University of Washington from 34th to 28th                
University of Illinois Urbana-Champaign from 36th to 32nd            Carnegie Mellon from 37th to 30th                    
Georgia Institute of Technology from 48th to 44th.                             
Some of this is probably the result of a change in the distribution of survey responses. I have already pointed out that the fate of Oxford in the THE survey rankings is tied to the percentages of responses from the arts and humanities. THE have reported that their survey this year had an increased number of responses from computer science and engineering and a reduced number from the social sciences and the humanities. Sure enough, Oxford has slipped slightly while LSE has fallen five places. 

The shift to computer science and engineering in the THE survey might explain the improved reputation of Georgia Tech and Carnegie Mellon. There is, I suspect, something else going on and that is the growing obsession of some American universities with  reputation management, public relations and rankings, including the hiring of professional consultants.

In contrast, Asian universities have not done so well in the THE reputation rankings.

University of  Tokyo has fallen from 11th to 13th place    
University of Kyoto from 25th to 27th      
Osaka University from 51st to 81st         
Tsinghua University is unchanged in 14th  
Peking University 17 unchanged in 17th   
Zhejiang University has fallen from the 51-60 band to 71-80          University of Hong Kong has fallen from 39th to 40th.        

All but one of the US universities have fallen in the latest Nature Index, UCLA by 3.1%, University of Washington 1.7%, University of Illinois Urbana-Champaign 12%, Carnegie Mellon 4.8%, Georgia Tech 0.9%.

All but one of the Asian universities have risen in the Nature Index, Tokyo by 9.2%, Kyoto 15.1%, Tsinghua 9.5%, Peking 0.9%, Zhejiang 9.8%, Hong kong 25.3%.

It looks like that Western and Asian universities are diverging. The former are focussed on branding, reputation, relaxing admission criteria, searching for diversity. They are increasingly engaged with, or even obsessed with, the rankings.

Asian universities, especially in Greater China and Korea, are less concerned with rankings and public relations and more with academic excellence and research output and impact. 

As the university systems diverge it seems that two different sets of rankings are emerging to cater for the academic aspirations of different countries.












Saturday, December 09, 2023

Global Subject Rankings: The Case of Computer Science

Three ranking agencies have recently released the latest editions of their subject rankings: Times Higher Education, Shanghai Ranking, and Round University Rankings.  

QS, URAP, and National Taiwan University also published subject rankings earlier in the year. The US News global rankings announced last year can be filtered for subject. The methods are different and consequently the results are also rather different. It is instructive to focus on the results for a specific field, computer science and on two universities, Oxford and Tsinghua. Note that the scope of the rankings is sometimes different.

 

1.   Times Higher Education has published rankings of eleven broad subjects using the same indicators as in their world rankings, Teaching, Research Environment, Research Quality, International Outlook, and Industry: Income and Patents, but with different weightings. For example, Teaching has a weighting of 28% for the Engineering rankings and Industry: Income and Patents 8%, while for Arts and Humanities the weightings are 37.5% and 3% respectively.

These rankings continued to be led by the traditional Anglo-American elite. Harvard is in first place for three subjects, Stanford, MIT, and Oxford in two each and Berkeley and Caltech in one each.

The top five for Computer Science are:

1.    University of Oxford

2.    Stanford University

3.    Massachusetts Institute of Technology

4.    Carnegie Mellon University

5.    ETH Zurich.

Tsinghua is 13th.

 

2.   The Shanghai subject rankings are based on these metrics: influential journal publications, category normalised citation impact, international collaboration, papers in Top Journals or Top Conferences, and faculty winning significant academic awards.

According to these rankings China is now dominant in Engineering subjects. Chinese universities lead in fifteen subjects although Harvard, MIT and Northwestern University lead for seven subjects. The Natural Sciences, Medical Sciences, and Social Sciences are still largely the preserve of American and European universities.

Excellence in the Life Sciences appears to be divided between the USA and China. The top positions in Biology, Human Biology, Agriculture, and Veterinary Science are held respectively by Harvard, University of California San Francisco, Northwest Agriculture and Forestry University, and Nanjing Agricultural University.

The top five for Computer Science and Engineering are:

1.    Massachusetts Institute of Technology

2.    Stanford University

3.    Tsinghua University

4.    Carnegie Mellon University

5.    University of California Berkeley.

Oxford is 9th.

 

3.  The Round University Rankings (RUR), now published from Tbilisi, Georgia, are derived from 20 metrics grouped in 5 clusters, Teaching, Research, International Diversity, and Financial Sustainability. The same methodology is used for rankings in six broad fields. Here, Harvard is in first place for Medical Sciences, Social Sciences, and Technical Sciences, Caltech for Life Sciences, and University of Pennsylvania for Humanities.

RUR’s narrow subject rankings, published for the first time, use different criteria related to publications and citations: Number of Papers, Number of Citations, Citations per Paper, Number of Citing Papers, and Number of Highly Cited Papers. In these rankings, first place goes to twelve universities in the USA, eight in Mainland China, three in Singapore, and one each in Hong Kong, France, and the UK.

 The top five for Computer Science are:

1.    National University of Singapore

2.    Nanyang Technological University

3.    Massachusetts Institute of Technology

4.    Huazhong University of Science and Technology

5.    University of Electronic Science and Technology of China.

Tsinghua is 10th.  Oxford is 47th.

 

4.   The QS World University Rankings by Subject are based on five indicators: Academic reputation, Employer reputation, Research citations per paper, H-index and International research network.  At the top they are mostly led by the usual suspects, MIT, Harvard, Stanford, Oxford, and Cambridge.

The top five for Computer Science and Information Systems

1.    Massachusetts Institute of Technology

2.    Carnegie Mellon University

3.    Stanford University

4.    University of California Berkeley

5.    University of Oxford.

Tsinghua is 15th.

 

5.   University Ranking by Academic Performance (URAP) is produced by a research group at the Middle East Technical University, Ankara, and is based on publications, citations, and international collaboration. Last July it published rankings of 78 subjects.  

 The top five for Information and Computing Sciences were:

1.    Tsinghua University

2.    University of Electronic Science and Technology of China

3.   Nanyang Technological University

4.   National University of Singapore

5.   Xidian University

Oxford is 19th

 

6.    The US News Best Global Universities can be filtered by subject. They are based on publications, citations and research reputation.

The top five for Computer Science in 2022 were:

1.   Tsinghua University

2.   Stanford University

3.    Massachusetts Institute of Technology

4.    Carnegie Mellon University

5.   University of California Berkeley

Oxford was 11th.

 

7.    The National Taiwan University Rankings are based on articles, citations, highly cited papers, and H-index.

The top five for Computer Science are:

1.    Nanyang Technological University

2.    Tsinghua University

3.    University of Electronic Science and Technology of China

4.   National University of Singapore

5.    Xidian University

Oxford is 111th

 

So, Tsinghua is ahead of Oxford for computer science and related fields in the Shanghai Rankings, the Round University Rankings, URAP, the US News Best Global Universities, and the National Taiwan University Rankings. These rankings are entirely or mainly based on research publications and citations. Oxford is ahead of Tsinghua in both the QS and THE subject rankings. The contrast between the THE and the Taiwan rankings is especially striking.

 

 

 

 

 

Sunday, April 07, 2024

What happens to those who leave THE?

Times Higher Education (THE) appears to be getting rather worried about leading universities such as Rhodes University, University of Zurich, Utrecht University, and some of the Indian Institutes of Technology boycotting its World University Rankings (WUR) and not submitting data.

Thriving Rankings?

We have seen articles about how the THE rankings are thriving, indeed growing explosively. Now, THE has published a piece about the sad fate that awaits the universities that drop out of the WUR or their Impact Rankings. 

Declining Universities?

An article by two THE data specialists reports that 611 universities that remained in the THE world rankings from 2018 to 2023 retained, on average, a stable rank in the THE reputation ranking. The 16 who dropped out saw a significant decline in their reputation ranks, as did 75 who are described as never being in the WUR.

The last category is a bit perplexing. According to Webometrics, there are over 30,000 higher education institutions in the world and nearly 90,000, according to Alex Usher of HESA. So, I assume that THE is counting only those that got votes or a minimum number of votes in their reputation ranking. 

We are not told who the 75 never-inners or the 16 defectors are, although some, such as six Indian Institutes of Technology, are well known, so it is difficult to check THE's claims. However, it  is likely that an institution that boycotted the THE WUR would also discourage its faculty from participating in the THE academic survey, which would automatically tend to reduce the reputation scores since THE allows self-voting.

Also, we do not know if there have been changes in the weighting for country and subject and how that might modify the raw survey responses. A few years ago, I noticed that Oxford's academic reputation fluctuated with the percentage of survey responses from the humanities. It is possible that adjustments like that might affect the reputation scores of the leavers. 

The opacity of THE's methodology and the intricacies of its data processing system mean that we cannot be sure about THE's claim that departure from the world rankings would have a negative impact. In addition, there is always the possibility that universities on a downward trend might be more likely to pull out because their leaders are concerned about their rankings, so the withdrawal is a result, not the cause of the decline. 

We should also remember that reputation scores are not everything. If a decline in reputation was accompanied by an improvement in other metrics, it could be a worthwhile trade.

What happened to the IITs in the THE WUR?

Fortunately, we can check THE's claims by looking at a group of institutions from the same country and with the same subject orientation. In the 2019-20 world rankings, twelve Indian Institutes of Technology were ranked. Then, six -- Bombay, Madras, Delhi, Kanpur, Kharagpur, Roorkee --  withdrew from the WUR, and six -- Ropar, Indore, Gandhinagar, Guwahati, Hyderabad, Bhubaneswar --  remained, although two of these withdrew later. 

So, let's see what happened to them. First, look at the overall ranks in the WUR itself and then in Leiden Ranking, the Shanghai Rankings (ARWU), and Webometrics.

Looking at WUR, it seems that if there are penalties for leaving THE, the penalties for remaining could be more serious. 

Among the  IITs in the 2020 rankings, Ropar led in the 301-350 band, followed by Indore in the 351-400 band. Neither of them is as reputable in India as senior IITs such as Bombay and Madras and they had those ranks because of remarkable citation scores, although they did much less well for the other pillars. This anomaly was part of the reason for the six leavers to depart.

Fast-forward to the 2024 WUR. IIT Ropar has fallen dramatically to 1001-1200,  Indore, which had fallen from 351-400 to 601-800 in 2023, has opted out, and Gandhinagar has fallen from 501-600 to 801-1000. Bhubaneswar, which was in the 601-800 band in the 2020 WUR,  fell to 1001-1200 in 2022 and 2023 and was absent in 2024. Guwahati and Hyderabad remained in the 601-800 band.

Frankly, it looks like staying in the THE WUR is not always a good idea. Maybe their THE reputation improved but four of the original remaining IITs suffered serious declines.

IITs in Other Rankings

Now, let's examine the IITs' performance in other rankings. First, the total publications metric in Leiden Ranking. Between 2019 and 2023, four of the six early leavers rose, and two fell. The late leavers, Hyderabad and Indore, were absent in 2019 and were ranked in the 900s in 2023. Remainer Guwahati rose from 536th in 2019 to 439th in 2023.

For Webometrics, between 2019 and 2024, all 12 IITs went up except for Bombay.

Finally, let's check the overall scores in the QS WUR. Between 2021 and 2024, four of the six leavers went up, and two went down. Of the others, Guwahati went up, and Hyderabad went down.

So, looking at overall ranking scores, it seems unlikely that boycotting THE causes any great harm, if any. On the other hand, if THE is tweaking its methodology or something happens to a productive researcher, staying could lead to an embarrassing decline.

IITs' Academic Reputation Scores

Next, here are some academic reputation surveys. The  US News Best Global Universities is not as helpful as it could be since it does not provide data from previous editions, and the Wayback Machine doesn't seem to work very well. However, the Global Research Reputation metric in the most recent edition is instructive. 

The six escapees had an average rank of 272, ranging from 163 for Bombay to 477 for Roorkee.

The remainers' ranks ranged from 702 for Guwahati to 1710 for Bhubaneswar. Ropar was not ranked at all. So, leaving THE does not appear to have done the IITs any harm in this metric

Turning to the QS WUR academic reputation metric, the rank in the academic survey for the leavers ranges from 141 for Bombay to 500 for Roorkee. They have all improved since 2022. The best performing remainer is Guwahati in 523rd place.  Ropar and Gandhinagar are not ranked at all. Bhubaneswar, Indore and Hyderabad are all at 601+.  

Now for Round University Ranking's reputation ranking. Four of the six original leavers were there in 2019. Three fell by 2023 and Delhi rose. Two, Bombay and Roorkee, were absent in 2019 and present in 2023.

This might be considered evidence that leaving THE leads to a loss of reputation. But five of the original remainers are not ranked in these rankings, and Guwahati is there in 2023 with a rank of 417, well below that of the six leavers. 

There is then scant evidence that leaving WUR damaged the academic reputations of those IITs that joined the initial boycott, and their overall rankings scores have generally improved.

On the other hand, for IITs Ropar and Bhubaneswar remaining proved disastrous.  

IITs and Employer Reputation

In the latest GEURS employer rankings, published by Emerging, the French consulting firm, there are four exiting IITs in the top 250, Delhi, Bombay, Kharagpur, and Madras, and no remainers.

In the QS WUR Employer Reputation indicator, the boycotters all perform well. Bombay is 69th and Delhi is 80th. Of the six original remainers two, Ropar and Gandhinagar, were not ranked by QS in their 2024 WUR. Three were ranked 601 or below, and Guwahati was 381st, ahead of Roorkee in 421st place.

Conclusion

Looking at the IITs, there seems to be little downside to boycotting THE WUR, and there could be some risk in staying, especially for institutions that have over-invested in specific metrics. It is possible that the IITs are atypical, but so far there seems little reason to fear leaving the THE WUR. A study of the consequences of boycotting the THE Impact Rankings is being prepared 






Saturday, September 24, 2016

The THE World University Rankings: Arguably the Most Amusing League Table in the World

If ever somebody does get round to doing a ranking of university rankings and if entertainment value is an indicator the Times Higher Education (THE) World University Rankings (WUR) stand a good chance of being at the top.

The latest global rankings contain many items that academics would be advised not to read in public places lest they embarrass the family by sniggering to themselves in Starbucks or Nandos.

THE would, for example, have us believe that St. George's, University of London is the top university in the world for research impact as measured by citations. This institution specialises in medicine, biomedical science and healthcare sciences. It does not do research in the physical sciences, the social sciences, or the arts and humanities and makes no claim that it does. To suggest that it is the best in the world across the range of scientific and academic research is ridiculous.

There are several other universities with scores for citations that are disproportionately higher than their research scores, a sure sign that the THE citations indicator is generating absurdity.  They include Brandeis, the Free University of Bozen-Bolzano, Clark University, King Abdulaziz University, Anglia Ruskin University, the University of Iceland, and Orebro University, Sweden.

In some cases, it is obvious what has happened. King Abdulaziz University has been gaming the rankings by recruiting large numbers of adjunct faculty whose main function appears to be listing the university as as a secondary affiliation in order to collect a share of the credit for publications and citations. The Shanghai rankers have stopped counting secondary affiliations for their highly cited researchers indicator but KAU is still racking up the points in other indicators and other rankings.

The contention that Anglia Ruskin University is tenth in the world  for research impact, equal to Oxford, Princeton, and UC Santa Barbara, and just above the University of Chicago, will no doubt be met with donnish smirks at the high tables of that other place in Cambridge, 31st for citations, although there will probably be less amusement about Oxford being crowned best university in the world.

Anglia Ruskin 's output of research is not very high, about a thirtieth of Chicago's according to the Web of Science Core Collection. Its faculty does, however, include one Professor who is a frequent contributor to global medical studies with a large number of authors, although never more than a thousand, and hundreds of citations a year. Single-handedly he has propelled the university into the research stratosphere since the rest of the university has been generating few citations (there's nothing wrong with that: it's not that sort of place) and so the number of papers by which the normalised citations are divided is very low.

The THE citations methodology is badly flawed. That university heads give any credence to rankings that include such ludicrous results is sad testimony to the decadence of the modern academy.

There are also many universities that have moved up or down by  a disproportionate number of places. These include:

Peking University rising from 42nd  to 29th
University of  Maryland at College Park rising from 117th to 67th.
Purdue University rising from 113th to 70th.
Chinese  University of Hong Kong rising from 138th  to 76th.
RWTH Aachen rising from 110th to 78th
Korean Advanced Institute of Science and Technology rising from  148th to 89th


Vanderbilt University falling from 87th to108th
University of Copenhagen falling from 82nd to 120th
Scuola Normale Pisa falling from 112nd to 137th
University of Cape Town falling from 120th to 148th
Royal Holloway, University of London falling from 129th to173rd
Lomonosov Moscow State University falling from 161st to 188th.


The point cannot be stressed too clearly that universities are large and complex organisations. They do not in 12 months or less, short of major restructuring, change sufficiently to produce movements such as these. The only way that such instability could occur is through entry into the rankings of universities with attributes different from the established ones thus changing the means from which standardised scores are derived or significant methodological changes.

There have in fact been significant changes to the methodology this year although perhaps not as substantial as 2015. First, books and book chapters are included in the count of publications and citations, an innovation pioneered by the US News in their Best Global Universities. Almost certainly this has helped English speaking universities with a comparative advantage in the humanities and social sciences although THE's practice of bundling indicators together makes it impossible to say exactly how much. It would also work to the disadvantage of institutions such as Caltech that are comparatively less strong in the arts and humanities.

Second, THE have used a modest version of fractional counting for papers with more than a thousand authors. Last year they were not counted at all. This means that universities that have participated in mega-papers such as those associated with the Large Hadron Collider will get some credit for citations of those papers although not as much as they did in 2014 and before. This has almost certainly helped a number of Asian universities that have participated in such projects but have a generally modest research output. It might have benefitted some universities in California such as UC Berkeley.

Third, THE have combined the results of the academic reputation survey conducted earlier this year with that used in the 2015-16 rankings. Averaging reputation surveys is a sensible idea, already adopted by QS and US News in their global rankings, but one that THE has avoided until now.

This year's survey saw a very large reduction in the number of responses from researchers in the arts and humanities and a very large increase, for reasons unexplained, in the number of responses from business studies and the social sciences, separated now but combined in 2015.

Had the responses for 2016 alone been counted there might have been serious consequences for UK universities, relatively strong in the humanities, and a boost for East Asian universities, relatively strong in business studies. Combining the two surveys would have limited the damage to British universities and slowed down the rise of Asia to media-acceptable proportions.

One possible consequence of these changes is that UC Berkeley, eighth in 2014-15 and thirteenth in 2015-16, is now, as predicted here,  back in the top ten. Berkeley is host for the forthcoming THE world summit although that is no doubt entirely coincidental.

The overall top place has been taken by Oxford to the great joy of the vice-chancellor who is said to be "thrilled" by the news.

I do not want to be unfair to Oxford but the idea that it is superior to Harvard, Princeton, Caltech or MIT is nonsense. Its strong performance in the THE WUR is in large measure due to the over- emphasis in these tables on reputation, income and a very flawed citations indicator. Its rise to first place over Caltech is almost certainly a result of this year's methodological changes.

Let's look at Oxford's standing in other rankings. The Round University Ranking (RUR) uses Thomson Reuters data just like THE did until two years ago. It has 12 of the indicators employed by THE and eight additional ones.

Overall Oxford was 10th, up from 17th in 2010. In the teaching group of five indicators Oxford is in 28th place. For specific indicators in that group the best performance was teaching reputation (6th) and the worst academic staff per bachelor degrees (203rd).

In Research it was 20th. Places ranged from 6th for research reputation to 206th for doctoral degrees per admitted PhD. It was 5th for International Diversity and 12th for Financial Sustainability

The Shanghai ARWU rankings have Oxford in 7th place and Webometrics in 10th (9th for Google Scholar Citations).

THE is said to be trusted by the great and the good of the academic world. The latest example is the Norwegian government including performance in the THE WUR as a criterion for overseas study grants. That trust seems largely misplaced. When the vice-chancellor of Oxford University is thrilled by a ranking that puts the university on a par for research impact with Anglia Ruskin then one really wonders about the quality of university leadership.

To conclude my latest exercise in malice and cynicism, (thank you ROARS) here is a game to amuse international academics .

Ask your friends which university in their country is the leader for research impact and then tell them who THE thinks it is.

Here are THE's research champions, according to the citations indicator:

Argentina: National University of the South
Australia: Charles Darwin University
Brazil: Universidade Federal do ABC (ABC refers to its location, not the courses offered)
Canada: University of British Columbia
China: University of Science and Technology of China
France: Paris Diderot Univerity: Paris 7
Germany: Ulm University
Ireland: Royal College of Surgeons
Japan: Toyota Technological Institute
Italy: Free University of Bozen-Bolzano
Russia: ITMO University
Turkey: Atilim University
United Kingdom: St George's, University of London.