Showing posts sorted by date for query oxford reputation. Sort by relevance Show all posts
Showing posts sorted by date for query oxford reputation. Sort by relevance Show all posts

Friday, December 16, 2016

A new Super-University for Ireland?

University rankings have become extremely influential over the last few years. This is not entirely a bad thing. The initial publication of the Shanghai rankings in 2003, for example, exposed the pretensions of many European universities revealing just how far behind they had fallen in scientific research.  It also showed China how far it had to go to achieve scientific parity with the West.

Unfortunately, rankings have also had malign effects. The THE and QS world rankings have acquired a great deal of respect, trust, even reverence that may not be entirely deserved. Both introduced significant methodological changes in 2015 and THE has made further changes in 2016 and the consequence of this is that there have been some remarkable rises and falls within the rankings that have had a lot of publicity but have little to do with any real change in quality.

In addition, both QS and THE have increased the number of ranked universities which can affect the mean score for indicators from which the processed scores given to the public are derived. Both have surveys that can be biased and subjective. Both are unbalanced: QS with a 50 % weighting for academic and employer surveys and THE with field and year normalised citations plus a partial regional modification with an official weighting of 30% (the modification means that everybody except the top scorer gets a bonus for citations). The remarkable rise of Anglia Ruskin University to parity with Oxford and Princeton in this year’s THE research impact (citations) indicator and the high placing of the Pontifical Catholic University of Chile and the National University of Colombia in QS’s employers survey are evidence that these rankings continue to be implausible and unstable. To make higher education policy dependent on their fluctuations is very unwise.

This is particularly true of the two leading Irish universities, Trinity College Dublin (TCD)  and University College Dublin (UCD), which have in fact been advancing in the Round University Rankings produced by a Russian organisation and ShanghaiRanking’s Academic Ranking of World Universities. These two global rankings have methodologies that are generally stable and transparent.

I pointed out in 2015 that TCD had been steadily rising in the Shanghai ARWU  since 2004, especially in the Publications indicator (papers in the Science Citation Index - Expanded and the Social Science Citation Index) and PCP (productivity per capita, that is the combined indicator scores divided by the number of faculty). This year, to repeat an earlier post, TCD’s publication score again went up very slightly from 31 to 31.1 (27.1 in 2004) and the PCP quite significantly from 19 to 20.8 (13.9 in 2004), compared to top scores of 100 for Harvard and Caltech respectively.

UCD has also continued to do well in the Shanghai rankings with the publications score rising this year from 34.1 to 34.2 (27.3 in 2004) and PCP from 18.0 to 18.1 (8.1 in 2014).

The Shanghai rankings are, of course, famous for not counting the arts and humanities and not trying to measure anything related to teaching. The RUR rankings from Russia are based on Thomson Reuters data, also used by THE until two years ago and they do include publications in the humanities and teaching-related metrics. They have 12 out of the 13 indicators in the THE World University Rankings, plus eight others, but with a sensible weighting, for example 8% instead of 30% for field normalised citations.

The RUR rankings show that TCD rose from 174th overall in 2010 to 102nd in 2016. (193rd to 67th for research). UCD rose from 213th overall to 195th (157th to 69th for research) although some Irish universities such as NUI Galway, NUI Maynooth, University College Cork, and Dublin City University have fallen.

It is thoroughly disingenuous for Irish academics to claim that academic standards are declining because of a lack of funds. Perhaps they will do so in the future. But so far everything suggests that the two leading Irish universities are making steady progress especially in research.

The fall of UCD in this year’s THE rankings this year and TCD’s fall in 2015 and the fall of both in the QS rankings mean very little. When there are such large methodological changes it is pointless to discuss how to improve in the rankings. Methodological changes can be made and unmade and universities made and unmade as the Middle East Technical University found in 2015 when it fell from 85th place in the THE world rankings to below 501st.

The Irish Times of November 8th  had an article by Philip O’Kane that proposed that Irish universities should combine in some ways to boost their position in the global rankings.

He suggested that:
“The only feasible course of action for Ireland to avert continued sinking in the world rankings is to create a new “International University of Ireland”.

This could be a world-class research university that consists exclusively of the internationally-visible parts of all our existing institutions, and to do so at marginal cost using joint academic appointments, joint facilities and joint student registration, in a highly flexible and dynamic manner.

Those parts that are not internationally visible would be excluded from this International University of Ireland.”

It sounds like he is proposing that universities maintain their separate identity for some things but present a united front for international matters. This was an idea that was proposed in India a while ago but was quickly shot down by Phil Baty of THE. It is most unlikely that universities could separate data for faculty, students, and income, and publications of their international bits and send the data to the rankers.

The idea of a full merger is more practical but could be pointless or even counter-productive. In 2012 a group of experts, headed by European Commissioner Frans  Van Vught, suggested that UCD and TCD be merged to become a single world class university.

The ironic thing about this idea is that a merger would help with the Shanghai rankings that university bosses are studiously pretending do not exist but would be of little or no use with the rankings that the bureaucrats and politicians do care about.

The Shanghai rankings are known for being as much about quantity as quality. A merger of TCD and UCD would produce a significant gain for the university by combining the number of publications, papers in Nature and Science, and highly cited researchers. It would do no good for Nobel and Fields awards since Trinity has two now and UCD none so the new institution would still only have two (ShanghaiRanking does not count Peace and Literature). Overall, it is likely that the new Irish super-university would rise about a dozen places in the Shanghai rankings, perhaps even getting into the top 150 (TCD is currently 162nd).

But it would probably not help with the rankings that university heads are so excited about. Many of the indicators in the QS and THE rankings are scaled in some way. You might get more citations by adding together those of TCD and UCD, for instance, but QS divide them by number of faculty which would also be combined if there was a merger. You could combine the incomes of TCD and UCD but then the combined income would be divided by the combined staff numbers.

The only place where a merger would be of any point is the survey criteria, 50% in QS and 33% in THE but the problem here is that the reputation of a new University of Dublin or Ireland or whatever it is called is likely to be inferior to that of TCD and UCD for some years to come. There are places where merging universities is a sensible way of pooling the strengths of a multitude of small specialist schools and research centres, for example France and Russia. But for Ireland, there is little point if the idea is to get ahead in the QS and THE  rankings.


It would make more sense for Irish universities to focus on the Shanghai rankings where, if present trends continue, TCD will catch up with Harvard in about 240 years although by then the peaks of the intellectual world will probably be in Seoul, Shanghai, Moscow, Warsaw and Tallinn. 

Saturday, October 08, 2016

Will North Korea Engage with the Rankings?

Kim Jong-un has declared that Kim Il-sung University must become a world-class institution. No doubt there will be chuckles at Oxford,  Anglia Ruskin University, the University of Iceland and the Free University of Bozen - Bolzano but it could be surprisingly easy if being world class means getting a high place in the rankings. After all, there are now quite a few places appearing in the various global and regional tables that would have been just as surprising just a few years ago.

First, I should mention that there already is a ranking in which Kim Il-sung University is listed: a ranking of international influence as measured by Google's ranking of search results where the institution is 254th.

Here is my plan for North Korea to become world class in just a few years.

1. Offer adjunct professorships to 150 researchers and ask them to  put the university as a secondary affiliation. Maybe they can come and visit Pyongyang sometimes but that is not really necessary. In a little while they will be producing 150 papers or more a year with the university name on, eventually one thousand over a five year period, which will meet the threshold for inclusion in the THE world rankings.

2. Make sure that one or two of those adjunct professors are involved in multi-author, multi-cited projects (but make sure below 1,000 authors) with multiple citations. Medicine is probably a better bet than physics at the moment. This will get a good score in the THE citations indicator.

3. Make sure that research funds to the university go through something with the word industry in it. That way the university will go to the top of the THE Industry Income: Innovation indicator.

4. Don't forget the other rankings. Give the university a boost in the QS world rankings by drafting lots of research assistants who will count in the the student faculty ratio indicator.

5.   Start a branch campus somewhere and get a high score in the international indicators that nearly everybody has nowadays. If the branch is in the USA go for Princeton Review's top party school. 

6. Send a few hundred closely supervised graduate students abroad and tell them they know what to do for the QS reputation survey. When they come back as faculty with a co-authored article or two tell them they know what to do for the THE survey.

7. When Kim Il-sung University is a rising star of the university world, try hosting a summit to rise even higher. Better make sure that hotel is finished though.

Saturday, September 24, 2016

The THE World University Rankings: Arguably the Most Amusing League Table in the World

If ever somebody does get round to doing a ranking of university rankings and if entertainment value is an indicator the Times Higher Education (THE) World University Rankings (WUR) stand a good chance of being at the top.

The latest global rankings contain many items that academics would be advised not to read in public places lest they embarrass the family by sniggering to themselves in Starbucks or Nandos.

THE would, for example, have us believe that St. George's, University of London is the top university in the world for research impact as measured by citations. This institution specialises in medicine, biomedical science and healthcare sciences. It does not do research in the physical sciences, the social sciences, or the arts and humanities and makes no claim that it does. To suggest that it is the best in the world across the range of scientific and academic research is ridiculous.

There are several other universities with scores for citations that are disproportionately higher than their research scores, a sure sign that the THE citations indicator is generating absurdity.  They include Brandeis, the Free University of Bozen-Bolzano, Clark University, King Abdulaziz University, Anglia Ruskin University, the University of Iceland, and Orebro University, Sweden.

In some cases, it is obvious what has happened. King Abdulaziz University has been gaming the rankings by recruiting large numbers of adjunct faculty whose main function appears to be listing the university as as a secondary affiliation in order to collect a share of the credit for publications and citations. The Shanghai rankers have stopped counting secondary affiliations for their highly cited researchers indicator but KAU is still racking up the points in other indicators and other rankings.

The contention that Anglia Ruskin University is tenth in the world  for research impact, equal to Oxford, Princeton, and UC Santa Barbara, and just above the University of Chicago, will no doubt be met with donnish smirks at the high tables of that other place in Cambridge, 31st for citations, although there will probably be less amusement about Oxford being crowned best university in the world.

Anglia Ruskin 's output of research is not very high, about a thirtieth of Chicago's according to the Web of Science Core Collection. Its faculty does, however, include one Professor who is a frequent contributor to global medical studies with a large number of authors, although never more than a thousand, and hundreds of citations a year. Single-handedly he has propelled the university into the research stratosphere since the rest of the university has been generating few citations (there's nothing wrong with that: it's not that sort of place) and so the number of papers by which the normalised citations are divided is very low.

The THE citations methodology is badly flawed. That university heads give any credence to rankings that include such ludicrous results is sad testimony to the decadence of the modern academy.

There are also many universities that have moved up or down by  a disproportionate number of places. These include:

Peking University rising from 42nd  to 29th
University of  Maryland at College Park rising from 117th to 67th.
Purdue University rising from 113th to 70th.
Chinese  University of Hong Kong rising from 138th  to 76th.
RWTH Aachen rising from 110th to 78th
Korean Advanced Institute of Science and Technology rising from  148th to 89th


Vanderbilt University falling from 87th to108th
University of Copenhagen falling from 82nd to 120th
Scuola Normale Pisa falling from 112nd to 137th
University of Cape Town falling from 120th to 148th
Royal Holloway, University of London falling from 129th to173rd
Lomonosov Moscow State University falling from 161st to 188th.


The point cannot be stressed too clearly that universities are large and complex organisations. They do not in 12 months or less, short of major restructuring, change sufficiently to produce movements such as these. The only way that such instability could occur is through entry into the rankings of universities with attributes different from the established ones thus changing the means from which standardised scores are derived or significant methodological changes.

There have in fact been significant changes to the methodology this year although perhaps not as substantial as 2015. First, books and book chapters are included in the count of publications and citations, an innovation pioneered by the US News in their Best Global Universities. Almost certainly this has helped English speaking universities with a comparative advantage in the humanities and social sciences although THE's practice of bundling indicators together makes it impossible to say exactly how much. It would also work to the disadvantage of institutions such as Caltech that are comparatively less strong in the arts and humanities.

Second, THE have used a modest version of fractional counting for papers with more than a thousand authors. Last year they were not counted at all. This means that universities that have participated in mega-papers such as those associated with the Large Hadron Collider will get some credit for citations of those papers although not as much as they did in 2014 and before. This has almost certainly helped a number of Asian universities that have participated in such projects but have a generally modest research output. It might have benefitted some universities in California such as UC Berkeley.

Third, THE have combined the results of the academic reputation survey conducted earlier this year with that used in the 2015-16 rankings. Averaging reputation surveys is a sensible idea, already adopted by QS and US News in their global rankings, but one that THE has avoided until now.

This year's survey saw a very large reduction in the number of responses from researchers in the arts and humanities and a very large increase, for reasons unexplained, in the number of responses from business studies and the social sciences, separated now but combined in 2015.

Had the responses for 2016 alone been counted there might have been serious consequences for UK universities, relatively strong in the humanities, and a boost for East Asian universities, relatively strong in business studies. Combining the two surveys would have limited the damage to British universities and slowed down the rise of Asia to media-acceptable proportions.

One possible consequence of these changes is that UC Berkeley, eighth in 2014-15 and thirteenth in 2015-16, is now, as predicted here,  back in the top ten. Berkeley is host for the forthcoming THE world summit although that is no doubt entirely coincidental.

The overall top place has been taken by Oxford to the great joy of the vice-chancellor who is said to be "thrilled" by the news.

I do not want to be unfair to Oxford but the idea that it is superior to Harvard, Princeton, Caltech or MIT is nonsense. Its strong performance in the THE WUR is in large measure due to the over- emphasis in these tables on reputation, income and a very flawed citations indicator. Its rise to first place over Caltech is almost certainly a result of this year's methodological changes.

Let's look at Oxford's standing in other rankings. The Round University Ranking (RUR) uses Thomson Reuters data just like THE did until two years ago. It has 12 of the indicators employed by THE and eight additional ones.

Overall Oxford was 10th, up from 17th in 2010. In the teaching group of five indicators Oxford is in 28th place. For specific indicators in that group the best performance was teaching reputation (6th) and the worst academic staff per bachelor degrees (203rd).

In Research it was 20th. Places ranged from 6th for research reputation to 206th for doctoral degrees per admitted PhD. It was 5th for International Diversity and 12th for Financial Sustainability

The Shanghai ARWU rankings have Oxford in 7th place and Webometrics in 10th (9th for Google Scholar Citations).

THE is said to be trusted by the great and the good of the academic world. The latest example is the Norwegian government including performance in the THE WUR as a criterion for overseas study grants. That trust seems largely misplaced. When the vice-chancellor of Oxford University is thrilled by a ranking that puts the university on a par for research impact with Anglia Ruskin then one really wonders about the quality of university leadership.

To conclude my latest exercise in malice and cynicism, (thank you ROARS) here is a game to amuse international academics .

Ask your friends which university in their country is the leader for research impact and then tell them who THE thinks it is.

Here are THE's research champions, according to the citations indicator:

Argentina: National University of the South
Australia: Charles Darwin University
Brazil: Universidade Federal do ABC (ABC refers to its location, not the courses offered)
Canada: University of British Columbia
China: University of Science and Technology of China
France: Paris Diderot Univerity: Paris 7
Germany: Ulm University
Ireland: Royal College of Surgeons
Japan: Toyota Technological Institute
Italy: Free University of Bozen-Bolzano
Russia: ITMO University
Turkey: Atilim University
United Kingdom: St George's, University of London.



Monday, September 19, 2016

Update on previous post

The reputation data used by THE in the 2016 world rankings, for which the world is breathlessly waiting, is that which was used in their reputation rankings  released last May and collected between January and March.

Therefore, the distribution of responses from disciplinary groups this year was 9% for the arts and humanities and 15% for social sciences and 13% for business (28% for the last two combined). In 2015 it was 16% for the arts and humanities and 19% for the social sciences (which then included business).

Since UK universities are relatively strong in the humanities and Asian universities relatively strong in business studies the result of this was a shift in the reputation rankings away from the UK and towards Asian universities. Oxford fell from 3rd (score 80.4) to 5th (score 69.1) in the reputation rankings and Bristol and Durham dropped out of the top 100 while Tsinghua University rose from 26th place to 18th, Peking University from 32nd to 21st and Seoul National University from 51-60 to 45th.

In the forthcoming world rankings British universities (although threatened by Brexit) ought to do better because of the inclusion of books in the publications and citations indicators and certain Asian universities, but by no means all, may do better because their citations for mega-projects will be partially restored.

Notice that THE have also said that this year they will combine the reputation scores for 2015 and 2016, something that is unprecedented. Presumably this will reduce the fall of UK universities in the reputation survey. Combined with the inclusion of books in the database, this may mean that UK universities may not fall this year and may even go up a bit (ATBB).  

Thursday, September 15, 2016

Some predictions for the THE rankings and summit

Here are my predictions for the THE rankings on the 21st and academic summit on the 26th -28th.

  • Donald Trump will not be invited to give a keynote address.
  • The decline of US public universities will be blamed on government spending cuts.
  • British universities will be found to be in mortal danger from Brexit and visa controls.
  • Phil Baty will give a rankings "masterclass" but will have to apologise to feminists because he couldn't think of anything else to call it.
  • The words 'prestige' and 'prestigious' will be used more times than in the novel by Christopher Priest or the film by Christopher Nolan
  • The counting of books will help British universities, especially Oxford and Cambridge, but they will still be threatened by Brexit.
  • The partial reinclusion of citations of papers with 1,000+ authors, mainly in physics, will lead to a modest recovery of some universities in France, Korea, Japan and Turkey. The rise of Asia will resume.
  • Since the host city or university of THE summits somehow manages to get in the top ten, Berkeley will recover from last year's fall to 13th place. 
  • Last year the percentage of survey responses from the arts and humanities fell to 9% from 16%. I suspect that this year the fall might be reversed and that the reason THE are combining the reputation survey results for this year and 2015 is to reduce the swing back to UK universities, which are suffering because of visa controls and Brexit.
  • At least one of the above will be wrong..




Wednesday, September 07, 2016

The shadow of Brexit falls across the land


The western chattering and scribbling classes sometimes like to reflect on their superiority to the pre-scientific attitudes of the local peasantry, astrology, nationalism and religion and things like that. But it seems that the credentialled elite of Britain are now in the grip of a great fear of an all pervading spirit called Brexit whose malign power is unlimited in time and space.

Thus the Independent tells us that university rankings (QS in this case) show that "post Brexit uncertainty and long-term funding issues" have hit UK higher education.

The Guardian implies that Brexit has something to do with the decline of British universities in the rankings without actually saying so.

"British universities have taken a tumble in the latest international rankings, as concern persists about the potential impact of Brexit on the country’s higher education sector. "

Many British universities have fallen in the QS rankings this year but the idea that Brexit has anything to do with it is nonsense. The Brexit vote was on June 23rd, well after QS's deadlines for submitting respondents for the reputation surveys and updating institutional data. The citations indicator refers to the period 2011-2015.

The belief that rankings reveal the dire effects of funding cuts and immigration restrictions is somewhat more plausible but fundamentally untenable.

Certainly, British universities have taken some blows in the QS rankings this year. Of the 18 universities in the top 100 in 2015 two are in the same place this year, two have risen and 14 have fallen. This is associated with a general decline in performance in the academic reputation indicator which accounts for 40% of the overall score.

Of those 18 universities three, Oxford, Cambridge and Edinburgh, hold the same rank in the academic reputation indicator, one, King's College London, has risen and fourteen are down.

The idea that the reputation of British universities is suffering because survey respondents have heard that the UK government is cutting spending or tightening up on visa regulations is based on some unlikely assumptions about how researchers go about completing reputation surveys.

Do researchers really base their assessment of research quality on media headlines, often inaccurate and alarmist? Or do they make an honest assessment of performance over the last few years or even decades? Or do they vote according to their self interest, nominating their almae matres or former employers?

I suspect that the decline of British universities in the QS reputation indicator has little to do with perceptions about British universities and a lot more to do with growing sophistication about and interest in rankings in the rest of the world, particularly in East Asia and maybe parts of continental Europe.






Thursday, August 11, 2016

Value Added Ranking


There has been a lot of talk about ranking universities by factors other than the usual mix of contributions to research and innovation, reputation surveys and inputs such as spending, teaching resources or student quality.

The emerging idea is that universities should be assessed according to their ability to teach students or to inculcate desirable skills or attributes.

Much of this is powered by the growing awareness that American and European secondary schools are failing to produce sufficient numbers of students with the ability  to undertake and complete anything that could realistically be called a university education. It is unlikely that this is the fault of the schools. The unavoidable  verdict of recent research is that the problem with schools has very little to do with institutional racism, a lack of grit, resilience or the current X factor or the failure to adopt Finnish, Chinese or Singaporean teaching methods. It is simply that students entering the school system are on average less intelligent than they were and those leaving are consequently also less intelligent.

There is now a market for rankings that will measure the quality of universities not by their resources, wealth or research output but by their ability to add value to students and to prepare them for employment or to enable them to complete their courses.

This could, however, lead to massively perverse consequences. If universities are assessed according to the percentage of entrants who graduate within a certain period or their employability then there could be a temptation to dilute graduation requirements .

Nevertheless, the idea of adding value is one that is clearly becoming more popular. It can be seen in the attempt to introduce a national rating system in the US and in the UK to use the proposed Teaching Excellence Framework (TEF) to rank universities.

One UK ranking that includes a value added measure is the Guardian University Guide. This includes eight indicators, three of which three of which measure student satisfaction. Other indicators are staff student ratio and spending per student. There is also  a measure of student outcomes, that is graduate level employment or entry into a postgraduate course after six months, one of the quality of students measured by A level qualifications and one a measure of value added, that is the difference between the students entry level exam results and their eventual degree results.

It is therefore possible to get a rough idea of what factors might actually produce positive student outcomes.

The overall ranking for 2015-16 starts by being quite conventional with the top three places going to Cambridge, Oxford and St Andrews. Some might be surprised by Exeter in 9th place and Loughborough in 11th,  ahead of LSE and UCL.

Measuring student quality by exam scores produces unsurprising results at the top. Cambridge is first followed by Oxford and Imperial. For staff student ratio the top three are UCL, Oxford and SOAS and for spending per student Oxford, Cambridge and the University of the Arts London.

For student satisfaction with courses, Bath, Keele and UEA are in the lead while Oxford is 5th and Cambridge 12th. It's when we look at the Value Added that we find some really unusual results. The top three are Gloucester, Edinburgh and Abertay.

After plugging the indicator scores into an SPSS file we can calculate the correlations between the desired outcome, that is graduate level employment or postgraduate study and a variety of possible associated factors.

Here in descending order are the correlations with career prospects:

average entry tariff .820
student staff ratio .647
spending per student .569
satisfaction with course  .559
satisfaction with teaching   .531
value added .335
satisfaction with feedback -.171.

It would seem that if you want to know which university is best for career prospects then the most important piece of data is the average academic ability of the students. The student staff ratio and money spent are also significant as is satisfaction with courses and teaching. 

The correlation between value added and career prospects is much less and rather modest.

The universities were divided into thirds according to average entry tariff. In the top third of universities there was a strong correlation between career prospects and average entry level tariff, .628, and a modest one with spending, .355. Nothing else was associated with career success.

In the middle third the factor most associated with career prospects was course satisfaction, .498, followed by average entry tariff, .449, staff student ratio, .436, and satisfaction with teaching, .362. Satisfaction with feedback and value added were insignificant.

However, for the least selective third of universities, the picture was rather different. The factor most strongly associated with career success was satisfaction with feedback, .493, followed by valued added, .479, course satisfaction, .470, satisfaction with teaching, .439, and average entry tariff, .401. The relationship with spending and staff student ratio was insignificant .

The evidence of the Guardian rankings is that value added would only be of interest to students at or applying to the least selective third of UK universities. For the rest it is of no importance. It is debatable whether it is worth making it the centre of a new set of rankings.












Sunday, May 15, 2016

The THE reputation rankings: Much ado about not very much

Every so often, especially in North America and Western Europe, there is a panic about the impact of government policies on higher education, usually the failure to provide as much money as universities want, or sometimes as many overseas students as they need to fill lecture halls or cover budget deficits. Global university rankings have a lot to do with the onset and spread of these panics.

True to form, the British  "quality" media have been getting into a tizzy over the latest edition of the Times Higher Education (THE) world reputation ranking. According to Javier Espinoza, education editor of the Telegraph, top UK universities have been under pressure to admit minority and state school students and have also had difficulty in recruiting foreign students. This has somehow caused them to forget about doing research or teaching the most able students. It seems that academics from countries around the world, where such problems are of course unknown, are reacting by withholding their votes from British universities when responding to the THE survey and transferring their approval to the rising stars of Asia.

This supposedly has caused UK institutions to slide down the rankings and two of them, Bristol and Durham, have even dropped out of the top 100 altogether into the great dark pit of the unranked.

The Guardian notes that Oxford and Cambridge are falling and are now only just in the world's top five while the Independent quotes Phil Baty, as saying that "our evidence - from six massive global surveys over six years, including the views of more than 80,000 scholars - proves the balance of power in higher education and research is slowly shifting from the West to the East". 

This, it would seem, is all because of cuts in funding and restrictions on the entry of overseas students and faculty.

All this is is rather implausible. First of all, these are reputation rankings. They refer only to one indicator that accounts for 33 percent of the World University Rankings that will appear later this year. It is not certain that the other indicators will go in the same direction.

Secondly, these rankings have not been standardised as they will be when included in the world rankings, which means that the huge gap between the Big Six, Harvard -- MIT, Berkeley, Stanford, Oxford and Cambridge -- and the rest is laid bare, as it will not be in the autumn, and so we can get a rough idea of how many academics were voting for each university. A crude guess is that when we get down to around 50th place the number of votes will be around five hundred and even less when we reach 100th place.

This means that below the 50 mark a shift in the opinion of a few dozen respondents could easily push a university up or down into a new band or even into or out of the top 100.

Another thing we should remember is that the expertise of the researchers in the Scopus database, from which respondents are drawn, is  exaggerated. The qualification for receiving a survey form is being the corresponding author of a publication listed in the Scopus database. There is much anecdotal evidence that in some places winning research grants or getting the corresponding author slot has more to do with politics than with merit. The THE survey is better than QS's, which allows anyone with an academic email address to take part, but it does not guarantee that every respondent is an unbiased and senior researcher.

We should also note that, unlike the US News and QS survey indicators, THE takes no measures to damp down year to year fluctuations. Nor does it do anything to prevent academics from supporting their own universities in the survey.

So, do we really need to get excited about a few dozen "senior researchers" withdrawing their support from British universities?

The credibility of these rankings is further undermined by apparent changes in the distribution of responses by subject group. According to the methodology page in Times Higher Education for 2015, 16% of the responses were from the arts and humanities and 19% were from the social sciences, which in that year included business studies and economics. This year, according to the THE methodology page, 9% of the responses were from the arts and humanities and 15 % were from the social sciences and 13 % were from business and economics, adding up to 28%.

In other words the responses from the arts and humanities have apparently fallen by 7 percentage points, or around 700 responses, and the combined responses from social sciences and business and economics have apparently risen by nine points, or about 900 responses.

If these numbers are accurate then there has been among survey respondents a very substantial shift from the arts and humanities to the social sciences (inclusive of business and economics) and it is possible that this could be sufficient to cause the recorded decline in the reputation scores of British universities which usually do much better  in the arts and humanities than in the social sciences.

In the THE subject group rankings last year, Durham, for example, was 28th for arts and humanities in the THE 2015-16 World University Rankings and 36th for the social sciences. Exeter was 71st for arts and humanities and 81st for the social sciences.

At the same time some of those rising  Asian universities were definitely  stronger in the social sciences than in the humanities: Peking was 52nd for social sciences and 84th for arts and humanities, Hong Kong 39th for social sciences and 44th for arts and humanities, Nanyang Technological University 95th for social sciences and outside the top 100 universities for the arts and humanities.

It is possible that such a symmetrical change could be the result of changes in the way disciplines are classified or even a simple transposition of data. So far, THE have given no indication that this was the case.

It is interesting that an exception to to the narrative of British decline is the London Business School which has risen from the 91-100 band to 81-90.

The general claim that the views of 80,000 academics over six years are evidence of a shift from west to east is also somewhat tenuous. There have been several changes in the collection and organisation of data over the last few years that could affect the outcomes of the reputation  survey.

Between 2010-2011 and 2016 the percentage of responses from the social sciences (originally including  business and economics) has risen from 19% to 28 % for social sciences plus business and economics counted separately. Those for clinical and health sciences and life sciences  have fallen somewhat while there has been a slight rise for the arts and humanities, with a large spike in 2015.

The number of responses from the Asia Pacific region and the Middle East has has risen from 25% to 36% while those from the Americas (North and Latin) have  fallen from 44% to 25%. The number of languages in which the survey is administered has increased from eight in 2011 to fifteen this year.

The source of respondents has shifted from the Thomson Reuters Web of Science to Scopus, which includes more publications from languages other than English.

The value of these changes is not disputed here but they should make everybody very cautious about using the reputation rankings to make large claims about what is happening to British universities or what the causes of their problems are.




Monday, April 18, 2016

Round University Rankings


The latest Round University Rankings have been released by the Russian company, RUR Rankings Agency. These are essentially holistic rankings that attempt to go beyond the measurement of research output and quality. There are twenty indicators, although some of them such as Teaching Reputation, International Teaching Reputation and Research Reputation and International Students and International Bachelors are so similar that the information they provide is limited.

Basically these rankings cover much the same ground as the Times Higher Education (THE) World University Rankings. The income from industry indicator is not included but there are an additional eight indicators. The data is taken from Thomson Reuters' Global Institutional Profiles Project (GIPP) which was used by THE for their rankings from 2010 to 2014.

Unlike THE, which lumps its indicators together into groups,  the scores in the RUR are listed separately in the profiles. In addition, the rankings provide data for seven continuous years from 2010 to 2016. This provides an unusual opportunity to examine in detail the development of universities over a period of seven years, measured by 20 indicators. This is not the case with other rankings which have fewer indicators or which have changed their methodology.

It should be noted that participation in the GIPP is voluntary and therefore the universities in each edition could be different. For example, in 2015 100 universities dropped out of the project and 62 joined.

It is, however,  possible to examine a number of claims that have been made about changes in university quality over the last few years. I will  take a look at these in the next few posts.

For the moment, here are the top five in the overall rankings and the dimension rankings.

Overall
1.   Caltech
2.   Harvard
3.   Stanford
4.   MIT
5.   Chicago


Teaching
1.   Caltech
2.   Harvard
3.   Stanford
4.   MIT
5.   Duke

Research
1.   Caltech
2.   Harvard
3.   Stanford
4.   Northwestern University
5.   Erasmus University Rotterdam

International Diversity
1.   EPF Lausanne
2.   Imperial College London
3.   National University of Singapore
4.   University College London
5.   Oxford

Financial Sustainability
1.   Caltech
2.   Harvard
3.   Scuola Normale Superiore Pisa
4.   Pohang University of Science and Technology
5.   Karolinska Institute

Unfortunately these rankings have received little or no recognition outside Russia. Here are some examples.


MIPT entered the top four universities in Russia according to the Round University Ranking

Russian Universities in the lead in terms of growth in the international ranking of Round University Ranking

TSU [Tomsk State University]  has entered the 100 best universities for the quality of teaching

[St Petersburg]

Russian universities to top intl rankings by 2020 – Education Minister Livanov to RT


Friday, November 13, 2015

Are global rankings losing their credibility? (from WONK HE)


Originally published in WONK HE 27/10/2015


Are global rankings losing their credibility?
Richard is an academic and expert on university rankings. He writes in depth on rankings at his blog: University Rankings Watch.
PUBLISHED
Oct 27th 2015
TAGS
·         Data
·         International
·         Rankings & League tables


The international university ranking scene has become increasingly complex, confusing and controversial. It also seems that the big name brands are having problems balancing popularity with reliability and validity. All this is apparent from the events of the last two months which have seen the publication of several major rankings.
The first phase of the 2015 global ranking season ended with the publication of the US News’s (USN) Best Global universities. We have already seen the 2015 editions of the big three brand names, the Academic Ranking of World Universities (ARWU) produced by the Centre for World-Class Universities at Shanghai Jiao Tong University, the Quacquarelli Symonds (QS) World University Rankings and the Times Higher Education (THEWorld University Rankings. Now a series of spin-offs has begun.
In addition, a Russian organisation, Round University Ranking (RUR), has produced another set of league tables. Apart from a news item on the website of the International Ranking Expert Group these rankings have received almost no attention outside Russia, Eastern Europe and the CIS. This is very unfortunate since they do almost everything that the other rankings do and contain information that the others do not.
One sign of the growing complexity of the ranking scene is that USN, QS, ARWU and THE are producing a variety of by-products including rankings of new universities, subject rankings, best cities for students, reputation rankings, regional rankings with no doubt more to come. They are also assessing more universities than ever before. THE used to take pride in ranking only a small elite group of world universities. Now they are talking about being open and inclusive and have ranked 800 universities this year, as did QS, while USN has expanded from 500 to 750 universities. Only the Shanghai rankers have remained content with a mere 500 universities in their general rankings.
Academic Ranking of World Universities (ARWU)
All three of the brand name rankings have faced issues of credibility. The Shanghai ARWU has had a problem with the massive recruitment of adjunct faculty by King Abdulaziz University (KAU) in Jeddah. This was initially aimed at the highly cited researchers indicator in the ARWU, which simply counts the number of researchers affiliated to universities no matter whether their affiliation has been for an academic lifetime or had begun the day before ARWU did the counting. The Shanghai rankers deftly dealt with this issue by simply not counting secondary affiliations in the new lists of highly cited researchers supplied by Thompson Reuters in 2014.
That, however, did not resolve the problem entirely. Those researchers have not stopped putting KAU as a secondary affiliation and even if they no longer affected the highly cited researchers indicator they could still help a lot with publications and papers in Nature and Science, both of which are counted in the ARWU. These part-timers – and some may not even be that – have already ensured that KAU, according to ARWU, is the top university in the world for publications in mathematics.
The issue of secondary affiliation is one that is likely to become a serious headache for rankers, academic publishers and databases in the next few years. Already, undergraduate teaching in American universities is dominated by a huge reserve army of adjuncts. It is not impossible that in the near future some universities may find it very easy to offer minimal part-time contracts to talented researchers in return for listing as an affiliation and then see a dramatic improvement in ranking performance.
ARWU’s problem with the highly cited researchers coincided with Thomson Reuters producing a new list and announcing that the old one would no longer be updated. Last year, Shanghai combined the old and new lists and this produced substantial changes for some universities. This year they continued with the two lists and there was relatively little movement in this indicator or in the overall rankings. But next year they will drop the old list altogether and just use the new one and there will be further volatility. ARWU have, however, listed the number of highly cited researchers in the old and new lists so most universities should be aware of what is coming.
Quacquarelli Symonds (QS) World University Rankings
The Quacquarelli Symonds (QS) World University Rankings have been regarded with disdain by many British and American academics although they do garner some respect in Asia and Latin America. Much of the criticism has been directed at the academic reputation survey which is complex, opaque and, judging from QS’s regular anti-gaming measures, susceptible to influence from universities. There have also been complaints about the staff student ratio indicator being a poor proxy for teaching quality and the bias of the citations per faculty indicator towards medicine and against engineering, the social sciences and the arts and humanities.
QS have decided to reform their citations indicator by treating the five large subject groups as contributing equally to the indicator score. In addition, QS omitted papers, most of them in physics, with a very large number of listed authors and averaged responses to the surveys over a period of five years in an attempt to make the rankings less volatile.
The result of all this was that some universities rose and others fell. Imperial College London went from 2nd to 8th while the London School of Economics rose from 71st to 35th. In Italy, the Polytechnics of Milan and Turin got a big boost while venerable universities suffered dramatic relegation. Two Indian institutions moved into the two hundred, some Irish universities such as Trinity College Dublin, University College Dublin and University College Cork went down and some such as National University of Ireland Galway and the University of Limerick went up.
There has always been a considerable amount of noise in these rankings resulting in part from small fluctuations in the employer and academic surveys. In the latest rankings these combined with methodological changes to produce some interesting fluctuations. Overall the general pattern was that universities that emphasise the social sciences, the humanities and engineering have improved at the expense of those that are strong in physics and medicine.
Perhaps the most remarkable of this year’s changes was the rise of two Singaporean universities, the National University of Singapore (NUS) and Nanyang Technological University (NTU), to 12th and 13th place respectively, a change that has met with some scepticism even in Singapore. They are now above Yale, EPF Lausanne and King’s College London. While the changes to the citations component were significant, another important reason for the rise of these two universities was their continuing remarkable performance in the academic and employer surveys. NUS is in the top ten in the world for academic reputation and employer reputation with a perfect score of 100, presumably rounded up, in each. NTU is 52nd for the academic survey and 39th for employer with scores in the nineties for both.
Introducing a moderate degree of field normalisation was probably a smart move. QS were able to reduce the distortion resulting from the database’s bias to medical research without risking the multiplication of strange results that have plagued the THE citations indicator. They have not, however, attempted to reform the reputation surveys which continue to have a combined 50% weighting and until they do so these rankings are unlikely to achieve full recognition from the international academic community.
Times Higher Education (THE) World University Rankings
The latest THE world rankings were published on September 30th and like QS, THE have done some tweaking of their methodology. They had broken with Thompson Reuters at the end of 2014 and started using data from Scopus, while doing the analysis and processing in-house. They were able to analyse many more papers and citations and conduct a more representative survey of research and postgraduate supervision. In addition they omitted multi-author and multi-cited papers and reduced the impact of the “regional modification”.
Consequently there was a large dose of volatility. The results were so different from those of 2014 that they seemed to reflect an entirely new system. THE did, to their credit, do the decent thing and state that direct comparisons should not be made to previous years. That, however, did not stop scores of universities and countries around the world from announcing their success. Those that had suffered have for the most part kept quiet.
There were some remarkable changes. At the very top, Oxford and Cambridge surged ahead of Harvard which fell to sixth place. University College Dublin, in contrast to the QS rankings, rose as did Twente and Moscow State, the Karolinska Institute and ETH Zurich.
On the other hand, many universities in France, Korea, Japan and Turkey suffered dramatic falls. Some of those universities had been participants in the CERN projects and so had benefitted in 2014 from the huge number of citations derived from their papers. Some were small and produced few papers so those citations were divided by a small number of papers. Some were located in countries that performed poorly and so got help from a “regional modification” (the citation impact score of the university is divided by the square root of the average citation impact score of the whole country). Such places suffered badly from this year’s changes.
It is a relief that THE have finally done something about the citations indicator and it would be excellent if they continued with further reforms such as fractional counting, reducing the indicator’s overall weighting, not counting self-citations and secondary affiliations and getting rid of the regional modification altogether.
Unfortunately, if the current round of reforms represent an improvement, and on balance they probably do, then the very different results of 2014 and before, call into question THE’s repeated claims to be trusted, robust and sophisticated. If the University of Twente deserves to be in the top 150 this year then the 2014 rankings which had them outside the top 200 could not possibly be valid. If the Korean Advanced Institute of Science and Technology (KAIST) fell 66 places then either the 2015 rankings or those of 2014 were inaccurate, or they both were. Unless there is some sort of major restructuring such as an amalgamation of specialist schools or the shedding of inconvenient junior colleges or branch campuses, large organisations like universities simply do not and cannot change that much over the course of 12 months or less.
It would have been more honest, although probably not commercially feasible, for THE to declare that they were starting with a completely new set of rankings and to renounce the 2009-14 rankings in the way that they had disowned the rankings produced in cooperation with QS between 2004 and 2008. THE seem to be trying to trade on the basis of their trusted methodology while selling results suggesting that that methodology is far from trustworthy. They are of course doing just what a business has to do. But that is no reason why university administrators and academic experts should be so tolerant of such a dubious product.
These rankings also contain quite a few small or specialised institutions that would appear to be on the borderline of a reasonable definition of an “independent university with a broad range of subjects”: Scuala Normale Superiore di Pisa and Scuala Superiore Sant’Anna, both part of the University of Pisa system, Charité-Universitätsmedizin Berlin, an affiliate of two universities, St George’s, University of London, a medical school, Copenhagen Business School, Rush university, the academic branch of a private hospital in Chicago, the Royal College of Surgeons in Ireland, and the National Research Nuclear University (MEPhI) in Moscow, specialising in physics. Even if THE have not been too loose about who is included, the high scores achieved by such narrowly focussed institutions calls the validity of the rankings into question.
Round University Rankings
In general the THE rankings have received a broad and respectful response from the international media and university managers, and criticism has largely been confined to outsiders and specialists. This is in marked contrast to the Rankings released by a Russian organisation early in September. These are based entirely on data supplied by Thompson Reuters, THE’s data provider and analyst until last year. They contain a total of 20 indicators, including 12 out of the 13 in the THE rankings. Unlike THE, RUR do not bundle indicators together in groups so it is possible to tell exactly why universities are performing well or badly.
The RUR rankings are not elegantly presented but the content is more transparent than THE, more comprehensive than QS, and apparently less volatile than either. It is a strong indictment of the international higher education establishment that these rankings are ignored while THE’s are followed so avidly.
Best Global Universities
The second edition of the US News’s Best Global Universities was published at the beginning of October. The US News is best known for the ranking of American colleges and universities and it has been cautious about venturing into the global arena. These rankings are fairly similar to the Shanghai ARWU, containing only research indicators and making no pretence to measure teaching or graduate quality. The methodology avoids some elementary mistakes. It does not give too much weight to any one indicator, with none getting more than 12.5%, and measures citations in three different ways. For eight indicators log manipulation was done before the calculation of z-scores to eliminate outliers and statistical anomalies.
This year US News went a little way towards reducing the rankers’ obsession with citations by including conferences and books in the list of criteria.
Since they do not include any non-research indicators these rankings are essentially competing with the Shanghai ARWU and it is possible that they may eventually become the first choice for internationally mobile graduate students.
But at the moment it seems that the traditional media and higher education establishment have lost none of their fascination for the snakes and ladders game of THE and QS.