Sunday, June 18, 2017

Comparing the THE and QS Academic Reputation Surveys

Times Higher Education (THE) has just published its 2017 reputation rankings which include 100 universities. These are based on a survey distributed between January and March of this year and will be included, after standardisation, in the 2017-18 (or 2018) World University Rankings scheduled for publication in a few months. In the forthcoming world rankings the reputation survey will be divided into two metrics in the research and teaching indicator groups, with a combined weighting of 33 percent. The survey asked about research and postgraduate teaching but since the correlation between these two questions is very high there is effectively only one indicator.

The QS world rankings released last week included scores derived from two surveys, one of academics with a 40% weighting and one of employers with 10%. The academic survey was concerned only with research.

The methodology of the THE survey is relatively simple. The respondents are drawn from the database of researchers with publications in Scopus indexed journals, in other words those who get to be listed as corresponding author. THE claims that this makes them experienced senior researchers although in many parts of the world being a member or leader of a research team often has more to do with politics than merit.

In contrast, the QS methodology has changed quite a lot over the last few years. It began with scouring the mailing lists of World Scientific, a Singapore based academic publisher with links to Imperial College London, then adding various other channels including lists supplied by institutions and sign up facilities for potential respondents. The result is a survey that appears more inclusive than THE's with more respondents from outside the elite but one whose validity may be rather suspect.

The THE ranking found that there were six super-brand universities that stood out from everyone else, Harvard, MIT, Stanford, Cambridge, Oxford, and Berkeley. There was a big gap between Berkeley and number seven Princeton and then the long smooth slope continues. 

After that, the ranking is dominated by English speaking universities, with the USA contributing 42, the UK 10, Canada 3 and Australia 3.  East Asia and the Chinese diaspora (Hong Kong, Taiwan and Singapore) are fairly well represented, while South and Central Asia, the Middle East and Africa are absent.

For any survey a great deal depends on how the forms are distributed. Last year, the THE survey had a lot more responses from the social sciences, including economics and business studies, and fewer from the arts and the humanities, and that contributed to some Asian universities rising and some British ones falling.

Such falls are typically attributed in the education establishment media to anxiety about the looming horrors of Brexit, the vicious snatching of research funds and the rising tide of hostility to international students.

This year British Universities did a bit better in the THE reputation ranking this year with five going up, three staying put and three going down. No doubt we will soon hear about the invigorating effects of Brexit and the benefits of austerity.  Perhaps also it might have something to do with the number of survey responses from the arts and humanities going up from  9% to 12.5%, something that would surely benefit UK universities.

The QS reputation indicator has the same universities in the top six but not in quite the same order: Cambridge, fourth in THE, is second in the QS indicator. After that it starts looking very different. Number seven is the University of Tokyo, which THE puts in 11th place for  academic reputation. Other Asian universities do much better in the QS indicator. The National University of Singapore is 11th ( 27th in THE) Nanyang Technological University  Singapore is 50th (THE 81-90 band), Peking University is 14th (THE 17h) Chulalongkorn University Thailand is 99th (not in the THE top 100).

It is noticeable that Latin American universities such as  the University of Sao Paulo, the University of Buenos Aires and the Pontifical Catholic University of Chile get a higher placing in the QS indicator than they do in the THE ranking as do some Southern European universities such as Barcelona, Sapienza and Bologna.

The THE reputation ranking gives us a snapshot of the current views of the world's academic elite and probably underestimates the rising universities of Greater China and Korea. QS cast their nets further and have probably caught a few of tomorrow's world class institutions although I suspect that the Latin American high fliers, apart from Sao Paulo, are very overrated.




Thursday, June 15, 2017

The Abuse and Use of Rankings

International university rankings have become a substantial industry since the first appearance of the Shanghai rankings (Academic Ranking of World Universities or ARWU) back in 2003. The various rankings are now watched closely by governments and media and for some students they play a significant role in choosing universities, They have become a factor in national higher education policies and are an important element in the race to enter and dominate the lucrative transnational higher education market. In Malaysia a local newspaper, Utusan Malaysia, recently had a full page on the latest QS world rankings including a half page of congratulations from the Malaysian Qualification Agency for nine universities who are part of a state-backed export drive.

Reaction to international rankings often goes to one of two extremes, either outright rejection or uncritical praise, sometimes descending into grovelling flattery that would make Uriah Heep ashamed (the revered QS rankings, Phil Baty a thought leader). The problem with the first, which is certainly very understandable, is that it is unrealistic. If every international ranking suddenly stopped publication we would just have, as we did before, an informal ranking system based largely on reputation, stereotypes and prejudice. 

On the other hand, many academics and bureaucrats find rankings very useful. It is striking that university administrators, the media and national governments have been so tolerant of some of the absurdities that Times Higher Education (THE) has announced in recent years. Recently, THE’s Asian rankings had Veltech University as the third best university in India and the best in Asia for research impact, the result of exactly one researcher assiduously citing himself. This passed almost unnoticed in the Indian press and seems to have aroused no great interest among Indian academics apart from a couple of blog posts. Equally, when Universiti Tunku Abdul Rahman (UTAR), a private Malaysian university, was declared to be the second best university in the country and best for research impact, on the strength of a single researcher’s participation in a high profile global medical project there was no apparent response from anyone.

International rankings have also become a weapon in the the drive by universities to maintain or increase their access to public funds. British and Irish universities often complain that their fall in the  rankings is all the fault of the government for not providing enough money. Almost any result in the better known rankings can be used to prop up the narrative of western universities starved of funds and international researchers and students. 

Neither of these two views is really valid. Rankings can tell us a great deal about the way that higher education and research are going. The early Shanghai rankings indicated that China was a long way behind the West and that research in  continental Europe was inferior to that in the USA. A recent analysis by Nature Index shows that American research is declining and that the decline is concentrated in diverse Democrat voting states such as California, Massachusetts, Illinois and New York.

But if university rankings are useful they not equally so and neither are the various indicators from which they are constructed.

Ranking indicators that rely on self-submitted information should be mistrusted. Even if everybody concerned is fanatically honest, there are many ways in which data can be manipulated, massaged, refined, defined and redefined, analysed and distorted as it makes it way from branch campuses, affiliated colleges and research institutes through central administration to the number munching programs of the rankers.   

Then of course there are the questionable validation processes within the ranking organisations. There was a much publicised case concerning Trinity College Dublin where for two years in a row the rankers missed an error of orders of magnitude in the data submitted for three income indicators.

Any metric that measures inputs rather than outputs should be approached with caution including THE's measures of income that amount to a total weighting of 10.75%. THE and QS both have indicators that count staff resources. It is interesting to have  this sort of information but there is no guarantee that having loads of money or staff will lead to quality whether of research, teaching or anything else.

Reputation survey data is also problematic. It is obviously subjective, although that is not necessarily a bad thing, and everything depends on the distribution of responses between countries, disciplines, subjects and levels of seniority. Take a look at the latest QS rankings and the percentages of respondents from various countries.

Canada has 3.5% of survey respondents and China has  1.7%.
Australia has 4% and Russia 4.2%.
Kazakhstan has 2.1% and India 2.3%'

There ought to be a sensible middle road between rejecting rankings altogether and passively accepting the errors, anomalies and biases of the popular rankers.

Universities and governments should abide by a self denying ordinance and reject ranking results that challenge common sense or contradict accepted national rankings. I remember a few years ago someone at Duke University saying that they were puzzled why the THES-QS rankings put the school in first place for faculty student ratio when this contradicted data in the
 US News rankings. Few, if any, major universities or higher education misters seem to have done anything like this lately.

It would also be a good idea if universities and governments stopped looking at rankings holistically and started setting targets according to specific indicators. High flying research university could refer to the Leiden Ranking, Nature Index or the
 Nature and Science and Publications indicators in ARWU. New universities could target a place in the Excellence indicators in the Webometrics rankings which lists 5,777 institutions as having some sort of research presence.

As for the teaching mission, the most directly relevant indicators are the QS employer survey in the world rankings, the QS Graduate Employability Index, and the Global University Ranking Employability Ranking published by THE.


Governments and universities would be advised not to got too excited about a strong performance in the rankings. What the rankings have given the rankings can take away.


            

Monday, May 29, 2017

Ten Universities with a Surprisingly Large Research Impact

Every so often newspapers produce lists of universities that excel in or are noteworthy for something. Here is a list of ten universities that, according to Times Higher Education (THE), have achieved remarkable success in the world of global research. In a time of austerity when the wells of patronage are running dry, they should be an example to us all: they have achieved a massive global research impact, measured by field-normalised citations, despite limited funding, minimal reputations and few or very few publications. The source is the THE World and Asian rankings citations indicator.

1. First on the list is Alexandria University in Egypt,  4th in the world and a near perfect score for research impact in 2010-11.

2. In the same year Hong Kong Baptist University was tenth for research impact, ahead of the University of Chicago and the University of Hong Kong.

3. In 2011-12 Royal Holloway, University of London, was in 12th place, ahead of any other British or European institution.

4. The National Research Nuclear University MEPhI, in Moscow, a specialist  institution, was top of the table for citations in 2012-13.

5. In 2013-14 and 2014-15 Tokyo Metropolitan University had a perfect score of 100 for citations, a distinction shared only with MIT.

6. In 2014-15 Federico Santa Maria Technical University was sixth in the world for research impact and first in Latin America with a near perfect score of 99.7.

7. In the same year Bogazici University in Turkey reached the top twenty for research impact.

8. St George's, University of London, was the top institution in the world for research impact in 2016-17.

9. In that year Anglia Ruskin University, a former art school, was tenth for this metric, equal to Oxford and well ahead of the other university in Cambridge.

10. Last year's THE Asian rankings saw Vel Tech University in Chennai achieve the highest impact of any Asian university. 

Sunday, May 28, 2017

The View from Leiden

Ranking experts are constantly warning about the grim fate that awaits the universities of the West if they are not provided with all the money that they want and given complete freedom to hire staff and recruit students from anywhere that they want. If this does not happen they will be swamped by those famously international Asian universities dripping with funds from indulgent patrons.

The threat, if we are to believe the prominent rankers of Times Higher Education (THE), QS and Shanghai Ranking Consultancy, is always looming but somehow never quite arrives. The best Asian performer in the THE world rankings  is the National University of Singapore (NUS) in 24th place followed by Peking University in 29th. The QS World University Rankings have NUS 12th, Nanyang Technological University 13th and Tsinghua University 24th.  The Academic Ranking of World Universities published in Shanghai puts the University of Tokyo in 20th place and Peking University in 71st.

These rankings are in one way or another significantly biased towards Western European and North American institutions and against Asia. THE has three separate indicators that measure income, adding up to a combined weighting of 10.75% . Both QS and THE have reputations surveys. ARWU gives a 30 % weighting to Nobel and Fields awards winners, some of them from several decades ago.

Let's take a look at a set of rankings that is technically excellent, namely the Leiden Ranking. The producers do not provide an overall score. Instead it is possible to create a variety of rankings, total publications, publications by subject groups, publications in the top 50%, 10% and 1% of journals. Users can also select fractional or absolute counting and  change the minimum threshold of number of publications.

Here is the top ten, using  the default settings, publications 2012-15, fractional counting, minimum threshold of 100 papers. Publications in 2006-09 are in brackets.

1. Harvard  (1)
2. Toronto  (2)
3. Zhejiang  (14)
4. Michigan (3)
5. Shanghai Jiao Tong (37)
6. Johns Hopkins (5)
7  Sao Paulo (8)
8. Stanford (9)
9  Seoul National University (23)
10.  Tokyo (4).

Tsinghua University is 11th, up from 32nd in 2006-09 and Peking University is 15th, up from 54th. What is interesting about this is not just that East Asian universities are moving into the highest level of research universities but how rapidly they are doing so.

No doubt there are many who will say that this is a matter of quantity and that what really counts is not the number of papers but their reception by other researchers. There is something to this. If we look at publications in the top 1 % of journals (by frequency of citation) the top ten include six US universities headed by Harvard, three British and one Canadian.

Tsinghua is 28th, Zhejiang is 50th, Peking 62nd, Shanghai Jiao Tong 80th, Seoul National University 85th . Right now it looks like publication in the most reputed journals is dominated by English-speaking universities. But in the last few years Chinese and Korean universities have advanced rapidly, Peking 119th to 62nd, Zhejiang 118th to 50th, Shanghai Jiao Tong 112th to 80th, Tsinghua 101st to 28th, Seoul National University 107th to 85th.

It seems that in a few years East Asia will dominate the elite journals and will take the lead for quality as well as quantity.

Moving on to subject group rankings, Tsinghua University is in first place for mathematics and computer sciences. The top ten consists of nine Chinese and one Singaporean university. The best US performer is MIT in 16th place, the best British Imperial College London in 48th.

When we look at the top 1 % of journals, Tsinghua is still on top, although MIT moves up to 4th place and Stanford is 5th. 

The Asian tsunami has already arrived. East Asian, mainly Chinese and Chinese diaspora, universities, are dominant or becoming dominant in the STEM subjects, leaving the humanities and social sciences to the US.

There will of course be debate about what happened. Maybe money had something to do with it. But it also seems that western universities are becoming much less selective about student admissions and faculty appointments. If you admit students who write #BlackLivesMatter 100 times on their application forms or impose ideological tests for faculty appointment and promotion you may have succeed in imposing political uniformity but you will have serious problems trying to compete with the Gaokao hardened students and researchers of Chinese universities.

Monday, May 22, 2017

Arab University Rankings: Another Snapshot from Times Higher Education

Times Higher Education (THE) has produced a "snapshot" ranking of Arab universities extracted from its World University Rankings. There has been no change in the indicators or their weighting. Only 28 universities are included which raises questions about how suitable THE's methodology is for regions like the Middle East and North Africa.

This is an improvement over a remarkable MENA snapshot that THE did in 2015 which put Texas A and M University Qatar in first place by virtue of one half-time faculty member who was listed as a contributor to a multi-author, multi-cited CERN paper.

The top five universities this time are King Abdulaziz University (KAU), Saudi Arabia, King Fahad University of Petroleum and Minerals, Saudi Arabia, King Saud University, Saudi Arabia, Khalifa University of Science, Technology and Research, UAE, and Qatar University.

The top three places are held by Saudi institutions. So how did they do it? According to an article by a THE editor for the World Economic Forum it was all due to money and internationalisation. 

Up to a point that is correct. The sad story of Trinity College Dublin's botched data submission shows roughly how much a given reported increase can affect a university's overall rank: roughly 5 million Euro in reported total income (with proportionate increases for research income and income from industry) and a middling high university can jump up a spot in the overall rankings.

But does that explain KAU in top place? It did get a high score, 92.1, for international orientation but five other Arab universities did better.  For teaching it was third, for industry income third, and for research seventh. What actually made the difference was Citations. KAU had a score of 93.3, far ahead of the next contender, Jordan University of Science and Technology with 50.2.

KAU's research impact is, according to THE, second in Asia only to the shooting star of india, Vel Tech University whose single self citing prodigy supposedly had a greater impact than the whole of any other Asian university. KAU's citations score was the result of the massive recruitment of adjunct faculty, 40 at the last count, who put KAU as a second affiliation. How much time they put in at KAU is uncertain but the Shanghai rankings calculated that highly cited researchers spent an average of 16% of their time at the university of their second affiliation.

It is bad enough that THE put so much emphasis on income and internationalisation in their methodology, promoting the diversion of resources from things like primary and pre-school education, adult literacy and alternatives to oil exports. To encourage universities to rise in the rankings  by hiring adjunct faculty whose contribution is uncertain is very irresponsible. It would be a good thing if this snapshot was ignored.



Wednesday, May 10, 2017

Ranking article: The building of weak expertise: the work of global university rankers

Miguel Antonio Lim, The building of weak expertise: the work of global university rankers

University rankers are the subject of much criticism, and yet they remain influential in the field of higher education. Drawing from a two-year field study of university ranking organizations, interviews with key correspondents in the sector, and an analysis of related documents, I introduce the concept of weak expertise. This kind of expertise is the result of a constantly negotiated balance between the relevance, reliability, and robustness of rankers’ data and their relationships with their key readers and audiences. Building this expertise entails collecting robust data, presenting it in ways that are relevant to audiences, and engaging with critics. I show how one ranking organization, the Times Higher Education (THE), sought to maintain its legitimacy in the face of opposition from important stakeholders and how it sought to introduce a new “Innovation and Impact” ranking. The paper analyzes the strategies, methods, and particular practices that university rankers undertake to legitimate their knowledge—and is the first work to do so using insights gathered alongside the operations of one of the ranking agencies as well as from the rankings’ conference circuit. Rather than assuming that all of these trust-building mechanisms have solidified the hold of the THE over its audience, they can be seen as signs of a constant struggle for influence over a skeptical audience.

Higher Education 13 April 2017
DOI: 10.1007/s10734-017-0147-8

https://link.springer.com/article/10.1007%2Fs10734-017-0147-8

What good are international faculty?

In the previous post I did a bit of fiddling around with correlations and found that UK universities' scores for the international student indicator in the QS world rankings did not correlate very much with beneficial outcome for students such as employability and course completion. They did, however, correlate quite well with spending per student.

That would suggest that British universities want lots of international students because it is good for their finances.

What about international faculty?

Comparing the scores for various outcomes with the QS international faculty score shows that in most cases correlation is low and statistically insignificant. This includes course satisfaction and   satisfaction with teaching (Guardian University League Tables), and completion and student satisfaction (THE TEF simulation).

There is, however, one metric that is positively, albeit modestly, and significantly associated with international students and that is the Research Excellence Framework score (REF) score (from Complete University Guide): .284 (sig (2-tailed) .043), N 51).

So it seems that international students are valued for the money they bring with them and international faculty for boosting research quality.

Caveat: this applies to highly ranked universities in the UK. How far it is true of other places or even less prestigious British institutions remains to be seen.


Monday, May 01, 2017

What good are international students?

There has been  a big fuss in the UK about the status of international students. Many in the higher education  industry are upset that the government insists on including these students in the overall total of immigrants, which might lead at some point to a reduction in their numbers. Leading twitterati have erupted in anger. Phil Baty of THE has called the government's decision "bizarre, bonkers & deeply depressing" and even invoked the name of the arch demon Enoch Powell in support.

So are international students a benefit to British universities? I have just done a quick correlation of the scores for the international students indicator in the QS World University Rankings to see whether there is any link between positive outcomes for students and the number of international students.

This is of course only suggestive. QS provides scores for only 51 universities included in their world top 500 and the situation might be different for other countries. Another caveat is that international students might provide net economic benefits for surrounding communities although that is far from settled.

Here are the correlations with the QS international student score (significance 2 tailed and N in brackets).

Value added .182 (.206; 50)    From the Guardian Rankings 2016. Compares entry qualifications with degrees awarded.

Career .102  (.480; 50) Graduate-level employment or postgraduate study six months after graduation. Also from the Guardian rankings. The correlation with the graduate destinations indicator, based on the same data, in the Times Higher Education TEF simulation is even lower, .018, and turns negative after benchmarking, -172.

Students completing degrees  .128 (.376; 50). From the TEF simulation. Again, the correlation turns negative after benchmarking.

QS Employers reputation survey .234 (.140; 41). From the 2016 world rankings.

So the number of international students has a slight and statistically insignificant relationship with the quality of teaching and learning as measured by value added, graduate employability, course completion and reputation with employers. Why then are universities so desperate to get as many as possible?

This, I think, is the answer. The correlation between the QS international students indicator and spending per student, measured by the Guardian ranking is .414 (.003; 50) which is very significant considering the noise generated in comparisons of this sort. Of course, correlation  does not equal causation, but it seems a reasonable hypothesis that it is the money brought by international students that makes them so attractive to British universities.