Saturday, July 13, 2019

Singapore and the Rankings Again

As far as the rankings are concerned, Singapore has been a great success story, at least as far as the Big Two (THE and QS) are concerned.

In the latest QS world rankings the two major Singaporean universities, National University of Singapore (NUS) and Nanyang Technological University (NTU, have done extremely well, both of them reaching the eleventh spot. Predictably, the mainstream local media has praised the universities, and quoted QS spokespersons  and university representatives about how the results are due to hiring talented faculty and the superiority of the national secondary education system.

There is some scepticism in Singapore about the rankings. The finance magazine Dollars and Sense has just published an article by Sim Kang Heong that questions the latest performance by Singapore's universities, including the relatively poor showing by Singapore Management University.

The author is aware of the existence of other rankings but names only two (ARWU and THE) and then presents a list of indicators from all the QS rankings including regional and specialist tables as though they were part of the World University Rankings.

The piece argues that "it doesn't take someone with a PhD to see some of the glaring biases and flaws in the current way QS does its global university rankings."

It is helpful that someone is taking a sceptical view of Singapore's QS ranking performance but disappointing that there is no specific reference to how NUS and NTU fared in other rankings.

Last February I published a post showing the ranks of these two institutions in the global rankings. Here they are again:

THE:   23rd and 51st
Shanghai ARWU:  85th and 96th
RUR:  50th  and 73rd
Leiden (publications): 34th and 66th

These are definitely top 100 universities by any standard. Clearly though, the QS rankings rate them much more highly than anybody else. 







Thursday, July 11, 2019

India and the QS rankings

The impact of university rankings is mixed. They have, for example, often had very negative consequences for faculty, especially junior, who in many places have been coerced into attending pointless seminars and workshops and churning out unread papers or book chapters in order to reach arbitrary and unrealistic targets or performance indicators.

But sometimes they have their uses. They have shown the weakness of several university systems. In particular, the global rankings have demonstrated convincingly that Indian higher education consists of a few islands of excellence in a sea of sub-mediocrity. The contrast with China, where many universities are now counted as world class, is stark and it is unlikely that it can be fixed with a few waves of the policy wand or by spraying cash around.

The response of academic and political leaders is not encouraging. There have been moves to give universities more autonomy, to increase funding, to engage with the rankings. But there is little sign that India is ready to acknowledge the underlying problems of the absence of a serious research culture or a secondary school system that seems unable to prepare students for tertiary education.

Indian educational and political leaders have lately become very concerned about the international standing of the country's universities. Unfortunately, their understanding of how the rankings actually work seems limited. This is not unusual. The qualities needed to climb the slippery ladder of academic politics are not those of a successful researcher or someone able to analyse the opportunities and the flaws of global rankings. 

Recently there was a meeting of the Indian minister for Human Resource Development (HRD) plus the heads of the Indian Institutes of Technology Bombay and Delhi  and Indian Institute of Science Bangalore.

According to a local media report, officials have said that the reputation indicators in the QS international rankings contribute to Indian universities poor ranking performance as they are "an area where the Indian universities lose out the maximum number of marks - due to the absence of Indian representation at QS panel." 
The IIT Bombay director is quoted as saying "there are not enough participants in the UK or the US to rate Indian universities." 

This shows ignorance of QS's methodology. QS now collects response from several channels including lists submitted by universities and a facility where individual researchers and employers can sign up to join the survey.  In 2019 out of 83,877 academic survey responses collected over five years, 2.6% were from academics with an Indian affiliation, which is less than Russia, South Korea, Australia or Malaysia but more than China or Germany. This does not include responses from Indian academics at British, North American or Australian institutions. A similar proportion of responses to the QS employer survey were from India.

If there are not enough Indian participants in the QS survey then this might well be the fault of Indian universities themselves. QS allows universities to nominate up to 400 potential survey participants. I do not know if they have taken full advantage of this or whether those nominated have actually voted for Indian institutions. 

It is possible that India could do better in the rankings by increasing its participation in the QS surveys to the level of Malaysia but it is totally inaccurate to suggest that there are no Indians in the current QS surveys

If Indian universities are going to rise in the rankings then they need to start by understanding how they actually work and creating informed and realistic  strategies.


Thursday, July 04, 2019

Comparing National Rankings: USA and China


America's Best Colleges
The US News America's Best Colleges (ABC) is very much the Grand Old Man of university rankings. Its chief data analyst has been described as the most powerful man in America although that is perhaps a bit exaggerated. These rankings have had a major role in defining excellence in American higher education and they may have contributed to US intellectual and scientific dominance in the last two decades of the twentieth century.

But they are changing. This year's edition has introduced two new measures of "social mobility", namely the number of  Pell Grant (low income) students and the comparative performance of those students. There is suspicion that this is an attempt to reward universities for the recruitment and graduation of certain favoured groups, including African Americans and Hispanics, and perhaps recent immigrants from the Global South. Income is used as a proxy for race since current affirmative action policies at Harvard and other places are under legal attack. 

It should be noted that success is defined as graduation within a six year period and that is something that can be easily achieved by extra tuition, lots of collaborative projects, credit for classroom discussions and effort and persistence, holding instructors responsible for student failure, innovative methods of assessment, contextualised grading and so on.

The new ABC has given the Pell Grant metrics a 5% weighting  and has also increased the weighting for graduation rate performance, which looks at actual student outcomes compared to those predicted from their social and academic attributes, from 7.5% to 8%. So now a total of 13 % in effect goes to social engineering. A good chunk of the rankings then is based on the dubious proposition that universities can and should reduce or eliminate the achievement gap between various groups.

To make room for these metrics the acceptance rate indicator has been eliminated, and the weightings for standardised test scores, high school rank, counsellor reviews and six year graduation rate have been reduced.

Getting rid of the acceptance rate metric is probably not a bad idea since it had the unfortunate effect of encouraging universities to maximise the number of rejected applications, which produced income for the universities but imposed a financial burden on applicants.

The rankings now assign nearly a one third weighting to student quality, 22% to graduation and retention rates and 10% for standardised tests and high school rank. 

It seems that US News is moving from ranking universities by the academic ability of their students to ranking based on the number and relative success of low income and "minority" students.

The latest ranking shows the effect of these changes. The very top is little changed but further down there are significant shifts. William and Mary is down. Howard University, a predominantly African American institution, is up as are the campuses of the University of California system.

ABC also has another 30% for resources (faculty 20% and financial 10%), 20% for for reputation (15 % peer and 5% high school counsellors), and 5% for alumni donations.

Shanghai Best Chinese University Rankings

The Shanghai Best Chinese University Ranking (BCUR) is a recent initiative although ShanghaiRanking has been doing global rankings since 2003. They are quite different from the US News rankings.

For student outcomes Shanghai assigns a weighting of 10% to graduate employment and does not bother with graduation rates. As noted, ABC gives 22% for student outcomes (six year graduation rate and first year retention rate). 


Shanghai gives a 30% weighting for the dreaded Gaokao, the national university entrance exam, compared to 10% for high school class rank and SAT/ACT scores in ABC.

With regard to inputs, Shanghai allocates just 5% for alumni donations, compared to 30% in the ABC for  class size, faculty salary, faculty highest degrees, student faculty ratio, full time faculty and financial resources. 

That 5% is the only thing in Shanghai that might  be relevant to reputation while ABC has a full 20% for reputation among peers and counsellors. 

Shanghai also has a 40% allocation for research, 10% for "social service", which comprises research income from industry and income from technology transfer, and 5% for international students. ABC has no equivalent to these, although it publishes rankings separately on postgraduate programmes.

To compare the two, ABC is heavy on inputs, student graduation and retention, reputation, and social engineering. Probably the last will become more important over the next few years 
BCUR, in contrast, emphasises student ability as measured by a famously rigorous entrance exam, student employment, research, links with industry, and internationalisation.

It seems that in the coming years excellence in higher education will be defined very differently. An elite US university will be one well endowed with money and human resources, will make sure that most of its students graduate one way or another, will ensure that that the  ethnic and gender composition of the faculty and student body matches that of America or the world, and has a good reputation among peers and the media.

An elite Chinese university will be one that produces employed and employable graduates, admits students with high levels of academic skills, has close ties with industry, and has a faculty that produces a high volume of excellent research.