Thursday, October 18, 2018

How many indicators do university rankings need?

The number of indicators used in international university rankings varies a lot. At one extreme we have the Russian Round University Rankings (RUR), which have 20 indicators. At the other, Nature Index and Reuters Top 100 Innovative Universities have just one.

In general, the more information provided by rankings the more helpful they are. If, however, the indicators produce very similar results then their value will be limited. The research and postgraduate teaching surveys in the THE world rankings and the RUR correlate so highly that they are in effect measuring the same thing.

There is probably an optimum number of indicators for a ranking, perhaps higher for general than for  research-only rankings, above which no further information is provided. 

A paper by Guleda Dogan of Hacettepe University, Ankara, looks at the indicators in three university rankings the Shanghai Academic Ranking of World Universities, the National Taiwan University Rankings and University Ranking by Academic Performance (URAP) and finds that they is a very high degree of internal  similarity:


"Results of the analyses show that the intra-indicators used in ARWU, NTU and URAP are highly similar and that they can be grouped according to their similarities. The authors also examined the effect of similar indicators on 2015 overall ranking lists for these three rankings. NTU and URAP are affected least from the omitted similar indicators, which means it is possible for these two rankings to create very similar overall ranking lists to the existing overall ranking using fewer indicators."






Wednesday, October 10, 2018

The link between rankings and standardised testing

The big hole in current international university rankings is the absence of anything that effectively measures the quality of graduates. Some rankings use staff student ratio or income as a proxy for the provision of resources, on the assumption that the more money that is spent or the more teachers deployed then the better the quality of teaching. QS has an employer survey that asks about the universities from where employers like to recruit but that has many problems.

There is a lot of of evidence that university graduates are valued to a large extent because they are seen as intelligent, conscientious and, depending on place and field, open-minded or conformist. A metric that correlates with these attributes would be helpful in assessing and comparing universities. 

A recent article in The Conversation by Jonathon Wai suggests that the US News America's Best Colleges rankings are highly regarded partly because they measure the academic ability of admitted students, which correlates very highly with that of graduates.

The popularity of these rankings is based on their capturing the average ability of students as measured by the SAT or ACT. Wai reports from a paper that he wrote in collaboration with Matt Brown and Christopher Chabris in the Journal of Intelligence that finds that there is a large correlation between the average SAT or ACT scores of students and overall scores in America's Best Colleges, .982 for national universities and .890 for liberal arts colleges. 

The correlation with the THE/WSJ US college rankings is less but still very substantial, .787, and also for the THE World University Rankings, .659.

It seems that employers and professional schools expect universities to certify the intelligence of their graduates. The value of standardised tests such as ACT, SAT, GRE, LSAT, GMAT, which correlate highly with one another, is that  they are a fairly robust proxy for general intelligence or general mental ability. Rankings could be valuable if they provideda clue to the ability of graduates .

It is, however, a shame that the authors should support their argument by referring only to one global ranking, the THE world rankings. There are now quite a few international rankings that are as good as or better than the THE tables.

I have therefore calculated the correlations between the average SAT/ACT scores of 30 colleges and universities in the USA and their scores in various global rankings. 

The source for student scores is a supplement to the article by Wai at al, from which I have taken the top 30 ranked by SAT/ACT. I have used those rankings listed in the IREG inventory that provide numerical scores and and not just ranks. The GreenMetric was not used since only one US school out of these thirty, Washington University in St Louis, took part in that ranking. I used two indicators from Leiden Ranking, which does not give a composite score, total publications and the percentage of papers in the top 1% of journals.

It is interesting that there are many liberal arts colleges in the US that are not included in the international rankings. Prospective undergraduates looking for a college in the USA would do well to look beyond the global rankings. Harvey Mudd College, for example, is highly selective and its graduates much sought after but it does not appear in any of the rankings below.

The results are interesting. The indicator that correlates most significantly with student ability is Leiden Ranking's percentage of papers in the top 1% of journals. Next is CWUR, which does explicitly claim to measure graduate quality. The US News world rankings and the Shanghai rankings, which only include research indicators, also do well.

We are looking at just 30 US institutions here. There might be different results if we looked at other countries or a broader range of US schools.

So, it seems that if you want to look at the ability of students or graduates, an international ranking based on research is as good as or better than one that tries to measure teaching excellence with the blunt instruments currently available.



Ranking
Address
correlation
significance
N
1
Leiden Ranking: papers in top 10% of journals
Netherlands
.65
.001*
22
2
Center for World University Ranking
UAE
.59
.003*
22
3
US News Best Global Universities
USA
.58
.004*
22
4
Shanghai ARWU
China
.57
.004*
24
5
Round University Rankings
Russia
.55
.008*
22
6
THE World University Rankings
UK
.51
.014*
22
7
QS World university Rankings
UK
.49
.025*
21
8
University Ranking by Academic Performance
Turkey
.48
.015*
25
9
Nature Index Fractional Count
USA
.45
.039*
21
10
National Taiwan University
Taiwan
.32
.147
22
11
Leiden: total Publications
Netherlands
.21
.342
22


*significant at 0.05 level 

Wednesday, October 03, 2018

Ulster University: no need to back down

Ulster University seems to have retracted the claim on its website that it is in the top 3% of universities in the world. Elsewhere it has apparently claimed to be in the top 2%.

According to the Belfast Telegraph

"Ulster University stated that it 'is in the top 3% of universities in the world' when it came 501-600th in the 1,103 Times Higher Education World University Rankings in 2018, putting them in the mid-tier of those rankings," a Which? spokesperson said.
"Further, in a website document "Ulster's 50 Facts Worth Knowing" Ulster University included the claim 'Top 2% of universities in the world (QS World Rankings)'."
The spokesman continued: "Which? University is particularly concerned about these comparative claims, given it is less than one year since the Advertising Standards Authority (ASA) issued its comprehensive advice in November 2017 to universities about such claims and its rulings which upheld complaints about comparative claims made in university advertising.
"Where universities are making comparative claims about their relative performance, these should be verifiable and provided with sufficient qualifying information to ensure that prospective students are not misled."

It is unfortunate that Which? should use the THE World University Rankings as the only arbiter of excellence. There are now at least 17 global rankings plus a variety of specialist and niche international rankings. Some of these are just as comprehensive, valid or informative as the THE rankings, if not more so. The THE world rankings are also known (to some people anyway) for their whimsical assessment of the research impact of world universities. This year THE has included Babol Noshirvani University of Technology, the University of Reykjavik, Brighton and Sussex Medical School, Anglia Ruskin University, the University of Dessarollo and the University of Canberra as leaders of the global research scene.

Which? has a point. The university is thinking of the 28,077 universities listed in the Webometrics rankings. So if it were in the top 600 of the THE rankings, then it would be in the top 2.45% of world universities.

There is a fallacy here. Many universities that could do better than Ulster do not bother to submit data to THE. If they did they might well forge ahead.

But I see nothing wrong with the university noting that it is ranked 689th in the current Webometrics ranking, which includes web data and a measure of research output, out of 28,077 for which data is available and therefore is in the top 2.45% of the Webometrics ranking.

Ulster is also 777th for the Webometrics research excellence indicator, the number of papers in the top 10% most cited, which would put it in the top 2.77% of universities in the world.

A claim to be in the top 3% of universities would be legitimate providing the ranking and indicator is stated.