Thursday, October 11, 2018

The link between rankings and standardised testing

The big hole in current international university rankings is the absence of anything that effectively measures the quality of graduates. Some rankings use staff student ratio or income as a proxy for the provision of resources, on the assumption that the more money that is spent or the more teachers deployed then the better the quality of teaching. QS has an employer survey that asks about the universities from where employers like to recruit but that has many problems.

There is a lot of of evidence that university graduates are valued to a large extent because they are seen as intelligent, conscientious and, depending on place and field, open-minded or conformist. A metric that correlates with these attributes would be helpful in assessing and comparing universities. 

A recent article in The Conversation by Jonathon Wai suggests that the US News America's Best Colleges rankings are highly regarded partly because they measure the academic ability of admitted students, which correlates very highly with that of graduates.

The popularity of these rankings is based on their capturing the average ability of students as measured by the SAT or ACT. Wai reports from a paper that he wrote in collaboration with Matt Brown and Christopher Chabris in the Journal of Intelligence that finds that there is a large correlation between the average SAT or ACT scores of students and overall scores in America's Best Colleges, .982 for national universities and .890 for liberal arts colleges. 

The correlation with the THE/WSJ US college rankings is less but still very substantial, .787, and also for the THE World University Rankings, .659.

It seems that employers and professional schools expect universities to certify the intelligence of their graduates. The value of standardised tests such as ACT, SAT, GRE, LSAT, GMAT, which correlate highly with one another, is that  they are a fairly robust proxy for general intelligence or general mental ability. Rankings could be valuable if they provideda clue to the ability of graduates .

It is, however, a shame that the authors should support their argument by referring only to one global ranking, the THE world rankings. There are now quite a few international rankings that are as good as or better than the THE tables.

I have therefore calculated the correlations between the average SAT/ACT scores of 30 colleges and universities in the USA and their scores in various global rankings. 

The source for student scores is a supplement to the article by Wai at al, from which I have taken the top 30 ranked by SAT/ACT. I have used those rankings listed in the IREG inventory that provide numerical scores and and not just ranks. The GreenMetric was not used since only one US school out of these thirty, Washington University in St Louis, took part in that ranking. I used two indicators from Leiden Ranking, which does not give a composite score, total publications and the percentage of papers in the top 1% of journals.

It is interesting that there are many liberal arts colleges in the US that are not included in the international rankings. Prospective undergraduates looking for a college in the USA would do well to look beyond the global rankings. Harvey Mudd College, for example, is highly selective and its graduates much sought after but it does not appear in any of the rankings below.

The results are interesting. The indicator that correlates most significantly with student ability is Leiden Ranking's percentage of papers in the top 1% of journals. Next is CWUR, which does explicitly claim to measure graduate quality. The US News world rankings and the Shanghai rankings, which only include research indicators, also do well.

We are looking at just 30 US institutions here. There might be different results if we looked at other countries or a broader range of US schools.

So, it seems that if you want to look at the ability of students or graduates, an international ranking based on research is as good as or better than one that tries to measure teaching excellence with the blunt instruments currently available.



Ranking
Address
correlation
significance
N
1
Leiden Ranking: papers in top 10% of journals
Netherlands
.65
.001*
22
2
Center for World University Ranking
UAE
.59
.003*
22
3
US News Best Global Universities
USA
.58
.004*
22
4
Shanghai ARWU
China
.57
.004*
24
5
Round University Rankings
Russia
.55
.008*
22
6
THE World University Rankings
UK
.51
.014*
22
7
QS World university Rankings
UK
.49
.025*
21
8
University Ranking by Academic Performance
Turkey
.48
.015*
25
9
Nature Index Fractional Count
USA
.45
.039*
21
10
National Taiwan University
Taiwan
.32
.147
22
11
Leiden: total Publications
Netherlands
.21
.342
22


*significant at 0.05 level 

No comments: