Wednesday, October 10, 2018

The link between rankings and standardised testing

The big hole in current international university rankings is the absence of anything that effectively measures the quality of graduates. Some rankings use staff student ratio or income as a proxy for the provision of resources, on the assumption that the more money that is spent or the more teachers deployed then the better the quality of teaching. QS has an employer survey that asks about the universities from where employers like to recruit but that has many problems.

There is a lot of of evidence that university graduates are valued to a large extent because they are seen as intelligent, conscientious and, depending on place and field, open-minded or conformist. A metric that correlates with these attributes would be helpful in assessing and comparing universities. 

A recent article in The Conversation by Jonathon Wai suggests that the US News America's Best Colleges rankings are highly regarded partly because they measure the academic ability of admitted students, which correlates very highly with that of graduates.

The popularity of these rankings is based on their capturing the average ability of students as measured by the SAT or ACT. Wai reports from a paper that he wrote in collaboration with Matt Brown and Christopher Chabris in the Journal of Intelligence that finds that there is a large correlation between the average SAT or ACT scores of students and overall scores in America's Best Colleges, .982 for national universities and .890 for liberal arts colleges. 

The correlation with the THE/WSJ US college rankings is less but still very substantial, .787, and also for the THE World University Rankings, .659.

It seems that employers and professional schools expect universities to certify the intelligence of their graduates. The value of standardised tests such as ACT, SAT, GRE, LSAT, GMAT, which correlate highly with one another, is that  they are a fairly robust proxy for general intelligence or general mental ability. Rankings could be valuable if they provideda clue to the ability of graduates .

It is, however, a shame that the authors should support their argument by referring only to one global ranking, the THE world rankings. There are now quite a few international rankings that are as good as or better than the THE tables.

I have therefore calculated the correlations between the average SAT/ACT scores of 30 colleges and universities in the USA and their scores in various global rankings. 

The source for student scores is a supplement to the article by Wai at al, from which I have taken the top 30 ranked by SAT/ACT. I have used those rankings listed in the IREG inventory that provide numerical scores and and not just ranks. The GreenMetric was not used since only one US school out of these thirty, Washington University in St Louis, took part in that ranking. I used two indicators from Leiden Ranking, which does not give a composite score, total publications and the percentage of papers in the top 1% of journals.

It is interesting that there are many liberal arts colleges in the US that are not included in the international rankings. Prospective undergraduates looking for a college in the USA would do well to look beyond the global rankings. Harvey Mudd College, for example, is highly selective and its graduates much sought after but it does not appear in any of the rankings below.

The results are interesting. The indicator that correlates most significantly with student ability is Leiden Ranking's percentage of papers in the top 1% of journals. Next is CWUR, which does explicitly claim to measure graduate quality. The US News world rankings and the Shanghai rankings, which only include research indicators, also do well.

We are looking at just 30 US institutions here. There might be different results if we looked at other countries or a broader range of US schools.

So, it seems that if you want to look at the ability of students or graduates, an international ranking based on research is as good as or better than one that tries to measure teaching excellence with the blunt instruments currently available.



Ranking
Address
correlation
significance
N
1
Leiden Ranking: papers in top 10% of journals
Netherlands
.65
.001*
22
2
Center for World University Ranking
UAE
.59
.003*
22
3
US News Best Global Universities
USA
.58
.004*
22
4
Shanghai ARWU
China
.57
.004*
24
5
Round University Rankings
Russia
.55
.008*
22
6
THE World University Rankings
UK
.51
.014*
22
7
QS World university Rankings
UK
.49
.025*
21
8
University Ranking by Academic Performance
Turkey
.48
.015*
25
9
Nature Index Fractional Count
USA
.45
.039*
21
10
National Taiwan University
Taiwan
.32
.147
22
11
Leiden: total Publications
Netherlands
.21
.342
22


*significant at 0.05 level 

Wednesday, October 03, 2018

Ulster University: no need to back down

Ulster University seems to have retracted the claim on its website that it is in the top 3% of universities in the world. Elsewhere it has apparently claimed to be in the top 2%.

According to the Belfast Telegraph

"Ulster University stated that it 'is in the top 3% of universities in the world' when it came 501-600th in the 1,103 Times Higher Education World University Rankings in 2018, putting them in the mid-tier of those rankings," a Which? spokesperson said.
"Further, in a website document "Ulster's 50 Facts Worth Knowing" Ulster University included the claim 'Top 2% of universities in the world (QS World Rankings)'."
The spokesman continued: "Which? University is particularly concerned about these comparative claims, given it is less than one year since the Advertising Standards Authority (ASA) issued its comprehensive advice in November 2017 to universities about such claims and its rulings which upheld complaints about comparative claims made in university advertising.
"Where universities are making comparative claims about their relative performance, these should be verifiable and provided with sufficient qualifying information to ensure that prospective students are not misled."

It is unfortunate that Which? should use the THE World University Rankings as the only arbiter of excellence. There are now at least 17 global rankings plus a variety of specialist and niche international rankings. Some of these are just as comprehensive, valid or informative as the THE rankings, if not more so. The THE world rankings are also known (to some people anyway) for their whimsical assessment of the research impact of world universities. This year THE has included Babol Noshirvani University of Technology, the University of Reykjavik, Brighton and Sussex Medical School, Anglia Ruskin University, the University of Dessarollo and the University of Canberra as leaders of the global research scene.

Which? has a point. The university is thinking of the 28,077 universities listed in the Webometrics rankings. So if it were in the top 600 of the THE rankings, then it would be in the top 2.45% of world universities.

There is a fallacy here. Many universities that could do better than Ulster do not bother to submit data to THE. If they did they might well forge ahead.

But I see nothing wrong with the university noting that it is ranked 689th in the current Webometrics ranking, which includes web data and a measure of research output, out of 28,077 for which data is available and therefore is in the top 2.45% of the Webometrics ranking.

Ulster is also 777th for the Webometrics research excellence indicator, the number of papers in the top 10% most cited, which would put it in the top 2.77% of universities in the world.

A claim to be in the top 3% of universities would be legitimate providing the ranking and indicator is stated.







Monday, October 01, 2018

Here we go again: the THE citations indicator

The latest THE world  rankings have just been announced. For most of the indicators there are few surprises. There are more universities from Japan in the rankings. Oxford is first, followed by Cambridge. The USA contributes the largest number of top universities. China rises steadily. India is as usual is a disappointment. 

But, as in previous years, the most interesting thing is the citations indicator, which is supposed to measure research influence. Once again this has produced some very interesting results. 

Here are some of the universities in the top 100. 

Babol Noshirvani University of Technology: the most influential university in the world for research
Brighton and Sussex Medical School: most influential in Europe
Brandeis University: most influential in the USA
Reykjavik University
St George's, University of London: fallen a bit, probably because of Brexit
King Abdulaziz University: top university for research influence in the Middle East and Asia 
Anglia Ruskin University
Jordan University of Science and Technology
Vita-Salute San Raffaele University
Ulsan National Institute for Science and Technology: top in Asia ex Middle East
University of Canberra: best in Australia
University of Dessarollo: best in Latin America
McMaster University: best in Canada
Universite de Versailles Saint-Quentin-en-Yvelines: best in France
Teikyo University: best in Japan

There are signs that THE are considering reforming this indicator. If that does happen, the rankings will be more valid but much less entertaining.






Sunday, September 30, 2018

Rankings and Higher Education Policy

Two examples of how the need to perform well in the rankings is shaping national research and higher education policy.

From the Irish Examiner

"Ireland must apply for membership of the world-renowned European Organisation for Nuclear Research (Cern) in order to combat the effect of Brexit and boost university rankings.
That is according to Cork senator Colm Burke as the campaign to join Cern gains momentum, after Ireland recently became a member of the European Space Observatory."

From Times Higher Education


"France’s programme of university mergers is paying off, improving the research performance and international visibility of its top providers, according to the Times Higher Education World University Rankings 2019.
Paris Sciences et Lettres – PSL Research University Paris, a 2010 merger of numerous institutions, climbed 31 places to 41st this year, becoming the first French university to feature in the top 50 best universities since 2011. PSL made its debut in the global table last year.
Its teaching and research scores improved, driven by increased global visibility and votes in the academic teaching and research reputation surveys.
Meanwhile, Sorbonne University, which was founded in January this year following the merger of Pierre and Marie Curie University and Paris-Sorbonne University, has joined the list at 73rd place – making it the highest-ranked newcomer in the table."












https://www.irishexaminer.com/breakingnews/business/cern-membership-vital-for-irish-universities-872312.html

Thursday, September 20, 2018

Philosophy Department Will Ignore GRE Scores

The philosophy department at the University of Pennsylvania has taken a step away from fairness and objectivity in university admissions. It will no longer look at the GRE scores of applicants to its graduate programme. 

The department is good but not great. It is ranked 27th in the Leiter Report rankings and in the 101-150 band in the QS world subject rankings.

So how will students be selected without GRE scores? It seems it will be by letters of recommendation, undergraduate GPA, writing samples, admission statements.

Letters of recommendation have very little validity. The value of undergraduate grades has eroded in recent years and very likely will continue to do so. Admission essays and diversity statements  say little about academic ability and a lot about political conformism.

The reasons for the move are not convincing. Paying for the GRE is supposed to be a burden on low income students. But the cost is much less than Penn's exorbitant tuition fees. It is also claimed that the GRE and other standardised tests do not predict performance in graduate school. In fact they are a reasonably good predictor of academic success although they should not be used by themselves. 

Then there is the claim that the GRE "sometimes" underpredicts the performance of minorities and women. No doubt it sometimes does but then presumably sometimes it does not. Unless there is evidence that the underprediction is significant and that it is greater than that of other indicators this claim is meaningless.

What will be the result of this? The department will be able to admit students who "do not test well" but who can get good grades, something that is becoming less difficult at US colleges, or  persuade letter writers at reputable schools that they will do well.

It is likely that more departments across the US will follow Penn's lead. American graduate programmes will slowly become less rigorous and less able to compete with the rising universities of Asia.








Sunday, September 09, 2018

Ranking Global Rankings: Information


Another indicator for ranking global rankings might be the amount of information that they contain. Here are 17 global rankings in the IREG Inventory ranked according to the number of indicators or groups of indicators for which scores or ranks are given. The median and the mode are both six.

The number for U-Multirank is perhaps misleading since data is not provided for all universities. 



 Number of indicators or indicator groups with scores or ranks

Rank
Ranking
Address of publisher
Number of indicators
1
Germany 
112
2
Russia
20
3
Netherlands
19
4
USA
13
5
Taiwan
8
6
UAE
7
7=
UK
6
7=
China 
6
7=
Indonesia
6
7=
URAP University Ranking by Academic Performance
Turkey
6
11
UK
5
12
Spain
4
13
Spain
3
14
UK
2
15=
France
1
15=
Reuters Top 100 Innovative Universities  
USA
1
15=
Australia
1



Monday, September 03, 2018

Ranking Global Rankings: Inclusion

The number of international global universities continues to grow and it is becoming harder to keep track of them. Earlier this year IREG published an inventory of international rankings that included 17 global rankings. Here are those rankings in order of the number of institutions that they rank in the most recent edition.

Webometrics is the clear winner, followed by uniRank and SCImago. There are, of course, other indicators to think about and some of these will be covered later.






Number of Institutions ranked

Rank
Ranking
Address of publisher
Number ranked
1
Spain
28,077
2
Australia
13,146
3
Spain
5,637
4
URAP University Ranking by Academic Performance
Turkey
2,500
5
U-Multirank
Germany 
1,500
6
USA
1,250
7
THE World University Rankings
UK
1,000+
8= Shanghai Ranking ARWU China 
1,000
8= CWUR University Rankings 
UAE
1,000
10
QS World University Rankings
UK
916
11
CWTS Leiden Ranking Netherlands 903
12
Taiwan 800
13
Russia
783
14
UI GreenMetric Ranking Indonesia 619
15
UK 500
16
France
150
17
Reuters Top 100 Innovative Universities  
USA
100


Sunday, September 02, 2018

Ranking US Rankings

Forbes Magazine has an article by Willard Dix that ranks US  ranking sites. The ranking is informal without specifying indicators but the author does give us an idea of what he thinks a good ranking should do.

Here are the top five of thirteen:
1.  US News: America's Best Colleges
2.  Money magazine: Best Colleges Ranking
3.  Forbes: America's Top Colleges
4.  Kiplinger's Best College Values
5.  Washington Monthly: College Guide and Rankings.

Reading through the comments it is possible to get an idea of the criteria of a good ranking. Rankings should contain a lot of information, they should be comprehensive and include a large number of institutions, they should provide data that helps prospective students and stakeholders, they should be published for several years, if they use surveys they should have a lot of respondents, they should have face validity (a list with a "revolutionary algorithm" that puts non-Ivy places at the top is in 13th place). 





Friday, August 24, 2018

Why is Australia doing well in the Shanghai rankings?

I am feeling a bit embarrassed. In a recent post I wrote about the Shanghai Rankings (ARWU) being a bit boring (which is good) because university ranks usually do not change very much. But then I noticed that a couple of Australian universities did very well in the latest rankings. One of them, the Australian National University (ANU), has risen a spectacular (for ARWU) 31 places over last year. The Financial Review says that "[u]niversity scientific research has boosted the position of two Australian universities in a global ranking of higher education providers." 

The ranking is ARWU and the rise in the ranking is linked to the economic contribution of Australian universities, especially those in the Group of Eight.

So how well did Australian universities do? The top performer, as in previous years, is the University of Melbourne, which went up a spot to 38th place. Two other universities went up a lot in a very un-Shanghainese way, ANU, already mentioned, from 69th to 38th place and the University of Sydney from  to 83rd to  68th

The University of Queensland was unchanged in 55th place while Monash fell from 78th to 91st  and the University of Western Australia from 91st to 93rd. 

How did ANU and Sydney do it? The ANU scores for Nobel and Fields awards were unchanged. Publications were up a bit  and papers in Nature and Science down a bit.  

What made the difference was the score for highly cited researchers, derived from lists kept by Clarivate Analytics, which rose from 15.4 to 23.5, a difference of 8.1 or, after weighting, 1.62 points of the overall score. The difference in total scores between 2017 and 2018 was 1.9 so those highly cited researchers made up most of the difference.

In 2016 ANU had two researchers in the list, which was used for the 2017 rankings. One was also on the 2017 list, used in 2018. In 2017 there were six ANU  highly cited researchers, one from the previous year and one who had moved from MIT. The other four were long serving ANU researchers.

Let's be clear. ANU has not been handing out unusual contracts or poaching from other institutions. It has grown its own researchers and should be congratulated.

But using an indicator where a single researcher can lift a top 100 university seven or eight places is an invitation to perverse consequences. ARWU should consider whether it is time to explore other measures of research impact.

The improved scores for the University of Sydney resulted from an increase between 2016 and 2017 in the number of articles published in the Science Citation Index Expanded and the Social Science Citation Index.