Sunday, September 26, 2021

What is a University Really for ?

Louise Richardson, Vice-Chancellor of the University of Oxford, has seen fit to enlighten us about the true purpose of a university. It is, it seems, to inculcate appropriate deference to the class of certified experts.

Professor Richardson remarked at the latest Times Higher Education (THE) academic summit that she was embarrassed that "we" had educated the Conservative politician Michael Gove who said, while talking about Brexit, that people had had enough of experts.

So now we know what universities are really about.  Not about critical discussion, cutting-edge research, skepticism, the disinterested pursuit of truth but about teaching respect for experts.

A few years ago I wrote a post suggesting we were now in a world where the expertise of the accredited experts was declining along with public deference. I referred to the failure of political scientists to predict the nomination of Trump, the election of Trump, the rise of Leicester City, the Brexit vote. It looks like respect for experts has continued to decline, not entirely without reason.

Professor Richardson thinks that Gove's disdain for the Brexit experts is cause for embarrassment. While it is early years for the real effects of Brexit to become clear it is as yet far from obvious that it has been an unmitigated disaster.  It is, moreover, a little ironic that the remark was made at the latest THE academic summit where the annual world rankings were announced.  Richardson remarked that she was delighted that her university was once again ranked number one.

The irony is that the THE world rankings are probably the least expert of the global rankings although they are apparently the most prestigious at least among those institutions that are known for being prestigious.

Let's have another look at THE's Citations Indicator which is supposed to measure research quality or impact and accounts for nearly a third of the total weighting. (Regular readers of this blog can skim or skip the next few lines. ) Here are the top five from this year's rankings.

1,   University of Cape Coast

2,   Duy Tan University

3,   An Najah National University

4.   Aswan University

5.   Brighton and Sussex Medical School.

This is not an academic version of the imaginary football league tables that nine-year-old children used to construct. Nor is it the result of massive cheating by the universities concerned. It is quite simply the outcome of a hopelessly flawed system. THE, or rather its data analysts, appear to be aware of the inadequacies of this indicator but somehow meaningful reform keeps getting postponed. One day historians will search the THE archives to findr the causes of this inability to take very simple and obvious measures to produce a sensible and credible ranking. I suspect that the people in control of THE policy are averse to anything that might involve any distraction from the priority of monetising as much data as possible. Nor is there any compelling reason for a rush to reform when universities like Oxford are unconcerned about the inadequacies of the current system.

Here are the top five for income from industry which is supposed to have something to do with innovation.

1.   Asia University Taiwan

2.   Istanbul Technical University

3.   Khalifa University

4.   Korean Advanced Institute of Science and Technology (KAIST)

5.   LMU Munich.

This is a bit better. It is not implausible that KAIST or Munich is a world leader for innovation. But in general, this indicator is also inadequate for any purpose other than providing fodder for publicity. See a scathing review by Alex Usher

Would any tutor or examiner at Oxford give any credit to any student who thought that Ghana, Vietnam and Palestine were centers of international research impact. They are all doing a remarkable job of teachng in many reseapects but that is not what THE is ostensibly giving them credit for.

In addition, the THE world rankings fail to meet satisfactory standards with regard to basic validity. Looking at the indicator scores for the top 200 universities in the most recent world rankings we can see that the correlation between research and teaching is 0.92. In effect these are not two distinct metrics. They are measuring essentially the same thing. A quick look at the methodology suggests that what they are comparing is income (total institutional income for teaching, research income for research), reputation (the opinion surveys for research and teaching) and investment in doctoral programmes.

On the other hand , the citations indicator does not correlate significantly with research or teaching and correlates negatively with industry income.

One can hardly blame THE for wanting to make as much money as possible. But surely we can expect something better from supposedly elite institutions that claim to value intellectual and scientific excellence. If Oxford and its peers wish to restore public confidence in the experts there is no better way than saying to THE that we will not submit data to THE until you produce something a little less embarrassing.




Wednesday, August 25, 2021

THE World University Rankings: Indicator Correlations

I was going to wait until next week to do this but the publication of the latest edition of the THE world rankings is coming and there may be a new methodology.

The current THE methodology is based on five indicators or indicator groups: Teaching (5 indicators), Research (3 indicators), Citations, Income from Industry, International Outlook (3 indicators).

Looking at the analysis of 1526 cases (using PSPP), we can see that the correlation between Teaching and Research is very high, .89, and fairly good between those two and Citations. Teaching and Research both include surveys of teaching and research, which have been shown to yield vary similar results. Also, Teaching includes Institutional Income and Research Income, which are likely to be closely related.

The Citations indicator has a moderate correlation with Teaching and Research, as noted, and also with International Outlook.

The correlations between Industry Income and Teaching and Research are moderate and those with Citations and International Outlook are low, .20 and .18 respectively. The Industry Income indicator is close to worthless since the definition of income is apparently interpreted in several different ways and may have little relation to financial reality. International Outlook correlates modestly with the other indicators except for Industry Income.

It seems there is little point in distinguishing between the Teaching and Research indicators since they are both influenced by income, reputation, and large doctoral programmes. The Industry Income indicator has little validity and will probably, with very good reason, be removed, from the THE rankings.


CORRELATIONS

CORRELATION
/VARIABLES = teaching research citations industry international weightedtotal
/PRINT = TWOTAIL SIG.
Correlations
teachingresearchcitationsindustryinternationalweightedtotal
teachingPearson Correlation1.00.89.51.45.38.83
Sig. (2-tailed).000.000.000.000.000
N152615261526152615261526
researchPearson Correlation.891.00.59.53.54.90
Sig. (2-tailed).000.000.000.000.000
N152615261526152615261526
citationsPearson Correlation.51.591.00.20.57.87
Sig. (2-tailed).000.000.000.000.000
N152615261526152615261526
industryPearson Correlation.45.53.201.00.18.42
Sig. (2-tailed).000.000.000.000.000
N152615261526152615261526
internationalPearson Correlation.38.54.57.181.00.65
Sig. (2-tailed).000.000.000.000.000
N152615261526152615261526
weightedtotalPearson Correlation.83.90.87.42.651.00
Sig. (2-tailed).000.000.000.000.000
N152615261526152615261526


Most people are probably more concerned with distinctions among the world's elite or would be elite universities. Turning to the top 200 of the THE rankings, the correlation between Teaching and Research, is again very high, suggesting that these are measuring virtually the same thing.

The Citations indicator has a low correlation with International Outlook, a low and insignificant correlation with Teaching and Research, and a negative and insignificant correlation with Industry Income. 

Industry Income  has low correlations with Research and Teaching and negative with Citations and International Outlook.

It would seem that THE world rankings are not helpful for evaluating the quality of the global elite. A new methodology will be most welcome.


CORRELATIONS

CORRELATION
/VARIABLES = teaching research citations industry international weightedtotal
/PRINT = TWOTAIL SIG.
Correlations
teachingresearchcitationsindustryinternationalweightedtotal
teachingPearson Correlation1.00.90.02.23-.11.89
Sig. (2-tailed).000.768.001.114.000
N200200200200200200
researchPearson Correlation.901.00.06.28.05.92
Sig. (2-tailed).000.411.000.471.000
N200200200200200200
citationsPearson Correlation.02.061.00-.30.22.39
Sig. (2-tailed).768.411.000.001.000
N200200200200200200
industryPearson Correlation.23.28-.301.00-.10.17
Sig. (2-tailed).001.000.000.149.014
N200200200200200200
internationalPearson Correlation-.11.05.22-.101.00.17
Sig. (2-tailed).114.471.001.149.017
N200200200200200200
weightedtotalPearson Correlation.89.92.39.17.171.00
Sig. (2-tailed).000.000.000.014.017
N200200200200200200





Monday, August 23, 2021

Shanghai Rankings: Correlations Between Indicators

This is, I hope, the first of  a series. Maybe THE and QS next week.

If we want to compare  the utility of university rankings one attribute to consider is internal consistency. Here, the correlation between the various indicators can tell us a lot. If the correlation between a pair of indicators is 0.90 or above we can assume that these indicators are essentially measuring the same thing.

On the other hand, if  there is no correlation or one that is low, insignificant or even negative we might have doubts about the validity of one or both of the indicators. It is reasonable that if a university scores well for one metric it will do well for others providing they both represent highly valued attributes. A university producing high quality research or collecting large numbers of citations should also score well for reputation. If it does not there might be a methodological problem somewhere.

So, we can assume that if the indicators are valid and are not measuring the same thing the correlation between indicators will probably be somewhere between 0.5 and 0.9.

Let's have a look at the Shanghai ARWU for 2019. The indicator scores were extracted and analysed using PSPP. (It is very difficult to analyse the 2020 edition because of a recent change in presentation.) These rankings have six indicators: alumni and faculty receiving Nobel and Fields awards, papers in Nature and Science, highly cited researchers, publications in the Web of Science, and productivity per capita.

Looking at all 1000 institutions in the Shanghai Rankings, Alumni, Awards, and Nature and Science all correlate well with each other Highly Cited Researchers correlates well with Nature and Science and Publications but less so with Alumni and Awards. Nature and Science correlates well with all the other indicators.

The Publications indicator does not correlate well with Alumni and Awards. This is to be expected since Publications refers to 2018 while the Alumni and Awards indicators go back several decades.

Overall, the correlations are quite good although there is a noticeable divergence between Publications and Alumni and Awards, which cover very different time periods. 

CORRELATIONS

CORRELATION
/VARIABLES = alumni awards highlycited naturescience publications pcp finaltotal
/PRINT = TWOTAIL NOSIG.
Correlations
alumniawardshighlycitednaturesciencepublicationspcpfinaltotal
alumniPearson Correlation1.00.78.51.72.45.63.76
Sig. (2-tailed).000.000.000.000.000.000
N100010001000992100010001000
awardsPearson Correlation.781.00.57.75.44.67.82
Sig. (2-tailed).000.000.000.000.000.000
N100010001000992100010001000
highlycitedPearson Correlation.51.571.00.79.72.64.87
Sig. (2-tailed).000.000.000.000.000.000
N100010001000992100010001000
naturesciencePearson Correlation.72.75.791.00.69.73.93
Sig. (2-tailed).000.000.000.000.000.000
N992992992992992992992
publicationsPearson Correlation.45.44.72.691.00.50.81
Sig. (2-tailed).000.000.000.000.000.000
N100010001000992100010001000
pcpPearson Correlation.63.67.64.73.501.00.78
Sig. (2-tailed).000.000.000.000.000.000
N100010001000992100010001000
finaltotalPearson Correlation.76.82.87.93.81.781.00
Sig. (2-tailed).000.000.000.000.000.000
N100010001000992100010001000



Most observers of ARWU and other global rankings are interested in the top levels where elite schools and national flagships jostle for dominance. Analysing correlations among indicators for the top 200 in ARWU, there are high correlations between Alumni, Awards, Nature and Science, and Productivity per Capita, ranging from .69 to .79.

There is also a high correlation of .72 between Nature and Science and Highly Cited Researchers. It is, however, noticeable that the correlation between Publications and other indicators is low for Highly Cited Researchers and very low for Productivity per Capita, Alumni and Awards.

It seems that, especially among the top 200 places, there is a big gap opening between the old traditional elite of Oxbridge, the Ivy League and the like who continue to get credit for long dead Nobel laureates and the new rising stars of Asia and Europe who are surging ahead for WOS papers and beginning to produce or recruit superstar researchers.




Correlations
alumniawardshighlycitednaturesciencepublicationspcpfinaltotal
alumniPearson Correlation1.00.79.36.69.21.62.78
Sig. (2-tailed).000.000.000.003.000.000
N200200200199200200200
awardsPearson Correlation.791.00.44.74.14.67.84
Sig. (2-tailed).000.000.000.044.000.000
N200200200199200200200
highlycitedPearson Correlation.36.441.00.72.57.49.78
Sig. (2-tailed).000.000.000.000.000.000
N200200200199200200200
naturesciencePearson Correlation.69.74.721.00.44.65.92
Sig. (2-tailed).000.000.000.000.000.000
N199199199199199199199
publicationsPearson Correlation.21.14.57.441.00.12.55
Sig. (2-tailed).003.044.000.000.083.000
N200200200199200200200
pcpPearson Correlation.62.67.49.65.121.00.72
Sig. (2-tailed).000.000.000.000.083.000
N200200200199200200200
finaltotalPearson Correlation.78.84.78.92.55.721.00
Sig. (2-tailed).000.000.000.000.000.000
N200200200199200200200