Wednesday, August 25, 2021

THE World University Rankings: Indicator Correlations

I was going to wait until next week to do this but the publication of the latest edition of the THE world rankings is coming and there may be a new methodology.

The current THE methodology is based on five indicators or indicator groups: Teaching (5 indicators), Research (3 indicators), Citations, Income from Industry, International Outlook (3 indicators).

Looking at the analysis of 1526 cases (using PSPP), we can see that the correlation between Teaching and Research is very high, .89, and fairly good between those two and Citations. Teaching and Research both include surveys of teaching and research, which have been shown to yield vary similar results. Also, Teaching includes Institutional Income and Research Income, which are likely to be closely related.

The Citations indicator has a moderate correlation with Teaching and Research, as noted, and also with International Outlook.

The correlations between Industry Income and Teaching and Research are moderate and those with Citations and International Outlook are low, .20 and .18 respectively. The Industry Income indicator is close to worthless since the definition of income is apparently interpreted in several different ways and may have little relation to financial reality. International Outlook correlates modestly with the other indicators except for Industry Income.

It seems there is little point in distinguishing between the Teaching and Research indicators since they are both influenced by income, reputation, and large doctoral programmes. The Industry Income indicator has little validity and will probably, with very good reason, be removed, from the THE rankings.


CORRELATIONS

CORRELATION
/VARIABLES = teaching research citations industry international weightedtotal
/PRINT = TWOTAIL SIG.
Correlations
teachingresearchcitationsindustryinternationalweightedtotal
teachingPearson Correlation1.00.89.51.45.38.83
Sig. (2-tailed).000.000.000.000.000
N152615261526152615261526
researchPearson Correlation.891.00.59.53.54.90
Sig. (2-tailed).000.000.000.000.000
N152615261526152615261526
citationsPearson Correlation.51.591.00.20.57.87
Sig. (2-tailed).000.000.000.000.000
N152615261526152615261526
industryPearson Correlation.45.53.201.00.18.42
Sig. (2-tailed).000.000.000.000.000
N152615261526152615261526
internationalPearson Correlation.38.54.57.181.00.65
Sig. (2-tailed).000.000.000.000.000
N152615261526152615261526
weightedtotalPearson Correlation.83.90.87.42.651.00
Sig. (2-tailed).000.000.000.000.000
N152615261526152615261526


Most people are probably more concerned with distinctions among the world's elite or would be elite universities. Turning to the top 200 of the THE rankings, the correlation between Teaching and Research, is again very high, suggesting that these are measuring virtually the same thing.

The Citations indicator has a low correlation with International Outlook, a low and insignificant correlation with Teaching and Research, and a negative and insignificant correlation with Industry Income. 

Industry Income  has low correlations with Research and Teaching and negative with Citations and International Outlook.

It would seem that THE world rankings are not helpful for evaluating the quality of the global elite. A new methodology will be most welcome.


CORRELATIONS

CORRELATION
/VARIABLES = teaching research citations industry international weightedtotal
/PRINT = TWOTAIL SIG.
Correlations
teachingresearchcitationsindustryinternationalweightedtotal
teachingPearson Correlation1.00.90.02.23-.11.89
Sig. (2-tailed).000.768.001.114.000
N200200200200200200
researchPearson Correlation.901.00.06.28.05.92
Sig. (2-tailed).000.411.000.471.000
N200200200200200200
citationsPearson Correlation.02.061.00-.30.22.39
Sig. (2-tailed).768.411.000.001.000
N200200200200200200
industryPearson Correlation.23.28-.301.00-.10.17
Sig. (2-tailed).001.000.000.149.014
N200200200200200200
internationalPearson Correlation-.11.05.22-.101.00.17
Sig. (2-tailed).114.471.001.149.017
N200200200200200200
weightedtotalPearson Correlation.89.92.39.17.171.00
Sig. (2-tailed).000.000.000.014.017
N200200200200200200





Monday, August 23, 2021

Shanghai Rankings: Correlations Between Indicators

This is, I hope, the first of  a series. Maybe THE and QS next week.

If we want to compare  the utility of university rankings one attribute to consider is internal consistency. Here, the correlation between the various indicators can tell us a lot. If the correlation between a pair of indicators is 0.90 or above we can assume that these indicators are essentially measuring the same thing.

On the other hand, if  there is no correlation or one that is low, insignificant or even negative we might have doubts about the validity of one or both of the indicators. It is reasonable that if a university scores well for one metric it will do well for others providing they both represent highly valued attributes. A university producing high quality research or collecting large numbers of citations should also score well for reputation. If it does not there might be a methodological problem somewhere.

So, we can assume that if the indicators are valid and are not measuring the same thing the correlation between indicators will probably be somewhere between 0.5 and 0.9.

Let's have a look at the Shanghai ARWU for 2019. The indicator scores were extracted and analysed using PSPP. (It is very difficult to analyse the 2020 edition because of a recent change in presentation.) These rankings have six indicators: alumni and faculty receiving Nobel and Fields awards, papers in Nature and Science, highly cited researchers, publications in the Web of Science, and productivity per capita.

Looking at all 1000 institutions in the Shanghai Rankings, Alumni, Awards, and Nature and Science all correlate well with each other Highly Cited Researchers correlates well with Nature and Science and Publications but less so with Alumni and Awards. Nature and Science correlates well with all the other indicators.

The Publications indicator does not correlate well with Alumni and Awards. This is to be expected since Publications refers to 2018 while the Alumni and Awards indicators go back several decades.

Overall, the correlations are quite good although there is a noticeable divergence between Publications and Alumni and Awards, which cover very different time periods. 

CORRELATIONS

CORRELATION
/VARIABLES = alumni awards highlycited naturescience publications pcp finaltotal
/PRINT = TWOTAIL NOSIG.
Correlations
alumniawardshighlycitednaturesciencepublicationspcpfinaltotal
alumniPearson Correlation1.00.78.51.72.45.63.76
Sig. (2-tailed).000.000.000.000.000.000
N100010001000992100010001000
awardsPearson Correlation.781.00.57.75.44.67.82
Sig. (2-tailed).000.000.000.000.000.000
N100010001000992100010001000
highlycitedPearson Correlation.51.571.00.79.72.64.87
Sig. (2-tailed).000.000.000.000.000.000
N100010001000992100010001000
naturesciencePearson Correlation.72.75.791.00.69.73.93
Sig. (2-tailed).000.000.000.000.000.000
N992992992992992992992
publicationsPearson Correlation.45.44.72.691.00.50.81
Sig. (2-tailed).000.000.000.000.000.000
N100010001000992100010001000
pcpPearson Correlation.63.67.64.73.501.00.78
Sig. (2-tailed).000.000.000.000.000.000
N100010001000992100010001000
finaltotalPearson Correlation.76.82.87.93.81.781.00
Sig. (2-tailed).000.000.000.000.000.000
N100010001000992100010001000



Most observers of ARWU and other global rankings are interested in the top levels where elite schools and national flagships jostle for dominance. Analysing correlations among indicators for the top 200 in ARWU, there are high correlations between Alumni, Awards, Nature and Science, and Productivity per Capita, ranging from .69 to .79.

There is also a high correlation of .72 between Nature and Science and Highly Cited Researchers. It is, however, noticeable that the correlation between Publications and other indicators is low for Highly Cited Researchers and very low for Productivity per Capita, Alumni and Awards.

It seems that, especially among the top 200 places, there is a big gap opening between the old traditional elite of Oxbridge, the Ivy League and the like who continue to get credit for long dead Nobel laureates and the new rising stars of Asia and Europe who are surging ahead for WOS papers and beginning to produce or recruit superstar researchers.




Correlations
alumniawardshighlycitednaturesciencepublicationspcpfinaltotal
alumniPearson Correlation1.00.79.36.69.21.62.78
Sig. (2-tailed).000.000.000.003.000.000
N200200200199200200200
awardsPearson Correlation.791.00.44.74.14.67.84
Sig. (2-tailed).000.000.000.044.000.000
N200200200199200200200
highlycitedPearson Correlation.36.441.00.72.57.49.78
Sig. (2-tailed).000.000.000.000.000.000
N200200200199200200200
naturesciencePearson Correlation.69.74.721.00.44.65.92
Sig. (2-tailed).000.000.000.000.000.000
N199199199199199199199
publicationsPearson Correlation.21.14.57.441.00.12.55
Sig. (2-tailed).003.044.000.000.083.000
N200200200199200200200
pcpPearson Correlation.62.67.49.65.121.00.72
Sig. (2-tailed).000.000.000.000.083.000
N200200200199200200200
finaltotalPearson Correlation.78.84.78.92.55.721.00
Sig. (2-tailed).000.000.000.000.000.000
N200200200199200200200


Thursday, August 12, 2021

THE's Caucus Ranking


In Alice in Wonderland there is a "caucus race" in which everyone runs around frantically in different directions and eventually everyone wins a prize. Unfortunately, there are not quite enough sweets to go around as prizes and so poor Alice has to make do with her thimble which she gives to the Dodo who then presents it to her.

It seems that THE has come up with a caucus ranking. In the THE Impact Rankings universities expend a lot of energy, do a lot of amazing, astounding and very different things in very different ways and a lot of them get some sort of prize for something. 

These rankings are another example of the growing complexity of the ranking scene. Global university rankings used to be simple. Shanghai Jiao Tong University started its  ARWU rankings in 2000 with just 500 ranked institutions and six indicators. Since then the number of rankings has proliferated and there have been more and more spin-offs, young university, regional, business schools, national, subject  rankings and so on with more indicators and increasingly complex and often opaque methodologies. We are getting to the point where a university is incompetent or excessively honest if it cannot find a ranking indicator, perhaps finely sliced by age, size, mission, and/or subject, in which they can scrape into the top five hundred, or at the very least the top thousand, and therefore into the top 3 or 4 % in the world. 

Some of the recent rankings seem redundant or pointless, going over the same old ground or making granular distinctions that are of little interest. It is no doubt nice to be acclaimed as the best young university for social science in South Asia, and maybe that can be used in advertising, but is it really necessary?

Now we have the third edition of the THE Impact rankings. These, as THE boasts, are the only rankings to measure universities according to their commitment to the UN's Sustainable Development Goals. But that is not very original. Universitas Indonesia's GreenMetric was doing something similar several years ago although not tied explicitly to the UN goals. They have indicators related to energy, infrastructure, climate change, water, waste, transportation, and education.

It seems a little odd that the UN should be accepted as the authority on the achievement of gender equality when its "peacekeeping" forces have repeatedly been accused of rape and sexual assault. Is the UN really the right body to lay down guidelines about health and well-being considering the dubious performance of the WHO during the pandemic crisis?

One also wonders why THE should venture into ranking contributions to sustainability when after a decade it has still failed to come up with a credible citations indicator, which would seem a much easier task. 

It is noticeable that participation in these rankings is very uneven. There are 1,118 universities in the latest edition but only 13 Chinese and only 45 American, of which precisely two are in California, supposedly the homeland of environmental consciousness. The higher education elite of the USA, UK and China are largely absent. On the other hand, Iraq, Egypt, Brazil and Iran are much better represented here than in the research based rankings.

The top of these rankings is dominated by English-speaking universities outside the USA. The overall top twenty contains seven Australian, five British, three Canadian, and one each from Denmark, Ireland, the USA, New Zealand and Italy.

The popularity of the Impact Rankings seems linked to the current problems of many western universities. Public funding has been drying up, academic standards eroding, research output stagnating. Many universities have resorted to importing international, often Chinese, students and faculty to keep up standards, bring in tuition money, fill up postgraduate classes, and do the work of junior researchers.

The international students and researchers have left or are leaving and may not return in significant numbers, although THE "believes" that they will. This is happening as universities trying to reopen face the prospect of unprepared students, dwindling funds, and a lack of interest from employers. Eventually this will impact the position of universities in the global ranking systems. Those universities once dependent on international researchers for their reputation and ranking scores will start to suffer.

It looks as though western universities are losing interest in research and instruction in professional and academic subjects and and are reinventing themselves as purveyors of transformative experiences to the children of the affluent and ambitious, guardians of the purity of cultural discourse, or as saviours of the planet.     

The Financial Post of Canada has published a caustic comment on the joyful proclamations by Queen's University about its ascent to fifth place in the Impact Rankings. A trustee, John Stackhouse, has claimed that its success there meant that it was fulfilling "the true purpose of a university." The article observes that those "who believe the true purpose of a university is to pursue academic excellence and ensure that students who pass through its doors have the skills to build prosperous lives for themselves as productive members of their community, might differ."  In the THE World University Rankings and others Queen's is doing much less well. 

The methodology of the impact rankings does little to inspire confidence. For each of the indicators there is a weighting of 27% for bibliometric measures, such as the amount of research on hunger, health, water, or clean energy. It is easy to see how this could be gamed. Then there is a variety of data submitted by the institutions. Even if every university administrator is a sea-green incorruptible there are many ways in which such data can be massaged or stretched.

Added to that, THE does not appear to be doing a rigorous validation. Universities are not  assessed  the same things, except for the partnership for the goals indicator. The University of Sydney, overall second this year, is ranked for clean water and sanitation, sustainable cities and communities, and life on land. Clean water and sanitation includes supporting water conservation off campus and the reuse of water across the university.

RMIT University, in third place, is ranked for decent work and economic growth, industry innovation and infrastructure and reduced inequalities.  Decent work and economic growth includes expenditure per employee and policies for ending discrimination. So, essentially THE is trying to figure out whether Sydney is better at reusing water than RMIT is at announcing policies that are supposed to reduce discrimination. Comparing research output and impact across disciplines is, as THE ought to know, far from easy. Comparing performance in using water with discrimination policy would seem close to impossible especially since THE does not always use objective criteria but merely examples of best practices. Evidence "is evaluated against a set oef criteria and decisions are cross-validated where there is uncertainty. Evidence is not required to be exhaustive -- we are looking for examples that demonstrate best practice at the institutions concerned."

But it seems that the a substantial number of universities will find these rankings a useful tool in their quest for income and publicity and there will be more editions, and probably sub-rankings of one sort or another, for years to come.