Wednesday, August 21, 2024

It seems that self-affirmation isn't such a big deal

 

A few years ago, I wrote about a massively cited study in Science, supposedly a leading scientific journal, that claimed to significantly reduce the racial high school achievement gap. The idea was that having low-achieving students write about values important to themselves would start a recursive process leading to an improvement in their relative academic performance. The positive effect of this self-affirmation intervention was conveniently confined to African-American students, which, I suspect, contributed to the paper's acceptance.

I was sceptical, having once taught English in classrooms, that 15 minutes of writing could have such a remarkable impact, and I wondered about whether the abundance of resources, support, and skills in the school under study might have compromised the anonymity of the subjects.

Now it seems that the study was "seriously underpowered" and "always obviously wrong".

How many more politically convenient studies will turn out to be wrong or perhaps even worse than wrong? 



Friday, August 02, 2024

Forget about the Euros, this is really serious

We are told that the failure at the UEFA final was a tragedy for England. Perhaps, but something else happened early in July that should have caused some debate but passed almost unnoticed, namely the publication of the latest edition of the CWTS Leiden Ranking.

The release of the Times Higher Education (THE) World University rankings and, to a lesser extent, the global rankings from Shanghai, QS, and US News (USN) are often met with fulsome praise from the media and government officials when national favourites rise in the rankings and lamentations when they fall, but other rankings, often much more reliable and rigorous, are largely ignored.

This is partly because the THE and QS rankings are dominated by American and British universities. Oxford, Cambridge, and Imperial College London are in the top ten in the overall tables in these three rankings. This year there was a lot of media talk about Imperial moving ahead of Cambridge and Oxford into second place in the QS rankings, just behind MIT. According to these rankings, British universities are on top of the world and criticism from journalists or politicians would surely be churlish in the extreme. 

It would, however, be a mistake to assume that the big brand rankings are objective judges of academic merit or any other sort. They are biased towards UK universities in a variety of obvious and subtle ways. QS, THE, and USN all include surveys of established academics, and the Shanghai Rankings include Nobel and Fields award winners, some of whom are long gone or retired. THE has three metrics based on income. THE USN, and QS give more weight to citations rather than publications, loading the dice for older and better-funded researchers. 

It seems that British universities have complacently accepted the verdict of these rankings and appear unwilling to consider that they are doing anything less than perfect. When the Sunak government proposed some vague and  bland  changes, the Chief Executive of the London Higher Group of Institutions complained that it was "beyond belief" that the government should have the King speak negatively of the "world-leading higher education and research sector." 

It is perhaps time to look at another ranking, one produced by the Centre for Science and Technology Studies (CWTS) at Leiden University. This provides data on publications with various optional filters for subject group, country, period, and fractional counting. There are also rankings for international and industrial collaboration, open-access publications, and gender equity in research.

CWTS does not, however, publish overall rankings, sponsor spectacular events in prestigious settings, or offer consultations and benchmarking services for non-trivial sums. Consequently, it is usually neglected by the media, university heads, or the grandees of the world economy gathered at WEF forums and the like.

Turning to the latest edition,  starting with the default metric, publications in the Web of Science over the period 2019-2022, we see that Zhejiang University has now overtaken Harvard and moved into first place. In the next few years, it is likely that other Chinese universities like Fudan, Peking, and Tsinghua will join Zhejiang at the peak. 

But the most interesting part of Leiden Ranking is the steady decline of British universities. Oxford is now 25th  in the publications table, down from 14th in 2009-12. That's not too bad, but rather different from the latest QS world ranking, where it is third, US News Best Global Universities, where it is fourth, or THE, where it is first. Oxford is well behind several Chinese universities and also behind, among others, the University of Sao Paulo, Seoul National University, and the University of Pennsylvania.

Of course, you could say that this is a crude measure of research activity and that if we look at other metrics, such as publications in the top 10% and the top 1% of journals, then, yes, Oxford does better. The problem is that the high-quality metrics are usually lagging indicators so we can expect Oxford to start declining there also before too long.

When we look at the broad subject tables for publications, there is further evidence of gradual decline.  For Mathematics and  Computer Science, Oxford is 63rd, behind Purdue University, Beijing University of Technology, and the University of New South Wales. In 2009-12 it was 50th. 

For Physical Sciences and Engineering, it is 72nd behind the  University of Tehran, Texas A & M, and Lomonosov Moscow State University. In 2009-12 it was 29th.

It is 64th in Life and Earth Sciences, behind Lanzhou University, the Swedish University of Agricultural Science, and Colorado State University. In 2009-2012 it was 42nd. 

For Biomedical and Health Sciences, it is 39th, behind Duke, University of British Columbia, and Karolinska Institutet; in 2009-2012, it was 27th.

Finally, when it comes to the Humanities and Social Scientists, Oxford remains at the top. It is fourth in the world, just as it was in 2009-2012. 

A glance at some middling British institutions shows the same picture of steady relative decline. Between 2009-2012 and 2019-2022 Reading went from 489th to 719th, Liverpool from 233rd to 302nd, and Cardiff from 190th to 328th. 

It is perhaps unfair to judge complex institutions based on a single metric. Unfortunately, most science, scholarship, and everyday life are based on assigning numbers that may ignore the fine details of indispensable complex phenomena. 

Also, such data does not tell us the full story about teaching and learning, but there is plenty of anecdotal evidence that British universities are not doing so great there either. 

It seems that the big rankings are exaggerating the merits of British higher education. It is time to take a look at some of the global in  rankings produced in places like the Netherlands (Leiden Ranking), Spain (SCImago), Turkiye (URAP), Georgia (RUR), and Taiwan (NTU rankings).









Sunday, July 28, 2024

Are British Universities Really Underfunded?

I noticed this on LinkedIn recently. An item by Phil Baty, Chief Global Affairs Officer at Times Higher Education (THE), claims that British universities are seriously underfunded and that their world-class achievements are endangered.

He reports that a brilliant data analyst has revealed that inflation has eroded the value of the tuition fees that UK universities are allowed to charge and that costs have dramatically increased. 

Then we have a graph from a THE data guru that compares British university performance in the 18 metrics used in the current THE world rankings to that of their international peers. The UK is well ahead of the world average of the top 500 universities in the most recent world university rankings for the international outlook indicators and significantly for research strength, field-weighted citations, and publications. It is slightly ahead for research excellence, research reputation, research influence, and patents.

However, when it comes to institutional income, research income, and industry income, British universities are apparently way behind the rest of the world. So, it seems that THE has conclusively demonstrated that UK universities are seriously short of money.

But there are a few things that need to be considered.

First, the THE income indicators are all divided by the number of academic staff. To do well in these measures, a university could have substantial income, or at least report that it did, or it could reduce the number of faculty reported.

In other words, a university that decided to spend its money recruiting teaching and/or research staff would fall in the THE rankings. If it sacked a lot of teachers and researchers, it would be rewarded with a significant improvement. You might think that is a bit bonkers, but that is the unintended consequence of the THE  methodology. I do not know which applies to British universities in general or specifically, but it would be interesting to see a breakdown of the data.

Also, remember that the income indicators are based on data submitted by institutions. It would be unwise to assume that these are completely valid and accurate. A few years ago Alex Usher of HESA published an article showing that there were some serious problems with THE's industry income indicator. I am not sure whether it has improved since then.

Also, we should note that 55 UK universities are in the current THE world top 500. According to Webometrics, there are 31,657 universities worldwide and 355 in the UK. THE is, in effect, claiming that the top 15.49% of British universities, according to THE's criteria, are underfunded compared to the top 1.58% of world universities in general. 

Before signing off, the graph is instructive in that it shows that the rankings are massively biased toward British universities. Consider the weighting for the various metrics. 

The International Outlook pillar has a 7.5% weighting, research quality, that is citations, 30%, teaching and research reputation 33%, and publications per staff 5.5%. These are all criteria where British higher education does better than the world average.

In contrast, the three income metrics, where UK universities do badly, are given weightings of 2.5%, 5.5%, and 2% respectively. 

If THE decided to shift some of its weighting from reputation to income or to doctoral education, which the UK sector also does badly, its THE rank would fall very noticeably.