Thursday, August 29, 2024

China vs the West: Snow’s ‘two culture’ theory goes global

 


Published today in University World News

In 1959, C P Snow, a British scientist, civil servant and novelist, created a stir with a lecture, “The Two Cultures and the Scientific Revolution”. The two cultures were led by natural scientists and literary intellectuals.

There was no doubt about where Snow stood with regard to the cultures. Scientists, he said, had “the future in their bones”, and he was disdainful of those who were ignorant of the basic laws of physics.

He believed that Britain’s stagnation after the Second World War was the result of the domination of public life by humanities graduates and the marginalisation of natural scientists.

Snow’s lecture was met with an equally famous ad hominem blast from the Cambridge literary critic, F R Leavis, which probably did Snow more good than harm. Leavis may, however, have had a prescient point when he talked about how science had destroyed the organic communities of the pre-industrial world.

At the time, his nostalgia was largely misplaced. Those who lived in the villages and farms of England had little reluctance about moving, as did my forebears, to the cotton mills of Derbyshire and the coal mines of South Wales, but, looking at a world where every human instinct has become digital media fodder, Leavis might have been onto something.

It now looks like we have something like Snow’s two cultures emerging at the global level with their centres in China, and in North America and Western Europe.


















Sunday, August 25, 2024

India and the THE Impact Rankings


The World Economic Forum (WEF), supposedly the voice of the global economic and political elites, recently published an article by Phil Baty, Chief Global Affairs Officer of Times Higher Education (THE), about Indian universities and their apparent progress towards world-class status, shown by their participation and performance in the THE Impact Rankings, which measure universities’ contributions to the UN’s Sustainable Development Goals (SDGs).

This is misleading and irresponsible. Participation, or even a high score, in the Impact Rankings, whether overall or for specific indicators, has little, if anything, to do with the ability of universities to provide instruction in academic and professional subjects or to pursue research, scholarship, and innovation. Indeed, it is difficult to see how many of the criteria used in the Impact Rankings are relevant to attaining the SDGs.

The article begins by quoting Philip Altbach, who said in  2012   that India was a world-class country without world-class universities. That in itself is an interesting comment. If a country can be world-class without world-class universities, then one wonders if such universities are really essential.

There is a bit of bait and switch here. Whatever Altbach meant by world-class in 2012, I doubt that he was referring to performance in meeting the UN’s SDGs.

Baty goes on to claim that Indian universities are improving, and this is shown by Indian universities submitting data for THE impact rankings, which assess universities' contribution to the SDGs, 125 compared with 100 from Türkiye and 96 from Pakistan, out of a total of  2152 universities around the world.

That sounds impressive. However, submissions to the impact rankings and other THE products are voluntary, as THE often points out. There is no real merit involved in filling out the forms except perhaps showing a need to be ranked for something.

In any case, according to the uniRank site, there are 890 higher education institutions in India, 174 in Türkiye, and 176 in Pakistan. That means that the participation rate is about 14% for India, 57% for Türkiye, and 55% for Pakistan. India's participation in THE Impact Rankings is less than that of Pakistan and Türkiye, and in previous years, it has been much less than that of countries like Algeria, Iran, and Iraq.

Nor does gaining a high score in the Impact Rankings tell us very much. Universities are ranked on their four best scores. Many universities simply submit data for five or six goals and just ignore the others, for which their actual contribution might well be zero or negative.

These rankings rely heavily on data submitted by universities. Even if everybody concerned with the collection, transfer, and processing of information is totally honest and competent, there are often immense obstacles to data curation confronting universities in the Middle East, Africa, and Latin America. These rankings may be, in effect, little more than a measure of the ambitions of university leaders and the efficiency of their data analysts.

Moreover, much of the progress toward these goals is measured not by hard, verifiable data but by targets, programs, initiatives, partnerships, facilities, policies, measures, and projects that are subject to an opaque and, one suspects, sometimes arbitrary validation process.

Also, do the criteria measure progress toward the goals? Does producing graduates in law, civil enforcement, and related fields really contribute to peace, justice, and strong institutions? Does a large number of graduates qualified to teach say much about the quality of education?

It might be commendable that a minority of Indian universities, albeit proportionately less than many other countries, have signed up for these rankings and that a few have done well for one or two of the SDGs. It is helpful to know that JSS Academy of Higher Education and Research is apparently a world beater for good health and well-being, Shoolini University of Biotechnology and Management for clean water and sanitation, and Saveetha Institute of Medical and Technical Sciences for affordable and clean energy, but does this really compensate for the pervasive perceived mediocrity of Indian higher education?

The validity of the Impact Rankings can be checked by comparing them with the UI GreenMetric Rankings, which have measured universities' commitment to environmental sustainability since 2010. Some of the indicators here, such as Energy and Climate Change and Water, are similar, although not identical, to those in the Impact Rankings, but there is almost no overlap between the best-performing universities in the two rankings. No doubt THE would say their rankings are more sophisticated but still, even the least cynical observer might wonder a bit.

The reality is that Indian universities have consistently underperformed in the various global rankings, and this is, on balance, a fairly accurate picture. It is probable that current reforms will bring widespread change, but that is still something on the horizon.

Here, THE has not been helpful. Over the last few years, It has repeatedly exaggerated the achievements of a few Indian institutions that have risen in their world or regional rankings, often due to the dysfunctional citations indicator. These include Panjab University, Vel Tech Rangarajan Dr. Sagunthala R&D Institute of Science and Technology, JSS Academy of Higher Education and Research, and the Ropar and Indore Institutes of Technology. This has caused resentment among leading Indian institutions, who are perplexed by such relatively marginal places zooming ahead of the highly reputable Indian Institutes of Technology of Bombay, Madras, and Delhi.

The article ignores the boycott by the leading Indian Institutes of Technology (IITs) of the THE World University Rankings partly because of their opacity, where all the metrics are now bundled into pillars, so it is next to impossible to figure out what is causing movement in the rankings without paying THE for consultation and benchmarking.

Indian universities have not performed well in global rankings. In the Shanghai Rankings, the best performer is the Indian Institute of Science in the 401-500 band, down from 301-400 in 2023. In the CWTS Leiden Ranking, the leading university is the Indian Institute of Technology Kharagpur in 284th place. Compared to China, Japan, and South Korea, India’s performance is rather tepid. The occasional show of excellence with regard to one or two of the SDGs is hardly sufficient compensation.

The current reforms may put Indian research and higher education on track, but India’s problems go deeper than that. There is widespread evidence that the country is lagging far behind in primary and secondary education, and ultimately, that will matter much more than the exploits of universities on the way to meeting sustainability goals.

 

Wednesday, August 21, 2024

It seems that self-affirmation isn't such a big deal

 

A few years ago, I wrote about a massively cited study in Science, supposedly a leading scientific journal, that claimed to significantly reduce the racial high school achievement gap. The idea was that having low-achieving students write about values important to themselves would start a recursive process leading to an improvement in their relative academic performance. The positive effect of this self-affirmation intervention was conveniently confined to African-American students, which, I suspect, contributed to the paper's acceptance.

I was sceptical, having once taught English in classrooms, that 15 minutes of writing could have such a remarkable impact, and I wondered about whether the abundance of resources, support, and skills in the school under study might have compromised the anonymity of the subjects.

Now it seems that the study was "seriously underpowered" and "always obviously wrong".

How many more politically convenient studies will turn out to be wrong or perhaps even worse than wrong? 



Friday, August 02, 2024

Forget about the Euros, this is really serious

We are told that the failure at the UEFA final was a tragedy for England. Perhaps, but something else happened early in July that should have caused some debate but passed almost unnoticed, namely the publication of the latest edition of the CWTS Leiden Ranking.

The release of the Times Higher Education (THE) World University rankings and, to a lesser extent, the global rankings from Shanghai, QS, and US News (USN) are often met with fulsome praise from the media and government officials when national favourites rise in the rankings and lamentations when they fall, but other rankings, often much more reliable and rigorous, are largely ignored.

This is partly because the THE and QS rankings are dominated by American and British universities. Oxford, Cambridge, and Imperial College London are in the top ten in the overall tables in these three rankings. This year there was a lot of media talk about Imperial moving ahead of Cambridge and Oxford into second place in the QS rankings, just behind MIT. According to these rankings, British universities are on top of the world and criticism from journalists or politicians would surely be churlish in the extreme. 

It would, however, be a mistake to assume that the big brand rankings are objective judges of academic merit or any other sort. They are biased towards UK universities in a variety of obvious and subtle ways. QS, THE, and USN all include surveys of established academics, and the Shanghai Rankings include Nobel and Fields award winners, some of whom are long gone or retired. THE has three metrics based on income. THE USN, and QS give more weight to citations rather than publications, loading the dice for older and better-funded researchers. 

It seems that British universities have complacently accepted the verdict of these rankings and appear unwilling to consider that they are doing anything less than perfect. When the Sunak government proposed some vague and  bland  changes, the Chief Executive of the London Higher Group of Institutions complained that it was "beyond belief" that the government should have the King speak negatively of the "world-leading higher education and research sector." 

It is perhaps time to look at another ranking, one produced by the Centre for Science and Technology Studies (CWTS) at Leiden University. This provides data on publications with various optional filters for subject group, country, period, and fractional counting. There are also rankings for international and industrial collaboration, open-access publications, and gender equity in research.

CWTS does not, however, publish overall rankings, sponsor spectacular events in prestigious settings, or offer consultations and benchmarking services for non-trivial sums. Consequently, it is usually neglected by the media, university heads, or the grandees of the world economy gathered at WEF forums and the like.

Turning to the latest edition,  starting with the default metric, publications in the Web of Science over the period 2019-2022, we see that Zhejiang University has now overtaken Harvard and moved into first place. In the next few years, it is likely that other Chinese universities like Fudan, Peking, and Tsinghua will join Zhejiang at the peak. 

But the most interesting part of Leiden Ranking is the steady decline of British universities. Oxford is now 25th  in the publications table, down from 14th in 2009-12. That's not too bad, but rather different from the latest QS world ranking, where it is third, US News Best Global Universities, where it is fourth, or THE, where it is first. Oxford is well behind several Chinese universities and also behind, among others, the University of Sao Paulo, Seoul National University, and the University of Pennsylvania.

Of course, you could say that this is a crude measure of research activity and that if we look at other metrics, such as publications in the top 10% and the top 1% of journals, then, yes, Oxford does better. The problem is that the high-quality metrics are usually lagging indicators so we can expect Oxford to start declining there also before too long.

When we look at the broad subject tables for publications, there is further evidence of gradual decline.  For Mathematics and  Computer Science, Oxford is 63rd, behind Purdue University, Beijing University of Technology, and the University of New South Wales. In 2009-12 it was 50th. 

For Physical Sciences and Engineering, it is 72nd behind the  University of Tehran, Texas A & M, and Lomonosov Moscow State University. In 2009-12 it was 29th.

It is 64th in Life and Earth Sciences, behind Lanzhou University, the Swedish University of Agricultural Science, and Colorado State University. In 2009-2012 it was 42nd. 

For Biomedical and Health Sciences, it is 39th, behind Duke, University of British Columbia, and Karolinska Institutet; in 2009-2012, it was 27th.

Finally, when it comes to the Humanities and Social Scientists, Oxford remains at the top. It is fourth in the world, just as it was in 2009-2012. 

A glance at some middling British institutions shows the same picture of steady relative decline. Between 2009-2012 and 2019-2022 Reading went from 489th to 719th, Liverpool from 233rd to 302nd, and Cardiff from 190th to 328th. 

It is perhaps unfair to judge complex institutions based on a single metric. Unfortunately, most science, scholarship, and everyday life are based on assigning numbers that may ignore the fine details of indispensable complex phenomena. 

Also, such data does not tell us the full story about teaching and learning, but there is plenty of anecdotal evidence that British universities are not doing so great there either. 

It seems that the big rankings are exaggerating the merits of British higher education. It is time to take a look at some of the global in  rankings produced in places like the Netherlands (Leiden Ranking), Spain (SCImago), Turkiye (URAP), Georgia (RUR), and Taiwan (NTU rankings).