Friday, May 11, 2018

Ranking Insights from Russia

The ranking industry is expanding and new rankings appear all the time. Most global rankings measure research publications and citations. Others try to add to the mix indicators that might have something to do with teaching and learning. There is now a  ranking that tries to capture various third missions.

The Round University Rankings published in Russia are in the tradition  of  holistic rankings. They give a 40 % weighting to research, 40 % to teaching, 10% to international diversity and 10% to financial sustainability. Each group contains five equally weighted indicators. The data is derived from Clarivate Analytics who also contribute to the US News Best Global Universities Rankings.

These rankings are similar to the THE rankings in that they attempt to assess quality rather than quantity but they have 20 indicators instead of 13 and assign sensible weightings. Unfortunately, they receive only a fraction of the attention given to the THE rankings.

They are, however, very  valuable since they dig deeper into the data than  other global rankings. They also show that there is a downside to measures of quality and that data submitted directly by institutions should be  treated with caution and perhaps scepticism.

Here are the top universities for each of the RUR indicators.

Teaching
Academic staff per students: VIB (Flemish Institute of Biotechnology), Belgium
Academic staff per bachelor degrees awarded: University of Valladolid, Spain
Doctoral degrees per academic staff: Kurdistan University of Medical Science, Iran
Doctoral degrees per bachelor degrees awarded:  Jawaharlal Nehru University, India
World teaching reputation  Harvard University, USA.

Research
Citations per academic and research staff: Harvard
Doctoral degrees per admitted PhD: Al Farabi Kazakh National University
Normalised citation impact: Rockefeller University, USA
Share of international co-authored papers: Free University of Berlin
World research reputation: Harvard.

International diversity
Share of international academic staff: American University of Sharjah, UAE
Share of international students: American University of Sharjah
Share of international co-authored papers: Innopolis University, Russia
Reputation outside region: Voronezh State Technical University, Russai
International Level: EPF Lausanne, Switzerland.

Financial sustainability:
Institutional income per academic staff: Universidade Federal Do Ceara, Brazil
Institutional income per student: Rockefeller University
Papers per research income:  Novosibersk State University of Economics and Management, Russia
Research income per academic and research staff: Istanbul Technical University, Turkey
Research income per institutional income: A C Camargo Cancer Center, Brazil.

There are some surprising results here. The most obvious is Voronezh State Technical University which is first for reputation outside its region (Asia, Europe and so on), even though its overall scores for reputation and for international diversity are very low. The other top universities for this metric are just what you would expect, Harvard, MIT, Stanford, Oxford and so on. I wonder whether there is some sort of bug in the survey procedure, perhaps something like the university's supporters being assigned to Asia and therefore out of region. The university is also in second place in the world for papers per research income despite very low scores for the other research indicators.

There are other oddities such as Novosibersk State University of Economics and Management placed first for papers per research income and Universidade Federal Do Ceara for institutional income per academic staff. These may result from anomalies in the procedures for reporting and analysing data, possibly including problems in collecting data on income and staff.

It also seems that medical schools and specialist or predominantly postgraduate institutions such as Rockefeller University, the Kurdistan University of Medical Science, Jawarhalal Nehru University and VIB have a big advantage with these indicators since they tend to have favourable faculty student ratios, sometimes boosted by large numbers of clinical and research only staff, and a large proportion of doctoral students.

Jawaharlal Nehru University is a mainly postgraduate university so a high placing for academic staff per bachelor degrees awarded is not unexpected although I am surprised that it is ahead of Yale and Princeton. I must admit that the third place here for the University of Baghdad needs some explanation.

The indicator doctoral degrees per admitted PhD might identify universities that do a good job of selection and training and get large numbers of doctoral candidates through the system. Or perhaps it identifies universities where doctoral programmes are so lacking in rigour that nearly everybody can get their degree once admitted. The top ten of this indicator includes De Montfort University, Shakarim University, Kingston University, and the University of Westminster, none of which are famous for research excellence across the range of disciplines.

Measures of international diversity have become a staple of global rankings since they are fairly easy to collect. The problem is that international orientation may have something do with quality but it may also simply be a necessary attribute of being in a small country next to larger countries with the same or similar language and culture. The top ten for the international student indicators includes the Central European University and the American university of Sharjah. For international faculty it includes the University of Macau and Qatar University.

To conclude, these indicators suggest that self submitted institutional data should be used sparingly and that data from third party sources may be preferable. Also, while ranking by quality instead of quantity is sometimes advisable it also means that anomalies and outliers are more likely to appear.







Tuesday, May 01, 2018

World Top 20 Project

The International Rankings Expert Group (IREG) has produced an inventory of international rankings that is testimony to the enormous interest in comparing and classifying universities around the world.

In addition to those rankings that were included there are several "also rans", rankings that were not counted because they included only one indicator, had been published only once or provided insufficient  information about methodology .

One of  these is the World Top 20 Project whose Executive Director and founder is Albert N Mitchell II. The website claims to rank 500 universities according to seven criteria and to use data from institutional databases and educational publications to construct eight regional rankings. The scores are then compared with those from the US News Best Global Universities, the THE World University Rankings, the QS World University Rankings, and the Center for World University Rankings to select the global top twenty.

The top five universities in each region are listed. Most seem quite sensible -- Cape Town is first in Africa, Tokyo in Europe, Harvard in North America and Cambridge in Europe -- but there are no Mexican universities in the top five in central America.

It is interesting that this site has included CWUR rather than the Shanghai ARWU in the big four world rankings. Could this be the start of a new trend?

There is no information about ranks or scores for the various indicators or details about the sources of data. It is also difficult to see how information about things like career development facilities, disability access and low-income outreach could be collected from the universities mentioned. 

Unless further information about sources and methods appears, it seems that there is no need to discuss these rankings any further






Friday, April 13, 2018

At last. A Ranking With Cambridge at the Bottom


Cambridge usually does well in national and global rankings. The most recent ARWU from Shanghai puts it in third place and although it does less well in other rankings it always seems to be in the top twenty. It has suffered at the hands of the citations indicator in the THE world  rankings which seem to think that Anglia Ruskin University, formerly the Cambridgeshire College of Arts and Technology, has a greater global research impact but nobody takes that seriously.

So it is a surprise to find  an article in the Guardian about a ranking from the Higher Education Policy Institute ( HEPI) in the UK that actually puts Cambridge at the bottom and the University of Hull at the top. Near the bottom are others in the Russell group, Oxford, Bristol and LSE.

At the top we find Edge Hill, Cardiff Metropolitan and, of course, Anglia Ruskin Universities.

The ranking was part of a report written for HEPI by Iain Martin, vice-chancellor of Anglia Ruskin University, that supposedly rates universities for fair access, that is having a student intake that mirrors society as a whole. It compares the percentage of participation in higher education of school leavers in local authority areas with the percentage admitted by specific universities. Universities have a high rank if they draw students from areas where relatively few school leavers go to university. The rationale is the claim that learning outcomes are improved when people of diverse backgrounds study together.

It is noticeable that there several Scottish universities clustered at the bottom even though Scotland has a free tuition policy (not  for the English of course) that was supposed to guarantee fair access.

This rankings looks like an inversion of the ranking of UK universities according to average entry tariff, ie 'A' level grades, and a similar inversion of most global rankings based on research or reputation. 

Cambridge and other Russell Group universities have been under increasing pressure to relax entry standards and indiscriminately  recruit more low income students and those from historically unrepresented groups. It seems that they are slowly giving way to the pressure and that as academic standards erode they will be gradually eclipsed by the rising universities of East Asia.





Saturday, March 24, 2018

Ranking Arab Universities

The Middle East and North Africa (MENA) region has been slower than some others to jump on the rankings train but it seems to be making up for lost time. In addition to the standard world rankings there are now MENA (or Arab world or region) university rankings from Quacquarelli Symonds (QS), Times Higher Education (THE), US News (USN) and Webometrics.

Taking methodologies developed to rank elite western universities and applying them to regions with different traditions, resources and priorities is no easy task. For most Arab universities, research is of little significance and attaining international prominence is something that only a few places can reasonably hope for. But there is still a need to differentiate among those institutions that are focussed largely on teaching.

Alex Usher of HESA has spoken of the difficulty of using metrics based on research, expenditure, and student quality. I agree that institutional data is not very helpful here. However, measures of social influence such as those in the Webometrics and QS Arab rankings, and peer and employer surveys, used by USN and QS, might be useful in assessing the teaching quality, or at least the perceived quality, of these universities.

If rankings are to be of any use in the MENA region, then they will have to find ways of comparing selectivity, student quality and social impact. There is little point in forcing regional universities into the Procrustean bed of global indicators designed to make fine distinctions within the Russell Group or the Ivy League.

This is pretty much what THE have done with the 2018 edition of their Arab World Rankings, which is simply extracted from their world rankings published in 2017. These rankings are very research orientated and include measures of income, doctoral degrees and internationalisation. They also give a disproportionate weighting to citations, supposedly a measure of research impact or research quality.

Here are the top five in the recent editions of the various Arab Region/MENA rankings.

THE
1.   King Abdulaziz University, Saudi Arabia
2.   Khalifa University, UAE
3.   Qatar University
4.   Jordan University of Science and Technology
5.   United Arab Emirates University (UAEU)

QS
1.    American University of Beirut, Lebanon
2.    King Fahd University of Petroleum and Minerals, Saudi Arabia
3.    King Saud University, Saudi Arabia
4.    King Abdulaziz University
5.    United Arab Emirates University

USN
1.    King Saud University
2.    King Abdulaziz University
3.    King Abdullah University of Science and Technology (KAUST), Saudi Arabia
4.    Cairo University, Egypt
5.    American University of Beirut

Webometrics
1.    King Saud University
2.    King Abdulaziz University
3.    King Abdullah University of Science and Technology
4.    Cairo University
5.    American University of Beirut

Webometrics and USN are identical for the first six places. It is only when we reach seventh place that they diverge: UAEU in Webometrics and Ain Shams, Egypt, in the USN rankings. Webometrics measures web activity with a substantial research output indicator while USN is mainly about research with some weighting for reputation.

The list of top universities in QS, which uses Webometrics data as one indicator, is quite similar. QS does not count research universities such as KAUST, third place in the WEbometrics and USN rankings but otherwise it is not too different from the other two.

The THE rankings have a disproportionate weighting for research impact supposedly measured by field and year normalised citations. Officially, it is 30 % but in fact it is much higher because of the regional modification that gives a big bonus to universities in countries with a low citation impact score.

For example, KAU's score for citations amounts to nearly 60% of its total score. Other universities in THE's top twenty have citation scores higher, sometimes much higher, than their research scores.

In effect, the THE Arab rankings are mostly about citations, very often in a limited range of disciplines. They can be easily, sometimes accidentally, gamed and can lead to perverse consequences, such as recruiting highly cited researchers or searching for citation-rich projects that have little relevance to the region or country.