Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Tuesday, March 10, 2020
University of California Riverside:Is it America's Fastest Rising University?
Not really.
It seems that one major function of rankings is to cover up the decline or stagnation of major western universities. An example is the University of California Riverside (UCR), a leading public institution.
Recently there was a tweet and a web page extolling UCR as "America's fastest rising university" on the strength of its ascent in four different rankings: Forbes 80 places in two years, US News 33 places in two years, Times Higher Education (THE) 83 places in two years and the Center for World University Rankings (CWUR) 41 places in one year.
This surprised me a bit because I was under the impression that the University of California system had been declining in the research-based rankings and that UCR had not done so well in the THE world rankings. So I had a quick look. In 2019-20 UCR was in the 251-300 band in the THE world rankings. Two years before that it was 198th and in 2010-11 it was 117th. I have little trust in THE but that is evidence of a serious decline by any standards.
But it seems that the tweeter was thinking about the THE/Wall Street Journal US Teaching Rankings. In 2019-20 Riverside was 189th there, in 2018-19 212th, in 2017-18 272nd, and in 2016-17 it was 364th.
That is definitely a substantial rise and it is rather impressive considering the this rise occurred while the UCR was falling in THE's world rankings. Most of the rise occurred in the outcomes "pillar" and probably was a result of the introduction in 2018 of a new indicator that measured student debt.
UCR's rise had nothing to do with resources, environment or engagement. It is not possible to disentangle the components within THE's four pillars but it is a very plausible hypothesis that a good part of UCR's success is the result of a methodological change that introduced an element where this university was especially strong.
Another ranking where UCR does well is the Center for World University Ranking now published in the UAE. Last year UCR was 204th, in 2018-19 245th, and in 2017-18 218th.
The fall and rise of UCR over two years follows the weighting given to research. When the weighting was 25% UCR's position was 218th. When the research weighting was increased to 70 % UCR fell to 245th. When it fell to 50% UCR rose to 204th. So, here again UCR's rank is dependent on methodological tweaking. It does well when the weighting for research is reduced and less well when it is increased.
I assume that the US News (USN) rankings refer to America's Best Colleges, where UCR does very well. It was 91st last year among national universities and 1st for social mobility.
The rise of UCR in these rankings is also the result of methodological changes. In 2019 USN began to shift away from academic excellence to social mobility, which basically means admitting and graduating larger numbers of recognised and protected groups. The acceptance rate criterion has been scrapped and metrics related to social mobility such as the graduation rates of low income students have been introduced. UCR has not necessarily improved: what has happened is that the ranking has put more weight on things that it is good at and less on those where it performs less well.
The Forbes ranking also refers to changes announced in 2017 that "better align this list with what FORBES values most: entrepreneurship, success, impact and the consumer experience." It is likely that these had a favourable impact on UCR's performance here as well.
When it comes to international research based rankings over several years the story is a different one of steady decline. Starting with total publications in the publications indicator of the CWTS Leiden Ranking, UCR went from 270th in 2006-09 to 392nd in 2014-17. Much of this was due to the general decline of American universities but even within the US group there was a decline from 88th to 93rd.
The decline is starker if we look at the most rigorous expression of quality, the proportion of publications in the top 1% of journals. In the same period UCR fell from 12th to 130th worldwide and 11th to 59th in the USA.
Turning to the Scimago Institution Rankings which include patents and altmetrics, Riverside fell from 151st in 2011 to 228th in 2019. Among US institutions it fell from 70th in 2011 to 85th in 2019.
That is the situation with regard to research based rankings. Moving on the rankings that include things related to teaching UCR fell from 271st in the QS world rankings in 2017 to 454th in 2020. In the Round University Rankings it fell from 197th in 2010 to 231st in 2916 and then stopped participating from 2017 onward.
It seems fairly clearly that UCR has been in declining in indicators relating to research, especially research of the highest quality. It also performs poorly in those rankings that combine teaching withresearch and internationalisation metrics. Exactly why must wait for another post but I strongly suspect that the underlying reason is the declining ability of incoming students and the retreat from meritocracy in graduate school admissions and faculty appointments and promotions.
Friday, February 28, 2020
Polish Universities in International Rankings
My short article on Polish universities and international rankings has just been published by Forum Akademikie. The article in Polish can be accessed here. Translation and editing by Piotr Kieracinski. The full journal issue is here.
Here is the English version.
Here is the English version.
Richard Holmes
Polish Universities and International
Rankings
A Brief History of International University Rankings
After a false start with the Asiaweek rankings of
1999-2000, international university rankings took off in 2003 with the Academic
Ranking of World Universities (ARWU), published by Shanghai Jiao Tong
University and then by the Shanghai Ranking Consultancy.
In 2004 two new rankings appeared: the Ranking Web of
Universities, better known as Webometrics, which originally measured only web
activity, and the Times Higher Education Supplement (THES) –
Quacquarelli Symonds (QS) World University Rankings, which emphasised research
and also included faculty resources and internationalisation indicators.
Since then the number of rankings, metrics and data points
has increased prompting ranking consultant Daniel Guhr to talk about “vast data
lakes”. Rankings have become more complex and sophisticated and often use statistical
techniques such as standardisation and field normalisation.
In addition to global rankings, specialist rankings of
regions, subjects, and business schools have appeared. International rankings continue
to have a bias towards research but some try to find a way of capturing data
that might be relevant to teaching and learning or to university third missions
such as sustainability, gender equity and open access. They have also become
significant in shaping national higher education policy and institutional
strategy.
Although the media usually talk about the big four rankings
or sometimes the big three or big two, there are now many more. The IREG
Inventory of International Rankings includes 17 global rankings in addition to
regional and specialised rankings and various spin offs. Since the publication of
the inventory more global rankings have appeared and no doubt there are more to
come.
Media Perceptions of Rankings
It is unfortunate that the media and public perception of global
rankings has little relation to reality. The Times Higher Education (THE)
World University Rankings are by far the most prestigious but they have serious
defects. They lack transparency with eleven indicators bundled into three broad
groups. They rely on subjective surveys and questionable data submitted by
institutions. They are unbalanced with a 30% weighting going towards a citation
indicator that can be influenced by a handful of papers in a multi-author
international project and that has elevated a succession of little-known places
to world research leadership. These include the University of Reykjavik, Babol
Noshirvani University of Technology, Aswan University, and Anglia Ruskin University.
The problems of the THE world rankings are illustrated by
looking at the fate of a leading Polish university in recent editions. In the
2014-15 rankings the University of Warsaw was ranked 301-350 but in 2015-16 it
fell to 501-600. This was entirely the result of a dramatic fall in the score
for citations and that was entirely the result of a methodological change. In
2015 THE stopped counting citations to papers with over a thousand “authors”.
This badly affected the University of Warsaw which, along with Warsaw
University of Technology, had been contributing to the Large Hadron Collider
project a producer of many such papers. The University of Warsaw’s decline in
the rankings had nothing to do with any defect. It was simply the result of
THE’s tweaking.
Although they receive little attention from the media there
are now several global rankings published by universities and research councils
that include more universities, cover a broader range of indicators and are
technically as good as or better than the Big Four. These include the National
Taiwan University Rankings, University Ranking by Academic Performance
published by Middle East Technical University, the Scimago Institution Rankings
and CWTS Leiden Ranking.
Polish Universities in Global Rankings
Turning to the current position of Polish universities in
international rankings there is a great deal of variation. There are 14 in the
THE rankings with 4 in the top 1000, but 410 in the Webometrics rankings of
which 10 are in the top 1000. The ranking with the best representation of
Polish universities is Scimago with 54 in the top 1000.
Of the “big four” rankings -- THE, QS, Shanghai, US News
-- the best for analysing the current standing of the International
Visibility Project (IntVP) universities is the US News Best Global Universities
(BGU). THE and QS are unbalanced with too much emphasis on a single indicator,
citations and academic survey respectively. The Shanghai rankings include Nobel
and Fields awards some of which are several decades old. It should be noted
that BGU is an entirely research based ranking.
The list below indicates the world rank of Polish universities
in the latest US News BGU:
University of Warsaw 286
Jagiellonian University 343
Warsaw University of Technology 611
AGH University of Science and Technology
635
Adam Mickiewicz University 799
University of Wroclaw 833
Medical University of Wroclaw 926
Wroclaw University of Science and
Technology 961
Nicholas Copernicus University 984
Medical University of Gdansk 995
Medical University of Warsaw 1033
University of Silesia 1082
University of Gdansk 1096
University of Lodz 1119
Gdansk University of Technology 1148
Poznan University of Technology 1148
Lodz University of Technology 1194
Lodz Medical University 1203
Warsaw University of Life Sciences
1221
Pomeranian Medical University
1303
Poznan University of Medical Sciences
1312
Silesian University of Technology
1351
Krakow University of Technology 1363
University of Warmia 1363
Medical University Silesia 1399
Medical University of Lublin 1414
University of Rseszov 1430
Technical University Czestochowa 1445
Poznan University of Life Sciences
1457
Wroclaw University of Life and
Environmental Sciences 1465
Agricultural University of Lublin. No
overall rank. 214 for agriculture
Medical University of Bialystok. No
overall rank. 680 for clinical medicine.
One thing that emerges from this list is that the Polish university
system suffers from a serious handicap in this ranking and in others due to the
existence of independent specialist universities of technology, business and
medicine. Consolidation of small specialist institutions could bring about
significant improvements as has recently happened in France.
There is also some variation in the rank of the best performing
Polish universities. In most rankings the top scoring Polish university is in
the 300s or 490s. There are, however, some exceptions. The default indicator in
Leiden Ranking, total publications, has Jagiellonian University at 247, GreenMetric
has Adam Mickiewicz University at 160 and the Moscow Three Missions University
Rankings puts the University of Warsaw at 113. On the other hand, no Polish
university gets higher than the 600-800 band in the THE world rankings.
The various rankings have very different methodologies and
indicators. THE for example includes income in three indicators. The QS
rankings gave a combined weighting of 50 % to reputation surveys. Scimago counts
patents and the Center for World University Rankings (CWUR), now based in the
Arab Gulf, the achievements of alumni. It would be a good idea to look carefully
at the content and format of all the rankings before using them for evaluation
or benchmarking.
Polish Universities: Strengths and Weaknesses
Poland has certain advantages with regard to international rankings.
It has an excellent secondary school system as shown by above average performance
in PISA and other international standardised tests. Current data indicates that
it has adequate teaching resources, shown by statistics for staff student
ratio. It has cultural and economic links to the East, with the Anglosphere and
within the EU that are likely in the future to produce fruitful research
partnerships and networks.
On the other hand, the evidence of current rankings is that
Polish universities relatively underfunded and that doctoral education is still
relatively limited. The reputation for research in the world is not very high although
is better for regional reputation.
One exception to the limited international visibility of
Polish universities is a recent British film, The Last Passenger, in
which a hijacked train is saved by a few heroes one of whom has an engineering
degree from Gdansk University of Technology.
Poland and the Rankings
It would be unwise for Poland, or indeed any country, to
focus on a single ranking. Some rankings have changed their methodology and will
probably continue to do so and this might lead to unexpected rises or falls. THE
has announced that there will be a new 3.0 version of the world rankings
towards the end of this year.
Any university or university system wishing to engage with
the rankings should be aware that they often favour certain types of
institutions. The Shanghai rankings, for example, privilege medical research
and totally ignore the arts and humanities. Scimago includes data about patents
which would give technological universities an advantage. THE’s current
methodology gives a massive privilege to participants in multi-contributor
projects. QS uses several channels to obtain respondents for its academic and
employer surveys. One of these is a list of potential respondents provided by
the universities. Universities are very likely to nominate those who are likely
to support them in the surveys.
Guidelines for Polish universities as they seek to establish
and extend their international presence.
First, improving international visibility will take time.
Quick fixes such as recruiting highly cited adjunct faculty or taking part in
high profile projects may be counterproductive, especially if there is an
unannounced change in methodology.
Second, before launching a campaign to rise in the global
rankings some universities might consider regional or specialist rankings
first, such as the THE Europe Teaching Rankings, the QS graduate employability
ranking, the Indonesian GreenMetric rankings or business school rankings.
Third, Universities should also consider the cost of taking
part in the rankings. US News, QS and THE require universities to submit
data and this can be time consuming especially for THE which asks for data in
ten subjects. Many universities seem to need three or four staff dedicated to
rankings.
Fourth, it would be wise to monitor all international
rankings for data that can be used for internal evaluation or publicity.
Fifth, universities should match their profiles, strengths
and weaknesses with the methodology of specific rankings. Universities with
strengths in medical research might perform well in the Shanghai rankings.
Sixth, it is not a good idea to focus exclusively on any
single ranking or to make any one ranking the standard of excellence.
1.
Sunday, January 19, 2020
Boycotting the Shanghai Rankings?
John Fitzgerald of Swinburne University of Technology has written an article in the Journal of Political Risk that argues that Western universities should boycott, that is not participate in or refer to, the Shanghai rankings in order to show opposition to the current authoritarian trend in Chinese higher education.
There seems a bit of selective indignation at work here. China is hardly the only country in the world with authoritarian governments, ideologically docile universities, or crackdowns on dissidents. Nearly everywhere in Africa, most of the Middle East, Russia, much of Eastern Europe, and perhaps India would seem as guilty as China, if not more so.
American and other Western universities themselves are in danger of becoming one party institutions based on an obsessive hatred of Trump or Brexit, a pervasive cult of "diversity", political tests for admission, appointment and promotion, and periodic media or physical attacks on dissenters or those who associate with dissenters.
Perhaps academics should boycott the THE or other rankings to protest the treatment by Cambridge University of Noah Carl or Jordan Peterson?
One way of resisting the wave of repression, according to Professor Fitzgerald, is to "no longer reference the ARWU rankings or participate in the Shanghai Jiaotong rankings process which risks spreading the Chinese Communist Party's university model globally. Universities that continue to participate or reference the Shanghai rankings should be tasked by their faculty and alumni to explain why they are failing to uphold the principles of free inquiry and institutional autonomy as fiercely as Xi Jinping is undermining them."
It is hard to see what Fitzgerald means by not participating in the Shanghai rankings. The Academic Ranking of World Universities (ARWU) uses publicly available data from western sources, the Web of Science, Nature, Science, the Clarivate Analytics list of Highly Cited Researchers, and Nobel and Fields awards. Universities cannot avoid participating in them. They can denounce and condemn the rankings until their faces turn bright purple but they cannot opt out. They are ranked by ARWU whether they like it or not.
As for referencing, presumably citing the Shanghai rankings or celebrating university achievements there, Fitzgerald's proposals would seem self defeating. The rankings actually understate the achievements of leading Chinese universities. In the latest ARWU Tsinghua University and Peking University are ranked 43rd and 53rd. The QS World University Rankings puts them 16th and 22nd and the THE world rankings 23rd and 24th.
If anyone wanted to protest the rise of Chinese universities they should turn to the QS and THE rankings where they do well because of reputation, income (THE), and publications in high status journals. It is also possible to opt out of the THE rankings simply by not submitting data.
If oppressive policies did affect the quality of research produced by Chinese universities this would be more likely to show up in the Shanghai rankings through the departure of highly cited researchers or declining submissions to Nature or Science than in the THE or QS rankings where a decline would be obscured if reputation scores continued to hold steady.
Fitzgerald's proposals are pointless and self defeating and ascribe a greater influence to rankings than they actually have.
There seems a bit of selective indignation at work here. China is hardly the only country in the world with authoritarian governments, ideologically docile universities, or crackdowns on dissidents. Nearly everywhere in Africa, most of the Middle East, Russia, much of Eastern Europe, and perhaps India would seem as guilty as China, if not more so.
American and other Western universities themselves are in danger of becoming one party institutions based on an obsessive hatred of Trump or Brexit, a pervasive cult of "diversity", political tests for admission, appointment and promotion, and periodic media or physical attacks on dissenters or those who associate with dissenters.
Perhaps academics should boycott the THE or other rankings to protest the treatment by Cambridge University of Noah Carl or Jordan Peterson?
One way of resisting the wave of repression, according to Professor Fitzgerald, is to "no longer reference the ARWU rankings or participate in the Shanghai Jiaotong rankings process which risks spreading the Chinese Communist Party's university model globally. Universities that continue to participate or reference the Shanghai rankings should be tasked by their faculty and alumni to explain why they are failing to uphold the principles of free inquiry and institutional autonomy as fiercely as Xi Jinping is undermining them."
It is hard to see what Fitzgerald means by not participating in the Shanghai rankings. The Academic Ranking of World Universities (ARWU) uses publicly available data from western sources, the Web of Science, Nature, Science, the Clarivate Analytics list of Highly Cited Researchers, and Nobel and Fields awards. Universities cannot avoid participating in them. They can denounce and condemn the rankings until their faces turn bright purple but they cannot opt out. They are ranked by ARWU whether they like it or not.
As for referencing, presumably citing the Shanghai rankings or celebrating university achievements there, Fitzgerald's proposals would seem self defeating. The rankings actually understate the achievements of leading Chinese universities. In the latest ARWU Tsinghua University and Peking University are ranked 43rd and 53rd. The QS World University Rankings puts them 16th and 22nd and the THE world rankings 23rd and 24th.
If anyone wanted to protest the rise of Chinese universities they should turn to the QS and THE rankings where they do well because of reputation, income (THE), and publications in high status journals. It is also possible to opt out of the THE rankings simply by not submitting data.
If oppressive policies did affect the quality of research produced by Chinese universities this would be more likely to show up in the Shanghai rankings through the departure of highly cited researchers or declining submissions to Nature or Science than in the THE or QS rankings where a decline would be obscured if reputation scores continued to hold steady.
Fitzgerald's proposals are pointless and self defeating and ascribe a greater influence to rankings than they actually have.
Subscribe to:
Comments (Atom)

