Friday, October 25, 2019

Using Webometrics to Rank University Systems

Recently there has been some interest in ranking higher education systems in addition to institutions or departments. See here and here.  But both of these efforts, from Universitas 21 and QS, rank only 50 countries.

The Webometrics rankings attempt to cover every university in the world or anything that might conceivably claim to be a university, institute or college. The indicators comprise web activity and research output. So, there is data here to create a simple and comprehensive  ranking of countries. Below is the list of countries and territories ranked according to the world rank of the highest ranked university. If the Webometrics methodology remains unchanged it will be updated twice a year.

The table is not very surprising overall but it is worth noting that the leading Asian countries are already in the top ten and that Brazil and Mexico  are not too far behind. The performance of Arab countries is not too impressive even if they are rich in oil.

It's a safe bet that the highest ranked Chinese university will rise steadily over the next few years followed by South Korea and Singapore, but probably not Hong Kong and Australia.






Rank
Country
Rank of highest ranked university
1
USA           
       1
2
UK            
       7
3
Canada        
      19
4
Switzerland   
      32
5
China         
      33
6
Hong Kong     
      45
7
Australia     
      46
8
Singapore     
      50
9
Netherlands   
      63
10
Japan         
      69
11
Brazil        
      74
12
Denmark       
      76
13
Belgium       
      78
14
Finland       
      87
15
Norway        
      93
16
Germany       
      97
17
Sweden        
     106
18
Taiwan        
     111
19
South Korea       
     116
20
Italy         
     120
21
Spain         
     133
22
Mexico        
     141
23
Austria       
     150
24
New Zealand   
     153
25
Israel        
     157
26
Czech Republic         
     204
27
Portugal      
     208
28
Greece        
     224
29
Russia        
     226
30
Argentina     
     228
31
Ireland       
     230
32
South Africa  
     274
33
France        
     292
34
Chile          
     323
35
Malaysia      
     352
36
Argentina     
     372
37
Poland        
     388
38
Saudi Arabia  
     415
39
Iran          
     417
40
Estonia       
     440
41
Serbia        
     464
42
India         
     471
43
Turkey        
     475
44
Thailand      
     513
45
Iceland       
     533
46
Hungary       
     563
47
Egypt         
     602
48
Colombia      
     614
49
Croatia       
     619
50
Luxembourg    
     631
51
Puerto Rico   
     649
52
Belarus       
     684
53
Cyprus        
     700
54
Macau         
     720
55
Slovakia      
     732
56
Lithuania     
     750
57
Indonesia     
     771
58
Costa Rica    
     844
59
Malta         
     866
60
Romania       
     881
61
Bulgaria      
     934
62
Jamaica       
     953
63
Qatar         
     958
64
Peru          
     971
65
Kenya         
     987
66
Vietnam       
    1013
67
Slovenia      
    1103
68
Latvia        
    1106
69
Uganda        
    1129
70
Jordan        
    1149
71
UAE           
    1158
72
Philippines   
    1199
73
Ghana         
    1209
74
Nigeria       
    1233
75
Pakistan      
    1269
76
Ethiopia      
    1314
77
Oman           
    1346
78
Georgia       
    1423
79
Morocco       
    1515
80
North Macedonia   
    1569
81
Venezuela     
    1593
82
Ecuador       
    1638
83
Palestine     
    1646
84
Bosnia        
    1669
85
Kazakhstan    
    1793
86
Trinidad      
    1794
87
Iraq          
    1804
88
Brunei        
    1829
89
Fiji          
    1831
90
Bangladesh    
    1895
91
Tanzania      
    1913
92
Ukraine       
    1977
93
Sri Lanka     
    1981
94
Zimbabwe      
    2014
95
Algeria       
    2061
96
Cuba          
    2134
97
Bahrain       
    2161
98
Kuwait        
    2200
99
Mozambique    
    2280
100
Paraguay      
    2297
101
Mauritius     
    2422
102
Guatemala     
    2458
103
Uruguay       
    2499
104
Botswana      
    2583
105
Grenada       
    2583
106
Armenia       
    2643
107
Liechtenstein 
    2761
108
Montenegro    
    2878
109
Guam          
    2900
110
Sudan         
    2936
111
Bolivia       
    2960
112
Mongolia      
    2962
113
Benin         
    2980
114
Malawi        
    3001
115
Zambia        
    3001
116
Senegal       
    3008
117
Moldova       
    3151
118
Tunisia       
    3198
119
Rwanda        
    3220
120
Nepal         
    3243
121
Namibia       
    3316
122
Panama        
    3391
123
Cameroon      
    3527
124
Barbados      
    3538
125
Azerbaijan    
    3573
126
US Virgin Islands
    3579
127
Syria         
    3593
128
Burkina Faso  
    3634
129
Dominica      
    3679
130
Honduras      
    3892
131
Uzbekistan    
    4017
132
Libya         
    4040
133
Yemen         
    4126
134
Faroe Islands      
    4368
135
Madagascar    
    4372
136
Togo          
    4392
137
Eswatini      
    4428
138
Laos          
    4431
139
Nicaragua     
    4458
140
El Salvador   
    4542
141
Kyrgyzstan     
    4554
142
French Polynesia
    4640
143
Albania       
    4735
144
Monaco        
    4842
145
Dominican Republic
    4903
146
Cambodia      
    5060
147
San Marino    
    5107
148
Papua New Guinee
    5205
149
Greenland     
    5378
150
Afghanistan   
    5676
151
Lesotho       
    5872
152
Antigua       
    6040
153
Guyana        
    6149
154
Ivory Coast   
    6306
155
Anguilla      
    6374
156
Suriname       
    6641
157
Democratic Republic of the Congo     
    7033
158
American Samoa      
    7213
159
Myanmar       
    7221
160
Belize        
    7497
161
Micronesia    
    7962
162
Haiti         
    8082
163
Angola        
    8091
164
Bhutan        
    8159
165
Niger         
    8384
166
Sierra Leone  
    8560
167
Somalia       
   10154
168
St Kitts & Nevis
   10527
169
Cape Verde    
   10685
170
Andorra       
   10772
171
Gambia        
   11020
172
Seychelles    
   11235
173
South Sudan   
   12329
174
Cayman Islands
   13011
175
Samoa         
   13132
176
Bermuda       
   13431
177
British Virgin Islands
   13694
178
Maldives      
   13864
179
Palau         
   13864
180
St Lucia      
   13981
181
Tajikistan    
   14180
182
Djibouti      
   14186
183
Central African Republic           
   14433
184
Northern Marianas
   14444
185
Marshall Islands   
   15827
186
Gabon         
   16002
187
Aruba         
   16347
188
Solomon Islands    
   17867
189
Montserrat    
   18103
190
East Timor       
   18433
191
Guinea        
   18588
192
French Guiana 
   18703
193
Liberia       
   19463
194
Isle of Man   
   20029
195
Mali          
   20172
196
Mauretania    
   22144
197
Equatorial Guinea     
   23382
198
Niue          
   23892
199
Eritrea       
   24481
200
Turks & Caicos Islands
   27918





Saturday, September 28, 2019

Rankings case shows need to reform citations indicator

My previous post has just been republished by University World News.

Comments here are welcome.

Thursday, September 26, 2019

Trinity College Dublin: Time to forget about THE?:

Global rankings, especially THE's, have been very useful to British universities, at least to those sitting at the apex of the system. If a Russell Group university falls in the rankings then it is the fault of impending Brexit and/or the terrible austerity inflicted on the nation's research prowess. If it rises then this is cause for congratulation but with a hint of foreboding. How can they keep advancing with Ebenezer Scrooge controlling  the treasury and the bitter winds of Brexit howling at the door? The universities have reciprocated by not inquiring about how THE constructs its rankings, particularly the citations and industry income indicators. 

Across the Irish Sea, universities have for the most part also been loyal to THE. Trinity College Dublin (TCD) has continued to submit data to THE and also to QS and has passively accepted the results of the rankings even when they show the institution going down and down. The steady decline is usually blamed on the meanness of the Irish government and its failure to provide sufficient funds.

I have dealt with Trinity's misfortunes here,  herehere, and here, In 2015 TCD fell seven places in the QS world rankings and 22 in THE's. In contrast it had been rising in the Shanghai ARWU rankings since 2004 and in the Round University Rankings (RUR) since 2010, although everybody pretended not to notice this.

This year history repeats itself all over again. TCD has fallen in THE world rankings from 120th place to 164th. Again this supposedly is the fault of the Irish state to provide enough money.

But we get a very different picture when we look at the Shanghai Rankings. TCD has risen from 167th place to 154th, getting close to the 101-150 band. Leaving aside the Nobel and Fields Awards, Trinity has gained 6.6 points for highly cited researchers, 1.4 for publications, and 1.4 for productivity per capita. It has, however, fallen 0.6 for papers in Nature and Science.

Looking at RUR, TCD has risen from 75th to 57th for Research and 35th to 29th for international diversity. It has fallen slightly for financial sustainability from 191st to 197th and for Teaching from 275th to 335th, mainly because of a fall in the number of academic staff.

It seems perverse for TCD to keep on about its decline in the THE rankings when it can point to a steady rise in the Shanghai rankings which are not perfect but are certainly more stable, consistent and realistic than THE. 

Does THE really want to be judged by rankings that apparently think that Anadolu University is best for Innovation, Luxembourg for International Orientation and Aswan for research impact measured by Citations?

But if TCD really insist on sticking with THE then I suggest that they recruit a few researchers taking part in the Global Burden of Disease Study, funded by the Bill and Melinda Gates Foundation.*

They should also think about amalgamating with the Royal College of Surgeons.

*assuming no methodological change




Tuesday, September 17, 2019

Seven Modest Suggestions for Times Higher Education


My post of the 24th of August has been republished in Arabia Higher Education







Going Up and Up Down Under: the Case of the University of Canberra

It is a fact almost universally ignored that when a university suddenly rises or falls many places in the global rankings the cause is not transformative leadership, inclusive excellence, team work, or strategic planning but nearly always a defect or a change in the rankers' methodology.

Let's take a look at the fortunes of the University of Canberra (UC) which THE world rankings now have in the world's top 200 universities and Australia's top ten. This is a remarkable achievement since the university did not appear in these rankings until 2015-16 when it was placed in the 500-600 band with very modest scores of 18.4 for teaching, 19.3 for research, 29.8 for citations, which is supposed to measure research impact, 36.2 for industry income, and 54.6 for international outlook.

Just four years later the indicator scores are 25.2 for teaching, 31.1 for research, 99.2 for citations, 38.6 for industry income, and 86.9 for international orientation. 

The increase in the overall score over four years, calculated with different weightings for the indicators, was composed of 20.8 points for citations and 6.3 for the other four indicators combined. Without those 20.8 points Canberra would be in the 601-800 band.

I will look at where that massive citation score came from in a moment. 

It seems that the Australian media is reporting on this superficially impressive performance with little or no scepticism and without noting how different it is from the other global rankings. 

The university has issued a statement quoting vice-chancellor Professor Deep Saini as saying that the "result confirms the steady strengthening of the quality at the University of Canberra, thanks to the outstanding work of our research, teaching and professional staff" and that the "increase in citation impact is indicative of the quality of research undertaken at the university, coupled with a rapid growth in influence and reach, and has positioned the university as amongst the best in the world."

The Canberra Times reports that the vice-chancellor has said  that part of the improvement was the result of a talent acquisition campaign while noting that many faculty were complaining about pressure and excessive workloads.

Leigh Sullivan, DVC for research and innovation, has a piece in the Campus Morning Mail that hints at reservations about UC's apparent success, which is " a direct result of its Research Foundation Plan (2013-2017) and "a strong emphasis on providing strategic support for research excellence in a few select research areas where UC has strong capability." He notes that when the citation scores of research stars are excluded there has still been a significant increase in citations and warns that what goes up can go down and that performance can be affected by changes in the ranking methodology.

The website riotact quotes the vice-chancellor on the improvement in research quality as evidence by the citation score and as calling for more funding for universities: the "government has to really think and look hard at how well we support our universities. That's not to say it badly supports us, it's that the university sector deserves to be on the radar of our government as a major national asset."

The impressive ascent of UC is unique to THE. No serious ranking puts it in the top 200 or anywhere near. In the current Shanghai Rankings it is in the 601-700 band and has been falling for the last two years. In Webometrics it is 730th in the world and 947th for Excellence, that is publications in the 10% most cited in 25 disciplines.  In  University Ranking by Academic Performance it is 899th and in the CWUR Rankings it doesn't even make the top 1,000.

Round University Ranking and Leiden Ranking do not rank UC at all.

Apart from THE UC does best in the QS rankings where it is 484th in the world and 26th in Australia.

So how could UC perform so brilliantly in THE rankings when nobody else has recognised that brilliance? What does THE know that nobody else does? Actually, it does not perform brilliantly in the THE rankings, just in the citations indicator which is supposed to measure research influence or research impact.

This year UC has a score of 99.2 which puts it in the top twenty for citations just behind Nova Southeastern University in Florida and Cankaya University in Turkey and ahead of Harvard, Princeton and Oxford. The top university this year is Aswan University in Egypt replacing Babol Noshirvani University of Technology in Iran. 

No, THE is not copying the interesting methodology of the Fortunate 500. This is the result of an absurd methodology that THE is unable or unwilling for some reason to change.

THE has a self-inflicted  problem with  a small number of papers that have hundreds or thousands of "authors" and collect thousands of citations. Some of these are from the CERN project and THE has dealt with them  by using a modified form of fractional counting for papers with more than a thousand authors. That has removed the privilege of institutions that contribut to CERN projects but has replaced it with the privilege of those that contribute to the Global Burden of Disease Study (GBDS) whose papers tend to have hundreds but not thousands of contributors and sometimes receive over a thousand citations. As a result, places like Tokyo Metropolitan University, National Research University MEPhI and Royal Holloway London have been replaced as citation super stars by St Georges' London, Brighton and Sussex Medical School, and Oregon Health and Science University.

It would be a simple matter to apply fractional counting to all papers, dividing the number of citations by the number of authors. After all Leiden Ranking and Nature Index manage to do it but THE for some reason has chosen not to follow.

The problem is compounded by counting self-citations, by hyper-normalisation so that the chances of hitting the jackpot with an unusually highly cited paper are increased, and by the country bonus that boosts the scores for universities by virtue of their location in low scoring countries. 

And so to UC's apparent success this year. This is entirely the result of it's citation score which is entirely dependent on THE's methodology. 

Between 2014 and 2018 UC had 3,825 articles in the Scopus database of which 27 were linked to the GBDS which is funded by the Bill and Melinda Gates Foundation. Those 27 articles, each with hundreds of contributors, have received 18,431 citations all of which are credited to UC and its contributor. The total number of citations is 53,929 so those 27 articles accounted for over a third of UC's citations. Their impact might be even greater if they were cited disproportionately soon after publication.

UC has of course improved its citation performance even without those articles but it is clear that they have made an outsize contribution. UC is not alone here. Many universities in the top 100 for citations in the THE world rankings owe their status to the GBDS: Anglia Ruskin, Reykjavik, Aswan, Indian Institute of Technology Ropar, the University of Peradeniya, Desarrollo, Pontifical Javeriana and so on.

There is absolutely nothing wrong with the GBDS nor with UC encouraging researchers to take part. The problem lies with THE and its reluctance to repair an indicator that produces serious distortions and is an embarrassment to those universities who apparently look to the THE rankings to validate their status.

Monday, September 16, 2019

What should universities do about organised cheating?

Every so often the world of higher education is swept by a big panic about systemic and widespread cheating. The latest instance is concern about contract cheating or essay mills that provide bespoke essays or papers for students.

It seems that the Australian government will introduce legislation to penalise the supply or advertising of cheating services to students. There are already laws in several American states and there have been calls for the UK to follow suit.

There is perhaps a bit of hypocrisy here. If universities in Europe Australia and North America admit more and more students who lack the cognitive or language skills to do the required work and if they chose to use assessment methods that are vulnerable to deception and dishonesty such as unsupervised essays and group projects then cheating is close to inevitable. 

On the supply side there appear to be large numbers of people around the world without decent academic jobs or jobs of any sort who are capable of producing academic of a high standard, sometimes worth an A grade or a first. The Internet has made it possible for lazy or incompetent students to link up with competent writers.

The Daily Mail has reported that Kenya hosts a medium sized industry with  students and academics slaving away to churn out essays for British and American students. This is no doubt a hugely exploitative business but consider the consequences of shutting down the essay mills. Many educated Kenyans are going to suffer financially. Many students will drop out, resort to  other forms of cheating, or will demand more support and counselling and transitional or foundation programmes.

If universities are serious about the scourge of essay mills they need to work on both the supply and the demand side. They might start by offering the essay writers in Kenya to apply scholarships for undergraduate or postgraduate courses or posts in EAP departments. 

On the demand side the solution seems to be simple. Stop admitting students because they show leadership ability, have overcome adversity, will make the department look like Britain, America or the world, will help craft an interesting class, and admit them because they have demonstrated an ability to do the necessary work.












https://www.studyinternational.com/news/australia-essay-mills-contract-cheating-penalty-law/

Saturday, September 14, 2019

Are UK universities facing a terrible catastrophe?

A repeated theme of mainstream media reporting on university rankings (nearly always QS or THE) is that Brexit has inflicted, is inflicting, or is surely going to inflict great damage on British education and the universities because they will not get any research grants from the European Union or be able to network with their continental peers.

The latest of these dire warnings can be found in a recent edition of the Guardian, which is the voice of the British progressive  establishment. Marja Makarow claims that Swiss science was forced "into exile" after the 2014 referendum on immigration controls. Following this, Switzerland supposedly entered a period of isolation without access to Horizon 2020 or grants from the European Research Council and with a declining reputation and a loss of international collaboration and networks. This will happen to British research and universities if Brexit is allowed to happen.

But has Swiss research suffered? A quick tour of some relevant rankings suggests that it has not. The European Research Ranking which measures research funding and networking in Europe has two Swiss universities in the top ten. The Universitas 21 systems rankings put Switzerland in third place for output, up from sixth in 2013, and first for connectivity.

The Leiden Ranking shows that EPF Lausanne and ETH Zurich have fallen for total publications between 2011-14 and 2014-17 but both have risen for publications in the top 10% of journals, a measure of research quality.

The Round University Rankings show that EPF and ETH have both improved for research since 2013 and both have improved their world research reputation.

So it looks as though Switzerland has not really suffered very much, if at all. Perhaps Brexit, if it ever happens, will turn out to be something less than the cataclysm that is feared, or hoped for.




Saturday, September 07, 2019

Finer and finer rankings prove anything you want

If you take a single metric from a single ranking and do a bit of slicing by country, region, subject, field and/or age there is a good chance that you can prove almost anything, for example that the University of the Philippines is a world beater for medical research. Here is another example from the Financial Times.

An article by John O'Hagan, Emeritus Professor at Trinity College Dublin, claims that German universities are doing well for research impact in the QS economics world rankings. Supposedly, "no German university appears in the top 50 economics departments in the world using the overall QS rankings. However, when just research impact is used, the picture changes dramatically, with three German universities, Bonn, Mannheim and Munich, in the top 50, all above Cambridge and Oxford on this ranking."

This is a response to Frederick Studemann's claim that German universities are about to move up the rankings. O'Hagan is saying that is already happening.

I am not sure what this is about. I had a look at the most recent QS economics rankings and found that in fact Mannheim is in the top fifty overall for that subject. The QS subject rankings do not have a research impact indicator. They have academic reputation, citations per paper, and h-index, which might be considered proxies for research impact, but for none of these are the three universities in the top fifty. Two of the three universities are in the top fifty for academic research reputation, one for citations per paper and two for h-index.

So it seems that the article isn't referring to the QS economics subject ranking. Maybe it is the overall ranking that professor O'Hagan is thinking of? There are no German universities in the overall top fifty there but there are also none in the citations per faculty indicator. 

I will assume that the article is based on an actual ranking somewhere, maybe an earlier edition of the QS subject rankings or the THE world rankings or from one of the many spin-offs. 

But it seems a stretch to talk about German universities moving up the rankings just because they did well in one metric in one of the 40 plus international rankings in one year.