Tuesday, February 27, 2018

Are the rankings biased?

Louise Richardson, vice-chancellor of the University of Oxford has published an article in the Financial Times proclaiming that British universities are a national asset and that their researchers deserve that same adulation as athletes and actors.

"Listening to the public discourse one could be forgiven for thinking that the British higher education system is a failure. It is not. It is the envy of the world."

That is an unfortunate phrase. It used to be asserted that the National Health Service was the envy of the world.

She cites as evidence for university excellence the Times Higher Education World University Rankings which have three British universities in the world's top ten and twelve in the top one hundred. These rankings also, although she does not mention it here, put Oxford in first place.

There are now, according to IREG, 21 global university rankings. One wonders why a world-class scholar and head of a world-class university would choose rankings that regularly produce absurdities such as Anglia Ruskin University ahead of Oxford for research impact and Babol Noshirvani University of Technology its equal.

But perhaps it is not really surprising since of those rankings THE is the only one to put Oxford  in first place. In the others it ranges from third place in the URAP rankings published in Ankara to seventh in the Shanghai Rankings (ARWU), Webometrics (WEB) and Round University Ranking (RUR) from Russia

That leads to the question of how far the rankings are biased in favor of universities in their own countries.

Below is a quick and simple comparison of how top universities perform in rankings published in the countries where they located and in other rankings.

I have looked at the rank of the top scoring home country university in each of eleven global rankings and then at how well that university does in the other rankings. The table below gives the overall rank of each "national flagship" in the most recent eleven global university rankings. The rank in the home country rankings is in red.

We can see that Oxford does better in the Times Higher Education (THE) world  rankings where it is first than in the others where its rank ranges from 3rd  to 7th. Similarly, Cambridge is the best performing UK university in the QS rankings where it is 4th. It is also 4th in  the Center for World University Rankings (CWUR), now published in the UAE, and 3rd in ARWU. In the other rankings it does less well.

ARWU, the US News Best Global Universities (BGU), Scimago (SCI), Webometrics (WEB), URAP, the National Taiwan University Rankings (NTU), and RUR do not seem to be biased in favour of their country's flagship universities. For example, URAP ranks Middle East Technical University (METU) 532nd which is  lower than five other rankings  and higher than three.

CWUR  used to be published from Jeddah in Saudi Arabia but has now moved to the Emirates so I count the whole Arabian peninsula as its home. The top home university is therefore King Saud University (KSU), which is ranked 560th, worse than in any other ranking except for THE.

The GreenMetric Rankings, produced by Universitas Indonesia (UI), have that university in 23rd place, which is very much better than any other.

It looks like THE, GreenMetric and, to a lesser extent QS, are biased towards their top home country institutions.

This only refers to the best universities and we might get different result looking at all the ranked universities.

There is a paper by Chris Claassen that does this although it covers fewer rankings.



THE
ARWU
QS
 BGU
SCI
WEB
URAP
NTU
RUR
CWUR
GM
Oxford
1
7
6
5
6
7
3
5
7
5
6
Tsinghua
35
48
25
64
8
45
25
34
75
65
NR
Cambridge
4
3
5
7
16
11
9
12
9
4
NR
Harvard
6
1
3
1
1
1
1
1
1
1
NR
Barcelona
201-250
201-300
156
81
151
138
46
64
212
103
180
METU
601-800
701-800
471-480
314
489
521
532
601-700
407
498
NR
NTU
195
151-200
76
166
342
85
100
114
107
52
92
Lomonosov  MSU
188
93
95
267
342
235
194
236
145
97
NR
KSU
501-600
101-150
221
377
NR
424
192
318
460
560
NR
UI
600-800
NR
277
NR
632
888
1548
NR
NR
NR
23

Tuesday, February 20, 2018

Is Erdogan Destroying Turkish Universities?


An article by Andrew Wilks in The National claims that the position of Turkish universities in the Times Higher Education (THE) world rankings, especially that of Middle East Technical University (METU) has been declining as a result of the crackdown by president Erdogan following the unsuccessful coup of July 2016.

He claims that Turkish universities are now sliding down the international rankings and that this is because of the decline of academic freedom, the dismissal or emigration of many academics and a decline in its academic reputation.


'Turkish universities were once seen as a benchmark of the country’s progress, steadily climbing international rankings to compete with the world’s elite.
But since the introduction of emergency powers following a failed coup against President Recep Tayyip Erdogan in July 2016, the government’s grip on academic freedom has tightened.
A slide in the nation's academic reputation is now indisputable. Three years ago, six Turkish institutions [actually five] were in the Times Higher Education’s global top 300. Ankara's Middle East Technical University was ranked 85th. Now, with Oxford and Cambridge leading the standings, no Turkish university sits in the top 300.
Experts say at least part of the reason is that since the coup attempt more than 5,800 academics have been dismissed from their jobs. Mr Erdogan has also increased his leeway in selecting university rectors.
Gulcin Ozkan, formerly of Middle East Technical University but now teaching economics at York University in Britain, said the wave of dismissals and arrests has "forced some of the best brains out of the country".'
I have no great regard for Erdogan but in this case he is entirely innocent.

There has been a massive decline in METU's position in the THE rankings since 2014 but that is entirely the fault of THE's methodology. 

In the world rankings of 2014-15, published in 2014, METU was 85th in the world, with a whopping score of 92.0 for citations, which carries an official weighting of 30%. That score was the result of METU's participation in the Large Hadron Collider (LHC) project which produces papers with hundreds or thousands of authors and hundreds and thousands of citations. In 2014 THE counted every single contributor as receiving all of the citations. Added to this was a regional modification that boosted the scores of universities located in countries with a low citations impact score.

In 2015, THE revamped its methodology by not counting the citations to these mega-papers and by applying the regional modification to only half of the research impact score.

As a result, in the 2015-16 rankings METU crashed to the 501-600 band, with a score for citations of only 28.8. Other Turkish universities had also been involved in the LHC project and benefited from the citations bonus and they too plummeted. There was now only one Turkish university in the THE top 300.

The exalted position of METU in the THE 2014-15 rankings was the result of THE's odd methodology and its spectacular tumble was the result of changing that methodology. In other popular rankings METU seems to be slipping a bit but it never goes as high as in THE in 2014 or as low as in 2015

In the QS world rankings for 2014-15 METU was in the 401-410 band and by 2017-18 it had fallen to  471-480 in 2017

The Russian Round University Rankings have it 375 in 2014 and 407 in 407. The US News Best Global Universities placed it 314th last year.

Erdogan had nothing to do with it.















Friday, February 16, 2018

It's happened: China overtakes USA in scientific research

Last November I noted that the USA was barely managing to hold onto its lead over China in scientific research as measured by articles in the Scopus database. At the time, there were 346,425 articles with a Chinese affiliation and 352,275 with a US affiliation for 2017.

As of today, there are 395,597 Chinese and 406,200 US articles dated 2017.

For 2018 so far, the numbers are 53,941 Chinese and 49,428 US.

There are other document types listed in Scopus and perhaps the situation may change over the course of the year.

Also, the United States still has a smaller population so it maintains its lead in per capita research production. For the moment.

Saturday, February 10, 2018

Influence of Rankings on State Policy: India

In case you are wondering why the Indian media get so excited about the THE and QS rankings and not about those that are just as good or better such as Leiden Ranking, RUR or Nature Index, see this from the University Grants Commission.

Note that it says "any time" and that only the Big Three rankings count for getting Assistant Professor jobs.


"NEW DELHI:  University Grants Commission (UGC) has come up with, UGC Regulations 2018, which exempts PhD candidates from having NET qualification for direct recruitment to Assistant Professor post. This new draft regulation is known as Minimum Qualifications for Appointment of Teachers and Other Academic Staff in Universities and Colleges and Measures for the Maintenance of Standards in Higher Education. Further the Commission has also listed 'Ph.D degree from a university/ institution with a ranking in top 500 in the World University ranking (at any time) by Quacquarelli Symonds (QS), the Times Higher Education (THE) and Academic Ranking of World Universities (ARWU) of the Shanghai Jiao Tong University (Shanghai),' as one of the criteria for Assistant Professor appointment."


Thursday, February 08, 2018

Playing the Rankings Game in Pakistan

This article by Pervez Hoodbhoy from October 2016 is worth reading:

"A recently released report by Thomson-Reuters, a Canada based multinational media firm, says, “In the last decade, Pakistan’s scientific research productivity has increased by more than 4 times, from approximately 2000 articles per year in 2006 to more than 9000 articles in 2015. During this time, the number of Highly Cited Papers (HCPs) featuring Pakistan based authors increased tenfold from 9 articles in 2006 to 98 in 2015.”
This puts Pakistan well ahead of Brazil, Russia, India, and China in terms of HCPs. As the reader surely knows, every citation is an acknowledgement by other researchers of important research or useful new findings. The more citations a researcher earns, the more impact he/she is supposed to have had upon that field. Research evaluations, through multiple pathways, count for 50-70 percent of a university’s ranking (if not more).
If Thomson-Reuters has it right, then Pakistanis should be overjoyed. India has been beaten hollow. Better still, two of the world’s supposedly most advanced countries–Russia and China–are way behind. This steroid propelled growth means Pakistan will overtake America in just a decade or two.
But just a little analysis shows something is amiss. Surely a four-fold increase in scientific productivity must have some obvious manifestations. Does one see science laboratories in Pakistani universities four times busier? Are there four times as many seminars presenting new results? Does one hear animated discussions on scientific topics four times more frequently?
Nothing’s visible. Academic activity on Pakistani campuses might be unchanged or perhaps even less today, but is certainly not higher than ten years ago. So where–and why–are the authors of the HCP’s hiding? Could it be that these hugely prolific researchers are too bashful to present their results in departmental seminars or public lectures? The answer is not too difficult to guess."




Should Pakistan Celebrate the Latest THE Asian Rankings?


This is an updating and revision of a post from a few days ago


There appears to be no end to the craze for university rankings. The media in many parts of the world show almost as much interest in global university rankings as in the Olympics or the World Cup. They are now used to set requirements for immigration, chose research collaborators, external examiners, international partners and for marketing, public relations, and recruitment.

Pakistan has not escaped the craze although it was perhaps a bit slower than some other places. Recently, we have seen headlines announcing that ten Pakistani universities are included in the latest Times Higher Education (THE) Asian rankings and highlighting the achievement of Quaid-i-Azam University (QAU) in Islamabad reaching the top 100.

Rankings are unavoidable and sometimes they have beneficial results. The first publication of the research-based Shanghai rankings in 2003, for example, was a salutary shock to continental European universities and a clear demonstration of how far China had to go to catch up with the West in the natural sciences. But rankings do need to be treated with caution especially when ranking metrics are badly and obviously flawed.

THE note that there are now ten Pakistani universities in the Asian rankings and one, QAU, in 79th place, which would appear to be evidence of academic progress.

Unfortunately, Pakistani universities, especially QAU, do very much better in the THE rankings than in others. QAU is in the 401-500 band in the THE world rankings, which use the same indicators as the Asian rankings. But in the QS World University Rankings it is in the 650-700 band. It does not even get into the 800 ranked universities In the Shanghai rankings, the 903 in the Leiden Ranking, or the 763 in the Russian Round University Rankings. In the University Ranking by Academic Performance, published in Ankara, it is 605th, in the Center for World University Rankings list 870th.

How can we explain QAU’s success in the THE world and Asian rankings, one that is so much greater than any other ranking? It is in large part the result of a flawed methodology.

Take a look at the scores that QAU got in the THE rankings. In all cases the top scoring university gets 100.

For Teaching, combining five indicators, it was 25.7 which is not very good. For international outlook it was 42.1. Since QAU has very few international staff or students this mediocre score is very probably the result of a high score for international collaboration.

For research income from industry it was 31.8. This is probably an estimate since exactly the same score is given for four other Pakistani universities.

Now we come to something very odd. QAU’s research score was 1.3. It was the lowest of the 350 universities in the Asian rankings, very much lower than the next worse, Ibaraki University in Japan with 6.6.  The research score is composed of research reputation, publications per faculty and research income per faculty. This probably means that QAU’s score for research reputation was zero or close to zero.

In contrast, QAU’s score of 81.2 for research impact measured by citations is among the best in Asia. Indeed, in this respect it would appear to be truly world class with a better score than Monash University, the Chinese University of Hong Kong, the University of Bologna or the University of Nottingham.

How is it being possible that QAU could be 7th in Asia for research impact but 350th for research?

The answer is that THE’s research impact indicator is extremely misleading. It does not simply measure the number of citations but the number of citations in over 300 fields, five years of publication and up to six years of citations. This means that a few highly cited papers in a strategic discipline at a strategic time can have a disproportionate effect on the impact score especially if the total number of papers is low.

Added to this is THE’s regional modification which means that the citation impact score of a university is divided by the square root of the score of the whole country in which they university is located. That means that the score of universities in the top scoring country remain the same but that of all the others goes up, the worse the country the bigger the increase. The effect of this is to give a big boost to countries like Pakistan. THE used to apply this bonus to all of the citations indicator but now only to 50%.

Then we have to consider how THE deals with mega-papers mainly in physics and medicine, those with hundred even thousands of authors and hundreds and thousands of citations.

Until the world rankings of 2015-16 THE treated every single author of such papers as though he or she were the only author of the papers. Then they stopped counting citations to these papers and then in 2016-17 they awarded each institution a minimum 5% for citations.

The effect of the citations metric has been to make a mockery of the THE Asian and world rankings. A succession of unlikely places has been propelled to the top of the indicator because of contributions to mega-papers or because of a few or even a single prolific author combined with a low overall number of papers. We have seen Alexandria University, Anglia Ruskin University, Moscow State Engineering Physics Institute, Tokyo Metropolitan University rise to the top of this indicator. In last year’s Asian rankings, Veltech University in India appeared to be first for research impact.

QAU has been involved in the Large Hadron Collider (LHC) project, which produces papers with hundreds or thousands of authors and hundreds or thousands of citations, and has provided authors for several papers. One 2012 paper derived from this project received 4094 citations so that QU would be credited with 205 citations just for this paper.

In addition to this QAU employs an extremely productive mathematician, Tasawar Hayat, who is among the world’s elite of researchers in Clarivate Analytics list of Highly Cited Researchers where his primary affiliation is King Abdulaziz University in Saudi Arabia and QAU is his secondary affiliation. Professor Hayat is extremely prolific: in 2017 alone, he was author or co-author of 384 scientific documents, articles, reviews, notes and so on.

There is nothing wrong with QAU taking part in the LHC project and I am unable to comment on the quality of his research. It should, however, be understood that if Professor Hayat left QAU or QAU withdrew from the LHC project or THE changed its methodology then QAU could suffer a dramatic fall in the rankings similar to those suffered by some Japanese, Turkish or Korean universities in recent years. This is an achievement built on desperately weak foundations.

It would be very unwise to use these rankings as evidence for the excellence of QAU or any other university.

Tuesday, February 06, 2018

Rising Stars of Asian research

Times Higher Education (THE) has just announced the latest edition of its Asian rankings. Since the indicators are the same as the world rankings with adjusted weightings there was absolutely no suspense about who would be top. In case anybody still doesn't know it was the National University of Singapore.

The really interesting part of the rankings is the citations indicator, field- and year-normalised, based on Scopus, with fractional counting only for papers with more than 1,000 authors.

Here are some of the superstars of Asian research. On the left is the citations rank and the score for citations. On the right in brackets is the score for Research comprising research reputation, publications per faculty, and research income. To achieve a score in the seventies, eighties or nineties  for citations with minimal research reputation, very few publications and limited funding is remarkable.

1st. 99.1. Babol Noshirvani University of Technology (15.3)
2nd. 92.0 King Abdulaziz University (92.3)
3rd.  93.1. Ulsan National Institute of Science and Technology (37.8)
7th. 81.2. Quaid-i-Azam University (1.3)
13th. 74.5. Fujita Health University (9.4)
16th.72.5.  Central China Normal University (11.3)






Free speech rankings from Spiked

The magazine Spiked is descended from Living Marxism although some think it is now more libertarian than socialist. It has just published the latest edition of its free speech university rankings.

These are not actually rankings but a classification or a rating, since they just divide UK universities into three groups. They have been subjected to mockery from sections of the academic blogosphere, including WONKHE, that might be justified on technical grounds. This is, however, such an important topic that any sort of publicity has to be welcomed.

Universities are divided into three categories: 

RED; "A students’ union, university or institution that is hostile to free speech and free expression, mandating explicit restrictions on speech, including, but not limited to, bans on specific ideologies, political affiliations, beliefs, books, speakers or words."

AMBER; "A students’ union, university or institution that chills free speech and free expression through restricting vague and subjective types of speech, such as ‘offensive’ or ‘insulting’ speech, or requiring burdensome vetting procedures for events, speakers, posters or publications. Many policies in this category might not explicitly limit speech, but have the potential to be used to that end, due to purposefully vague or careless wording."

GREEN; "A students’ union, university or institution that, as far as we are aware, places no significant restrictions on free speech and expression – other than where such speech or expression is unlawful."

The roll of honour in the green category includes exactly seven universities, none of them in the Russell Group: Anglia Ruskin, Buckingham, Hertfordshire, Robert Gordon, Trinity St David, West of Scotland, and Winchester.


Interesting data from Webometrics

The Webometrics rankings perform the invaluable function of ranking 27,000 plus universities or entities claiming to be universities around the world. Also, their Excellence indicator identifies those  institutions, 5,776 this year, with any claim to involvement in research.

Consequently, it has often been used in unofficial national rankings in countries, especially in Africa, where very few places can make it into the top 500 or 1,000 universities included in the better known international rankings.

However, there seems to be a universal law that when a ranking becomes significant it will have unintended and perverse consequences. In the UK we have seen massive inflation in the number of first and upper second class degrees partly because this is a n element in popular national rankings. Sophisticated campaigns can also produce  significant gains in the QS academic opinion survey which has a 40% weighting  and a few hundred strategic citations can boost the most unlikely universities in the research impact indicator of THE world and regional rankings.

Webometrics also has indicator that seems to be susceptible to bad practices. This is "Presence", the number of pages in the main webdomain including subdomains and file types such as rich files, with a 5% weighting. Apparently this can be easily manipulated. Unlike other rankings, Webometrics does not attempt to ignore this but has highlighted it in several recent tweets, which is helpful since it indicates who might be manipulating the variable. It is  possible that there might have been a misunderstanding of the Webometrics guidelines, an error somewhere, or perhaps some totally valid and innocent explanation. If the latter is the case iIwill be happy to publish a statement.

Here is a selection of universities with their world rank in the Webometrics Presence indicator. The overall rank is in brackets.

4.  University of Nairobi, Kenya (874)

5.  Masaryk University in Brno, Czechia (433)

9.  Federal university of Santa Catarina, Brazil  (439)

15.  Charles University in Prague (203)

17.  University of Costa Rica (885)

20.  University of the West Indies St Augustine (1792)

32.  National University of Honduras (3777)

40.  Mahidol University, Thailand (548)

55.  Universitas Muhammadiyah Surakarta, Indonesia  (6394)




Wednesday, January 24, 2018

Fake Rankings from Nigeria?

Although the Webometrics rankings, based mainly on web activities, receive little attention from the good and the great among the world 's university administrators they do serve the important function of providing some sort of assessment of over 20,000 universities or entities that claim to be universities. They get to places where the market leaders, Shanghai Ranking, THE and QS, cannot go.

As a result, the media in several African countries have from time to time published local rankings based on Webometrics that do not appear all that different from what would be expected from a ranking based on research or reputation.

For example, the current top five in Webometrics are:

1. University of Ibadan
2. Covenant University
3. Obafemi Awolowo University
4. University of Nigeria
5. University of Lagos.

The Nigerian press have in the last few years announced the results of rankings supposedly produced by the country's national university commission. In 2016 Nigerian Scholars reported that the NUC had produced a ranking with the top five being:

1. University of Ibadan
2. University of Lagos
3. University of Benin
4. Obafemi Awolowo University
5. Ahmadu Bello University.

Now we have this published in The Nation .Professor Adamu Abubakar Abdulrasheed, Executive Secretary of the NUC, has announced that the rankings attributed to the NUC were fake and that the commission had not published any ranking for several years.

This is  a bit strange. Does that mean that nobody on the commission noticed that fake rankings were being published in its name until now? There may be more to the story.

For the moment, it looks as though Nigeria and other countries in Africa may have to continue relying on Webometrics.





Saturday, January 20, 2018

What use is a big endowment?






Quite a lot. But not as much as you might expect.

The website THEBESTSCHOOLS has just published a list of the world's 100 wealthiest universities, as measured by the value of their endowments. As expected, it is dominated by US institutions with Harvard in first place. There are also three universities from Canada and two each  from the UK, Australia, Japan, Singapore and Saudi Arabia

There are of course other elements in university funding but it worth looking at how this ranking compares with others. The top five are familiar to any rankings observer, Harvard with an endowment of 34.5 US$ followed by Yale, the University of Texas system, Stanford and Princeton. Then there is a surprise, King Abdullah University of Science and Technology in Saudi Arabia in sixth place with an endowment of 20 billion.

Some of the wealthy universities also do well in other rankings. Stanford, in fourth place here, is second in the overall Shanghai rankings and seventh for publications, and fifth in the  Leiden Ranking default publications indicator. It does even better in the QS employer survey indicator, where it is ranked second.

There are, however, several places that are very wealthy but just don't get anywhere in the global rankings. Williams College, the University of Richmond, Pomona College, Wellesley College, Smith College, and Grinnell College are not even given a value in the QS employment indicator, or the Leiden or Shanghai publication indicators. They may of course do well in some other respects: the University of Richmond is reported by the Princeton Review to be second in the US for internships.

On the other hand, some less affluent universities do surprisingly well. Some California schools seem to among the best high-performers.  Caltech is 47th here but 9th in the Shanghai rankings where it has always been first in the productivity per capita indicator. Berkeley is 65th here and fifth in Shanghai. The University of California San Francisco, a medical school, is 90th here and 21st in Shanghai.

Overall there is an association between endowment value and research output or reputation among employers that is definitely positive but rather modest. The correlation  between endowment and Shanghai publication score is 0.38, between endowment and number of publications 2012-15 (in the Leiden Ranking) 0.46, and between endowment and the QS employer survey score 0.40. The relationship would certainly be higher if we corrected for restriction of range.

Having a lot of money helps a university produce research and build up a reputation for excellence but it is certainly not the only factor involved.

Here is the top ten in a a ranking of the 100 universities by papers (Leiden Ranking) per billion dollars of endowment.

1. University of Toronto
2. University of British Columbia
3. McGill University
4. University of California San Francisco
5. University of Melbourne
6. Rutgers University
7. UCLA
8. University of Florida
9. University of California Berkeley
10. University of Sydney.

When it comes to research value for money it looks as though Australian and Canadian universities and US state institutions are doing rather better than the Ivy League or Oxbridge.














Ranking News: Chinese Think Tank Ranking

From the China Daily

The Global Think Tank Research Center affiliated with Zhejiang University of Technology has released a ranking of domestic university think tanks.

The first three places go to the National Academy of Development and Strategy at Renmin University of China, the national School of Development at Peking University, and the National Conditions Institute at Tsinghua University.

Wednesday, January 17, 2018

Ranking News: US State K-12 Rankings

Education Week has produced a ranking of states according to three criteria: Chance for Success, School Finance and K-12 Achievement. Overall, the top state is Massachusetts, which is also first for Chance for Success and K-12 Achievement. Pennsylvania is top for school finance. Overall the worse performing state is Nevada while New Mexico is worst for Chance for Success, Idaho for School Finance, and Mississippi for K-12 achievement.

California is an interesting case. Overall it is below average and gets a grade of C-. For K-12 its grade is D+. The state has some of the best universities in the world. Typically three or four of them will be found in the top ten of any global ranking. So why is the performance of primary and secondary schools so poor? Could it be that Education Week has identified the future of California's tertiary sector?






Thursday, January 11, 2018

Ranking News: US News online program rankings

U.S. News Releases 2018 Best Online Programs Rankings

Ranking news: Jordan cancels classification of universities

The Higher Education Accreditation Commission of Jordan has cancelled its proposed  classification of universities. Apparently, academics were  opposed because it was based on international rankings and ignored "“the reality of the universities and the damage to their reputation”.


Source

Jordan Times

Friday, December 29, 2017

Getting ready for next year's university rankings


Getting ready for next year's university rankings.




More on Japan and the Rankings

The Japan Times recently published an article by Takamitsu Sawa, President and Distinguished Professor at Shiga University, discussing the apparent decline of Japan's universities in the global rankings.

He notes that in 2014 there were five Japanese universities in the top 200 of the Times Higher Education (THE) world rankings but only two in 2016. He attributes Japan's poor performance to the bias of the citations indicator towards English language publications and the inability or reluctance of Japanese academics to write in English. Professor Sawa seems to be under the impression that THE does not count research papers not written in English, which is incorrect. It is, however, true that the failure of Japanese scholars to write in English prevents their universities doing better in the rankings. He also blames lack of funding from the government and the Euro-American bias of the THE reputation survey.

The most noticeable thing about this article is that the author talks about exactly one table, the THE World University Rankings. This is unfortunately very common especially among Asian academics, There are now over a dozen global rankings of varying quality and some of them tell a different, and perhaps more accurate, story than THE's. For example, there are several well known international rankings in which there are more Japanese universities in the world top 200 than there are in THE's.

There are currently two in the THE top 200 but seven in the Shanghai Academic Ranking of World Universities (ARWU), ten in the QS World University Rankings, ten in the Russian Round University Rankings, seven in the CWTS Leiden Ranking total publications indicator and ten in the Nature Index.

Let's now take a look at the University of Tokyo (Todai), the country's best known university, and it's position in these rankings. Currently it is 46th in the world in THE but in ARWU it is 23rd, in QS 28th, in Leiden Ranking tenth for publications and tenth in the Nature Index. RUR put the university in 43rd place, still a little better than THE. It is very odd that Professor Sawa should focus on the rankings that puts Japanese universities in the worst possible light and ignore the others.

As noted in an earlier post, Tokyo's tumble in the THE rankings came suddenly in 2015 when THE made some drastic changes in its methodology, including switching to Scopus as data supplier, excluding papers with large numbers of authors such as those derived from the CERN projects, and applying a country adjustment to half instead of all the citations indicator. Then in 2016 THE made further changes for its Asian rankings that further lowered the scores of Japanese universities.

It is true that scores of leading Japanese universities in most rankings have drifted downwards over the last few years but this is a relative trend caused mainly by the rise of a few Chinese and Korean universities. Japan's weakest point, as indicated by the RUR and THE rankings, is internationalisation. These rankings show that the major Japanese universities still have strong reputations for postgraduate teaching and research while the Nature Index and the Leiden Ranking point to an excellent performance in research in the natural science at the highest levels.

Nobody should rely on a single ranking and changes caused mainly by methodological tweaking should be taken with a large bucket of salt.




Tuesday, December 19, 2017

Rankings Calendar

The US News Online Program Rankings will be published on January 9th, 2018


Saturday, December 16, 2017

Rankings in Hong Kong

My previous post on the City University of Hong Kong has been republished in the Hong Kong Standard.

So far I can find no reference to anyone asking about the City University of Hong Kong's submission of student data to THE or data about faculty numbers for any Hong Kong university.

I also noticed that the Hong Kong University of Science and Technology is not on the list of 500 universities in the QS Employability Rankings although it is 12th in the one published in THE. Is there a dot here? 



Measuring graduate employability; two rankings

Global university rankings are now well into their second decade. Since 2003, when the first Shanghai rankings appeared, there has been a steady growth of global and regional rankings. At the moment most global rankings are of two kinds, those that focus entirely or almost entirely on research and those such as the Russian Round Rankings, Times Higher Education (THE) and Quacquarelli Symonds (QS) that claim to also measure teaching, learning or graduate quality in some way, although even those are biased towards research when you scratch the surface a little.

The ranking industry has become adept at measuring research productivity and quality in various ways. But the assessment of undergraduate teaching and learning is another matter.

Several ranking organisations use faculty student ratio as a proxy for quality of teaching which in turn is assumed to have some connection with something that happens to students during their programmes. THE also count institutional income, research income and income from industry, again assuming that there is a significant association with academic excellence. Indicators like this are usually based on those supplied by institutions. For examples of problems here see an article by Alex Usher and a reply by Phil Baty.

An attempt to get at student quality is provided by the CWUR rankings now based in UAE, counting alumni who win international awards or who are CEOs of major companies. But obviously this is relevant only for a very small number of universities. A new pilot ranking from Moscow also counts international awards.

The only attempt to measure student quality  by the well known rankers that is relevant to most institutions is the survey of employers in the QS world and regional rankings. There are some obvious difficulties here. QS gets respondents from a variety of channels and this may allow some universities to influence the survey. In recent years some Latin American universities have done much better on this indicator than on any other.

THE now publish a global employability ranking which is conducted by two European firms, Trendence and Emerging. This is based on two surveys of recruiters in Argentina, Australia, Austria, Brazil, Canada, China, Germany, France, India, Israel, Italy, Japan, Mexico, Netherlands, Singapore, Spain, South Africa, South Korea, Turkey, UAE, UK, and USA. There were two panels with a total of over 6,000 respondents.

A global survey that does not include Chile, Sweden, Egypt, Nigeria, Saudi Arabia, Russia, Pakistan, Indonesia, Bangladesh, Poland, Malaysia or Taiwan can hardly claim to be representative of international employers. This limited representation may explain some oddities of the rankings such as the high places of the American University of Dubai and and the National Autonomous University of Mexico.

The first five places in these rankings are quite similar to the THE world rankings: Caltech, Harvard, Columbia, MIT, Cambridge.  But there some significant differences after that and some substantial changes since last year. Here Columbia, 14th in the world rankings, is in third place, up from 12th last year. Boston University is 6th here but 70th in the world rankings. Tokyo Institute of Technology in 19th place is in the 251-300 band in the world rankings. CentraleSupelec, is 41st,  but in the world  401-500 group.

These rankings are useful only for a small minority of universities, stakeholders and students. Only 150 schools are ranked and only a small proportion of the world's employers consulted.

QS have also released their global employability rankings with 500 universities. These combine the employer reputation survey, used in their world rankings with  other indicators: alumni outcomes, based on lists of high achievers, partnership with employers, that is research collaboration noted in the Scopus database, employer-student connections, that is employers actively present on campus, and graduate employment rate. There seems to be a close association, at least at the top, between overall scores, employer reputation and alumni outcomes. Overall the top three are Stanford, UCLA, Harvard. For employer reputation they are Cambridge, Oxford, Harvard and for alumni outcomes Harvard, Stanford Oxford.

The other  indicators are a different matter. For employer-student connections the top three are Huazhong University of Science and Technology, Arizona State University, and New York University. In fact seven out of the top ten on this measure  are Chinese. For graduate employment rate they are Politecnico di Torino, Moscow State Institute of International Relations, and Sungkyunkwan University and for partnership with employers Stanford, Surrey and Politecnico Milano. When the front runners in indicators are so different one has to wonder about their validity.

There are some very substantial differences in the ranks given to various universities in these rankings. Caltech is first in the Emerging-Trendence rankings and 73rd in QS. Hong Kong University of Science and Technology is 12th in Emerging-Trendence but not ranked at all by QS. The University of Sydney is 4th in QS and 48th in Emerging-Trendence. The American University of Dubai is in QS's 301-500 band but 138th for Emerging-Trendence

The  rankings published by THE could be some value to those students contemplating careers with the leading companies in the richest countries.

The QS rankings may be more helpful for those students or stakeholders looking at universities outside the very top of the global elite. Even so QS have ranked only a fraction of the world's universities.

It still seems that the way forward in the assessment of graduate outcomes and employability is through standardised testing along the lines of AHELO or the Collegiate Learning Assessment.




Monday, December 11, 2017

Rankings Calendar

The Times Higher Education (THE) Asian Universities Summit will be held at the Southern University of Science and Technology, Shenzen, China, 5th-7th February, 2018. The 2018 THE Asian universities rankings will be announced.