Thursday, February 08, 2018

Should Pakistan Celebrate the Latest THE Asian Rankings?


This is an updating and revision of a post from a few days ago


There appears to be no end to the craze for university rankings. The media in many parts of the world show almost as much interest in global university rankings as in the Olympics or the World Cup. They are now used to set requirements for immigration, chose research collaborators, external examiners, international partners and for marketing, public relations, and recruitment.

Pakistan has not escaped the craze although it was perhaps a bit slower than some other places. Recently, we have seen headlines announcing that ten Pakistani universities are included in the latest Times Higher Education (THE) Asian rankings and highlighting the achievement of Quaid-i-Azam University (QAU) in Islamabad reaching the top 100.

Rankings are unavoidable and sometimes they have beneficial results. The first publication of the research-based Shanghai rankings in 2003, for example, was a salutary shock to continental European universities and a clear demonstration of how far China had to go to catch up with the West in the natural sciences. But rankings do need to be treated with caution especially when ranking metrics are badly and obviously flawed.

THE note that there are now ten Pakistani universities in the Asian rankings and one, QAU, in 79th place, which would appear to be evidence of academic progress.

Unfortunately, Pakistani universities, especially QAU, do very much better in the THE rankings than in others. QAU is in the 401-500 band in the THE world rankings, which use the same indicators as the Asian rankings. But in the QS World University Rankings it is in the 650-700 band. It does not even get into the 800 ranked universities In the Shanghai rankings, the 903 in the Leiden Ranking, or the 763 in the Russian Round University Rankings. In the University Ranking by Academic Performance, published in Ankara, it is 605th, in the Center for World University Rankings list 870th.

How can we explain QAU’s success in the THE world and Asian rankings, one that is so much greater than any other ranking? It is in large part the result of a flawed methodology.

Take a look at the scores that QAU got in the THE rankings. In all cases the top scoring university gets 100.

For Teaching, combining five indicators, it was 25.7 which is not very good. For international outlook it was 42.1. Since QAU has very few international staff or students this mediocre score is very probably the result of a high score for international collaboration.

For research income from industry it was 31.8. This is probably an estimate since exactly the same score is given for four other Pakistani universities.

Now we come to something very odd. QAU’s research score was 1.3. It was the lowest of the 350 universities in the Asian rankings, very much lower than the next worse, Ibaraki University in Japan with 6.6.  The research score is composed of research reputation, publications per faculty and research income per faculty. This probably means that QAU’s score for research reputation was zero or close to zero.

In contrast, QAU’s score of 81.2 for research impact measured by citations is among the best in Asia. Indeed, in this respect it would appear to be truly world class with a better score than Monash University, the Chinese University of Hong Kong, the University of Bologna or the University of Nottingham.

How is it being possible that QAU could be 7th in Asia for research impact but 350th for research?

The answer is that THE’s research impact indicator is extremely misleading. It does not simply measure the number of citations but the number of citations in over 300 fields, five years of publication and up to six years of citations. This means that a few highly cited papers in a strategic discipline at a strategic time can have a disproportionate effect on the impact score especially if the total number of papers is low.

Added to this is THE’s regional modification which means that the citation impact score of a university is divided by the square root of the score of the whole country in which they university is located. That means that the score of universities in the top scoring country remain the same but that of all the others goes up, the worse the country the bigger the increase. The effect of this is to give a big boost to countries like Pakistan. THE used to apply this bonus to all of the citations indicator but now only to 50%.

Then we have to consider how THE deals with mega-papers mainly in physics and medicine, those with hundred even thousands of authors and hundreds and thousands of citations.

Until the world rankings of 2015-16 THE treated every single author of such papers as though he or she were the only author of the papers. Then they stopped counting citations to these papers and then in 2016-17 they awarded each institution a minimum 5% for citations.

The effect of the citations metric has been to make a mockery of the THE Asian and world rankings. A succession of unlikely places has been propelled to the top of the indicator because of contributions to mega-papers or because of a few or even a single prolific author combined with a low overall number of papers. We have seen Alexandria University, Anglia Ruskin University, Moscow State Engineering Physics Institute, Tokyo Metropolitan University rise to the top of this indicator. In last year’s Asian rankings, Veltech University in India appeared to be first for research impact.

QAU has been involved in the Large Hadron Collider (LHC) project, which produces papers with hundreds or thousands of authors and hundreds or thousands of citations, and has provided authors for several papers. One 2012 paper derived from this project received 4094 citations so that QU would be credited with 205 citations just for this paper.

In addition to this QAU employs an extremely productive mathematician, Tasawar Hayat, who is among the world’s elite of researchers in Clarivate Analytics list of Highly Cited Researchers where his primary affiliation is King Abdulaziz University in Saudi Arabia and QAU is his secondary affiliation. Professor Hayat is extremely prolific: in 2017 alone, he was author or co-author of 384 scientific documents, articles, reviews, notes and so on.

There is nothing wrong with QAU taking part in the LHC project and I am unable to comment on the quality of his research. It should, however, be understood that if Professor Hayat left QAU or QAU withdrew from the LHC project or THE changed its methodology then QAU could suffer a dramatic fall in the rankings similar to those suffered by some Japanese, Turkish or Korean universities in recent years. This is an achievement built on desperately weak foundations.

It would be very unwise to use these rankings as evidence for the excellence of QAU or any other university.

Tuesday, February 06, 2018

Rising Stars of Asian research

Times Higher Education (THE) has just announced the latest edition of its Asian rankings. Since the indicators are the same as the world rankings with adjusted weightings there was absolutely no suspense about who would be top. In case anybody still doesn't know it was the National University of Singapore.

The really interesting part of the rankings is the citations indicator, field- and year-normalised, based on Scopus, with fractional counting only for papers with more than 1,000 authors.

Here are some of the superstars of Asian research. On the left is the citations rank and the score for citations. On the right in brackets is the score for Research comprising research reputation, publications per faculty, and research income. To achieve a score in the seventies, eighties or nineties  for citations with minimal research reputation, very few publications and limited funding is remarkable.

1st. 99.1. Babol Noshirvani University of Technology (15.3)
2nd. 92.0 King Abdulaziz University (92.3)
3rd.  93.1. Ulsan National Institute of Science and Technology (37.8)
7th. 81.2. Quaid-i-Azam University (1.3)
13th. 74.5. Fujita Health University (9.4)
16th.72.5.  Central China Normal University (11.3)






Free speech rankings from Spiked

The magazine Spiked is descended from Living Marxism although some think it is now more libertarian than socialist. It has just published the latest edition of its free speech university rankings.

These are not actually rankings but a classification or a rating, since they just divide UK universities into three groups. They have been subjected to mockery from sections of the academic blogosphere, including WONKHE, that might be justified on technical grounds. This is, however, such an important topic that any sort of publicity has to be welcomed.

Universities are divided into three categories: 

RED; "A students’ union, university or institution that is hostile to free speech and free expression, mandating explicit restrictions on speech, including, but not limited to, bans on specific ideologies, political affiliations, beliefs, books, speakers or words."

AMBER; "A students’ union, university or institution that chills free speech and free expression through restricting vague and subjective types of speech, such as ‘offensive’ or ‘insulting’ speech, or requiring burdensome vetting procedures for events, speakers, posters or publications. Many policies in this category might not explicitly limit speech, but have the potential to be used to that end, due to purposefully vague or careless wording."

GREEN; "A students’ union, university or institution that, as far as we are aware, places no significant restrictions on free speech and expression – other than where such speech or expression is unlawful."

The roll of honour in the green category includes exactly seven universities, none of them in the Russell Group: Anglia Ruskin, Buckingham, Hertfordshire, Robert Gordon, Trinity St David, West of Scotland, and Winchester.


Interesting data from Webometrics

The Webometrics rankings perform the invaluable function of ranking 27,000 plus universities or entities claiming to be universities around the world. Also, their Excellence indicator identifies those  institutions, 5,776 this year, with any claim to involvement in research.

Consequently, it has often been used in unofficial national rankings in countries, especially in Africa, where very few places can make it into the top 500 or 1,000 universities included in the better known international rankings.

However, there seems to be a universal law that when a ranking becomes significant it will have unintended and perverse consequences. In the UK we have seen massive inflation in the number of first and upper second class degrees partly because this is a n element in popular national rankings. Sophisticated campaigns can also produce  significant gains in the QS academic opinion survey which has a 40% weighting  and a few hundred strategic citations can boost the most unlikely universities in the research impact indicator of THE world and regional rankings.

Webometrics also has indicator that seems to be susceptible to bad practices. This is "Presence", the number of pages in the main webdomain including subdomains and file types such as rich files, with a 5% weighting. Apparently this can be easily manipulated. Unlike other rankings, Webometrics does not attempt to ignore this but has highlighted it in several recent tweets, which is helpful since it indicates who might be manipulating the variable. It is  possible that there might have been a misunderstanding of the Webometrics guidelines, an error somewhere, or perhaps some totally valid and innocent explanation. If the latter is the case iIwill be happy to publish a statement.

Here is a selection of universities with their world rank in the Webometrics Presence indicator. The overall rank is in brackets.

4.  University of Nairobi, Kenya (874)

5.  Masaryk University in Brno, Czechia (433)

9.  Federal university of Santa Catarina, Brazil  (439)

15.  Charles University in Prague (203)

17.  University of Costa Rica (885)

20.  University of the West Indies St Augustine (1792)

32.  National University of Honduras (3777)

40.  Mahidol University, Thailand (548)

55.  Universitas Muhammadiyah Surakarta, Indonesia  (6394)




Wednesday, January 24, 2018

Fake Rankings from Nigeria?

Although the Webometrics rankings, based mainly on web activities, receive little attention from the good and the great among the world 's university administrators they do serve the important function of providing some sort of assessment of over 20,000 universities or entities that claim to be universities. They get to places where the market leaders, Shanghai Ranking, THE and QS, cannot go.

As a result, the media in several African countries have from time to time published local rankings based on Webometrics that do not appear all that different from what would be expected from a ranking based on research or reputation.

For example, the current top five in Webometrics are:

1. University of Ibadan
2. Covenant University
3. Obafemi Awolowo University
4. University of Nigeria
5. University of Lagos.

The Nigerian press have in the last few years announced the results of rankings supposedly produced by the country's national university commission. In 2016 Nigerian Scholars reported that the NUC had produced a ranking with the top five being:

1. University of Ibadan
2. University of Lagos
3. University of Benin
4. Obafemi Awolowo University
5. Ahmadu Bello University.

Now we have this published in The Nation .Professor Adamu Abubakar Abdulrasheed, Executive Secretary of the NUC, has announced that the rankings attributed to the NUC were fake and that the commission had not published any ranking for several years.

This is  a bit strange. Does that mean that nobody on the commission noticed that fake rankings were being published in its name until now? There may be more to the story.

For the moment, it looks as though Nigeria and other countries in Africa may have to continue relying on Webometrics.





Saturday, January 20, 2018

What use is a big endowment?






Quite a lot. But not as much as you might expect.

The website THEBESTSCHOOLS has just published a list of the world's 100 wealthiest universities, as measured by the value of their endowments. As expected, it is dominated by US institutions with Harvard in first place. There are also three universities from Canada and two each  from the UK, Australia, Japan, Singapore and Saudi Arabia

There are of course other elements in university funding but it worth looking at how this ranking compares with others. The top five are familiar to any rankings observer, Harvard with an endowment of 34.5 US$ followed by Yale, the University of Texas system, Stanford and Princeton. Then there is a surprise, King Abdullah University of Science and Technology in Saudi Arabia in sixth place with an endowment of 20 billion.

Some of the wealthy universities also do well in other rankings. Stanford, in fourth place here, is second in the overall Shanghai rankings and seventh for publications, and fifth in the  Leiden Ranking default publications indicator. It does even better in the QS employer survey indicator, where it is ranked second.

There are, however, several places that are very wealthy but just don't get anywhere in the global rankings. Williams College, the University of Richmond, Pomona College, Wellesley College, Smith College, and Grinnell College are not even given a value in the QS employment indicator, or the Leiden or Shanghai publication indicators. They may of course do well in some other respects: the University of Richmond is reported by the Princeton Review to be second in the US for internships.

On the other hand, some less affluent universities do surprisingly well. Some California schools seem to among the best high-performers.  Caltech is 47th here but 9th in the Shanghai rankings where it has always been first in the productivity per capita indicator. Berkeley is 65th here and fifth in Shanghai. The University of California San Francisco, a medical school, is 90th here and 21st in Shanghai.

Overall there is an association between endowment value and research output or reputation among employers that is definitely positive but rather modest. The correlation  between endowment and Shanghai publication score is 0.38, between endowment and number of publications 2012-15 (in the Leiden Ranking) 0.46, and between endowment and the QS employer survey score 0.40. The relationship would certainly be higher if we corrected for restriction of range.

Having a lot of money helps a university produce research and build up a reputation for excellence but it is certainly not the only factor involved.

Here is the top ten in a a ranking of the 100 universities by papers (Leiden Ranking) per billion dollars of endowment.

1. University of Toronto
2. University of British Columbia
3. McGill University
4. University of California San Francisco
5. University of Melbourne
6. Rutgers University
7. UCLA
8. University of Florida
9. University of California Berkeley
10. University of Sydney.

When it comes to research value for money it looks as though Australian and Canadian universities and US state institutions are doing rather better than the Ivy League or Oxbridge.














Ranking News: Chinese Think Tank Ranking

From the China Daily

The Global Think Tank Research Center affiliated with Zhejiang University of Technology has released a ranking of domestic university think tanks.

The first three places go to the National Academy of Development and Strategy at Renmin University of China, the national School of Development at Peking University, and the National Conditions Institute at Tsinghua University.

Wednesday, January 17, 2018

Ranking News: US State K-12 Rankings

Education Week has produced a ranking of states according to three criteria: Chance for Success, School Finance and K-12 Achievement. Overall, the top state is Massachusetts, which is also first for Chance for Success and K-12 Achievement. Pennsylvania is top for school finance. Overall the worse performing state is Nevada while New Mexico is worst for Chance for Success, Idaho for School Finance, and Mississippi for K-12 achievement.

California is an interesting case. Overall it is below average and gets a grade of C-. For K-12 its grade is D+. The state has some of the best universities in the world. Typically three or four of them will be found in the top ten of any global ranking. So why is the performance of primary and secondary schools so poor? Could it be that Education Week has identified the future of California's tertiary sector?






Thursday, January 11, 2018

Ranking News: US News online program rankings

U.S. News Releases 2018 Best Online Programs Rankings

Ranking news: Jordan cancels classification of universities

The Higher Education Accreditation Commission of Jordan has cancelled its proposed  classification of universities. Apparently, academics were  opposed because it was based on international rankings and ignored "“the reality of the universities and the damage to their reputation”.


Source

Jordan Times