Showing posts sorted by relevance for query oxford reputation. Sort by date Show all posts
Showing posts sorted by relevance for query oxford reputation. Sort by date Show all posts

Friday, November 13, 2015

Are global rankings losing their credibility? (from WONK HE)


Originally published in WONK HE 27/10/2015


Are global rankings losing their credibility?
Richard is an academic and expert on university rankings. He writes in depth on rankings at his blog: University Rankings Watch.
PUBLISHED
Oct 27th 2015
TAGS
·         Data
·         International
·         Rankings & League tables


The international university ranking scene has become increasingly complex, confusing and controversial. It also seems that the big name brands are having problems balancing popularity with reliability and validity. All this is apparent from the events of the last two months which have seen the publication of several major rankings.
The first phase of the 2015 global ranking season ended with the publication of the US News’s (USN) Best Global universities. We have already seen the 2015 editions of the big three brand names, the Academic Ranking of World Universities (ARWU) produced by the Centre for World-Class Universities at Shanghai Jiao Tong University, the Quacquarelli Symonds (QS) World University Rankings and the Times Higher Education (THEWorld University Rankings. Now a series of spin-offs has begun.
In addition, a Russian organisation, Round University Ranking (RUR), has produced another set of league tables. Apart from a news item on the website of the International Ranking Expert Group these rankings have received almost no attention outside Russia, Eastern Europe and the CIS. This is very unfortunate since they do almost everything that the other rankings do and contain information that the others do not.
One sign of the growing complexity of the ranking scene is that USN, QS, ARWU and THE are producing a variety of by-products including rankings of new universities, subject rankings, best cities for students, reputation rankings, regional rankings with no doubt more to come. They are also assessing more universities than ever before. THE used to take pride in ranking only a small elite group of world universities. Now they are talking about being open and inclusive and have ranked 800 universities this year, as did QS, while USN has expanded from 500 to 750 universities. Only the Shanghai rankers have remained content with a mere 500 universities in their general rankings.
Academic Ranking of World Universities (ARWU)
All three of the brand name rankings have faced issues of credibility. The Shanghai ARWU has had a problem with the massive recruitment of adjunct faculty by King Abdulaziz University (KAU) in Jeddah. This was initially aimed at the highly cited researchers indicator in the ARWU, which simply counts the number of researchers affiliated to universities no matter whether their affiliation has been for an academic lifetime or had begun the day before ARWU did the counting. The Shanghai rankers deftly dealt with this issue by simply not counting secondary affiliations in the new lists of highly cited researchers supplied by Thompson Reuters in 2014.
That, however, did not resolve the problem entirely. Those researchers have not stopped putting KAU as a secondary affiliation and even if they no longer affected the highly cited researchers indicator they could still help a lot with publications and papers in Nature and Science, both of which are counted in the ARWU. These part-timers – and some may not even be that – have already ensured that KAU, according to ARWU, is the top university in the world for publications in mathematics.
The issue of secondary affiliation is one that is likely to become a serious headache for rankers, academic publishers and databases in the next few years. Already, undergraduate teaching in American universities is dominated by a huge reserve army of adjuncts. It is not impossible that in the near future some universities may find it very easy to offer minimal part-time contracts to talented researchers in return for listing as an affiliation and then see a dramatic improvement in ranking performance.
ARWU’s problem with the highly cited researchers coincided with Thomson Reuters producing a new list and announcing that the old one would no longer be updated. Last year, Shanghai combined the old and new lists and this produced substantial changes for some universities. This year they continued with the two lists and there was relatively little movement in this indicator or in the overall rankings. But next year they will drop the old list altogether and just use the new one and there will be further volatility. ARWU have, however, listed the number of highly cited researchers in the old and new lists so most universities should be aware of what is coming.
Quacquarelli Symonds (QS) World University Rankings
The Quacquarelli Symonds (QS) World University Rankings have been regarded with disdain by many British and American academics although they do garner some respect in Asia and Latin America. Much of the criticism has been directed at the academic reputation survey which is complex, opaque and, judging from QS’s regular anti-gaming measures, susceptible to influence from universities. There have also been complaints about the staff student ratio indicator being a poor proxy for teaching quality and the bias of the citations per faculty indicator towards medicine and against engineering, the social sciences and the arts and humanities.
QS have decided to reform their citations indicator by treating the five large subject groups as contributing equally to the indicator score. In addition, QS omitted papers, most of them in physics, with a very large number of listed authors and averaged responses to the surveys over a period of five years in an attempt to make the rankings less volatile.
The result of all this was that some universities rose and others fell. Imperial College London went from 2nd to 8th while the London School of Economics rose from 71st to 35th. In Italy, the Polytechnics of Milan and Turin got a big boost while venerable universities suffered dramatic relegation. Two Indian institutions moved into the two hundred, some Irish universities such as Trinity College Dublin, University College Dublin and University College Cork went down and some such as National University of Ireland Galway and the University of Limerick went up.
There has always been a considerable amount of noise in these rankings resulting in part from small fluctuations in the employer and academic surveys. In the latest rankings these combined with methodological changes to produce some interesting fluctuations. Overall the general pattern was that universities that emphasise the social sciences, the humanities and engineering have improved at the expense of those that are strong in physics and medicine.
Perhaps the most remarkable of this year’s changes was the rise of two Singaporean universities, the National University of Singapore (NUS) and Nanyang Technological University (NTU), to 12th and 13th place respectively, a change that has met with some scepticism even in Singapore. They are now above Yale, EPF Lausanne and King’s College London. While the changes to the citations component were significant, another important reason for the rise of these two universities was their continuing remarkable performance in the academic and employer surveys. NUS is in the top ten in the world for academic reputation and employer reputation with a perfect score of 100, presumably rounded up, in each. NTU is 52nd for the academic survey and 39th for employer with scores in the nineties for both.
Introducing a moderate degree of field normalisation was probably a smart move. QS were able to reduce the distortion resulting from the database’s bias to medical research without risking the multiplication of strange results that have plagued the THE citations indicator. They have not, however, attempted to reform the reputation surveys which continue to have a combined 50% weighting and until they do so these rankings are unlikely to achieve full recognition from the international academic community.
Times Higher Education (THE) World University Rankings
The latest THE world rankings were published on September 30th and like QS, THE have done some tweaking of their methodology. They had broken with Thompson Reuters at the end of 2014 and started using data from Scopus, while doing the analysis and processing in-house. They were able to analyse many more papers and citations and conduct a more representative survey of research and postgraduate supervision. In addition they omitted multi-author and multi-cited papers and reduced the impact of the “regional modification”.
Consequently there was a large dose of volatility. The results were so different from those of 2014 that they seemed to reflect an entirely new system. THE did, to their credit, do the decent thing and state that direct comparisons should not be made to previous years. That, however, did not stop scores of universities and countries around the world from announcing their success. Those that had suffered have for the most part kept quiet.
There were some remarkable changes. At the very top, Oxford and Cambridge surged ahead of Harvard which fell to sixth place. University College Dublin, in contrast to the QS rankings, rose as did Twente and Moscow State, the Karolinska Institute and ETH Zurich.
On the other hand, many universities in France, Korea, Japan and Turkey suffered dramatic falls. Some of those universities had been participants in the CERN projects and so had benefitted in 2014 from the huge number of citations derived from their papers. Some were small and produced few papers so those citations were divided by a small number of papers. Some were located in countries that performed poorly and so got help from a “regional modification” (the citation impact score of the university is divided by the square root of the average citation impact score of the whole country). Such places suffered badly from this year’s changes.
It is a relief that THE have finally done something about the citations indicator and it would be excellent if they continued with further reforms such as fractional counting, reducing the indicator’s overall weighting, not counting self-citations and secondary affiliations and getting rid of the regional modification altogether.
Unfortunately, if the current round of reforms represent an improvement, and on balance they probably do, then the very different results of 2014 and before, call into question THE’s repeated claims to be trusted, robust and sophisticated. If the University of Twente deserves to be in the top 150 this year then the 2014 rankings which had them outside the top 200 could not possibly be valid. If the Korean Advanced Institute of Science and Technology (KAIST) fell 66 places then either the 2015 rankings or those of 2014 were inaccurate, or they both were. Unless there is some sort of major restructuring such as an amalgamation of specialist schools or the shedding of inconvenient junior colleges or branch campuses, large organisations like universities simply do not and cannot change that much over the course of 12 months or less.
It would have been more honest, although probably not commercially feasible, for THE to declare that they were starting with a completely new set of rankings and to renounce the 2009-14 rankings in the way that they had disowned the rankings produced in cooperation with QS between 2004 and 2008. THE seem to be trying to trade on the basis of their trusted methodology while selling results suggesting that that methodology is far from trustworthy. They are of course doing just what a business has to do. But that is no reason why university administrators and academic experts should be so tolerant of such a dubious product.
These rankings also contain quite a few small or specialised institutions that would appear to be on the borderline of a reasonable definition of an “independent university with a broad range of subjects”: Scuala Normale Superiore di Pisa and Scuala Superiore Sant’Anna, both part of the University of Pisa system, Charité-Universitätsmedizin Berlin, an affiliate of two universities, St George’s, University of London, a medical school, Copenhagen Business School, Rush university, the academic branch of a private hospital in Chicago, the Royal College of Surgeons in Ireland, and the National Research Nuclear University (MEPhI) in Moscow, specialising in physics. Even if THE have not been too loose about who is included, the high scores achieved by such narrowly focussed institutions calls the validity of the rankings into question.
Round University Rankings
In general the THE rankings have received a broad and respectful response from the international media and university managers, and criticism has largely been confined to outsiders and specialists. This is in marked contrast to the Rankings released by a Russian organisation early in September. These are based entirely on data supplied by Thompson Reuters, THE’s data provider and analyst until last year. They contain a total of 20 indicators, including 12 out of the 13 in the THE rankings. Unlike THE, RUR do not bundle indicators together in groups so it is possible to tell exactly why universities are performing well or badly.
The RUR rankings are not elegantly presented but the content is more transparent than THE, more comprehensive than QS, and apparently less volatile than either. It is a strong indictment of the international higher education establishment that these rankings are ignored while THE’s are followed so avidly.
Best Global Universities
The second edition of the US News’s Best Global Universities was published at the beginning of October. The US News is best known for the ranking of American colleges and universities and it has been cautious about venturing into the global arena. These rankings are fairly similar to the Shanghai ARWU, containing only research indicators and making no pretence to measure teaching or graduate quality. The methodology avoids some elementary mistakes. It does not give too much weight to any one indicator, with none getting more than 12.5%, and measures citations in three different ways. For eight indicators log manipulation was done before the calculation of z-scores to eliminate outliers and statistical anomalies.
This year US News went a little way towards reducing the rankers’ obsession with citations by including conferences and books in the list of criteria.
Since they do not include any non-research indicators these rankings are essentially competing with the Shanghai ARWU and it is possible that they may eventually become the first choice for internationally mobile graduate students.
But at the moment it seems that the traditional media and higher education establishment have lost none of their fascination for the snakes and ladders game of THE and QS.


Saturday, December 16, 2017

Measuring graduate employability; two rankings

Global university rankings are now well into their second decade. Since 2003, when the first Shanghai rankings appeared, there has been a steady growth of global and regional rankings. At the moment most global rankings are of two kinds, those that focus entirely or almost entirely on research and those such as the Russian Round Rankings, Times Higher Education (THE) and Quacquarelli Symonds (QS) that claim to also measure teaching, learning or graduate quality in some way, although even those are biased towards research when you scratch the surface a little.

The ranking industry has become adept at measuring research productivity and quality in various ways. But the assessment of undergraduate teaching and learning is another matter.

Several ranking organisations use faculty student ratio as a proxy for quality of teaching which in turn is assumed to have some connection with something that happens to students during their programmes. THE also count institutional income, research income and income from industry, again assuming that there is a significant association with academic excellence. Indicators like this are usually based on those supplied by institutions. For examples of problems here see an article by Alex Usher and a reply by Phil Baty.

An attempt to get at student quality is provided by the CWUR rankings now based in UAE, counting alumni who win international awards or who are CEOs of major companies. But obviously this is relevant only for a very small number of universities. A new pilot ranking from Moscow also counts international awards.

The only attempt to measure student quality  by the well known rankers that is relevant to most institutions is the survey of employers in the QS world and regional rankings. There are some obvious difficulties here. QS gets respondents from a variety of channels and this may allow some universities to influence the survey. In recent years some Latin American universities have done much better on this indicator than on any other.

THE now publish a global employability ranking which is conducted by two European firms, Trendence and Emerging. This is based on two surveys of recruiters in Argentina, Australia, Austria, Brazil, Canada, China, Germany, France, India, Israel, Italy, Japan, Mexico, Netherlands, Singapore, Spain, South Africa, South Korea, Turkey, UAE, UK, and USA. There were two panels with a total of over 6,000 respondents.

A global survey that does not include Chile, Sweden, Egypt, Nigeria, Saudi Arabia, Russia, Pakistan, Indonesia, Bangladesh, Poland, Malaysia or Taiwan can hardly claim to be representative of international employers. This limited representation may explain some oddities of the rankings such as the high places of the American University of Dubai and and the National Autonomous University of Mexico.

The first five places in these rankings are quite similar to the THE world rankings: Caltech, Harvard, Columbia, MIT, Cambridge.  But there some significant differences after that and some substantial changes since last year. Here Columbia, 14th in the world rankings, is in third place, up from 12th last year. Boston University is 6th here but 70th in the world rankings. Tokyo Institute of Technology in 19th place is in the 251-300 band in the world rankings. CentraleSupelec, is 41st,  but in the world  401-500 group.

These rankings are useful only for a small minority of universities, stakeholders and students. Only 150 schools are ranked and only a small proportion of the world's employers consulted.

QS have also released their global employability rankings with 500 universities. These combine the employer reputation survey, used in their world rankings with  other indicators: alumni outcomes, based on lists of high achievers, partnership with employers, that is research collaboration noted in the Scopus database, employer-student connections, that is employers actively present on campus, and graduate employment rate. There seems to be a close association, at least at the top, between overall scores, employer reputation and alumni outcomes. Overall the top three are Stanford, UCLA, Harvard. For employer reputation they are Cambridge, Oxford, Harvard and for alumni outcomes Harvard, Stanford Oxford.

The other  indicators are a different matter. For employer-student connections the top three are Huazhong University of Science and Technology, Arizona State University, and New York University. In fact seven out of the top ten on this measure  are Chinese. For graduate employment rate they are Politecnico di Torino, Moscow State Institute of International Relations, and Sungkyunkwan University and for partnership with employers Stanford, Surrey and Politecnico Milano. When the front runners in indicators are so different one has to wonder about their validity.

There are some very substantial differences in the ranks given to various universities in these rankings. Caltech is first in the Emerging-Trendence rankings and 73rd in QS. Hong Kong University of Science and Technology is 12th in Emerging-Trendence but not ranked at all by QS. The University of Sydney is 4th in QS and 48th in Emerging-Trendence. The American University of Dubai is in QS's 301-500 band but 138th for Emerging-Trendence

The  rankings published by THE could be some value to those students contemplating careers with the leading companies in the richest countries.

The QS rankings may be more helpful for those students or stakeholders looking at universities outside the very top of the global elite. Even so QS have ranked only a fraction of the world's universities.

It still seems that the way forward in the assessment of graduate outcomes and employability is through standardised testing along the lines of AHELO or the Collegiate Learning Assessment.




Saturday, May 26, 2018

The THE reputation rankings

THE have just published details of their reputation rankings which will be published on May 30th, just ahead, no doubt coincidentally, of the QS World University Rankings.

The number of responses has gone down a bit, from 10,566 last year to 10,162, possibly reflecting growing survey fatigue among academics.

In surveys of this kind the distribution of responses is crucial. The more responses from engineers the better for universities in Asia. The more from scholars in the humanities the better for  Western Europe. I have noted in a previous blog that the fortunes of Oxford in this ranking are tied to the percentage of responses from the arts and humanities.

This year there have been modest or small reductions in the percentage of responses from the clinical and health sciences, the life sciences, the social sciences, education and psychology and  large ones for business and economics and the arts and humanities.

The number of responses in engineering and computer science has increased considerably.

It is likely that this year places like Caltech and Nanyang Technological University will do better while Oxford and LSE will suffer. It will be interesting to see if THE claim that this is all the fault of Brexit, an anti-feminist reaction to Oxford's appointment of a female vice-chancellor or government Scrooges turning off the funding tap.

         

2017  %
2018  %
Physical science
14.6
15.6
Clinical and health
14.5
13.2
Life sciences
13.3
12.8
Business and economics
13.1
9
engineering
12.7
18.1
Arts and humanities
12.5
7.5
Social sciences
8.9
7.6
Computer science
4.2
10.4
Education
2.6
2.5
Psychology
2.6
2.3
Law
0.9
1.0



North America
22
22
Asia Pacific
33
32
Western Europe
25
26
Eastern Europe
11
11
Latin America
5
5
Middle East
3
3
Africa
2
2


Wednesday, November 01, 2006

The Best Universities for Biomedicine?

THES has published a list of the world's 100 best universities for biomedicine. This is based, like the other subject rankings, on peer review . Here are the top twenty according to the THES reviewers.

1. Cambridge
2. Harvard
3. Oxford
4. Imperial College London
5. Stanford
6. Johns Hopkins
7. Melbourne
8. Beijing (Peking)
9. National University of Singapore
10. Berkeley
11. Yale
12. Tokyo
13. MIT
14. University of California at San Diego
15. Edinburgh
16. University College London
17. Kyoto
18. Toronto
19. Monash
20. Sydney

Here are the top twenty according to citations per paper, a measure of the quality of research.


1. MIT
2. Caltech
3. Princeton
4. Berkeley
5. Stanford
6. Harvard
7. Oxford
8. University of California at San Diego
9. Cambridge
10. Yale
11. Washington (St Louis)
12. Johns Hopkins
13. ETH Zurich
14. Duke
15. Dundee
16. University of Washington
17. Chicago
18. Vanderbilt
19. Columbia
20. UCLA

The two lists are quite different. Here are the positions according to citations per paper of some of the universities that were in the top twenty for the peer review;

University College London -- 24
Edinburgh -- 25
Imperial College London -- 28
Tokyo -- 34
Toronto -- 35
Kyoto -- 36
Monash -- 52
Melbourne -- 58
Sydney -- 67
National University of Singapore -- 74
Beijing -- 78=

Again, there is a consistent pattern of British, Australian and East Asia universities doing dramatically better in the peer review than in citations per paper. How did they acquire such a remarkable reputation if their research was of such undistinguished quality? Did they acquire a reputation for producing a large quantity of mediocre research?

Notice that Cambridge with the top score for peer review produces research of a quality inferior to, according to QS's data, eight universities, seven of which are in the US and four in California.

There are also 23 universities that produced insufficient papers to be counted by the consultants. Thirteen are in Asia, 5 in Australia and New Zealand, 4 in Europe and one in the US. How did they acquire such a remarkable reputation while producing so little research? Was the little research they did of a high quality?

Monday, April 18, 2016

Round University Rankings


The latest Round University Rankings have been released by the Russian company, RUR Rankings Agency. These are essentially holistic rankings that attempt to go beyond the measurement of research output and quality. There are twenty indicators, although some of them such as Teaching Reputation, International Teaching Reputation and Research Reputation and International Students and International Bachelors are so similar that the information they provide is limited.

Basically these rankings cover much the same ground as the Times Higher Education (THE) World University Rankings. The income from industry indicator is not included but there are an additional eight indicators. The data is taken from Thomson Reuters' Global Institutional Profiles Project (GIPP) which was used by THE for their rankings from 2010 to 2014.

Unlike THE, which lumps its indicators together into groups,  the scores in the RUR are listed separately in the profiles. In addition, the rankings provide data for seven continuous years from 2010 to 2016. This provides an unusual opportunity to examine in detail the development of universities over a period of seven years, measured by 20 indicators. This is not the case with other rankings which have fewer indicators or which have changed their methodology.

It should be noted that participation in the GIPP is voluntary and therefore the universities in each edition could be different. For example, in 2015 100 universities dropped out of the project and 62 joined.

It is, however,  possible to examine a number of claims that have been made about changes in university quality over the last few years. I will  take a look at these in the next few posts.

For the moment, here are the top five in the overall rankings and the dimension rankings.

Overall
1.   Caltech
2.   Harvard
3.   Stanford
4.   MIT
5.   Chicago


Teaching
1.   Caltech
2.   Harvard
3.   Stanford
4.   MIT
5.   Duke

Research
1.   Caltech
2.   Harvard
3.   Stanford
4.   Northwestern University
5.   Erasmus University Rotterdam

International Diversity
1.   EPF Lausanne
2.   Imperial College London
3.   National University of Singapore
4.   University College London
5.   Oxford

Financial Sustainability
1.   Caltech
2.   Harvard
3.   Scuola Normale Superiore Pisa
4.   Pohang University of Science and Technology
5.   Karolinska Institute

Unfortunately these rankings have received little or no recognition outside Russia. Here are some examples.


MIPT entered the top four universities in Russia according to the Round University Ranking

Russian Universities in the lead in terms of growth in the international ranking of Round University Ranking

TSU [Tomsk State University]  has entered the 100 best universities for the quality of teaching

[St Petersburg]

Russian universities to top intl rankings by 2020 – Education Minister Livanov to RT


Thursday, November 20, 2014

The Times [Higher Education rankings] they are a-changing

Maybe I'll get my five minutes of fame for being first with a Dylan quotation. I was a bit slow because, unlike Jonah Lehrer, I wanted to check that the quotation actually exists.

Times Higher Education (THE) have announced that they will be introducing reforms to their World University Rankings and ending their partnership with media and data giant, Thomson Reuters (TR).

Exactly why is not stated. It could be rooted in financial disagreement. Maybe THE feels betrayed because TR let  US News use the reputation survey for their new Best Global Universities rankings. Perhaps THE got fed up with explaining why places like Bogazici University, Federico Santa Maria Technical University and Royal Holloway were world beaters for research impact, outshining Yale, Oxford and Cambridge,

The reputation survey will now be administered by THE itself in cooperation with Elsevier and will make use of the Scopus database. Institutional data will be collected  from universities, the Scopus database and the Scival analysis tool by a new THE team.

The coming months will reveal what THE have in store but for now this is a list of recommendations. No doubt there will be many more from all sorts of people.

Display each indicator separately instead of lumping them together into Teaching, Research and International Outlook. It is impossible to work out exactly what is causing a rise or fall in the rankings unless they are separated.

Try to find some why of reducing the volatility of the reputation survey. US News do this by using a five year average and  QS by rolling over unchanged responses for a further two years.

Consider including questions about undergraduate teaching or doing another survey to assess student satisfaction.

Reduce the weighting of the citations indicator and use more than one measure of citations to assess research quality (citations per paper), faculty quality (citations per faculty) and  research impact (total citations). Use field normalisation but sparingly and sensibly and forget about that regional modification.

Drop the Industry Income: Innovation indicator. It is unfair to liberal arts colleges and private universities and too dependent on input from institutions. Think about using patents instead.

Income is an input. Do not use unless it is to assess the efficiency of universities in producing research or graduates.

Considering dropping the international students indicator or at least reducing its weighting. It is too dependent on geography and encourages all sorts of immigration scams.

Benchmark scores against the means of a constant number of institutions. If you do not, the mean indicator scores will fluctuate from year to year causing all sorts of distortions.






Sunday, April 23, 2017

UTAR and the Times Higher Education Asian University Rankings


Recently, Universiti Tunku Abdul Rahman (UTAR), a private Malaysian university, welcomed what appeared to be an outstanding performance in the Times Higher Education (THE) Asian Universities Rankings, followed by a good score in the magazine’s Young University Rankings. This has been interpreted as a remarkable achievement not just for UTAR but also for Malaysian higher education in general.

In the Asian rankings, UTAR is ranked in the top 120 and second in Malaysia behind Universiti Malaya (UM) and ahead of the major research universities, Universiti Sains Malaysia, Universiti Kebangsaan Malaysia and Universiti Putra Malaysia.

This is sharp contrast to other rankings. There is a research based ranking published by Middle East Technical University that puts UTAR 12th in Malaysia and 589th in Asia. The Webometrics ranking, which is mainly web based with one research indicator, has it 17th in Malaysia and 651st in Asia.

The QS rankings, known to be kind to South East Asian universities, puts UTAR in the 251-300 band for Asia and 14th= in Malaysia behind places like Taylor’s University and Multi Media University and in the same band as Universiti Malaysia Perlis and Universiti Malaysia Terengganu. UTAR does not appear in the Shanghai rankings or the Russian Round University Rankings.

Clearly, THE is the odd man out among rankings in its assessment of UTAR. I suspect that if challenged a spokesperson for THE might say that this is because they measure things other than research. That is very debatable. Bahram Bekhradnia of the Higher Education Policy Institute has argued in a widely-cited report that these rankings are of little value because they are almost entirely research-orientated.

In fact, UTAR did not perform so well in the THE Asian rankings because of teaching, internationalisation or links with industry. It did not even do well in research. It did well because of an “outstanding” score for research impact and it got that score because of the combination of a single obviously talented researcher with a technically defective methodology.

Just take a look at UTAR’s scores for the various components in the THE Asian rankings. For Research UTAR got a very low score of 9.6, the lowest of the nine Malaysian universities featured in these rankings (100 represents the top score in all the indicators).

For Teaching it has a score of 15.9, also the lowest of the ranked Malaysian universities.

For International Orientation, it got a score of 33.2. This was not quite the worst in Malaysia. Universiti Teknologi MARA (UiTM), which does not admit non– bumiputra Malaysians, let alone international students, did worse.

For Industry Income UTAR’s score was 32.9, again surpassed by every Malaysian university except UiTM.

So how on earth did UTAR manage to get into the top 120 in Asia and second in Malaysia?

The answer is that it got an “excellent” score of 56.7 for Research Impact, measured by field-normalised citations, higher than every other Malaysian university, including UM, in these rankings.

That score is also higher than several major international research universities such as National Taiwan University, the Indian Institute of Technology Bombay, Kyoto University and Tel Aviv University. That alone should make the research impact score very suspicious. Also, compare the score with the low score for research which combines three metrics, research reputation, research income and publications. Somehow UTAR has managed to have a huge impact on the research world even though it receives little money for research, does not have much of a reputation for research, and does not publish very much.

The THE research impact (citations) indicator is very problematical in several ways. It regularly produces utterly absurd results such as Alexandria University in Egypt in fourth place for research impact in the world in 2010 and St George’s, University of London (a medical school), in first place last year, or Anglia Ruskin University, a former art school, equal to Oxford and well ahead of Cambridge University.

In addition, to flog a horse that should have decomposed by now, Veltech University in Chennai, India, according to THE has biggest research impact in Asia and perhaps, if it qualified for the World Rankings, in the world.  This was done by massive self-citation by exactly one researcher and a little bit of help from a few friends.

Second in Asia for research, THE would have us believe, is King Abdulaziz University of Jeddah which has been on  recruiting spree of adjunct faculty whose duties might include visiting the university but certainly do require putting its name as secondary affiliation in research papers.

To rely on the THE rankings as a measure of excellence is unwise. There were methodological changes in 2011, 2015 and 2016, which have contributed to universities moving up or down many places even if there has been no objective change. Middle East Technical University in Ankara, for example, fell from 85th place in 2014-15 to the 501-600 band in 2015-6 and then to the 601-800 band in 2016-17. Furthermore, adding new universities means that the average scores from which the final scores are calculated are likely to fluctuate.

In addition, THE has been known to recalibrate the weight given to its indicators in their regional rankings and this has sometimes worked to the advantage of whoever is the host of THE’s latest exciting and prestigious summit. In 2016, THE’s Asian rankings featured an increased weight for research income from industry and a reduced one for teaching and research reputation. This was to the disadvantage of Japan and to the benefit of Hong Kong where the Asian summit was held.

So, is UTAR really more influential among international researchers than Kyoto University or the National Taiwan University?

What actually happened to UTAR is that it has an outstanding medical researcher who is involved in a massive international medical project with hundreds of collaborators from hundreds of institutions that produces papers that have been cited hundreds of times and will in the next few years be cited thousands of times. One of these papers had, by my count, 720 contributors from 470 universities and research centres and has so far received 1,036 citations, 695 in 2016 alone.

There is absolutely nothing wrong with such projects but it is ridiculous to treat every one of those 720 contributors as though they were the sole author of the paper with credit for all the citations, which is what THE does. This could have been avoided simply by using fractional counting and dividing the number of citations by the number of authors or number of affiliating institutions. This is an option available in the Leiden Ranking, which is the most technically expert of the various rankings. THE already does this for publications with over 1,000 contributors but that is obviously not enough.

I would not go as far as Bahram Bekhradnia and other higher education experts and suggest that universities should ignore rankings altogether. But if THE are going to continue to peddle such a questionable product then Malaysian universities would be well advised to keep their distance. There are now several other rankings on the marking that could be used for benchmarking and marketing.

It is not a good idea for UTAR to celebrate its achievement in the THE rankings. It is quite possible that the researcher concerned will one day go elsewhere or that THE will tweak its methodology again. If either happens the university will suffer from a precipitous fall in the rankings along with a decline in its public esteem. UTAR and other Malaysian universities would be wise to treat the THE rankings with a great deal of caution and scepticism.