Showing posts sorted by date for query harvard. Sort by relevance Show all posts
Showing posts sorted by date for query harvard. Sort by relevance Show all posts

Saturday, March 16, 2024

THE's Big Bang Ranking

 


Another day, another ranking. 

Times Higher Education (THE) has published a "bang for the bucks" ranking.

THE is taking the scores for institutional income, research income, and income from industry and comparing them with the scores "for research, teaching, and working with industry." This, presumably, is supposed to reveal those universities that are able to process their funding efficiently and turn it into publications, citations, patents, doctorates, and survey responses

There are some methodological issues here. It is not clear exactly how the income scores are calculated. Is it from the raw monetary data that THE collects from universities, or has it been through the THE standardization and normalization machine? Is there some sort of weighting or just an average of the three income categories? 

Also, there is a chart that suggests that all the scores are counted except for the financial metrics, but the text implies that the international pillar is not counted as part of the bang that THE purports to measure.

Another issue is that the financial data in the THE rankings refers to the year two years before the date of publication. However, citation and publication data are from a five—or six-year period before the ranking is published. In effect, THE is claiming that their favored schools have a remarkable ability to send money back in time to the years when research proposals were written, papers published, and citations recorded.

THE lists ten countries as good bang producers, starting with the UK and including Pakistan and Egypt. It does not list China, South Korea, Canada, or Australia, which should make us a little suspicious, 

Then, looking at the list of twenty universities with the biggest bangs, we see a few familiar names, including Sussex and Brighton Medical School,  Babol Noshirvani University of Technology,  and  Vita-Salute San Raffaele University, that have appeared in this blog before because they received remarkably high scores for citations and consequently did well in the overall rankings. Some, including Quaid-i-Islam University, COMSATS University, Auckland University of Technology, Government College University Faisalabad, and University College London, have contributed to citation-rich multi-contributor papers from the Global Burden of Disease Studies or the Large Hadron Collider project. Others, such as Shoolini University of  Biotechnology and Management Sciences and Malaviya National Institute of Technology, have scores for research quality that are disproportionate to those for research environment or teaching. It looks as though a lot of  THE's Big Bang simply consists of getting masses of citations. 

It is also possible that universities might obtain a good bang for the buck score by underreporting their income, perhaps accidentally, which would help here, although not in conventional rankings. This has happened to Trinity College Dublin and probably to Harvard, although the latter case went unnoticed by almost everyone. Probably, the very high scores for Sorbonne University and Universite Paris Cite result from the special features of the French funding system.

I suspect quite a few institutions will take this ranking seriously or pretend to and use it as a pretext to try to obtain more largesse from increasingly impoverished states.

It would seem that THE is engaged in a public relations exercise for upmarket British and perhaps for US and continental universities. These are doing all sorts of amazing, brilliant, and exciting things for which they receive insufficient funds from cheapskate governments.  Just imagine what they could do if they got as much money as Chinese universities do. 



Wednesday, February 28, 2024

Comments on the THE Reputation Rankings

Times Higher Education (THE) has announced the latest edition of its reputation ranking. The scores for this ranking will be included in the forthcoming World University Ranking and THE's other tables, where they will have a significant or very significant effect. In the Japan University Ranking, they will get an 8% weighting, and in the Arab University Ranking, 41%. Why THE gives such a large weight to reputation in the Arab rankings seems a bit puzzling. 

The ranking is based on a survey of researchers "who have published in academic journals, have been cited by other researchers and who have been published within the last five years," presumably in journals indexed in  Scopus.

Until 2022 the survey was run by Elsevier but since then has been brought in-house. 

The top of the survey tells us little new. Harvard is first and is followed by the rest of the six big global brands: MIT, Stanford, Oxford, Cambridge, and Berkeley. Leading Chinese universities are edging closer to the top ten.

For most countries or regions, the rank order is uncontroversial: Melbourne is the most prestigious university in Australia, Toronto in Canada, Technical University of Munich in Germany, and a greyed-out Lomonosov Moscow State University in Russia. However, there is one region where the results are a little eyebrow-raising. 

As THE has been keen to point out, there has been a remarkable improvement in the scores for some universities in the Arab region. This in itself is not surprising. Arab nations in recent years have invested massive amounts of money in education and research, recruited international researchers, and begun to rise in the research-based rankings such as Shanghai and Leiden. It is to be expected that some of these universities should start to do well in reputation surveys.

What is surprising is which Arab universities have now appeared in the THE reputation ranking. Cairo University, the American University in Beirut, Qatar University, United Emirates University, KAUST, and King Abdulaziz University have achieved some success in various rankings, but they do not make the top 200 here. 

Instead, we have nine universities: the American University in the Middle East, Prince Mohammed Bin Fahd University, Imam Mohammed Ibn Saud Islamic University, Qassim University, Abu Dhabi University,  Zayed University, Al Ain University, Lebanese University, and Beirut Arab University. These are all excellent and well-funded institutions by any standards, but it is hard to see why they should be considered to be among the world's top 200 research-orientated universities.

None of these universities makes it into the top 1,000 of the Webometrics ranking or the RUR reputation rankings. A few are found in the US News Best Global Universities, but none get anywhere near the top 200 for world or regional reputation. They do appear in the QS world rankings but always with a low score for the academic survey.

THE accepts that survey support for the universities comes disproportionately from within the region in marked contrast to US institutions and claim that Arab universities have established a regional reputation but have yet to sell themselves to the rest of the world.

That may be so, but again, there are several Arab universities that have established international reputations. Cairo University is in the top 200 in the QS academic survey, and the RUR reputation ranking, and the American University of Beirut is ranked 42nd for regional research reputation by USN. They are, however, absent from the THE reputation ranking. 

When a ranking produces results that are at odds with other rankings and with accessible bibliometric data, then a bit of explanation is needed.


  




Saturday, December 09, 2023

Global Subject Rankings: The Case of Computer Science

Three ranking agencies have recently released the latest editions of their subject rankings: Times Higher Education, Shanghai Ranking, and Round University Rankings.  

QS, URAP, and National Taiwan University also published subject rankings earlier in the year. The US News global rankings announced last year can be filtered for subject. The methods are different and consequently the results are also rather different. It is instructive to focus on the results for a specific field, computer science and on two universities, Oxford and Tsinghua. Note that the scope of the rankings is sometimes different.

 

1.   Times Higher Education has published rankings of eleven broad subjects using the same indicators as in their world rankings, Teaching, Research Environment, Research Quality, International Outlook, and Industry: Income and Patents, but with different weightings. For example, Teaching has a weighting of 28% for the Engineering rankings and Industry: Income and Patents 8%, while for Arts and Humanities the weightings are 37.5% and 3% respectively.

These rankings continued to be led by the traditional Anglo-American elite. Harvard is in first place for three subjects, Stanford, MIT, and Oxford in two each and Berkeley and Caltech in one each.

The top five for Computer Science are:

1.    University of Oxford

2.    Stanford University

3.    Massachusetts Institute of Technology

4.    Carnegie Mellon University

5.    ETH Zurich.

Tsinghua is 13th.

 

2.   The Shanghai subject rankings are based on these metrics: influential journal publications, category normalised citation impact, international collaboration, papers in Top Journals or Top Conferences, and faculty winning significant academic awards.

According to these rankings China is now dominant in Engineering subjects. Chinese universities lead in fifteen subjects although Harvard, MIT and Northwestern University lead for seven subjects. The Natural Sciences, Medical Sciences, and Social Sciences are still largely the preserve of American and European universities.

Excellence in the Life Sciences appears to be divided between the USA and China. The top positions in Biology, Human Biology, Agriculture, and Veterinary Science are held respectively by Harvard, University of California San Francisco, Northwest Agriculture and Forestry University, and Nanjing Agricultural University.

The top five for Computer Science and Engineering are:

1.    Massachusetts Institute of Technology

2.    Stanford University

3.    Tsinghua University

4.    Carnegie Mellon University

5.    University of California Berkeley.

Oxford is 9th.

 

3.  The Round University Rankings (RUR), now published from Tbilisi, Georgia, are derived from 20 metrics grouped in 5 clusters, Teaching, Research, International Diversity, and Financial Sustainability. The same methodology is used for rankings in six broad fields. Here, Harvard is in first place for Medical Sciences, Social Sciences, and Technical Sciences, Caltech for Life Sciences, and University of Pennsylvania for Humanities.

RUR’s narrow subject rankings, published for the first time, use different criteria related to publications and citations: Number of Papers, Number of Citations, Citations per Paper, Number of Citing Papers, and Number of Highly Cited Papers. In these rankings, first place goes to twelve universities in the USA, eight in Mainland China, three in Singapore, and one each in Hong Kong, France, and the UK.

 The top five for Computer Science are:

1.    National University of Singapore

2.    Nanyang Technological University

3.    Massachusetts Institute of Technology

4.    Huazhong University of Science and Technology

5.    University of Electronic Science and Technology of China.

Tsinghua is 10th.  Oxford is 47th.

 

4.   The QS World University Rankings by Subject are based on five indicators: Academic reputation, Employer reputation, Research citations per paper, H-index and International research network.  At the top they are mostly led by the usual suspects, MIT, Harvard, Stanford, Oxford, and Cambridge.

The top five for Computer Science and Information Systems

1.    Massachusetts Institute of Technology

2.    Carnegie Mellon University

3.    Stanford University

4.    University of California Berkeley

5.    University of Oxford.

Tsinghua is 15th.

 

5.   University Ranking by Academic Performance (URAP) is produced by a research group at the Middle East Technical University, Ankara, and is based on publications, citations, and international collaboration. Last July it published rankings of 78 subjects.  

 The top five for Information and Computing Sciences were:

1.    Tsinghua University

2.    University of Electronic Science and Technology of China

3.   Nanyang Technological University

4.   National University of Singapore

5.   Xidian University

Oxford is 19th

 

6.    The US News Best Global Universities can be filtered by subject. They are based on publications, citations and research reputation.

The top five for Computer Science in 2022 were:

1.   Tsinghua University

2.   Stanford University

3.    Massachusetts Institute of Technology

4.    Carnegie Mellon University

5.   University of California Berkeley

Oxford was 11th.

 

7.    The National Taiwan University Rankings are based on articles, citations, highly cited papers, and H-index.

The top five for Computer Science are:

1.    Nanyang Technological University

2.    Tsinghua University

3.    University of Electronic Science and Technology of China

4.   National University of Singapore

5.    Xidian University

Oxford is 111th

 

So, Tsinghua is ahead of Oxford for computer science and related fields in the Shanghai Rankings, the Round University Rankings, URAP, the US News Best Global Universities, and the National Taiwan University Rankings. These rankings are entirely or mainly based on research publications and citations. Oxford is ahead of Tsinghua in both the QS and THE subject rankings. The contrast between the THE and the Taiwan rankings is especially striking.

 

 

 

 

 

Saturday, March 18, 2023

SCImago Innovation Rankings: The East-West Gap Gets Wider

The decline of western academic research becomes more apparent every time a ranking with a stable and moderately accurate methodology is published. This will not be obvious if one just looks at the top ten, or even the top fifty, of the better known rankings. Harvard, Stanford, and MIT are usually still there at the top and Oxford and Cambridge are cruising along in the top twenty or the top thirty.

But take away the metrics that measure inherited intellectual capital such as the Nobel and Fields laureates in the Shanghai rankings or the reputation surveys in the QS, THE, and US world rankings, and the dominance of the West appears ever more precarious. This is confirmed if we turn from overall rankings to subject and field tables.

Take a look at the most recent edition of the CWTS Leiden Ranking, which is highly reputed among researchers although much less so among the media. For sheer number of publications overall, Harvard still holds the lead although Zhejiang, Shanghai Jiao Tong and Tsinghua are closing in and there are more Chinese schools in the top 30.  Chinese dominance is reduced if we move to the top 10% of journals but it may be just a matter of time before China takes the lead there as well. 

But click to physical sciences and engineering. The top 19 places are held by Mainland Chinese universities with the University of Tokyo coming in at 20.  MIT is there at 33, Texas A & M at 55 and Purdue 62. Again the Chinese presence is diluted, probably just for the moment, if we switch to the top 10% or 1% of journals.  

Turning to developments in applied research, the shift to China and away from the West, appears even greater.

The SCImago Institutions rankings are rather distinctive. In addition to the standard measures of research activity, there are also metrics for innovation and societal impact. Also, they include the performance of government agencies, hospitals, research centres and companies.

The innovation rankings combine three measures of patent activity. Patents are problematic for comparing universities but they can establish broad long-term trends. 

Here are the top 10 for Innovation in 2009:

1.   Centre Nationale de la Recherche Scientifique

2.   Harvard University 

3.   National Institutes of Health, USA

4.   Stanford University 

5.   Massachusetts Institute of Technology

6.   Institute National de las Sante et de la Recherche Medicale

7.   Johns Hopkins University 

8.   University of California Los Angeles

9.   Howard Hughes Medical Institute 

10.  University of Tokyo.

And here they are for 2023:

1.   Chinese Academy of Sciences 

2.   State Grid Corporation of China  

3.   Ministry of Education PRC

4.   DeepMind

5.   Ionis Pharmaceuticals

6.   Google Inc, USA

7.   Alphabet Inc 

8.  Tsinghua University

9.   Huawei Technologies Co Ltd

10.  Google International LLC.

What happened to the high flying universities of 2009?  Harvard is in 57th place, MIT in 60th, Stanford 127th, Johns Hopkins 365th, and Tokyo in 485th. 

it seems that the torch of innovation has left the hand of American, European, and Japanese universities and research centres and has been passed to Multinational, Chinese, and American companies and research bodies, plus a few Chinese universities. I am not sure where the loyalties of the multinational institutions lie, if indeed they have any at all.




Saturday, March 04, 2023

US News and the Law Schools

There has always been a tension between the claim by commercial rankers that that they provide insights and data for students and other stakeholders and the need to keep on the good side of those institutions that can provide them with status, credibility, and perhaps even lucrative consultancies.

A recent example is Yale, Harvard, Berkeley and other leading law schools declaring that they will "boycott", "leave", "shun", or "withdraw from" the US News (USN) law school rankings. USN has announced that it will make some concessions to the schools although it seems that, for some of them at least, this will not be enough. It is possible that this revolt of the elites will spread to other US institutions and other rankings. Already Harvard Medical School has declared that it will follow suit and boycott the medical school rankings.

At first sight, it would seem that the law schools are performing an act of extraordinary generosity or self denial. Yale has held first place since the rankings began and the others who have joined the supposed boycott seem to be mainly from the upper levels of the ranking while those who have not seem to be mostly from the middle and lower. To "abandon" a ranking that has served the law school elite so well for many years is a bit odd, to say the least.

But Yale and the others are not actually leaving or withdrawing from the rankings. That is something they cannot do. The data used by US News is mostly from public sources or if it is supplied by the schools it can be replaced by public data. The point of the exercise seems to be to persuade US News to review their methodology so that it conforms to the direction where Yale law school and other institutions want to go.

We can be sure that the schools have a good idea how they will fare if the current methodology continues and what is likely to happen if there are changes. It is now standard practice in the business to model how institutions will be affected by possible tweaks in ranking methodology.

So what was Yale demanding? It wanted  fellowships to be given the same weighting as graduate legal jobs. This would appear reasonable on the surface but it seems that the fellowships will be under the control of Yale and therefore this would add a metric dependent on the ability to fund such fellowships. Yale also wanted debt-forgiveness programmes to be counted in the rankings. Again this is something dependent on the schools having enough money to spare.

For a long time the top law schools have been in the business of supplying bright and conscientious young graduates. Employers have been happy to pay substantial salaries to the graduates of the famous "top 14" schools since they appear more intelligent and productive than those from run of the mill institutions.

The top law schools have been able to do this by rigorous selection procedures including standardised tests and college grades. Basically, they have selected for intelligence and conscientiousness and perhaps for a certain amount of agreeability and conformity. There is some deception here,  including perhaps including self-deception. Yale and the rest of the elite claim that they are doing something remarkable by producing outstanding legal talent but in fact they are just recruiting new students with the greatest potential, or at least they did until quite recently.

If schools cannot select for such attributes then they will have problems convincing future employers that their graduates do in fact possess them. If that happens then the law school graduate premium will erode and if that happens future lawyers will be reluctant to go into never ending  debt to enter a career that is increasingly precarious and unrewarding.

The law schools, along with American universities in general, are also voicing their discontent with reliance on standardised tests for admission and their inclusion as ranking indicators. The rationale for this is that the tests supposedly discourage universities from offering aid according to need  and favours those who can afford expensive test prep courses.

Sometimes this is expanded into the argument that since there is a relationship between test scores and wealth then that is the only thing that tests measure and so they cannot measure anything else that might be related to academic ability. 

The problem here is that standardised tests do have a substantial relationship with intelligence, although not as much as they used to, which  in turn has a strong relationship with academic and career success. Dropping the tests means that schools will have to rely on high school and college grades. which have been increasingly diluted over the last few decades, or on recommendations, interviews, and personal essays which have little or no predictive validity and can be easily prepped or gamed.

It appears that American academia is retreating from its mission of producing highly intelligent and productive graduates and have embraced the goal of socialisation into the currently dominant ideology. Students will be  admitted and graduated and faculty recruited according to their doctrinal conformity and their declared identity.  

USN has gone some way to meeting the demands of the schools but that will probably not be enough. Already there are calls to have a completely new ranking system or to do away with rankings altogether.


 




Saturday, February 25, 2023

Global Trends in Innovation: Evidence from SCImago

We are now approaching the third decade of global university rankings. They have had a mixed impact. The original Shanghai rankings published in 2003 were a salutary shock for universities in continental Europe and contributed to a wave of restructuring and excellence initiatives. On the other hand, rankings with unstable and unreliable methodologies are of little use to anyone except for the public relations departments of wealthy Western universities. 

In contrast, the SCImago Institutions Rankings, published by a Spanish research organisation, with thousands of universities, hospitals, research institutes, companies and other organisations, can be quite informative, especially the Innovation and Societal Rankings.

The Innovation Rankings, which are based on three measures of patent citations and applications, included 4019 organisations of various kinds in 2009. The top spot was held by the Centre Nationale de la Recherche Scientifique in France, followed by Harvard, the National Institutes of Health in the USA, Stanford, and MIT.

Altogether the top 20 in 2009 consisted of  ten universities, nine American plus the University of Tokyo, and ten non-university organisations, three American, two German, two French, two multinational, and the Chinese Academy of Sciences (CAS) in 14th place. 

Fast forward to 2022 and we now have 8084 institutions. First place now goes to CAS, followed by the State Grid Corporation of China, Deep Mind Technologies, a British AI firm, the Chinese Ministry of Education, and Samsung Corp.

Now, the top twenty includes exactly two universities, Tsinghua in 14th place and Harvard in 20th. The rest are companies, health organisations, and government agencies. The nationality assigned by Scimago for these eighteen is Multinational eight, USA six, China four, and UK and South Korea one each.

What about those high flying US universities of 2009? Stanford has fallen from 4th place to 67th, the University of Michigan from 13th to 249th, the University of Washington from 16th to 234th.

The relative -- and probably now absolute as well -- decline of American academic research has been well documented. It seems that the situation is even more dire for the innovative capability of US universities. But the technological torch is passing not only to Chinese universities and research centres but also to US and Multinational corporations.



Saturday, February 04, 2023

Aggregate Ranking from BlueSky Thinking

 

In recent years there have been attempts to construct rankings that combine several global rankings. The University of New South Wales has produced an aggregate ranking based on the “Big Three” rankings, the Shanghai ARWU, Times Higher Education (THE), and QS. AppliedHE of Singapore has produced a ranking that combines these three plus Leiden Ranking and Webometrics.

The latest aggregate ranking is from BlueSky Thinking, a website devoted to research and insights in higher education. This aggregates the big three rankings plus the Best Global Universities published by US News.

There are some noticeable differences between the rankings. The University of California Berkeley is fourth in the US News rankings but 27th in QS. The National University of Singapore is 11th in QS but 71st in ARWU.

The top of the aggregate ranking is unremarkable. Harvard leads followed by Stanford, MIT, Cambridge, and Oxford.

There have been some significant changes over the last five years, with universities in Mainland China, Hong Kong, France, Germany, Saudi Arabia, and Australia recording significant improvements, while a number of US institutions, including Ohio State University, Boston University and the University of Minnesota, have fallen.

 

Tuesday, July 19, 2022

What's the Matter with Harvard?

When the first global ranking was published by Shanghai Jiao Tong University back in 2003, the top place was taken by Harvard. It was the same for the rankings that followed in 2004, Webometrics and the THES - QS World University Rankings.  Indeed, at that time any international ranking that did not put Harvard at the top would have  been regarded as faulty.

Is Harvard Declining?

But since then Harvard has been dethroned by a few rankings. Now MIT leads in the QS world rankings, while Oxford is first in THE's  and the Chinese Academy of Sciences in Nature Index. Recently Caltech deposed Harvard at the top of the Round University Rankings, now published in Georgia.

It is difficult to get excited about Oxford leading Harvard in the THE rankings. A table that purports to show Macau University of Science and Technology as the world's most international university, Asia University Taiwan as the most innovative, and An Najah National University as the best for research impact, need not be taken too seriously.

Losing out to MIT in the QS world rankings probably does not mean very much either. Harvard is at a serious disadvantage here for international students and international faculty.

Harvard and Leiden Ranking

On the other hand, the performance of Harvard in CWTS Leiden Ranking, which is generally respected in the global research community,  might tell us that something is going on. Take a look at the total number of publications for the period 2017-20 (using the default settings and parameters). There we can see Harvard at the top with 35,050 publications followed by Zhejiang and Shanghai Jiao Tong Universities.

But it is rather different for publications in the broad subject fields. Harvard is still in the lead for Biomedical Sciences and for Social Sciences and Humanities. For Mathematics and Computer Science, however, the top twenty consists entirely of Mainland Chinese universities. The best non - Mainland institution is Nanyang Technological University in Singapore. Harvard is 128th.

You could argue whether this is just a matter of quantity rather than quality. So, let's turn to another Leiden indicator, the percentage of publications in the top 10% of journals for Mathematics and Computer Science. Even here China is in the lead, although somewhat precariously. Changsha University of Science and Technology tops the table and  Harvard is in fifth place.

The pattern for Physical Sciences and Engineering is similar. The top 19 for publications are Chinese with the University of Tokyo in 20th place. However, for those in the top 10% Harvard still leads. It seems then that Harvard is still ahead for upmarket publications in physics and engineering but a growing and substantial amount of  research is done by China, a few other parts of Asia, and perhaps some American outposts of scientific excellence such as MIT and Caltech.

The Rise of China

The trend seems clear. China is heading towards industrial and scientific hegemony and eventually Peking, Tsinghua, Fudan and Zhejiang and a few others will, if nothing changes, surpass the Ivy league, the Group of Eight, and Oxbridge, although it will take longer for the more expensive and demanding fields of research. Perhaps the opportunity will be lost in the next few years if there is another proletarian cultural revolution in China or if Western universities change course.

What Happened to Harvard's Money?

It is standard to claim that the success or failure of universities is dependent on the amount of money they receive. The latest edition of the annual Nature Index tables was accompanied by headlines proclaiming that that China's recent success in high impact research was the result of a long term investment program. 

Money surely had a lot to do with it but there needs to be a bit of caution here. The higher education establishment has a clear vested interest in getting as much money from the public purse as it can and is inclined to claiming that any decline in the rankings is a result of hostility to higher education..

Tracing the causes of Harvard's decline, we should consult the latest edition of the Round University Rankings, now based in Georgia,  which provides ranks for 20 indicators. In 2021 Harvard was first but this year it was second, replaced by Caltech. So what happened?  Looking more closely we see that in 2021 Harvard was 2nd for financial sustainability and in 2022 it was 357th, That suggests a catastrophic financial collapse. So maybe there has been a financial disaster over at Harvard and the media simply have not noticed bankrupt professors jumping out of their offices, Nobel laureates hawking their medals, or mendicant students wandering the streets with tin cups. 

Zooming in a bit, it seems that, if the data is accurate, there has been a terrible collapse in Harvard's financial fortunes. For institutional income per academic staff Harvard's rank has gone from 21st to 891st.

Exiting sarcasm mode for a moment, it is of course impossible that there has actually been such a catastrophic fall in income. I suspect that what we have here is something similar to what happened  to Trinity College Dublin  a few years ago when someone forgot the last six zeros when filling out the form for the THE world rankings.

So let me borrow a flick knife from my good friend Occam and propose that what happened to Harvard in the Round University Rankings was simply that somebody left off the zeros at the end of the institutional income number when submitting data to Clarivate Analytics, who do the statistics for RUR. I expect next year the error will be corrected, perhaps without anybody admitting that anything was wrong.

So, there was no substantial reason why Harvard lost ground to Caltech in the Round Rankings this year. Still it does say something that such a mistake could occur and that nobody in the administration noticed or had the honesty to say anything. That is perhaps symptomatic of deeper problems within American academia. We can then expect the relative decline of Harvard and the rise of Chinese universities and a few others in Asia to continue.





Sunday, June 13, 2021

The Remarkable Revival of Oxford and Cambridge


There is nearly always a theme for the publication of global rankings. Often it is the rise of Asia, or parts of it. For a while it was the malign grasp of Brexit which was crushing the life out of British research or the resilience of American science in the face of the frenzied hostility of the great orange beast. This year it seems that the latest QS world rankings are about the triumph of Oxford and other elite UK institutions and their leapfrogging their US rivals. Around the world, quite a few other places are also showcasing their splendid achievements.

In the recent QS rankings Oxford has moved up from overall fifth to second place and Cambridge from seventh to third while University College London, Imperial College London, and Edinburgh have also advanced. No doubt we will soon hear that this is because of transformative leadership, the strength that diversity brings, working together as a team or a family, although I doubt whether any actual teachers or researchers will get a bonus or a promotion for their contributions to these achievements.

But was it leadership or team spirit that pushed Oxford and Cambridge into the top five? That is very improbable. Whenever there is a big fuss about universities rising or falling significantly in the rankings in a single year it is a safe bet that it is the result of an error, the correction of an error, or a methodological flaw or tweak of some kind.

Anyway, this year's Oxbridge advances had as much to do with leadership,  internationalization, or reputation as goodness had with Mae West's diamonds. It was entirely due to a remarkable rise for both places in the score for citations per faculty, Oxford from 81.3 to 96, and Cambridge from 69.2 to 92.1. There was no such change for any of the other indicators.

Normally, there are three ways in which a university can rise in QS's citations indicator. One is to increase the number of publications while maintaining the citation rate. Another is to improve the citation rate while keeping output constant. The third is to reduce the number of faculty physically or statistically.

None of these seem to have happened at Oxford and Cambridge. The number of publications and citations has been increasing but not sufficiently to cause such a big jump. Nor does there appear to have been a drastic reduction of faculty in either place.

In any case it seems that Oxbridge is not alone in its remarkable progress this year. For citations, ETH Zurich rose from 96.4 to 99.8, University of Melbourne from 75 to 89.7, National University of Singapore from 72.9 to 90.6, Michigan from 58 to 70.5. It seems that at the top levels of these rankings nearly everybody is rising except for MIT which has the top score of 100 but it is noticeable that as we get near the top the increase gets smaller.

It is theoretically possible that this might be the result of a collapse of the raw scores of citations front runner MIT which would raise everybody else's scores if it still remained at the top but there is no evidence of either a massive collapse in citations or a massive expansion of research and teaching staff.

But then as we go to the other end of the ranking we find universities' citations scores falling, University College Cork from 23.4 to 21.8, Universitas Gadjah Mada from 1.7 to 1.5, UCSI University Malaysia from 4.4 to 3.6, American University  in Cairo from 5.7 to 4.2.

It seems there is a bug in the QS methodology. The indicator scores that are published by QS are not raw data but standardized scores based on standard deviations from the mean The mean score is set at fifty and the top score at one hundred. Over the last few years the number of ranked universities has been increasing and the new ones tend to perform less well than the the established ones, especially for citations. In consequence, the  mean number of citations per faculty has declined and therefore universities scoring above the mean will increase their standardized scores which is derived from the standard deviation from the mean. If this interpretation is incorrect I'm very willing to be corrected.

This has an impact on the relative positions of Oxbridge and leading American universities. Oxford and Cambridge rely on their  scores in the academic and employer survey and international faculty and staff to keep in the top ten. Compared to Harvard, Stanford and MIT they are do not perform well for quantity or quality of research. So the general inflation of citations scores gives them more of a boost than the US leaders and so their total score rises.

It is likely that Oxford and Cambridge's moment of glory will be brief since QS in the next couple of years will have to do some recentering in order to prevent citation indicator scores bunching up in the high nineties. The two universities will fall again although  it that will probably not be attributed to a sudden collapse of leadership or failure to work as a team.

It will be interesting to see if any of this year's rising universities will make an announcement that they don't really deserve any praise for their illusory success in the rankings.



Sunday, November 10, 2019

When will Tsinghua Overtake Harvard?

One of the most interesting trends in higher education over the last few years is the rise of China and the relative decline of the USA.

Winston Churchill said the empires of the future will be empires of the mind. If that is so then this century will very likely be the age of Chinese hegemony. Chinese science is advancing faster than that of the USA on all or nearly all fronts. Unless we count things like critical race theory or queer studies.

This is something that should show up in the global rankings if we track them over at least a few years. So, here is a comparison of the top two universities in the two countries according to indicators of research output and research quality over a decade.

Unfortunately, most international rankings are not very helpful in this respect. Few of the current ones provide data for a decade or more. QS and THE have seen frequent changes in methodology and THE's citation indicator although charmingly amusing is not useful unless you think that Aswan University, Anglia Ruskin University and the University of Peradeniya are world beaters for research impact. Two helpful rankings here are Shanghai Academic Ranking of World Universities (ARWU), and Leiden Ranking.

Let's compare the comparative performance of Tsinghua University and Harvard in the Shanghai Ranking's indicator of research output, papers over a one year period, excluding arts and humanities. The published scores are derived from the square roots of the raw data with the top scorer getting a score of 100.

In 2009 Harvard's score  was 100 while that for Tsinghua was 55.8. In 2019 it was 100 for Harvard and 79.5 for Tsinghua. So the gap is closing 2.37 points every year. At that rate it would take about nine years for Tsinghua to catch up so look out for 2028.

Of course, this is quantity not quality so take a look at another indicator, Highly Cited Researchers. This is a moderately gamable metric and I suspect that Shanghai might have to abandon it one day but it captures the willingness and ability of universities to sponsor research of a high quality. In 2009 Tsinghua's score was zero compared to Harvard's 100. In 2019 it is 37.4. If everything continues at the same rate Tsinghua will overtake Harvard in another 17 years.

Looking at the default indicator in Leiden Ranking, total publications, in 2007-10 Tsinghua was 35% of Harvard and in 2014-17 56%. Working from that Tsinghua would achieve parity in 2029-33, in the rankings published in 2035.

Looking at a measure of research quality, publications in the top 10% of journals, Tsinghua was 15% of Harvard in 2007-10 and 34% in 2014-17.  From that, Tsinghua should reach parity  in 2038-42. in the rankings published in 2044, assuming Leiden is still following its current methodology.

So it looks like Tsinghua will reach parity in  research output in a decade or a decade or a decade and a half and high quality research in a decade and a half or two decades and a half.






Tuesday, September 17, 2019

Going Up and Up Down Under: the Case of the University of Canberra

It is a fact almost universally ignored that when a university suddenly rises or falls many places in the global rankings the cause is not transformative leadership, inclusive excellence, team work, or strategic planning but nearly always a defect or a change in the rankers' methodology.

Let's take a look at the fortunes of the University of Canberra (UC) which THE world rankings now have in the world's top 200 universities and Australia's top ten. This is a remarkable achievement since the university did not appear in these rankings until 2015-16 when it was placed in the 500-600 band with very modest scores of 18.4 for teaching, 19.3 for research, 29.8 for citations, which is supposed to measure research impact, 36.2 for industry income, and 54.6 for international outlook.

Just four years later the indicator scores are 25.2 for teaching, 31.1 for research, 99.2 for citations, 38.6 for industry income, and 86.9 for international orientation. 

The increase in the overall score over four years, calculated with different weightings for the indicators, was composed of 20.8 points for citations and 6.3 for the other four indicators combined. Without those 20.8 points Canberra would be in the 601-800 band.

I will look at where that massive citation score came from in a moment. 

It seems that the Australian media is reporting on this superficially impressive performance with little or no scepticism and without noting how different it is from the other global rankings. 

The university has issued a statement quoting vice-chancellor Professor Deep Saini as saying that the "result confirms the steady strengthening of the quality at the University of Canberra, thanks to the outstanding work of our research, teaching and professional staff" and that the "increase in citation impact is indicative of the quality of research undertaken at the university, coupled with a rapid growth in influence and reach, and has positioned the university as amongst the best in the world."

The Canberra Times reports that the vice-chancellor has said  that part of the improvement was the result of a talent acquisition campaign while noting that many faculty were complaining about pressure and excessive workloads.

Leigh Sullivan, DVC for research and innovation, has a piece in the Campus Morning Mail that hints at reservations about UC's apparent success, which is " a direct result of its Research Foundation Plan (2013-2017) and "a strong emphasis on providing strategic support for research excellence in a few select research areas where UC has strong capability." He notes that when the citation scores of research stars are excluded there has still been a significant increase in citations and warns that what goes up can go down and that performance can be affected by changes in the ranking methodology.

The website riotact quotes the vice-chancellor on the improvement in research quality as evidence by the citation score and as calling for more funding for universities: the "government has to really think and look hard at how well we support our universities. That's not to say it badly supports us, it's that the university sector deserves to be on the radar of our government as a major national asset."

The impressive ascent of UC is unique to THE. No serious ranking puts it in the top 200 or anywhere near. In the current Shanghai Rankings it is in the 601-700 band and has been falling for the last two years. In Webometrics it is 730th in the world and 947th for Excellence, that is publications in the 10% most cited in 25 disciplines.  In  University Ranking by Academic Performance it is 899th and in the CWUR Rankings it doesn't even make the top 1,000.

Round University Ranking and Leiden Ranking do not rank UC at all.

Apart from THE UC does best in the QS rankings where it is 484th in the world and 26th in Australia.

So how could UC perform so brilliantly in THE rankings when nobody else has recognised that brilliance? What does THE know that nobody else does? Actually, it does not perform brilliantly in the THE rankings, just in the citations indicator which is supposed to measure research influence or research impact.

This year UC has a score of 99.2 which puts it in the top twenty for citations just behind Nova Southeastern University in Florida and Cankaya University in Turkey and ahead of Harvard, Princeton and Oxford. The top university this year is Aswan University in Egypt replacing Babol Noshirvani University of Technology in Iran. 

No, THE is not copying the interesting methodology of the Fortunate 500. This is the result of an absurd methodology that THE is unable or unwilling for some reason to change.

THE has a self-inflicted  problem with  a small number of papers that have hundreds or thousands of "authors" and collect thousands of citations. Some of these are from the CERN project and THE has dealt with them  by using a modified form of fractional counting for papers with more than a thousand authors. That has removed the privilege of institutions that contribut to CERN projects but has replaced it with the privilege of those that contribute to the Global Burden of Disease Study (GBDS) whose papers tend to have hundreds but not thousands of contributors and sometimes receive over a thousand citations. As a result, places like Tokyo Metropolitan University, National Research University MEPhI and Royal Holloway London have been replaced as citation super stars by St Georges' London, Brighton and Sussex Medical School, and Oregon Health and Science University.

It would be a simple matter to apply fractional counting to all papers, dividing the number of citations by the number of authors. After all Leiden Ranking and Nature Index manage to do it but THE for some reason has chosen not to follow.

The problem is compounded by counting self-citations, by hyper-normalisation so that the chances of hitting the jackpot with an unusually highly cited paper are increased, and by the country bonus that boosts the scores for universities by virtue of their location in low scoring countries. 

And so to UC's apparent success this year. This is entirely the result of it's citation score which is entirely dependent on THE's methodology. 

Between 2014 and 2018 UC had 3,825 articles in the Scopus database of which 27 were linked to the GBDS which is funded by the Bill and Melinda Gates Foundation. Those 27 articles, each with hundreds of contributors, have received 18,431 citations all of which are credited to UC and its contributor. The total number of citations is 53,929 so those 27 articles accounted for over a third of UC's citations. Their impact might be even greater if they were cited disproportionately soon after publication.

UC has of course improved its citation performance even without those articles but it is clear that they have made an outsize contribution. UC is not alone here. Many universities in the top 100 for citations in the THE world rankings owe their status to the GBDS: Anglia Ruskin, Reykjavik, Aswan, Indian Institute of Technology Ropar, the University of Peradeniya, Desarrollo, Pontifical Javeriana and so on.

There is absolutely nothing wrong with the GBDS nor with UC encouraging researchers to take part. The problem lies with THE and its reluctance to repair an indicator that produces serious distortions and is an embarrassment to those universities who apparently look to the THE rankings to validate their status.

Tuesday, August 13, 2019

University of the Philippines beats Oxford, Cambridge, Yale, Harvard, Tsinghua, Peking etc etc

Rankings can do some good sometimes. They can also do a lot of harm and that harm is multiplied when they are sliced more and more thinly to produce rankings by age, by size, by mission, by region, by indicator, by subject. When this happens minor defects in the overall rankings are amplified.

That would not be so bad if universities, political leaders and the media were to treat the tables and the graphs with a healthy scepticism. Unfortunately, they treat the rankings, especially THE, with obsequious deference as long as they are provided with occasional bits of publicity fodder.

Recently, the Philippines media have proclaimed that the University of the Philippines (UP) has beaten Harvard, Oxford and Stanford for health research citations. It was seventh in the THE Clinical, Pre-clinical and Health category behind Tokyo Metropolitan University, Auckland University of Technology, Metropolitan  Autonomous University Mexico, Jordan University of Science and Technology, University of Canberra and Anglia Ruskin University.

The Inquirer is very helpful and provides an explanation from the Philippine Council for Health Research and Development that citation scores “indicate the number of times a research has been cited in other research outputs” and that the score "serves as an indicator of the impact or influence of a research project which other researchers use as reference from which they can build on succeeding breakthroughs or innovations.” 

Fair enough, but how can UP, which has a miserable score of  13.4 for research in the same subject ranking have such a massive research influence? How can it have an extremely low output of papers, a poor reputation for research, and very little funding and still be a world beater for research impact.

It is in fact nothing to do with UP, nothing to do with everyone working as a team, decisive leadership or recruiting international talent.

It is the result of a bizarre and ludicrous methodology.  First, THE does not use fractional counting for papers with less than a thousand authors. UP, along with many other universities, has taken part in the Global Burden of Disease project funded by the Bill and Melinda Gates Foundation. This has produced a succession of papers, many of them in the Lancet, with hundreds of contributing institutions and researchers, whose names are all listed as authors, and hundreds or thousands of citations. As long as the number of authors does not reach 1,000 each author is counted as though he or she were the recipient of all the citations. So UP gets the credit for a massive number of citations which is divided by a relatively small number of papers.

Why not just use fractional counting, dividing the citations among the contributors or the intuitions, like Leiden Ranking does. Probably because it might add a little bit to costs, perhaps because THE doesn't like to admit it made a mistake.

Then we have the country bonus or regional modification, applied to half the indicator, which increases the score for universities in countries with low impact.

The result of all this is that UP, surrounded by low scoring universities, not producing very much research but with a role in a citation rich mega project, gets a score for this indicator that puts it ahead of the Ivy League, the Group of Eight and the leading universities of East Asia.

If nobody took this seriously, then no great harm would be done. Unfortunately it seems that large numbers of academics, bureaucrats and journalists do take the THE rankings very seriously or pretend to do so in public. 

And so committee addicts get bonuses and promotions, talented researchers spend their days in unending ranking-inspired transformational seminars, funds go to the mediocre and the sub mediocre, students and stakeholders base their careers on  misleading data, and the problems of higher education are covered up or ignored.



Thursday, July 04, 2019

Comparing National Rankings: USA and China


America's Best Colleges
The US News America's Best Colleges (ABC) is very much the Grand Old Man of university rankings. Its chief data analyst has been described as the most powerful man in America although that is perhaps a bit exaggerated. These rankings have had a major role in defining excellence in American higher education and they may have contributed to US intellectual and scientific dominance in the last two decades of the twentieth century.

But they are changing. This year's edition has introduced two new measures of "social mobility", namely the number of  Pell Grant (low income) students and the comparative performance of those students. There is suspicion that this is an attempt to reward universities for the recruitment and graduation of certain favoured groups, including African Americans and Hispanics, and perhaps recent immigrants from the Global South. Income is used as a proxy for race since current affirmative action policies at Harvard and other places are under legal attack. 

It should be noted that success is defined as graduation within a six year period and that is something that can be easily achieved by extra tuition, lots of collaborative projects, credit for classroom discussions and effort and persistence, holding instructors responsible for student failure, innovative methods of assessment, contextualised grading and so on.

The new ABC has given the Pell Grant metrics a 5% weighting  and has also increased the weighting for graduation rate performance, which looks at actual student outcomes compared to those predicted from their social and academic attributes, from 7.5% to 8%. So now a total of 13 % in effect goes to social engineering. A good chunk of the rankings then is based on the dubious proposition that universities can and should reduce or eliminate the achievement gap between various groups.

To make room for these metrics the acceptance rate indicator has been eliminated, and the weightings for standardised test scores, high school rank, counsellor reviews and six year graduation rate have been reduced.

Getting rid of the acceptance rate metric is probably not a bad idea since it had the unfortunate effect of encouraging universities to maximise the number of rejected applications, which produced income for the universities but imposed a financial burden on applicants.

The rankings now assign nearly a one third weighting to student quality, 22% to graduation and retention rates and 10% for standardised tests and high school rank. 

It seems that US News is moving from ranking universities by the academic ability of their students to ranking based on the number and relative success of low income and "minority" students.

The latest ranking shows the effect of these changes. The very top is little changed but further down there are significant shifts. William and Mary is down. Howard University, a predominantly African American institution, is up as are the campuses of the University of California system.

ABC also has another 30% for resources (faculty 20% and financial 10%), 20% for for reputation (15 % peer and 5% high school counsellors), and 5% for alumni donations.

Shanghai Best Chinese University Rankings

The Shanghai Best Chinese University Ranking (BCUR) is a recent initiative although ShanghaiRanking has been doing global rankings since 2003. They are quite different from the US News rankings.

For student outcomes Shanghai assigns a weighting of 10% to graduate employment and does not bother with graduation rates. As noted, ABC gives 22% for student outcomes (six year graduation rate and first year retention rate). 


Shanghai gives a 30% weighting for the dreaded Gaokao, the national university entrance exam, compared to 10% for high school class rank and SAT/ACT scores in ABC.

With regard to inputs, Shanghai allocates just 5% for alumni donations, compared to 30% in the ABC for  class size, faculty salary, faculty highest degrees, student faculty ratio, full time faculty and financial resources. 

That 5% is the only thing in Shanghai that might  be relevant to reputation while ABC has a full 20% for reputation among peers and counsellors. 

Shanghai also has a 40% allocation for research, 10% for "social service", which comprises research income from industry and income from technology transfer, and 5% for international students. ABC has no equivalent to these, although it publishes rankings separately on postgraduate programmes.

To compare the two, ABC is heavy on inputs, student graduation and retention, reputation, and social engineering. Probably the last will become more important over the next few years 
BCUR, in contrast, emphasises student ability as measured by a famously rigorous entrance exam, student employment, research, links with industry, and internationalisation.

It seems that in the coming years excellence in higher education will be defined very differently. An elite US university will be one well endowed with money and human resources, will make sure that most of its students graduate one way or another, will ensure that that the  ethnic and gender composition of the faculty and student body matches that of America or the world, and has a good reputation among peers and the media.

An elite Chinese university will be one that produces employed and employable graduates, admits students with high levels of academic skills, has close ties with industry, and has a faculty that produces a high volume of excellent research.


Saturday, April 13, 2019

Where is the real educational capital of the world?

Here is another example of how rankings, especially those produced by Times Higher Education (THE), are used to mislead the public.

The London Post has announced that London is the Higher Educational Capital of the World for 2019. Support for this claim is provided by four London universities appearing in the top 40 of the THE World University Rankings which, unsurprisingly, have been welcomed by London Mayor Sadiq Khan.

In addition, THE has Oxford and Cambridge as first and second in the world in their overall rankings and QS has declared London to be the Best Student City.

THE is not the only global ranking. There are now several others and none of them have Oxford in first place. Most of them give the top spot to Harvard, although in the QS world rankings it is MIT and in the GreenMetric rankings Wageningen.

Also, if we look at the number of universities in the top 50 of the Shanghai rankings we cannot see London as the undisputed HE capital of the world. Using this simple criterion it would be New York with three, Columbia, New York University and Rockefeller.

Then come Boston, Paris, Chicago and London with two each.



Thursday, April 04, 2019

What to do to get into the rankings?

I have been asked this question quite a few times. So finally here is an attempt to answer it.

If you represent a university that is not listed in any rankings, except uniRank and Webometrics, but you want to be, what should you do?

Where are you now?
The first thing to do is to find out where you are in the global hierarchy of universities. 

Here the Webometrics rankings are very helpful. These are now a mixture of web activity and research indicators and provide a rank for over 28,000 universities or places that might be considered universities, colleges, or academies of some sort. 

If you are ranked in the bottom half of Webometrics then frankly it would be better to concentrate on not going bankrupt and getting or staying accredited.

But if you are in the top 10,000 or so then you might  be able to think about getting somewhere in some sort of ranking.

Where do you want to be?
Nearly everybody in higher education who is not hibernating has heard of the Times Higher Education (THE) world and regional rankings. Some also know about the Quacquarelli Symonds (QS) or the Shanghai rankings. But there are now many more rankings that are just as good as, or in some cases better than, the "big three".
 
According to the IREG inventory published last year there are now at least 45 international university rankings including business school, subject, system and regional rankings, of which 17 are global rankings, and there will be more to come. This inventory provides links and some basic preliminary information about all the rankings but it already needs updating.

The methodology and public visibility of the global rankings varies enormously. So, first you have to decide what sort of university you are and what you want to be. You also need to think about exactly what you want from a ranking, whether it is fuel for the publicity machine or an accurate and valid assessment of research performance.  

If you want to be a small high quality research led institution with lavish public and private funding, something like Caltech, then the THE world rankings would probably be appropriate. They measure income three different ways, no matter how wastefully it is spent, and most of the indicators are scaled according to number of staff or students. They also have a citations indicator which favours research intensive institutions like Stanford or MIT along with some improbable places like Babol Noshirvani University of Technology, Brighton and Sussex Medical School or Reykjavik University.

If, however, your goal is to be a large comprehensive research and teaching university then the QS or the Russia-based Round University Rankings might be a better choice. The latter has all the metrics of the THE rankings except one plus another eight, all with sensible weightings.

If you are a research postgraduate-only university then you would not be eligible for the overall rankings produced by QS or THE but you could be included in the Shanghai Rankings.

Data Submission

Most rankings rely on publicly accessible information. However these global rankings use include information submitted by the ranked institution:  QS world rankings, THE world rankings, Round University Ranking, US News Best Global Universities, U-Multirank, UI GreenmetricCollecting, verifying and submitting data can be a very tiresome task so it  would be well to consider whether there are sufficient informed and conscientious staff available. U-Multirank is especially demanding in the the amount and quality of data required.

List of Global Rankings
Here is the list of the 17 global rankings included in the IREG inventory with comments about the kind of university that is likely to do well in them. 

CWTS Leiden Ranking
This is a research only ranking by a group of bibliometric experts at Leiden University. There are several indicators starting with the total number of publications, headed by Harvard followed by the University of Toronto, and ending with the percentage of publications in the top 1% of journals, headed by Rockefeller University. 

CWUR World University Rankings
Now produced out of UAE, this is an unusual and not well-known ranking that attempts to measure alumni employment and the quality of education and faculty. At the top it generally resembles more conventional rankings.

Emerging/Trendence Global University Employability Rankings
Published in but not produced by THE, these are based on a survey of employers in selected countries and rank only 150 universities.

Nature Index
A research rankine based on a very select group of journals. Also includes non-university institutions. The current leader is the Chinese Academy of Sciences. This ranking is relevant only for those universities aiming for the very top levels of research in the natural sciences.

National Taiwan University Rankings 
A research ranking of current publications and citations and those over a period of eleven years. It favours big universities with the  current top ten including the University of Toronto and the University of Michigan.

QS World University Rankings
If you are confident of building a local reputation then this is the ranking for you. There is a 40 % weighting for academic reputation and 10 % for employer reputation. Southeast Asian universities often do well in this ranking.

Webometrics
This now has two measures of web activity, one of citations and one of publications. It measures quantity rather than quality so there is a chance here for mass market institutions to excel. 

Reuters Top 100 Innovative Universities
This is definitely for the world's technological elite.

Round University Rankings
These rankings combines survey and institutional data  from Clarivate's Global Institutional Profiles Project and bibliometric data from the.Web of Science Core Collection. They are the most balanced and comprehensive of the general world rankings although hardly known outside Russia.

Scimago Institution Rankings
These combine indicators of research, innovation measured by patents and web activity. They tend to favour larger universities that are strong in technology.

Shanghai Academic Ranking of World Universities (ARWU)
These are the oldest of the global rankings with a simple and stable methodology. They are definitely biased towards large, rich, old research universities with strengths in the natural sciences and a long history of scientific achievement.

THE World University Rankings
The most famous of the international rankings, they claim to be sophisticated, rigorous, trusted etc but are biased towards UK universities. The citations indicator is hopelessly and amusingly flawed. There are a number of spin-offs that might be of interest to non-elite universities such as regional, reputation, young universities and, now, global impact rankings.

U-Multirank
Contains masses of information about things that other rankings neglect but would be helpful mainly to universities looking for students from Europe.

UI GreenMetric Ranking 
This is published by Universitas Indonesia and measures universities' contribution to environmental sustainability. Includes a lot of Southeast Asian universities but not many from North America. Useful for eco-conscious universities.

uniRank University Ranking
This is based on web popularity derived from several sources. In many parts of Africa it serves as a measure of general quality.

University Ranking by Academic Performance
A research ranking produced by the Middle East Technical University in Ankara that ranks 2,500 universities. It is little known outside Turkey but I noticed recently that it was used in a presentation at a conference in Malaysia.

US News Best Global Universities
Sometimes counted as one of the big four but hardly ever the big three, this is a research ranking that is balanced and includes 1,250 universities. For American universities is a useful complement to the US News' America's best Colleges.

You will have to decide whether to take a short-term approach to rankings, by recruiting staff from the Highly Cited Researchers list, admitting international students regardless of ability, sending papers to marginal journals and conferences, signing up for citation-rich mega projects, or by concentrating on the underlying attributes of an excellent university, admitting students and appointing and promoting faculty for their cognitive skills and academic ability, encouraging genuine and productive collaboration, nurturing local talent.

The first may produce quick results or nice bonuses for administrators but it can leave universities at the mercy of the methodological tweaking of the rankers, as Turkish universities found out in 2015.

The latter will take years or decades to make a difference and unfortunately that may be too long for journalists and policy makers.