Showing posts sorted by date for query MIT. Sort by relevance Show all posts
Showing posts sorted by date for query MIT. Sort by relevance Show all posts

Saturday, February 04, 2023

Aggregate Ranking from BlueSky Thinking

 

In recent years there have been attempts to construct rankings that combine several global rankings. The University of New South Wales has produced an aggregate ranking based on the “Big Three” rankings, the Shanghai ARWU, Times Higher Education (THE), and QS. AppliedHE of Singapore has produced a ranking that combines these three plus Leiden Ranking and Webometrics.

The latest aggregate ranking is from BlueSky Thinking, a website devoted to research and insights in higher education. This aggregates the big three rankings plus the Best Global Universities published by US News.

There are some noticeable differences between the rankings. The University of California Berkeley is fourth in the US News rankings but 27th in QS. The National University of Singapore is 11th in QS but 71st in ARWU.

The top of the aggregate ranking is unremarkable. Harvard leads followed by Stanford, MIT, Cambridge, and Oxford.

There have been some significant changes over the last five years, with universities in Mainland China, Hong Kong, France, Germany, Saudi Arabia, and Australia recording significant improvements, while a number of US institutions, including Ohio State University, Boston University and the University of Minnesota, have fallen.

 

Tuesday, July 19, 2022

What's the Matter with Harvard?

When the first global ranking was published by Shanghai Jiao Tong University back in 2003, the top place was taken by Harvard. It was the same for the rankings that followed in 2004, Webometrics and the THES - QS World University Rankings.  Indeed, at that time any international ranking that did not put Harvard at the top would have  been regarded as faulty.

Is Harvard Declining?

But since then Harvard has been dethroned by a few rankings. Now MIT leads in the QS world rankings, while Oxford is first in THE's  and the Chinese Academy of Sciences in Nature Index. Recently Caltech deposed Harvard at the top of the Round University Rankings, now published in Georgia.

It is difficult to get excited about Oxford leading Harvard in the THE rankings. A table that purports to show Macau University of Science and Technology as the world's most international university, Asia University Taiwan as the most innovative, and An Najah National University as the best for research impact, need not be taken too seriously.

Losing out to MIT in the QS world rankings probably does not mean very much either. Harvard is at a serious disadvantage here for international students and international faculty.

Harvard and Leiden Ranking

On the other hand, the performance of Harvard in CWTS Leiden Ranking, which is generally respected in the global research community,  might tell us that something is going on. Take a look at the total number of publications for the period 2017-20 (using the default settings and parameters). There we can see Harvard at the top with 35,050 publications followed by Zhejiang and Shanghai Jiao Tong Universities.

But it is rather different for publications in the broad subject fields. Harvard is still in the lead for Biomedical Sciences and for Social Sciences and Humanities. For Mathematics and Computer Science, however, the top twenty consists entirely of Mainland Chinese universities. The best non - Mainland institution is Nanyang Technological University in Singapore. Harvard is 128th.

You could argue whether this is just a matter of quantity rather than quality. So, let's turn to another Leiden indicator, the percentage of publications in the top 10% of journals for Mathematics and Computer Science. Even here China is in the lead, although somewhat precariously. Changsha University of Science and Technology tops the table and  Harvard is in fifth place.

The pattern for Physical Sciences and Engineering is similar. The top 19 for publications are Chinese with the University of Tokyo in 20th place. However, for those in the top 10% Harvard still leads. It seems then that Harvard is still ahead for upmarket publications in physics and engineering but a growing and substantial amount of  research is done by China, a few other parts of Asia, and perhaps some American outposts of scientific excellence such as MIT and Caltech.

The Rise of China

The trend seems clear. China is heading towards industrial and scientific hegemony and eventually Peking, Tsinghua, Fudan and Zhejiang and a few others will, if nothing changes, surpass the Ivy league, the Group of Eight, and Oxbridge, although it will take longer for the more expensive and demanding fields of research. Perhaps the opportunity will be lost in the next few years if there is another proletarian cultural revolution in China or if Western universities change course.

What Happened to Harvard's Money?

It is standard to claim that the success or failure of universities is dependent on the amount of money they receive. The latest edition of the annual Nature Index tables was accompanied by headlines proclaiming that that China's recent success in high impact research was the result of a long term investment program. 

Money surely had a lot to do with it but there needs to be a bit of caution here. The higher education establishment has a clear vested interest in getting as much money from the public purse as it can and is inclined to claiming that any decline in the rankings is a result of hostility to higher education..

Tracing the causes of Harvard's decline, we should consult the latest edition of the Round University Rankings, now based in Georgia,  which provides ranks for 20 indicators. In 2021 Harvard was first but this year it was second, replaced by Caltech. So what happened?  Looking more closely we see that in 2021 Harvard was 2nd for financial sustainability and in 2022 it was 357th, That suggests a catastrophic financial collapse. So maybe there has been a financial disaster over at Harvard and the media simply have not noticed bankrupt professors jumping out of their offices, Nobel laureates hawking their medals, or mendicant students wandering the streets with tin cups. 

Zooming in a bit, it seems that, if the data is accurate, there has been a terrible collapse in Harvard's financial fortunes. For institutional income per academic staff Harvard's rank has gone from 21st to 891st.

Exiting sarcasm mode for a moment, it is of course impossible that there has actually been such a catastrophic fall in income. I suspect that what we have here is something similar to what happened  to Trinity College Dublin  a few years ago when someone forgot the last six zeros when filling out the form for the THE world rankings.

So let me borrow a flick knife from my good friend Occam and propose that what happened to Harvard in the Round University Rankings was simply that somebody left off the zeros at the end of the institutional income number when submitting data to Clarivate Analytics, who do the statistics for RUR. I expect next year the error will be corrected, perhaps without anybody admitting that anything was wrong.

So, there was no substantial reason why Harvard lost ground to Caltech in the Round Rankings this year. Still it does say something that such a mistake could occur and that nobody in the administration noticed or had the honesty to say anything. That is perhaps symptomatic of deeper problems within American academia. We can then expect the relative decline of Harvard and the rise of Chinese universities and a few others in Asia to continue.





Sunday, June 13, 2021

The Remarkable Revival of Oxford and Cambridge


There is nearly always a theme for the publication of global rankings. Often it is the rise of Asia, or parts of it. For a while it was the malign grasp of Brexit which was crushing the life out of British research or the resilience of American science in the face of the frenzied hostility of the great orange beast. This year it seems that the latest QS world rankings are about the triumph of Oxford and other elite UK institutions and their leapfrogging their US rivals. Around the world, quite a few other places are also showcasing their splendid achievements.

In the recent QS rankings Oxford has moved up from overall fifth to second place and Cambridge from seventh to third while University College London, Imperial College London, and Edinburgh have also advanced. No doubt we will soon hear that this is because of transformative leadership, the strength that diversity brings, working together as a team or a family, although I doubt whether any actual teachers or researchers will get a bonus or a promotion for their contributions to these achievements.

But was it leadership or team spirit that pushed Oxford and Cambridge into the top five? That is very improbable. Whenever there is a big fuss about universities rising or falling significantly in the rankings in a single year it is a safe bet that it is the result of an error, the correction of an error, or a methodological flaw or tweak of some kind.

Anyway, this year's Oxbridge advances had as much to do with leadership,  internationalization, or reputation as goodness had with Mae West's diamonds. It was entirely due to a remarkable rise for both places in the score for citations per faculty, Oxford from 81.3 to 96, and Cambridge from 69.2 to 92.1. There was no such change for any of the other indicators.

Normally, there are three ways in which a university can rise in QS's citations indicator. One is to increase the number of publications while maintaining the citation rate. Another is to improve the citation rate while keeping output constant. The third is to reduce the number of faculty physically or statistically.

None of these seem to have happened at Oxford and Cambridge. The number of publications and citations has been increasing but not sufficiently to cause such a big jump. Nor does there appear to have been a drastic reduction of faculty in either place.

In any case it seems that Oxbridge is not alone in its remarkable progress this year. For citations, ETH Zurich rose from 96.4 to 99.8, University of Melbourne from 75 to 89.7, National University of Singapore from 72.9 to 90.6, Michigan from 58 to 70.5. It seems that at the top levels of these rankings nearly everybody is rising except for MIT which has the top score of 100 but it is noticeable that as we get near the top the increase gets smaller.

It is theoretically possible that this might be the result of a collapse of the raw scores of citations front runner MIT which would raise everybody else's scores if it still remained at the top but there is no evidence of either a massive collapse in citations or a massive expansion of research and teaching staff.

But then as we go to the other end of the ranking we find universities' citations scores falling, University College Cork from 23.4 to 21.8, Universitas Gadjah Mada from 1.7 to 1.5, UCSI University Malaysia from 4.4 to 3.6, American University  in Cairo from 5.7 to 4.2.

It seems there is a bug in the QS methodology. The indicator scores that are published by QS are not raw data but standardized scores based on standard deviations from the mean The mean score is set at fifty and the top score at one hundred. Over the last few years the number of ranked universities has been increasing and the new ones tend to perform less well than the the established ones, especially for citations. In consequence, the  mean number of citations per faculty has declined and therefore universities scoring above the mean will increase their standardized scores which is derived from the standard deviation from the mean. If this interpretation is incorrect I'm very willing to be corrected.

This has an impact on the relative positions of Oxbridge and leading American universities. Oxford and Cambridge rely on their  scores in the academic and employer survey and international faculty and staff to keep in the top ten. Compared to Harvard, Stanford and MIT they are do not perform well for quantity or quality of research. So the general inflation of citations scores gives them more of a boost than the US leaders and so their total score rises.

It is likely that Oxford and Cambridge's moment of glory will be brief since QS in the next couple of years will have to do some recentering in order to prevent citation indicator scores bunching up in the high nineties. The two universities will fall again although  it that will probably not be attributed to a sudden collapse of leadership or failure to work as a team.

It will be interesting to see if any of this year's rising universities will make an announcement that they don't really deserve any praise for their illusory success in the rankings.



Saturday, April 13, 2019

Where is the real educational capital of the world?

Here is another example of how rankings, especially those produced by Times Higher Education (THE), are used to mislead the public.

The London Post has announced that London is the Higher Educational Capital of the World for 2019. Support for this claim is provided by four London universities appearing in the top 40 of the THE World University Rankings which, unsurprisingly, have been welcomed by London Mayor Sadiq Khan.

In addition, THE has Oxford and Cambridge as first and second in the world in their overall rankings and QS has declared London to be the Best Student City.

THE is not the only global ranking. There are now several others and none of them have Oxford in first place. Most of them give the top spot to Harvard, although in the QS world rankings it is MIT and in the GreenMetric rankings Wageningen.

Also, if we look at the number of universities in the top 50 of the Shanghai rankings we cannot see London as the undisputed HE capital of the world. Using this simple criterion it would be New York with three, Columbia, New York University and Rockefeller.

Then come Boston, Paris, Chicago and London with two each.



Thursday, April 04, 2019

What to do to get into the rankings?

I have been asked this question quite a few times. So finally here is an attempt to answer it.

If you represent a university that is not listed in any rankings, except uniRank and Webometrics, but you want to be, what should you do?

Where are you now?
The first thing to do is to find out where you are in the global hierarchy of universities. 

Here the Webometrics rankings are very helpful. These are now a mixture of web activity and research indicators and provide a rank for over 28,000 universities or places that might be considered universities, colleges, or academies of some sort. 

If you are ranked in the bottom half of Webometrics then frankly it would be better to concentrate on not going bankrupt and getting or staying accredited.

But if you are in the top 10,000 or so then you might  be able to think about getting somewhere in some sort of ranking.

Where do you want to be?
Nearly everybody in higher education who is not hibernating has heard of the Times Higher Education (THE) world and regional rankings. Some also know about the Quacquarelli Symonds (QS) or the Shanghai rankings. But there are now many more rankings that are just as good as, or in some cases better than, the "big three".
 
According to the IREG inventory published last year there are now at least 45 international university rankings including business school, subject, system and regional rankings, of which 17 are global rankings, and there will be more to come. This inventory provides links and some basic preliminary information about all the rankings but it already needs updating.

The methodology and public visibility of the global rankings varies enormously. So, first you have to decide what sort of university you are and what you want to be. You also need to think about exactly what you want from a ranking, whether it is fuel for the publicity machine or an accurate and valid assessment of research performance.  

If you want to be a small high quality research led institution with lavish public and private funding, something like Caltech, then the THE world rankings would probably be appropriate. They measure income three different ways, no matter how wastefully it is spent, and most of the indicators are scaled according to number of staff or students. They also have a citations indicator which favours research intensive institutions like Stanford or MIT along with some improbable places like Babol Noshirvani University of Technology, Brighton and Sussex Medical School or Reykjavik University.

If, however, your goal is to be a large comprehensive research and teaching university then the QS or the Russia-based Round University Rankings might be a better choice. The latter has all the metrics of the THE rankings except one plus another eight, all with sensible weightings.

If you are a research postgraduate-only university then you would not be eligible for the overall rankings produced by QS or THE but you could be included in the Shanghai Rankings.

Data Submission

Most rankings rely on publicly accessible information. However these global rankings use include information submitted by the ranked institution:  QS world rankings, THE world rankings, Round University Ranking, US News Best Global Universities, U-Multirank, UI GreenmetricCollecting, verifying and submitting data can be a very tiresome task so it  would be well to consider whether there are sufficient informed and conscientious staff available. U-Multirank is especially demanding in the the amount and quality of data required.

List of Global Rankings
Here is the list of the 17 global rankings included in the IREG inventory with comments about the kind of university that is likely to do well in them. 

CWTS Leiden Ranking
This is a research only ranking by a group of bibliometric experts at Leiden University. There are several indicators starting with the total number of publications, headed by Harvard followed by the University of Toronto, and ending with the percentage of publications in the top 1% of journals, headed by Rockefeller University. 

CWUR World University Rankings
Now produced out of UAE, this is an unusual and not well-known ranking that attempts to measure alumni employment and the quality of education and faculty. At the top it generally resembles more conventional rankings.

Emerging/Trendence Global University Employability Rankings
Published in but not produced by THE, these are based on a survey of employers in selected countries and rank only 150 universities.

Nature Index
A research rankine based on a very select group of journals. Also includes non-university institutions. The current leader is the Chinese Academy of Sciences. This ranking is relevant only for those universities aiming for the very top levels of research in the natural sciences.

National Taiwan University Rankings 
A research ranking of current publications and citations and those over a period of eleven years. It favours big universities with the  current top ten including the University of Toronto and the University of Michigan.

QS World University Rankings
If you are confident of building a local reputation then this is the ranking for you. There is a 40 % weighting for academic reputation and 10 % for employer reputation. Southeast Asian universities often do well in this ranking.

Webometrics
This now has two measures of web activity, one of citations and one of publications. It measures quantity rather than quality so there is a chance here for mass market institutions to excel. 

Reuters Top 100 Innovative Universities
This is definitely for the world's technological elite.

Round University Rankings
These rankings combines survey and institutional data  from Clarivate's Global Institutional Profiles Project and bibliometric data from the.Web of Science Core Collection. They are the most balanced and comprehensive of the general world rankings although hardly known outside Russia.

Scimago Institution Rankings
These combine indicators of research, innovation measured by patents and web activity. They tend to favour larger universities that are strong in technology.

Shanghai Academic Ranking of World Universities (ARWU)
These are the oldest of the global rankings with a simple and stable methodology. They are definitely biased towards large, rich, old research universities with strengths in the natural sciences and a long history of scientific achievement.

THE World University Rankings
The most famous of the international rankings, they claim to be sophisticated, rigorous, trusted etc but are biased towards UK universities. The citations indicator is hopelessly and amusingly flawed. There are a number of spin-offs that might be of interest to non-elite universities such as regional, reputation, young universities and, now, global impact rankings.

U-Multirank
Contains masses of information about things that other rankings neglect but would be helpful mainly to universities looking for students from Europe.

UI GreenMetric Ranking 
This is published by Universitas Indonesia and measures universities' contribution to environmental sustainability. Includes a lot of Southeast Asian universities but not many from North America. Useful for eco-conscious universities.

uniRank University Ranking
This is based on web popularity derived from several sources. In many parts of Africa it serves as a measure of general quality.

University Ranking by Academic Performance
A research ranking produced by the Middle East Technical University in Ankara that ranks 2,500 universities. It is little known outside Turkey but I noticed recently that it was used in a presentation at a conference in Malaysia.

US News Best Global Universities
Sometimes counted as one of the big four but hardly ever the big three, this is a research ranking that is balanced and includes 1,250 universities. For American universities is a useful complement to the US News' America's best Colleges.

You will have to decide whether to take a short-term approach to rankings, by recruiting staff from the Highly Cited Researchers list, admitting international students regardless of ability, sending papers to marginal journals and conferences, signing up for citation-rich mega projects, or by concentrating on the underlying attributes of an excellent university, admitting students and appointing and promoting faculty for their cognitive skills and academic ability, encouraging genuine and productive collaboration, nurturing local talent.

The first may produce quick results or nice bonuses for administrators but it can leave universities at the mercy of the methodological tweaking of the rankers, as Turkish universities found out in 2015.

The latter will take years or decades to make a difference and unfortunately that may be too long for journalists and policy makers.




















Thursday, November 15, 2018

THE uncovers more pockets of research excellence

I don't want to do this. I really would like to start blogging about whether rankings should measure third missions or developing metrics for teaching and learning. But I find it difficult to stay away from the THE rankings, especially the citations indicator.

I have a couple of questions. If someone can help please post a comment here. 

Do the presidents, vice-chancellors, directors, generalissimos, or whatever of universities actually look at or get somebody to look at the indicator scores of the THE world rankings and their spin-offs?

Does anyone ever wonder how a ranking that produces such such imaginative and strange results for research influence, measured by citations, command the respect and trust of those hard-headed engineers, MBAs and statisticians running the world's elite universities?

These questions are especially relevant as THE are releasing subject rankings. Here are the top universities in the world for research impact (citations) in various subjects. For computer science and engineering they refer to last year's rankings.

Clinical, pre-clinical and health: Tokyo Metropolitan University

Life Sciences: MIT

Physical sciences: Babol Noshirvani University of Technology

Psychology: Princeton University

Arts and humanities: Universite de Versailles Saint Quentin-en-Yvelines

Education: Kazan Federal University

Law: Iowa State University

Social sciences: Stanford University

Business and economics: Dartmouth College

Computer Science: Princeton University

Engineering and technology: Victoria University, Australia.










https://www.timeshighereducation.com/world-university-rankings/by-subject

Monday, October 29, 2018

Is THE going to reform its methodology?


An article by Duncan Ross in Times Higher Education (THE) suggests that the World University Rankings are due for repair and maintenance. He notes that these rankings were originally aimed at a select group of research orientated world class universities but THE is now looking at a much larger group that is likely to be less internationally orientated, less research based and more concerned with teaching.

He says that it is unlikely that there will be major changes in the methodology for the 2019-20 rankings next year but after that there may be significant adjustment.

There is a chance that  the industry income indicator, income from industry and commerce divided by the number of faculty, will be changed. This is an indirect attempt to capture innovation and is unreliable since it is based entirely on data submitted by institutions. Alex Usher of Higher Education Strategy Associates has pointed out some problems with this indicator.

Ross seems most concerned, however, with the citations indicator which at present is normalised by field, of which there are over 300, type of publication and year of publication. Universities are rated not according to the number of citations they receive but by comparison with the world average of citations to documents of a specific type in a specific field in a specific year. There are potentially over 8,000 boxes into which any single citation could be dropped for comparison.

Apart from anything else, this has resulted in a serious reduction in transparency. Checking on the scores for Highly Cited Researchers or Nobel and fields laureates in the Shanghai rankings can be done in few minutes. Try comparing thousands of world averages with the citation scores of a university.

This methodology has produced a series of bizarre results, noted several times in this blog. I hope I will be forgiven for yet again listing some of the research impact superstars that THE has identified over the last few years: Alexandria University, Moscow Nuclear Research University MEPhI, Anglia Ruskin University, Brighton and Sussex Medical School, St George's University of London, Tokyo Metropolitan University, Federico Santa Maria Technical University, Florida Institute of Technology, Babol Noshirvani University of Technology, Oregon Health and Science University, Jordan University of Science and Technology, Vita-Salute San Raffaele University.

The problems of this indicator go further than just a collection of quirky anomalies. It now accords a big privilege to medical research as it once did to fundamental physics research. It offers a quick route to ranking glory by recruiting highly cited researchers in strategic fields and introduces a significant element of instability into the rankings.

So here are some suggestions for THE should it actually get round to revamping the citations indicator.

1. The number of universities around the world that do a modest amount of  research of any kind is relatively small, maybe five or six thousand. The number that can reasonably claim to have a significant global impact is much smaller, perhaps two or three hundred. Normalised citations are perhaps a reasonable way of distinguishing among the latter, but pointless or counterproductive when assessing the former. The current THE methodology might be able to tell whether  a definitive literary biography by a Yale scholar has the same impact in its field as cutting edge research in particle physics at MIT but it is of little use in assessing the relative research output of mid-level universities in South Asia or Latin America.

THE should therefore consider reducing the weighting of citations to the same as research output or lower.

2.  A major cause of problems with the citations indicator is the failure to introduce complete fractional counting, that is distributing credit for citations proportionately among authors or institutions. At the moment THE counts every author of a paper with less than a thousand authors as though each of them were the sole author of the paper. As a result, medical schools that produce papers with hundreds of authors now have a privileged position in the THE rankings, something that the use of normalisation was supposed to prevent.

THE has introduced a moderate form of fractional counting for papers with over a thousand authors but evidently this is not enough.

It seems that some, rankers do not like fractional counting because it might discourage collaboration. I would not dispute that collaboration might be a good thing, although it is often favoured by institutions that cannot do very well by themselves, but this is not sufficient reason to allow distortions like those noted above to flourish.

3. THE have a country bonus or regional modification which divides a university's citation impact score by the square root of the score of the country in which the university is located. This was supposed to compensate for the lacking of funding and networks that afflicts some countries, which apparently does not affect their reputation scores or publications output. The effect of this bonus is to give some universities a boost derived not from their excellence but from the mediocrity or worse of their compatriots. THE reduced the coverage of this bonus to fifty percent of the indicator in 2015.  It might well be time to get rid of it altogether

4. Although QS stopped counting self-citations in 2011 THE continue to do so. They have said that overall they make little difference. Perhaps, but as the rankings expand to include more and more universities it will become more likely that a self-citer or mutual-citer will propel undistinguished  schools up the charts. There could be more cases like Alexandria University or Veltech University.

5. THE needs to think about what they are using citations to measure. Are they trying to assess research quality in which they case they should use citations per papers? Are they trying to estimate overall research impact in which case the appropriate metric would be total citations.

6. Normalisation by field and document type might be helpful for making fine distinctions among elite research universities but lower down it creates or contributes to serious problems when a single document or an unusually productive author can cause massive distortions. Three hundred plus fields may be too many and THE should think about reducing the number of fields. 

7. There has been a proliferation in recent years In the number of secondary affiliations. No doubt most of these are making a genuine contribution to the life of both or all of the universities with which they are affiliated. There is, however, a possibility of serious abuse if the practice continues. It would be greatly to THE's credit if they could find some way of omitting or reducing the weighting of secondary affiliations. 

8. THE are talking about different models of excellence. Perhaps they could look at the Asiaweek rankings which had a separate table for technological universities or Maclean's with its separate rankings for doctoral/medical universities and primarily undergraduate schools. Different weightings could be given to citations for each of these categories.

Friday, August 24, 2018

Why is Australia doing well in the Shanghai rankings?

I am feeling a bit embarrassed. In a recent post I wrote about the Shanghai Rankings (ARWU) being a bit boring (which is good) because university ranks usually do not change very much. But then I noticed that a couple of Australian universities did very well in the latest rankings. One of them, the Australian National University (ANU), has risen a spectacular (for ARWU) 31 places over last year. The Financial Review says that "[u]niversity scientific research has boosted the position of two Australian universities in a global ranking of higher education providers." 

The ranking is ARWU and the rise in the ranking is linked to the economic contribution of Australian universities, especially those in the Group of Eight.

So how well did Australian universities do? The top performer, as in previous years, is the University of Melbourne, which went up a spot to 38th place. Two other universities went up a lot in a very un-Shanghainese way, ANU, already mentioned, from 69th to 38th place and the University of Sydney from  to 83rd to  68th

The University of Queensland was unchanged in 55th place while Monash fell from 78th to 91st  and the University of Western Australia from 91st to 93rd. 

How did ANU and Sydney do it? The ANU scores for Nobel and Fields awards were unchanged. Publications were up a bit  and papers in Nature and Science down a bit.  

What made the difference was the score for highly cited researchers, derived from lists kept by Clarivate Analytics, which rose from 15.4 to 23.5, a difference of 8.1 or, after weighting, 1.62 points of the overall score. The difference in total scores between 2017 and 2018 was 1.9 so those highly cited researchers made up most of the difference.

In 2016 ANU had two researchers in the list, which was used for the 2017 rankings. One was also on the 2017 list, used in 2018. In 2017 there were six ANU  highly cited researchers, one from the previous year and one who had moved from MIT. The other four were long serving ANU researchers.

Let's be clear. ANU has not been handing out unusual contracts or poaching from other institutions. It has grown its own researchers and should be congratulated.

But using an indicator where a single researcher can lift a top 100 university seven or eight places is an invitation to perverse consequences. ARWU should consider whether it is time to explore other measures of research impact.

The improved scores for the University of Sydney resulted from an increase between 2016 and 2017 in the number of articles published in the Science Citation Index Expanded and the Social Science Citation Index.

Monday, June 04, 2018

Ranking rankings: Crass materialism

As the number and scope of university rankings increase it is time to start thinking about how to rank the rankers.

Indicators for global rankings might include number of universities ranked (Webometrics in 1st place), number of indicators (Round University Ranking), bias, and stability.

There could also be an indicator for crass materialism. Here is a candidate for first place. CNBC quotes a report from Wealth-X (supposedly downloadable, good luck) and lists the top ten universities, all in the USA, for billionaires. Apparently, the ranking also includes universities outside the US.

1.  Harvard
2.  Stanford
3.  Pennsylvania
4.  Columbia
5.  MIT
6.  Cornell
7.  Yale
8= Southern California
8= Chicago
10 Michigan.







Thursday, May 31, 2018

Where did the top data scientists study?


The website efinancialcareers has a list of the top twenty data scientists in finance and banking. This looks like a subjective list and another writer might come up with a different set of experts. Even so it is quite interesting.

Their degrees are mainly in things like engineering, computer science and maths. There is only one each in business, economics and finance.

The institutions where they studied are:
Stanford (three)
University College London (three)
Institut Polytechnique de Grenoble
Oxford
Leonard Stern School of Business, New York University
University of Mexico
Universite Paris Dauphine
Ecole Polytechnique
Rensselaer Polytechnic Institute (RPI)
California State University
Indian Institute of Science
Johns Hopkins University
Institute of Management Development and Research, India
University of Illinois
University of Pittsburgh
Indian Institute of Technology.

Harvard, MIT and Cambridge are absent but there are three Indian Institutes,  three French schools and some non-Ivy US places like RPI and the Universities of Pittsburgh and Illinois.





Friday, May 11, 2018

Ranking Insights from Russia

The ranking industry is expanding and new rankings appear all the time. Most global rankings measure research publications and citations. Others try to add to the mix indicators that might have something to do with teaching and learning. There is now a  ranking that tries to capture various third missions.

The Round University Rankings published in Russia are in the tradition  of  holistic rankings. They give a 40 % weighting to research, 40 % to teaching, 10% to international diversity and 10% to financial sustainability. Each group contains five equally weighted indicators. The data is derived from Clarivate Analytics who also contribute to the US News Best Global Universities Rankings.

These rankings are similar to the THE rankings in that they attempt to assess quality rather than quantity but they have 20 indicators instead of 13 and assign sensible weightings. Unfortunately, they receive only a fraction of the attention given to the THE rankings.

They are, however, very  valuable since they dig deeper into the data than  other global rankings. They also show that there is a downside to measures of quality and that data submitted directly by institutions should be  treated with caution and perhaps scepticism.

Here are the top universities for each of the RUR indicators.

Teaching
Academic staff per students: VIB (Flemish Institute of Biotechnology), Belgium
Academic staff per bachelor degrees awarded: University of Valladolid, Spain
Doctoral degrees per academic staff: Kurdistan University of Medical Science, Iran
Doctoral degrees per bachelor degrees awarded:  Jawaharlal Nehru University, India
World teaching reputation  Harvard University, USA.

Research
Citations per academic and research staff: Harvard
Doctoral degrees per admitted PhD: Al Farabi Kazakh National University
Normalised citation impact: Rockefeller University, USA
Share of international co-authored papers: Free University of Berlin
World research reputation: Harvard.

International diversity
Share of international academic staff: American University of Sharjah, UAE
Share of international students: American University of Sharjah
Share of international co-authored papers: Innopolis University, Russia
Reputation outside region: Voronezh State Technical University, Russai
International Level: EPF Lausanne, Switzerland.

Financial sustainability:
Institutional income per academic staff: Universidade Federal Do Ceara, Brazil
Institutional income per student: Rockefeller University
Papers per research income:  Novosibersk State University of Economics and Management, Russia
Research income per academic and research staff: Istanbul Technical University, Turkey
Research income per institutional income: A C Camargo Cancer Center, Brazil.

There are some surprising results here. The most obvious is Voronezh State Technical University which is first for reputation outside its region (Asia, Europe and so on), even though its overall scores for reputation and for international diversity are very low. The other top universities for this metric are just what you would expect, Harvard, MIT, Stanford, Oxford and so on. I wonder whether there is some sort of bug in the survey procedure, perhaps something like the university's supporters being assigned to Asia and therefore out of region. The university is also in second place in the world for papers per research income despite very low scores for the other research indicators.

There are other oddities such as Novosibersk State University of Economics and Management placed first for papers per research income and Universidade Federal Do Ceara for institutional income per academic staff. These may result from anomalies in the procedures for reporting and analysing data, possibly including problems in collecting data on income and staff.

It also seems that medical schools and specialist or predominantly postgraduate institutions such as Rockefeller University, the Kurdistan University of Medical Science, Jawarhalal Nehru University and VIB have a big advantage with these indicators since they tend to have favourable faculty student ratios, sometimes boosted by large numbers of clinical and research only staff, and a large proportion of doctoral students.

Jawaharlal Nehru University is a mainly postgraduate university so a high placing for academic staff per bachelor degrees awarded is not unexpected although I am surprised that it is ahead of Yale and Princeton. I must admit that the third place here for the University of Baghdad needs some explanation.

The indicator doctoral degrees per admitted PhD might identify universities that do a good job of selection and training and get large numbers of doctoral candidates through the system. Or perhaps it identifies universities where doctoral programmes are so lacking in rigour that nearly everybody can get their degree once admitted. The top ten of this indicator includes De Montfort University, Shakarim University, Kingston University, and the University of Westminster, none of which are famous for research excellence across the range of disciplines.

Measures of international diversity have become a staple of global rankings since they are fairly easy to collect. The problem is that international orientation may have something do with quality but it may also simply be a necessary attribute of being in a small country next to larger countries with the same or similar language and culture. The top ten for the international student indicators includes the Central European University and the American university of Sharjah. For international faculty it includes the University of Macau and Qatar University.

To conclude, these indicators suggest that self submitted institutional data should be used sparingly and that data from third party sources may be preferable. Also, while ranking by quality instead of quantity is sometimes advisable it also means that anomalies and outliers are more likely to appear.







Saturday, December 16, 2017

Measuring graduate employability; two rankings

Global university rankings are now well into their second decade. Since 2003, when the first Shanghai rankings appeared, there has been a steady growth of global and regional rankings. At the moment most global rankings are of two kinds, those that focus entirely or almost entirely on research and those such as the Russian Round Rankings, Times Higher Education (THE) and Quacquarelli Symonds (QS) that claim to also measure teaching, learning or graduate quality in some way, although even those are biased towards research when you scratch the surface a little.

The ranking industry has become adept at measuring research productivity and quality in various ways. But the assessment of undergraduate teaching and learning is another matter.

Several ranking organisations use faculty student ratio as a proxy for quality of teaching which in turn is assumed to have some connection with something that happens to students during their programmes. THE also count institutional income, research income and income from industry, again assuming that there is a significant association with academic excellence. Indicators like this are usually based on those supplied by institutions. For examples of problems here see an article by Alex Usher and a reply by Phil Baty.

An attempt to get at student quality is provided by the CWUR rankings now based in UAE, counting alumni who win international awards or who are CEOs of major companies. But obviously this is relevant only for a very small number of universities. A new pilot ranking from Moscow also counts international awards.

The only attempt to measure student quality  by the well known rankers that is relevant to most institutions is the survey of employers in the QS world and regional rankings. There are some obvious difficulties here. QS gets respondents from a variety of channels and this may allow some universities to influence the survey. In recent years some Latin American universities have done much better on this indicator than on any other.

THE now publish a global employability ranking which is conducted by two European firms, Trendence and Emerging. This is based on two surveys of recruiters in Argentina, Australia, Austria, Brazil, Canada, China, Germany, France, India, Israel, Italy, Japan, Mexico, Netherlands, Singapore, Spain, South Africa, South Korea, Turkey, UAE, UK, and USA. There were two panels with a total of over 6,000 respondents.

A global survey that does not include Chile, Sweden, Egypt, Nigeria, Saudi Arabia, Russia, Pakistan, Indonesia, Bangladesh, Poland, Malaysia or Taiwan can hardly claim to be representative of international employers. This limited representation may explain some oddities of the rankings such as the high places of the American University of Dubai and and the National Autonomous University of Mexico.

The first five places in these rankings are quite similar to the THE world rankings: Caltech, Harvard, Columbia, MIT, Cambridge.  But there some significant differences after that and some substantial changes since last year. Here Columbia, 14th in the world rankings, is in third place, up from 12th last year. Boston University is 6th here but 70th in the world rankings. Tokyo Institute of Technology in 19th place is in the 251-300 band in the world rankings. CentraleSupelec, is 41st,  but in the world  401-500 group.

These rankings are useful only for a small minority of universities, stakeholders and students. Only 150 schools are ranked and only a small proportion of the world's employers consulted.

QS have also released their global employability rankings with 500 universities. These combine the employer reputation survey, used in their world rankings with  other indicators: alumni outcomes, based on lists of high achievers, partnership with employers, that is research collaboration noted in the Scopus database, employer-student connections, that is employers actively present on campus, and graduate employment rate. There seems to be a close association, at least at the top, between overall scores, employer reputation and alumni outcomes. Overall the top three are Stanford, UCLA, Harvard. For employer reputation they are Cambridge, Oxford, Harvard and for alumni outcomes Harvard, Stanford Oxford.

The other  indicators are a different matter. For employer-student connections the top three are Huazhong University of Science and Technology, Arizona State University, and New York University. In fact seven out of the top ten on this measure  are Chinese. For graduate employment rate they are Politecnico di Torino, Moscow State Institute of International Relations, and Sungkyunkwan University and for partnership with employers Stanford, Surrey and Politecnico Milano. When the front runners in indicators are so different one has to wonder about their validity.

There are some very substantial differences in the ranks given to various universities in these rankings. Caltech is first in the Emerging-Trendence rankings and 73rd in QS. Hong Kong University of Science and Technology is 12th in Emerging-Trendence but not ranked at all by QS. The University of Sydney is 4th in QS and 48th in Emerging-Trendence. The American University of Dubai is in QS's 301-500 band but 138th for Emerging-Trendence

The  rankings published by THE could be some value to those students contemplating careers with the leading companies in the richest countries.

The QS rankings may be more helpful for those students or stakeholders looking at universities outside the very top of the global elite. Even so QS have ranked only a fraction of the world's universities.

It still seems that the way forward in the assessment of graduate outcomes and employability is through standardised testing along the lines of AHELO or the Collegiate Learning Assessment.




Friday, November 17, 2017

Another global ranking?

In response to  suggestion by Hee Kim Poh of Nanyang Technological University, I have had a look at the Worldwide Professional University Rankings which appear to be linked to "Global World Communicator" and the "International Council of Scientists" and may be based in Latvia.

There is a methodology page but it does not include essential information. One indicator is "number of publications to number of academic staff" but there is nothing about how either of these are calculated or where the data comes from. There is a reference to a survey of members of the International Council of Scientists but nothing about the wording of the survey, date of survey, distribution of respondents or the response rate.

Anyway, here is the introduction to the methodology:

"The methodology of the professional ranking of universities is based on comparing universities and professional evaluation by level of proposed training programs (degrees), availability and completeness of information on activities of a university, its capacity and reputation on a national and international levels. Main task is to determine parameters and ratios needed to assess quality of the learning process and obtained specialist knowledge. Professional formalized ranking system based on a mathematical calculation of the relation of parameters of the learning process characterizing quality of education and learning environment. Professional evaluation criteria are developed and ranking is carried out by experts of the highest professional qualification in relevant fields - professors of universities, specialists of the highest level of education, who have enough experience in teaching and scientific activities. Professional rating of universities consists of three components.. "

The top five universities are 1. Caltech,  2. Harvard,  3. MIT,  4. Stanford,  5. ETH Zurich.

Without further information, I do not think that this ranking is worth further attention.









http://www.cicerobook.com/en/ranks

Tuesday, September 05, 2017

Highlights from THE citations indicator


The latest THE world rankings were published yesterday. As always, the most interesting part is the field- and year- normalised citations indicator that supposedly measures research impact.

Over the last few years, an array of implausible places have zoomed into the top ranks of this metric, sometimes disappearing as rapidly as they arrived.

The first place for citations this year goes to MIT. I don't think anyone would find that very controversial.

Here are some of the institutions that feature in the top 100 of THE's most important indicator which has a weighting of 30 per cent.

2nd     St. George's, University of London
3rd=    University of California Santa Cruz, ahead of Berkeley and UCLA
6th =   Brandeis University, equal to Harvard
11th=   Anglia Ruskin University, UK, equal to Chicago
14th=   Babol Noshirvani University of Technology, Iran, equal to Oxford
16th=   Oregon Health and Science University
31st     King Abdulaziz University, Saudi Arabia
34th=   Brighton and Sussex Medical School, UK, equal to Edinburgh
44th     Vita-Salute San Raffaele University, Italy, ahead of the University of Michigan
45th=   Ulsan National Institute of Science and Technology, best in South Korea
58th=   University of Kiel, best in Germany and equal to King's College London
67th=   University of Iceland
77th=   University of Luxembourg, equal to University of Amsterdam












Thursday, August 24, 2017

Guest Post by Pablo Achard

This post is by Pablo Achard of the University of Geneva. It refers to  the Shanghai subject rankings. However, the problem of outliers in subject and regional rankings is one that affects all the well known rankings and will probably become more important over the next few years


How a single article is worth 60 places

We can’t repeat it enough: an indicator is bad when a small variation in the input is overly amplified in the output. This is the case when indicators are based on very few events.

I recently came through this issue (again) with Shanghai’s subject ranking of universities. The universities of Geneva and Lausanne (Switzerland) share the same School of Pharmacy and a huge share of published articles in this discipline are signed under the name of both institutions. But in the “Pharmacy and pharmaceutical sciences” ranking, one is ranked between the 101st and 150th position while the other is 40th. Where does this difference come from?

Comparing the scores obtained under each category gives a clue

Geneva
Lausanne
Weight in the final score
PUB
46
44.3
1
CNCI
63.2
65.6
1
IC
83.6
79.5
0.2
TOP
0
40.8
1
AWARD
0
0
1
Weighted sum
125.9
166.6


So the main difference between the two institutions is the score in “TOP”. Actually, the difference in the weighted sum (40.7) is almost equal to the value of this score (40.8). If Geneva and Lausanne had the same TOP score, they would be 40th and 41st

Surprisingly, a look at other institutions for that TOP indicator show only 5 different values : 0, 40.8, 57.7, 70.7 and 100. According to the methodology page of the ranking, “TOP is the number of papers published in Top Journals in an Academic Subject for an institution during the period of 2011-2015. Top Journals are identified through ShanghaiRanking’s Academic Excellence Survey […] The list of the top journals can be found here  […] Only papers of ‘Article’ type are considered.”
Looking deeper, there is just one journal in this list for Pharmacy: NATURE REVIEWS DRUG DISCOVERY. As its name indicates, this recognized journal mainly publishes ‘reviews’. A search on Web of Knowledge shows that in the period 2011-2015, only 63 ‘articles’ were published in this journal. That means a small variation in the input is overly amplified.

I searched for several institutions and rapidly found this rule: Harvard published 4 articles during these five years and got a score of 100 ; MIT published 3 articles and got a score of 70.7 ; 10 institutions published 2 articles and got a 57.7 and finally about 50 institutions published 1 article and got a 40.8.

I still don’t get why this score is so unlinear. But Lausanne published one single article in NATURE REVIEWS DRUG DISCOVERY and Geneva none (they published ‘reviews’ and ‘letters’ but no ‘articles’) and that small difference led to at least a 60 places gap between the two institutions.


This is of course just one example of what happens too often: rankers want to publish sub-rankings and end up with indicators where outliers can’t be absorbed into large distributions. One article, one prize or one  co-author in a large and productive collaboration all of the sudden makes very large differences in final scores and ranks.