Sunday, April 20, 2014

Should New Zealand Worry about the Rankings?

The Ministry of Education in New Zealand has just published a report by Warren Smart on the performance of the country's universities in the three best known international rankings. The report, which is unusually detailed and insightful, suggests that the eight universities -- Auckland, Otago, Canterbury, Victoria University of Wellington, Massey, Waikato, Auckland University of Technology and Lincoln  --  have a mixed record with regard to the Shanghai rankings and the Times Higher Education (THE) -- Thomson Reuters World University Rankings. Some are falling, some are stable and some are rising.

But things are a bit different when it comes to the QS World University Rankings. There the report finds a steady and general decline both overall and on nearly all of the component indicators. According to the New Zealand Herald this means that New Zealand is losing the race against Asia.

However, looking at the indicators one by one it is difficult to see any consistent and pervasive decline, whether absolute or relative.

Academic Survey

It is true that scores for the academic survey fell between 2007 and 2013 but one reason for this could be that the percentage of responses from New Zealand fell dramatically from 4.1% in 2007 to 1.2% in 2013 (see University Ranking Watch 20th February). This probably reflects the shift from a survey based on the subscription lists of World Scientific, a Singapore- based academic publishing company, to one with several sources, including a sign up facility.

Employer survey

In 2011 QS reported that there had been an enthusiastic response to the employer opinion survey from Latin America and it was found necessary to cap the scores of several universities where there had been a disproportionate response. One consequence of this was that the overall mean for this indicator rose dramatically so that universities received much lower scores in that year for the same number of responses. QS seems to have rectified the situation so that scores for New Zealand universities -- and many others -- recovered to some extent  in 2012 and 2013.

Citations per faculty and faculty student ratio

From 2007 to 2010 or 2011 scores fell for the citations per faculty indicator but have risen since then. The report notes that "the recent improvement in the citations per faculty score by New Zealand universities had not been matched by an increase in their academic reputations score, despite the academic reputation survey being focused on perceptions of research performance."

This apparent contradiction might be reconciled by the declining number of survey respondents from New Zealand noted above. Also, we should not forget the number on the bottom. A fall in the recorded number of faculty could have the same result as an increase in citations. It is interesting that  while the score for faculty student ratio for five  universities -- Auckland , Canterbury, Otago, Victoria University of Wellington and Waikato -- went down from 2010 to 2012, the score for citations per faculty went up. Both changes could result from an a decline in the number of faculty submitted by universities or recorded by QS. In only one case, Massey, did both scores rise. There was insufficient data for the other two universities.

International Faculty and International Students

The scores for international faculty have always been high and are likely to remain so. The scores for international students have been slipping but this indicator counts for only 5% of the total weighting.

New Zealand universities might benefit from looking at the process of submission of data to QS. Have they submitted lists of potential survey respondents? Are they aware of the definitions of faculty, students, international and so on? That might be more productive than worrying about a deep malaise in the tertiary sector.

And perhaps New Zealand salt producers could send out free packets every time the media have anxiety attacks about the rankings.

Thursday, April 17, 2014

The Scimago Ibero-America Ranking

In February the SCImago Research Group published its annual Ibero-American Institutions Ranking. This is not a league table but a research tool. The default order is according to the number of publications in the Scopus database over the period 2008-2012. The top five are:

1.  Universidade de Sao Paulo

2.  Universidade de Lisboa

3.  Universidad Nacional Autonoma de Mexico

4.  Universidade Estadual Paulista Julio de Mesquita Filho

5.  Universitat de Barcelona

Friday, April 11, 2014

Why are Britain's Universities Still Failing Male Students?

I doubt that you will see a headline like that in the mainstream media.

A report from the Higher Education Funding Council for England (Hefce) has shown that students who classify themselves as White do better than Black or Asian students who get the same grades at A levels. Mixed-race students are in between. The difference persists even when universities and subjects are analysed separately. 

Aaron Kiely in the Guardian says that this "suggests that higher education institutions are somehow failing black students, which should be a national embarrassment."

He then goes on to recount a study by the National Union of Students (NUS) that indicated that Black students suffered institutional barriers that eroded their self-esteem and confidence and that seven per cent said that the university environment was racist. 

A similar conclusion was drawn by Richard Adams also in the Guardian. He quoted Rachel Wenstone of the NUS as saying that it was "a national shame that black students and students from low participation backgrounds are appearing to do worse in degree outcomes than other students even when they get the same grades at A level."

It is interesting that the Hefce report also found that female students were more likely to get a 2 (i) than male students with the same grades, although there was no difference with regard to first class degrees. Men were also more likely to fail to complete their studies.

So is anyone worrying about why men are doing less well at university?


Thursday, April 10, 2014

The Parochial World of Global Thinkers

The magazine Prospect has just published its list of fifty candidates for the title of global thinker. It is rather different from last year. Number one in 2013, Richard Dawkins, biologist and atheist spokesman, is out. Jonathon Derbyshire, Managing Editor of Prospect, in an interview with the Digital Editor of Prospect says that is because Dawkins  has been saying the same thing for several years. Presumably Prospect only noticed this year.

The list is top heavy with philosophers and economists and Americans and Europeans. There is one candidate from China, one from Africa, one from Brazil and none from Russia. There is one husband and wife. A large number are graduates of Harvard or have taught there and quite a few are from Yale, MIT, Berkeley, Cambridge and Oxford. One wonders if the selectors made some of their choices by going through the contents pages of New Left Review. So far I have counted six contributors.

There are also no Muslims. Was Prospect worried about a repetition of that unfortunate affair in 2008?

All in all, apart from Pope Francis, this does not look like a global list. Unless, that is, thinking has largely retreated to the humanities and social science faculties of California, New England and Oxbridge.

Tuesday, April 01, 2014

Comparing the THE and QS Reputation Rankings

This year's Times Higher Education (THE) Reputation Rankings were  a bit boring, at least at the top, and that is just what they should be.

The top ten are almost the same as last year. Harvard is still first and MIT is second. Tokyo has dropped out of the top ten to 11th place and has been replaced by Caltech. Stanford is up three places and is now third. Cambridge and Oxford are both down one place. Further down, there is some churning but it is difficult to see any clear and consistent trends, although the media have done their best to find stories, UK universities falling or sliding or slipping, no Indian or Irish or African universities in the top 100.

These rankings may be more interesting for who is not there than for who is. There are some notable absentees from the top 100. Last year Tokyo Metropolitan University was, according to THE and data providers Thomson Reuters (TR), first in the world, along with MIT, for research impact. Yet it fails to appear in the top 100 in a reputation  survey in which research has a two thirds weighting. Rice University, joint first in the world for research impact with Moscow State Engineering Physics Institute  in 2012 is also absent. How is this possible? Am I missing something?

In general, the THE-TR reputation survey, the data collection for which was contracted out  to the pollsters Ipsos Mori CT, appears to be quite rigorous and reliable. Survey forms were sent out to a clearly defined group, researchers with papers in the ISI indexes. THE claim that this means that their respondents must therefore be active producers of academic research. That is stretching it a bit. Getting your name on a article published in a reputable journal might mean a high degree of academic competence or it could  just mean having some sort of influence over the research process. I have heard a report about an Asian university where researchers were urged to put their heads of  department on the list of co-authors. Still, on balance it seems that the respondents to the THE survey are mostly from a stable group, namely those who have usually made some sort of contribution to a research paper of sufficient merit to be included in an academic journal .

TR also appear to have used a systematic approach in sending out the survey forms. When the first survey was being prepared in 2010 they announced that the forms would be emailed according to the number of researchers recorded by UNESCO in 2007. It is not clear if this procedure has been followed strictly over the last four years. Oceania, presumably Australia and New  Zealand, appears to have a very large  number of responses this year, 10%, although TR reported in 2010 that UNESCO found only 2.1 % of the world's researchers in that region.

The number of responses received appears reasonably large although it has declined recently.  In 2013 TR collected 10, 536 responses, considerably less than in 2012 when it was 16,639. Again, it is not clear what happened.

The number of responses from the various subject areas has changed somewhat. Since 2012 the proportion from the social sciences has gone from 19% to 22% as has engineering and technology while life sciences has gone from 16% to 22%.

QS do not publish reputation surveys but it is possible to filter their ranking scores to find out how universities performed on their academic survey.

The QS approach is less systematic. They started out using the subscription lists of World Scientific, a Singapore based academic publishing company with links to Imperial College London. Then they added respondents from  Mardev, a publisher of academic lists, to beef up the number of names in the humanities. Since then the balance has shifted with more names coming from Mardev with some topping up from World Scientific. QS have also added a sign up facility where people are allowed to apply to receive survey forms. That was suspended in April 2013 but has recently been revived. They have also asked universities to submit lists of potential respondents and respondents to suggest further names. The  exact number of responses coming from all these different sources is not known.

Over the last few years QS have made their survey rather more rigorous. First, respondents were not allowed to vote for the universities where they were currently employed. They were restricted to one response per computer and universities were not allowed to solicit votes or instruct staff who to vote for or who not to vote for. Then they were told not to promote any form of participation in the surveys.

In addition to methodological changes, the proportion of responses from different countries has changed significantly since 2007 with a large increase from Latin America, especially Brazil and Mexico, the USA and larger European countries and a fall in those from India, China and the Asia-Pacific region. All of this means that it is very difficult to figure out whether the rise or fall of a university reflects a change in methodology or distribution of responses or a genuine shift in international reputation

Comparing the THE-TR and QS surveys there is some overlap at the top. The top five are the same in both although in a different order: Harvard, MIT, Stanford, Oxford and Cambridge.

After that, we find that the QS academic survey favours universities in Asia-Pacific and  Latin America. Tokyo is seventh according to QS but THE-TR have it in 11th place. Peking is 19th for QS and 41st for THE -TR. Sao Paulo is 51st in the QS indicator but is in the 81-90 band in the THE-TR rankings. The Autonomous National University of Mexico (UNAM) is not even in THE-TR's top 100 but QS put it 48th.

On the other hand Caltech, Moscow State University, Seoul National University and Middle Eastern Technical University do much better with THE-TR than with QS .

I suspect that the QS survey is tapping a younger less experienced pool of respondents from less regarded universities and from countries with high aspirations but so far limited achievements.

Sunday, March 30, 2014

The Nature Publication Index

Nature has long been regarded as the best or one of the two best scientific journals in the world. Papers published there and in Science  account for 20 % of the weighting for Shanghai Jiao Tong University's Academic Ranking of World Universities, the same as Nobel and Fields awards or publications in the whole of the Science Citation and Social Science Citation Indexes.

Sceptics may wonder whether Nature has seen better years and is perhaps sliding away from the pinnacle of scientific publishing. It has had some embarrassing moments in recent decades including the publication of a 1978 paper that gave credence to the alleged abilities of the psychic Uri Geller, the report of a study by Jacques Beneviste and others that purported to show that water has a memory. the questionable "hockey stick" article on global warming in 1998 and seven retracted papers on superconductivity by Jan Hendrik Schon.

But it still seems that Nature is highly regarded by the global scientific community and that the recent publication of the Nature Publication Index is a reasonable guide to current trends in scientific research. This counts the number of publications in Nature in 2013.

The USA remains on top with Harvard first, MIT second and Stanford third although China continues to make rapid progress. For many parts of the world, Latin America, Southern Europe, Africa, scientific achievement is extremely limited. Looking at the Asia-Pacific rankings  much of the region including Indonesia, Bangladesh and the Philippines is almost a scientific desert.

Sunday, March 23, 2014

At Last! A really Useful Ranking

Wunderground lists the top 25 snowiest universities in the US.

The top five are:

1.  Syracuse University
2.  Northern Arizona University (that's interesting)
3.  The University at Buffalo: SUNY
4.  Montana State University
5.  University at Colorado Boulder

Tuesday, March 04, 2014

Reactions to the QS Subject Rankings

It looks as though the QS subject rankings are a big hit. Here is just a sample of headlines and quotations from around the world.

World Ranking Recognises Agricultural Excellence at Lincoln [New Zealand]

CEU [Central European University, Hungary] Programs Rank Among the World's Top 100

Boston-Area Schools Rank Top in the World in These 5 Fields

"Cardiff has been ranked as one of the top universities in the world in a number of different subjects, according to a recent international league table."

NTU [National Taiwan University] leads local universities making QS rankings list

Swansea University continues to excel in QS world subject rankings

Penn State Programs Rank Well in 2014 QS World Rankings by Subject

Anna Varsity [India] Enters Top 250 in QS World Univ Rankings

Moscow State University among 200 best in the world

New Ranking Says Harvard And MIT Are The Best American Universities For 80% of Academic Subjects

QS: The University of Porto ranked among the best in the world

4 Indian Institutions in 2014 World Ranking

"The Institute of Education [London] has been ranked as the world's leading university for Education in the 2014 QS World University Rankings."

Nine UvA [University of Amsterdam] subject areas listed in QS World University Rankings top 50

"The University of Newcastle's [Australia] Civil and Structural Engineering discipline has surged in the QS World University Rankings by Subject list"


Sunday, March 02, 2014

The QS Subject Rankings: Reposting

QS have come out with their 2014 University Rankings by Subject, three months earlier than last year. Maybe this is to get ahead of Times Higher whose latest Reputation Rankings will be published next week.

The methodology of these rankings has not changed since last year so I am just reposting my article which was first published in the Philippine Daily Inquirer on 27th May and then reposted here on the 29th May 2013.

The QS University Rankings by Subject: Warning 

It is time for the Philippines to think about constructing its own objective and transparent ranking or rating systems for its colleges and universities that would learn from the mistakes of the international rankers.

The ranking of universities is getting to be big business these days. There are quite a few rankings appearing from Scimago, Webometrics, University Ranking of Academic Performance (from Turkey), the Taiwan Rankings, plus many national rankings.

No doubt there will be more to come.

In addition, the big three of the ranking world—Quacquarelli Symonds (QS), Times Higher Education and Shanghai Jiao Tong University’s Academic Ranking of World Universities—are now producing a whole range of supplementary products, regional rankings, new university rankings, reputation rankings and subject rankings.

There is nothing wrong, in principle, with ranking universities. Indeed, it might be in some ways a necessity. The problem is that there are very serious problems with the rankings produced by QS, even though they seem to be better known in Southeast Asia than any of the others.
This is especially true of the subject rankings.

No new data

The QS subject rankings, which have just been released, do not contain new data. They are mostly based on data collected for last year’s World University Rankings—in some cases extracted from the rankings and, in others, recombined or recalculated.

There are four indicators used in these rankings. They are weighted differently for the different subjects and, in two subjects, only two of the indicators are used.

The four indicators are:
A survey of academics or people who claim to be academics or used to be academics, taken from a variety of sources. This is the same indicator used in the world rankings. Respondents were asked to name the best universities for research.
A survey of employers, which seem to comprise anyone who chooses to describe himself or herself as an employer or a recruiter.
The number of citations per paper. This is a change from the world rankings when the calculation was citations per faculty.
H-index. This is something that is easier to give examples for than to define. If a university publishes one paper and the paper is cited once, then it gets an index of one. If it publishes two or more papers and two of them are published twice each, then the index is two and so on. This is a way of combining quantity of research with quality as measured by influence on other researchers.

Out of these four indicators, three are about research and one is about the employability of a university’s graduates.

These rankings are not at all suitable for use by students wondering where they should go to study, whether at undergraduate or graduate level.

The only part that could be of any use is the employer review and that has a weight ranging from 40 percent for accounting and politics to 10 percent for arts and social science subjects, like history and sociology.

But even if the rankings are to be used just to evaluate the quantity or quality of research, they are frankly of little use. They are dominated by the survey of academic opinion, which is not of professional quality.

There are several ways in which people can take part in the survey. They can be nominated by a university, they can sign up themselves, they can be recommended by a previous respondent or they can be asked because they have subscribed to an academic journal or an online database.

Apart from checking that they have a valid academic e-mail address, it is not clear whether QS makes any attempt to check whether the survey respondents are really qualified to make any judgements about research.

Not plausible

The result is that the academic survey and also the employer survey have produced results that do not appear plausible.

In recent years, there have been some odd results from QS surveys. My personal favorite is the New York University Tisch School of the Arts, which set up a branch in Singapore in 2007 and graduated its first batch of students from a three-year Film course in 2010.In the QS Asian University Rankings of that year, the Singapore branch got zero for the other criteria (presumably the school did not submit data) but it was ranked 149th in Asia for academic reputation and 114th for employer reputation.

Not bad for a school that had yet to produce any graduates when the survey was taken early in the year.

In all of the subject rankings this year, the two surveys account for at least half of the total weighting and, in two cases, Languages and English, all of it.

Consequently, while some of the results for some subjects may be quite reasonable for the world top 50 or the top 100, after that they are sometimes downright bizarre.

The problem is that although QS has a lot of respondents worldwide, when it gets down to the subject level there can be very few. In pharmacy, for example, there are only 672 for the academic survey and in materials science 146 for the employer survey. Since the leading global players will get a large share of the responses, this means that universities further down the list will be getting a handful of responses for the survey. The result is that the order of universities in any subject in a single country like the Philippines can be decided by just one or two responses to the surveys.

Another problem is that, after a few obvious choices like Harvard, MIT (Massachusetts Institute of Technology), Tokyo, most respondents probably rely on a university’s general reputation and that can lead to all sorts of distortions.

Many of the subject rankings at the country level are quite strange. Sometimes they even include universities that do not offer courses in that subject. We have already seen that there are universities in the Philippines that are ranked for subjects that they do not teach.

Somebody might say that maybe they are doing research in a subject while teaching in a department with a different name, such as an economic historian teaching in the economics department but publishing in history journals and getting picked up by the academic survey for history.

Maybe, but it would not be a good idea for someone who wants to study history to apply to that particular university.

Another example is from Saudi Arabia, where King Fahd University of Petroleum and Minerals was apparently top for history, even though it does not have a history department or indeed anything where you might expect to find a historian. There are several universities in Saudi Arabia that may not teach history very well but at least they do actually teach it.

These subject rankings may have a modest utility for students who can pick or choose among top global universities and need some idea whether they should study engineering at SUNY (State University of New York) Buffalo (New York) or Leicester (United Kingdom) or linguistics at Birmingham or Michigan.

But they are of very little use for anyone else.

Thursday, February 20, 2014

Changing Responses to the QS Academic Survey

QS have published an interactive map showing the percentage distribution of the 62,084 responses to its academic survey in 2013. These are shown in tabular form below. In brackets is the percentage of the 3,069 responses in 2007.  The symbol -- means that the percentage response was below 0.5 in 2007 and not indicated by QS. There is no longer a link to the 2007 data but the numbers were recorded in a  post on this blog  on the 4th of December 2007.

The proportion of respondents from the USA rose substantially between 2007 and 2013. There were also increases for European countries such as the UK, Italy, Germany, France, Spain, Hungary, Russia, Netherlands and Portugal although there were declines for some smaller countries like Belgium, Denmark, Sweden and Switzerland.

The percentage of respondents from Japan and Taiwan rose but there were significant falls for India, China, Malaysia, Hong Kong New Zealand, Australia, Singapore, Indonesia and the Philippines

The most notable change is the growing number of responses from Latin America including Brazil, Mexico, Chile, Argentina and Colombia.

US   17.4   (10.0)
UK   6.5   (5.6)
Brazil   6.3  (1.1)
Italy  4.7    (3.3)
Germany   3.8 (3.0)
Canada   3.4 (4.0)
Australia   3.2  (3.5)
France   2.9    (2.4)
Japan   2.9    (1.9)
Spain   2.7    (2.3)
Mexico   2.6  (0.8)
Hungary   2.0   --
Russia 1.7   (0.7)
India 1.7   (3.5)
Chile  1.7     --
Ireland   1.6    (1.5)
Malaysia  1.5   (3.2)
Belgium 1.4  (2.6))
Hong Kong 1.4  (1.9)
Taiwan 1.3  (0.7)
Netherlands 1.2   (0.6)
New Zealand 1.2  (4.1)
Singapore 1.2  (2.5)
China 1.1   (1.6)
Portugal 1.1  (0.9)
Colombia 1.1   --
Argentina  1.0  (0.7)
South Africa 1.0   (0.7)
Denmark  0.9  (1.2)
Sweden  0.9  (1.7)
Kazakhstan  0.9
Israel 0.8   --
Switzerland  0.8  (1.5)
Austria 0.8  (1.3)
Romania 0.8  --
Turkey 0.7  (1.1)
Pakistan 0.7  --
Norway  0.6   --
Poland 0.6   (0.8)
Thailand 0.6   (0.6)
Finland 0.8   (0.5)
Greece 07  (0.7)
Ukraine 0.5   --
Indonesia   0.5  (1.2)
Czech 0.5   --
Peru 0.4   --
Slovenia 0.4   --
Saudi Arabia 0.4   --
Lithuania 0.4   --
Uraguay  0.3   --
Philippines 0.3   (1.8)
Bulgaria 0.3   --
UAE  0.3   --
Egypt 0.3   --
Paraguay  0.2   --
Jordan 0.2   --
Nigeria   0.2   --
Latvia 0.2   --
Venezuela  0.2   --
Estonia 0.2   --
Ecuador  0.2   --
Slovakia  0.2   --
Iraq 0.2   --
Jamaica 0.1   --
Azerbaijan 0.1   --
Iran 0.1  (0.7)   --
Palestine 0.1   --
Cyprus 0.1   --
Kuwait 0.1   --
Bahrain 0.1   --
Vietnam 0.1   --
Algeria 0.1   --
Puerto Rico 0.1   --
Costa Rica 0.1   --
Brunei 0.1   --
Panama 0.1   --
Taiwan 0.1   --
Sri Lanka 0.1   --
Oman  0.1   --
Icelan 0.1   --
Qatar 0.1   --
Bangladesh 0.1   --


The SIRIS Lab has some interesting visualizations of the THE and QS rankings for 2013 and the changing Shanghai Rankings from 2003 to 2013 (thanks to

Be warned. They can get quite addictive.

Tuesday, February 18, 2014

The New Webometrics Rankings

The latest Webometrics rankings are out.

In the overall rankings the top five are:

1.  Harvard
2.  MIT
3.  Stanford
4.  Cornell
5.  Columbia.

Looking at the indicators one by one, the top five for presence (number of webpages in the main webdomain) are:

1.  Karolinska Institute
2.  National Taiwan University
3.  Harvard
4.  University of California San Francisco
5.  PRES Universite de Bordeaux.

The top five for impact (number of external inlinks received from third parties) are:

1.  University of California Berkeley
2.  MIT
3.  Harvard
4.  Stanford
5.  Cornell.

The top five for openness (number of rich files published in dedicated websites) are:

1.  University of California San Francisco
2.  Cornell
3.  Pennsylvania State University
4.  University of Kentucky
5.  University of Hong Kong.

The top five for excellence (number of papers in the 10% most cited category) are:

1.  Harvard
2.  Johns Hopkins
3.  Stanford
4.  UCLA
5.  Michigan

Saturday, February 08, 2014

The Triple Package

I have just finished reading The Triple Package by Amy Chua and Jed Rubenfeld, a heavily anecdotal book that tells us, as every reader of the New York Times now knows, what really determines success.

An irritating thing is the presentation of urban legends -- no dogs, no Cubans and so on -- and generalizations to support the authors' thesis.

Here is one example: "men like Alfred Kazin, Norman Mailer, Delmore Schwatz, Saul Bellow, Celement Greenberg, Norman Podhoretz, and so many of the New York intellectuals who grew up excluded from anti-Semitic bastions of education and culture but went on to become famous writers and critics".

Alfred Kazin went to City College of New York when it was a selective institution. Norman Mailer went to Harvard at the age of 16 and, after serving in the army, to the Sorbonne. Delmore Schwartz attended Columbia, the University of Wisconsin and New York University and did postgraduate work at Harvard with Alfred North Whitehead. Saul Bellow was at the University of Chicago and then Northwestern. He was also also a postgraduate student at the University of Wisconsin. Clement Greenberg studied at Syracuse University. Norman Podhoretz was accepted by Harvard and NYU but went to Columbia which offered him a full scholarship. He went to Cambridge on a Fulbright and was offered a fellowship at Harvard which he turned down

Bellow famously endured several anti Semitic slights and sneers and no doubt did the others. But can we really say that were excluded from bastions of education?


Thursday, February 06, 2014

The Best Universities for Research

It seems to be the time of year when there a slow trickle of university ranking spin-offs before the big three world rankings starting in August. We have had young university rankings, best student cities, most international universities, BRICS rankings.

Something is missing though, a ranking of top universities for research. So to assuage the pent up demand here are the top 20 universities for research according to six different ranking indicators. There is considerable variation with only two universities, Harvard and Stanford, appearing in every list.

First the top twenty universities for research output according to Scimago. This is measured by publications in the Scopus database over a five year period.

1.   Harvard
2.   Tokyo
3.   Toronto
4.   Tsinghua
5.   Sao Paulo
6.   Michigan Ann Arbor
7.   Johns Hopkins
8.   UCLA
9.   Zhejiang
10. University of Washington
11. Stanford
12. Graduate University of the Chinese Academy of Sciences
13. Shanghai Jiao Tong University
14. University College London
15. Oxford
16. Universite Pierre et Marie Curie Paris 6
17. University of Pennsylvania
18. Cambridge
19. Kyoto
20. Columbia

Next we have the normalized impact scores from Scimago, which measure citations to research publications taking account of field. This might be considered a measure of the quality of research rather than quantity. Note that a university would not be harmed if it had a large number of non-performing faculty who never wrote papers.

1.   MIT
2.   Harvard
3.   University of California San Francisco
4=  Stanford
4=  Princeton
6.   Duke
7.   Rice
8.   Chicago
9=  Columbia
9=  University of California Berkeley
9=  University of California Santa Cruz
12.  University Of California Santa Barbara
13.  Boston University
14= Johns Hopkins
14= University of Pennsylvania
16.  University of California San Diego
17= UCLA
17= University of Washington
17= Washington University of St Louis
20.  Oxford

The citations per faculty indicator in the QS World University Rankings also uses Scopus. It is not normalized by field so medical schools and technological institutes can do very well.

1.   Weizmann Institute of Technology
2.   Caltech
3.   Rockefeller University
4.   Harvard
5.   Stanford
6.   Gwanju Institute of Science and Technology
7.   UCLA
8.   University of California San Francisco
9.   Karolinska Institute
10. University of California Santa Barbara
11. University of California San Diego
12. London School of Hygiene and Tropical Medicine
13. MIT
14. Georgia Institute of Technology
15. University of Washington
16. Northwestern University
17. Emory
18. Tel Aviv
19. Minnesota Twin Cities
20. Cornell

The Times Higher Education -- Thomson Reuters Research Impact Citations Indicator is normalized by field (250 of them) and by year of publication. In addition, there is a "regional modification" that gives a big boost to universities in countries with generally low impact scores. A good score on this indicator can be obtained by contributing to multi-contributor publications, especially in physics, providing that total publications do not rise too much.

1=  MIT
1=  Tokyo Metropolitan University
3=  University of California Santa Cruz
3=  Rice
5.   Caltech
6.   Princeton
7.   University of California Santa Barbara
8.   University of California Berkeley
9=  Harvard
9=  Stanford
11. Florida Institute of Technology
12. Chicago
13. Royal Holloway,University of London
14.  University of Colorado Boulder
15= Colorado School of Mines
15= Northwestern
17= Duke
17= Universty of California San Diego
19.  Washington University of St Louis
20.  Boston College

The Shanghai Academic Ranking of World Universities Highly Cited indicator counts the number of researchers on the lists compiled by Thomson Reuters. It seems that new lists will now be produced every year so this indicator could become less stable.

1.   Harvard
2.   Stanford
3.   MIT
4.   University of California Berkeley
5.   Princeton
6.   Michigan Ann Arbor
7.   University of California San Diego
8.   Yale
9.   University of Pennsylvania
10.   UCLA
11=  Caltech
11=  Columbia
13.   University of Washington
14.   Cornell
15.   Cambridge.
16.   University of California San Francisco
17.   Chicago
18    University of Wisconsin Madison
19    University of Minnesota Twin Cities
20.   Oxford

Finally, the MNCS indicator from the Leiden Ranking, which is the number of field normalized citations per paper. It is possible for a few widely cited papers in the right discipline to have a disproportionate effect. The high placing for Gottingen results from a single computer science paper the citation of which is required for intellectual property reasons.

1.    MIT
2.    Gottingen
3.    Princeton
4.    Caltech
5.    Stanford
6.    Rice
7.    University of California Santa Barbara
8.    University of California Berkeley
9     Harvard
10   University of California Santa Cruz
11.  EPF Lausanne
12.  Yale
13   University of California San Francisco
14.  Chicago
15.  University of California San Diego
16.  Northwestern
17.  University of  Colorado Boulder
18.  Columbia
19.  University of Texas Austin
20.  UCLA

Wednesday, February 05, 2014

Monday, February 03, 2014

India and the World Rankings

There is an excellent article in Asian Scientist by Prof Pushkar of BITS Pilani that questions the developing obsession in India with getting into the top 100 or 200 of the world rankings.

Prof Pushkar observes that Indian universities have never done well in global rankings. He says:

"there is no doubt that Indian universities need to play ‘catch up’ in order to place more higher education institutions in the top 400 or 500 in the world. It is particularly confounding that a nation which has sent a successful mission to Mars does not boast of one single institution in the top 100. “Not even one!” sounds like a real downer. Whether one considers the country a wannabe “major” power or an “emerging” power (or not), it is still surprising that India’s universities do not make the grade." 


"It is also rather curious that the “lost decades” of India’s higher education – the 1980s and the 1990s – coincided with a period when the country registered high rates of economic growth. The neglect of higher education finally ended when the National Knowledge Commission drew attention to a “quiet crisis” in its 2006 report."

Even so: 

"(d)espite everything that is wrong with India’s higher education, there is no reason for panic about the absence of its universities in the top 100 or 200. Higher education experts agree that the world rankings of universities are limited in terms of what they measure. Chasing world rankings may do little to improve the overall quality of higher education in the country."

He also refers to the proposal that the Indian Institutes of Technology should combine just for the rankings. Apparently he has been in touch with Phil Baty of THE who is not buying the idea.

I would disagree with Professor Ashok's argument that combining universities would not be a good idea anyway because THE scales some indicators for size. That is true but the reputation survey is not scaled and adding votes in the survey would be beneficial  for a combined institution if one could be created and then accepted by the rankers . Also, you currently need 200 publications a year to be ranked by THE so there would be a case for smaller places around the world --although probably not the IITs -- banding together to get past this threshold.

Saturday, February 01, 2014

Recent Research: Rankings Matter

According to an article by Molly Alter and Randall Reback in Education Evaluation and Policy Analysis, universities in the USA get more applications if they receive high quality-of-life ratings and fewer if their peers are highly rated academically.

True for your school: How changing reputations alter demand for selective US colleges


There is a comprehensive literature documenting how colleges’ tuition, financial aid packages, and academic reputations influence students’ application and enrollment decisions. Far less is known about how quality-of-life reputations and peer institutions’ reputations affect these decisions. This article investigates these issues using data from two prominent college guidebook series to measure changes in reputations. We use information published annually by the Princeton Review—the best-selling college guidebook that formally categorizes colleges based on both academic and quality-of-life indicators—and the U.S. News and World Report—the most famous rankings of U.S. undergraduate programs. Our findings suggest that changes in academic and quality-of-life reputations affect the number of applications received by a college and the academic competitiveness and geographic diversity of the ensuing incoming freshman class. Colleges receive fewer applications when peer universities earn high academic ratings. However, unfavorable quality-of-life ratings for peers are followed by decreases in the college’s own application pool and the academic competitiveness of its incoming class. This suggests that potential applicants often begin their search process by shopping for groups of colleges where non-pecuniary benefits may be relatively high.

Friday, January 31, 2014

Department of Remarkable Coincidences

On the day that QS published their top 50 under-50 universities, Times Higher Education has announced that it will be holding a Young Universities Summit in Miami in April at which the top 100 universities under 50 will be revealed.

Also, the summit will see "a consultative discussion on proposed new rankings metrics designed to better capture innovation in innovation and knowledge transfer in world rankings in the future."

Innovation?  What could that mean? Maybe counting patents.

Knowledge transfer? Could this mean doing something about the citations indicator? Has someone at THE seen who contributed to multi-author massively cited publications in 2012?

on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the future.on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the future.

a consultative discussion on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the future.a consultative discussion on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the future.

ll also host a consultative discussion on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the fu

QS Young Universities Rankings

QS have produced a ranking of universities founded in the last fifty years. It is based on data collected for last year's World University Rankings.

The top five are:

1.  Hong Kong University of Science and Technology
2.  Nanyang Technological University, Singapore
3.  Korean Advanced Institute of Science and Technology
4.  City University of Hong Kong
5.  Pohang University of Science and Technology, Korea

There are no universities from Russia or Mainland China on the list although there is one from Taiwan and another from Kazakhstan.

There are nine Australian universities in the top fifty.

Wednesday, January 29, 2014

The 25 Most International Universities

Times Higher Education has produced a succession of spin-offs from their World University Rankings: rankings of Asian universities, young universities and emerging economies universities, reputation rankings, a gender index.

Now there is a list of the world's most international universities, based on the international outlook indicator in the world rankings. This comprises data on international students, international faculty and international research collaboration.

The top five are:

1.   Ecole Polytechnique Federale de Lausanne
2=  Swiss Federal Institute of technology Zurich
2=  University of Geneva
4.   National University of Singapore
5.   Royal Holloway, University of London

Sunday, January 19, 2014

A bright idea from India

Has someone in India been reading this blog?

In a previous post I suggested that universities might improve their scores in the world rankings by merging. That would help in the QS and THE reputation surveys and the publications indicator in the Shanghai rankings.

If he is being reported correctly, Indian education minister Ashok Thakur proposes to go a step further and suggests that all the Indian Institutes of Technology should be assessed together by the international rankers, although presumably continuing to function separately in other respects. According to outlookindia:

"All the 13 IITs may compete as a single unit at the global level for a place among the best in the global ranking list.

Giving an indication in this regard, Higher Education Secretary Ashok Thakur said the idea is to position the IITs as a single unit much like the IIT brand which has become an entity in itself for finding a place among the top three best institutes the world-over.

International ranking agencies such as Times Higher Education and QS World University Ranking would be informed accordingly, he said.

Central universities and other institutes could follow on how the IITs position themselves in the ranking list, he said."

Both QS and THE seem eager to do business in India but this is surely a non-starter. Apart from anything else, it could be followed by all the University of California and other US state university campuses, branches of the National University of Ireland and the Indian Institutes of Science and Management coming together for ranking purposes.

Also, the Secretary should consider that if any IIT  follows the lead of Panjab University and joins the Hadron Collider Project or any other multi-contributor, multi-citation project, any gain in the THE citations indicator would be lost if it had to be shared with the other 12 institutes.

Tuesday, January 07, 2014

Explain Please

I have often noticed that some university administrators and educational bureaucrats are clueless about international university rankings, even when their careers depend on a good performance.

The Economic Times of India reports that the Higher Education Secretary in the Human Resource Development Ministry, Ashok Thakur, said "institutions could improve their scores dramatically in Times Higher Education's globally cited World University Rankings as the British magazine has agreed to develop and include India-specific parameters for assessment from the next time."

This sounds like THE is going to insert a new indicator just for India in their world rankings, which is unbelievable. The Hindu puts it a little differently, suggesting that THE is preparing a separate ranking of Indian universities:
"Times Higher Education (THE) — recognised world over for its ranking of higher education institutions — has agreed to draw up an India-specific indicator that would act as a parameter for global education stakeholders and international students to judge Indian educational institutions.
This was disclosed by Higher Education Secretary in the Union Human Resource Development Ministry Ashok Thakur."

It would be interesting to find out what the minister actually said and what, if anything, THE has agreed to.

Ranking News


The latest Times Higher Education international reputation rankings, based on data collected for last year's World University Rankings,  will be announced in Tokyo on March 6th.

The number of responses was 10,536 in 2013, down from 16,639  in 2012 and 17,554 in 2011

Why is the number of responses falling?

Is the decline linked with changes in the scores for the teaching and research indicators criteria, both of which include indicators based on the survey?

Sunday, December 22, 2013

Twenty Ways to Rise in the Rankings Quickly and Fairly Painlessly

Times Higher Education has just republished an article by Amanda Goodall, ‘Top 20 ways to improve your world university ranking’.  Much of her advice is very sensible -- appointing university leaders with a strong research record, for example -- but in most cases the road from her suggestions to a perceptible improvement in the rankings is likely to be winding and very long. It is unlikely that any of her proposals would have much effect on the rankings in less than a decade or even two.

So here are 20 realistic proposals for a university wishing to join the rankings game.

Before starting, any advice about how a university can rise in the rankings should be based on these principles.

·         Rankings are proliferating and no doubt there will be more in the future. There is something for almost anybody if you look carefully enough.

·         The indicators and methodology of the better known rankings are very different. Something that works with one may not work with another. It might even have a negative effect.

·         There is often a price to pay for getting ahead in the rankings. Everybody should consider whether it is worth it. Also, while rising from 300th place to 250th is quite easy, going from 30th to 25th is another matter.

·         Don’t forget the number on the bottom. It might be easier to reduce the number of academic staff than to increase the number of citations or publications.

 ·         Rankings are at best an approximation to what universities do. Nobody should get too excited about them.

The top 20 ways in which universities can quickly improve their positions in one or more of the international university rankings are:

 1.  Get rid of students

Over the years many universities acquire a collection of branch campuses, general studies programmes, night schools, pre-degree programmes and so on. Set them free to become independent universities or colleges. Almost always, these places have relatively more students and relatively fewer faculty than the main campus. The university will therefore do better in the Quacquarelli Symonds (QS) and Times Higher Education (THE) faculty student ratio indicators.  Also, staff in the spun off branches and schools generally produce less research than those at the main campus so you will get a boost in the productivity per capita indicator in the Shanghai ARWU rankings.

2.  Kick out the old and bring in the young

Get rid of ageing professors, especially if unproductive and expensive, and hire lots of indentured servants adjunct and temporary teachers and researchers. Again, this will improve the university’s performance on the THE and QS faculty student ratio indicators. They will not count as senior faculty so this will be helpful for ARWU.

3.  Hire research assistants

Recruiting slave labour cheap or unpaid research assistants (unemployed or unemployable graduate interns?) will boost the score for faculty student ratio in the QS rankings, since QS counts research-only staff for their faculty student indicator. It will not, however, work for the THE rankings.  Remember that for QS more faculty are good for faculty student ratio but bad for citations per faculty so you have to analyse the potential trade off carefully.

4.  Think about an exit option

If an emerging university wants to be included in the rankings it might be better to focus on just one of them.  Panjab University is doing very well in the THE rankings but does not appear in the QS rankings. But remember that if you apply to be ranked by THE and you do not like your placing then it is always possible to opt out by not submitting data next year. But QS has a Hotel California policy: once in, you can check out but you can never leave. It does not matter how much you complain about the unique qualities of your institution and how they are neglected by the rankers, QS will go on ranking you whether you like it.

5. Get a medical school

 If you do not have a medical school or a research and/or teaching hospital then get one from somewhere. Merge with an existing one or start your own. If you have one, get another one. Medical research produces a disproportionate number of papers and citations which is good for the QS citations per faculty indicator and the ARWU publications indicator. Remember this strategy may not help with THE who use field normalisation. Those citations of medical research will help there only if they above the world average for field and year.

6. But if you are a medical school, diversify

QS and THE supposedly do not include single subject institutions in their general rankings, although from time to time one  will, like the University of California at San Francisco, Aston Business School or Moscow Engineering Physics Institute, slip through.  If you are an independent medical or single subject institution consider adding one or two more subjects then QS and THE will count you although you will probably  start sliding down the ARWU table.

Update: the QS BRICS rankings include some Russian institutions that look like they focus on one field.

7. Amalgamate

The Shanghai rankings count the total number of publications in the SCI and SSCI, the total number of highly cited researchers and the total number of papers without regard for the number of researchers. THE and QS count the number of votes in their surveys without considering the number of alumni.

What about a new mega university formed by merging LSE, University College London and Imperial College? Or a tres grande ecole from all those little grandes ecoles around Paris?

8. Consider the weighting of the rankings

THE gives a 30 % weighting to citations and 2.5% to income from industry. QS gives 40 % to its academic survey and 5 % to international faculty. So think about where you are going to spend your money.

9.  The wisdom of crowds

Focus on research projects in those fields that have huge multi - “author”  publications, particle physics, astronomy and medicine for example.  Such publications often have very large numbers of citations. Even if your researchers make a one in two thousandth contribution Thomson Reuters, THE’s data collector, will give them the same credit as they would get if they were the only authors. This will not work for the Leiden Ranking which uses fractionalised counting of citations.

Note that this strategy works best when combined with number 10.

10.  Do not produce too much

You need to produce 200 papers a year to be included in the THE rankings. But producing more papers than this might be counterproductive. If your researchers are producing five thousand papers a year then those five hundred citations from a five hundred “author” report on the latest discovery in particle physics will not have much impact. But if you are publishing three hundred papers a year those citations will make a very big difference. This is why Dr El Naschie’s frequently cited papers in Chaos, Solitons and Fractals were a big boost for Alexandria University but not for Cambridge, Surrey, Cornell and Frankfurt universities with whom he also claimed affiliation.

However, Leiden will not rank universities until they reach 500 papers a year.

11.  Moneyball Strategy

In his book Moneyball, Michael Lewis recounted the ascent of the Oakland As baseball team through a strategy of buying undervalued players. The idea was to find players who did things that led to their teams winning even if they did not match the stereotype of a talented player.

This strategy was applied by George Mason University in Virginia who created a top basketball team by recruiting players who were overlooked by scouts because they were too small or too fat and a top economics department by recruiting advocates of a market economy at a time when such an idea was unfashionable.

Universities could recruit researchers who are prolific and competent but are unpromotable or unemployable because they are in the wrong group or fail to subscribe enthusiastically to current academic orthodoxies. Maybe start with Mark Regnerus and Jason Richwine.

12. Expand doctoral programmes

One indicator in the THE world rankings is the ratio of doctoral to bachelor degree students.

Panjab University recently announced that they will introduce integrated masters and doctors programmes. This could be a smart move if it means students no longer go into master’s programmes but instead into something that can be counted as a doctoral degree program.

13.  The importance of names

Make sure that your researchers know which university they are affiliated to and that they know its correct name. Make sure that branch campuses, research institutes and other autonomous or quasi- autonomous groups incorporate the university name in their publications. Keep an eye on Scopus and ISI and make sure they know what you are called. Be especially careful if you are an American state university.

14.   Evaluate staff according to criteria relevant to the rankings

If staff are to be appointed and promoted according to their collegiality,  the enthusiasm with which they take part in ISO exercises,  community service, ability to make the faculty a pleasant place for everybody  or commitment to diversity then you will get collegial, enthusiastic etc faculty. But those are things that the rankers do not – for once with good reason – attempt to measure.

While you are about it get rid of interviews for staff and students. Predictive validity ranges from zero to low

15.  Collaborate

The more authors a paper has the more likely it is to be cited, even if it is only self-citation.  Also, the more collaborators you have the greater the chances of a good score in the reputation surveys. And do not forget the percentage of collaborators who are international is also an indicator in the THE rankings

16. Rebrand

It would be good to have names that are as distinctive and memorable as possible. Consider a name change. Do you really think that the average scientist filling out the QS or the THE reputation surveys is going to remember which of the sixteen Indian Institutes of Technology is especially good in engineering.

17. Be proactive

Rankings are changing all the time so think about indicators that might be introduced in the near future. It would seem quite easy, for example, for rankers to collect data about patent applications.

18. Support your local independence movement

It has been known for a long time that increasing the number of international students and faculty is good for both the THE and QS rankings. But there are drawbacks to just importing students. If it is difficult to move students across borders why not create new borders?

If Scotland votes for independence in next year’s referendum its scores for international students and international faculty in the QS and THE rankings would go up since English and Welsh students and staff would be counted as international.

19. Accept that some things will never work

Realise that there are some things that are quite pointless from a rankings perspective. Or any other for that matter.  Do not bother telling staff and students to click away at the website to get into Webometrics. Do not have motivational weekends. Do not have quality initiatives unless they get rid of the cats.

20.  Get Thee to an Island

Leiden Ranking has a little known ranking that measures the distance between collaborators. At the moment the first place goes to the Australian National University. Move to Easter Island or the Falklands and you will be top for something.