Tuesday, April 01, 2014

Comparing the THE and QS Reputation Rankings

This year's Times Higher Education (THE) Reputation Rankings were  a bit boring, at least at the top, and that is just what they should be.

The top ten are almost the same as last year. Harvard is still first and MIT is second. Tokyo has dropped out of the top ten to 11th place and has been replaced by Caltech. Stanford is up three places and is now third. Cambridge and Oxford are both down one place. Further down, there is some churning but it is difficult to see any clear and consistent trends, although the media have done their best to find stories, UK universities falling or sliding or slipping, no Indian or Irish or African universities in the top 100.

These rankings may be more interesting for who is not there than for who is. There are some notable absentees from the top 100. Last year Tokyo Metropolitan University was, according to THE and data providers Thomson Reuters (TR), first in the world, along with MIT, for research impact. Yet it fails to appear in the top 100 in a reputation  survey in which research has a two thirds weighting. Rice University, joint first in the world for research impact with Moscow State Engineering Physics Institute  in 2012 is also absent. How is this possible? Am I missing something?

In general, the THE-TR reputation survey, the data collection for which was contracted out  to the pollsters Ipsos Mori CT, appears to be quite rigorous and reliable. Survey forms were sent out to a clearly defined group, researchers with papers in the ISI indexes. THE claim that this means that their respondents must therefore be active producers of academic research. That is stretching it a bit. Getting your name on a article published in a reputable journal might mean a high degree of academic competence or it could  just mean having some sort of influence over the research process. I have heard a report about an Asian university where researchers were urged to put their heads of  department on the list of co-authors. Still, on balance it seems that the respondents to the THE survey are mostly from a stable group, namely those who have usually made some sort of contribution to a research paper of sufficient merit to be included in an academic journal .

TR also appear to have used a systematic approach in sending out the survey forms. When the first survey was being prepared in 2010 they announced that the forms would be emailed according to the number of researchers recorded by UNESCO in 2007. It is not clear if this procedure has been followed strictly over the last four years. Oceania, presumably Australia and New  Zealand, appears to have a very large  number of responses this year, 10%, although TR reported in 2010 that UNESCO found only 2.1 % of the world's researchers in that region.

The number of responses received appears reasonably large although it has declined recently.  In 2013 TR collected 10, 536 responses, considerably less than in 2012 when it was 16,639. Again, it is not clear what happened.

The number of responses from the various subject areas has changed somewhat. Since 2012 the proportion from the social sciences has gone from 19% to 22% as has engineering and technology while life sciences has gone from 16% to 22%.

QS do not publish reputation surveys but it is possible to filter their ranking scores to find out how universities performed on their academic survey.

The QS approach is less systematic. They started out using the subscription lists of World Scientific, a Singapore based academic publishing company with links to Imperial College London. Then they added respondents from  Mardev, a publisher of academic lists, to beef up the number of names in the humanities. Since then the balance has shifted with more names coming from Mardev with some topping up from World Scientific. QS have also added a sign up facility where people are allowed to apply to receive survey forms. That was suspended in April 2013 but has recently been revived. They have also asked universities to submit lists of potential respondents and respondents to suggest further names. The  exact number of responses coming from all these different sources is not known.

Over the last few years QS have made their survey rather more rigorous. First, respondents were not allowed to vote for the universities where they were currently employed. They were restricted to one response per computer and universities were not allowed to solicit votes or instruct staff who to vote for or who not to vote for. Then they were told not to promote any form of participation in the surveys.

In addition to methodological changes, the proportion of responses from different countries has changed significantly since 2007 with a large increase from Latin America, especially Brazil and Mexico, the USA and larger European countries and a fall in those from India, China and the Asia-Pacific region. All of this means that it is very difficult to figure out whether the rise or fall of a university reflects a change in methodology or distribution of responses or a genuine shift in international reputation

Comparing the THE-TR and QS surveys there is some overlap at the top. The top five are the same in both although in a different order: Harvard, MIT, Stanford, Oxford and Cambridge.

After that, we find that the QS academic survey favours universities in Asia-Pacific and  Latin America. Tokyo is seventh according to QS but THE-TR have it in 11th place. Peking is 19th for QS and 41st for THE -TR. Sao Paulo is 51st in the QS indicator but is in the 81-90 band in the THE-TR rankings. The Autonomous National University of Mexico (UNAM) is not even in THE-TR's top 100 but QS put it 48th.

On the other hand Caltech, Moscow State University, Seoul National University and Middle Eastern Technical University do much better with THE-TR than with QS .

I suspect that the QS survey is tapping a younger less experienced pool of respondents from less regarded universities and from countries with high aspirations but so far limited achievements.






No comments:

Post a Comment