Saturday, December 12, 2009

Whither the QS Rankings?

While Times Higher Education is looking around for a new methodology, QS, judging from a recent conversation with Ben Sowter and Tony Martin and comments on its website, appears set on continuing with the old system perhaps with a bit of tweaking.

The need to maintain some sort of continuity is understandable, especially after the yo-yoing of some universities in recent editions of the THE-QS rankings. However, criticism of the rankings is such that it would seem a good idea to seize the opportunity to make some simple changes.

The least liked element of the THE-QS rankings of 2004-09 was the "peer review". It had, being based on the mailing lists of a Singapore-based publishing company with links to Imperial College London, an obvious geographical bias. The declared response rate was too low to meet conventional standards of face validity. Its weighting was too high. As a survey of research expertise it was quite redundant since citations are a far better measure of research impact and quality.

Furthermore, the "peer review" added to the overemphasis on research. The THE-QS rankings gave a 20 % weighting to citations, the faculty student ratio gave a big and obvious boost to universities with large numbers of non-teaching research-only faculty and then there was 40% for a research-based survey.

I would like to suggest a simple change. Keep the survey of academic opinion (and stop calling it a peer review because it is nothing of the sort) but use it to assess the general excellence or reputation, perhaps including teaching and student satisfaction, of universities. It is not credible that someone with a functioning mouse can sign up for the World Scientific list and became competent to assess the research performance of universities but he or she might have some idea of the general reputation of institutions. This would require minimal changes to the current procedure: all that is needed is to change the questions.

A couple of other refinements might be in order. The division of the academic world into three super-regions for weighting purposes is too crude. Latin America, Africa, Southwest Asia and Southeast Asia deserve to be treated as separate regions.

Telling everybody that you have sent 180,000 e-mails is asking for trouble if you are going to get a negligible response. It would be better to use the World Scientific lists to accumulate a list of people willing to participate in the survey, combine it with names collected from various events and then send out the survey. If nothing else, the response rate would be a little more respectable.

2 comments:

  1. Richard, we have some exciting news. The "peer review" element for the 2010 Times Higher Education world university rankings will be carried out on our behalf by the leading research company, Ipsos MORI. We are aiming for a much larger sample than achieved by the old THE-QS ranking, with some serious social science behind the activity. My commentary on this is here: http://bit.ly/7GX4CZ
    Thanks.

    ReplyDelete
  2. Looks like the THE ranking will keep some "peer review" in its criteria...
    http://www.timeshighereducation.co.uk/story.asp?sectioncode=26&storycode=409595&c=1

    ReplyDelete