Something Needs Explaining
QS Quacquarelli Symonds have published additional information on their web site concerning the selection of the initial list of universities and the administration of the "peer review". I would like to focus on just one issue for the moment, namely the response rate to the e-mail survey. Ben Sowter of QS had already claimed to have surveyed more than 190,000 academics to produce the review. He had said:
"Peer Review: Over 190,000 academics were emailed a request to complete our online survey this year. Over 1600 responded - contributing to our response universe of 3,703 unique responses in the last three years. Previous respondents are given the opportunity to update their response." (THES-QS World University Rankings _ Methodology)
This is a response rate about 0.8%, less than 1%. I had assumed that the figure of 190,000 was a typographical error and that it should have been 1,900. A response rate of 80% would have been on the high side but perhaps respondents were highly motivated by being included in the ranks of "smart people" or winning a BlackBerry organiser.
However, the new information provided appears to suggests that QS did survey such a large number.
"So, each year, phase one of the peer review exercise is to invite all previous reviewers to return and update their opinion. Then we purchase two databases, one of 180,000 international academics from the World Scientific (based in Singapore) and another of around 12,000 from Mardev - focused mainly on Arts & humanities which is poorly represented in the former.
We examine the responses carefully and discard any test responses and bad responses and look for any non-academic responses that may have crept in. " (Methodology-- The Peer Review)
There is a gap between "we purchase" and "we examine the responses" but the implication is that about 192, 000 academics were sent emails.
If this is the case then we have an extraordinarily low response rate, probably a record in the history of survey research. Kim Sheehan in an article in the Journal of Computer Mediated Communication reports that 31 studies of e-mail surveys show a mean response rate of about 37%. Response rates have been declining in recent years but even in 2004 the mean response rate was about 24%
Either QS did not send out so many e-mails or there was something wrong with the database or something else is wrong. Whatever it is, such a low response rate is in itself enough to render a survey invalid. A explanation is needed.
Thanks for the incredibly interesting stats Richard!!
ReplyDeleteQS really needs to be denounced for their poor research skills and lack of integrity.
A response rate of .85% is a monstrosity considering the population size of research academics in the world. And indeed would render the results of the survey to be extremely invalid. However, despite this they still proceeded to report the results.
The impact that the THES QS Rankings has had on Australian universities have been really significant. On the most recent survey, the ANU comes in at no.16 -which has lead to the university self proclaming it to be the no. 1 university in Australia (despite the fact that the ANU does not attract the top high school students in the country nor does it have an extremely reputable medical, law, business or engineering school.
However the real matter is... no one has really been able to answer how the ANU (founded in 1947) manages to significantly top American research powerhouses such as UCLA, Carnegie Mellon, Georgetown, Dartmouth etc...?
I'd love to get a chance to analyse the raw data myself and reveal all the biases and flaws in the methods used.
I think it's about time someone of social or academic prominence stepped up to the plate to officially denouce the THES QS Rankings... The best way is for universities to ban academics from responding.
John.