Times Higher Education (THE) has just announced that American university rankings are not fit for purpose.
We have heard that before. In 2009 THE said the same thing about the world rankings that they had published in partnership with the consulting firm Quacquarelli Symonds (QS) since 2004.
The subsequent history of THE's international rankings provides little evidence that the magazine is qualified to make such a claim.
The announcement of 2009 was followed by months of consultation with all sorts of experts and organisations. In the end the world rankings of 2010, powered by data from Thomson Reuters (TR), were not quite what anyone had expected. There was an increased dependence on self-submitted data, a reduced but still large emphasis on subjective surveys, and four different measures of income, reduced to three in 2011. Altogether there were 14 indicators, reduced to 13 in 2011, all but two of which were bundled into three super-indicators, making it difficult for anyone to figure exactly why any institution was falling or rising.
There were also some extraordinary elements in the 2010 rankings the most obvious of
which was placing Alexandria University in 4th place in the world for research impact
.
.
The rankings received a chorus of criticism
mixed with some faint praise for trying hard. Philip Altbach of Boston College summed up the whole affair pretty well.
“Some of the rankings are clearly inaccurate. Why do Bilkent University
in Turkey and the Hong Kong Baptist University rank ahead of Michigan State University,
the University of Stockholm, or Leiden University in Holland? Why is Alexandria
University ranked at all in the top 200? These anomalies, and others, simply do
not pass the smell test."
THE and TR returned to the
drawing board. They did some tweaking here and there and in 2011 got Alexandria
University out of the top 200 although more oddities would follow over the next
few years, usually associated with the citations indicator. Tokyo Metropolitan
University, Cadi Ayyad University of Marrakech, Santa Maria Federico Technical University,
Middle East Technical University and the University of the Andes were at one
point or another declared world class for research impact across the full range
of the disciplines.
Eventually the anomalies got too much and after breaking with TR in 2015 THE decided to have a bit of a spring cleaning and tidied things up a bit.
For many universities and countries the results of
the 2015 methodological changes were catastrophic. There was a massive churning with universities going up and down the tables. Universite Paris-Sud, the Korean Advanced Institute of Science and Technology, Bogazici University and The Middle East Technical
university fell scores of places.
THE claimed that this was an
improvement. If it was then the previous editions must have been hopelessly
inadequate. But if the previous rankings were the gold standard of rankings
then those methodological changes were surely nothing but gratuitous vandalism.
THE has also ventured into far away regions with snapshot or pilot rankings. The Middle East was treated to a ranking with a single indicator that put Texas A and M University Qatar, a
branch campus housing a single faculty, in first place. For Africa there was a ranking consisting of
data extracted from the world rankings without any modification of the indicators, which did not seem to impress anyone.
So one wonders where THE got the
chutzpah to tell the Americans that their rankings are not fit for purpose. After
all, US News was doing rankings for two decades before THE and their America's Best Colleges include metrics about retention and reputation as well as resources and
selectivity. Also, there are now several rankings that already deal directly with the concerns raised by THE.
The Forbes/CCAP rankings include measures of student satisfaction , degree of student indebtedness,
graduation on time, and career success.
The Brookings Institution has a value added ranking that includes data from the college scorecard
The Economist has produced a very interesting ranking that compares expected and actual value added.
So exactly what is THE proposing to do?
It seems that there will be a student engagement survey which apparently will be launched this
week and will cover 1,000
institutions. They will also use data on cost, graduation rates and salaries from the Integrated Postsecondary Data System (IPEDS) and the College Scorecard. Presumably they are looking for some way of monetising all of this so probably large chunks of the data will only be revealed as part of benchmarking or consultancy packages.
I suspect that the new
rankings will like something like the Guardian university league tables just published in the
UK but much bigger.
The Guardian rankings include
measures of student satisfaction, selectivity, spending,
staff student ratio and value added. The latter compares entry qualifications with the number of students getting good degrees (a first or upper second).
It seems that THE are planning something different from the research centred industry orientated university rankings that they have been doing so far and are venturing out into new territory, institutions that are two or three tiers below the elite and do little or no research.
It seems that THE are planning something different from the research centred industry orientated university rankings that they have been doing so far and are venturing out into new territory, institutions that are two or three tiers below the elite and do little or no research.
There could be a market for this kind of ranking but is very far from certain that THE are capable of doing it and whether it is financially feasible.