Maybe I'll get my five minutes of fame for being first with a Dylan quotation. I was a bit slow because, unlike Jonah Lehrer, I wanted to check that the quotation actually exists.
Times Higher Education (THE) have announced that they will be introducing reforms to their World University Rankings and ending their partnership with media and data giant, Thomson Reuters (TR).
Exactly why is not stated. It could be rooted in financial disagreement. Maybe THE feels betrayed because TR let US News use the reputation survey for their new Best Global Universities rankings. Perhaps THE got fed up with explaining why places like Bogazici University, Federico Santa Maria Technical University and Royal Holloway were world beaters for research impact, outshining Yale, Oxford and Cambridge,
The reputation survey will now be administered by THE itself in cooperation with Elsevier and will make use of the Scopus database. Institutional data will be collected from universities, the Scopus database and the Scival analysis tool by a new THE team.
The coming months will reveal what THE have in store but for now this is a list of recommendations. No doubt there will be many more from all sorts of people.
Display each indicator separately instead of lumping them together into Teaching, Research and International Outlook. It is impossible to work out exactly what is causing a rise or fall in the rankings unless they are separated.
Try to find some why of reducing the volatility of the reputation survey. US News do this by using a five year average and QS by rolling over unchanged responses for a further two years.
Consider including questions about undergraduate teaching or doing another survey to assess student satisfaction.
Reduce the weighting of the citations indicator and use more than one measure of citations to assess research quality (citations per paper), faculty quality (citations per faculty) and research impact (total citations). Use field normalisation but sparingly and sensibly and forget about that regional modification.
Drop the Industry Income: Innovation indicator. It is unfair to liberal arts colleges and private universities and too dependent on input from institutions. Think about using patents instead.
Income is an input. Do not use unless it is to assess the efficiency of universities in producing research or graduates.
Considering dropping the international students indicator or at least reducing its weighting. It is too dependent on geography and encourages all sorts of immigration scams.
Benchmark scores against the means of a constant number of institutions. If you do not, the mean indicator scores will fluctuate from year to year causing all sorts of distortions.
2 comments:
Interesting development here!
With regards to the citations indicators, they should begin to fractionalize the citations on each paper and they should consider to use the new excellence indicator (pptop 10%) instead of the normalized impact (crown indicator). The naomalized impact (crown indicator) can be schewed if a university has a few highly cited papers
THE's next rank will clearly be substantially different from the previous tank. So THE should point out how much of the differences in institutions' ranks are due to changes are due to changes in method.
Post a Comment