A notorious feature of the THE-QS rankings was its over-valuation of British and Australian universities. It would seem that Times Higher and Thomson Reuters are not really bothered by this. Indeed it looks like they are set on course to add to this bias in their new rankings, at least as far as British universities are concerned. An opinion piece by Jonathon Adams, the Director of Research Evaluation at Thomson Reuters, echoes previous comments in THE by lamenting the maltreatment of the London School of Economics in the old league table.
"The London School of Economics is generally agreed to be an outstanding institution globally. But how can we judge that? A lot of people would like to study there. If you wanted an informed opinion, you would consult the people who work there. A lot of people who have been there have gone on to great things. These are good indicators that the place is intellectually vibrant and delivers excellent teaching, and those values are endorsed internationally.
Good, but not perfect. Three major problems spring to mind. First, that quick summary tells us there are many ways in which we may value what a university does. It is a knowledge business and a source for teaching, research and dissemination to users. Second, the LSE is a specialist. Its astronomy is weak, so we need to consider subject portfolio. And, third, what will we measure? I need an informed expert to confirm my judgment, but as I can't send my expert to every institution, I need a proxy indicator (not a "metric": an indicator).
Our view of the LSE does not translate readily into anything useful unless we are careful and we make sure our information is appropriate. The LSE stood at only 67th in the last Times Higher Education-QS World University Rankings - some mistake surely? Yes, and quite a big one. LSE academics publish papers in social and economic sciences, which have lower citation rates than the natural sciences; so on the simple "citations per paper" used by QS in analysing the Scopus publications data, it slipped way down the list. Not a good way of comparing it with nearby King's College London, which has a huge medical school.
We need a lot more information than has typically been gathered before we can build an even halfway sensible picture of what a university is doing."
The problem with this is that there are many institutions that scored lower than LSE in the rankings that are agreed by some people somewhere to be outstanding. The “good indicators” raise more questions. A lot of people want to study at LSE. Is that because of its intrinsic merits or shrewd marketing? And who is the "you" who would consult the LSE? A lot of its alumni and alumnae have done great things? No doubt many have become MPs, civil servants, university administrators and CEOs but given the current moral condition of British politics and the performance of the British and European economies that might not be something to be proud of.
It is difficult to concur with the claim that LSE has been treated unfairly in previous rankings. In 2009 they were number five for social sciences and 32nd for arts and humanities. They got top marks for international faculty and international students and in the employer review. They did somewhat less well in the academic survey, which had a disproportionate number of respondents from Britain and Commonwealth countries with large numbers of British alumni and alumae, but that is surely to be expected when LSE excels in a very limited range of disciplines.
LSE also did badly in the citations per faculty indicator (not citations per paper – QS used that for their Asian rankings, not the world rankings) partly because it is a specialist social science institution and it is conventional in the social sciences to produce fewer papers and to cite them less frequently but also because LSE actually does not produce as much social science research, as measured by Scopus and ISi publications, as general institutions such as the Universities of Manchester, Birmingham, Harvard, Yale, Chicago, Toronto, Melbourne and Sao Paulo.
It is difficult to think of changes in the structure or content of the rankings that would benefit LSE but not a host of others. Giving extra weighting to social science publications is an excellent idea and would boost LSE relative to King’s College or Imperial College (I wonder if THE is prepared to let Imperial slip a few places) but it would probably help US state universities and European universities even more. Counting “contributions to society”, such as sitting on committees and commissions and boards of directors would help LSE a bit but might well help Japanese universities and the French grandes ecoles a lot more.
LSE is a narrowly based specialist institution and QS gave it as much as or more than it deserved by ranking it highly in the social science and arts and humanities categories and putting it in the top 100 in the general world rankings. It is good at what it does but it does not do all that much. It would be a shame if the rankings are going to be restructured to promote it beyond its real merits.
The other item is a Rick Trainor’s review of Robert Zemsky’s book Making Reform Work. In the course of his review Trainor, who is president of King’s College London, says that:
"Most fundamentally, while the US debate is premised on a clear and widespread belief in the great, if imperilled, merits of the US system, British opinion often pays too little attention to the successes of UK universities, even in comparison with their US counterparts. For example, British commentators often overlook UK universities' superior completion rates, the greater rigour concerning undergraduate assessment inherent in the existence of an external examiner system, their greater ability (allowing for the much greater size of the US population and its university system) to attract overseas students and, as suggested by the Sainsbury report, their arguably superior record in commercialisation.
Of course, this is not to suggest that the UK higher education system is perfect, any more than US universities are. Nonetheless, there has been too little recognition in the UK of its high international research standing (aided by rises in public investment in recent years), despite persisting American strength and rapidly rising competition from countries such as China and India. Likewise, the UK system receives too little credit domestically for its success in protecting standards despite the huge increase in UK student numbers during the past 25 years. Similarly too few observers on this side of the Atlantic have learned one of the basic lessons propounded by Zemsky: that outstanding achievement in higher education depends on adequate resources - for teaching (which was substantially underfunded, even before the UK's public expenditure crisis began) as well as for research. "
This is a rather odd set of claims. Superior completion rate? I wonder how that happened. Greater rigour because of the external examiner system? Really? Do British universities still have a high international research standing? Just look at their performance on the Shanghai rankings, after removing the cushion of the thirty percent weighting for Nobel and Fields laureates. Have standards really been protected? Would more money make any difference?
It is beginning to look as though an implicit consensus is developing in the British higher educational establishment that the rankings should reflect its self-serving view of the merits of British higher education and that they have an important role to play in fending off the economic crisis. It appears that THE and Thomson Reuters are only too happy to oblige.
6 comments:
Richard,
You appear to be contradicting yourself.
You say: "A notorious feature of the THE-QS rankings was its over-valuation of British and Australian universities. It would seem that Times Higher and Thomson Reuters are not really bothered by this".
You leap to that conclusion on the basis that we picked out the LSE as an example to explain how we would change the way we measure research excellence in order to stop penalising universities which specialise in the social sciences in favour of institutions with strong specialisms in the hard sciences.
But you then say:
"It is difficult to think of changes in the structure or content of the rankings that would benefit LSE but not a host of others. Giving extra weighting to social science publications is an excellent idea and would boost LSE... but it would probably help US state universities and European universities even more."
So on the one hand you allege that we are not addressing concerns about the over-valuation of British universities, and then you concede that one of our planned improvements to the rankings methodology will help US and European universities.
It is starting to look that you have a chip on your shoulder!
Best wishes,
Phil Baty
Editor, Times Higher Education World University Rankings
I was pointing to the contradiction between the desire to reward LSE for its excellence in research in citations-limited fields and the probable consequences of giving extra weight to social science publications, which may well benfit institutions with a diverse research profile as much as or more more than it benefits LSE. The suggested changes to the citations indicator are overdue and welcome but they may not have the effect that seems to be expected. I was not conceding that such a change would benefit US and European universities but giving a warning that this might happen.
The bottom line of any change is that LSE does not offer the full range of excellence that is expected of a world-class university and therefore it should not be expected to perform well in any general ranking.
Suspicions of bias will be alleviated if you provide examples of non-British institutions that you feel were penalised by the old rankings.
Thanks Richard.
LSE tends to get used as an example to explain some of the problems with the old QS methodology, as it is an extreme case. But there are many others in other countries that will perhaps have a better showing in the new improved 2010 world university rankings.
The point is that we want to improve on the old QS methodology that we have now rejected, to remove clear bias in favour of the hard sciences. We all know that a world-class journal article in the humanities will not have the same volume of citations as similarly world-class work in other fields. What we are suggesting is that when using citations as a proxy for research excellence, we will "normalise" by subject, to take into account very different citations habits and volumes in different disciplines. We do this to make the world university rankings more balanced and sophisticated, not to benefit any individual institutions. As journalists working in the academic community, we are interested in getting as true a picture as possible, not in benefitting any institution or nation over another
Phil Baty
You really seem to have it in for LSE and you also seem to know very little about it. There is no reason why a specialist university should not be regarded as world class.
LSE is specialist, but its speciality is very broad, like say Imperial.And the social sciences impact on every area of activity, from literature to bio-engineering. If LSE is too narrow in approach to be considered world class then Imperial, MIT, Caltech and more than a few other places should be removed from consideration also. In reality there are plenty of 'all round universities' that in practice are narrow in range: KCL for example has very little social science, Essex, despite its official status, is really a social science and humanities university and so on..
The truly specialist and narrow institutions such as the School of Hygiene and Tropical Medicine or LBS probably should not appear in a general ranking..
Universities should be judged on the strength of what they try to do, not what you think they should do.
LSE has produced about 35 world leaders on the basis of a small student body, in just over a hundred years, without inherited wealth or connections and about a quarter of all the Nobel winners in economics studied or taught there;its impact on public policy across the world has been huge; the fact that league tables find it difficult to quantify this, with their ridiculous bias towards big science is not a justification for placing LSE below comparatively humdrum universities which cannot match its credentials. That is why people think a ranking of 66th is absurd, especially when the same table placed it at 11th a few years ago.
Sorry for my bad english. Thank you so much for your good post. Your post helped me in my college assignment, If you can provide me more details please email me.
Post a Comment