Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Saturday, February 27, 2010
Phil Baty in Times Higher Education writes about fluctuations in the old THE-QS rankings
"Magazines that compile league tables have an interest in instability - playing around with their methodologies to ensure rankings remain newsworthy.
This was the argument made by Alice Gast, president of Lehigh University, Pennsylvania, at the Lord Dearing memorial conference at the University of Nottingham this month.
She has a point. Dramatic movements in the league tables make the news and generate interest - helpful for the circulation figures.
But too much movement raises questions about credibility: everyone knows that it takes more than 12 months for an 800-year-old university to lose its status, or for a young pretender to ascend the heights. "
The THE-QS rankings were famous for their yearly fluctuations. This of course helped to make them much more popular than the reliable but boring Shanghai rankings (unless you were prepared to spend a few hours cutting and pasting the indicator scores of universities in the 300s and 400s into an Excel file and then they could be interesting). The rises and falls resulted from changes in methodology, errors, correction of errors and inconsistent application of guidelines.
Still, there are cases when universities undergo serious restructuring or pour massive funds into research or recruit administrators of the highest calibre and these developments should be reflected in any valid index. Rankings that do not show some upward movement by, say, the Hong Kong University of Science and Technology or King Abdullah University of Science and Technology ought to be considered suspect
Equally, it is striking that the major rankings contain elements, the THE-QS academic opinion survey, the Nobel laureates in the Shanghai rankings, even eleven year old publications and citations in the Taiwan rankings, that disguise the steady relative decline of Oxford and Cambridge over the last two decades.
We shall have to wait until 2011 to see if the new THE ranking will avoid the suspicious fluctuations of the THE-QS rankings and also be sensitive to genuine changes in international higher education.
Wednesday, February 24, 2010
Richard Vedder of the Center for College Affordability and Productivity produces an interesting US ranking based on value for money for students. One key element is data provided from the famous or notorious site, RateMyProfessors. I used to think that it would a step forward in international university rankings to try to do something like this on a global scale. Now I am not so sure.
I assume that everbody has heard of the tragic shooting at the University of Alabama at Huntsville. My first suspicion was that the alleged murderer, Amy Bishop, was a talented but socially awkward academic who had snapped after being denied tenure on flimsy grounds of collegiality or for being politically incorrect.
That does not look like being the case. A blog, Shepherds and Black Sheep has analysed her research output and found that it seems inflated, with one article having her children as c0-authors and another published in an "online vanity press' and several more being co-authored with her husband
Bishop's pages at RateMyProfessors are also interesting. At first sight they look quite impressive with a total score of 3.6 out of 5, fifth best in her faculty, and 3.4 for clarity and 3.7 for helpfulness. But there are some oddities.
The user comments start with three excellent reviews, one on 26 May, 2009 and two on 19 May. A little odd. Back in June 2004 there were also two rave reviews again posted on the same day.
It is also noticeable that the good reviews tend to cluster together with three consecutve good reviews in January and February 2006 and another three, one after the other, in November, 2004 and January, 2005.
Another odd thing is that for the helpfulness and clarity indicators there is a very distinctive distribution curve. For helpfulness, Bishop had 6 ones (the worse), 2 twos , 5 threes, 3 fours, and then 17 fives (the best) For clarity, it was 7 ones, 5 twos, 4 threes, 3 fours and 15 fives. Note the dramatic jump from four to five in both categories.
Compare this with a low scoring teacher who gets 4 fours and 4 fives for helpfulness and 5 fours and 6 fives for clarity.
Compare also a high scoring faculty member with 13 fours and 25 fives for helpfulness and 15 fours and 28 fives for clarity. A big jump from four to five but proportionately much less than Bishop's.
Is it possible to rig RateMyProfessors?
According to Tenured Radical it is very easy.
"To test my theory that ratings could be posted by people who had never been my students, I went to the dreaded site, and registered myself, under my own name, as a Zenith student. Easy-peasy. The only false information I provided was a birth date that made me 19 years old (I wish!) and the box I checked that affirmed my status as a Zenith sophomore. I then successfully added a rating about myself. You can see it here: it's the anxious looking green emoticon that has the comment "interesting." I thought it only fair to add something right down the middle, neither good nor bad. Inflammatory perhaps, but arrogant never, that's my motto. "
So, I have a strong suspicion that someone had been going to RateMyProfessors and posting effusive commuents about Bishop (and nasty ones about other faculty members.?)
Perhaps RateMyProfessors is not such as good indicator after all.
Tuesday, February 23, 2010
A survey was conducted recently for Thomson Reuters to provide input for the forthcoming Times Higher Education World University Rankings. The results of the survey can be accessed at the Global Institutional Profiles Project set up by Thomson Reuters.
The results of the survey are important since they might provide a clue to what the new ranking will look like.
There were 350 respondents from the “global academic community” This is apparently more than the numbers that answered similar surveys by QS but it does not seem very large especially when THE has raised justified concern about the low and possibly unrepresentative numbers participating in the THE-QS survey of academic opinion.
Of those 350, 107 were from the UK (31%), 90 from the US, 30 from Australia, nine from Canada and seven from New Zealand. Thirty three were from the rest of Europe, 32 from Asia and 42 from the others, i.e. not from North America, Europe, Asia and Australasia. With nearly a third of the respondents coming from the UK and over two thirds from just five English speaking countries this is a distinctly Anglo-Saxon-centric affair.
The first question was the level of familiarity with various rankings. The THE ranking was the one with which the largest number of respondents were familiar. This is a slightly odd result since from 2004 to 2009 THE published rankings under the name THE (S) - QS World University Rankings. The Times Higher Education World University Rankings (minus QS) have yet to appear. No doubt, THE would claim that it was they who published the ranking and can retrospectively rename them if they wish.
Summarizing the responses to the survey, it seems that the respondents believe that university rankings
• are useful
• have methodological problems
• are biased
• encourage the manipulation of data
• encourage a focus on numerical comparisons
• use data that is not transparent or reproducible
• do not not include appropriate metrics
• favour research institutions
Among the information that respondents need or would like to have are
• Publications and citations
• Research awards
• Patents
• Faculty student ratio
• Faculty activity ratios (teaching income/research grants/publications per staff)
• Number of faculty by gender, international, ethnicity or race
• Number of graduate programs and degrees
• Collaboration
• Community engagement
• Perceptions of researchers, employers, alumni and community
Some of this, community engagement for instance, is too vague to be useful. Other items contradict the stated objectives of the developing ranking system: including patents and research collaboration in a general ranking would add more bias in favour of the natural and applied sciences. Others betray the American or European concerns of the respondents: alumni have little significance outside the USA. It is also noticeable that nobody seems interested in student perceptions.
Friday, February 19, 2010
This, in an article by Phil Baty of THE in The Australian, sounds promising.
"So we've started again. For 2010, Thomson Reuters has hired pollsters Ipsos MORI to carry out the reputation survey, and it has committed to obtaining 25,000 responses, from a carefully targeted and properly sampled group that will represent the true demographics of global higher education.
This may mean a slip in the performance of British and Australian institutions but if that is the case then so be it. We are interested in getting closer to the truth. "
Click here for a survey by Nick Clark in the World Education News and Reviews
Click here for another presentation from the CHEA International Seminar by Angela Yong-chi Hou of the Higher Education Evaluation and Accereditaion Council of Taiwan.
Click here for a comprehensive and informative presentation by Robert Morse of US News at the CHEA International Seminar in Washington DC on America's best Colleges Rankings: A Brief History.
Phil Baty argues in yesterday's Times Higher Education that, despite the "jiggery-pokery" employed by some universities to get a better position in university rankings, "there is no need to sacrifice mission to position"
He refers to several cases of university administrators manipulating data to rise in the rankings. One example is Albion College in the USA who divided a small alumnus donation into smaller annual payments. Frankly, I wonder if this is worth getting worried about. Surely, a far greater scandal in American colleges is the admission, in order to please alumni and get money out of them, of large numbers of academically unqualified student athletes.
The article then discusses "the less dishonest but nevertheless deleterious effects of rankings, such as pressing staff to publish in English-language journals, which may lift an institution's profile but may not best serve its local community'
This is true but it should be noted that THE has shifted from using Scopus data to Thomson Reuters whose database has been criticised for its overwhlemingly English language content.
Baty is right on target when he comments on institutions' importing large numbers of foreign students in order to boost their score for the internationalistion score on the THE-QS rankings. There are though other reasons, mainly financial, for doing this. In the UK and Australia it is likely that in many cases this has contributed to a decline in quality.
Counting international students is rather different from counting international faculty. In most cases, students pay, or someone pays for them, to travel abroad to go to university but universities pay international faculty to come to them.
It would be a good idea if THE dropped the intenational student indicator. If they are going to keep it then one simple and helpful measure might be to include the showing of a passport in the definition of international. In other words treat the European Union, or at least the Schengen Area, as a single country.
Thursday, February 18, 2010
The answer is very good but there seem to be a few US universities that are better.
Tilburg University has just produced a new ranking of Economics schools based on publications (ISI indexed journals) in journals in Economics, Econometrics and Finance. Harvard is first with a score of 551 followed by Chicago (385). LSE is eighth alongside Northwestern University with a score of 280. Oxford is 22nd and University College London 29th. Tilburg is 23rd.
If LSE can only get to eighth place in Economics then what can we expect from an ojective ranking in the natural sciences and the arts and humanities?
Wednesday, February 17, 2010
The country share of vistors to this blog is as follows. Noticeably absent are China and Russia unless they are in 'unkown' (12%).
United States 22%
United Kingdom 8%
Switzerland 7%
Canada 4%
Malaysia 4%
Singapore 4%
Germany 3%
Nigeria 3%
France 3%
Indonesia 2%
India 2%
Poland 2%
Spain 2%
Mexico 1%
Czech Republic 1%
Greece 1%
Brunei 1%
Belgium 1%
Australia 1%
Ireland 1%
Japan 1%
Saturday, February 13, 2010
I recently came across a site called StrategicFIRST that ranks websites according to traffic and indicates an estimated value for the site. I am not sure how reliable it is but here are the data for some sites associated with international university ranking.
Estimated Value
Webometrics (webometrics.info) $97,281
QS Quacquarelli Symonds (topuniversities.com) $89,122
Scimargo (scimagojr.com) $86,528
Academic ranking of World Universities (aarwu.org) $79,545
Times Higher Education (timeshighereducation.co.uk) $79,132
HEEACT (heeact.edu.tw) $23,684
University Ranking Watch (rankingwatch.blogspot.com) $5,176
Global Universities Ranking [Russia]globaluniversitiesranking.org $3,941
Princeton Review (princetonreview.comcollege-rankings.aspx $3,802
There is a comment by Nunzio Quacquarelli on the QS topuniversities rankings blog.
Here is an extract:
"In October 2009, QS and THE ended their collaboration under which THE was licensed to publish the QS results known as “Times Higher Education (THE) – QS World University Rankings”. Since then, THE have announced they intend to produce their own rankings and have been systematically critical of QS’ methodology as part of their explanation for the split. This is surprising; THE consistently praised the QS methodology throughout the six-year publishing collaboration. Indeed, their former publishing director described it as one of the best partnerships in the history of THE.
Similarly, Ann Mroz, Editor of THE wrote in October 2008: "These rankings use an unprecedented amount of data to deliver the most accurate measure available of the world’s best universities, and of the strength of different nations’ university systems. They are important for governments wanting to gauge the progress of their education systems, and are used in planning by universities across the world."
Phil Baty, Associate Editor of THE wrote only on October 10 2009: “Congratulations on a highly successful campaign on the rankings again this year. The internet is buzzing.” Yet it seems our objectives and methodological principles have subsequently diverged. QS will continue to produce our rankings using citation data from the Scopus database of Elsevier. THE have decided to align themselves with Thomson Reuters’ academic citation database."
Thursday, February 11, 2010
The new Webometrics ranking is out.
Some interesting points
The top 20 are all in the USA.
The best non-US university is Cambridge at 27.
British universities do not do very well. Oxford is at 37, University College London at 57 and Imperial at 157 while Webometrics joins the anti-LSE conspiracy by putting it at 234.
The top European universities seem to be in the North -- Edinburgh, Oslo, Helsinki. Something about the cold weather?
Regional Rankings
Best in Latin America: Sao Paulo
Best in Europe: Cambridge
Best in Central and Eastern Europe: Charles University
Best in Asia: Tokyo
Best in South East Asia: National University of Singapore
Best in South Asia: Indian Institute of Technology Bombay
Best in the Arab World: King Saud University
Best in Oceania: Australian National University
Best in Africa: Cape Town
Finally Israeli universities should get a special award for mobility. They manage to be in Asia and Europe at the same time.
Tuesday, February 09, 2010
The rise of China to scientific superpower status has been well documented. See here for a report by Jonathon Adams, Christopher King and Nan Ma.
This can be confirmed by a simple search of the Scopus database which reveals 38,360 scientific publications from China in 1999 compared to 250,452 in 2009. For the United States the corresponding figures were 311,879 and 367,641.
The UK, France and Germany recorded modest increases over the decade while research output in Russia actually fell.
A certain amount of caution is in order. These figures refer to the quantity of research, not to its quality and China does have a large, although stable, population. Still, the West has cause to be concerned.
Some other countries have improved quite considerably over the decade. Korea, India, Australia and Hong Kong have doubled or nearly doubled and Thailand has more than tripled its research output.
It is especially noticeable that Malaysia is catching with Singapore. The former had 1,235 publications in 1999 and the latter 4,538 . In 2009 the figures were 7,834 and 10,993.
However, the prize f0r rapid growth goes to Iran which had 1,351 publications in 1999 and 19,088 in 2009. Compare this with Israel: 11,918 and 16,335.
If research in Iran goes on advancing at this rate and if other countries in the region also develop their scientific capabilities and if the ultra -orthodox extend their assault on reason and science into Israeli schools and universities, it looks as though Hamas and Hezbollah are going to the least of Israel's problems.
Friday, February 05, 2010
There has been a lot of discussion about university rankings recently. In Times Higher Education, Phil Baty refers to a comment in the satirical magazine Private Eye about the forthcoming European Union rankings. Why spend public money on the ranking of universities when there are already two recognised rankings? Perhaps, it has something to do with the striking absence of continental European universities from the upper reaches of the THE-QS and Shanghai rankings.
Baty claims to be less cynical than Private Eye. He says that:
"While I am sure CHERPA will strive to be fully independent, it is a group made up exclusively of European universities, and was set up in direct response to Europe's poor showing in the current rankings, so some suspicion is inevitable.
More serious, and entertaining, questions have been asked over other rankings. Russia's RatER raised eyebrows for putting Moscow State University in fifth place, ahead of Harvard and Cambridge, and a ranking from France's Mines ParisTech has been ridiculed for putting five French universities into the top 20."
However, one should not assume that the forthcoming THE rankings will be biased because
"these concerns give THE great confidence - as an independent magazine we are free from the influence of any institution or authority.
We are accountable only to our readers - an increasingly international community of thousands of academics and university administrators. "
But this raises certain questions. Is THE not accountable to the company that owns it? Another question is that "increasingly international" community. "Increasingly" from what to what? And who are those administrators responsible to?
The national bias of the Paris Mines ranking is indisputable. There the top French institution is in sixth place. In the most recent THE-QS rankings the top French institution was 38th, in the Russian RaTER rankinigs 36th, in the Shanghai Aacademic Ranking of World Universities 40th, in the Taiwan rankings 88th and in Webometrics 129th.
The bias of the Russian rankings is even more glaring. They put Moscow State University in 5th place. In no other ranking did they even get intio the top fifty.
I am not suggesting that there is anything dishonest about the Paris and Russian rankings. The Paris rankings is as transparent as it is possible to be. It simply counts the number of CEOs of top 500 companies who attended particular schools. Everything is in the public record. The Russian rankings are not so transparent. The problem here is that its questionnaire contains many references to indicators specific to Russia and the CIS. It is also written in a style that many people would find close to incomprehensible.
The bias in the Paris and Russian rankings stems not from dishonesty but from the choice of criteria that are likely to give an advantage to universities in their countries while downplaying or ignoring those in which their countries are not so strong.
In contrast, the Shanghai, Taiwan, Webometrics, and Scimargo rankings appear to have no home country bias at all.
What about THE? The old THE- QS rankings were pretty obviously biased in favour of British universities. Last year it had Cambridge in second place. The Shanghai rankings put it in 4th place, although that will not be sustained as the impact of old Nobel winners fades. In the Paris Mines ranking it was 7th, in the Russian rankings 8th, in the Taiwan rankings 15th, in Webometrics 22nd , in Scimargo 34th and in the Leiden green index (the size-independent, field-normalized average impact) 37th.
We will see if Cambridge and Imperial College maintain their suspiciously high places in the new THE rankings. If they start slipping a little I will be inclined to agree that THE has in fact overcome its anglocentric bias.
Tuesday, February 02, 2010
The Chronicle of Higher Education has a substantial article on world rankings by Aisha Labi. She describes a number of recent developments
- The European Union "began moving ahead in the development of a nuanced and more complex rankings system". No doubt it will soon start moving as fast as Concorde.
- A Russian ranking was met with derision, even in Russia.
- THE and QS "had an acrimonious split, with each now promising to produce a superior product."
There are some comments from Phil Baty of THE who describes the old rankings as "no longer fit for purpose". He indicates that the new THE rankings will see two improvements. One is a new academic survey that will be larger, better targeted and more representative. The other is some sort of extra weighting for the social science citations.
Meanwhile Ben Sowter of QS defends
"its [QS] continuing emphasis on a peer-review component, adding that it seeks increased input from academics and aims to increase response numbers through measures such as translated surveys for academics in non-English-speaking institutions.
"Of all the measures that different rankings are using at a global level, from my perspective peer review is the one that is fairest to universities with different disciplines," he says. The use of peer reviews "enables institutions with great strengths in the arts and humanities to shine in a way that they are not able to in other measures." "
So it looks like there will be a survey war with THE flaunting the size of its sample and QS stressing the diversity of theirs.
Of course, the last word goes to Nian Cai Liu of Shanghai Jiao Tong University: "We think that more and diversified rankings are good for the higher-education community and the general public in general,"
Saturday, January 30, 2010
A comprehensive and interesting report on university rankings from the Swedish National Agency for Higher Education is available here.
Thanks to Beerkens' Blog
Finally something has appeared on the QS Intelligence Unit Blog. Ben Sowter writes:
"The QS World University Rankings will continue to be published in 2010, albeit through a number of new channels which we are working on. At present, there are no plans to alter the methodology, in fact it seems important to maintain some comparability in a time when a number of new and different interpretations are going to emerge. So in 2010, we are focused on improving our engagement with institutions, redesigning some of our data collection systems to be more user-friendly and intuitive, and our work in specific regional and discipline oriented contexts."
I am not sure that keeping the methodology is a good idea but it is understandable. However, even with the same basic methods there are a couple of minor changes that might help QS find a niche in the "holistic" ranking market as Times Higher appears to focus on making fine distinctions among leading research institutions. One would be to use the academic survey to ask about general excellence or activities other than research. The other would be to remove non-teaching faculty from the faculty totals when calculating faculty student ratio. As it is, the QS rankings are heavily weighted towards research, with an academic survey asking about research, an indicator based on citations and a teaching resources measure that includes researchers who never teach.
Now that QS have done an Asian ranking and are apparently preparing Arab and Latin American ones, they could also also outflank THE by preparing survey forms in additional languages. They offered a Spanish option last year. They ought to have the resources to produce forms in Chinese, French, German. Arabic and Japanese.
Thursday, January 28, 2010
In this week's Times Higher Education, Phil Baty discusses the role of reputational surveys in university ranking. It was a distinctive feature of the THE-QS rankings that they devoted 40 % of the weighting to a survey of academic opinion about the research excellence of universities. Baty points out that "The reputation survey used in the now-defunct Times Higher Education-QS World University Rankings was one of its most controversial elements: a survey of a tiny number of academics should not determine 40 per cent of a university's score".
It was not so much that a tiny number of academics was surveyed but that a tiny number responded and that this (relatively) tiny number was heavily biased towards particular countries and regions. A very obvious effect of the survey was to boost the position of Oxford and Cambridge well beyond anything they would have attained on indicators based on other more objective factors.
Whether THE can produce a better survey remains to be seen. But at least they have at last stopped calling it a peer review.
Monday, January 25, 2010
An article in the Financial Times describes the impressive growth of scientific research in China
"China has experienced the strongest growth in scientific research over the past three decades of any country, according to figures compiled for the Financial Times, and the pace shows no sign of slowing.
Jonathan Adams, research evaluation director at Thomson Reuters, said China’s “awe-inspiring” growth had put it in second place to the US – and if it continues on its trajectory it will be the largest producer of scientific knowledge by 2020.
Thomson Reuters, which indexes scientific papers from 10,500 journals worldwide, analysed the performance of four emerging markets countries: Brazil, Russia, India and China, over the past 30 years."
In contrast, the performance of Indian universities and institutes has been rather limp:
" A symptom of this is the poor performance of India in international comparisons of university standards. The 2009 Asian University Rankings, prepared by the higher education consultancy QS, shows the top Indian institution to be IIT Bombay at number 30; 10 universities in China and Hong Kong are higher in the table.
Part of India’s academic problem may be the way red tape ties up its universities, says Ben Sowter, head of the QS intelligence unit. Another issue is that the best institutions are so overwhelmed with applications from would-be students and faculty within India that they do not cultivate the international outlook essential for world-class universities. This looks set to change as India’s human resource minister has stepped up efforts to build links with US and UK institutions. "
A couple of observations. China's research output might not be so impressive if population were taken into account. I also wonder if India's relatively poor performance is the result of a failure to cultivate an international outlook. Is China really so much more international than India? Is it possible that other factors are more important?
Monday, January 18, 2010
A notorious feature of the THE-QS rankings was its over-valuation of British and Australian universities. It would seem that Times Higher and Thomson Reuters are not really bothered by this. Indeed it looks like they are set on course to add to this bias in their new rankings, at least as far as British universities are concerned. An opinion piece by Jonathon Adams, the Director of Research Evaluation at Thomson Reuters, echoes previous comments in THE by lamenting the maltreatment of the London School of Economics in the old league table.
"The London School of Economics is generally agreed to be an outstanding institution globally. But how can we judge that? A lot of people would like to study there. If you wanted an informed opinion, you would consult the people who work there. A lot of people who have been there have gone on to great things. These are good indicators that the place is intellectually vibrant and delivers excellent teaching, and those values are endorsed internationally.
Good, but not perfect. Three major problems spring to mind. First, that quick summary tells us there are many ways in which we may value what a university does. It is a knowledge business and a source for teaching, research and dissemination to users. Second, the LSE is a specialist. Its astronomy is weak, so we need to consider subject portfolio. And, third, what will we measure? I need an informed expert to confirm my judgment, but as I can't send my expert to every institution, I need a proxy indicator (not a "metric": an indicator).
Our view of the LSE does not translate readily into anything useful unless we are careful and we make sure our information is appropriate. The LSE stood at only 67th in the last Times Higher Education-QS World University Rankings - some mistake surely? Yes, and quite a big one. LSE academics publish papers in social and economic sciences, which have lower citation rates than the natural sciences; so on the simple "citations per paper" used by QS in analysing the Scopus publications data, it slipped way down the list. Not a good way of comparing it with nearby King's College London, which has a huge medical school.
We need a lot more information than has typically been gathered before we can build an even halfway sensible picture of what a university is doing."
The problem with this is that there are many institutions that scored lower than LSE in the rankings that are agreed by some people somewhere to be outstanding. The “good indicators” raise more questions. A lot of people want to study at LSE. Is that because of its intrinsic merits or shrewd marketing? And who is the "you" who would consult the LSE? A lot of its alumni and alumnae have done great things? No doubt many have become MPs, civil servants, university administrators and CEOs but given the current moral condition of British politics and the performance of the British and European economies that might not be something to be proud of.
It is difficult to concur with the claim that LSE has been treated unfairly in previous rankings. In 2009 they were number five for social sciences and 32nd for arts and humanities. They got top marks for international faculty and international students and in the employer review. They did somewhat less well in the academic survey, which had a disproportionate number of respondents from Britain and Commonwealth countries with large numbers of British alumni and alumae, but that is surely to be expected when LSE excels in a very limited range of disciplines.
LSE also did badly in the citations per faculty indicator (not citations per paper – QS used that for their Asian rankings, not the world rankings) partly because it is a specialist social science institution and it is conventional in the social sciences to produce fewer papers and to cite them less frequently but also because LSE actually does not produce as much social science research, as measured by Scopus and ISi publications, as general institutions such as the Universities of Manchester, Birmingham, Harvard, Yale, Chicago, Toronto, Melbourne and Sao Paulo.
It is difficult to think of changes in the structure or content of the rankings that would benefit LSE but not a host of others. Giving extra weighting to social science publications is an excellent idea and would boost LSE relative to King’s College or Imperial College (I wonder if THE is prepared to let Imperial slip a few places) but it would probably help US state universities and European universities even more. Counting “contributions to society”, such as sitting on committees and commissions and boards of directors would help LSE a bit but might well help Japanese universities and the French grandes ecoles a lot more.
LSE is a narrowly based specialist institution and QS gave it as much as or more than it deserved by ranking it highly in the social science and arts and humanities categories and putting it in the top 100 in the general world rankings. It is good at what it does but it does not do all that much. It would be a shame if the rankings are going to be restructured to promote it beyond its real merits.
The other item is a Rick Trainor’s review of Robert Zemsky’s book Making Reform Work. In the course of his review Trainor, who is president of King’s College London, says that:
"Most fundamentally, while the US debate is premised on a clear and widespread belief in the great, if imperilled, merits of the US system, British opinion often pays too little attention to the successes of UK universities, even in comparison with their US counterparts. For example, British commentators often overlook UK universities' superior completion rates, the greater rigour concerning undergraduate assessment inherent in the existence of an external examiner system, their greater ability (allowing for the much greater size of the US population and its university system) to attract overseas students and, as suggested by the Sainsbury report, their arguably superior record in commercialisation.
Of course, this is not to suggest that the UK higher education system is perfect, any more than US universities are. Nonetheless, there has been too little recognition in the UK of its high international research standing (aided by rises in public investment in recent years), despite persisting American strength and rapidly rising competition from countries such as China and India. Likewise, the UK system receives too little credit domestically for its success in protecting standards despite the huge increase in UK student numbers during the past 25 years. Similarly too few observers on this side of the Atlantic have learned one of the basic lessons propounded by Zemsky: that outstanding achievement in higher education depends on adequate resources - for teaching (which was substantially underfunded, even before the UK's public expenditure crisis began) as well as for research. "
This is a rather odd set of claims. Superior completion rate? I wonder how that happened. Greater rigour because of the external examiner system? Really? Do British universities still have a high international research standing? Just look at their performance on the Shanghai rankings, after removing the cushion of the thirty percent weighting for Nobel and Fields laureates. Have standards really been protected? Would more money make any difference?
It is beginning to look as though an implicit consensus is developing in the British higher educational establishment that the rankings should reflect its self-serving view of the merits of British higher education and that they have an important role to play in fending off the economic crisis. It appears that THE and Thomson Reuters are only too happy to oblige.
Saturday, January 16, 2010
Although there has been a lot of activity, so far mainly rhetorical, at Times Higher Education and Thomson Reuters about their forthcoming rankings, nothing has been heard from QS apart from an advert for a manager of a university ranking for Latin America and Iberia.
Nothing has been added to the 2010 ranking news page since December and Ben Sowter’s blog has been silent for a month.
Are they preparing a response to THE or are they just fading away?
Tuesday, January 12, 2010
Kiplinger has produced its 2009-2010 ranking of US universities. This is very much a student consumer ranking that measures the value for money delivered by each institution. It is based on information about student debt, tuition costs, financial aid, gender ratio, class size and average SAT scores, among others.
There is no doubt a lot of room for argument about the validity of the data and how the indicators were weighted but this sort of index does seem very useful.
I am wondering if something like this can be incorporated into existing international rankings. A lot of Kiplingers's data would be difficult or impossible to obtain outside the US but information about things like tuition fees, gender ratio, class size, and number of books in the library is widely available.
The top five private universities are:
1. Caltech
2. Princeton
3. Yale
4. Rice
5. Harvard
The top five public schools (for out-of-state students) are:
1. SUNY Binghamton
2. SUNY College at Geneseo
3. University of North Carolina at Chapel Hill
4. University of Florida
5. College of New Jersey
Sunday, January 10, 2010
The European Union is trying to develop a new ranking system to rival the existing ones. The motivation is fairly transparent. The object, as reported in the EUObserver is "to improve the ranking of European universities and improve Europe's economic power".
The EUObserver provides an excellent and succinct summary of the forces underlying the universities ranking boom.
"This means the rankings are increasingly receiving more attention for different specific purposes: Students use them to short-list their choice of university; public and private institutions use them to decide on funding allocations; universities use them to promote themselves; while some politicians use them as a measure of national economic achievements or aspirations. "
It seems that planning for the new rankings took place in the second half of 2009 and that in the first half of 2010 it will be tested on 150 institutions around the world, but only for engineering and business studies.
At that rate, THE, QS and Shanghai Jiao Tong University have nothing to worry about.
Thomson Reuters have set up a new site here. It contains information, although not much so far, about the new Times Higher ranking system.
They will "address industry concerns over current profile systems... The 21st century research institution has many fluid layers, and Thomson Reuters is committed to developing an equally robust and dynamic dataset".
Notice that they are talking about research institutions as though universities do nothing but research and that they refer to higher education as an industry.
The page provides some hints about what might be included in the forthcoming rankings: peer review, scholarly outputs, citation patterns, funding levels and faculty characteristics.
I do not know whether there is any significance in the absence of internationalisation and faculty student ratio from the list.
The page could have done with some editing. There are too many barely meaningful adjectives -- robust, dynamic, flexible, data-driven, globally significant. And exactly what is a "fluid layer"?