Saturday, August 28, 2010

From THE

I am reproducing Phil Baty's column from Times Higher Education in its entirety

One of the things that I have been keen to do as editor of the Times Higher
Education World University Rankings is to engage as much as possible with our
harshest critics.

Our editorial board was trenchant in its criticism of our old rankings. In particular, Ian Diamond, principal of the University of Aberdeen and former chief executive of the Economic and Social Research Council, was scathing about our use of research citations.

The old system failed to normalise data to take account of the dramatically different citation volumes between different disciplines, he said - unfairly hitting strong work in fields with lower average figures. We listened, learned and have corrected this
weakness for the 2010 rankings.

Another strong critic is blogger Richard Holmes, an academic at the Universiti Teknologi MARA in Malaysia. Through his University Ranking Watch blog, he has perhaps done more than anyone to highlight the weaknesses in existing systems: indeed, he highlighted many of the problems that helped convince us to develop a new methodology with a new data provider, Thomson Reuters.

He has given us many helpful suggestions as we develop our improved methodology. For example, he advised that we should reduce the weighting given to the proportion of international students on campus, and we agreed. He added that we should increase the weighting given to our new teaching indicators, and again we concurred.

Of course, there are many elements that he and others will continue to disagree with us on, and we welcome that. We are not seeking anyone's endorsement. We simply ask for open engagement - including criticism - and we expect that process will continue long after the new tables are published.

There are still issues to be resolved but it does appear that the new THE rankings are making progress on several fronts. There is a group of indicators that attempts to measure teaching effectiveness. The weighting given to international students, an indicator that is easily manipulable and that has had very negative backwash effects, has been reduced. The inclusion of funding as a criterion, while obviously favouring wealthy regions, does measure an important input. The weighting assigned to the subjective academic survey has been reduced and it is now drawn from a clearly defined and at least moderately qualified set of respondents.

There are still areas where questions remain. I am not sure that citations per paper is the only way to measure impact. At the very least, the h-index could be added, which would add another ingredient to the mix.

Also, there are details that need to be sorted out. Exactly what sort of faculty will be counted in the various scalings? Is self-citation be counted? I also suspect that not everybody will be enthusiastic about using statistics from UNESCO for weighting the results of the reputational survey. That is not exactly the most efficient organization in the world. There is also a need for a lot more information about the workings of the reputational survey. What was the response rate and exactly how many responses were there from individual countries?

Something that may well cause problems in the future is the proposed indicator of the ratio of doctoral degrees to undergraduate degrees. if this is retained it is easy to predict that universities everywhere will be encouraging or coercing applicants to master's programs to switch to doctoral programs.

Still, it does seem that THE is being more open and honest about the creation of the new rankings than other ranking organizations and that the final result will be a significant improvement.

Tuesday, August 24, 2010

America' Best Colleges 2011

US News and World Report's Ameica's Best Colleges 2011 is now out.

The top ten National Universities are:

1. Harvard
2. Princeton
3. Yale
4. Columbia
5. Stanford
6. University of Pennsylvania
7 = Caltech
7 = MIT
9 = Dartmouth
9 = Duke
9 = Chicago

Monday, August 23, 2010

Shanghai Rankings: Shifting Research Landscape

My article in University World News on the 2010 Academic Ranking of World Universisities can be viewed here.

Thursday, August 19, 2010

More from THE

Phil Baty in Times Higher Education gives us some clues about what the forthcoming THE World University Rankings will contain.

"While all the self-reported material bears the imprimatur of the supplying
institutions (and our tables include only those that have cooperated with our exercise) and it has been vetted for quality, the consultation had some concerns
about its consistency and robustness - especially in this inaugural year. For example, not all institutions could provide a clear or internationally comparable figure for their research income from industry.

For maximum robustness, we plan to give extra weighting to data that have been sourced independently of the institutions themselves and are globally consistent.

Citations data, for example, which are widely accepted as a strong proxy for research quality, will have a high weighting - perhaps about 30 per cent of the total ranking score.

We also have high confidence in the validity and independence of the results of our reputation survey. Although we may yet adjust its weighting, this subjective measure will not be weighted as highly as it was in our old methodology (2004-09), where reputation was worth 40 per cent."

It looks as though citations per paper, a measure of its influence throughout a research community, will count for a lot in the forthcoming rankings. It is questionable whether such a high weighting for a single component is justified. At the very least it could be combined with other measures of quality such as the h-index which is, in effect, a measure of both productivity and impact.

The reluctance to place too much emphasis on research income and perhaps other types of income, is understandable but perhaps unfortunate. This indicator would give the new rankings a distinctive feature and might also allow us to see whether institutions are giving value for money.

It is inevitable that the reputational survey would never be given the same weight that its predecessors received in the THE-QS rankings. Whether its results are really valid -- we still do not know the response rate -- remains to be seen.

Monday, August 16, 2010

Why international students are not a good indicator of quality

Times Higher Education describes a dispute between Coventry University and a recruiting agent in Chennai. According to the article, Ram Beegala was hired as a recruiting consultant and would only be paid if he succeeded in getting the number of Indian students above 450.

There is a comment by "To John" which might be slightly exaggerated:

"It is no secret that the Indian students who cannot get into any of their universities and colleges are the ones that are willing to come to the UK. Their intention is the 20 hour/week work allowed and assume rightly once they use the university route to get into UK they can stay in the country to work. In my university which recruits these students, the drop out rates for such students is high as they work more than 20 weeks to meet their expenses. Their attendance drops down after a few months. I have yet to come across a single non-EU student who comes with enough funds to complete a 3 UG degree. They are told by agents that they can work in the UK to meet part of their fees and all the living expenses. The students coming in to do MSc are poorly equipped and struggle to pass their modules and write project proposals."

Big Names and Unsung Heroes

In Times Higher Education, Phil Baty hints that the reduction in the weighting for subjective indicators in the forthcoming THE rankings will mean that those dominant in the past will suffer a decline and that there may be some new schools at the top.

"We can expect some big-name institutions to take a hit in the new World
University Rankings.

Why? Because the rankings we will publish this autumn will be based less on subjective opinion and more on objective evidence".


"Under the initial proposals for our methodology, currently being refined in line with responses from the global academy, reputational measures are worth no more than 20 per cent of overall scores.

I have also set a cap to ensure that subjective elements are never again anywhere near the 50 per cent used in our previous methodology. This means that big names with big reputations that lack world-class research output and influence to match will suffer in comparison with previous exercises. Conversely, unsung heroes have a better chance of recognition".

Another Ranking

The ic4u ranking of 200 top universities is based on web popularity.

The top five are:

1. Stanford
2. MIT
3. National Autonomous University of Mexico
4. Berkeley
5. Peking
The Forbes Ranking

The 2010 edition of the Forbes College Rankings is now out. These are basically an evaluation from the students' viewpoint. The criteria are the number of alumni in Who's Who in America, ratings in RatemyProfessor, graduation rates, number of students and faculty winning national awards and accumulated student debt.

There are some surprises. Top place goes to Williams College a private liberal arts college that does not even get into Shanghai's top 500. The service academies do very well. On the other hand, Harvard is 8th, Yale 10th and Chicago 2oth.

Saturday, August 14, 2010

Long Term Trends in the Shanghai Rankings

The Shanghai Rankings are noted for their methodological stability. Whereas frequent changes combined with the insertion and removal of errors produced wild fluctuations in the THE-QS rankings, the ARWU have remained essentially the same since they started. The Shanghai index never aroused as much public interest as the now defunct THE-QS league table but over the long run it is more likely to reveal real and significant trends.

If we compare the 2004 rankings with those just announced there are some noticeable changes over six years. Cambridge and Oxford have each slipped a couple of places while Imperial College and University College London have moved up a bit, although not as high as their implausible position in THE-QS. Tokyo has slipped from 14th to 20th and Kyoto from 21st to 24th. The leading Australian university has also fallen.

Russia has stagnated with only two institutions in the top 500 in 2004 and 2010. India has fallen back with the University of Calcutta dropping out of the rankings. The rising stars for scientific research are Mainland China (8 in 2004 and 22 in 2010), South Korea (7 in 2004 and 10 in 2010), Brazil (4 in 2004 and 6 in 2010) and the Middle East (none in 2004 and 4 from Saudi Arabia, Turkey and Iran in 2010).
The Shanghai Rankings

The 2010 Academic Ranking of World Universities by Shanghai Jiao Tong University is now out. See here.

Friday, August 06, 2010

From QS

A media advisory has been sent by Martin Ince, Chair of the Advisory Board of QS World University Rankings. See here.

The document describes the structure of the current rankings. Something interesting is that apparently the number of responses has increased to over 13,000, although about half of those would be from people who filled out the form in 2009 and 2008 and did not update their forms this year. The number of respondents is now about the same as that reported by Times Higher for their survey, although THE will no doubt point out that they can be fairly confident that their respondents are still alive and working in academia.

The number of respondents is less important than the response rate and so far neither QS or THE have said how many forms were distributed.