Tuesday, March 19, 2013

Some More on the THE World University Rankings 2


I have calculated the mean scores for the indicator groups in the 2012-13 Times Higher education. world university rankings. The mean scores are for the 400 universities included in the published 2012-13 rankings are:

Teaching   41.67
International Outlook 52.35
Industry Income 50.74
Research 40.84
Citations 65.25

For Industry Income, N is 363 since 37 universities, mainly in the US, did not submit data. This might be a smart move if the universities realized that they were likely to receive a low score. N is 400 for the others.

There are considerable differences between the indicators which are probably due to Thomson Reuters' methodology. Although THE publishes data for 200 universities on its website and another 200 on an iPad/iPhone app there are in fact several hundred more universities that are not included in the published rankings but whose scores are used to calculate the overall mean from which scores for the ranked universities are derived.

A higher score on an indicator means a greater distance from all the institutions in the Thomson Reuters database.

The high scores for citations mean that there is a large gap between the top 400 and the lesser places outside the top 400.

I suspect that the low scores for teaching and research are due to the influence of the academic survey which contributes to both indicator clusters. We have already seen that after the top six, the curve for the survey is relatively flat.

The citations indicator already has a disproportionate influence contributing 30 % to the overall weighing. That 30 % is of course a maximum. Since universities on average are getting more for citations than the the  indicators, it has in practice a correspondingly greater weighting.

Friday, March 15, 2013

Some More on the THE World University Rankings 2012-13

Here are some observations based on a simple analysis of the Times Higher Education World University Rankings of 2012-13.

First, calculating the Pearson correlation between the indicator groups produces some interesting points. If a ranking is valid we would expect the correlations between indicators to  be fairly high but not too high. If the correlations between indicators are above .800 this suggests that they are basically measuring the same thing and that there is no point in having more than one indicator. On the other hand it is safe to assume that if an indicator does measure quality or desired characteristics in some way it will have a positive relationship with other valid indicators.

One thing about the 2012-13 rankings is that the relationship between international outlook (international faculty, students and research collaboration)  and the other indicators is negative or very slight. With teaching it is .025 (not significant), industry income .003 (not significant), research .156 and citations 158. This adds to  my suspicion that internationalisation, at least among those universities that get into the world rankings, does not per se  say very much about quality.

Industry income correlates modestly with teaching (.350) and research (.396), insignificantly with international outlook (.003) and negatively and insignificantly with citations (-.008).

The correlation between research  and teaching is very high at .905. This may well be because  the survey of academic opinion contributes to the teaching and the research indicators. There are different questions -- one about research and one about postgraduate supervision -- but the difference between the responses is probably quite small.

It is also very interesting that the correlation between scores for research and citations is rather modest at .410. Since volume of publications, funding and reputation should contribute to research influence, which is what citations are supposed to measure, this suggests that the citations indicator needs a careful review.

Teaching, research and international outlook are composites of several indicators. It would be very helpful if THE or Thomson Reuters released the scores for the separate indicators.

Sunday, March 10, 2013


The California Paradox

Looking at the Times Higher Education reputation rankings, I noticed that there were two Californian universities in the  superbrand six and seven in the top 50. This is not an anomaly. A slightly different seven can be found in the THE World University Rankings. California does even better in the Shanghai ARWU with three in the top six and 11 in the top 50. This is a slight improvement on 2003 when there were ten. According to ARWU, California would be the second best country in the world for higher education if it became independent.
California’s performance is not so spectacular according to QS who have just four Californian institutions in their top fifty, a fall from 2004 when they had five (I am not counting the University of California at San Francisco which, being a single subject medical school, should not have been there). Even so it is still a creditable performance.
But, if we are to believe many commentators, higher education in California, at least public higher education, is dying if not already dead.

According to Andy Kroll in Salon:

"California’s public higher education system is, in other words, dying a slow death. The promise of a cheap, quality education is slipping away for the working and middle classes, for immigrants, for the very people whom the University of California’s creators held in mind when they began their grand experiment 144 years ago. And don’t think the slow rot of public education is unique to California: that state’s woes are the nation’s".

The villains according to Kroll are Californian taxpayers who refuse to accept adding to a tax burden that is the among the highest in the world. 
It is surprising that the death throes of higher education in California have gone unnoticed by the well known international rankers.
It is also surprising that public and private universities that are still highly productive and by international standards still lavishly funded exist in the same state as secondary and elementary schools that are close to being the worse in the nation in terms of student performance. The relative and absolute decline in educational achievement is matched by a similar decline in the overall economic performance of the state.

It may be just a matter of time and in the coming decades Californian universities will follow primary and secondary education into irreversible decline.

 

 

 
Preserving data

Times Higher and QS have both renovated their ranking pages recently and both seem to have removed access to some data from previous years. THE used to provide links to the Times Higher Education (Supplement) - Quacquarelli Symonds rankings of 2004-2010 but apparently not any more. QS do not seem to give access to these rankings before 2007. In both cases, I will update if it turns out that there is a way to get to these rankings.


There is, however,  a site which has the rankings for the top 200 of the THES - QS Rankings of 2004-2007.