Wednesday, August 17, 2016

The Shanghai Rankings: More Interesting This Year




The Shanghai rankings are usually the most stable and therefore the least interesting (for journalists, politicians and bureaucrat) of the current array.

This year, however, they are quite volatile. The reason for that is that the Shanghai Ranking Consultancy has completed the transition from the old to the new lists of highly cited researchers supplied by Thomson Reuters. In 2014 and 2015 they used both lists with an equal weighting which has reduced the abruptness of the transition. In addition, the rankings now count only primary affiliations. As a result there have been enough ascents and descents to gladden the hearts of higher education journalists and experts.

It should be noted that the effect of this is largely to accelerate trends that were in progress anyway. The old list was clearly out of date and it was time for a new one.

First, to my predictions. Harvard is still number one. Wisconsin at Madison, Rutgers and Virginia Polytechnic Institute have all fallen. Aalborg, Nanyang Technological University, Peking, Chiba and Tsinghua have risen. Peking and Tsinghua are now in the top 100 and heading for the top 50.

But the University of Tehran has not risen. It has fallen by 84 places, presumably because it lost a highly cited researcher during the second half of 2015.

Overall, the rankings provide more evidence for the rise of China with two universities in the top 100 and 54 in the top 500 compared with none and 44 last year, but not the rest of Asia. South Korea has gone from 12 in the top 500 to 11, Japan from 18 to 16 and Israel 6 to 5. India still has only one representative in the top 500 and Malaysia two.

Meanwhile the USA  now has 137 universities in the top 500 compared with 146  last year

Rapidly rising institutions include Toulouse School of Economics, from 375th to 265th, largely because of being given a free pass this year for papers in Nature and Science, University of the Witwatersrand from 244th to 204th, University of Queensland from 77th to 55th and , King Abdullah University of Science and technology from 352nd to 254th.

Kwazulu-Natal has fallen from 413th to 494th, Dartmouth College from 215th to 271st and Universiti Malaya from 353rd to 413th.


Saturday, August 13, 2016

My Predictions for the Shanghai Rankings




Watch this Nate Silver.

In the latest edition of the Shanghai Academic Ranking of World Universities (ARWU) to be announced on Monday, the first place will go to Harvard.

The methodology for this prediction is based on an extremely complex and sophisticated algorithm that incorporates a large number of variables and will remain a secret for the moment.

Now for some easier predictions.

The Shanghai rankings are generally famous for their stability and consistency which makes them rather boring for journalists and naive administrators. No shocking headlines about catastrophic plunges in the rankings after the latest vandalism by government Scrooges.

But the Shanghai rankers have had problems with their highly cited researchers indicator. Thomson Reuters have stopped adding to their old list of highly cited researchers and have published a new one. The Shanghai Ranking Consultancy combined the two lists in 2014 and 2015 and have said that this year only the new list will be used in calculating the overall score.

Last year Shanghai published a list of scores for the highly cited indicator in 2013 (the old list), combined scores in 2015 and scores if the new list alone had been used.

Here are the universities that will rise or fall by ten points (two points in the weighted overall rankings) as the new list replaces the combined lists. This assumes that universities have not recruited or lost highly cited researchers dring 2015. If they have then the predictions will be incorrect. Also, changes in the highly cited indicator may be balanced by changes in other indicators

Predicted to Fall
University of Wisconsin, Madison
Rutgers: State University of New Jersey
Virginia Polytechnic Institute

Predicted to Rise
Aalborg University
Nanyang Technological University
Peking University
Chiba University
Tsinghua University
University of Tehran

Thursday, August 11, 2016

Value Added Ranking


There has been a lot of talk about ranking universities by factors other than the usual mix of contributions to research and innovation, reputation surveys and inputs such as spending, teaching resources or student quality.

The emerging idea is that universities should be assessed according to their ability to teach students or to inculcate desirable skills or attributes.

Much of this is powered by the growing awareness that American and European secondary schools are failing to produce sufficient numbers of students with the ability  to undertake and complete anything that could realistically be called a university education. It is unlikely that this is the fault of the schools. The unavoidable  verdict of recent research is that the problem with schools has very little to do with institutional racism, a lack of grit, resilience or the current X factor or the failure to adopt Finnish, Chinese or Singaporean teaching methods. It is simply that students entering the school system are on average less intelligent than they were and those leaving are consequently also less intelligent.

There is now a market for rankings that will measure the quality of universities not by their resources, wealth or research output but by their ability to add value to students and to prepare them for employment or to enable them to complete their courses.

This could, however, lead to massively perverse consequences. If universities are assessed according to the percentage of entrants who graduate within a certain period or their employability then there could be a temptation to dilute graduation requirements .

Nevertheless, the idea of adding value is one that is clearly becoming more popular. It can be seen in the attempt to introduce a national rating system in the US and in the UK to use the proposed Teaching Excellence Framework (TEF) to rank universities.

One UK ranking that includes a value added measure is the Guardian University Guide. This includes eight indicators, three of which three of which measure student satisfaction. Other indicators are staff student ratio and spending per student. There is also  a measure of student outcomes, that is graduate level employment or entry into a postgraduate course after six months, one of the quality of students measured by A level qualifications and one a measure of value added, that is the difference between the students entry level exam results and their eventual degree results.

It is therefore possible to get a rough idea of what factors might actually produce positive student outcomes.

The overall ranking for 2015-16 starts by being quite conventional with the top three places going to Cambridge, Oxford and St Andrews. Some might be surprised by Exeter in 9th place and Loughborough in 11th,  ahead of LSE and UCL.

Measuring student quality by exam scores produces unsurprising results at the top. Cambridge is first followed by Oxford and Imperial. For staff student ratio the top three are UCL, Oxford and SOAS and for spending per student Oxford, Cambridge and the University of the Arts London.

For student satisfaction with courses, Bath, Keele and UEA are in the lead while Oxford is 5th and Cambridge 12th. It's when we look at the Value Added that we find some really unusual results. The top three are Gloucester, Edinburgh and Abertay.

After plugging the indicator scores into an SPSS file we can calculate the correlations between the desired outcome, that is graduate level employment or postgraduate study and a variety of possible associated factors.

Here in descending order are the correlations with career prospects:

average entry tariff .820
student staff ratio .647
spending per student .569
satisfaction with course  .559
satisfaction with teaching   .531
value added .335
satisfaction with feedback -.171.

It would seem that if you want to know which university is best for career prospects then the most important piece of data is the average academic ability of the students. The student staff ratio and money spent are also significant as is satisfaction with courses and teaching. 

The correlation between value added and career prospects is much less and rather modest.

The universities were divided into thirds according to average entry tariff. In the top third of universities there was a strong correlation between career prospects and average entry level tariff, .628, and a modest one with spending, .355. Nothing else was associated with career success.

In the middle third the factor most associated with career prospects was course satisfaction, .498, followed by average entry tariff, .449, staff student ratio, .436, and satisfaction with teaching, .362. Satisfaction with feedback and value added were insignificant.

However, for the least selective third of universities, the picture was rather different. The factor most strongly associated with career success was satisfaction with feedback, .493, followed by valued added, .479, course satisfaction, .470, satisfaction with teaching, .439, and average entry tariff, .401. The relationship with spending and staff student ratio was insignificant .

The evidence of the Guardian rankings is that value added would only be of interest to students at or applying to the least selective third of UK universities. For the rest it is of no importance. It is debatable whether it is worth making it the centre of a new set of rankings.