Thursday, June 09, 2016

THE is coming to America



Times Higher Education (THE) has just announced that American university rankings are not fit for purpose.

We have heard that before. In 2009 THE said the same thing about the world rankings that they had published in partnership with the consulting firm Quacquarelli Symonds (QS) since 2004.

The subsequent history of THE's international rankings provides little evidence that the magazine is qualified to make such a claim.

The announcement of 2009 was followed by months of consultation with all sorts of experts and organisations. In the end the world rankings of 2010, powered by data from Thomson Reuters (TR), were not quite what anyone had expected. There was an increased dependence on self-submitted data, a reduced but still large emphasis on subjective surveys, and four different measures of income, reduced to three in 2011. Altogether there were 14 indicators, reduced to 13 in 2011, all but two of which were bundled into three super-indicators, making it difficult for anyone to figure exactly why any institution was falling or rising.

There were also some extraordinary elements in the 2010 rankings the most obvious of which was  placing Alexandria University in 4th place in the world for  research impact
.
The rankings received a chorus of criticism mixed with some faint praise for trying hard. Philip Altbach of Boston College summed up the whole affair pretty well.

“Some of the rankings are clearly inaccurate. Why do Bilkent University in Turkey and the Hong Kong Baptist University rank ahead of Michigan State University, the University of Stockholm, or Leiden University in Holland? Why is Alexandria University ranked at all in the top 200? These anomalies, and others, simply do not pass the smell test."  
THE and TR returned to the drawing board. They did some tweaking here and there and in 2011 got Alexandria University out of the top 200 although more oddities would follow over the next few years, usually associated with the citations indicator. Tokyo Metropolitan University, Cadi Ayyad University of Marrakech, Santa Maria Federico Technical University, Middle East Technical University and the University of the Andes were at one point or another declared world class for research impact across the full range of the disciplines.

Eventually the anomalies got too much and after breaking with TR in 2015 THE decided to have a bit of a spring cleaning and tidied things up a bit.


For many universities and countries  the results of the 2015 methodological changes were catastrophic. There was a massive churning with universities going up and down the tables. Universite Paris-Sud, the Korean Advanced Institute of Science and Technology, Bogazici University and The Middle East Technical university fell scores of places.

THE claimed that this was an improvement. If it was then the previous editions must have been hopelessly inadequate. But if the previous rankings were the gold standard of rankings then those methodological changes were surely nothing but gratuitous vandalism.

THE has also ventured into far away regions with snapshot or pilot rankings. The Middle East was treated to a ranking with a single indicator that put Texas A and M University Qatar, a branch campus housing a single  faculty, in first place. For Africa there was a ranking consisting of data extracted from the world rankings without any modification of the indicators, which did not seem to impress anyone.


So one wonders where THE got the chutzpah to tell the Americans that their rankings are not fit for purpose. After all, US News was doing rankings for two decades before THE and their America's Best Colleges include metrics about retention and reputation as well as resources and selectivity. Also, there are now several rankings that already deal directly with  the concerns raised by THE.


The Forbes/CCAP rankings include measures of student satisfaction , degree of student indebtedness, graduation on time, and career success.

The Brookings Institution has a value added ranking that includes data from the college scorecard

The Economist has produced a very interesting ranking that compares expected and actual value added.

So exactly what is THE proposing to do

It seems that there will be a student engagement survey which apparently will be launched this week and will cover 1,000 institutions. They will also use data on cost, graduation rates and salaries from the Integrated Postsecondary Data System (IPEDS) and the College Scorecard. Presumably they are looking for some way of monetising all of this so probably large chunks of the data will only be revealed as part of benchmarking or consultancy packages.

I suspect that  the new rankings will like something like the Guardian university league tables just published in the UK but much bigger.

The Guardian rankings include measures of student satisfaction, selectivity, spending, staff student ratio and value added. The latter compares entry qualifications with the number of students getting good degrees (a first or upper second).

It seems that THE are planning something different from the research centred industry orientated university rankings that they have been doing so far and are venturing out into new territory, institutions that are two or three tiers below the elite and do little or no research.

There could be a market for this kind of ranking but is very far from certain that THE are capable of doing it and whether it is financially feasible. 



Tuesday, May 31, 2016

UK rises in the U21 system rankings

The comparison of national higher education systems by Universitas 21 shows that the UK has risen from 10th place to 4th since 2012.

These rankings consists of four groups of indicators: resources, connectivity, environment and output. Since 2012 the British higher education has risen from 27th to to 12th for resources, 13th to 10th for environment  and 6th to 4th for connectivity. It was in second place for output in 2012 and in 2016 but its score rose from 62.2 to 69.9 over the four years.

Every few months, whenever any sort of ranking is published, there is an outcry from British universities that austerity and government demands and interference and immigration controls are ruining higher education.

If the U21 rankings have any validity then it would seem that British universities have been very generously funded in comparison to other countries.

Perhaps they could return some of the money or at least say thank you to the state that has been so kind to them.


Monday, May 23, 2016

Does a programming competition in Thailand show the future of the world economy?



The ACM ICPC (Association for Computing Machinery -- Intercollegiate Programme) is the "Olympics of Programming Competitions". The competitors are teams of university students who grapple with complex real-world problems. It is a "battle of logic, strategy and mental endurance" and is the apex of a series of local and regional competitions.

Success in the competition requires a high level of intelligence, genuine team formation and rigorous training. It is the antithesis of the intellectual levelling and  narcissistic cult of safe spaces that has infected American and, perhaps to a lesser extent, British universities.

The finals have just been completed in Thailand. The top five are:

1.  St. Petersburg State University
2.  Shanghai Jiao Tong University
3.  Harvard University
4.  St. Petersburg Institute of Physics and Technology
5.  University of Warsaw

The list of universities in the top ten, the top fifty and the total number of finalists is interesting. If this competition reflects the current level of intelligence of university students, then the future for China, patches of the rest of Asia, and Russia and Eastern Europe looks bright. The USA may do well if -- a very big if -- it can continue to attract large numbers of Chinese students and immigrants. For Africa and Western Europe, including the UK, the economy of the 21st century may be bleak.

Below, countries are ranked according to the number of universities in the top ten, the top fifty and all finalists.


Rank
Country
Top 10
Top 50
Total Finalists
1
Russia
5
10
12
2
USA
2
6
23
3
Poland
2
3
3
4
China
1
10
17
5
Brazil

2
6
6
Japan

2
4
7
Ukraine

2
3
8=
Belarus

2
2
8=
Taiwan

2
2
10
Bangladesh

1
3
11=
Canada

1
2
11=
Iran

1
2
11=
South Korea

1
2
11=
Vietnam

1
2
15=
Argentina

1
1
15=
Croatia

1
1
15=
Finland

1
1
15=
Hong Kong

1
1
15=
North Korea

1
1
15=
Singapore

1
1
21
India


6
22
Egypt


4
23=
Mexico


3
23=
Syria


3
25=
Australia


2
25=
Colombia


2
25=
Netherlands


2
28=
Chile


1
28=
Cuba


1
28=
Czech Republic


1
28=
Jordan


1
28=
Macao


1
28=
Pakistan


1
28=
Peru


1
28=
Philippines


1
28=
Slovakia


1
28=
South Africa


1
28=
Spain


1
28=
Switzerland


1
28=
Thailand


1
28=
UK


1
28=
Venezuela


1

Sunday, May 22, 2016

Don’t rush to conclusions from the THE rankings

My 15th May post has been republished in University World News with a different title.

No need for British universities to worry about their dip in THE reputation rankings

Don’t rush to conclusions from the THE rankings

Sunday, May 15, 2016

One more thing about the THE reputation rankings

I have just remembered something about the THE reputation reputation rankings that is worth noting.

THE have broken out the scores for teaching reputation and research reputation for the first fifty universities and this gives us a chance to ask if there is any meaningful difference between teaching and research reputation.

The answer is that there is not. The correlation between the teaching and the research scores is .986. This is so high that for practical purposes they are exactly the same thing. The 15% weighting given for teaching (actually "postgraduate supervision") reputation may be unrelated to undergraduate teaching or even to taught master's teaching.The emphasis on research in the THE world rankings is therefore even higher than THE claim, at least at the top.

This has already been pointed out by Alex Usher of High Education Strategy Associates of Canada who found a correlation of .991 in 2013.