Sunday, October 18, 2015

Going Up and Going Down

A revised version of  a previous post has been posted at University World News. Readers are welcome to comment here.

Sunday, October 11, 2015

More on Politics and Rankings

The Higher Education Minister of Malaysia has praised the country's leading university, Universiti Malaya (UM) for getting into the top 150 of the Quacquarelli Symonds (QS) World University Rankings. He also noted that UM and other Malaysian universities had done well in the QS subject rankings.

The problem with relying on QS or Times Higher Education (THE) is that they are prone to volatility because of  reliance on reputation surveys that can be unstable outside the top dozen or so universities. Things have been made worse this year by methodological  changes. In the case of QS one change was to give more credit to citations in the humanities and social sciences thereby helping universities that publish mainly or entirely in English.

A more consistent view of university performance might be found in the Shanghai or US News rankings.

Rankings Become Big Politics

University performance in global rankings has become a favorite weapon of politicians around the world. Scotland's first Minister has noted that there are five Scottish universities in the Times Higher Education World University Rankings and that the Scottish government will "continue to work with our universities to make two sure that they continue to be that fantastic success story"

She did not mention that  there are only two Scottish universities in the top 200 of the Shanghai rankings and in the US News Best Global Universities.



Thursday, October 08, 2015

Tokyo Metropolitan University is Still in the Japanese Top Ten

Until recently Tokyo Metropolitan University had an advertisement with Times Higher Education proclaiming their perfect score of 100 for citations. This year the score fell to 72.2 and so now they just say "TMU ranks 9th among Japanese universities in the Times Higher Education World University Rankings 2015-2016"

I hope they got a discount.

Saturday, October 03, 2015

Where Should Rankers get Data From?

Times Higher Education (THE) have started publishing some basic university statistics on their rankings page: number of students,  student-staff ratio, international students and female-male ratio.

Already some observers have noted that the data does not always match that found in institutional and official sources. I have heard that the number of students given for several German universities is significantly lower than that found in other sources.

The Online Citizen in Singapore has found that the island's two leading tertiary institutions, National University of Singapore and Nanyang Technological University, have claimed 34% and 33% international students respectively on the THE site although in 2013 the Minister of Education had claimed that the proportion of international students in Singaporean universities was only 16 %.

There are several plausible and innocent explanations for this and similar discrepancies. It could be that part-time students, branch campuses, online students, permanent residents, research institutes, commuters living in Malaysia are counted in one set of figures but not the other.

But there is a serious and general problem with institutional data for university rankings. Even if everybody concerned is completely honest, there are many points at which ambiguous definitions, conflicting estimates, duplication or omission of data can undermine the accuracy of ranking indicators. In the case of Germany there might be some argument over whether doctoral candidates count as students or teaching and/or research staff

QS used to have a validation hierarchy starting with national statistics, followed by institutional data, data from websites, old data, third party data and smart averages in that order. If  it is still applied rigorously this would be the best approach.

I understand that both QS and THE reserve the right to overrule institutional data although how severe they are I do not know. THE have a particularly difficult task since they allow universities to opt in or out as they please. Should THE be too strict about the data supplied a university might simply decide not to be ranked for a year.

On balance, it is probably good sense for ranking  organisations to rely on publicly accessible data when they can and to minimise input from universities.








Friday, October 02, 2015

Very Interesting Rankings from Times Higher Education


The latest edition of the Times Higher Education (THE) World University Rankings has just been published, along with a big dose of self-flattery and congratulations to the winners of what is beginning to look more like a lottery than an objective exercise in comparative assessment.

The background to the story is that at the end of last year THE broke with their data suppliers Thomson Reuters (TR) and announced the dawn of a new era of transparency and accountability 

There were quite a few things wrong with the THE rankings, especially with the citations indicator which supposedly measured research impact and was given nearly a third of the total weighting. This meant that THE was faced with a serious dilemma. Keeping the old methodology would be a problem but radical reform would raise the question of why THE would want to change what they claimed was a uniquely trusted and sophisticated methodology with carefully calibrated indicators.

It seems that THE have decided to make a limited number of changes but to postpone making a decision about other issues.

They have broadened the academic reputation survey, sending out forms in more languages and getting more responses from outside the USA. Respondents are now drawn from those with publications in the Scopus database, much larger than the Web of Science, as was information about publications and citations. In addition, THE have excluded 649 “freakish” multi – author papers from their calculations and diluted the effect of the regional modification that boosted the scores in the citations indicator of low performing countries.

These changes have led to implausible fluctuations with some institutions rising or falling dozens or hundreds of places. Fortunately for THE, the latest winners are happy to trumpet their success and the losers so far seem to have lapsed into an embarrassed silence.

When they were published on the 30th of September the rankings provided lots of headline fodder about who was up or down.

The Irish Times announced that the rankings showed  Trinity College Dublin had fallen while University College Dublin was rising.

In the Netherlands the University of Twente bragged about its “sensationally higher scores”.

Study International asserted that “Asia Falters” and that Britain and the US were still dominant in higher education.

The London Daily Telegraph claimed  that European universities were matching the US.

The Hindu found something to boast about by noting that India was at last the equal of co-BRICS member Brazil.

Russian media celebrated the remarkable achievement of Lomonosov Moscow State University in rising 35 places.

And, of course, the standard THE narrative was trotted out again. British universities are wonderful but they will only go on being wonderful if they are given as much money as they want and are allowed to admit as many overseas students as they want.

The latest rankings support this narrative of British excellence by showing Oxford and Cambridge overtaking Harvard, which was pushed into sixth place. But is such a claim believable? Has anything happened in the labs or lecture halls at any of those places between 2014 and 2015 to cause such a shift?

In reality, what probably happened was that the Oxbridge duo were not actually doing anything better this year but that Harvard’s eclipse came from a large drop from 92.9 to 83.6 points for THE’s composite teaching indicator. Did Harvard’s teaching really deteriorate over twelve months? It is more likely that there were relatively fewer American respondents in the THE survey but one cannot be sure because there are four other statistics bundled into the indicator.

While British universities appeared to do well, French ones appeared to perform disastrously. The École Normale Supérieure recorded a substantial gain going from 78th to 54th place but every other French institution in the rankings fell, sometimes by dozens of places. École Polytechnique went from 61st place to 101st, Université Paris-Sud from 120th  to 188th , the University of Strasbourg from the 201-225 band to 301-350, in every case because of a substantial fall in the citations indicator. If switching to Scopus was intended to help non-English speaking countries it did not do France any good.

Meanwhile, the advance of Asia has apparently come to an end or gone into screeching reverse. Many Asian universities  slipped down the ladder although the top Chinese schools held their ground. Some Japanese and Korean universities fell dozens of places. The University of Tokyo went from 23rd to 43rd place, largely because of a fall in the citations indicator from 74.7 points to 60.9 and the University of Kyoto from 59th to 88th with another drop in the score for citations. Among the casualties was Tokyo Metropolitan University which used to advertise its perfect score of 100 for citations on the THE website. This year, stripped of the citations for mega-papers in physics, its citation score dropped to a rather tepid 72.2.

The Korean flagships have also foundered. Seoul National University fell 35 places and the Korean Advanced Institute of Technology 66, largely because of a decline in the scores for teaching and research. Pohang University of Science and Technology (POSTECH) fell 50 places, losing points in all indicators except income from industry

The most catastrophic fall was in Turkey. There were four Turkish universities in the top 200 last year. All of them have dropped out. Several Turkish universities contributed to the Large Hadron Collider project with its multiple authors and multiple citations and they also benefited from producing comparatively few research papers and from the regional modification, which gave them artificially high scores for the citations indicator in 2014 but not this year.

The worst case was Middle East Technical University which had the 85th place in 2014, helped by an outstanding score of 92 for citations and reasonable scores for the other indicators. This year it was in the 501-600 band with reduced scores for everything except Industry Income and a very low score of 28.8 for citations.

The new rankings appear to have restored the privilege given to medical research. In the upper reaches we find St George’s, University of London, a medical school, which according to THE is the world's leading university for research impact,  Charité - Universitätsmedizin Berlin,  a teaching hospital affiliated to Humboldt University and the Free University of Berlin, and Oregon Health and Science University.

It also appears that THE's methodology continues to gives an undeserved advantage to small or specialized institutions such as Scuola Superiore Sant’Anna in Pisa, which does not appear to be a truly independent university,  the Copenhagen Business School, and Rush University in Chicago, the academic branch of a private hospital.

These rankings appear so far to have got a good reception in the mainstream press, although it is likely that that before long we will hear some negative reactions from independent experts and from Japan, Korea, France, Italy and the Middle East.

THE, however, have just postponed the hard decisions that they will eventually have to make.












Monday, September 28, 2015

Japanese Barbarians Out to Crush Humanities!

The international education media has been getting very excited recently about what appeared to be an extraordinary act of cultural vandalism by the Japanese Ministry of Education.

It seems that the ministry has been behaving like the Taliban on a rampage through the Louvre and has ordered public universities to stop teaching the humanities and social sciences.

Noah Smith, an Assistant Professor of Finance at Stony Brook University SUNY and a freelance writer, wrote that public universities had been ordered to stop teaching social sciences, humanities and law, although apparently the "order" was non-binding.

Meanwhile Takamitsu Sawa announced in the Japan Times that the humanities were under attack and that someone on the ministry's panel of learned persons had said that students should study accounting software instead of Samuelson's Economics and translation instead of Shakespeare.

Eventually,  the Financial Times revealed that that the ministry had been misinterpreted and that the abolition of the humanities referred to a number of unneeded teacher training programs. This was supported by an authoritative comment by a former government official.

So it seems  that Samuelson and Shakespeare are safe from the rampage of utilitarian barbarians.

Perhaps Japanese universities can now adopt the best practices of Columbia and the University at Buffalo for the teaching of art.


Sunday, September 27, 2015

Latest on the THE Rankings Methodology

Times Higher Education (THE) have now officially announced the methodology of next week's World University Rankings. There are some changes although major problems are still not addressed.

First, THE is now getting data from Scopus rather than Thomson Reuters. The Scopus database is more inclusive -- it covers 22,000 publications compared to 12,000 -- and includes more papers from non-English speaking countries so this may give an advantage to some universities in Eastern Europe and Asia.

Second, THE has tried to make its reputation survey more inclusive, making forms available in an additional six languages and reducing the bias to the USA.

Third, 649 papers with more than 1,000 listed authors, mainly in physics, will not be counted for the citations indicator.

Fourth, the citations indicator will be divided into two with equal weighting. One half will be be with and one half without the "regional modification" by which the overall citations impact score of a university is divided by the square root of the score for the country in which it is located. In previous editions of these rankings this modification gave a big boost to universities in low scoring countries such as Chile, India, Turkey and Russia.

It is likely that Institutions such as Bogazici University, Panjab University, Federico Santa Maria Technical University and Universite Cadi Ayyad, which have benefited from contributing to mega-papers such as those emanating from the Large Hadron Collider project,will suffer from the exclusion of these papers from the citations indicator. Their pain will be increased by the dilution of the regional modification.

It is possible that such places may get some compensation in the form of more responses in the reputation survey or higher publication counts in the Scopus database but that is far from certain. I suspect that several university administrators are going to be very miserable next Thursday.

There is something else that should not be forgotten. The scores published by THE are not raw data but standardised scores derived from standard deviations and means. Since THE are including more universities in this year's rankings and since most of them are likely to have low scores for most indicators it follows that the overall mean scores of ranked universities will fall. This will have the effect of raising the standardised scores of the 400 or so universities that score above the mean. It is likely that this effect will vary from indicator to indicator and so the final overall scores will be even more unpredictable and volatile.

Tuesday, September 22, 2015

Looking Inside the Engine: The Structure of the Round University Rankings

Many of those interested in international university rankings have been frustrated by the lack of transparency in the Quacquarelli Symonds (QS) and the Times Higher Education (THE) rankings .

The QS rankings assign a fifty per cent weighting to two surveys collected from a variety of channels -- I think six for the employer survey and five for the academic survey -- with different and fluctuating response rates.

The THE rankings have lumped five indicators in a Teaching cluster, three in a Research cluster and three in an International cluster. So how can anyone figure out just what is causing a university to rise or fall in the rankings?

A major step forward in transparency has now come with the recent publication of the Round University Rankings (RUR) by a Russian organisation that uses data from Thomson Reuters (TR), who provided the data for the Times Higher Education world and regional rankings from 2009 until the end of last year.

RUR have published the separate scores for all of the indicators. They have retained 12 out of the 13 indicators used in the THE rankings from 2011 to 2014, dropping income from industry as a percentage of research income, and added another eight.

I doubt that RUR could afford to pay TR very much for the data and I suspect that TR's motive in allowing the dissemination of such a large amount of information is to preempt THE or anyone else trying to move upstream in the drive to monetise data.

It is now possible to see whether the various indicators are measuring the same thing and hence are redundant, whether and to what extent  they are associated with other indicators and whether there is any link between markers of input and markers of output.

Here is a crude analysis of a very small sample of sixteen, one in fifty, of the RUR rankings starting with Harvard and ending with the Latvia Transport and Telecom Institute. I hope that a more detailed analysis of the entire corpus can be done in a few weeks.

The combined indicator groups

Three groups, Teaching, Research, Financial Sustainability, are fairly closely associated with one another. The teaching cluster correlates .634 with Research and .735 with Financial Sustainability. Research correlates .702 with Financial Sustainability.

The International Diversity group appears to be the odd one out here. It correlates significantly with Research (.555) but not with Teaching or Financial Sustainability. This suggests that internationalisation, at least in the form of recruiting more international students, may not always be a strong marker of quality.


The Reputation Indicators

Looking at the three reputation indicators, teaching, international teaching and research, we can see that for practical purposes they are measuring the same thing. The correlation between the Research Reputation and Teaching Reputation scores is .986 and between Research Reputation and International Teaching Reputation .925. Between Teaching Reputation and International Teaching Reputation it is .941.

Alex Usher of Higher Education Strategy Associates has claimed a correlation of .99 between teaching and research reputation scores in the THE rankings up to 2014. The figures from the RUR rankings are a bit lower but essentially the reputation indicators are measuring the same thing, whatever it is, and there is no need to count them more than once.

Other Unnecessary Indicators

Turning to the international indicators, the correlation between Academic Staff per Students and Academic Staff per Bachelor Degrees is very close at .834. The latter, which has not been in any previous ranking, could be omitted without a significant loss of information,

There is an extremely high correlation, .989,  between Citations per Academic and Research Staff and Papers per Academic and Research Staff. It sounds rather counter-intuitive but it seems that as measure of research productivity one is as good as the other, at least when dealing with more than a few hundred elite universities

There is a correlation of .906 between  Institutional Income  per Academic Staff and Institutional Income per Student.

It would appear then that the THE rankings of 2011-2014 with 13 indicators had passed the point beyond which additional indicators become redundant and provide no additional information.

Input and Outputs

There are some clues  about the possible relationship between indicators that could be regarded as inputs and those that might be counted as outputs.

Academic Staff per Student does not  significantly affect teaching reputation (.350, sig .183). It is positively and significantly associated only with doctoral degrees per bachelor degrees  (.510). The correlation with the overall score is, however, quite high and significant at .552.

There is some evidence that a diverse international faculty might have a positive impact on research output and quality. The correlations between International Faculty and Normalised Citation Impact, Papers per Academic and Research Staff and the overall score are positive and significant. On the other hand, the correlations between international collaboration and overall score and international students and overall score are weak and insignificant.

Money seems to help at least as as far as research is concerned. There are moderately high and significant correlations between International Income per Academic Staff and Citations per Academic and Research Staff, Papers per Academic and Research Staff, Normalised Citation Impact and the overall score.

Research Income per Academic Staff correlates highly and significantly with Teaching Reputation, International Teaching Reputation, Research Reputation, Citations per Academic  and Research Staff, Papers per Academic and Research Staff, Normalised Citation Impact and the overall score.

















Saturday, September 19, 2015

Who's Interested in the QS World University Rankings?

And here are the first ten results (excluding this blog and the QS page) from a Google search for this year's QS world rankings. Compare with ARWU and RUR. Does anyone notice any patterns?


Canada falls in World University Rankings' 2015 list

UBC places 50th, SFU 225th in QS World University Rankings



















Who's interested in the Round University Rankings?


The top results from a Google search for responses to the recently published Round Universities Rankings 



New Ranking from Russia










Who's Interrested in the Shanghai Rankings?

First results from a Google search for responses to the latest edition of the Shanghai world rankings.

Radboud University: 132nd place on ARWU/Shanghai ranking 2015







Friday, September 18, 2015

The Italian Ranking Dance

As noted in the previous post, the latest QS world rankings have not been well received by the Italian blog ROARS. Their opinion of the reaction of  the Italian media and public was summarised by posting the following video





Who believes QS?

From the Italian site ROARS: Return on Academic Research (Google translation):


According to the ranking Quacquarelli Symonds (QS) in Siena that something would happen in a year they have lost 220 (two hundred) rankings. But Pavia and Turin have collapsed by over 150 people coming out of the top-500; They have lost more than 100 positions Pisa, Tor Vergata, Federico II of Naples, Milan Catholic, Genoa, Perugia and Bicocca. The meltdown is simply due to the fact that QS has changed the methodology used in its construction ranking. Gaining places only the Polytechnics of Milan and Turin, as provided by Richard Holmes more than a month ago, when the news was spread of the change in methodology. I hope that the collapse of the Italian university in 2015 "certified" by QS and caused by the change of methodology, be a lesson: the rankings are not a serious way to evaluate the performance of universities.Unfortunately, judging from press and of POLIMI POLITO, it seems that the lesson has been helpful.