Tuesday, March 17, 2015

QS Subject Rankings Postponed


From the QS topuniversities site

The planned publication of the QS World University Rankings by Subject 2015 has been postponed for the next few weeks.
In 2015, we have introduced minor methodological refinements which have allowed for improved discrimination, particularly among specialist institutions that are now featuring more materially in our work.  Our new subjects for this year include several - such as Veterinary Sciences, Art & Design, Architecture, Dentistry, Business & Management Studies – that are delivered by single-faculty institutions as well as large, comprehensive universities.
We approach our work with passion, dedication, integrity and a strong sense of responsibility. In response to feedback, we have decided to extend the consultation process, to fully articulate the methodological refinements of the QS World University Rankings by Subject.
Please check back on TopUniversities.com for details on the revised release date of the QS World University Rankings by Subject 2015.

Monday, March 16, 2015

Malaysia and the Rankings: The Saga Continues

Malaysia has had a long and complicated relationship with global rankings ever since that wonderful moment in 2004 when the Times Higher Education Supplement (THES) -- Quacquarelli Symonds (QS) World University Rankings, as they were then, put Universiti Malaya (UM), the country's oldest institution, in the top 100 universities of the world.

It turned out that UM was only in the top 100 because of a ridiculous error by data gatherers QS who counted ethnic Indians and Chinese as international and so boosted the score for the international faculty and international student indicators. This was followed in 2005 by the correction of the error, or "clarification of data" as THES put it, and UM's extraordinary fall out of the top 100, often explained by higher education experts as a change in methodology.

There was another fall in 2007 when QS introduced several methodological changes including the use of z scores, that is calibrating scores against the indicator means, and prohibiting survey respondents from voting for their own universities.

In 2009 UM made something of a recovery rising from 230th in the Times Higher Education (the Supplement bit was dropped in 2008) charts to 180th, largely because of an increase in the number of faculty and a reduction in the number of students reported to QS.

In 2010 THE and QS went their separate ways, publishing their own rankings with different methodologies. UM dropped out of the top 200 of the QS rankings but was back again in 2011 and  has now reached 151st place. It has stayed clear of the THE World University Rankings, which require the annual resubmission of data.

Every time UM or any of the other Malaysian universities rises or falls, it becomes a political issue. Ascent is seen as proof of the strength of Malaysian higher education, decline is the result of policy failures.

Recently, the Malaysian second Minister for Education argued that Malaysian higher education was now world class and on a par with countries such as the United Kingdom, Germany and Australia because of its improved performance in the QS rankings and because it had attracted 135,000 foreign students.

Not everyone was impressed by this. Opposition MP Tony Pua criticised the reliance on the QS rankings, saying that they had been condemned by prominent academics such as Simon Marginson and that UM was not ranked by THE and performed much less well in the other rankings such as the Academic Ranking of World Universities.

The minister has riposted by noting that four Malaysian researchers were included in Thomson Reuters' list of influential scientific minds and that UM had been given five stars by QS.

So, who's right about  Malaysian higher education?

First, Tony Pua is quite right about the inadequacies of the QS world rankings. It can be unstable since the number of universities included in the rankings changes from year to year and this can affect the scores for each indicator. Many of the scores for the objective indicators such as faculty student ratio and international students seem exaggerated and appear to have had a bit of massaging somewhere along the line.

The biggest problem with the QS rankings is the academic and employer  reputation surveys. These collect data from a variety of sources, have low response rates and are very volatile. They include respondents whose names are submitted by universities and those who nominate themselves. There were some suspiciously high scores for the  academic reputation indicator in 2014: Peking University in 19th place, National Taiwan University in 37th, University of Buenos Aires in 52nd and the Pontifical Catholic University of Chile in 78th.

The employer survey also produces some counter-intuitive results: Universitia Commerciale Luigi Bocconi in 33rd place, Indian Institute of Technology Bombay in 60th, the American University of Beirut in 85th and Universidiad de los Andes in 98th.

The QS world rankings can therefore be considered a poor reflection of overall quality.

Some critics have asserted that the THE rankings are superior and that Malaysian universities are being evasive by staying away from them. It is true that  THE  have won the approval of the British political and educational establishment. David Willetts, former universities and science minister, has joined the advisory board of  THE's parent company. THE has highlighted comments on its recent reputation rankings by Greg Clark, universities, science and cities minister, Vince Cable, the Business Secretary and Wendy Piatt, director of the Russell Group of research intensive universities.

However, more informed and observers such as Alex Usher of Higher Education Strategy Associates and Isidro Aguillo of the Cybermetrics Lab have little regard for these rankings.

Even Simon Marginson, who has moved to the London Institute of Education, now accepts that they are "fatally flawed once outside the top 50 universities".

The THE rankings have some serious methodological flaws. They assign a 33% weighting to a reputation survey. After the top six universities the number of responses drops off rapidly. After we leave the top 100 the number of votes on the survey is small and so it is quite normal for a few additional responses to have a disproportionate effect on the indicator scores and consequently on overall scores. QS does give an even greater weighting for reputation -- 50% -- but reduces annual fluctuations by carrying over responses for a further two years if they are not updated. The new Best Global Universities produced by US News takes a five year average of their reputation scores.

In addition, the THE rankings assign a 30 % weighting to their Citations: Research Impact indicator which  is constructed so that it allows contributions to publications, usually in physics , astronomy or medicine, with hundreds of contributing institutions to give a university an undeserved score for citations. Since its beginning, the THE rankings have  shown bizarre results for research impact by putting places like Alexandria University, Moscow State Engineering Physics Institute, Federico Santa Maria Technical University in Valparaiso and the University of Marrakech Cadi Ayyad in the top ranks of the world for research impact.

Yes, QS putting Tsinghua  University in overall 47th place is questionable (the Shanghai Academic Ranking of World Universities has it in 101-150 band)  but on balance this is more plausible than putting, as THE does, Scuola Normale Superiore di Pisa in 63rd place (Shanghai has it in the 301-400 band)

The only situation in which it would make sense for UM to take part in the THE rankings would be if it was going to start a first rate particle physics programme including participation in the Large Hadron Collider project with multi-author publications that would bring in thousands of citations.

Rather than relying on the questionable QS and THE rankings, it would be a good idea to look at the progress of UM according to the less well known but technically competent research-based rankings. The Scimago Institution Rankings show that in a ranking of higher education institutions by research output  UM was in 718th place in 2009.  Since then it has risen to 266th place, behind 16 British (out of 189), 15 German and seven Australian universities and institutes.

This is similar to the CWTS Leiden Ranking which has UM in 270th place for number of publications (calculated with default settings) or the Shanghai rankings, where Nobel prizes are indicators, which place it in the 301-400 band.

This does not necessarily mean that there has been similar progress in graduate employability or in the quality of research. It does, however, mean that for research output Universiti Malaya, and maybe two or three other Malaysian universities, are now competitive with second tier universities in Britain and Germany.

This is probably not quite what most people mean by world-class but it is not impossible that in a decade UM could, if it sticks to current policies, be the rival of universities like Sheffield, Cardiff or Leeds.

But such progress depends on Malaysian universities focussing on their core missions and not falling into the quagmire of mission creep.

It also depends on something being done to remedy the very poor performance of Malaysian secondary schools. If that does not happen then the future of Malaysian higher education and the Malaysian economy could be very bleak.