Monday, April 18, 2016

Round University Rankings


The latest Round University Rankings have been released by the Russian company, RUR Rankings Agency. These are essentially holistic rankings that attempt to go beyond the measurement of research output and quality. There are twenty indicators, although some of them such as Teaching Reputation, International Teaching Reputation and Research Reputation and International Students and International Bachelors are so similar that the information they provide is limited.

Basically these rankings cover much the same ground as the Times Higher Education (THE) World University Rankings. The income from industry indicator is not included but there are an additional eight indicators. The data is taken from Thomson Reuters' Global Institutional Profiles Project (GIPP) which was used by THE for their rankings from 2010 to 2014.

Unlike THE, which lumps its indicators together into groups,  the scores in the RUR are listed separately in the profiles. In addition, the rankings provide data for seven continuous years from 2010 to 2016. This provides an unusual opportunity to examine in detail the development of universities over a period of seven years, measured by 20 indicators. This is not the case with other rankings which have fewer indicators or which have changed their methodology.

It should be noted that participation in the GIPP is voluntary and therefore the universities in each edition could be different. For example, in 2015 100 universities dropped out of the project and 62 joined.

It is, however,  possible to examine a number of claims that have been made about changes in university quality over the last few years. I will  take a look at these in the next few posts.

For the moment, here are the top five in the overall rankings and the dimension rankings.

Overall
1.   Caltech
2.   Harvard
3.   Stanford
4.   MIT
5.   Chicago


Teaching
1.   Caltech
2.   Harvard
3.   Stanford
4.   MIT
5.   Duke

Research
1.   Caltech
2.   Harvard
3.   Stanford
4.   Northwestern University
5.   Erasmus University Rotterdam

International Diversity
1.   EPF Lausanne
2.   Imperial College London
3.   National University of Singapore
4.   University College London
5.   Oxford

Financial Sustainability
1.   Caltech
2.   Harvard
3.   Scuola Normale Superiore Pisa
4.   Pohang University of Science and Technology
5.   Karolinska Institute

Unfortunately these rankings have received little or no recognition outside Russia. Here are some examples.


MIPT entered the top four universities in Russia according to the Round University Ranking

Russian Universities in the lead in terms of growth in the international ranking of Round University Ranking

TSU [Tomsk State University]  has entered the 100 best universities for the quality of teaching

[St Petersburg]

Russian universities to top intl rankings by 2020 – Education Minister Livanov to RT


Saturday, April 16, 2016

Some fairly new news from THE

The latest of many spin offs from the THE world university rankings is a list of 150 universities less than 50 years old. The data is extracted from the world rankings released at the end of last year. THE have, however, reduced the weighting given to their academic reputation survey and so the results are a little different.

Is it possible that THE are thinking about reducing the weighting for the reputation survey in this year's world rankings?

Here are the top ten new universities with their overall scores and then the overall scores in the World University Rankings in brackets. It can be seen that changing the weighting for this indicator does in some cases make a difference, although not a spectacular one.


Place
University
Overall score

1Ecole Polytechnique Federale Lausanne
Switzerland
76.8  (76.1)   
2Nanyang Technological University
Singapore
72.5  (68.2)
Hong Kong University of Science and Technology70.7  (67.2)
4Maastricht University
Netherlands 
66.1  (59.9)
5Pohang University of Science and Technology
Korea
65.5  (56.9)
6Korean Advanced Institute of Science and Technology60.8  (53)
7University of Konstanz
Germany
58.9  (50.8)
8Karlsruhe Institute of Technology
Germany
58.6 (54.5)
9Pierre and Marie Curie University
France
58.2  (57)
10Scuola Superiore Normale Pisa
Italy
57.3  (50.2)

Most of the 150 universities were outside the top 200 in the world rankings and did not receive an overall score (although it could be calculated easily enough) so there is some new data here. 

As expected the young universities rankings have received a lot of publicity from the world media. Here's a sample.

University of Calgary ranks again as a top young global university by Times Higher Education










Thursday, April 14, 2016

Are there any more rankings left?

There seems to be an unending stream of new rankings. So far we have had from the big three or four subject rankings, field rankings, European, Asian, African, Latin American, Middle East and North Africa rankings, BRICS rankings. BRICS and emerging economies rankings, reputation rankings, young universities, old universities, most international universities rankings, research income from industry rankings.

From outside the charmed triangle or square we have had random rankings, length of name rankings,green rankings, twitter and LinkedIn rankings and rich universities rankings and of course in the USA a mixed bag of best universities for squirrels, gay friendly, top party schools and so on. I am a keen follower of the latter: when the US Air Force Academy gets in the top ten I shall pack up and move to Antarctica.

So are there any more international university rankings in the pipeline?

A few suggestions. Commonwealth universities, OIC universities, cold universities, high universities (altitude that is), poor universities, fertile universities (measured by branch campuses).

One that would be fun to see would be a Research Impact ranking based on those universities that have achieved a top  placing in the THE year- and field- normalised, regionally modified, standardised, citations ranking.

Some notable inclusions would be St. George's University of London, Rice University, Tokyo Metropolitan University, Federico Santa Maria Technical University, Florida Institute of Technology, National Research Nuclear University MEPhI and Alexandria University.



Monday, April 11, 2016

Ranking Rankers' Twitter Accounts

Over at Campus Morning Mail, Steven Matchett reflects on the limited attention that U-Multirank has received compared with THE's ranking of 150 young universities. He notes that THE has 192,000 followers on Twitter, while U-Multirank has only 2,390.

Digging further, here are the number of twitter followers for various people and organisations that have something to do with international university rankings.


World Uni Ranking (THE) 25,400
Phil Baty (THE) 15,300
QS 3,667
Bob Morse (US News) 2,122
Isidro Aguillo (Webometrics) 2,221
Ellie Bothwell, (THE) 2,049
World Univ Ranking (Shanghai) 1,226
Ben Sowter (QS) 1,115
CWTS Leiden 641
Proyecto U-Ranking (Spain) 561
RUR ranking (Russia)   532
5 top 100 (Russia)  355
Centre for Higher Education (Germany) 308
Richard Holmes 175

Sunday, April 10, 2016

Interview with Round University Ranking

See here for the text of an interview with Round University Ranking of Russia.


How to Survive Changes in Ranking Methodology

How to survive changes in ranking methodology

Thursday, April 07, 2016

How to Read International University Rankings


The London Daily Telegraph has published an article on 'How to Read the Different University Rankings' which refers to two international rankings, QS and THE and their spin-offs and some national ones. I assume this was intended for British students who might like to get a different perspective on UK university quality or who might be thinking of venturing abroad.

The article is not very satisfactory as it refers only to the QS and THE rankings and is uncritical.

So here is a brief survey of some global rankings for prospective students.

International university rankings fall into three groups: Internet-based like Webometrics and ic4u, research-based like the Academic Ranking of World Universities (ARWU) produced by the Shanghai Ranking Consultancy and "holistic" rankings such as the Quacquarelli Symonds (QS) and Times Higher Education (THE) world rankings which claim to measure reputation, teaching quality, internationalisation and other factors and combine indicators into a single index.

The first thing that you need to know is where an institution stands in the global university hierarchy. Most rankings cover only a fraction of the world's higher education institutions. Webometrics, however, rates the presence, impact, openness and research quality, measured by Google Scholar Citations, of nearly 24,000 universities. Only 5,483 of these get a score for quality so that a university with a rank of 5,484  is not doing anything resembling research at all.

While the Webometrics rankings are quite volatile at the top, the distinction between a university in 200th and one in 2,000th place is significant, even more so between 1,000th and 10,000th.

Webometrics can also help determine whether an institution is in fact a university in any sense of the word. If it isn't there the chances are that it isn't really a university. I recently heard of someone who claimed degrees from Gordon University in Florida.  A search of Webometrics revealed nothing so it is very likely that this is not a reputable institution.

Here ic4u is less helpful since it ranks only 11,606 places and does not provide information about specific indicators.

Looking at the research rankings, the oldest and most respected by academics is the ARWU published by Shanghai Ranking Consultancy. This has six criteria, all of them related to research. The methodology is straightforward and perhaps unsophisticated by today's standards. The humanities are excluded, it has nothing to say about teaching and it favours large, old and rich universities. It also privileges medical schools, as shown by last year's world rankings which put the University of California at San Francisco, a medical school, in eighteenth place in the world.

On the other hand, it is stable and sensible and by default the ranking of choice for research students and faculty.

ARWU also publishes subject rankings which should be checked. But it is worth noting that ARWU has proved vulnerable to the tactics of King Abdulaziz University (KAU) in Jeddah which hands out contracts to over 200 adjunct faculty members who list KAU as a secondary affiliation. This has put the university  in the world's top ten in the ARWU mathematics rankings.

There are a number of other global research rankings that could be consulted since they produce results that may be different from Shanghai. These include the National Taiwan University Rankings, the Best Global Universities published by US News, University Ranking by Academic Performance produced by Middle East Technical University and the CWTS Leiden Ranking. These, especially the Leiden Ranking, reach a high level of technical sophistication and should be consulted by anyone thinking of postgraduate study anywhere. The Leiden Ranking is very helpful in providing stability intervals for each score.

Since 2004 a number of rankings have appeared that attempt to go beyond simply measuring research output and quality. These are problematical in many ways. It is very difficult to find data that is comparable across international borders and their methodology can change. In addition, they rely on reputation surveys which can be biased and unreliable. At the moment the two best known international rankings of this type are the World University Rankings published by QS and THE.

It must be pointed that even these are still top heavy with research indicators. The QS rankings have a 40% weighting for an opinion survey of research quality, and another 20 percent for citations. Even the faculty student ratio indicator is sometimes a measure of research rather than teaching since it can be improved by adding research-only staff to the faculty. The THE rankings allot 30% to five research indicators and another 30% to citations, 2.5% to international research collaborations and 2.5% to research income from industry.

You should bear in mind that these rankings have been accused, not without justification, of national bias. A paper by Christopher Claassen of the University of Glasgow has found that the QS and THE rankings are seriously biased towards UK universities.

The metrics that do attempt to measure teaching quality in the THE and QS rankings are not very helpful. Both have faculty student ratio data but this is a very imprecise proxy for teaching resources. The THE rankings include five indicators in their super-indicator "Teaching: the Learning Environment", two of which measure the number of doctoral students or doctoral degrees, which does not say much about undergraduate instruction.

It is also a good idea to check the scores for the criteria that are combined to make up the composite score. If a university has a disproportionately high score for an indicator with a high weighting like QS's academic opinion survey (40%) or THE's citations indicator (30%) then alarm bells should start ringing.

In some ways the new Round University Ranking from Russia is an improvement. It uses data from Thomson Reuters as THE did until last year. It does nearly everything that THE and QS do and a few more things besides. Altogether there are 20 indicators, although some of these, such as three reputation indicators,  are so similar that they are effectively redundant.

Recently a consortium of European organisations and universities created U-Multirank, which takes a different approach. This is basically an online evaluation tool that allows users to choose how they wish to compare and sort universities. Unfortunately, its coverage is rather uneven. Data about teaching and learning is limited for Asia, North America and the UK although good for Western
Europe.

International rankings are generally not helpful in providing information about graduate employability although QS do include a reputation survey of employers and the Jeddah based Center for World University Rankings counts alumni who become CEOs of major companies.

The main global rankers also publish regional and specialist spin-offs: Latin American, Asian and European rankings, new university rankings, subject rankings. These should be treated with scepticism since they depend on a relatively small number of data points and consequently can be unreliable.

To summarise, these are things to remember when using global rankings:

  • first check with Webometrics to find out the approximate standing of a university in the world
  • for prospective postgraduate students in science and medicine, the Shanghai rankings should be consulted and confirmed by looking at other research rankings
  • potential undergraduate students should look at more than one "holistic" ranking and should always check the scores for specific indicators
  • be very sceptical of universities that get a good score for only one or two indicators or that do well in only one ranking
  • always look at scores for subjects if available but remember that they may be based on limited and unreliable data
  • for teaching related information the best source is probably U-Multirank but this is available only for some universities.







Friday, April 01, 2016

Really Useful Rankings: The Lowest Ranked Universities in the World

University rankings are getting a bit predictable, especially at the top. The top university is either in California or Cambridge, Massachusetts (or once in another Cambridge somewhere). The top twenty will be nearly all US places with a few British. Some of the established rankers do try to liven things up a bit: Nanyang Technological University in the top twenty, Middle East Technical University in the top hundred, but that does not do much to relieve the tedium.

So I thought it might be interesting to find the lowest ranked universities in the world. The trouble is most rankings do not rank many institutions so the best place to look is Webometrics which tries to rank all the universities on the planet. Here are the 22 institutions ranked 23,892 by Webometrics which means they have no Presence, no Impact, no Openness and not one Google Scholar Citation. Whether their existence goes beyond a website to include buildings and people I'll leave for you to find out.

Instituto Oswaldo Cruz de Certificacao, Brazil
Faculdade des Americas, Brazil
Escuela Normal Prescolar Adolfo Viguri Viguri, Mexico
Universidad Continente Americano Celaya, Mexico
Colegio Universitario y Tecnologico del Noreste, Mexico
Benemerita Escuela Normal Federalizada de Tamaulipas, Mexico
Universidad Virtual Latinoamericano, Venezuela
Faculdade Zumbi dos Palmares, Brazil
Faculdade Vizcaya, Brazil
Mugla Vocational Higher School, Turkey
Higher Polytechnic School Zagreb, Croatia
Sadat Institute of Higher Education, Afghanistan
Saint Paul Institute, Cambodia
Zhangzhou Institute of Technology, China
Technological University Thanlyin, Myanmar
Abasyn University, Pakistan
Salipur College, India
Adusumilli Vijaya Group of Colleges, India
Kaboora Institute of Higher Education, Afghanistan
Madina Engineering College, India
Universidade Lueji A'Nkonde, Angola
Reseau Marwan, Morocco






Sunday, March 27, 2016

The Goldman Sachs Rankings

From efinancialcareers

"Goldman Sachs recruits primarily from US Ivy League universities. This might sound entirely predictable, but Goldman does have some surprises in the places it recruits from. In Asia, for example, it recruits mostly from local universities. In the UK, it seems to draw most of its operations hires from Warwick.
Based on the 22,000 Goldman Sachs CVs in the eFinancialCareers CV database, we’ve created rankings of the top universities the bank hires from by sector and region.
Globally, the best university for getting into Goldman Sachs is the London School of Economics (LSE), our data suggests, followed by Columbia University and the University of Pennsylvania – both of which have a strong emphasis on financial services careers."
Globally the best places for a career with Goldman Sachs are:

1.  London School of Economics
2.  Columbia University
3.  University of Pennsylvania
4.  Princeton University
5.  Harvard University.

In the US they are:

1.  Columbia University
2.  University of Pennsylvania
3.  Princeton University
4.  Cornell University
5.  Harvard University.

In Asia they are:

1.  University of Hong Kong
2.  National University of Singapore
3.  Nanjing University
4.  London School of Economics
5.  Indian Institute of Technology (which one is not stated).

In Europe they are:
1.  London School of Economics
2.  Cambridge University
3.  Imperial College London
4.  Oxford University
5.  Warwick University.




Saturday, March 26, 2016

Ranking Rankings Again


Magnus Gunnarsson of the University of Gothenburg has reminded me of a 2010 report which included an assessment of methodology based on IREG's Berlin principles.

There were 16 Berlin principles, 14 of which were given weighted subscores in the report. The values and weighting were determined  subjectively although the rankers were evidently well informed.

The Berlin principles were grouped in four categories, Purpose and Goals of Rankings, Design and Weighting of Indicators, Collection and Processing of Data and Presentation of Ranking Results. For more information see here.

The ranking of rankings  by methodology is as follows. It is obviously out of date.


Position
Ranking
Overall method score
1= CHE (Germany) 3.10
1=  Webometrics 3.10
HEEACT (now Taiwan National
University)
2.90
4 ARWU (Shanghai) 2.80
5= High Impact Universities (Australia) 2.60
5= Observatory (sustainable development)2.60
7= Scimago 2.40
7= CWTS Leiden ranking 2.40
9 THE 2.30
10 Mines Paris Tech 2.20
11 QS 2.10
12 Rater (Russia) 1.70

Wednesday, March 23, 2016

Update on Trinity College Dublin

Some things have been clarified about the now famous message sent by John Boland, vice-president and dean of research at Trinity College Dublin to various stakeholders.

It seems that the message was about both the QS and the THE academic surveys, referring to the sign up facility for the former and the one dollar donation to charity for the latter.

Apparently THE has no problems with the message.

On the other hand, QS thinks that its guidelines have been breached.

"QS has been made aware of recent communications from Trinity College Dublin (TCD) that appear to contravene the guidelines used by QS to ensure the accuracy of its university rankings. In accordance with our standard operating procedure, we have notified TCD that its “awareness” campaign is in breach of these guidelines."

The issue is not suggesting that anyone should go the sign up facility. That is permitted in clearly stated guidelines. What is not permitted is suggesting what they do after they have signed up.

"It is acceptable and even encouraged for institutions to communicate with employers and academics worldwide to showcase their achievements. Institutions are welcome to invite contacts to sign up for possible selection for our survey using our Academic or Employer Sign Up Facilities, but any message soliciting a specific response in our surveys, represents unfair manipulation of the results and will not be tolerated. "

It is fairly clear what the vice-dean was suggesting people should do although I can see how he might have thought he was being subtle enough to get around the QS guidelines.

One interesting aspect of this affair is that QS is much stricter about influencing the surveys than THE.

I think the main lesson to be learnt from this affair is that it is unwise to allow senior administrators to get involved with any sort of rankings initiative or strategy.





Sunday, March 20, 2016

Turning the Tide: Contributing to the Decline of American Universities


Harvard Graduate School of Education has come out with a plan to radically overhaul the admissions process in American universities. The document has been endorsed by a huge, if that word won't trigger a trauma for someone, number of certified academic gatekeepers.

The report argues that the current university admissions process emphasises personal success at the expense of community engagement, puts too much stress on applicants and discriminates against students from disadvantaged communities.

Proposals include "promoting more meaningful contributions to others", "assessing students  ethical engagement and contributions to others in ways that reflect varying types of family and community contributions across race, culture and class" and  "redefining achievement in ways that both level the playing field for economically diverse students and reduce excessive achievement pressure."

Detailed recommendations include students doing at least a year of "sustained service or community engagement", which could include working to contribute family finances. The report recommends that what should count for university admissions is whether " whether students immersed themselves in an experience and the emotional and ethical awareness and skills generated by that experience.'

It is predictable that this will greatly increase the stress experienced by applicants who would have to find some sort of service for a year, immerse themselves in it, generate emotional and ethical skills and then be able to convince admissions officers that they have done so. Or, let us be realistic, find teachers or advisors who will show them how to do this. It will also prompt a massive invasion of privacy if colleges are really going to demand evidence that community service is authentic and meaningful.

Students will also be encouraged to undertake activities that  "deepens their appreciation of diversity." Would it be too cynical to suspect that this is code for political indoctrination?

The report also urges that students take fewer AP and IB courses and that colleges should consider making the SAT and ACT optional.

None of these are particularly novel but put together they are likely to cause a shift in the qualities required for admission to America's elite schools. Students will care for others, be passionate about community engagement, appreciate diversity and have authentic extra-curricular activities or at least they will be highly skilled at pretending that they are. They will also be less academically able and prepared and universities will inevitably have to adjust to this.

Also, admissions offices will require more money and responses to make sure that those diversity experiences are meaningful and that community engagement is authentic. But that shouldn't be a problem. Somehow money is always available when it is really needed.

Over the next few years look for a migration of talented students and researchers from the US.

Trinity College Dublin: Waiting for Clarification

Yesterday I published a post about Trinity College Dublin that referred to a report in the London Sunday Times about TCD encouraging its graduates to respond to questionnaires  from THE.

Now I have seen a report in Inside Higher Education that says that TCD has been encouraging people to sign up for the QS academic survey sign up facility and to vote for the college.

I have therefore deleted the post until it is clear whether TCD has been sending messages about about the QS or the THE surveys or both and whether it has violated the guidelines of either or both.












Friday, March 18, 2016




"What we are seeing worldwide, from India to the UK to the US, is the rebellion against the inner circle of no-skin-in-the-game policymaking "clerks" and journalists-insiders, that class of paternalistic semi-intellectual experts with some Ivy league, Oxford-Cambridge, or similar label-driven education who are telling the rest of us 1) what to do, 2) what to eat, 3) how to speak, 4) how to think... and 5) who to vote for.
With psychology papers replicating less than 40%, dietary advice reversing after 30y of fatphobia, macroeconomic analysis working worse than astrology, microeconomic papers wrong 40% of the time, the appointment of Bernanke who was less than clueless of the risks, and pharmaceutical trials replicating only 1/5th of the time, people are perfectly entitled to rely on their own ancestral instinct and listen to their grandmothers with a better track record than these policymaking goons."

And, one might add, the publishers of sophisticated and prestigious rankings who would have us believe that a university can be world-class one year and at the bottom of the table the next.