Friday, April 22, 2016

No more excellence at the University of Missouri




Image result for free image university missouri


No, it is not a new initiative to identify more inclusive measures of achievement.


From Peter Woods at Minding the Campus:

"The University of Missouri has eliminated Respect and Excellence.  I have to write this in a hurry because it won’t be long before others will seize on this gift.  Respect and Excellence are the names for two residence halls at the University.  They are being closed because the University suddenly finds that its enrollments are plummeting.  Two other dorms were closed already in light of the crisis.
Let’s bask in the irony for a moment or two longer.  The University of Missouri arrived at this juncture by cravenly submitting to the demands of activists and the threats of football players who decided to abet the activists.  On November 9, System President Tim Wolfe and Chancellor R. Bowen Loftin resigned rather than face down those threats."
There is definitely something wrong with the administration if they did not anticipate the reaction to this.

Trinity College Dublin: A Case of Rankings Abuse



Trinity College Dublin (TCD) is giving itself a public flogging over its fall in the rankings. This is rather odd since in fact it has been doing pretty well over the last few years with one exception.

An article in the Irish Times reveals that the Provost of TCD, Patrick Prendergast, is planning a new business school and student accommodation but finds it difficult to raise the necessary cash.


"The college’s planned expansion comes at a time when many in higher education say the sector faces a funding crisis. Rising student numbers, declining state funding and restrictions on staff recruitment mean that many have had to make dramatic cuts to make ends meet.
Trinity and some of the bigger universities have at least been able to plug many of the funding gaps with private income, such as international students’ fees, research and other commercial sources."
According to the provost things are getting pretty rough.
“We have overcrowded classrooms. Our staff-student ratio is a way out of kilter. The universities have been very resilient; they have managed to keep going successfully. But is it sustainable? I and other university presidents don’t think it is sustainable at current funding levels.”
What does this have to do with rankings.?
"University rankings are considered vital to attracting international students and research funding. However, Trinity, like many Irish universities, has slid down world rankings in recent years as it copes with an increase in students and a reduction in funding.
Last month, TCD found itself at the centre of controversy when one of the main ranking agencies, QS, accused the college of violating its rules by influencing academics involved in its annual survey."
The provost claims:


“If Ireland really wants to be an island known for the talent of its people, and have companies locate here, then we can’t afford to have that one global indicator of the quality of education systems – rankings – decline."

But is it true that TCD is sliding down the rankings?

Let's take a look at the Shanghai Academic Ranking of World Universities (ARWU). This measures research in the natural and social sciences at the highest level. The methodology of these rankings has remained largely stable with some minor exceptions.  In 2004  there were some changes to help social science institutions a bit and in 2014 they began to use a new list of highly cited researchers. The latter move helped TCD a bit in 2014 and, assuming the highly cited researchers do not leave, will help a bit more later this year. 

Therefore, if there has been a significant change in scores in these rankings it is a reasonable assumption that this does actually reflect a change in quality.

Looking at the scores for the separate indicators in the Shanghai rankings, the score for alumni with Nobel and Fields awards fell from 15.4 to 10.3, for faculty with awards from 14.4 to 13.3, and for papers in Nature and Science very slightly from 13.2 to 13.1 between 2004 and 2015.

In contrast, TCD did much better for highly cited researchers (from zero to 12.3), publications in Web of Science journals (27.1 to 31.0) and productivity per capita (the sum of the above indicators divided by number of faculty). For all criteria, the top scoring university, Caltech for productivity and Harvard for everything else, gets 100.

Overall, TCD moved from the 202-300 band to the 151-200 band. A rough guess is that TCD has moved up  about 15 places altogether. About five of these places were gained because of the new highly cited researchers list. At this rate it could be in the top 100 in another century or so. That is very long term but as Basil Fawlty said, "these things take time".

Of course, the Shanghai rankings do not measure teaching, quality of students, income or internationalisation. For a detailed look at a more diverse set of criteria we can turn to the Round University Rankings. These are published by a Russian company but use data from Thomson Reuters, the same source that Times Higher Education (THE) used from 2010 to 2014.  There are, however, 20 indicators compared to the THE's 13 and they cover the period from 2010 to 2016.

Overall, TCD improved significantly, going from 174th to 102th place.

In the teaching dimension, TCD rose slightly from 207th place to 197th, doing slightly better for two doctoral degree indicators, quite a bit worse for two academic staff indicators and staying in exactly the same place for teaching reputation (187th).

For research, TCD improved significantly from 2010 to 2016, going from 193rd to 67th, did much better for papers, citations, completed doctoral degrees and research reputation but did worse for normalised citation impact.

TCD did very well for international diversity in 2010 (31st) but slipped back a bit by 2016 (46th).

For financial sustainability, TCD 's relative position worsened considerably falling from 229th to 412th.

So the data from RUR suggests that TCD's income and faculty resources  were declining relative to other universities over this period.  But so far this has had no significant effect on TCD's teaching profile while research has improved noticeably both in quality and quality.

TCD could quite plausibly claim to be the university with a tiger in its tank, getting more research and educating more students with limited financial and faculty resources.

Turning to the QS scores, in 2004 TCD was ranked 87th in the world and by 2014 had risen to 71st place, but fell back to 78th in 2015. They were still doing better than in 2016. It is not a good idea to draw any conclusions from the decline between 2014 and 2015 because in 2015 QS introduced a number of substantial methodological changes.

To summarise, TCD were doing better over fairly long periods in  the Shanghai,  RUR and QS rankings. Possibly, the fall in the QS rankings between 2014 and 2015 might portend difficulty ahead and perhaps the fall in income documented in RUR may eventually have a knock on effect although so far it has not. Still, it seems that  TCD has on the whole  been doing well. So what  is the Provost talking about?

It is the THE rankings that appear to show TCD in a bad light. In 2010 TCD was in 76th place overall and by 2015 had fallen to 160th. It is difficult to tell exactly what happened because 11 of the 13 indicators are bundled into three super-indicators and it is not clear exactly what contributed to rising or falling scores. In 2015 TCD had higher scores for  international orientation and lower scores for Teaching, Research, Citations and Income from Industry. The biggest  decline was in the Research score, from 45.3 to 30.8.

Clearly, THE is the odd man out as far as TCD is concerned and should not be taken too seriously. Firstly, there were major methodological changes last year which produced upheavals for universities around the world, including those in France, Korea and Turkey. There was another batch of changes in 2011. In addition, these rankings generate a lot of noise because of exchange rate fluctuations, the use of surveys which can be quite volatile outside the top fifty or so, and a citations indicator where (until last year) a single paper or adjunct faculty could produce an enormous change in scores.

THE have said  -- and here they must be given credit -- that:  "Because of changes in the underlying data, we strongly advise against direct comparisons with previous years’ World University Rankings."

It seems that TCD is doing the academic equivalent of taking a dive.

Monday, April 18, 2016

Off Topic: The Independent Gets Really Creative

The Independent has just posted a creativity test composed of one question. I can remember this question from a non-examination social studies class in grammar school several decades ago.

Surely the voice of the British intelligentsia can be more creative than that.

Round University Rankings


The latest Round University Rankings have been released by the Russian company, RUR Rankings Agency. These are essentially holistic rankings that attempt to go beyond the measurement of research output and quality. There are twenty indicators, although some of them such as Teaching Reputation, International Teaching Reputation and Research Reputation and International Students and International Bachelors are so similar that the information they provide is limited.

Basically these rankings cover much the same ground as the Times Higher Education (THE) World University Rankings. The income from industry indicator is not included but there are an additional eight indicators. The data is taken from Thomson Reuters' Global Institutional Profiles Project (GIPP) which was used by THE for their rankings from 2010 to 2014.

Unlike THE, which lumps its indicators together into groups,  the scores in the RUR are listed separately in the profiles. In addition, the rankings provide data for seven continuous years from 2010 to 2016. This provides an unusual opportunity to examine in detail the development of universities over a period of seven years, measured by 20 indicators. This is not the case with other rankings which have fewer indicators or which have changed their methodology.

It should be noted that participation in the GIPP is voluntary and therefore the universities in each edition could be different. For example, in 2015 100 universities dropped out of the project and 62 joined.

It is, however,  possible to examine a number of claims that have been made about changes in university quality over the last few years. I will  take a look at these in the next few posts.

For the moment, here are the top five in the overall rankings and the dimension rankings.

Overall
1.   Caltech
2.   Harvard
3.   Stanford
4.   MIT
5.   Chicago


Teaching
1.   Caltech
2.   Harvard
3.   Stanford
4.   MIT
5.   Duke

Research
1.   Caltech
2.   Harvard
3.   Stanford
4.   Northwestern University
5.   Erasmus University Rotterdam

International Diversity
1.   EPF Lausanne
2.   Imperial College London
3.   National University of Singapore
4.   University College London
5.   Oxford

Financial Sustainability
1.   Caltech
2.   Harvard
3.   Scuola Normale Superiore Pisa
4.   Pohang University of Science and Technology
5.   Karolinska Institute

Unfortunately these rankings have received little or no recognition outside Russia. Here are some examples.


MIPT entered the top four universities in Russia according to the Round University Ranking

Russian Universities in the lead in terms of growth in the international ranking of Round University Ranking

TSU [Tomsk State University]  has entered the 100 best universities for the quality of teaching

[St Petersburg]

Russian universities to top intl rankings by 2020 – Education Minister Livanov to RT


Saturday, April 16, 2016

Some fairly new news from THE

The latest of many spin offs from the THE world university rankings is a list of 150 universities less than 50 years old. The data is extracted from the world rankings released at the end of last year. THE have, however, reduced the weighting given to their academic reputation survey and so the results are a little different.

Is it possible that THE are thinking about reducing the weighting for the reputation survey in this year's world rankings?

Here are the top ten new universities with their overall scores and then the overall scores in the World University Rankings in brackets. It can be seen that changing the weighting for this indicator does in some cases make a difference, although not a spectacular one.


Place
University
Overall score

1Ecole Polytechnique Federale Lausanne
Switzerland
76.8  (76.1)   
2Nanyang Technological University
Singapore
72.5  (68.2)
Hong Kong University of Science and Technology70.7  (67.2)
4Maastricht University
Netherlands 
66.1  (59.9)
5Pohang University of Science and Technology
Korea
65.5  (56.9)
6Korean Advanced Institute of Science and Technology60.8  (53)
7University of Konstanz
Germany
58.9  (50.8)
8Karlsruhe Institute of Technology
Germany
58.6 (54.5)
9Pierre and Marie Curie University
France
58.2  (57)
10Scuola Superiore Normale Pisa
Italy
57.3  (50.2)

Most of the 150 universities were outside the top 200 in the world rankings and did not receive an overall score (although it could be calculated easily enough) so there is some new data here. 

As expected the young universities rankings have received a lot of publicity from the world media. Here's a sample.

University of Calgary ranks again as a top young global university by Times Higher Education










Thursday, April 14, 2016

Are there any more rankings left?

There seems to be an unending stream of new rankings. So far we have had from the big three or four subject rankings, field rankings, European, Asian, African, Latin American, Middle East and North Africa rankings, BRICS rankings. BRICS and emerging economies rankings, reputation rankings, young universities, old universities, most international universities rankings, research income from industry rankings.

From outside the charmed triangle or square we have had random rankings, length of name rankings,green rankings, twitter and LinkedIn rankings and rich universities rankings and of course in the USA a mixed bag of best universities for squirrels, gay friendly, top party schools and so on. I am a keen follower of the latter: when the US Air Force Academy gets in the top ten I shall pack up and move to Antarctica.

So are there any more international university rankings in the pipeline?

A few suggestions. Commonwealth universities, OIC universities, cold universities, high universities (altitude that is), poor universities, fertile universities (measured by branch campuses).

One that would be fun to see would be a Research Impact ranking based on those universities that have achieved a top  placing in the THE year- and field- normalised, regionally modified, standardised, citations ranking.

Some notable inclusions would be St. George's University of London, Rice University, Tokyo Metropolitan University, Federico Santa Maria Technical University, Florida Institute of Technology, National Research Nuclear University MEPhI and Alexandria University.



Monday, April 11, 2016

Ranking Rankers' Twitter Accounts

Over at Campus Morning Mail, Steven Matchett reflects on the limited attention that U-Multirank has received compared with THE's ranking of 150 young universities. He notes that THE has 192,000 followers on Twitter, while U-Multirank has only 2,390.

Digging further, here are the number of twitter followers for various people and organisations that have something to do with international university rankings.


World Uni Ranking (THE) 25,400
Phil Baty (THE) 15,300
QS 3,667
Bob Morse (US News) 2,122
Isidro Aguillo (Webometrics) 2,221
Ellie Bothwell, (THE) 2,049
World Univ Ranking (Shanghai) 1,226
Ben Sowter (QS) 1,115
CWTS Leiden 641
Proyecto U-Ranking (Spain) 561
RUR ranking (Russia)   532
5 top 100 (Russia)  355
Centre for Higher Education (Germany) 308
Richard Holmes 175

Sunday, April 10, 2016

Interview with Round University Ranking

See here for the text of an interview with Round University Ranking of Russia.


How to Survive Changes in Ranking Methodology

How to survive changes in ranking methodology

Thursday, April 07, 2016

How to Read International University Rankings


The London Daily Telegraph has published an article on 'How to Read the Different University Rankings' which refers to two international rankings, QS and THE and their spin-offs and some national ones. I assume this was intended for British students who might like to get a different perspective on UK university quality or who might be thinking of venturing abroad.

The article is not very satisfactory as it refers only to the QS and THE rankings and is uncritical.

So here is a brief survey of some global rankings for prospective students.

International university rankings fall into three groups: Internet-based like Webometrics and ic4u, research-based like the Academic Ranking of World Universities (ARWU) produced by the Shanghai Ranking Consultancy and "holistic" rankings such as the Quacquarelli Symonds (QS) and Times Higher Education (THE) world rankings which claim to measure reputation, teaching quality, internationalisation and other factors and combine indicators into a single index.

The first thing that you need to know is where an institution stands in the global university hierarchy. Most rankings cover only a fraction of the world's higher education institutions. Webometrics, however, rates the presence, impact, openness and research quality, measured by Google Scholar Citations, of nearly 24,000 universities. Only 5,483 of these get a score for quality so that a university with a rank of 5,484  is not doing anything resembling research at all.

While the Webometrics rankings are quite volatile at the top, the distinction between a university in 200th and one in 2,000th place is significant, even more so between 1,000th and 10,000th.

Webometrics can also help determine whether an institution is in fact a university in any sense of the word. If it isn't there the chances are that it isn't really a university. I recently heard of someone who claimed degrees from Gordon University in Florida.  A search of Webometrics revealed nothing so it is very likely that this is not a reputable institution.

Here ic4u is less helpful since it ranks only 11,606 places and does not provide information about specific indicators.

Looking at the research rankings, the oldest and most respected by academics is the ARWU published by Shanghai Ranking Consultancy. This has six criteria, all of them related to research. The methodology is straightforward and perhaps unsophisticated by today's standards. The humanities are excluded, it has nothing to say about teaching and it favours large, old and rich universities. It also privileges medical schools, as shown by last year's world rankings which put the University of California at San Francisco, a medical school, in eighteenth place in the world.

On the other hand, it is stable and sensible and by default the ranking of choice for research students and faculty.

ARWU also publishes subject rankings which should be checked. But it is worth noting that ARWU has proved vulnerable to the tactics of King Abdulaziz University (KAU) in Jeddah which hands out contracts to over 200 adjunct faculty members who list KAU as a secondary affiliation. This has put the university  in the world's top ten in the ARWU mathematics rankings.

There are a number of other global research rankings that could be consulted since they produce results that may be different from Shanghai. These include the National Taiwan University Rankings, the Best Global Universities published by US News, University Ranking by Academic Performance produced by Middle East Technical University and the CWTS Leiden Ranking. These, especially the Leiden Ranking, reach a high level of technical sophistication and should be consulted by anyone thinking of postgraduate study anywhere. The Leiden Ranking is very helpful in providing stability intervals for each score.

Since 2004 a number of rankings have appeared that attempt to go beyond simply measuring research output and quality. These are problematical in many ways. It is very difficult to find data that is comparable across international borders and their methodology can change. In addition, they rely on reputation surveys which can be biased and unreliable. At the moment the two best known international rankings of this type are the World University Rankings published by QS and THE.

It must be pointed that even these are still top heavy with research indicators. The QS rankings have a 40% weighting for an opinion survey of research quality, and another 20 percent for citations. Even the faculty student ratio indicator is sometimes a measure of research rather than teaching since it can be improved by adding research-only staff to the faculty. The THE rankings allot 30% to five research indicators and another 30% to citations, 2.5% to international research collaborations and 2.5% to research income from industry.

You should bear in mind that these rankings have been accused, not without justification, of national bias. A paper by Christopher Claassen of the University of Glasgow has found that the QS and THE rankings are seriously biased towards UK universities.

The metrics that do attempt to measure teaching quality in the THE and QS rankings are not very helpful. Both have faculty student ratio data but this is a very imprecise proxy for teaching resources. The THE rankings include five indicators in their super-indicator "Teaching: the Learning Environment", two of which measure the number of doctoral students or doctoral degrees, which does not say much about undergraduate instruction.

It is also a good idea to check the scores for the criteria that are combined to make up the composite score. If a university has a disproportionately high score for an indicator with a high weighting like QS's academic opinion survey (40%) or THE's citations indicator (30%) then alarm bells should start ringing.

In some ways the new Round University Ranking from Russia is an improvement. It uses data from Thomson Reuters as THE did until last year. It does nearly everything that THE and QS do and a few more things besides. Altogether there are 20 indicators, although some of these, such as three reputation indicators,  are so similar that they are effectively redundant.

Recently a consortium of European organisations and universities created U-Multirank, which takes a different approach. This is basically an online evaluation tool that allows users to choose how they wish to compare and sort universities. Unfortunately, its coverage is rather uneven. Data about teaching and learning is limited for Asia, North America and the UK although good for Western
Europe.

International rankings are generally not helpful in providing information about graduate employability although QS do include a reputation survey of employers and the Jeddah based Center for World University Rankings counts alumni who become CEOs of major companies.

The main global rankers also publish regional and specialist spin-offs: Latin American, Asian and European rankings, new university rankings, subject rankings. These should be treated with scepticism since they depend on a relatively small number of data points and consequently can be unreliable.

To summarise, these are things to remember when using global rankings:

  • first check with Webometrics to find out the approximate standing of a university in the world
  • for prospective postgraduate students in science and medicine, the Shanghai rankings should be consulted and confirmed by looking at other research rankings
  • potential undergraduate students should look at more than one "holistic" ranking and should always check the scores for specific indicators
  • be very sceptical of universities that get a good score for only one or two indicators or that do well in only one ranking
  • always look at scores for subjects if available but remember that they may be based on limited and unreliable data
  • for teaching related information the best source is probably U-Multirank but this is available only for some universities.







Friday, April 01, 2016

Really Useful Rankings: The Lowest Ranked Universities in the World

University rankings are getting a bit predictable, especially at the top. The top university is either in California or Cambridge, Massachusetts (or once in another Cambridge somewhere). The top twenty will be nearly all US places with a few British. Some of the established rankers do try to liven things up a bit: Nanyang Technological University in the top twenty, Middle East Technical University in the top hundred, but that does not do much to relieve the tedium.

So I thought it might be interesting to find the lowest ranked universities in the world. The trouble is most rankings do not rank many institutions so the best place to look is Webometrics which tries to rank all the universities on the planet. Here are the 22 institutions ranked 23,892 by Webometrics which means they have no Presence, no Impact, no Openness and not one Google Scholar Citation. Whether their existence goes beyond a website to include buildings and people I'll leave for you to find out.

Instituto Oswaldo Cruz de Certificacao, Brazil
Faculdade des Americas, Brazil
Escuela Normal Prescolar Adolfo Viguri Viguri, Mexico
Universidad Continente Americano Celaya, Mexico
Colegio Universitario y Tecnologico del Noreste, Mexico
Benemerita Escuela Normal Federalizada de Tamaulipas, Mexico
Universidad Virtual Latinoamericano, Venezuela
Faculdade Zumbi dos Palmares, Brazil
Faculdade Vizcaya, Brazil
Mugla Vocational Higher School, Turkey
Higher Polytechnic School Zagreb, Croatia
Sadat Institute of Higher Education, Afghanistan
Saint Paul Institute, Cambodia
Zhangzhou Institute of Technology, China
Technological University Thanlyin, Myanmar
Abasyn University, Pakistan
Salipur College, India
Adusumilli Vijaya Group of Colleges, India
Kaboora Institute of Higher Education, Afghanistan
Madina Engineering College, India
Universidade Lueji A'Nkonde, Angola
Reseau Marwan, Morocco






Sunday, March 27, 2016

The Goldman Sachs Rankings

From efinancialcareers

"Goldman Sachs recruits primarily from US Ivy League universities. This might sound entirely predictable, but Goldman does have some surprises in the places it recruits from. In Asia, for example, it recruits mostly from local universities. In the UK, it seems to draw most of its operations hires from Warwick.
Based on the 22,000 Goldman Sachs CVs in the eFinancialCareers CV database, we’ve created rankings of the top universities the bank hires from by sector and region.
Globally, the best university for getting into Goldman Sachs is the London School of Economics (LSE), our data suggests, followed by Columbia University and the University of Pennsylvania – both of which have a strong emphasis on financial services careers."
Globally the best places for a career with Goldman Sachs are:

1.  London School of Economics
2.  Columbia University
3.  University of Pennsylvania
4.  Princeton University
5.  Harvard University.

In the US they are:

1.  Columbia University
2.  University of Pennsylvania
3.  Princeton University
4.  Cornell University
5.  Harvard University.

In Asia they are:

1.  University of Hong Kong
2.  National University of Singapore
3.  Nanjing University
4.  London School of Economics
5.  Indian Institute of Technology (which one is not stated).

In Europe they are:
1.  London School of Economics
2.  Cambridge University
3.  Imperial College London
4.  Oxford University
5.  Warwick University.




Saturday, March 26, 2016

Ranking Rankings Again


Magnus Gunnarsson of the University of Gothenburg has reminded me of a 2010 report which included an assessment of methodology based on IREG's Berlin principles.

There were 16 Berlin principles, 14 of which were given weighted subscores in the report. The values and weighting were determined  subjectively although the rankers were evidently well informed.

The Berlin principles were grouped in four categories, Purpose and Goals of Rankings, Design and Weighting of Indicators, Collection and Processing of Data and Presentation of Ranking Results. For more information see here.

The ranking of rankings  by methodology is as follows. It is obviously out of date.


Position
Ranking
Overall method score
1= CHE (Germany) 3.10
1=  Webometrics 3.10
HEEACT (now Taiwan National
University)
2.90
4 ARWU (Shanghai) 2.80
5= High Impact Universities (Australia) 2.60
5= Observatory (sustainable development)2.60
7= Scimago 2.40
7= CWTS Leiden ranking 2.40
9 THE 2.30
10 Mines Paris Tech 2.20
11 QS 2.10
12 Rater (Russia) 1.70

Wednesday, March 23, 2016

Update on Trinity College Dublin

Some things have been clarified about the now famous message sent by John Boland, vice-president and dean of research at Trinity College Dublin to various stakeholders.

It seems that the message was about both the QS and the THE academic surveys, referring to the sign up facility for the former and the one dollar donation to charity for the latter.

Apparently THE has no problems with the message.

On the other hand, QS thinks that its guidelines have been breached.

"QS has been made aware of recent communications from Trinity College Dublin (TCD) that appear to contravene the guidelines used by QS to ensure the accuracy of its university rankings. In accordance with our standard operating procedure, we have notified TCD that its “awareness” campaign is in breach of these guidelines."

The issue is not suggesting that anyone should go the sign up facility. That is permitted in clearly stated guidelines. What is not permitted is suggesting what they do after they have signed up.

"It is acceptable and even encouraged for institutions to communicate with employers and academics worldwide to showcase their achievements. Institutions are welcome to invite contacts to sign up for possible selection for our survey using our Academic or Employer Sign Up Facilities, but any message soliciting a specific response in our surveys, represents unfair manipulation of the results and will not be tolerated. "

It is fairly clear what the vice-dean was suggesting people should do although I can see how he might have thought he was being subtle enough to get around the QS guidelines.

One interesting aspect of this affair is that QS is much stricter about influencing the surveys than THE.

I think the main lesson to be learnt from this affair is that it is unwise to allow senior administrators to get involved with any sort of rankings initiative or strategy.





Sunday, March 20, 2016

Turning the Tide: Contributing to the Decline of American Universities


Harvard Graduate School of Education has come out with a plan to radically overhaul the admissions process in American universities. The document has been endorsed by a huge, if that word won't trigger a trauma for someone, number of certified academic gatekeepers.

The report argues that the current university admissions process emphasises personal success at the expense of community engagement, puts too much stress on applicants and discriminates against students from disadvantaged communities.

Proposals include "promoting more meaningful contributions to others", "assessing students  ethical engagement and contributions to others in ways that reflect varying types of family and community contributions across race, culture and class" and  "redefining achievement in ways that both level the playing field for economically diverse students and reduce excessive achievement pressure."

Detailed recommendations include students doing at least a year of "sustained service or community engagement", which could include working to contribute family finances. The report recommends that what should count for university admissions is whether " whether students immersed themselves in an experience and the emotional and ethical awareness and skills generated by that experience.'

It is predictable that this will greatly increase the stress experienced by applicants who would have to find some sort of service for a year, immerse themselves in it, generate emotional and ethical skills and then be able to convince admissions officers that they have done so. Or, let us be realistic, find teachers or advisors who will show them how to do this. It will also prompt a massive invasion of privacy if colleges are really going to demand evidence that community service is authentic and meaningful.

Students will also be encouraged to undertake activities that  "deepens their appreciation of diversity." Would it be too cynical to suspect that this is code for political indoctrination?

The report also urges that students take fewer AP and IB courses and that colleges should consider making the SAT and ACT optional.

None of these are particularly novel but put together they are likely to cause a shift in the qualities required for admission to America's elite schools. Students will care for others, be passionate about community engagement, appreciate diversity and have authentic extra-curricular activities or at least they will be highly skilled at pretending that they are. They will also be less academically able and prepared and universities will inevitably have to adjust to this.

Also, admissions offices will require more money and responses to make sure that those diversity experiences are meaningful and that community engagement is authentic. But that shouldn't be a problem. Somehow money is always available when it is really needed.

Over the next few years look for a migration of talented students and researchers from the US.

Trinity College Dublin: Waiting for Clarification

Yesterday I published a post about Trinity College Dublin that referred to a report in the London Sunday Times about TCD encouraging its graduates to respond to questionnaires  from THE.

Now I have seen a report in Inside Higher Education that says that TCD has been encouraging people to sign up for the QS academic survey sign up facility and to vote for the college.

I have therefore deleted the post until it is clear whether TCD has been sending messages about about the QS or the THE surveys or both and whether it has violated the guidelines of either or both.












Friday, March 18, 2016




"What we are seeing worldwide, from India to the UK to the US, is the rebellion against the inner circle of no-skin-in-the-game policymaking "clerks" and journalists-insiders, that class of paternalistic semi-intellectual experts with some Ivy league, Oxford-Cambridge, or similar label-driven education who are telling the rest of us 1) what to do, 2) what to eat, 3) how to speak, 4) how to think... and 5) who to vote for.
With psychology papers replicating less than 40%, dietary advice reversing after 30y of fatphobia, macroeconomic analysis working worse than astrology, microeconomic papers wrong 40% of the time, the appointment of Bernanke who was less than clueless of the risks, and pharmaceutical trials replicating only 1/5th of the time, people are perfectly entitled to rely on their own ancestral instinct and listen to their grandmothers with a better track record than these policymaking goons."

And, one might add, the publishers of sophisticated and prestigious rankings who would have us believe that a university can be world-class one year and at the bottom of the table the next.

Thursday, March 17, 2016

Trinity College Dublin Gets Upset About Rankings

Trinity College Dublin has done remarkably well in global university rankings over the last decade. Since 2004 it has steadily risen from the 201-300 band to the 151-200 band in the Shanghai Academic Ranking of World Universities (ARWU). Its score for publications went from 27.1 in 2004 to 31 in 2015 (Harvard is 100) and for productivity per capita (scores for Nobel and Fields awards, papers in Nature and Science, highly cited researchers and publications divided by number of faculty) it rose from 13.9 to 19 (Caltech is 100).

The Shanghai rankings measure only research. Nonetheless, this is genuine progress even if slow and boring: at this rate Trinity will catch up with Harvard for publications in another 170 years or so.

So why is Trinity not celebrating this excellent achievement? Instead it is getting very excited about its poor and declining performance in the QS and THE world rankings.

It is a serious mistake to be concerned about falling in these rankings. Last year QS and THE made significant methodological changes so it is meaningless to make year on year comparisons.

Even if QS and THE make no further changes, these rankings are likely to be unacceptably volatile. Both rely heavily on reputation surveys for which scores tend to be very low once you get outside the top fifty or so and consequently are susceptible to short term fluctuations, although QS does damp down short term changes by recycling unchanged survey responses. THE has three income based indicators, institutional income, research income and income from industry and commerce so it is exposed to fluctuations resulting from exchange rate changes. If THE were serious about producing valid and reliable rankings they would use three or five year averages for the income indicators.


And so, as you might have guessed, Trinity is developing a rankings strategy

"The Rankings Steering Group, set up as part of the strategy, is chaired by the Provost, Patrick Prendergast, and has identified the QS World University Rankings and the Times Higher Education rankings as a priority. The strategy will focus on areas such as outputs, citations, funding levels, staff composition and reputation."




Wednesday, March 16, 2016

Plagiarism Data from Russia

The Russian website Dissernet has published information about about plagiarism in Russian universities. The worst offender, according to the Moscow Times, is the Financial University followed by the St Petersburg State University of Economics and the Plekhanov Economic University.

It seems that plagiarism is a big problem in Russian universities although it could be that some Russians are more serious about academic fraud than other countries.

Saturday, March 12, 2016

I'm glad somebody's noticed


Uwe Brandenburg, Managing Director of CHE Consult in Times Higher Education:


"Frankly, I think the success of certain institutions in rankings is more to do with the rankings’ methodology than anything else. They inevitably favour factors that are statistically more likely to be found among certain universities than others."

Ranking Rankings 1: Stability


Updated 13/03/16, 15/03/16

Making a start on ranking global university rankings, here is the average change in position of the top 20 twenty universities in 2014 in seven global university rankings between 2014 and 2015.

Note: both QS and THE introduced major methodological changes in 2015.

The table refers only to the top 20. Things might (or might not) be different if the top 100 or 500 were considered.

The Shanghai ARWU and URAP  are so far well ahead  of the others for stability.



Rankings Mean position change of
 top 20 universities
2014-15
Shanghai Academic Ranking of World Universities (ARWU)  0.30
Center for World University Rankings (CWUR) (Jeddah)   0.40
University Ranking By Academic Performance (URAP)
(Middle East Technical University)  
0.55
THE World University Rankings 2.10
Round University Rankings (Russia) 2.35
QS World University Rankings 3.50
Webometrics 8.85

News Lite from Times Higher

The mainstream media in the UK and parts of Europe is getting very excited about the latest ranking from Times Higher Education, the top  200 European universities.

There is nothing new here. THE have just pulled the European universities out of last year's world rankings.

Here are some of the headlines that have greeted the remarkable revelations.



Leicester Mercury
Leicester Mercury
Google News
Quarter of top 200 European universities are in UK - league table

Wessex Scene

World News Online

Myinforms
Dundee University 'on a global scale' after European rankings accolade

Bailiwick Express
A quarter of Europe's top 200 universities are in the UK

Flanders Today
In-Cyprus


The Tab
UCL ranked 5th best university in Europe

Plymouth Herald

Shanghai Daily





Monday, March 07, 2016

Is it possible to rank rankings?

At a seminar recently at  Ural Federal University in Ekaterinburg the question was raised whether we could evaluate and rank rankings.

That's a tough one. Different rankings have different approaches, audiences and methodologies. The Shanghai rankings embody the concerns of the Chinese ruling class, convinced that salvation lies in science and technology and disdainful  -- not entirely without reason --  of the social sciences and humanities. The Times Higher Education world rankings have claimed to be looking for quality, recently tempered by a somewhat unconvincing concern for inclusiveness.

But it is possible that there are some simple metrics that could be used to compare global rankings. here are some suggestions.

Stability
Universities are big places typically with thousands of students and hundreds of staff. In the absence of administrative reorganisation or methodological changes we should not expect dramatic change from one year to another. Apparently a change of four places over a year is normal for the US News America's Best Colleges so nobody should get excited about going up or down a couple of places.

It would seem reasonable then that rankings could be ordered according to the average change in position over a year. I have already done some calculations with previous years' rankings (see posts 09/07/14, 17/07/14, 04/11/14).

So we could rank these international rankings according to the mean number of position changes in the top 100 between 2013 and 2014. The fewer the more stable the rankings.

1.   Quacquarelli Symonds World University Rankings        3.94
2.   Times Higher Education World University Rankings     4.34
3.   Shanghai ARWU 4.92
4.   National Taiwan University Rankings   7.30
5.   CWUR (Jeddah)   10.59
6.   Webometrics   12.08

Consistency and Redundancy
It is reasonable that if the various ranking indicators are measuring quality or highly valued attributes there should be at least a modest correlation  between them. Good students will attract good teachers who might also have the attributes, such as an interest in their field or reading comprehension skills, required to do research. Talented researchers will be drawn to places that are generously funded or highly reputed.

On the other hand, if  there is a very high correlation between two indicators, perhaps above .850, then this probably means that they are measuring the same thing. One of them could be discarded.

Transparency
Some rankings have adopted the practice of putting universities into bands rather than giving them individual scores. This is, I suppose, a sensible way of discouraging people from getting excited about insignificant fluctuation but it might also suggest a lack of confidence in the rankers' data or the intention of selling the data in some way, perhaps in the guise of benchmarking. Since 2010 THE have bundled indicator scores into clusters making it very difficult to figure out exactly what is causing universities to rise or fall. Rankings could be ranked according to the number of universities for which overall scores and indicator scores are provided.

Inclusiveness
It would be very easy to rank rankings according to the number of universities that they include. This is something where they vary considerably. The Shanghai ARWU ranks 500 universities while Webometrics ranks close to 24,000.

Comprehensiveness
Some rankings such as ARWU, the NTU rankings (Taiwan), URAP (Middle East Technical University) measure only research output and impact. The THE and QS rankings attempt to include metrics related, perhaps distantly, to teaching quality and innovation. QS has an indicator that purports

Balance
Some rankings award a disproportionate weighting to a single indicator, QS's  academic survey (40%), THE's citation indicator (30%).  Also, if a university or universities are getting disproportionately high scores for a specific indicator this might mean that the rankings are being manipulated or are seriously flawed in some way.

External Validation
How do we know that rankings measure what they are supposed to measure? It might be possible to measure the correlation between international rankings and national rankings which often include more data and embody local knowledge about the merits of universities.

Replicability
How long would it take to check whether the rankings have given your university the correct indicator score? Try it for yourself with the Shanghai highly cited researchers indicator. Go here and find the number of highly cited researchers with Harvard as their primary affiliation and the number with your university. Find the square root of both numbers. Then give Harvard a score of 100 and adjust your university's score accordingly.

Now for the THE citations impact indicator. This is normalised by field and by year of citation so that the what matters is not the number of citations that a publication gets but the number of publications compared to the world average in 334 fields and in the first, second, third, fourth, fifth or sixth year of publication.

Dead simple isn't it?

And don't forget the regional modification.

I hope the next post will be a ranking of rankings according to stability.