Thursday, April 07, 2016

How to Read International University Rankings


The London Daily Telegraph has published an article on 'How to Read the Different University Rankings' which refers to two international rankings, QS and THE and their spin-offs and some national ones. I assume this was intended for British students who might like to get a different perspective on UK university quality or who might be thinking of venturing abroad.

The article is not very satisfactory as it refers only to the QS and THE rankings and is uncritical.

So here is a brief survey of some global rankings for prospective students.

International university rankings fall into three groups: Internet-based like Webometrics and ic4u, research-based like the Academic Ranking of World Universities (ARWU) produced by the Shanghai Ranking Consultancy and "holistic" rankings such as the Quacquarelli Symonds (QS) and Times Higher Education (THE) world rankings which claim to measure reputation, teaching quality, internationalisation and other factors and combine indicators into a single index.

The first thing that you need to know is where an institution stands in the global university hierarchy. Most rankings cover only a fraction of the world's higher education institutions. Webometrics, however, rates the presence, impact, openness and research quality, measured by Google Scholar Citations, of nearly 24,000 universities. Only 5,483 of these get a score for quality so that a university with a rank of 5,484  is not doing anything resembling research at all.

While the Webometrics rankings are quite volatile at the top, the distinction between a university in 200th and one in 2,000th place is significant, even more so between 1,000th and 10,000th.

Webometrics can also help determine whether an institution is in fact a university in any sense of the word. If it isn't there the chances are that it isn't really a university. I recently heard of someone who claimed degrees from Gordon University in Florida.  A search of Webometrics revealed nothing so it is very likely that this is not a reputable institution.

Here ic4u is less helpful since it ranks only 11,606 places and does not provide information about specific indicators.

Looking at the research rankings, the oldest and most respected by academics is the ARWU published by Shanghai Ranking Consultancy. This has six criteria, all of them related to research. The methodology is straightforward and perhaps unsophisticated by today's standards. The humanities are excluded, it has nothing to say about teaching and it favours large, old and rich universities. It also privileges medical schools, as shown by last year's world rankings which put the University of California at San Francisco, a medical school, in eighteenth place in the world.

On the other hand, it is stable and sensible and by default the ranking of choice for research students and faculty.

ARWU also publishes subject rankings which should be checked. But it is worth noting that ARWU has proved vulnerable to the tactics of King Abdulaziz University (KAU) in Jeddah which hands out contracts to over 200 adjunct faculty members who list KAU as a secondary affiliation. This has put the university  in the world's top ten in the ARWU mathematics rankings.

There are a number of other global research rankings that could be consulted since they produce results that may be different from Shanghai. These include the National Taiwan University Rankings, the Best Global Universities published by US News, University Ranking by Academic Performance produced by Middle East Technical University and the CWTS Leiden Ranking. These, especially the Leiden Ranking, reach a high level of technical sophistication and should be consulted by anyone thinking of postgraduate study anywhere. The Leiden Ranking is very helpful in providing stability intervals for each score.

Since 2004 a number of rankings have appeared that attempt to go beyond simply measuring research output and quality. These are problematical in many ways. It is very difficult to find data that is comparable across international borders and their methodology can change. In addition, they rely on reputation surveys which can be biased and unreliable. At the moment the two best known international rankings of this type are the World University Rankings published by QS and THE.

It must be pointed that even these are still top heavy with research indicators. The QS rankings have a 40% weighting for an opinion survey of research quality, and another 20 percent for citations. Even the faculty student ratio indicator is sometimes a measure of research rather than teaching since it can be improved by adding research-only staff to the faculty. The THE rankings allot 30% to five research indicators and another 30% to citations, 2.5% to international research collaborations and 2.5% to research income from industry.

You should bear in mind that these rankings have been accused, not without justification, of national bias. A paper by Christopher Claassen of the University of Glasgow has found that the QS and THE rankings are seriously biased towards UK universities.

The metrics that do attempt to measure teaching quality in the THE and QS rankings are not very helpful. Both have faculty student ratio data but this is a very imprecise proxy for teaching resources. The THE rankings include five indicators in their super-indicator "Teaching: the Learning Environment", two of which measure the number of doctoral students or doctoral degrees, which does not say much about undergraduate instruction.

It is also a good idea to check the scores for the criteria that are combined to make up the composite score. If a university has a disproportionately high score for an indicator with a high weighting like QS's academic opinion survey (40%) or THE's citations indicator (30%) then alarm bells should start ringing.

In some ways the new Round University Ranking from Russia is an improvement. It uses data from Thomson Reuters as THE did until last year. It does nearly everything that THE and QS do and a few more things besides. Altogether there are 20 indicators, although some of these, such as three reputation indicators,  are so similar that they are effectively redundant.

Recently a consortium of European organisations and universities created U-Multirank, which takes a different approach. This is basically an online evaluation tool that allows users to choose how they wish to compare and sort universities. Unfortunately, its coverage is rather uneven. Data about teaching and learning is limited for Asia, North America and the UK although good for Western
Europe.

International rankings are generally not helpful in providing information about graduate employability although QS do include a reputation survey of employers and the Jeddah based Center for World University Rankings counts alumni who become CEOs of major companies.

The main global rankers also publish regional and specialist spin-offs: Latin American, Asian and European rankings, new university rankings, subject rankings. These should be treated with scepticism since they depend on a relatively small number of data points and consequently can be unreliable.

To summarise, these are things to remember when using global rankings:

  • first check with Webometrics to find out the approximate standing of a university in the world
  • for prospective postgraduate students in science and medicine, the Shanghai rankings should be consulted and confirmed by looking at other research rankings
  • potential undergraduate students should look at more than one "holistic" ranking and should always check the scores for specific indicators
  • be very sceptical of universities that get a good score for only one or two indicators or that do well in only one ranking
  • always look at scores for subjects if available but remember that they may be based on limited and unreliable data
  • for teaching related information the best source is probably U-Multirank but this is available only for some universities.







Friday, April 01, 2016

Really Useful Rankings: The Lowest Ranked Universities in the World

University rankings are getting a bit predictable, especially at the top. The top university is either in California or Cambridge, Massachusetts (or once in another Cambridge somewhere). The top twenty will be nearly all US places with a few British. Some of the established rankers do try to liven things up a bit: Nanyang Technological University in the top twenty, Middle East Technical University in the top hundred, but that does not do much to relieve the tedium.

So I thought it might be interesting to find the lowest ranked universities in the world. The trouble is most rankings do not rank many institutions so the best place to look is Webometrics which tries to rank all the universities on the planet. Here are the 22 institutions ranked 23,892 by Webometrics which means they have no Presence, no Impact, no Openness and not one Google Scholar Citation. Whether their existence goes beyond a website to include buildings and people I'll leave for you to find out.

Instituto Oswaldo Cruz de Certificacao, Brazil
Faculdade des Americas, Brazil
Escuela Normal Prescolar Adolfo Viguri Viguri, Mexico
Universidad Continente Americano Celaya, Mexico
Colegio Universitario y Tecnologico del Noreste, Mexico
Benemerita Escuela Normal Federalizada de Tamaulipas, Mexico
Universidad Virtual Latinoamericano, Venezuela
Faculdade Zumbi dos Palmares, Brazil
Faculdade Vizcaya, Brazil
Mugla Vocational Higher School, Turkey
Higher Polytechnic School Zagreb, Croatia
Sadat Institute of Higher Education, Afghanistan
Saint Paul Institute, Cambodia
Zhangzhou Institute of Technology, China
Technological University Thanlyin, Myanmar
Abasyn University, Pakistan
Salipur College, India
Adusumilli Vijaya Group of Colleges, India
Kaboora Institute of Higher Education, Afghanistan
Madina Engineering College, India
Universidade Lueji A'Nkonde, Angola
Reseau Marwan, Morocco






Sunday, March 27, 2016

The Goldman Sachs Rankings

From efinancialcareers

"Goldman Sachs recruits primarily from US Ivy League universities. This might sound entirely predictable, but Goldman does have some surprises in the places it recruits from. In Asia, for example, it recruits mostly from local universities. In the UK, it seems to draw most of its operations hires from Warwick.
Based on the 22,000 Goldman Sachs CVs in the eFinancialCareers CV database, we’ve created rankings of the top universities the bank hires from by sector and region.
Globally, the best university for getting into Goldman Sachs is the London School of Economics (LSE), our data suggests, followed by Columbia University and the University of Pennsylvania – both of which have a strong emphasis on financial services careers."
Globally the best places for a career with Goldman Sachs are:

1.  London School of Economics
2.  Columbia University
3.  University of Pennsylvania
4.  Princeton University
5.  Harvard University.

In the US they are:

1.  Columbia University
2.  University of Pennsylvania
3.  Princeton University
4.  Cornell University
5.  Harvard University.

In Asia they are:

1.  University of Hong Kong
2.  National University of Singapore
3.  Nanjing University
4.  London School of Economics
5.  Indian Institute of Technology (which one is not stated).

In Europe they are:
1.  London School of Economics
2.  Cambridge University
3.  Imperial College London
4.  Oxford University
5.  Warwick University.




Saturday, March 26, 2016

Ranking Rankings Again


Magnus Gunnarsson of the University of Gothenburg has reminded me of a 2010 report which included an assessment of methodology based on IREG's Berlin principles.

There were 16 Berlin principles, 14 of which were given weighted subscores in the report. The values and weighting were determined  subjectively although the rankers were evidently well informed.

The Berlin principles were grouped in four categories, Purpose and Goals of Rankings, Design and Weighting of Indicators, Collection and Processing of Data and Presentation of Ranking Results. For more information see here.

The ranking of rankings  by methodology is as follows. It is obviously out of date.


Position
Ranking
Overall method score
1= CHE (Germany) 3.10
1=  Webometrics 3.10
HEEACT (now Taiwan National
University)
2.90
4 ARWU (Shanghai) 2.80
5= High Impact Universities (Australia) 2.60
5= Observatory (sustainable development)2.60
7= Scimago 2.40
7= CWTS Leiden ranking 2.40
9 THE 2.30
10 Mines Paris Tech 2.20
11 QS 2.10
12 Rater (Russia) 1.70

Wednesday, March 23, 2016

Update on Trinity College Dublin

Some things have been clarified about the now famous message sent by John Boland, vice-president and dean of research at Trinity College Dublin to various stakeholders.

It seems that the message was about both the QS and the THE academic surveys, referring to the sign up facility for the former and the one dollar donation to charity for the latter.

Apparently THE has no problems with the message.

On the other hand, QS thinks that its guidelines have been breached.

"QS has been made aware of recent communications from Trinity College Dublin (TCD) that appear to contravene the guidelines used by QS to ensure the accuracy of its university rankings. In accordance with our standard operating procedure, we have notified TCD that its “awareness” campaign is in breach of these guidelines."

The issue is not suggesting that anyone should go the sign up facility. That is permitted in clearly stated guidelines. What is not permitted is suggesting what they do after they have signed up.

"It is acceptable and even encouraged for institutions to communicate with employers and academics worldwide to showcase their achievements. Institutions are welcome to invite contacts to sign up for possible selection for our survey using our Academic or Employer Sign Up Facilities, but any message soliciting a specific response in our surveys, represents unfair manipulation of the results and will not be tolerated. "

It is fairly clear what the vice-dean was suggesting people should do although I can see how he might have thought he was being subtle enough to get around the QS guidelines.

One interesting aspect of this affair is that QS is much stricter about influencing the surveys than THE.

I think the main lesson to be learnt from this affair is that it is unwise to allow senior administrators to get involved with any sort of rankings initiative or strategy.





Sunday, March 20, 2016

Turning the Tide: Contributing to the Decline of American Universities


Harvard Graduate School of Education has come out with a plan to radically overhaul the admissions process in American universities. The document has been endorsed by a huge, if that word won't trigger a trauma for someone, number of certified academic gatekeepers.

The report argues that the current university admissions process emphasises personal success at the expense of community engagement, puts too much stress on applicants and discriminates against students from disadvantaged communities.

Proposals include "promoting more meaningful contributions to others", "assessing students  ethical engagement and contributions to others in ways that reflect varying types of family and community contributions across race, culture and class" and  "redefining achievement in ways that both level the playing field for economically diverse students and reduce excessive achievement pressure."

Detailed recommendations include students doing at least a year of "sustained service or community engagement", which could include working to contribute family finances. The report recommends that what should count for university admissions is whether " whether students immersed themselves in an experience and the emotional and ethical awareness and skills generated by that experience.'

It is predictable that this will greatly increase the stress experienced by applicants who would have to find some sort of service for a year, immerse themselves in it, generate emotional and ethical skills and then be able to convince admissions officers that they have done so. Or, let us be realistic, find teachers or advisors who will show them how to do this. It will also prompt a massive invasion of privacy if colleges are really going to demand evidence that community service is authentic and meaningful.

Students will also be encouraged to undertake activities that  "deepens their appreciation of diversity." Would it be too cynical to suspect that this is code for political indoctrination?

The report also urges that students take fewer AP and IB courses and that colleges should consider making the SAT and ACT optional.

None of these are particularly novel but put together they are likely to cause a shift in the qualities required for admission to America's elite schools. Students will care for others, be passionate about community engagement, appreciate diversity and have authentic extra-curricular activities or at least they will be highly skilled at pretending that they are. They will also be less academically able and prepared and universities will inevitably have to adjust to this.

Also, admissions offices will require more money and responses to make sure that those diversity experiences are meaningful and that community engagement is authentic. But that shouldn't be a problem. Somehow money is always available when it is really needed.

Over the next few years look for a migration of talented students and researchers from the US.

Trinity College Dublin: Waiting for Clarification

Yesterday I published a post about Trinity College Dublin that referred to a report in the London Sunday Times about TCD encouraging its graduates to respond to questionnaires  from THE.

Now I have seen a report in Inside Higher Education that says that TCD has been encouraging people to sign up for the QS academic survey sign up facility and to vote for the college.

I have therefore deleted the post until it is clear whether TCD has been sending messages about about the QS or the THE surveys or both and whether it has violated the guidelines of either or both.












Friday, March 18, 2016




"What we are seeing worldwide, from India to the UK to the US, is the rebellion against the inner circle of no-skin-in-the-game policymaking "clerks" and journalists-insiders, that class of paternalistic semi-intellectual experts with some Ivy league, Oxford-Cambridge, or similar label-driven education who are telling the rest of us 1) what to do, 2) what to eat, 3) how to speak, 4) how to think... and 5) who to vote for.
With psychology papers replicating less than 40%, dietary advice reversing after 30y of fatphobia, macroeconomic analysis working worse than astrology, microeconomic papers wrong 40% of the time, the appointment of Bernanke who was less than clueless of the risks, and pharmaceutical trials replicating only 1/5th of the time, people are perfectly entitled to rely on their own ancestral instinct and listen to their grandmothers with a better track record than these policymaking goons."

And, one might add, the publishers of sophisticated and prestigious rankings who would have us believe that a university can be world-class one year and at the bottom of the table the next.

Thursday, March 17, 2016

Trinity College Dublin Gets Upset About Rankings

Trinity College Dublin has done remarkably well in global university rankings over the last decade. Since 2004 it has steadily risen from the 201-300 band to the 151-200 band in the Shanghai Academic Ranking of World Universities (ARWU). Its score for publications went from 27.1 in 2004 to 31 in 2015 (Harvard is 100) and for productivity per capita (scores for Nobel and Fields awards, papers in Nature and Science, highly cited researchers and publications divided by number of faculty) it rose from 13.9 to 19 (Caltech is 100).

The Shanghai rankings measure only research. Nonetheless, this is genuine progress even if slow and boring: at this rate Trinity will catch up with Harvard for publications in another 170 years or so.

So why is Trinity not celebrating this excellent achievement? Instead it is getting very excited about its poor and declining performance in the QS and THE world rankings.

It is a serious mistake to be concerned about falling in these rankings. Last year QS and THE made significant methodological changes so it is meaningless to make year on year comparisons.

Even if QS and THE make no further changes, these rankings are likely to be unacceptably volatile. Both rely heavily on reputation surveys for which scores tend to be very low once you get outside the top fifty or so and consequently are susceptible to short term fluctuations, although QS does damp down short term changes by recycling unchanged survey responses. THE has three income based indicators, institutional income, research income and income from industry and commerce so it is exposed to fluctuations resulting from exchange rate changes. If THE were serious about producing valid and reliable rankings they would use three or five year averages for the income indicators.


And so, as you might have guessed, Trinity is developing a rankings strategy

"The Rankings Steering Group, set up as part of the strategy, is chaired by the Provost, Patrick Prendergast, and has identified the QS World University Rankings and the Times Higher Education rankings as a priority. The strategy will focus on areas such as outputs, citations, funding levels, staff composition and reputation."




Wednesday, March 16, 2016

Plagiarism Data from Russia

The Russian website Dissernet has published information about about plagiarism in Russian universities. The worst offender, according to the Moscow Times, is the Financial University followed by the St Petersburg State University of Economics and the Plekhanov Economic University.

It seems that plagiarism is a big problem in Russian universities although it could be that some Russians are more serious about academic fraud than other countries.

Saturday, March 12, 2016

I'm glad somebody's noticed


Uwe Brandenburg, Managing Director of CHE Consult in Times Higher Education:


"Frankly, I think the success of certain institutions in rankings is more to do with the rankings’ methodology than anything else. They inevitably favour factors that are statistically more likely to be found among certain universities than others."

Ranking Rankings 1: Stability


Updated 13/03/16, 15/03/16

Making a start on ranking global university rankings, here is the average change in position of the top 20 twenty universities in 2014 in seven global university rankings between 2014 and 2015.

Note: both QS and THE introduced major methodological changes in 2015.

The table refers only to the top 20. Things might (or might not) be different if the top 100 or 500 were considered.

The Shanghai ARWU and URAP  are so far well ahead  of the others for stability.



Rankings Mean position change of
 top 20 universities
2014-15
Shanghai Academic Ranking of World Universities (ARWU)  0.30
Center for World University Rankings (CWUR) (Jeddah)   0.40
University Ranking By Academic Performance (URAP)
(Middle East Technical University)  
0.55
THE World University Rankings 2.10
Round University Rankings (Russia) 2.35
QS World University Rankings 3.50
Webometrics 8.85

News Lite from Times Higher

The mainstream media in the UK and parts of Europe is getting very excited about the latest ranking from Times Higher Education, the top  200 European universities.

There is nothing new here. THE have just pulled the European universities out of last year's world rankings.

Here are some of the headlines that have greeted the remarkable revelations.



Leicester Mercury
Leicester Mercury
Google News
Quarter of top 200 European universities are in UK - league table

Wessex Scene

World News Online

Myinforms
Dundee University 'on a global scale' after European rankings accolade

Bailiwick Express
A quarter of Europe's top 200 universities are in the UK

Flanders Today
In-Cyprus


The Tab
UCL ranked 5th best university in Europe

Plymouth Herald

Shanghai Daily





Monday, March 07, 2016

Is it possible to rank rankings?

At a seminar recently at  Ural Federal University in Ekaterinburg the question was raised whether we could evaluate and rank rankings.

That's a tough one. Different rankings have different approaches, audiences and methodologies. The Shanghai rankings embody the concerns of the Chinese ruling class, convinced that salvation lies in science and technology and disdainful  -- not entirely without reason --  of the social sciences and humanities. The Times Higher Education world rankings have claimed to be looking for quality, recently tempered by a somewhat unconvincing concern for inclusiveness.

But it is possible that there are some simple metrics that could be used to compare global rankings. here are some suggestions.

Stability
Universities are big places typically with thousands of students and hundreds of staff. In the absence of administrative reorganisation or methodological changes we should not expect dramatic change from one year to another. Apparently a change of four places over a year is normal for the US News America's Best Colleges so nobody should get excited about going up or down a couple of places.

It would seem reasonable then that rankings could be ordered according to the average change in position over a year. I have already done some calculations with previous years' rankings (see posts 09/07/14, 17/07/14, 04/11/14).

So we could rank these international rankings according to the mean number of position changes in the top 100 between 2013 and 2014. The fewer the more stable the rankings.

1.   Quacquarelli Symonds World University Rankings        3.94
2.   Times Higher Education World University Rankings     4.34
3.   Shanghai ARWU 4.92
4.   National Taiwan University Rankings   7.30
5.   CWUR (Jeddah)   10.59
6.   Webometrics   12.08

Consistency and Redundancy
It is reasonable that if the various ranking indicators are measuring quality or highly valued attributes there should be at least a modest correlation  between them. Good students will attract good teachers who might also have the attributes, such as an interest in their field or reading comprehension skills, required to do research. Talented researchers will be drawn to places that are generously funded or highly reputed.

On the other hand, if  there is a very high correlation between two indicators, perhaps above .850, then this probably means that they are measuring the same thing. One of them could be discarded.

Transparency
Some rankings have adopted the practice of putting universities into bands rather than giving them individual scores. This is, I suppose, a sensible way of discouraging people from getting excited about insignificant fluctuation but it might also suggest a lack of confidence in the rankers' data or the intention of selling the data in some way, perhaps in the guise of benchmarking. Since 2010 THE have bundled indicator scores into clusters making it very difficult to figure out exactly what is causing universities to rise or fall. Rankings could be ranked according to the number of universities for which overall scores and indicator scores are provided.

Inclusiveness
It would be very easy to rank rankings according to the number of universities that they include. This is something where they vary considerably. The Shanghai ARWU ranks 500 universities while Webometrics ranks close to 24,000.

Comprehensiveness
Some rankings such as ARWU, the NTU rankings (Taiwan), URAP (Middle East Technical University) measure only research output and impact. The THE and QS rankings attempt to include metrics related, perhaps distantly, to teaching quality and innovation. QS has an indicator that purports

Balance
Some rankings award a disproportionate weighting to a single indicator, QS's  academic survey (40%), THE's citation indicator (30%).  Also, if a university or universities are getting disproportionately high scores for a specific indicator this might mean that the rankings are being manipulated or are seriously flawed in some way.

External Validation
How do we know that rankings measure what they are supposed to measure? It might be possible to measure the correlation between international rankings and national rankings which often include more data and embody local knowledge about the merits of universities.

Replicability
How long would it take to check whether the rankings have given your university the correct indicator score? Try it for yourself with the Shanghai highly cited researchers indicator. Go here and find the number of highly cited researchers with Harvard as their primary affiliation and the number with your university. Find the square root of both numbers. Then give Harvard a score of 100 and adjust your university's score accordingly.

Now for the THE citations impact indicator. This is normalised by field and by year of citation so that the what matters is not the number of citations that a publication gets but the number of publications compared to the world average in 334 fields and in the first, second, third, fourth, fifth or sixth year of publication.

Dead simple isn't it?

And don't forget the regional modification.

I hope the next post will be a ranking of rankings according to stability.









Wednesday, March 02, 2016

The Decline of Free Speech in American Universities


The Foundation for Individual Rights in Education has just released its list of the ten worse colleges for free speech in the US. Here they are along with the incidents that put them on the list.

Mount St Mary's University, Maryland

Two faculty were sacked for criticising the President's plan to get rid of low performing students, "drowning the bunnies" as he so charmingly put it. They were later reinstated.

Northwestern University

Laura Kipnis was investigated for sexual harassment for writing an essay criticising the sexual harassment mania sweeping US colleges. She was cleared only after writing an account of her persecution in the Chronicle of Higher Education.

Louisiana State University

Theresa Buchanan was fired for using profanity in the classroom for a pedagogical reason.

University of  California San Diego

The administration attempted to defund a  student newspaper for making fun of "safe spaces".

St Mary's University of Minnesota

An adjunct classics professor was fired for sexual harassment which have have something to do with an authentic production of Seneca's Medea. He was also fired from his other job as a janitor (!).

University of Oklahoma

Two fraternity members for leading a racist chant. The supreme court has ruled that offensive speech is protected by the first amendment

Marquette University
John Mcadams was for criticising an instructor for suppressing a student's negative comments about same sex marriage.

Colorado College 

A student was suspended for unchivalrous remarks about African American women on Yik Yak.

University of Tulsa 

A student was removed from class because of Facebook posts written by his fiance criticising a professor.

Wesleyan University

The student government voted to remove funding from a student newspaper that was mildly critical of Black Lives Matter.






Wednesday, February 24, 2016

Britain leads in sniffing edge research


There must be some formula that can predict the scientific research that will go viral over the social media or reach the pages of the popular press or even Times Higher Education (THE).

The number of papers in the natural and social sciences is getting close to uncountable. So why out of all of them has THE showcased a study of the disgust reported by students when sniffing sweaty T-shirts from other universities?

Anyway, here is a suggestion to the authors for a follow-up study. Have the students read the latest QS, THE or Shanghai world rankings before having a sniff and see if that makes any difference to the disgust experienced.

Tuesday, February 23, 2016

Should the UK stay in the EU?

There are 130 universities in the UK and the vice-chancellors of more than 103 of them have signed a letter praising the role of the European Union in supporting the UK's world-class universities.

There are some notable names missing, University College London, Manchester, Warwick and York but such a high degree of consensus among the higher bureaucracy is rather suspicious.




The Ranking Effect

The observer effect in science refers to the changes that a phenomenon undergoes as a result of being observed.

Similarly, in the world of university ranking, indicators that may once have been useful measures of quality sometimes become less so once they are included in the global rankings.

Counting highly cited researchers might have been a good measure of research quality but its value was greatly reduced once someone found out how easy it was to recruit adjunct researchers and to get them to list their new "employer" as an affiliation.

International students and international faculty could also be markers of quality but not so much if universities are deliberately recruiting under-qualified staff and students just to boost their rankings score.

Or, as Irish writer Eoin O'Malley aptly puts it in the online magazine Village , "As Goodhart’s Law warns us: when a measure becomes a target, it ceases to be a good measure".

O'Malley argues that reputation surveys are little more than an exercise in name reputation, that Nobel awards do not measure anything important (although I should point out that Trinity College Dublin would not get any credit for Samuel Beckett since the Shanghai Rankings do not count literature awards), and that the major criteria used by rankers do not measure anything of interest to students.

Irish universities have experienced a disproportionate impact from the methodological changes introduced by QS and Times Higher Education towards the end of last year. I suspect that Dr O'Malley's criticism will have a receptive audience.






Friday, January 15, 2016

Aussies not impressed with THE any more


Back in 2012 The Australian published a list of the most influential figures in Australian higher education. In 14th place was Phil Baty, the editor of the Times Higher Education (THE) World University Rankings.

Recently, the newspaper came out with another influential list full of the usual bureaucrats and boosters plus the Australian dollar at number five. Then at number 10 was not a person, not even Times Higher Education, but "rankings". A step up for rankings but a demotion for THE.

To make things worse for THE, Leiden Ranking and the Shanghai Academic ranking of World Universities were designated the leaders.

Then we have a reference to "new and increasingly obscure league tables peddled by unreliable metrics merchants, with volatile methodologies triggering inexplicably spectacular rises and falls from grace."

But who are those new and increasingly obscure league tables?  It can't be URAP, National Taiwan University Rankings, QS world rankings, or Scimago, because they are not new. The US News Best Global Universities and the Russian Round University Ranking are  new but so far their methodology is not volatile. Webometrics can be a bit volatile sometimes but it is also not  new. Maybe they are referring to the QS subject rankings.

Or could it be that The Australian is thinking of the THE World University Rankings? What happened last autumn to universities in France, Korea and Turkey was certainly a case of volatile methodology. But new? Maybe The Australian has decided that the methodology was changed so much that it constituted a new league table.




Sunday, January 10, 2016

Diversity Makes You Brighter ... if You're a Latino Stockpicker in Texas or Chinese in Singapore


Nearly everybody, or at least those who run the western mainstream media, agrees that some things are sacred. Unfortunately,  this is not always obvious to the uncredentialled who from time to time need to be beaten about their empty heads with the "findings" of "studies".

So we find that academic papers often with small or completely inappropriate samples, minimal effect sizes, marginal significance levels, dubious data collection procedures, unreproduced results or implausible assumptions are published in top flight journals, cited all over the Internet or even showcased in the pages of the "quality" or mass market press.

For example, anyone with any sort of mind knows that the environment is the only thing that determines intelligence.

So in 2009 we had an article in the Journal of Neuroscience that supposedly proves that a stimulating environment will not only make its beneficiaries more intelligent but also the children of the experimental subjects.

A headline in the Daily Mail proclaimed that " Mothers who enjoyed a stimulating childhood 'have brainier babies"

The first sentence of the reports claims that "[a] mother's childhood experiences may influence not only her own brain development but also that of her sons and daughters, a study suggests."

Wonderful. This could, of course, be an argument for allowing programs like Head Start to run for another three decades so that that their effects would show up in the next generation. Then the next sentence gives the game away.

"Researchers in the US found that a stimulating environment early in life improved the memory of female mice with a genetic learning defect."

Notice that experiment involved mice and not humans or any other mammal bigger than a ferret, it improved memory and nothing else, and the subjects had a genetic learning defect.

Still, that did not stop the MIT Technology Review from reporting Moshe Szyf of McGill University a saying “[i]f the findings can be conveyed to human, it means that girls’ education is important not just to their generation but to the next one,”

All of this, if confirmed, would be a serious blow against modern evolutionary theory. The MIT Technology Review got it right when it spoke about a comeback for Lamarckianism. But if there is anything scientists should have learnt over the last few decades it is that an experiment that appears to overthrow current theory, not to mention common sense and observation, is often flawed in some way. Confronted with evidence in 2011 that neutrinos were travelling faster than light, physicists with CERN reviewed their experimental procedures until they found that the apparent theory busting observation was caused by a loose fibre optic cable.

If a study had shown that a stimulating environment had a negative effect on the subjects or on the next generation or that it was stimulation for fathers that made the difference, would it have been cited in the Daily Mail or the MIT Technology Review? Would it even have been published in the Journal of Neuroscience? Wouldn't everybody have been looking for the equivalent of a loose cable?

A related idea that has reached the status of unassailable truth is that the famous academic achievement gap between Asians and Whites, and African Americans and Hispanics, could be eradicated by some sort of environmental manipulation such as spending money, providing safe spaces or laptops,  boosting self esteem or fine tuning teaching methods.

A few years ago Science, the apex of scientific research, published a paper by Geoffrey L. Cohen, Julio Garcia, Nancy Apfel and Allison Master that claimed a few minutes writing a essay affirming students' values (the control group wrote about somebody else's values) would start a process leading to an improvement in their relative academic performance. This applied only to low-achieving African American students.

I suspect that anyone with any sort of experience of secondary school classrooms would be surprised by the claim that such a brief exercise could have such a disproportionate impact.

The authors in their conclusion say:

"Finally, our apparently disproportionate results rested on an obvious precondition: the existence in the school of adequate material, social, and psychological resources and support to permit and sustain positive academic outcomes. Students must also have had the skills to perform significantly better. What appear to be small or brief events in isolation may in reality be the last element required to set in motion a process whose other necessary conditions already lay, not fully realised, in the situation."

In other words the experiment would not work unless there were "adequate material, social, and psychological resources and support" in the school, and unless students "have had the skills to perform significantly.

Is it possible that a school with all those resources, support and skills might also be one where students, mentors, teachers or classmates might just somehow leak who was in the experimental and who was in the control group?

Perhaps the experiment really is valid. If so we can expect to see millions of US secondary school students and perhaps university students writing their self affirmation essays and watch the achievement gap wither away.

In 2012, this study made the top 20 of studies that Psychfiledrawer would like to see reproduced, along with studies that showed that participants were more likely to give up trying to solve a puzzle if they ate radishes than if they ate cookies, that anxiety reducing interventions boost exam scores, music training raises IQ,  and, of course, Rosenthal and Jacobsons' famous study showing that teacher expectations can change students' IQ.

Geoffrey Cohen has provided a short list of studies that he claims replicate his findings. I suspect that only someone already convinced of the reality of self affirmation would be impressed.

Another variant of the environmental determinism creed is that diversity (racial or maybe gender although certainly not intellectual or ideological) is a wonderful thing that enriches the lives of everybody. There are powerful economic motives for universities to believe this and so we find that a succession of dubious studies are show cased as though they are the last and definitive word on the topic.

The latest such study is by Sheen S. Levine, David Stark and others and was the basis for an op ed in the New York Times (NYT).

The background is that the US Supreme Court back in 2003 had decided that universities could not admit students on the basis of race but they could try to recruit more minority students because having large numbers of a minority group would be good for everybody. Now the court is revisiting the issue and asking whether racial preferences can be justified by the benefits they supposedly provide for everyone.

Levine and Stark in their NYT piece claim that they can and refer to a study that they published with four other authors in the Proceedings of the American Academy of Sciences. Essentially, this involved an experiment in simulating stock trading  and it was found that  homogenous "markets" in Singapore and Kingsville, Texas, (ethnically Chinese and Latino respectively) were less accurate in pricing  stocks than those that were ethnically diverse with participants from minority groups (Indian and Malay in Singapore, non-Hispanic White, Black and Asian in Texas).

They argue that:

"racial and ethnic diversity matter for learning, the core purpose of a university. Increasing diversity is not only a way to let the historically disadvantaged into college, but also to promote sharper thinking for everyone.

Our research provides such evidence. Diversity improves the way people think. By disrupting conformity, racial and ethnic diversity prompts people to scrutinize facts, think more deeply and develop their own opinions. Our findings show that such diversity actually benefits everyone, minorities and majority alike."

From this very specific exercise the authors  conclude that diversity is beneficial for American universities which are surely not comparable to a simulated stock market.

Frankly, if this is the best they can do to justify diversity then it looks as though affirmative action in US education is doomed.

Looking at the original paper also suggests that quite different conclusions could be drawn. It is true that in each country the diverse market was more accurate than the homogenous one (Chinese in Singapore, Latino in Texas) but the homogenous Singapore market was more accurate than the diverse Texas market (see fig. 2) and very much more accurate than the homogenous Texas market. Notice that this difference is obscured by the way the data is presented.

There is a moral case for affirmative action provided that it is limited to the descendants of the enslaved and the dispossessed but it is wasting everybody's time to cherry-pick studies like these to support questionable empirical claims and to stretch their generalisability well beyond reasonable limits.








Wednesday, January 06, 2016

Towards a transparent university ranking system


For the last few years global university rankings have been getting more complicated and more "sophisticated".

Data makes it way from branch campuses, research institutes and far flung faculties and departments and is analysed, decomposed, recomposed, scrutinised for anomalies and outliers and then enters the files of the rankers where it is normalised, standardised, square rooted, weighted and/or subjected to regional modification. Sometimes what comes out the other end makes sense: Harvard in first place, Chinese high fliers flying higher. Sometimes it stretches academic credulity: Alexandria University in fourth place in the world for research impact, King Abdulaziz University in the world's top ten for mathematics.

The transparency of the various indicators in the global rankings varies. Checking the scores for Nature and Science papers and indexed publications in the Shanghai rankings is easy if you have access to the Web of Knowledge. It is also not difficult to check the numbers of faculty and students on the QS, Times Higher Education (THE)and US News web sites.

On the other hand, getting into the data behind the THE citations is close to impossible. Citations are normalised by field, year of publication and year of citation. Then, until last year the score for each university was adjusted by division by the square root of the citation impact score of the country in which it was located. Now this applies to half the score for the indicator. Reproducing the THE citations score is impossible for almost everybody since it requires calculating the world average citation score for 250 or 300 fields and then the total citation score for every country.

It is now possible to access third party data from sources such as Google, World Intellectual Property Organisation and various social media such as LinkedIn. One promising development is the creation of public citation profiles by Google Scholar.

The Cybermetrics Lab in Spain, publishers of the Webometrics Ranking Web of Universities, has announced the beta version of a ranking based on nearly one million individual profiles in the Google Scholar Citations database. The object is to see whether this data can be included in future editions of the Ranking Web of Universities

It uses data from the institutional profiles and counts the citations in the top ten public profiles for each institution, excluding the first profile.

The ranking is incomplete since many researchers and institutions have not participated fully. There are, for example, no Russian institutions in the top 600. In addition, there are technical issues such as the duplication of profiles.

The leading university is Harvard which is well ahead of its closest rival, the University of Chicago. English speaking universities are dominant with 17 of the top 20 places going to US institutions and three, Oxford, Cambridge and University College London, going to the UK.

Overall the top twenty are:

  1.   Harvard University
  2.   University of Chicago
  3.   Stanford University
  4.   University of California Berkeley
  5.   Massachusetts Institute of Technology (MIT)
  6.   University of Oxford
  7.   University College London
  8.   University of Cambridge
  9.   Johns Hopkins University
  10.   University of Michigan
  11.   Michigan State University
  12.   Yale University
  13.   University of California San Diego
  14.   UCLA
  15.   Columbia University
  16.   Duke University
  17.   University of Washington
  18.   Princeton University
  19.   Carnegie Mellon University
  20.   Washington University St Louis.

The top universities in selected countries and regions are:

Africa: University of Cape Town, South Africa 244th
Arab Region: King Abdullah University of Science and Technology, Saudi Arabia 148th
Asia and Southeast Asia: National University of Singapore 40th
Australia and Oceana: Australian National University 57th
Canada: University of Toronto 22nd
China: Zhejiang University 85th
France: Université Paris 6 Pierre and Marie Curie 133rd
Germany: Ludwig Maximilians Universität München 194th
Japan: Kyoto University 100th
Latin America: Universidade de São Paulo 164th
Middle East: Hebrew University of Jerusalem 110th
South Asia: Indian Institute of Science Bangalore 420th.

This seems plausible and sensible so it is likely that the method could be extended and improved.

Tuesday, January 05, 2016

Worth Reading 5: Another Year, Another Methodology

International Higher Education, published by Boston College, has an article by Ellen Hazelkorn and Andrew Gibson that reviews the recent changes in the methodology of the brand name world rankings.

Nothing new here, except that they have noticed the Round University Rankings from Russia.

Tuesday, December 22, 2015

Worth Reading 4: Rankings influence public perceptions of German universities



Der Ranking-Effekt Zum Einfluss des „Shanghai-Rankings“ auf die medial dargestellte Reputation deutscher Universitäten

Tim Hegglin · Mike S. Schäfer

Publizistik (2015) 60:381–402 DOI 10.1007/s11616-015-0246-

English Abstract

Increasingly, universities find themselves in a competition about public visibility and reputation in which media portrayals play a crucial role. But universities are complex, heterogeneous institutions which are difficult to compare. University rankings offer a seemingly simple solution for this problem: They reduce the complexity inherent to institutions of higher education to a small number of measures and easy-to-understand ranking tables – which may be particularly attractive for media as they conform to news values and media preferences. Therefore, we analyze whether the annual publications of the “Shanghai Ranking” influence media coverage about the included German universities. Based on a content analysis of broadsheet print media, our data show that a ranking effect exists: After the publication of the Ranking results, included universities are presented as more reputable in the media. This effect is particularly strong among better ranked universities. It does not, however, increase over a 10-year time period. The Ranking Effect How the “Shanghai Ranking” influences the mediated reputation of German Universities.

Thanks to Christian Scholz of the University of Hamburg for alerting me to this paper.

Monday, December 21, 2015

Worth Reading 3

Matthew David, Fabricating World Class: Global university league tables, status differentiation and myths of global competition

accepted for publication in the British Journal of Sociology of Education


This paper finds that UK media coverage of global university rankings  is strongly biased towards the Russell Group, which supposedly consists of elite research intensive universities, emphasises the superiority of some US universities, interprets whatever happens in the rankings as evidence that British universities, especially those in the Russell Group, need and deserve as much money as they want.

For example, he quotes the Daily Mail as saying in 2007 that "Vice chancellors are now likely to seize on their strong showing [in the THES-QS world university rankings] to press the case for the £3,000 a-year cap on tuition fees to be lifted when it is reviewed in 2009," while the Times in 2008, when UK universities slipped, said: "Vice chancellors and commentators voiced concern that, without an increase in investment, Britain's standing as a first-class destination for higher education could be under threat"

The media and elite universities also claim repeatedly that lavishly funded Asian universities are overtaking the impoverished and neglected schools of the West.

David argues that none of this is supported by the actual data of the rankings. He looks at the top 200 of the three well known rankings QS, THE, ARWU up to 2012.

I would agree with most of these conclusions, especially the argument that the rankings data he uses do not support either US superiority or the rise of Asia.

I would go further and suggest that changes to the QS rankings in 2008 and 2015, plus ad hoc adjustments to the employer survey in 2011 and 2012 plus changes in rules for submission of data, plus variations in the degree of engagement with the rankings, plus  the instability resulting from an unstable pool from which ranked universities are drawn would render the QS rankings invalid as a measure of any but the most obvious trends.

Similarly THE rankings, started in 2010, underwent substantial changes in 2011 and then in 2015. Between those years there were fluctuations for many universities because  a few papers could have a disproportionate impact on the citations indicator and again because the pool of ranked universities from which indicator means are calculated is unstable.

If, however, we take the Shanghai rankings over the course of eleven years and look at the full five hundred rankings then we do find that Asia, or more accurately some of it, is rising.

The number of Chinese universities in the ARWU top 500 rose from 16 in 2004 to 44 in 2015. The number of South Korean universities rose from 8 to 12, and Australian from 14 to 20,

But the number of Indian universities remained unchanged at three, while the number of Japanese fell from 36 to 18.

David does not argue that Asia is not rising, merely that looking at the top level of the rankings does not show that it is.

What is probably more important in the long run is the comparative performance not of universities but  of secondary school systems. Here the future of the US, the UK and continental Europe does indeed look bleak while that of East Asia and the Chinese diaspora is very promising.



Saturday, December 19, 2015

Go East Young (and Maybe not so Young) Man and Woman!

Mary Collins, a distinguished immunologist at University College London (UCL), is leaving to take up an academic appointment in Japan. Going with her is her husband Tim Hunt, a Nobel winner, who was the victim of a particularly vicious witch hunt about some allegedly sexist remarks over dinner. She had apparently applied for the Japanese post before the uproar but her departure was hastened by the disgraceful way he was treated by UCL.

Could this be the beginning of a massive drain of academic talent from the West to Asia? What would it take to persuade people like Nicholas Christakis, Erika Christakis, Joshua RichwineMark RegnerusAndrea QuenetteK.C. Johnson, and Matt Tyler to trade in abuse and harassment by the "progressive" academic establishment for a productive scholarly or administrative career in Korea, Japan, China or the Pacific rim?

Meanwhile Russia and the Arab Gulf are also stepping up their recruitment of foreign scientists. Has Mary Collins started a new trend?