Tuesday, October 14, 2014

Shanghai without the Awards

Updated. The link to the site is here.

The Center for World-Class Universities (CWU) at Shanghai Jiao Tong University has produced an interesting new ranking by removing the Alumni and Awards indicators from its Academic Ranking of World Universities. These indicators have been criticised for allowing western universities to live off their intellectual capital and ignoring the rise of newcomers in Asia.

So what would ARWU look like without the Nobel and Fields awards?

At the very top things are the same. Harvard is still first and Stanford second. But Cambridge goes down and Oxford goes up.

Universities that would benefit significantly from deleting these indicators include Michigan, rising from 23rd  to 13th, Pennsylvania State University from 58th  to 35th, University of Florida, Tsinghua University, Alberta, Peking, Sao Paul, Tel Aviv, Zhejiang and Scuola Normale Pisa, which would rise to the 201-300 band.

CWU have calculated the ratio between places in ARWU and the Alternative Ranking. The higher the score the greater the benefit from the Awards and Alumni indicators. The biggest gainers from Nobel and Fields laureates are Princeton, Moscow State University and Paris Sud (11).

The countries that have benefited most from these indicators are the USA, France, Germany.

It looks as though the ARWU has favoured the Ivy League, continental European universities and Cambridge at the expense of American public universities and the rising stars of Asia.
.

Saturday, October 11, 2014

Another Global Ranking

Just when you thought you could stop reading about rankings.

For several years the US News & World Report (USNWR), publishers of America's Best Colleges, repackaged the QS World University Rankings and just put its own stamp on them for the American public.

Now, the USNWR has announced that it is going into the global rankings business. It seems that this time that they will produce completely new rankings that have nothing to do with the Times Higher Education (THE) rankings. There will also be regional, country and subject rankings.

The data, however,will come from Thomson Reuters (TR), who are also the data providers for the THE world rankings and two of the indicators, Highly Cited Researchers and Publications, in the Shanghai ARWU rankings. It is definitely unhealthy if TR are going to supply the data or some of it for three out of four well known world rankings.

Bob Morse says that the new rankings will be "powered by Thomson Reuters InCitesTM research analytics solutions". Does this mean that universities who do not join InCites will not be ranked? Will universities be allowed to opt in or opt out? Will all data come from TR? Will the survey be shared with THE or will there be another one?














Tuesday, October 07, 2014

The Times Higher Education World University Rankings





Publisher

Times Higher Education



Scope

Global. Data provided for 400 universities. Over 800 ranked.


Top Ten


PlaceUniversity
1California Institute of Technology (Caltech) 
2Harvard University
3Oxford University
4Stanford University
5Cambridge University
6Massachusetts Institute of Technology (MIT)                       
7Princeton University
8University of California Berkeley
9=Imperial College London
9=Yale University



Countries with Universities in the Top Hundred


Country      Number of Universities
USA45
UK11
Germany6
Netherlands                                              6
Australia5
Canada4
Switzerland3
Sweden3
South Korea3
Japan2
Singapore2
Hong Kong2
China2
France2
Belgium2
Italy1
Turkey1



Top Ranked in Region


North America 
California Institute of Technology (Caltech)
AfricaUniversity of Cape Town
EuropeOxford University
Latin AmericaUniversidade de Sao Paulo                                    
AsiaUniversity of Tokyo                                 
Central and Eastern Europe  Lomonosov Moscow State University                                   
Arab WorldUniversity of Marrakech Cadi Ayyad                                    
Middle EastMiddle East Technical University                                 
OceaniaUniversity of Melbourne                              



Noise Index

In the top 20, this year's THE world rankings are less volatile than the previous edition and this year's QS rankings. They are still slightly less stable than the Shanghai rankings.


RankingAverage Place Change
 of Universities in the top 20 
THE World rankings 2013-140.70
THE World Rankings 2012-20131.20
QS World Rankings 2013-20141.45
ARWU 2013 -2014 0.65
Webometrics 2013-20144.25
Center for World University Ranking (Jeddah)
2013-2014 
0.90


Looking at the top 100 universities, the  THE rankings are more stable than last year. The average university in the top 100 in 2013 rose or fell 4.34 places. The QS rankings are now more stable than the THE or Shanghai rankings.

RankingAverage Place Change
 of Universities in the top 100 
THE World Rankings 2013-20144.34
THE World Rankings 2012-20135.36
QS World Rankings 2013-143.94
ARWU 2013 -2014 4.92
Webometrics 2013-201412.08
Center for World University Ranking (Jeddah)
2013-2014 
10.59


Note: universities falling out of the top 100 are treated as though they fell to 101st position.


Methodology

See here




Saturday, October 04, 2014

How to win citations and rise in the rankings

A large part of the academic world has either been congratulating itself on performing well in the latest Times Higher Education  (THE) world rankings, the data for which is provided by Thomson Reuters (TR), or complaining that only large injections of public money will keep their universities from falling into the great pit of the unranked.

Some, however, have been baffled by some of the placings reported by THE this year. Federico Santa Maria Technical University in Chile is allegedly the fourth best university in Latin America, Scuola Normale Superiore di Pisa the best in Italy and Turkish universities are apparently the rising stars of the academic world.

When there is a a university that appears to be punching above its weight the cause often turns out to be the citations indicator

Scuola Normale Superiore di Pisa is 63rd in the world with an overall score of 61.9 but a citations score of 96.4.

Royal Holloway, University of London is 118th in the world with an overall score of 53 but a citations score of 98.9

The University of California Santa Cruz is top of the world for citations with an overall score of 53.7 and 100 for citations

Bogazici University is 139th in the world with an overall score of 51.1 and a citations score of 96.8.

Federico Santa Maria Technical University in Valparaiso is in the 251-175 band so the total score is not given although it would be easy enough to work out. It has a score of 99.7 for citations.

So what is going on?

The problem lies with various aspects of Thomson Reuters' methodology.

First they use field normalisation. That means that they do not simply count the number of citations but compare the number of citations in 250 fields with the world average in each field. Not only that, but they compare each year in which the paper is cited with the world average of citations for that year.

The rationale for this is that the number  of citations and the rapidity with which papers are cited vary from field to field. A paper reporting a cure for cancer or the discovery of a new particle will be cited hundreds of times within weeks. A paper in philosophy, economics or history may languish for years before anyone takes notice. John Muth's work on rational expectations was hardly noticed or cited for years before eventually starting a revolution in economic theory. So universities should be compared to the average for fields and years. Otherwise, those that are strong in the humanities and social sciences will be penalised.

Up to a point this is not a bad idea. But it does assume that all disciplines are equally valuable and demanding. But if the world has decided that it will fund medical research or astrophysics and support journals and pay researchers to read and cite other researchers' papers rather than media studies or education, then this is perhaps something rankers and data collectors should take account of.

In any case, by normalising for so many fields and then throwing normalisation by year into the mix, TR increase the likelihood of statistical anomalies. If someone can get a few dozen citations within a couple of years after publication in a field where citations, especially early ones, average below one a year then this could give an enormous boost to a university's citation score. That is precisely what happened with Alexandria University in 2010. Methodological tweaking has mitigated the risk to some extent but not completely. A university could also get a big boost by getting credit, no matter how undeserved, for a breakthrough paper or a review that is widely cited.

So let's take a look at some of the influential universities in the 2014 THE rankings. Scuola Normale Superiore di Pisa (SNSP) is a small research intensive institution that might not even meet the criteria to be ranked by TR. Its output is modest, 2,407 publications in the Web of Science core collection between 2009 and 2013, although for a small institution that is quite good.

One of those publications is 'Observation of a new boson...' in Physics Letters B in September 2012, which has been cited 1,631 times.

The paper has 2,896 "authors", whom I counted by looking for semicolons in the "find" box, affiliated to 228 institutions. Five of them are from SNSP.

To put it crudely, SSNP is making an "authorship" contribution of 0.17 %  to the paper but getting 100% of the citation credit, as does every other contributor. Perhaps its researchers are playing a leading role in the Large Hadron Collider project or perhaps it has made a disproportionate financial contribution but TR provide no reason to think so.

The University of the Andes, supposedly the second best university in Latin America, is also a contributor to this publication, as is Panjab University, supposedly the second best institution in the Indian subcontinent.

Meanwhile, Royal Holloway, University of London has contributed to "Observation of a new particle...' in the same journal and issue. This has received 1,734 citations and involved  2,932 authors from 267 institutions, along with Tokyo Metropolitan University, Federico Santa Maria Technical University, Middle Eastern Technical University and Bogazici University.

The University of California Santa Cruz is one of 119 institutions that contributed to the 'Review of particle physics...'  2010 which has been cited 3,739 times to date. Like all the other contributors it gets full credit for all those citations.

It is not just the number of citations that boosts citation impact scores but also their occurrence within a year or two of publication so that the number of citations is much greater than the average for that field and those years.

The proliferation of papers with hundreds of authors is not confined to physics. There are several examples from medicine and genetics as well.

At this point, the question arises why not divide the citations for each paper among the authors of the paper? This is an option available in the Leiden Ranking so it should not be beyond TR's technical capabilities.

Or why not stop counting multi - authored publications when they exceed a certain quota of authors? This is exactly what TR did earlier this year when collecting data for its new highly cited researchers lists. Physics papers with more than 30 institutional affiliations were omitted, a very sensible procedure that should have been applied across the board.

So basically, one route to success in the rankings is to get into a multi - collaborator mega - cited project.

But that is not enough in itself. There are hundreds of universities contributing to these publications. But not all of them  reap such disproportionate benefits. It is important not to publish too much. A dozen LHC papers will do wonders if you publish 400 or 500  papers a year. Four thousand a year and it will make little difference. One reason for the success of otherwise obscure institutions is that the number of papers by which the citations are divided is small.

So why on earth are TR using a method that produces such laughable results? Lets face it, if any other ranker put SNS Pisa, Federico Santa Maria or Bogaziii at the top of its flagship indicator we would go deaf from the chorus of academic tut-tutting.

TR, I suspect, are doing this because this method is identical or nearly identical to that used for their InCites system for evaluating individual academics within institutions, which appears very lucrative, and they do not want the expense and inconvenience of recalculating data.

Also perhaps, TR have become so enamoured of the complexity and sophistication of their operations that they really do think that they have actually discovered pockets of excellence in unlikely places that nobody else has the skill or the resources to even notice.

But we have not finished. There is one  more element in TR's distinctive methodology and that is its regional modification introduced by Thompson Reuters in 2011.

This means that the normalised citation impact score of the university is divided by the square root of the impact score of  the country in which it is located. A university located in a low scoring country will get a bonus that will be greater the lower the country's impact score. This would clearly be an advantage to countries like Chile, India and Turkey.

Every year there are more multi - authored multi -cited papers. It would not be surprising if university presidents start scanning the author lists of publications like the Review of Particle Physics, send out recruitment letters and get ready for ranking stardom.







Friday, October 03, 2014

Scuola Normale Superiore di Pisa: Are they dancing in the streets?

There is a lot of coverage of that huge pocket of excellence in Italy at ROARS:Return on Academic Research. Still trying to make sense of the Google translator.

The university with more research influence than.....

With apologies to the Sydney Morning Herald which had a headline about Caltech.

I don't know if the people of Valparaiso are aware that they are home to a world- class university but if they do find out this might be a nice headline.

Federico Santa Maria Technical University. More research influence than Princeton, Stanford, Harvard, Oxford, Cambridge, Yale, Duke, Bogazici, Colorado School of Mines.... (insert as you wish.)

Thursday, October 02, 2014

Which universities have the greatest research influence?

Times Higher Education (THE) claims that its Citations:Research Influence indicator, prepared by Thomson Reuters (TR), is the flagship of its World University Rankings, It is strange then that the magazine has never published a research influence ranking although that ought to be just as interesting as its Young Universities Ranking, Reputation Rankings or gender index.

So let's have a look at the top 25  universities in the world this year ranked for research influence,  measured by field- and year- normalised citations, by Thomson Reuters.

Santa Cruz and Tokyo Metropolitan have the same impact as MIT. Federico Santa MariaTechnical University is ahead of Princeton. Florida Institute of Technology beats Harvard. Bogazici University and Scuola Normale Superiore do better than Oxford and Cambridge.

Are they serious?

Apparently. There will be an explanation in the next post. Meanwhile go and check if you don't believe me. And let me know if there's any dancing in the streets of Valparaiso, Pisa, Golden or Istanbul.


Rank and Score for Citations: Research Influence 2014-15 THE World Rankings

Rank University Score
1= University of California Santa Cruz 100
1= MIT 100
1= Tokyo Metropolitan University 100
4 Rice University 99.9
5= Caltech 99.7
5= Federico Santa Maria Technical University, Chile  99.7
7 Princeton University 99.6
8= Florida Institute of Technology 99.2
8= University of California Santa Barbara 99.2
10= Stanford University 99.1
10= University of California Berkeley 99.1
12= Harvard University 98.9
12= Royal Holloway University of London 98.9
14 University of Colorado Boulder  97.4
15 University of Chicago 97.3
16= Washington University of St Louis 97.1
16= Colorado School of Mines 97.1
18 Northwestern University 96.9
19 Bogazici University, Turkey  96.8
20 Duke University  96.6
21= Scuola Normale Superiore Pisa, Italy 96.4
21= University of California San Diego 96.4
23 Boston College 95.9
24 Oxford University 95.5
25= Brandeis University  95.3
25= UCLA 95.3

Tuesday, September 30, 2014

What good is reputation?

There is an excellent analysis by Alex Usher of Higher Education Strategy Associates of the reputation indicators in the THE and QS world rankings.

Main points include:


  • THE and QS are both insufficiently transparent about their reputation surveys and it is very difficult to judge their reliability. 


  • The numbers responding to the THE survey are very small outside the top 50 and this could cause substantial changes in total scores because of a small increase or decease in the number of votes.


  • The lack of transparency is influenced by commercial motives.
THE has been dropping twitter hints about interesting changes in the forthcoming rankings. Are these due to swings in the votes on the surveys?

 Or could it be the Large Hadron Collider Citation Amplifier?





Monday, September 29, 2014

Ranking Status Wars

It looks like Times Higher Education is pulling ahead of QS in the ranking status war.

From Asahi Shimbun in Japan:

"Only the University of Tokyo and Kyoto University made the top 100 of the World University Rankings released in October last year, placing 23rd and 52nd respectively. The rankings are decided by British educational journal Times Higher Education."

Meanwhile, a Norwegian study of rankings (analysis here, original report here) examines only the Shanghai ARWU, the Leiden Rankings and the THE World University Rankings.

Thursday, September 25, 2014

How the Universities of Huddersfield, East London, Plymouth, Salford, Central Lancashire et cetera helped Cambridge overtake Harvard in the QS rankings

It is a cause of pride for the great and the good of British higher education that the country's universities  do brilliantly in certain global rankings. Sometimes though, there is puzzlement about how UK universities can do so well even though the  performance of the national economy  and the level of adult cognitive skills are so mediocre.

In the latest QS World University Rankings Cambridge and Imperial College London pulled off a spectacular feat when they moved ahead of Harvard into joint second place behind MIT, an achievement at first glance as remarkable as Leicester City beating Manchester United. Is this a tribute to the outstanding quality of teaching, inspired leadership or cutting edge research, or perhaps something else?

Neither Cambridge nor Imperial does very well in the research based rankings. Cambridge is 18th and Imperial 26th among higher education institutions in the latest Scimago rankings for output and 32nd and 33rd for normalised impact (citations per paper adjusted for field). Harvard is 1st and 4th for these indicators. In the CWTS Leiden Ranking, Cambridge is 22nd and Imperial 32nd for the mean normalised citation score, sometimes regarded as the flagship of these rankings, while Harvard is 6th.

It is true that Cambridge does much better on the Shanghai Academic Ranking of World Universities with fifth place overall, but that is in large measure due to an excellent score, 96.6, for alumni winning Nobel and Fields awards, some dating back several decades. For Highly Cited Researchers and publications in Nature and Science its performance is not nearly so good.

Looking at the THE World University Rankings, which make some attempt to measure factors other than research, Cambridge and Imperial come in 7th and 10th overall, which is much better than they do in the Leiden and Scimago rankings. However, it is very likely that the postgraduate teaching and research surveys made a significant contribution to this performance. Cambridge is 4th in the THE reputation rankings based on last year's data and Imperial is 13th.

Reputation is also a key to the success of Cambridge and Imperial in the QS world rankings. Take a look at the scores and positions of Harvard, Cambridge and Imperial in the rankings just released.

Harvard  gets 100 points (2nd place) for the academic survey, employer survey (3rd), and citations per faculty (3rd). It has 99.7 for faculty student ratio (29th), 98.1 for international faculty (53rd), and 83.8 for international students (117th). Harvard's big weakness is its relatively small percentage of international students.

Cambridge is in first place for the academic survey and 2nd in the employer survey, in both cases with a score of 100 and one place ahead of Harvard. The first secret of Cambridge's success is that it does much better on reputational measures than for bibliometric or other objective data. It was 18th for faculty student ratio, 73rd for international faculty, 50th for international students and 40th for citations per faculty.

So, Cambridge is ahead for faculty student ratio and international students and Harvard is ahead for international faculty and citations per faculty. Both get 100 for the two surveys.

Similarly, Imperial has 99.9 points for the academic survey (14th), 100 for the employer survey (7th), 99.8 for faculty student ratio (26th), 100 for international faculty (41st), 99.7 (20th) for international students and 96.2 (49th) for citations per faculty. It is behind Harvard for citations per faculty but just enough ahead for international students to squeeze past into joint second place.

The second secret is that QS's standardisation procedure combined with an expanding database means that the scores of the leading universities in the rankings are getting more and more squashed together at the top. QS turns its raw data into Z scores so that universities are measured according to their distance in standard deviations from the mean for all ranked universities. If the number of sub-elite universities in the rankings increases then the overall means for the indicators will fall and the scores of universities at the top end will rise as their distance in standard deviations from the mean increases.

Universities with scores of 98 and 99 will now start getting scores of 100. Universities with recorded scores of 100 will go on getting 100, although they might go up up a few invisible decimal points

In 2008, QS ranked 617 universities. In that year, nine universities had a score of 100 for the academic survey, four for the employer survey, nine for faculty student ratio, six for international faculty, six for international students and seven for citations per faculty.

By 2014 QS was ranking over 830 universities (I assume that those at the end of the rankings marked "NA" are there because they got votes in the surveys but are not ranked because they fail to meet the criteria for inclusion). For each indicator the number of universities getting a score of 100 increased. In 2014 there were 13 universities with a score of 100 for the academic survey, 14 for the employer survey, 16 for faculty student ratio, 41 for international faculty, 15 for international students and 10 for citations per faculty,

In 2008 Harvard got the same score as Cambridge for the academic and employer surveys. It was 0.3 (0.06 weighted) behind for faculty student ratio, 0.6 (0.53 weighted) behind for international faculty, and 14.1 (0.705 weighted) behind for international students, It was, however, 11.5 points. (2.3 weighted) ahead for citations per faculty. Harvard was therefore first and Cambridge third.

By 2014 Cambridge had fallen slightly behind Harvard for international faculty. It was slightly ahead for faculty student ratio. Scores for the survey remained the same, 100 for both places. Harvard reduced the gap for international students slightly.

What made the difference in 2014 and put Cambridge ahead of Harvard was that in 2008 Harvard  in fifth place for citations and with a score 100 was 11.5 (2.3 weighted) points ahead of Cambridge. In 2014 Cambridge had improved a bit for this indicator -- it was 40th instead of 49th -- but now got 97.9 points reducing the difference with Harvard to 2.1 points (0.42 weighted). That was just enough to let Cambridge overtake Harvard.

Cambridge's rise between 2008 and 2014 was thus largely due to the increasing number of ranked universities which led to lower means for  each indicator which led to higher Z scores at the top of each indicator and so reduced the effect of Cambridge's comparatively lower citations per faculty score.

The same thing happened to Imperial . It did a bit better for citations, rising from 58th to 49th place and this brought it a rise in points from 83.10 to 96.20 again allowing it to creep past Harvard.

Cambridge and Harvard should be grateful to those universities filling up the 701+ category at the bottom of the QS rankings. They are the invisible trampoline that propelled "Impbridge" into second place, just behind MIT.

QS should think carefully about adding more universities to their rankings. Another couple of hundred and there will be a dozen universities at the top getting 100 for everything.









Monday, September 22, 2014

Using Ig Nobel awards for ranking countries

Since 1991 Improbable Research has awarded prizes for research that makes people laugh and then think. Highlights this year include dung beetles navigating by starlight, the reaction of reindeer to humans disguised as polar bears and the ethical inferiority of people who can't get up in the morning.

The Ig® Nobel Interactive Database publishes a series of charts, one of which indicates the countries that produce such cutting edge research. Here are the top ten. The funny thing is it looks similar to the top ten of countries with universities in the top 100 in the QS rankings.The big difference is that Japan does better for Ig Nobel prizes than it does in the QS and the other rankings.



Country       % of Ig Nobel awards
1.   USA34.7
2.   UK12.3
3.   Japan9.9
4.   Australia                                            5.5
5.   France                                                       3.7                                   
6.   Netherlands3.5
7.   Canada2.8
8.   Italy2.6
9.   Switzerland2.1
10. China1.4





Country      Number of Universities
in QS top 100
1,   USA28
2.   UK19
3.   Australia8
4.   Netherlands                                              7
5.   Canada5
6.   Switzerland4
7.   Japan4
8=   Germany3
8=  China3
8=  Korea3
8=  Hong Kong3


The Uses of Rankings

It is getting difficult to avoid university rankings. They seem to be everywhere, with advertisements in railway stations in the English Midlands proclaiming that the local university is in the top 100 for something and newspaper articles in Malaysia reporting the latest news from QS.

Even Ron Liddle, the Spectator's curmudgeon in residence, has taken notice of the rankings and used them to mount a half-hearted defence of British culture against a scathing attack by Portuguese academic Jose Magueijo before retreating and conceding that the assault is pretty much on target.

"We might also mention, quietly, that he [Magueijo] has a post at one of the world’s top ten universities and that at least three other British universities are in that top ten, but there is not a Portuguese university in the top 200 (if they have universities)."


Thursday, September 18, 2014

QS World University Rankings 2014



Publisher

QS (Quacquarelli Symonds)



Scope

Global. 701+ universities.


Top Ten


PlaceUniversity
1MIT
2=Cambridge
2=Imperial College London
4Harvard
5Oxford
6University College London
7Stanford
8California Institute of Technology (Caltech)
9Princeton
10Yale



Countries with Universities in the Top Hundred


Country      Number of Universities
USA28
UK19
Australia8
Netherlands                                              7
Canada5
Switzerland4
Japan4
Germany3
China3
Korea3
Hong Kong3
Denmark2
Singapore2
France2
Sweden2
Ireland1
Taiwan1
Finland1
Belgium1
New Zealand1



Top Ranked in Region


North America 
MIT
AfricaUniversity of Cape Town
EuropeCambridge
Imperial College London
Latin AmericaUniversidade de Sao Paulo                                    
AsiaNational University of Singapore                                    
Central and Eastern Europe  Lomonosov Moscow State University                                   
Arab WorldKing Fahd University of Petroleum and Minerals                                     
Middle EastHebrew University of Jerusalem                                  



Noise Index

In the top 20, this year's QS world rankings are less volatile than the previous edition but more so than the THE rankings or Shanghai ARWU. The top 20 universities in 2013 rose or fell an average of 1.45 places. The most remarkable change was the rise of Imperial College and Cambridge to second place behind MIT and ahead of Harvard.


RankingAverage Place Change
 of Universities in the top 20 
QS World Rankings 2013-20141.45
QS World Rankings 2012-20131.70
ARWU 2013 -2014 0.65
Webometrics 2013-20144.25
Center for World University Ranking (Jeddah)
2013-2014 
0.90
THE World Rankings 2012-20131.20


Looking at the top 100 universities, the  QS rankings  are little different from last year. The average university in the top 100 moved up or down 3.94 places compared to 3.97 between 2012 and 2013. These rankings are more reliable than this year's ARWU, which was affected by the new lists of highly cited researchers, and last year's THE rankings.

RankingAverage Place Change
 of Universities in the top 100 
QS World Rankings 2013-143.94
QS World Rankings 2012-20133.97
ARWU 2013 -2014 4.92
Webometrics 2013-201412.08
Center for World University Ranking (Jeddah)
2013-2014 
10.59
THE World Rankings 2012-20135.36




Methodology (from topuniversities)

1. Academic reputation (40%)

Academic reputation is measured using a global survey, in which academics are asked to identify the institutions where they believe the best work is currently taking place within their field of expertise.
For the 2014/15 edition, the rankings draw on almost 63,700 responses from academics worldwide, collated over three years. Only participants’ most recent responses are used, and they cannot vote for their own institution. Regional weightings are applied to counter any discrepancies in response rates.
The advantage of this indicator is that it gives a more equal weighting to different discipline areas than research citation counts. Whereas citation rates are far higher in subjects like biomedical sciences than they are in English literature, for example, the academic reputation survey weights responses from academics in different fields equally.
It also gives students a sense of the consensus of opinion among those who are by definition experts. Academics may not be well positioned to comment on teaching standards at other institutions, but it is well within their remit to have a view on where the most significant research is currently taking place within their field.

2. Employer reputation (10%)

The employer reputation indicator is also based on a global survey, taking in almost 28,800 responses for the 2014/15 edition. The survey asks employers to identify the universities they perceive as producing the best graduates. This indicator is unique among international university rankings.
The purpose of the employer survey is to give students a better sense of how universities are viewed in the job market. A higher weighting is given to votes for universities that come from outside of their own country, so it’s especially useful in helping prospective students to identify universities with a reputation that extends beyond their national borders. 

3. Student-to-faculty ratio (20%)

This is a simple measure of the number of academic staff employed relative to the number of students enrolled. In the absence of an international standard by which to measure teaching quality, it provides an insight into the universities that are best equipped to provide small class sizes and a good level of individual supervision.

4. Citations per faculty (20%)

This indicator aims to assess universities’ research output. A ‘citation’ means a piece of research being cited (referred to) within another piece of research. Generally, the more often a piece of research is cited by others, the more influential it is. So the more highly cited research papers a university publishes, the stronger its research output is considered.
QS collects this information using Scopus, the world’s largest database of research abstracts and citations. The latest five complete years of data are used, and the total citation count is assessed in relation to the number of academic faculty members at the university, so that larger institutions don’t have an unfair advantage.

5  6. International faculty ratio (5%)  international student ratio (5%)

The last two indicators aim to assess how successful a university has been in attracting students and faculty members from other nations. This is based on the proportion of international students and faculty members in relation to overall numbers. Each of these contributes 5% to the overall ranking results.

Monday, September 15, 2014

What makes a world-class university?

According to Times Higher Education, a world-class university, one that is in the top 200 of the THE World University Rankings:


  • has a lot of money
  • has a lot of money for research
  • has a lot of staff compared to students
  • attracts staff and students from abroad
  • collaborates with international researchers.


All of these data are derived from the Thomson Reuters InCites programme and they are all indicators in THE world rankings.

The article compares the top 200 to the top 400 universities. It seems at the top that money talks loudly, but internationalisation has a limited impact -- 20 percent of staff from abroad in the top 200 compared to 18 percent in the top 400, 43 percent of papers with international collaborators compared to 42 percent.

What isn't there anything about citations and research impact. Wasn't that supposed to the rankings' flagship indicator?

Sunday, September 14, 2014

The British Paradox

Times Higher Education, reporting on the latest OECD education report, says,

"The UK is ranked relatively low among the most developed nations for the literacy skills of graduates, with its performance described as “a puzzle” given the elevated reputation of its universities."

It is only a puzzle if you think that reputation is an accurate reflection of reality.

Friday, September 12, 2014

U-Multirank – A university ranking evaluation

My article on U-Multirank can be found in the latest University World News.

Wednesday, September 10, 2014

The Only Ranking You'll Ever Need

The Onion, which is looking more and more like a dull chronicle of everyday life in the US, has just published its 2014 University rankings. Here are some highlights:

1. Harvard.  "Endowment: Never enough"
2. University of Alabama. "Subway Franchises on Campus:104"
3. Oberlin College. "Most popular student activity: Adding the prefix 'cis' to all nouns."
4. University of Phoenix. "Undergraduates: 150,000 students, 5.8 million bots."
5. United States  Military Academy at West Point."Incoming class: 74 percent of admitted students were the Supreme Allied Commander of their high school class"
6. ITT Technical Institute, Penscola Campus. "Admissions requirement: $25,000"


America's Best Colleges

The US News & World Report's America's Best Colleges has just been published. There are no surprises at the top. Here are the top ten.

1.  Princeton
2.  Harvard
3.  Yale
4= Columbia
4= Stanford
4= Chicago
7.   MIT
8= Duke
8= University of Pennsylvania
10. Caltech

Analysis at the Washington Post indicates little movement at the top. Outside the elite there are some significant changes.

Liberal arts colleges
St. John's College, Annapolis from 123rd to 56th .
Bennington College from 122nd to 89th.

National universities
Northeastern University from 69th to 42nd.
Texas Christian University from 99th to 46th.




Sunday, September 07, 2014

Scottish Independence and the Rankings

What happens if Scotland votes yes for independence?

Forget about Trident, currency union, North Sea oil and how many crosses there will be in the Union Jack.

The really important issue (1) is what happens to Scottish and other British universities in the international university rankings.

English, Welsh and Northern Irish students and staff in Scottish universities will presumably be classified as international. Whether that  happens immediately or over a few years remains to be seen. Also, if most of the Scottish population retains dual nationality, there will be a lot of quibbling over the small print in the instructions Thomson Reuters (TR) and QS send out to universities participating in the rankings. But one way or another there will be a boost for Scottish universities, at least in the short run.

There would be an immediate bonus in the international collaboration indicator in the Times Higher Education (THE) rankings.

There would also be a smaller boost for English, Welsh and Northern Irish universities as well since some Scottish students and faculty would presumably sooner or later become international.

Less certain is the effect of independence on the regional modification in the Citations: Research Impact indicator in the THE rankings. If the overall Scottish field-normalised and year-normalised citation rate is less than that of the rest of the United Kingdom then independence and separate counting would bring another bonus for Scottish universities since they would be benchmarked against a lower number. Whether the rate is in fact lower is something that TR will no doubt be keen to tell us.

Nothing would happen right away in the Shanghai rankings unless an independent Scottish government found a new way of counting university staff. That could affect the Productivity per Capita indicator.

Of course, the long term fate of Scottish education and society would depend on the policies adopted by an independent Scottish government. Alex Salmond's "plans" for currency do not inspire very much confidence, but who knows?


(1) I'm being sarcastic.

Friday, September 05, 2014

Problems with Green Rating

More evidence that self-submitted data for university rankings is not a good idea comes from Inside Higher Ed. An article by Ry Rivard reports that many American colleges have been submitting data that is incomplete or inconsistent for an environmental standards rating published by the Sustainable Endowments Institute.

Wednesday, September 03, 2014

Going on Twitter

As the ranking season gets under way, with America's Best Colleges, the QS world rankings, the THE world rankings and various spin-offs and alternative rankings in the pipeline, I have reactivated my twitter account at Richard Holmes @universities06.

Monday, September 01, 2014

More on Affiliation

Recently, there has been a lot of finger wagging about King Abdulaziz University (KAU), Jeddah, signing up highly cited researchers as secondary affiliations. The idea behind this was to climb up the ladder of the Shanghai rankings, the Academic Ranking of World Universities. These rankings include an indicator, based on Thomson Reuters' (TR) lists of highly cited researchers, which until now gave universities credit for those researchers who list them as secondary affiliation.

The Shanghai Ranking Consultancy decided that this year they would  count secondary affiliations in the old but not the new list "at the suggestion of many institutions and researchers including some Highly Cited Researchers".

It is possible that the highly cited researchers mentioned may have upset their primary affiliations who might have noticed that the indicator points accruing to KAU would come out of their own scores. Just counting the primary affiliations in the new list meant that institutions such as Stanford, the Tokyo Institute of Technology, Indiana University Bloomington, the University of Sydney and the Indian Institute of Science have lost several points for this indicator.

The highly cited indicator is unique among the well known international rankings because when a researcher changes his or her  affiliation all of his or her papers go with him or her. It does not matter whether a university has employed a researcher for a day or a decade it will still get the same credit in this indicator. Everything depends on what the researcher puts down as his or her affiliation or affiliations.

All of this is just one manifestation of a problem that has been latent in academic publishing for some years, namely the issue of the affiliation that researchers use when submitting papers or articles. There has probably been quite a bit of small scale fiddling going on for years, with researchers with doctorates from selective universities giving those places as affiliations rather than the technical or education colleges where they are teaching or adjuncts picking the most prestigious of the several institutions where they work.

The best known case of creative affiliation  was that of Mohammed El Naschie whose publication career included questionable claims to affiliation with Cambridge, Frankfurt, Surrey and Alexandria Universities (see High Court of Justice Queen's Bench Division: Neutral Citation Number: [2012] EWHC 1809 (QB)).

Most of these claims did no one any good or any harm, apart from a little embarrassment. However, the Alexandria affiliation, combined with Thomson Reuters' distinctive method of counting citations and the university's relatively few publications, propelled Alexandria into the worlds top 5 for research impact and top 200 overall in the the 2010 Times Higher Education (THE) World University Rankings.

It is possible that a few of the researchers who have signed up for KAU will start showing up in massively cited multi-contributor publications, many of them in physics, that can boost otherwise obscure places into the upper sections of the research impact indicator of the THE rankings.

TR have said that they did not count physics articles with more than 30 authors when they prepared their recent list of highly cited researchers. This could reduce the scores obtained by KAU, Panjab University and some other institutions if TR follow the same procedure in the coming THE world rankings. The issue, however, is not confined to physics.

It is time that journals, databases and ranking organisations began to look carefully at affiliations. At the least, journals should start checking claims and rankers might consider counting only one affiliation per author.





Saturday, August 23, 2014

The Shanghai Rankings Part 2

The Shanghai Rankings have had a reputation for reliability and consistency. The latest rankings have, however, undermined that reputation a little. There have been two methodological changes of which one, not counting Proceedings Papers in the Nature and Science and Publications indicators, may not be of any significance. The other is the use of a new list of Highly Cited Researchers prepared by Thomson Reuters covering citations between 2002 and 2012. In this year's rankings this was combined with the old list which had not been updated since 2004.

One result of this is that there have been some very dramatic changes in the scores for Highly Cited Researchers this year. University of California Santa Cruz's score has risen from 28.9 to 37.9, Melbourne's from 24 to 29.3 and China University of Science and Technology's from 7.2 to 24.5 while that of the Australian National University has fallen from 32.3 to 24.8 and Virginia Polytechnic Institute's from 22.9 to 11.4.

This has had a noticeable impact on total scores. Santa Cruz has risen from the 101-150 band to 93rd place, Melbourne from 54th to 44th and China University of Science and Technology from the 201 - 300 band to the 150-200 band. The Australian National University has fallen from 66th place to 74th and Indiana University at Bloomington has dropped from 85th place to the 101-150 band.


In the top 20, this year's ARWU is more volatile than the two previous editions but still not as much as any other international ranking. The top 20 universities in 2013 rose or fell an average of 0.65 places.


Ranking Average Place Change
 of Universities in the top 20 
ARWU 2013 -2014    0.65
ARWU 2012-2013 0.25
ARWU 2011 - 2012 0.15
Webometrics 2013-2014 4.25
Center for World University Ranking (Jeddah)
2013-2014 
0.90
THE World Rankings 2012-2013 1.20
QS World Rankings 2012-2013 1.70


Looking at the top 100 universities, the ARWU is more volatile than last year's QS rankings with the average institution moving up or down 4.92 places.

RankingAverage Place Change
 of Universities in the top 100 
ARWU 2013 -2014   4.92
ARWU 2012-20131.66
ARWU 2011 - 20122.01
Webometrics 2013-201412.08
Center for World University Ranking (Jeddah)
2013-2014 
10.59
THE World Rankings 2012-20135.36
QS World Rankings 2012-20133.97







Sunday, August 17, 2014

The Shanghai Rankings (Academic Ranking of World Universities) 2014 Part 1

Publisher

Center for World-Class Universities, Shanghai Jiao Tong  University


Scope

Global. 500 institutions.


Methodology

See ARWU site.

In contrast to the other indicators, the Highly Cited Researchers indicator has undergone substantial changes in recent years, partly as a result of changes by data provider Thomson Reuters. Originally, ARWU used the old list of highly cited researchers prepared by Thomson Reuters (TR), which was first published in 2001 and updated in 2004. Since them no names have been added although changes of affiliation submitted by researchers were recorded. 

Until 2011 when a researcher listed more than one institution as his or her affiliation then credit for the highly cited indicator would be equally divided. Following the recruitment of a large number of part time researchers by King Abdulaziz University, ARWU introduced a new policy of asking researchers how their time was divided. When there was no response, secondary affiliations were counted as 16%, which was the average time given by those who responded to the survey.

In 2013 TR announced that they were introducing a new list based on field-normalised citations over the period 2002-2012. However, problems with the preparation of the new list meant that it could not be used in the 2013 rankings. Instead, the Shanghai rankings repeated the 2012 scores.

During 2013, KAU recruited over 100 highly cited researchers who nominated the university as a secondary affiliation. That caused some comment by researchers and analysts. A paper by Lutz Bornmann and Johann Bauer concluded that to " counteract attempts at manipulation, ARWU should only consider primary institutions of highly cited researchers."

It seems that Shanghai has acted on this advice: "It is worth noting that, upon the suggestion of many institutions and researchers including some Highly Cited Researchers, only the primary affiliations of new Highly Cited Researchers are considered in the calculation of an institution’s HiCi score for the new list."

As a result, KAU has risen into the lower reaches of the 150-200 band on the basis of publications, some papers in Nature and Science and a modest number of primary affiliations among highly cited researchers. That is a respectable achievement but one that would have been much greater if the secondary affiliations had been included.


Perhaps Shanghai should also take note of the suggestion in a paper by Lawrence Cram and Domingo Docampo that  " [s]ignificant acrimony accompanies some published comparisons between ARWU and other rankings (Redden, 2013) driven in part by commercial positioning .  Given its status as an academic ranking , it may be prudent for ARWU to consider replacing its HiCi indicator with a measure that is nit sourced from a commercial provider if such a product can be found that satisfies the criteria (objective, open, independent ) used by ARWU."


.
Top Ten


Place University
1 Harvard
2 Stanford
3 MIT
4 University of California Berkeley
5 Cambridge
6 Princeton
7 California Institute of Technology (Caltech)
8 Columbia
9= Chicago
9= Oxford



Countries With Universities in the Top 100



Country Number of Universities
United States    52
United Kingdom                                           8
Switzerland 5
Germany 4
France 4
Netherlands 4
Australia 4
Canada 4
Japan 3
Sweden 3
Belgium 2
Israel 2
Denmark 2
Norway 1
Finland 1
Russia 1