Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Friday, December 19, 2014
Study International
Study International is a new web site published by Hybrid News that will provide news, analysis and advice about international higher education. Some posts from this blog have been republished there, See for example.
Wednesday, December 10, 2014
Is Asia really rising?
Over the last few years there has been a lot of talk about the continuing rise of Asian universities. In The Conversation, Gerard Postiglione of the University of Hong Kong has pointed out that Asian universities now take one in eight of the top 200 places in the Times Higher Education World University Rankings and he predicts that by 2040 a quarter of the top universities will be Asian.
It is interesting that he apparently regards the THE rankings as the arbiter of excellence despite an eccentric methodology that, among other oddities, claims that an excellent but small research institute is the best university in Italy and among the best in the world.
Exactly what progress in the THE rankings means is difficult to decide. A rise in the score for the research indicator, for example, could result from an increase in the number of publications, a fall in the number of academic and/or research staff, an increase in research income, an improvement in the research reputation survey or a combination of some or all of these.
Predicting what will happen in the THE rankings has become even more difficult since THE broke up with their data supplier, Thomson Reuters, raising the possibility that there will be another round of methodological changes.
There are some sceptics such as Alex Usher of Higher Education Strategy Associates but in general the rise of Asia and the decline of the US and UK seems to have become part of the accepted wisdom of Western pontificators.
So is Asia rising? And if it is, is it the whole of the continent or just parts of it?
The problem is that the rankings vary in their ability to identify medium term trends. QS and THE give a large weighting to reputation surveys that are inherently volatile,They also use an unstable number of institutions to generate means from which processed scores are calculated and this can led to fluctuations in the final overall scores
The Academic Ranking of World Universities (ARWU) produced by the Shanghai Center for World-Class Universities is probably the most useful for identifying changes over the last decade since there were no significant changes between 2004 (when schools with strengths in the social sciences were helped by exemption from the Nature and Science indicator) and 2014 (when Thomson Reuters issued a new list of highly cited researchers).
The number of universities in the Shanghai top 500 provides strong evidence that some parts of Asia are making rapid progress. The number of mainland Chinese universities (excluding Hong Kong and Taiwan) has risen from 8 to 33, The number of Korean universities has gone from eight to ten, Taiwanese from three to six, Saudi from zero to four, and Malaysian from zero to two.
But some parts of Asia appear to be stagnant or in relative decline. The number of Japanese universities has fallen from 36 to 19 and Indian from three to one while the number of Hong Kong institutions has remained the same at five.
Looking at the performance of some national flagships in the ARWU publications indicator provides more evidence of an expansion of research in some Asian countries. Compared to Harvard's benchmark score of 100, Peking University has risen from 49.8 in 2004 to 63.6 in 2014. Other Asian universities have also had substantial growth over the decade. Seoul National University went from 62.6 to 67.8, National Taiwan University from 52.6 to 57.9 and Istanbul University from 30.7 to 34.9.
Note that the raw numbers of publications have been modified by a logarithm so that in 2004 Peking was in fact publishing about a quarter of the number of papers produced by Harvard rather than a half.
On the other hand, Tokyo University, which in 2004 had the second highest publications score in the world, fell from 91.9 to 73 and the University of Hong Kong from 46.4 to 44.
If we look at Shanghai's Productivity per Capita indicator, which measures quality by dividing five combined indicators by the number of faculty, we find some Asian universities doing well. Peking goes from 5.9 to 16.5, Seoul National University from 19 to 23.4 and National Taiwan University from 17.5 to 19.9. Tokyo, meanwhile, has fallen from 49.8 to 29.2. Hong Kong University, on the other hand,has risen from 13.1 to 22.4.
Confirmation of the trends for research output comes from the Output indicator in the Scimago rankings, which is based on data provided by Scopus. Peking, Seoul National University, National Taiwan University and Universiti Malaya rose between 2009 and 2014. However, the scores for Tokyo and Hong Kong both fell.
On the other hand, the evidence of Scimago' normalised impact indicator, which might measure research quality, shows Peking rising but Seoul National University and Hong Kong falling.
It would seem that China and the overseas Chinese communities and Korea are expanding the quantity of research but progress at higher levels is slower. There are also islands of research productivity in West and Southeast Asia. In Central Asia, the Indian Subcontinent and Indonesia there is very little significant research activity while Japan is actually declining.
It is interesting that he apparently regards the THE rankings as the arbiter of excellence despite an eccentric methodology that, among other oddities, claims that an excellent but small research institute is the best university in Italy and among the best in the world.
Exactly what progress in the THE rankings means is difficult to decide. A rise in the score for the research indicator, for example, could result from an increase in the number of publications, a fall in the number of academic and/or research staff, an increase in research income, an improvement in the research reputation survey or a combination of some or all of these.
Predicting what will happen in the THE rankings has become even more difficult since THE broke up with their data supplier, Thomson Reuters, raising the possibility that there will be another round of methodological changes.
There are some sceptics such as Alex Usher of Higher Education Strategy Associates but in general the rise of Asia and the decline of the US and UK seems to have become part of the accepted wisdom of Western pontificators.
So is Asia rising? And if it is, is it the whole of the continent or just parts of it?
The problem is that the rankings vary in their ability to identify medium term trends. QS and THE give a large weighting to reputation surveys that are inherently volatile,They also use an unstable number of institutions to generate means from which processed scores are calculated and this can led to fluctuations in the final overall scores
The Academic Ranking of World Universities (ARWU) produced by the Shanghai Center for World-Class Universities is probably the most useful for identifying changes over the last decade since there were no significant changes between 2004 (when schools with strengths in the social sciences were helped by exemption from the Nature and Science indicator) and 2014 (when Thomson Reuters issued a new list of highly cited researchers).
The number of universities in the Shanghai top 500 provides strong evidence that some parts of Asia are making rapid progress. The number of mainland Chinese universities (excluding Hong Kong and Taiwan) has risen from 8 to 33, The number of Korean universities has gone from eight to ten, Taiwanese from three to six, Saudi from zero to four, and Malaysian from zero to two.
But some parts of Asia appear to be stagnant or in relative decline. The number of Japanese universities has fallen from 36 to 19 and Indian from three to one while the number of Hong Kong institutions has remained the same at five.
Looking at the performance of some national flagships in the ARWU publications indicator provides more evidence of an expansion of research in some Asian countries. Compared to Harvard's benchmark score of 100, Peking University has risen from 49.8 in 2004 to 63.6 in 2014. Other Asian universities have also had substantial growth over the decade. Seoul National University went from 62.6 to 67.8, National Taiwan University from 52.6 to 57.9 and Istanbul University from 30.7 to 34.9.
Note that the raw numbers of publications have been modified by a logarithm so that in 2004 Peking was in fact publishing about a quarter of the number of papers produced by Harvard rather than a half.
On the other hand, Tokyo University, which in 2004 had the second highest publications score in the world, fell from 91.9 to 73 and the University of Hong Kong from 46.4 to 44.
If we look at Shanghai's Productivity per Capita indicator, which measures quality by dividing five combined indicators by the number of faculty, we find some Asian universities doing well. Peking goes from 5.9 to 16.5, Seoul National University from 19 to 23.4 and National Taiwan University from 17.5 to 19.9. Tokyo, meanwhile, has fallen from 49.8 to 29.2. Hong Kong University, on the other hand,has risen from 13.1 to 22.4.
Confirmation of the trends for research output comes from the Output indicator in the Scimago rankings, which is based on data provided by Scopus. Peking, Seoul National University, National Taiwan University and Universiti Malaya rose between 2009 and 2014. However, the scores for Tokyo and Hong Kong both fell.
On the other hand, the evidence of Scimago' normalised impact indicator, which might measure research quality, shows Peking rising but Seoul National University and Hong Kong falling.
It would seem that China and the overseas Chinese communities and Korea are expanding the quantity of research but progress at higher levels is slower. There are also islands of research productivity in West and Southeast Asia. In Central Asia, the Indian Subcontinent and Indonesia there is very little significant research activity while Japan is actually declining.
One Way of Rising in the Rankings
One way of rising in the rankings is to amalgamate. Watch out for Paris-Saclay in next year's Shanghai rankings.
From BBC Online News:
"Dominique Vernay, the president of this new university, says that within a decade he wants Paris-Saclay to be among the top ranking world universities.
"My goal is to be a top 10 institution," he says. In Europe, he wants Paris-Saclay to be in the "top two or three".
In university rankings, big is beautiful, and the Paris-Saclay will have 70,000 students and 10,000 researchers. There will be an emphasis on graduate courses and recruiting more international students and staff.
The idea of bringing together individual colleges into a "federal university" has been borrowed from the UK.
"Our model isn't that far from the Oxbridge model," he says.
To put it into scale, Mr Vernay says Paris-Saclay is going to be twice the size of the University of California, Berkeley, one of the flagships of the US university system."
Saturday, December 06, 2014
Ranking Status Wars 4
Meanwhile in Saudi Arabia, scholarships granted under the Custodian of the Two Holy Mosques Programme will go to students at 200 scientific universities chosen from the Big Four rankings, US News, Times Higher Education, QS and Shanghai.
Wednesday, December 03, 2014
Ranking Status Wars 3
The US News and World Report's Best Global Universities has been admitted to the ranks of the elite rankings. The Hong Kong government has announced a scholarship programme that will pay tuition and bursaries at universities in the top 100 of the QS world rankings, the THE world rankings, the Shanghai rankings and the USNWR global rankings.
Thursday, November 20, 2014
More on the Ranking Status Wars
The Economist thinks there are two international university rankings worth talking about. Will the prestigious THE rankings continue to be prestigious now they are no longer powered by Thomson Reuters but have to share their data partner with QS?
"But most universities still have far to go. Only two Chinese institutions number in the top 100 in the Times Higher Education World University Rankings. Shanghai’s Jiao Tong University includes only 32 institutions from mainland China among the world’s 500 best. The government frets about the failure of a Chinese scholar ever to win a Nobel prize in science (although the country has a laureate for literature and an—unwelcome—winner in 2010 of the Nobel peace prize, Liu Xiaobo, an imprisoned dissident)".
The Times [Higher Education rankings] they are a-changing
Maybe I'll get my five minutes of fame for being first with a Dylan quotation. I was a bit slow because, unlike Jonah Lehrer, I wanted to check that the quotation actually exists.
Times Higher Education (THE) have announced that they will be introducing reforms to their World University Rankings and ending their partnership with media and data giant, Thomson Reuters (TR).
Exactly why is not stated. It could be rooted in financial disagreement. Maybe THE feels betrayed because TR let US News use the reputation survey for their new Best Global Universities rankings. Perhaps THE got fed up with explaining why places like Bogazici University, Federico Santa Maria Technical University and Royal Holloway were world beaters for research impact, outshining Yale, Oxford and Cambridge,
The reputation survey will now be administered by THE itself in cooperation with Elsevier and will make use of the Scopus database. Institutional data will be collected from universities, the Scopus database and the Scival analysis tool by a new THE team.
The coming months will reveal what THE have in store but for now this is a list of recommendations. No doubt there will be many more from all sorts of people.
Display each indicator separately instead of lumping them together into Teaching, Research and International Outlook. It is impossible to work out exactly what is causing a rise or fall in the rankings unless they are separated.
Try to find some why of reducing the volatility of the reputation survey. US News do this by using a five year average and QS by rolling over unchanged responses for a further two years.
Consider including questions about undergraduate teaching or doing another survey to assess student satisfaction.
Reduce the weighting of the citations indicator and use more than one measure of citations to assess research quality (citations per paper), faculty quality (citations per faculty) and research impact (total citations). Use field normalisation but sparingly and sensibly and forget about that regional modification.
Drop the Industry Income: Innovation indicator. It is unfair to liberal arts colleges and private universities and too dependent on input from institutions. Think about using patents instead.
Income is an input. Do not use unless it is to assess the efficiency of universities in producing research or graduates.
Considering dropping the international students indicator or at least reducing its weighting. It is too dependent on geography and encourages all sorts of immigration scams.
Benchmark scores against the means of a constant number of institutions. If you do not, the mean indicator scores will fluctuate from year to year causing all sorts of distortions.
Times Higher Education (THE) have announced that they will be introducing reforms to their World University Rankings and ending their partnership with media and data giant, Thomson Reuters (TR).
Exactly why is not stated. It could be rooted in financial disagreement. Maybe THE feels betrayed because TR let US News use the reputation survey for their new Best Global Universities rankings. Perhaps THE got fed up with explaining why places like Bogazici University, Federico Santa Maria Technical University and Royal Holloway were world beaters for research impact, outshining Yale, Oxford and Cambridge,
The reputation survey will now be administered by THE itself in cooperation with Elsevier and will make use of the Scopus database. Institutional data will be collected from universities, the Scopus database and the Scival analysis tool by a new THE team.
The coming months will reveal what THE have in store but for now this is a list of recommendations. No doubt there will be many more from all sorts of people.
Display each indicator separately instead of lumping them together into Teaching, Research and International Outlook. It is impossible to work out exactly what is causing a rise or fall in the rankings unless they are separated.
Try to find some why of reducing the volatility of the reputation survey. US News do this by using a five year average and QS by rolling over unchanged responses for a further two years.
Consider including questions about undergraduate teaching or doing another survey to assess student satisfaction.
Reduce the weighting of the citations indicator and use more than one measure of citations to assess research quality (citations per paper), faculty quality (citations per faculty) and research impact (total citations). Use field normalisation but sparingly and sensibly and forget about that regional modification.
Drop the Industry Income: Innovation indicator. It is unfair to liberal arts colleges and private universities and too dependent on input from institutions. Think about using patents instead.
Income is an input. Do not use unless it is to assess the efficiency of universities in producing research or graduates.
Considering dropping the international students indicator or at least reducing its weighting. It is too dependent on geography and encourages all sorts of immigration scams.
Benchmark scores against the means of a constant number of institutions. If you do not, the mean indicator scores will fluctuate from year to year causing all sorts of distortions.
Thursday, November 06, 2014
The US News Arab Region Rankings
Hardly a week passes without the publication of yet more international university rankings. This week it was the Best Arab Region Universities from the US News, famous for producing America's Best Colleges for over three decades.
These rankings are research based. There are nine indicators, one of which measures the number of publications and has a weighting of 30 per cent. The other eight relate to citations in some way. There are no indicators measuring faculty student ratio, teaching quality, graduate employment, income or reputation.
Inclusion in the rankings required 400 papers in the Scopus database over a five year period,2009 to 2013. It is a serious indictment of Arab universities that only 91 institutions could reach this modest target.
There is an interesting section in the methodology:
The rankings show that research in the Arab world is dominated by a few countries. Just over half of the universities in the rankings come from three countries, Egypt, Saudi Arabia and Algeria. However, at the very top the rankings are dominated by Saudi Arabia, which holds the first three places.
The Top Ten are:
1. King Saud University, Saudi Arabia
2. King Abdualaziz University, Saudi Arabia
3. King Abdullah University of Science and Technology, Saudi Arabia
4. Cairo University, Egypt
5. American University of Beirut, Lebanon
6. Mansoura University, Egypt
7. Ain Shams University, Egypt
8 King Fahd University of Petroleum and Minerals, Saudi Arabia
9. Alexandria University, Egypt
10. United Arab Emirates University
There are also 16 subject rankings. Every one of these is topped by a Saudi university except for the Social Sciences which is headed by the American University of Beirut. In first place in the Physics rankings is King Abdulaziz University which has benefited from those multi-contributor publications which feature at least one of its adjunct faculty with a double affiliation.
Universite Cadi Ayyad Marrakech, Morocco, which was declared by THE to be the best Arab university and best in Africa north of the Kalahari, is in thirtieth place here. I wonder why.
These rankings are research based. There are nine indicators, one of which measures the number of publications and has a weighting of 30 per cent. The other eight relate to citations in some way. There are no indicators measuring faculty student ratio, teaching quality, graduate employment, income or reputation.
Inclusion in the rankings required 400 papers in the Scopus database over a five year period,2009 to 2013. It is a serious indictment of Arab universities that only 91 institutions could reach this modest target.
There is an interesting section in the methodology:
"Papers published by Arab region institutions in the subject area of physics and astronomy were excluded based on input from Elsevier's bibliometric experts, who determined that their citation characteristics would distort the results of the overall rankings. There is, however, a separate subject ranking for physics and astronomy that is based on papers published exclusively in those fields."
This presumably means that US News is aware of the distorting effect of physics publications with a large number of contributing authors, which has helped propel institutions such as Panjab University and Federico Santa Maris Technical University into high spots in the THE world rankings.
The rankings show that research in the Arab world is dominated by a few countries. Just over half of the universities in the rankings come from three countries, Egypt, Saudi Arabia and Algeria. However, at the very top the rankings are dominated by Saudi Arabia, which holds the first three places.
The Top Ten are:
1. King Saud University, Saudi Arabia
2. King Abdualaziz University, Saudi Arabia
3. King Abdullah University of Science and Technology, Saudi Arabia
4. Cairo University, Egypt
5. American University of Beirut, Lebanon
6. Mansoura University, Egypt
7. Ain Shams University, Egypt
8 King Fahd University of Petroleum and Minerals, Saudi Arabia
9. Alexandria University, Egypt
10. United Arab Emirates University
There are also 16 subject rankings. Every one of these is topped by a Saudi university except for the Social Sciences which is headed by the American University of Beirut. In first place in the Physics rankings is King Abdulaziz University which has benefited from those multi-contributor publications which feature at least one of its adjunct faculty with a double affiliation.
Universite Cadi Ayyad Marrakech, Morocco, which was declared by THE to be the best Arab university and best in Africa north of the Kalahari, is in thirtieth place here. I wonder why.
Tuesday, November 04, 2014
The Taiwan (NTU) Rankings
These rankings are entirely research based and make no attempt to measure teaching quality. They also have a very strong bias against the humanities and social sciences: the London School of Economics and the Stockholm School of Economics do not appear at all.
They tend to reward size rather than quality so that Johns Hopkins is in 2nd place and Caltech 36th. The emphasis on citations gives a boost to medical schools like the University of California San Francisco and Rockefeller University.
The results, with these limitations, are quite reasonable.
Publisher
National Taiwan University
Scope
Global. Data provided for 500 universities. 903 ranked.selected from Essential Science Indicators and other rankings.
Indicators
Number of articles 2003-13: 10%
Number of articles 2013: 15%
Number of citations 2003-13: 15%
Number of citations 2012-13:10%
Average number of citations 2003-13: 10%
h-index 2012-13: 10%
Number of highly cited papers 2003-13: 15%
Number of articles in high-impact journals 2012: 15%
Top Ten (total score)
Countries with Universities in the Top Hundred
Top Ranked in Region (Total Score)
Noise Index
In the top 20, the NTU rankings are more volatile than the THE world rankings but less so than QS, with the average university moving up or down 1.2 places since last year.
Looking at the top 100 universities, the NTU rankings are very volatile, since several of the indicators cover a one or two year period. The average university has changed 7.3 places over the year.
Note: universities falling out of the top 100 are treated as though they fell to 101st position.
They tend to reward size rather than quality so that Johns Hopkins is in 2nd place and Caltech 36th. The emphasis on citations gives a boost to medical schools like the University of California San Francisco and Rockefeller University.
The results, with these limitations, are quite reasonable.
Publisher
National Taiwan University
Scope
Global. Data provided for 500 universities. 903 ranked.selected from Essential Science Indicators and other rankings.
Indicators
Number of articles 2003-13: 10%
Number of articles 2013: 15%
Number of citations 2003-13: 15%
Number of citations 2012-13:10%
Average number of citations 2003-13: 10%
h-index 2012-13: 10%
Number of highly cited papers 2003-13: 15%
Number of articles in high-impact journals 2012: 15%
Top Ten (total score)
Place | University |
---|---|
1 | Harvard University |
2 | Johns Hopkins University |
3 | Stanford University |
4 | University of Toronto |
5 | University of Washington Seattle |
6 | University of California Los Angeles |
7 | University of Michigan Ann Arbor |
8= | University of California Berkeley |
8= | Massachusetts Institute of Technology |
8= | University of Oxford |
Countries with Universities in the Top Hundred
Country | Number of Universities |
---|---|
USA | 45 |
UK | 8 |
Netherlands | 7 |
Germany | 5 |
Canada | 5 |
Australia | 4 |
China | 4 |
Sweden | 3 |
Japan | 3 |
France | 2 |
Belgium | 2 |
Denmark | 2 |
Switzerland | 2 |
Singapore | 1 |
Spain | 1 |
Italy | 1 |
Finland | 1 |
Brazil | 1 |
Taiwan | 1 |
Norway | 1 |
South Korea | 1 |
Top Ranked in Region (Total Score)
North America
|
Harvard
|
---|---|
Africa | University of Cape Town |
Europe | Oxford University |
Latin America | Universidade de Sao Paulo |
Asia | University of Tokyo |
Central and Eastern Europe | Charles University in Prague |
Arab World | King Abdulaziz University |
Middle East | Tel Aviv University |
Oceania | University of Melbourne |
Noise Index
In the top 20, the NTU rankings are more volatile than the THE world rankings but less so than QS, with the average university moving up or down 1.2 places since last year.
Ranking | Average Place Change of Universities in the top 20 |
---|---|
NTU rankings 2013-14 | 1.20 |
THE World rankings 2013-14 | 0.70 |
QS World Rankings 2013-2014 | 1.45 |
ARWU 2013 -2014 | 0.65 |
Webometrics 2013-2014 | 4.25 |
Center for World University Ranking (Jeddah) 2013-2014 | 0.90 |
Looking at the top 100 universities, the NTU rankings are very volatile, since several of the indicators cover a one or two year period. The average university has changed 7.3 places over the year.
Ranking | Average Place Change of Universities in the top 100 |
---|---|
NTU rankings 2013-14 | 7.30 |
THE World Rankings 2013-2014 | 4.34 |
QS World Rankings 2013-14 | 3.94 |
QS World Rankings 2013-14 | 3.94 |
ARWU 2013 -2014 | 4.92 |
Webometrics 2013-2014 | 12.08 |
Center for World University Ranking (Jeddah) 2013-2014 | 10.59 |
Note: universities falling out of the top 100 are treated as though they fell to 101st position.
Friday, October 31, 2014
Initial Comments on the US News Global Rankings
It was a bit of a surprise when US News & World Report (USNWR) announced that they were going global but perhaps it shouldn't have been. The USNWR has been ranking American colleges since the early 80s, making even the Shanghai Centre for World Class Universities or QS look like novices. Also, with the advance of globalisation of higher education and research there is now a market for comparisons of US universities and their international competitors.
The Best Global Universities rankings are research based, except for two indicators, each with a 5% weighting, that count Ph D degrees. They are also heavily citation oriented, with a huge 42.5% weighting going to citations. However, the US News staff have used their common sense and included four measures of citations, normalized citation impact, total citations, number of highly cited papers and percentage of highly cited papers.
The result of this is that many of the high fliers in this year's THE rankings are absent. Bogazici University in Turkey, 14th best in Asia according to THE, is absent, So is Federico Santa Maria Technical University in Chile, according to THE second best in Latin America and Panjab University, supposedly the second best in India.
The reason for this contrast is simply that THE and Thomson Reuters rewarded these institutions for a few physics papers with hundreds of participating institutions by using a very inappropriate methodology and giving it a 30% weighting. USNWR have trimmed this indicator to 10% and so the high fliers have been grounded.
The Best Global Universities rankings are research based, except for two indicators, each with a 5% weighting, that count Ph D degrees. They are also heavily citation oriented, with a huge 42.5% weighting going to citations. However, the US News staff have used their common sense and included four measures of citations, normalized citation impact, total citations, number of highly cited papers and percentage of highly cited papers.
The result of this is that many of the high fliers in this year's THE rankings are absent. Bogazici University in Turkey, 14th best in Asia according to THE, is absent, So is Federico Santa Maria Technical University in Chile, according to THE second best in Latin America and Panjab University, supposedly the second best in India.
The reason for this contrast is simply that THE and Thomson Reuters rewarded these institutions for a few physics papers with hundreds of participating institutions by using a very inappropriate methodology and giving it a 30% weighting. USNWR have trimmed this indicator to 10% and so the high fliers have been grounded.
Friday, October 17, 2014
The university rankings business gets bigger and bigger
US News is going global. There are three different Arab/ MENA rankings on the way. Now, QS is getting ready for further growth. This is from Education Investor.
Posted on: 16/10/2014
Exclusive: QS seeks £10m investment
The university rankings provider QS is looking to sell a £10 million stake in its business, EducationInvestorunderstands.
According to its website, QS runs websites and events that connect graduates and employers. But it is best known for its World University Rankings, which it claims are “the most widely read university comparison of their kind”.
Three sources close to the matter said a deal was on the table, and one said that first round bids had already been submitted. QS wants to raise the cash “half to buy out an existing shareholding and half to use as growth capital”.
However, Nunzio Quacquatelli, managing director and majority shareholder of QS, told EducationInvestorthat the firm was “looking at all options, both debt and possibly structured finance”.
“We are looking for some external funding to support our rapid growth. Our vision is to be a leading information company in the higher education sector with global ambitions and [with this funding] we aim to continue on this path.”
QS operates in over 70 countries, and has more than 200 staff and 1,200 clients. Its valuation hasn’t been publicised, but the firm is understood to have an ebitda of £3.3 million and revenue of £19.8 million.
According to one source, the deal is expected to complete later in the fourth quarter.
|
Posted on: 16/10/2014
Tuesday, October 14, 2014
Shanghai without the Awards
Updated. The link to the site is here.
The Center for World-Class Universities (CWU) at Shanghai Jiao Tong University has produced an interesting new ranking by removing the Alumni and Awards indicators from its Academic Ranking of World Universities. These indicators have been criticised for allowing western universities to live off their intellectual capital and ignoring the rise of newcomers in Asia.
So what would ARWU look like without the Nobel and Fields awards?
At the very top things are the same. Harvard is still first and Stanford second. But Cambridge goes down and Oxford goes up.
Universities that would benefit significantly from deleting these indicators include Michigan, rising from 23rd to 13th, Pennsylvania State University from 58th to 35th, University of Florida, Tsinghua University, Alberta, Peking, Sao Paul, Tel Aviv, Zhejiang and Scuola Normale Pisa, which would rise to the 201-300 band.
CWU have calculated the ratio between places in ARWU and the Alternative Ranking. The higher the score the greater the benefit from the Awards and Alumni indicators. The biggest gainers from Nobel and Fields laureates are Princeton, Moscow State University and Paris Sud (11).
The countries that have benefited most from these indicators are the USA, France, Germany.
It looks as though the ARWU has favoured the Ivy League, continental European universities and Cambridge at the expense of American public universities and the rising stars of Asia.
.
The Center for World-Class Universities (CWU) at Shanghai Jiao Tong University has produced an interesting new ranking by removing the Alumni and Awards indicators from its Academic Ranking of World Universities. These indicators have been criticised for allowing western universities to live off their intellectual capital and ignoring the rise of newcomers in Asia.
So what would ARWU look like without the Nobel and Fields awards?
At the very top things are the same. Harvard is still first and Stanford second. But Cambridge goes down and Oxford goes up.
Universities that would benefit significantly from deleting these indicators include Michigan, rising from 23rd to 13th, Pennsylvania State University from 58th to 35th, University of Florida, Tsinghua University, Alberta, Peking, Sao Paul, Tel Aviv, Zhejiang and Scuola Normale Pisa, which would rise to the 201-300 band.
CWU have calculated the ratio between places in ARWU and the Alternative Ranking. The higher the score the greater the benefit from the Awards and Alumni indicators. The biggest gainers from Nobel and Fields laureates are Princeton, Moscow State University and Paris Sud (11).
The countries that have benefited most from these indicators are the USA, France, Germany.
It looks as though the ARWU has favoured the Ivy League, continental European universities and Cambridge at the expense of American public universities and the rising stars of Asia.
.
Saturday, October 11, 2014
Another Global Ranking
Just when you thought you could stop reading about rankings.
For several years the US News & World Report (USNWR), publishers of America's Best Colleges, repackaged the QS World University Rankings and just put its own stamp on them for the American public.
Now, the USNWR has announced that it is going into the global rankings business. It seems that this time that they will produce completely new rankings that have nothing to do with the Times Higher Education (THE) rankings. There will also be regional, country and subject rankings.
The data, however,will come from Thomson Reuters (TR), who are also the data providers for the THE world rankings and two of the indicators, Highly Cited Researchers and Publications, in the Shanghai ARWU rankings. It is definitely unhealthy if TR are going to supply the data or some of it for three out of four well known world rankings.
Bob Morse says that the new rankings will be "powered by Thomson Reuters InCitesTM research analytics solutions". Does this mean that universities who do not join InCites will not be ranked? Will universities be allowed to opt in or opt out? Will all data come from TR? Will the survey be shared with THE or will there be another one?
For several years the US News & World Report (USNWR), publishers of America's Best Colleges, repackaged the QS World University Rankings and just put its own stamp on them for the American public.
Now, the USNWR has announced that it is going into the global rankings business. It seems that this time that they will produce completely new rankings that have nothing to do with the Times Higher Education (THE) rankings. There will also be regional, country and subject rankings.
The data, however,will come from Thomson Reuters (TR), who are also the data providers for the THE world rankings and two of the indicators, Highly Cited Researchers and Publications, in the Shanghai ARWU rankings. It is definitely unhealthy if TR are going to supply the data or some of it for three out of four well known world rankings.
Bob Morse says that the new rankings will be "powered by Thomson Reuters InCitesTM research analytics solutions". Does this mean that universities who do not join InCites will not be ranked? Will universities be allowed to opt in or opt out? Will all data come from TR? Will the survey be shared with THE or will there be another one?
Tuesday, October 07, 2014
The Times Higher Education World University Rankings
Publisher
Times Higher Education
Scope
Global. Data provided for 400 universities. Over 800 ranked.
Top Ten
Place | University |
---|---|
1 | California Institute of Technology (Caltech) |
2 | Harvard University |
3 | Oxford University |
4 | Stanford University |
5 | Cambridge University |
6 | Massachusetts Institute of Technology (MIT) |
7 | Princeton University |
8 | University of California Berkeley |
9= | Imperial College London |
9= | Yale University |
Countries with Universities in the Top Hundred
Country | Number of Universities |
---|---|
USA | 45 |
UK | 11 |
Germany | 6 |
Netherlands | 6 |
Australia | 5 |
Canada | 4 |
Switzerland | 3 |
Sweden | 3 |
South Korea | 3 |
Japan | 2 |
Singapore | 2 |
Hong Kong | 2 |
China | 2 |
France | 2 |
Belgium | 2 |
Italy | 1 |
Turkey | 1 |
Top Ranked in Region
North America
|
California Institute of Technology (Caltech)
|
---|---|
Africa | University of Cape Town |
Europe | Oxford University |
Latin America | Universidade de Sao Paulo |
Asia | University of Tokyo |
Central and Eastern Europe | Lomonosov Moscow State University |
Arab World | University of Marrakech Cadi Ayyad |
Middle East | Middle East Technical University |
Oceania | University of Melbourne |
Noise Index
In the top 20, this year's THE world rankings are less volatile than the previous edition and this year's QS rankings. They are still slightly less stable than the Shanghai rankings.
Ranking | Average Place Change of Universities in the top 20 |
---|---|
THE World rankings 2013-14 | 0.70 |
THE World Rankings 2012-2013 | 1.20 |
QS World Rankings 2013-2014 | 1.45 |
ARWU 2013 -2014 | 0.65 |
Webometrics 2013-2014 | 4.25 |
Center for World University Ranking (Jeddah) 2013-2014 | 0.90 |
Looking at the top 100 universities, the THE rankings are more stable than last year. The average university in the top 100 in 2013 rose or fell 4.34 places. The QS rankings are now more stable than the THE or Shanghai rankings.
Ranking | Average Place Change of Universities in the top 100 |
---|---|
THE World Rankings 2013-2014 | 4.34 |
THE World Rankings 2012-2013 | 5.36 |
QS World Rankings 2013-14 | 3.94 |
ARWU 2013 -2014 | 4.92 |
Webometrics 2013-2014 | 12.08 |
Center for World University Ranking (Jeddah) 2013-2014 | 10.59 |
Note: universities falling out of the top 100 are treated as though they fell to 101st position.
Saturday, October 04, 2014
How to win citations and rise in the rankings
A large part of the academic world has either been congratulating itself on performing well in the latest Times Higher Education (THE) world rankings, the data for which is provided by Thomson Reuters (TR), or complaining that only large injections of public money will keep their universities from falling into the great pit of the unranked.
Some, however, have been baffled by some of the placings reported by THE this year. Federico Santa Maria Technical University in Chile is allegedly the fourth best university in Latin America, Scuola Normale Superiore di Pisa the best in Italy and Turkish universities are apparently the rising stars of the academic world.
When there is a a university that appears to be punching above its weight the cause often turns out to be the citations indicator
Scuola Normale Superiore di Pisa is 63rd in the world with an overall score of 61.9 but a citations score of 96.4.
Royal Holloway, University of London is 118th in the world with an overall score of 53 but a citations score of 98.9
The University of California Santa Cruz is top of the world for citations with an overall score of 53.7 and 100 for citations
Bogazici University is 139th in the world with an overall score of 51.1 and a citations score of 96.8.
Federico Santa Maria Technical University in Valparaiso is in the 251-175 band so the total score is not given although it would be easy enough to work out. It has a score of 99.7 for citations.
So what is going on?
The problem lies with various aspects of Thomson Reuters' methodology.
First they use field normalisation. That means that they do not simply count the number of citations but compare the number of citations in 250 fields with the world average in each field. Not only that, but they compare each year in which the paper is cited with the world average of citations for that year.
The rationale for this is that the number of citations and the rapidity with which papers are cited vary from field to field. A paper reporting a cure for cancer or the discovery of a new particle will be cited hundreds of times within weeks. A paper in philosophy, economics or history may languish for years before anyone takes notice. John Muth's work on rational expectations was hardly noticed or cited for years before eventually starting a revolution in economic theory. So universities should be compared to the average for fields and years. Otherwise, those that are strong in the humanities and social sciences will be penalised.
Up to a point this is not a bad idea. But it does assume that all disciplines are equally valuable and demanding. But if the world has decided that it will fund medical research or astrophysics and support journals and pay researchers to read and cite other researchers' papers rather than media studies or education, then this is perhaps something rankers and data collectors should take account of.
In any case, by normalising for so many fields and then throwing normalisation by year into the mix, TR increase the likelihood of statistical anomalies. If someone can get a few dozen citations within a couple of years after publication in a field where citations, especially early ones, average below one a year then this could give an enormous boost to a university's citation score. That is precisely what happened with Alexandria University in 2010. Methodological tweaking has mitigated the risk to some extent but not completely. A university could also get a big boost by getting credit, no matter how undeserved, for a breakthrough paper or a review that is widely cited.
So let's take a look at some of the influential universities in the 2014 THE rankings. Scuola Normale Superiore di Pisa (SNSP) is a small research intensive institution that might not even meet the criteria to be ranked by TR. Its output is modest, 2,407 publications in the Web of Science core collection between 2009 and 2013, although for a small institution that is quite good.
One of those publications is 'Observation of a new boson...' in Physics Letters B in September 2012, which has been cited 1,631 times.
The paper has 2,896 "authors", whom I counted by looking for semicolons in the "find" box, affiliated to 228 institutions. Five of them are from SNSP.
To put it crudely, SSNP is making an "authorship" contribution of 0.17 % to the paper but getting 100% of the citation credit, as does every other contributor. Perhaps its researchers are playing a leading role in the Large Hadron Collider project or perhaps it has made a disproportionate financial contribution but TR provide no reason to think so.
The University of the Andes, supposedly the second best university in Latin America, is also a contributor to this publication, as is Panjab University, supposedly the second best institution in the Indian subcontinent.
Meanwhile, Royal Holloway, University of London has contributed to "Observation of a new particle...' in the same journal and issue. This has received 1,734 citations and involved 2,932 authors from 267 institutions, along with Tokyo Metropolitan University, Federico Santa Maria Technical University, Middle Eastern Technical University and Bogazici University.
The University of California Santa Cruz is one of 119 institutions that contributed to the 'Review of particle physics...' 2010 which has been cited 3,739 times to date. Like all the other contributors it gets full credit for all those citations.
It is not just the number of citations that boosts citation impact scores but also their occurrence within a year or two of publication so that the number of citations is much greater than the average for that field and those years.
The proliferation of papers with hundreds of authors is not confined to physics. There are several examples from medicine and genetics as well.
At this point, the question arises why not divide the citations for each paper among the authors of the paper? This is an option available in the Leiden Ranking so it should not be beyond TR's technical capabilities.
Or why not stop counting multi - authored publications when they exceed a certain quota of authors? This is exactly what TR did earlier this year when collecting data for its new highly cited researchers lists. Physics papers with more than 30 institutional affiliations were omitted, a very sensible procedure that should have been applied across the board.
So basically, one route to success in the rankings is to get into a multi - collaborator mega - cited project.
But that is not enough in itself. There are hundreds of universities contributing to these publications. But not all of them reap such disproportionate benefits. It is important not to publish too much. A dozen LHC papers will do wonders if you publish 400 or 500 papers a year. Four thousand a year and it will make little difference. One reason for the success of otherwise obscure institutions is that the number of papers by which the citations are divided is small.
So why on earth are TR using a method that produces such laughable results? Lets face it, if any other ranker put SNS Pisa, Federico Santa Maria or Bogaziii at the top of its flagship indicator we would go deaf from the chorus of academic tut-tutting.
TR, I suspect, are doing this because this method is identical or nearly identical to that used for their InCites system for evaluating individual academics within institutions, which appears very lucrative, and they do not want the expense and inconvenience of recalculating data.
Also perhaps, TR have become so enamoured of the complexity and sophistication of their operations that they really do think that they have actually discovered pockets of excellence in unlikely places that nobody else has the skill or the resources to even notice.
But we have not finished. There is one more element in TR's distinctive methodology and that is its regional modification introduced by Thompson Reuters in 2011.
This means that the normalised citation impact score of the university is divided by the square root of the impact score of the country in which it is located. A university located in a low scoring country will get a bonus that will be greater the lower the country's impact score. This would clearly be an advantage to countries like Chile, India and Turkey.
Every year there are more multi - authored multi -cited papers. It would not be surprising if university presidents start scanning the author lists of publications like the Review of Particle Physics, send out recruitment letters and get ready for ranking stardom.
Some, however, have been baffled by some of the placings reported by THE this year. Federico Santa Maria Technical University in Chile is allegedly the fourth best university in Latin America, Scuola Normale Superiore di Pisa the best in Italy and Turkish universities are apparently the rising stars of the academic world.
When there is a a university that appears to be punching above its weight the cause often turns out to be the citations indicator
Scuola Normale Superiore di Pisa is 63rd in the world with an overall score of 61.9 but a citations score of 96.4.
Royal Holloway, University of London is 118th in the world with an overall score of 53 but a citations score of 98.9
The University of California Santa Cruz is top of the world for citations with an overall score of 53.7 and 100 for citations
Bogazici University is 139th in the world with an overall score of 51.1 and a citations score of 96.8.
Federico Santa Maria Technical University in Valparaiso is in the 251-175 band so the total score is not given although it would be easy enough to work out. It has a score of 99.7 for citations.
So what is going on?
The problem lies with various aspects of Thomson Reuters' methodology.
First they use field normalisation. That means that they do not simply count the number of citations but compare the number of citations in 250 fields with the world average in each field. Not only that, but they compare each year in which the paper is cited with the world average of citations for that year.
The rationale for this is that the number of citations and the rapidity with which papers are cited vary from field to field. A paper reporting a cure for cancer or the discovery of a new particle will be cited hundreds of times within weeks. A paper in philosophy, economics or history may languish for years before anyone takes notice. John Muth's work on rational expectations was hardly noticed or cited for years before eventually starting a revolution in economic theory. So universities should be compared to the average for fields and years. Otherwise, those that are strong in the humanities and social sciences will be penalised.
Up to a point this is not a bad idea. But it does assume that all disciplines are equally valuable and demanding. But if the world has decided that it will fund medical research or astrophysics and support journals and pay researchers to read and cite other researchers' papers rather than media studies or education, then this is perhaps something rankers and data collectors should take account of.
In any case, by normalising for so many fields and then throwing normalisation by year into the mix, TR increase the likelihood of statistical anomalies. If someone can get a few dozen citations within a couple of years after publication in a field where citations, especially early ones, average below one a year then this could give an enormous boost to a university's citation score. That is precisely what happened with Alexandria University in 2010. Methodological tweaking has mitigated the risk to some extent but not completely. A university could also get a big boost by getting credit, no matter how undeserved, for a breakthrough paper or a review that is widely cited.
So let's take a look at some of the influential universities in the 2014 THE rankings. Scuola Normale Superiore di Pisa (SNSP) is a small research intensive institution that might not even meet the criteria to be ranked by TR. Its output is modest, 2,407 publications in the Web of Science core collection between 2009 and 2013, although for a small institution that is quite good.
One of those publications is 'Observation of a new boson...' in Physics Letters B in September 2012, which has been cited 1,631 times.
The paper has 2,896 "authors", whom I counted by looking for semicolons in the "find" box, affiliated to 228 institutions. Five of them are from SNSP.
To put it crudely, SSNP is making an "authorship" contribution of 0.17 % to the paper but getting 100% of the citation credit, as does every other contributor. Perhaps its researchers are playing a leading role in the Large Hadron Collider project or perhaps it has made a disproportionate financial contribution but TR provide no reason to think so.
The University of the Andes, supposedly the second best university in Latin America, is also a contributor to this publication, as is Panjab University, supposedly the second best institution in the Indian subcontinent.
Meanwhile, Royal Holloway, University of London has contributed to "Observation of a new particle...' in the same journal and issue. This has received 1,734 citations and involved 2,932 authors from 267 institutions, along with Tokyo Metropolitan University, Federico Santa Maria Technical University, Middle Eastern Technical University and Bogazici University.
The University of California Santa Cruz is one of 119 institutions that contributed to the 'Review of particle physics...' 2010 which has been cited 3,739 times to date. Like all the other contributors it gets full credit for all those citations.
It is not just the number of citations that boosts citation impact scores but also their occurrence within a year or two of publication so that the number of citations is much greater than the average for that field and those years.
The proliferation of papers with hundreds of authors is not confined to physics. There are several examples from medicine and genetics as well.
At this point, the question arises why not divide the citations for each paper among the authors of the paper? This is an option available in the Leiden Ranking so it should not be beyond TR's technical capabilities.
Or why not stop counting multi - authored publications when they exceed a certain quota of authors? This is exactly what TR did earlier this year when collecting data for its new highly cited researchers lists. Physics papers with more than 30 institutional affiliations were omitted, a very sensible procedure that should have been applied across the board.
So basically, one route to success in the rankings is to get into a multi - collaborator mega - cited project.
But that is not enough in itself. There are hundreds of universities contributing to these publications. But not all of them reap such disproportionate benefits. It is important not to publish too much. A dozen LHC papers will do wonders if you publish 400 or 500 papers a year. Four thousand a year and it will make little difference. One reason for the success of otherwise obscure institutions is that the number of papers by which the citations are divided is small.
So why on earth are TR using a method that produces such laughable results? Lets face it, if any other ranker put SNS Pisa, Federico Santa Maria or Bogaziii at the top of its flagship indicator we would go deaf from the chorus of academic tut-tutting.
TR, I suspect, are doing this because this method is identical or nearly identical to that used for their InCites system for evaluating individual academics within institutions, which appears very lucrative, and they do not want the expense and inconvenience of recalculating data.
Also perhaps, TR have become so enamoured of the complexity and sophistication of their operations that they really do think that they have actually discovered pockets of excellence in unlikely places that nobody else has the skill or the resources to even notice.
But we have not finished. There is one more element in TR's distinctive methodology and that is its regional modification introduced by Thompson Reuters in 2011.
This means that the normalised citation impact score of the university is divided by the square root of the impact score of the country in which it is located. A university located in a low scoring country will get a bonus that will be greater the lower the country's impact score. This would clearly be an advantage to countries like Chile, India and Turkey.
Every year there are more multi - authored multi -cited papers. It would not be surprising if university presidents start scanning the author lists of publications like the Review of Particle Physics, send out recruitment letters and get ready for ranking stardom.
Friday, October 03, 2014
Scuola Normale Superiore di Pisa: Are they dancing in the streets?
There is a lot of coverage of that huge pocket of excellence in Italy at ROARS:Return on Academic Research. Still trying to make sense of the Google translator.
The university with more research influence than.....
With apologies to the Sydney Morning Herald which had a headline about Caltech.
I don't know if the people of Valparaiso are aware that they are home to a world- class university but if they do find out this might be a nice headline.
Federico Santa Maria Technical University. More research influence than Princeton, Stanford, Harvard, Oxford, Cambridge, Yale, Duke, Bogazici, Colorado School of Mines.... (insert as you wish.)
I don't know if the people of Valparaiso are aware that they are home to a world- class university but if they do find out this might be a nice headline.
Federico Santa Maria Technical University. More research influence than Princeton, Stanford, Harvard, Oxford, Cambridge, Yale, Duke, Bogazici, Colorado School of Mines.... (insert as you wish.)
Thursday, October 02, 2014
Which universities have the greatest research influence?
Times Higher Education (THE) claims that its Citations:Research Influence indicator, prepared by Thomson Reuters (TR), is the flagship of its World University Rankings, It is strange then that the magazine has never published a research influence ranking although that ought to be just as interesting as its Young Universities Ranking, Reputation Rankings or gender index.
So let's have a look at the top 25 universities in the world this year ranked for research influence, measured by field- and year- normalised citations, by Thomson Reuters.
Santa Cruz and Tokyo Metropolitan have the same impact as MIT. Federico Santa MariaTechnical University is ahead of Princeton. Florida Institute of Technology beats Harvard. Bogazici University and Scuola Normale Superiore do better than Oxford and Cambridge.
Are they serious?
Apparently. There will be an explanation in the next post. Meanwhile go and check if you don't believe me. And let me know if there's any dancing in the streets of Valparaiso, Pisa, Golden or Istanbul.
Rank and Score for Citations: Research Influence 2014-15 THE World Rankings
So let's have a look at the top 25 universities in the world this year ranked for research influence, measured by field- and year- normalised citations, by Thomson Reuters.
Santa Cruz and Tokyo Metropolitan have the same impact as MIT. Federico Santa MariaTechnical University is ahead of Princeton. Florida Institute of Technology beats Harvard. Bogazici University and Scuola Normale Superiore do better than Oxford and Cambridge.
Are they serious?
Apparently. There will be an explanation in the next post. Meanwhile go and check if you don't believe me. And let me know if there's any dancing in the streets of Valparaiso, Pisa, Golden or Istanbul.
Rank and Score for Citations: Research Influence 2014-15 THE World Rankings
Rank | University | Score |
---|---|---|
1= | University of California Santa Cruz | 100 |
1= | MIT | 100 |
1= | Tokyo Metropolitan University | 100 |
4 | Rice University | 99.9 |
5= | Caltech | 99.7 |
5= | Federico Santa Maria Technical University, Chile | 99.7 |
7 | Princeton University | 99.6 |
8= | Florida Institute of Technology | 99.2 |
8= | University of California Santa Barbara | 99.2 |
10= | Stanford University | 99.1 |
10= | University of California Berkeley | 99.1 |
12= | Harvard University | 98.9 |
12= | Royal Holloway University of London | 98.9 |
14 | University of Colorado Boulder | 97.4 |
15 | University of Chicago | 97.3 |
16= | Washington University of St Louis | 97.1 |
16= | Colorado School of Mines | 97.1 |
18 | Northwestern University | 96.9 |
19 | Bogazici University, Turkey | 96.8 |
20 | Duke University | 96.6 |
21= | Scuola Normale Superiore Pisa, Italy | 96.4 |
21= | University of California San Diego | 96.4 |
23 | Boston College | 95.9 |
24 | Oxford University | 95.5 |
25= | Brandeis University | 95.3 |
25= | UCLA | 95.3 |
Tuesday, September 30, 2014
What good is reputation?
There is an excellent analysis by Alex Usher of Higher Education Strategy Associates of the reputation indicators in the THE and QS world rankings.
Main points include:
Main points include:
- THE and QS are both insufficiently transparent about their reputation surveys and it is very difficult to judge their reliability.
- The numbers responding to the THE survey are very small outside the top 50 and this could cause substantial changes in total scores because of a small increase or decease in the number of votes.
- The lack of transparency is influenced by commercial motives.
THE has been dropping twitter hints about interesting changes in the forthcoming rankings. Are these due to swings in the votes on the surveys?
Or could it be the Large Hadron Collider Citation Amplifier?
Monday, September 29, 2014
Ranking Status Wars
It looks like Times Higher Education is pulling ahead of QS in the ranking status war.
From Asahi Shimbun in Japan:
Meanwhile, a Norwegian study of rankings (analysis here, original report here) examines only the Shanghai ARWU, the Leiden Rankings and the THE World University Rankings.
From Asahi Shimbun in Japan:
"Only the University of Tokyo and Kyoto University made the top 100 of the World University Rankings released in October last year, placing 23rd and 52nd respectively. The rankings are decided by British educational journal Times Higher Education."
Meanwhile, a Norwegian study of rankings (analysis here, original report here) examines only the Shanghai ARWU, the Leiden Rankings and the THE World University Rankings.
Thursday, September 25, 2014
How the Universities of Huddersfield, East London, Plymouth, Salford, Central Lancashire et cetera helped Cambridge overtake Harvard in the QS rankings
It is a cause of pride for the great and the good of British higher education that the country's universities do brilliantly in certain global rankings. Sometimes though, there is puzzlement about how UK universities can do so well even though the performance of the national economy and the level of adult cognitive skills are so mediocre.
In the latest QS World University Rankings Cambridge and Imperial College London pulled off a spectacular feat when they moved ahead of Harvard into joint second place behind MIT, an achievement at first glance as remarkable as Leicester City beating Manchester United. Is this a tribute to the outstanding quality of teaching, inspired leadership or cutting edge research, or perhaps something else?
Neither Cambridge nor Imperial does very well in the research based rankings. Cambridge is 18th and Imperial 26th among higher education institutions in the latest Scimago rankings for output and 32nd and 33rd for normalised impact (citations per paper adjusted for field). Harvard is 1st and 4th for these indicators. In the CWTS Leiden Ranking, Cambridge is 22nd and Imperial 32nd for the mean normalised citation score, sometimes regarded as the flagship of these rankings, while Harvard is 6th.
It is true that Cambridge does much better on the Shanghai Academic Ranking of World Universities with fifth place overall, but that is in large measure due to an excellent score, 96.6, for alumni winning Nobel and Fields awards, some dating back several decades. For Highly Cited Researchers and publications in Nature and Science its performance is not nearly so good.
Looking at the THE World University Rankings, which make some attempt to measure factors other than research, Cambridge and Imperial come in 7th and 10th overall, which is much better than they do in the Leiden and Scimago rankings. However, it is very likely that the postgraduate teaching and research surveys made a significant contribution to this performance. Cambridge is 4th in the THE reputation rankings based on last year's data and Imperial is 13th.
Reputation is also a key to the success of Cambridge and Imperial in the QS world rankings. Take a look at the scores and positions of Harvard, Cambridge and Imperial in the rankings just released.
Harvard gets 100 points (2nd place) for the academic survey, employer survey (3rd), and citations per faculty (3rd). It has 99.7 for faculty student ratio (29th), 98.1 for international faculty (53rd), and 83.8 for international students (117th). Harvard's big weakness is its relatively small percentage of international students.
Cambridge is in first place for the academic survey and 2nd in the employer survey, in both cases with a score of 100 and one place ahead of Harvard. The first secret of Cambridge's success is that it does much better on reputational measures than for bibliometric or other objective data. It was 18th for faculty student ratio, 73rd for international faculty, 50th for international students and 40th for citations per faculty.
So, Cambridge is ahead for faculty student ratio and international students and Harvard is ahead for international faculty and citations per faculty. Both get 100 for the two surveys.
Similarly, Imperial has 99.9 points for the academic survey (14th), 100 for the employer survey (7th), 99.8 for faculty student ratio (26th), 100 for international faculty (41st), 99.7 (20th) for international students and 96.2 (49th) for citations per faculty. It is behind Harvard for citations per faculty but just enough ahead for international students to squeeze past into joint second place.
The second secret is that QS's standardisation procedure combined with an expanding database means that the scores of the leading universities in the rankings are getting more and more squashed together at the top. QS turns its raw data into Z scores so that universities are measured according to their distance in standard deviations from the mean for all ranked universities. If the number of sub-elite universities in the rankings increases then the overall means for the indicators will fall and the scores of universities at the top end will rise as their distance in standard deviations from the mean increases.
Universities with scores of 98 and 99 will now start getting scores of 100. Universities with recorded scores of 100 will go on getting 100, although they might go up up a few invisible decimal points
In 2008, QS ranked 617 universities. In that year, nine universities had a score of 100 for the academic survey, four for the employer survey, nine for faculty student ratio, six for international faculty, six for international students and seven for citations per faculty.
By 2014 QS was ranking over 830 universities (I assume that those at the end of the rankings marked "NA" are there because they got votes in the surveys but are not ranked because they fail to meet the criteria for inclusion). For each indicator the number of universities getting a score of 100 increased. In 2014 there were 13 universities with a score of 100 for the academic survey, 14 for the employer survey, 16 for faculty student ratio, 41 for international faculty, 15 for international students and 10 for citations per faculty,
In 2008 Harvard got the same score as Cambridge for the academic and employer surveys. It was 0.3 (0.06 weighted) behind for faculty student ratio, 0.6 (0.53 weighted) behind for international faculty, and 14.1 (0.705 weighted) behind for international students, It was, however, 11.5 points. (2.3 weighted) ahead for citations per faculty. Harvard was therefore first and Cambridge third.
By 2014 Cambridge had fallen slightly behind Harvard for international faculty. It was slightly ahead for faculty student ratio. Scores for the survey remained the same, 100 for both places. Harvard reduced the gap for international students slightly.
What made the difference in 2014 and put Cambridge ahead of Harvard was that in 2008 Harvard in fifth place for citations and with a score 100 was 11.5 (2.3 weighted) points ahead of Cambridge. In 2014 Cambridge had improved a bit for this indicator -- it was 40th instead of 49th -- but now got 97.9 points reducing the difference with Harvard to 2.1 points (0.42 weighted). That was just enough to let Cambridge overtake Harvard.
Cambridge's rise between 2008 and 2014 was thus largely due to the increasing number of ranked universities which led to lower means for each indicator which led to higher Z scores at the top of each indicator and so reduced the effect of Cambridge's comparatively lower citations per faculty score.
The same thing happened to Imperial . It did a bit better for citations, rising from 58th to 49th place and this brought it a rise in points from 83.10 to 96.20 again allowing it to creep past Harvard.
Cambridge and Harvard should be grateful to those universities filling up the 701+ category at the bottom of the QS rankings. They are the invisible trampoline that propelled "Impbridge" into second place, just behind MIT.
QS should think carefully about adding more universities to their rankings. Another couple of hundred and there will be a dozen universities at the top getting 100 for everything.
In the latest QS World University Rankings Cambridge and Imperial College London pulled off a spectacular feat when they moved ahead of Harvard into joint second place behind MIT, an achievement at first glance as remarkable as Leicester City beating Manchester United. Is this a tribute to the outstanding quality of teaching, inspired leadership or cutting edge research, or perhaps something else?
Neither Cambridge nor Imperial does very well in the research based rankings. Cambridge is 18th and Imperial 26th among higher education institutions in the latest Scimago rankings for output and 32nd and 33rd for normalised impact (citations per paper adjusted for field). Harvard is 1st and 4th for these indicators. In the CWTS Leiden Ranking, Cambridge is 22nd and Imperial 32nd for the mean normalised citation score, sometimes regarded as the flagship of these rankings, while Harvard is 6th.
It is true that Cambridge does much better on the Shanghai Academic Ranking of World Universities with fifth place overall, but that is in large measure due to an excellent score, 96.6, for alumni winning Nobel and Fields awards, some dating back several decades. For Highly Cited Researchers and publications in Nature and Science its performance is not nearly so good.
Looking at the THE World University Rankings, which make some attempt to measure factors other than research, Cambridge and Imperial come in 7th and 10th overall, which is much better than they do in the Leiden and Scimago rankings. However, it is very likely that the postgraduate teaching and research surveys made a significant contribution to this performance. Cambridge is 4th in the THE reputation rankings based on last year's data and Imperial is 13th.
Reputation is also a key to the success of Cambridge and Imperial in the QS world rankings. Take a look at the scores and positions of Harvard, Cambridge and Imperial in the rankings just released.
Harvard gets 100 points (2nd place) for the academic survey, employer survey (3rd), and citations per faculty (3rd). It has 99.7 for faculty student ratio (29th), 98.1 for international faculty (53rd), and 83.8 for international students (117th). Harvard's big weakness is its relatively small percentage of international students.
Cambridge is in first place for the academic survey and 2nd in the employer survey, in both cases with a score of 100 and one place ahead of Harvard. The first secret of Cambridge's success is that it does much better on reputational measures than for bibliometric or other objective data. It was 18th for faculty student ratio, 73rd for international faculty, 50th for international students and 40th for citations per faculty.
So, Cambridge is ahead for faculty student ratio and international students and Harvard is ahead for international faculty and citations per faculty. Both get 100 for the two surveys.
Similarly, Imperial has 99.9 points for the academic survey (14th), 100 for the employer survey (7th), 99.8 for faculty student ratio (26th), 100 for international faculty (41st), 99.7 (20th) for international students and 96.2 (49th) for citations per faculty. It is behind Harvard for citations per faculty but just enough ahead for international students to squeeze past into joint second place.
The second secret is that QS's standardisation procedure combined with an expanding database means that the scores of the leading universities in the rankings are getting more and more squashed together at the top. QS turns its raw data into Z scores so that universities are measured according to their distance in standard deviations from the mean for all ranked universities. If the number of sub-elite universities in the rankings increases then the overall means for the indicators will fall and the scores of universities at the top end will rise as their distance in standard deviations from the mean increases.
Universities with scores of 98 and 99 will now start getting scores of 100. Universities with recorded scores of 100 will go on getting 100, although they might go up up a few invisible decimal points
In 2008, QS ranked 617 universities. In that year, nine universities had a score of 100 for the academic survey, four for the employer survey, nine for faculty student ratio, six for international faculty, six for international students and seven for citations per faculty.
By 2014 QS was ranking over 830 universities (I assume that those at the end of the rankings marked "NA" are there because they got votes in the surveys but are not ranked because they fail to meet the criteria for inclusion). For each indicator the number of universities getting a score of 100 increased. In 2014 there were 13 universities with a score of 100 for the academic survey, 14 for the employer survey, 16 for faculty student ratio, 41 for international faculty, 15 for international students and 10 for citations per faculty,
In 2008 Harvard got the same score as Cambridge for the academic and employer surveys. It was 0.3 (0.06 weighted) behind for faculty student ratio, 0.6 (0.53 weighted) behind for international faculty, and 14.1 (0.705 weighted) behind for international students, It was, however, 11.5 points. (2.3 weighted) ahead for citations per faculty. Harvard was therefore first and Cambridge third.
By 2014 Cambridge had fallen slightly behind Harvard for international faculty. It was slightly ahead for faculty student ratio. Scores for the survey remained the same, 100 for both places. Harvard reduced the gap for international students slightly.
What made the difference in 2014 and put Cambridge ahead of Harvard was that in 2008 Harvard in fifth place for citations and with a score 100 was 11.5 (2.3 weighted) points ahead of Cambridge. In 2014 Cambridge had improved a bit for this indicator -- it was 40th instead of 49th -- but now got 97.9 points reducing the difference with Harvard to 2.1 points (0.42 weighted). That was just enough to let Cambridge overtake Harvard.
Cambridge's rise between 2008 and 2014 was thus largely due to the increasing number of ranked universities which led to lower means for each indicator which led to higher Z scores at the top of each indicator and so reduced the effect of Cambridge's comparatively lower citations per faculty score.
The same thing happened to Imperial . It did a bit better for citations, rising from 58th to 49th place and this brought it a rise in points from 83.10 to 96.20 again allowing it to creep past Harvard.
Cambridge and Harvard should be grateful to those universities filling up the 701+ category at the bottom of the QS rankings. They are the invisible trampoline that propelled "Impbridge" into second place, just behind MIT.
QS should think carefully about adding more universities to their rankings. Another couple of hundred and there will be a dozen universities at the top getting 100 for everything.
Monday, September 22, 2014
Using Ig Nobel awards for ranking countries
Since 1991 Improbable Research has awarded prizes for research that makes people laugh and then think. Highlights this year include dung beetles navigating by starlight, the reaction of reindeer to humans disguised as polar bears and the ethical inferiority of people who can't get up in the morning.
The Ig® Nobel Interactive Database publishes a series of charts, one of which indicates the countries that produce such cutting edge research. Here are the top ten. The funny thing is it looks similar to the top ten of countries with universities in the top 100 in the QS rankings.The big difference is that Japan does better for Ig Nobel prizes than it does in the QS and the other rankings.
The Ig® Nobel Interactive Database publishes a series of charts, one of which indicates the countries that produce such cutting edge research. Here are the top ten. The funny thing is it looks similar to the top ten of countries with universities in the top 100 in the QS rankings.The big difference is that Japan does better for Ig Nobel prizes than it does in the QS and the other rankings.
Country | % of Ig Nobel awards |
---|---|
1. USA | 34.7 |
2. UK | 12.3 |
3. Japan | 9.9 |
4. Australia | 5.5 |
5. France | 3.7 |
6. Netherlands | 3.5 |
7. Canada | 2.8 |
8. Italy | 2.6 |
9. Switzerland | 2.1 |
10. China | 1.4 |
Country | Number of Universities in QS top 100 |
---|---|
1, USA | 28 |
2. UK | 19 |
3. Australia | 8 |
4. Netherlands | 7 |
5. Canada | 5 |
6. Switzerland | 4 |
7. Japan | 4 |
8= Germany | 3 |
8= China | 3 |
8= Korea | 3 |
8= Hong Kong | 3 |
The Uses of Rankings
It is getting difficult to avoid university rankings. They seem to be everywhere, with advertisements in railway stations in the English Midlands proclaiming that the local university is in the top 100 for something and newspaper articles in Malaysia reporting the latest news from QS.
Even Ron Liddle, the Spectator's curmudgeon in residence, has taken notice of the rankings and used them to mount a half-hearted defence of British culture against a scathing attack by Portuguese academic Jose Magueijo before retreating and conceding that the assault is pretty much on target.
Even Ron Liddle, the Spectator's curmudgeon in residence, has taken notice of the rankings and used them to mount a half-hearted defence of British culture against a scathing attack by Portuguese academic Jose Magueijo before retreating and conceding that the assault is pretty much on target.
"We might also mention, quietly, that he [Magueijo] has a post at one of the world’s top ten universities and that at least three other British universities are in that top ten, but there is not a Portuguese university in the top 200 (if they have universities)."
Sunday, September 21, 2014
Reactions to the QS Rankings 2014
Here is a selection of headlines and quotations about the latest QS world rankings from around the world.
'Australian National University moves into top 25 global rankings', The Guardian
'Strong Showing For Singapore & Hong Kong In 2014/2015 QS Rankings'
Asian Scientist
'BGU rises 39 places in QS World University Rankings to break in to top 300 this year'
Ben Gurion University of the Negev
The QS--"Quirky Silliness"--world university rankings are back
Leiter Reports: A Philosophy Blog
'St. Petersburg State University has not been able to get into the list of the top 200 QS'
Recent News Technology
'University rankings show decline in Irish colleges'
Irish Times
'While U.K. universities raked [sic] highly in the QS World University rankings, the scarcity of European peers should be worrying for the rest of the EU.'
The Wall Street Journal
Times Live
'Estonia has reached record high in the QS World University Rankings'
University of Tartu
'Mahidol leads Thai universities in Asia'
Bangkok Post
'No Indian varsity in top 200 global ranking'
Indian Express
'UP and Ateneo's University Rankings Worldwide 2014-2015 Improved'
Inquirer
'Australian National University moves into top 25 global rankings', The Guardian
'Strong Showing For Singapore & Hong Kong In 2014/2015 QS Rankings'
Asian Scientist
'BGU rises 39 places in QS World University Rankings to break in to top 300 this year'
Ben Gurion University of the Negev
The QS--"Quirky Silliness"--world university rankings are back
Leiter Reports: A Philosophy Blog
'St. Petersburg State University has not been able to get into the list of the top 200 QS'
Recent News Technology
'University rankings show decline in Irish colleges'
Irish Times
'While U.K. universities raked [sic] highly in the QS World University rankings, the scarcity of European peers should be worrying for the rest of the EU.'
The Wall Street Journal
'Four British institutions ranked in top six of world's universities'
'Estonia has reached record high in the QS World University Rankings'
University of Tartu
'Mahidol leads Thai universities in Asia'
Bangkok Post
'No Indian varsity in top 200 global ranking'
Indian Express
'UP and Ateneo's University Rankings Worldwide 2014-2015 Improved'
Inquirer
Thursday, September 18, 2014
QS World University Rankings 2014
Publisher
QS (Quacquarelli Symonds)
Scope
Global. 701+ universities.
Top Ten
Place | University |
---|---|
1 | MIT |
2= | Cambridge |
2= | Imperial College London |
4 | Harvard |
5 | Oxford |
6 | University College London |
7 | Stanford |
8 | California Institute of Technology (Caltech) |
9 | Princeton |
10 | Yale |
Countries with Universities in the Top Hundred
Country | Number of Universities |
---|---|
USA | 28 |
UK | 19 |
Australia | 8 |
Netherlands | 7 |
Canada | 5 |
Switzerland | 4 |
Japan | 4 |
Germany | 3 |
China | 3 |
Korea | 3 |
Hong Kong | 3 |
Denmark | 2 |
Singapore | 2 |
France | 2 |
Sweden | 2 |
Ireland | 1 |
Taiwan | 1 |
Finland | 1 |
Belgium | 1 |
New Zealand | 1 |
Top Ranked in Region
North America
| MIT |
---|---|
Africa | University of Cape Town |
Europe | Cambridge Imperial College London |
Latin America | Universidade de Sao Paulo |
Asia | National University of Singapore |
Central and Eastern Europe | Lomonosov Moscow State University |
Arab World | King Fahd University of Petroleum and Minerals |
Middle East | Hebrew University of Jerusalem |
Noise Index
In the top 20, this year's QS world rankings are less volatile than the previous edition but more so than the THE rankings or Shanghai ARWU. The top 20 universities in 2013 rose or fell an average of 1.45 places. The most remarkable change was the rise of Imperial College and Cambridge to second place behind MIT and ahead of Harvard.
Ranking | Average Place Change of Universities in the top 20 |
---|---|
QS World Rankings 2013-2014 | 1.45 |
QS World Rankings 2012-2013 | 1.70 |
ARWU 2013 -2014 | 0.65 |
Webometrics 2013-2014 | 4.25 |
Center for World University Ranking (Jeddah) 2013-2014 | 0.90 |
THE World Rankings 2012-2013 | 1.20 |
Looking at the top 100 universities, the QS rankings are little different from last year. The average university in the top 100 moved up or down 3.94 places compared to 3.97 between 2012 and 2013. These rankings are more reliable than this year's ARWU, which was affected by the new lists of highly cited researchers, and last year's THE rankings.
Ranking | Average Place Change of Universities in the top 100 |
---|---|
QS World Rankings 2013-14 | 3.94 |
QS World Rankings 2012-2013 | 3.97 |
ARWU 2013 -2014 | 4.92 |
Webometrics 2013-2014 | 12.08 |
Center for World University Ranking (Jeddah) 2013-2014 | 10.59 |
THE World Rankings 2012-2013 | 5.36 |
Methodology (from topuniversities)
1. Academic reputation (40%)
Academic reputation is measured using a global survey, in which academics are asked to identify the institutions where they believe the best work is currently taking place within their field of expertise.
For the 2014/15 edition, the rankings draw on almost 63,700 responses from academics worldwide, collated over three years. Only participants’ most recent responses are used, and they cannot vote for their own institution. Regional weightings are applied to counter any discrepancies in response rates.
The advantage of this indicator is that it gives a more equal weighting to different discipline areas than research citation counts. Whereas citation rates are far higher in subjects like biomedical sciences than they are in English literature, for example, the academic reputation survey weights responses from academics in different fields equally.
It also gives students a sense of the consensus of opinion among those who are by definition experts. Academics may not be well positioned to comment on teaching standards at other institutions, but it is well within their remit to have a view on where the most significant research is currently taking place within their field.
2. Employer reputation (10%)
The employer reputation indicator is also based on a global survey, taking in almost 28,800 responses for the 2014/15 edition. The survey asks employers to identify the universities they perceive as producing the best graduates. This indicator is unique among international university rankings.
The purpose of the employer survey is to give students a better sense of how universities are viewed in the job market. A higher weighting is given to votes for universities that come from outside of their own country, so it’s especially useful in helping prospective students to identify universities with a reputation that extends beyond their national borders.
3. Student-to-faculty ratio (20%)
This is a simple measure of the number of academic staff employed relative to the number of students enrolled. In the absence of an international standard by which to measure teaching quality, it provides an insight into the universities that are best equipped to provide small class sizes and a good level of individual supervision.
4. Citations per faculty (20%)
This indicator aims to assess universities’ research output. A ‘citation’ means a piece of research being cited (referred to) within another piece of research. Generally, the more often a piece of research is cited by others, the more influential it is. So the more highly cited research papers a university publishes, the stronger its research output is considered.
QS collects this information using Scopus, the world’s largest database of research abstracts and citations. The latest five complete years of data are used, and the total citation count is assessed in relation to the number of academic faculty members at the university, so that larger institutions don’t have an unfair advantage.
5 6. International faculty ratio (5%) international student ratio (5%)
The last two indicators aim to assess how successful a university has been in attracting students and faculty members from other nations. This is based on the proportion of international students and faculty members in relation to overall numbers. Each of these contributes 5% to the overall ranking results.