Nature has long been regarded as the best or one of the two best scientific journals in the world. Papers published there and in Science account for 20 % of the weighting for Shanghai Jiao Tong University's Academic Ranking of World Universities, the same as Nobel and Fields awards or publications in the whole of the Science Citation and Social Science Citation Indexes.
Sceptics may wonder whether Nature has seen better years and is perhaps sliding away from the pinnacle of scientific publishing. It has had some embarrassing moments in recent decades including the publication of a 1978 paper that gave credence to the alleged abilities of the psychic Uri Geller, the report of a study by Jacques Beneviste and others that purported to show that water has a memory. the questionable "hockey stick" article on global warming in 1998 and seven retracted papers on superconductivity by Jan Hendrik Schon.
But it still seems that Nature is highly regarded by the global scientific community and that the recent publication of the Nature Publication Index is a reasonable guide to current trends in scientific research. This counts the number of publications in Nature in 2013.
The USA remains on top with Harvard first, MIT second and Stanford third although China continues to make rapid progress. For many parts of the world, Latin America, Southern Europe, Africa, scientific achievement is extremely limited. Looking at the Asia-Pacific rankings much of the region including Indonesia, Bangladesh and the Philippines is almost a scientific desert.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Sunday, March 30, 2014
Sunday, March 23, 2014
At Last! A really Useful Ranking
Wunderground lists the top 25 snowiest universities in the US.
The top five are:
1. Syracuse University
2. Northern Arizona University (that's interesting)
3. The University at Buffalo: SUNY
4. Montana State University
5. University at Colorado Boulder
The top five are:
1. Syracuse University
2. Northern Arizona University (that's interesting)
3. The University at Buffalo: SUNY
4. Montana State University
5. University at Colorado Boulder
Monday, March 17, 2014
Reactions to the THE Reputation Rankings
The public appetite for university rankings seems insatiable, even when they just contain recycled data from last year's international rankings. Here is a sample of headlines about the recent Times Higher Education University Reputation Rankings which consist of data included in the 2013-14 World University Rankings
UC Berkeley 6th in worldwide university reputation ranking
UW-Madison ranked 28th among universities worldwide
Southern schools have worse reputations than Northern peers
UCLA rated world's No.2 public university
Times Higher Education Reputation Rankings Names UCSD as 40th Best University
Leading UK universities sliding down world rankings
Israel's Technion Ranked Among World's Top 100 Universities for 2014
No Indian university in top 100 global list, Harvard voted best
NZ universities rank among the world's best
Britain dropping down world university league table
IIT Bombay, IISc Bangalore falls from world top 200 rank: Survey
Survey: Caltech, UCLA Among Top 10 Global University Brands
Irish universities fail to make lecturers' most prestigious list
Four NL universities in top 100
University of Cape Town's Global Rankings Drops
Oxbridge slides down the latest university world rankings as US institutions tighten their grip on the top spots
Asian Universities Rising in Global Rankings
UC Berkeley 6th in worldwide university reputation ranking
UW-Madison ranked 28th among universities worldwide
Southern schools have worse reputations than Northern peers
UCLA rated world's No.2 public university
Times Higher Education Reputation Rankings Names UCSD as 40th Best University
Leading UK universities sliding down world rankings
Israel's Technion Ranked Among World's Top 100 Universities for 2014
No Indian university in top 100 global list, Harvard voted best
NZ universities rank among the world's best
Britain dropping down world university league table
IIT Bombay, IISc Bangalore falls from world top 200 rank: Survey
Survey: Caltech, UCLA Among Top 10 Global University Brands
Irish universities fail to make lecturers' most prestigious list
Four NL universities in top 100
University of Cape Town's Global Rankings Drops
Oxbridge slides down the latest university world rankings as US institutions tighten their grip on the top spots
Asian Universities Rising in Global Rankings
Tuesday, March 04, 2014
Reactions to the QS Subject Rankings
It looks as though the QS subject rankings are a big hit. Here is just a sample of headlines and quotations from around the world.
World Ranking Recognises Agricultural Excellence at Lincoln [New Zealand]
CEU [Central European University, Hungary] Programs Rank Among the World's Top 100
Boston-Area Schools Rank Top in the World in These 5 Fields
"Cardiff has been ranked as one of the top universities in the world in a number of different subjects, according to a recent international league table."
NTU [National Taiwan University] leads local universities making QS rankings list
Swansea University continues to excel in QS world subject rankings
Penn State Programs Rank Well in 2014 QS World Rankings by Subject
Anna Varsity [India] Enters Top 250 in QS World Univ Rankings
Moscow State University among 200 best in the world
New Ranking Says Harvard And MIT Are The Best American Universities For 80% of Academic Subjects
QS: The University of Porto ranked among the best in the world
4 Indian Institutions in 2014 World Ranking
"The Institute of Education [London] has been ranked as the world's leading university for Education in the 2014 QS World University Rankings."
Nine UvA [University of Amsterdam] subject areas listed in QS World University Rankings top 50
"The University of Newcastle's [Australia] Civil and Structural Engineering discipline has surged in the QS World University Rankings by Subject list"
World Ranking Recognises Agricultural Excellence at Lincoln [New Zealand]
CEU [Central European University, Hungary] Programs Rank Among the World's Top 100
Boston-Area Schools Rank Top in the World in These 5 Fields
"Cardiff has been ranked as one of the top universities in the world in a number of different subjects, according to a recent international league table."
NTU [National Taiwan University] leads local universities making QS rankings list
Swansea University continues to excel in QS world subject rankings
Penn State Programs Rank Well in 2014 QS World Rankings by Subject
Anna Varsity [India] Enters Top 250 in QS World Univ Rankings
Moscow State University among 200 best in the world
New Ranking Says Harvard And MIT Are The Best American Universities For 80% of Academic Subjects
QS: The University of Porto ranked among the best in the world
4 Indian Institutions in 2014 World Ranking
"The Institute of Education [London] has been ranked as the world's leading university for Education in the 2014 QS World University Rankings."
Nine UvA [University of Amsterdam] subject areas listed in QS World University Rankings top 50
"The University of Newcastle's [Australia] Civil and Structural Engineering discipline has surged in the QS World University Rankings by Subject list"
Sunday, March 02, 2014
The QS Subject Rankings: Reposting
QS have come out with their 2014 University Rankings by Subject, three months earlier than last year. Maybe this is to get ahead of Times Higher whose latest Reputation Rankings will be published next week.
The methodology of these rankings has not changed since last year so I am just reposting my article which was first published in the Philippine Daily Inquirer on 27th May and then reposted here on the 29th May 2013.
The QS University Rankings by Subject: Warning
It is time for the Philippines to think about constructing its own objective and transparent ranking or rating systems for its colleges and universities that would learn from the mistakes of the international rankers.
The ranking of universities is getting to be big business these days. There are quite a few rankings appearing from Scimago, Webometrics, University Ranking of Academic Performance (from Turkey), the Taiwan Rankings, plus many national rankings.
No doubt there will be more to come.
In addition, the big three of the ranking world—Quacquarelli Symonds (QS), Times Higher Education and Shanghai Jiao Tong University’s Academic Ranking of World Universities—are now producing a whole range of supplementary products, regional rankings, new university rankings, reputation rankings and subject rankings.
There is nothing wrong, in principle, with ranking universities. Indeed, it might be in some ways a necessity. The problem is that there are very serious problems with the rankings produced by QS, even though they seem to be better known in Southeast Asia than any of the others.
This is especially true of the subject rankings.
No new data
The QS subject rankings, which have just been released, do not contain new data. They are mostly based on data collected for last year’s World University Rankings—in some cases extracted from the rankings and, in others, recombined or recalculated.
There are four indicators used in these rankings. They are weighted differently for the different subjects and, in two subjects, only two of the indicators are used.
The four indicators are:
A survey of academics or people who claim to be academics or used to be academics, taken from a variety of sources. This is the same indicator used in the world rankings. Respondents were asked to name the best universities for research.
A survey of employers, which seem to comprise anyone who chooses to describe himself or herself as an employer or a recruiter.
The number of citations per paper. This is a change from the world rankings when the calculation was citations per faculty.
H-index. This is something that is easier to give examples for than to define. If a university publishes one paper and the paper is cited once, then it gets an index of one. If it publishes two or more papers and two of them are published twice each, then the index is two and so on. This is a way of combining quantity of research with quality as measured by influence on other researchers.
Out of these four indicators, three are about research and one is about the employability of a university’s graduates.
These rankings are not at all suitable for use by students wondering where they should go to study, whether at undergraduate or graduate level.
The only part that could be of any use is the employer review and that has a weight ranging from 40 percent for accounting and politics to 10 percent for arts and social science subjects, like history and sociology.
But even if the rankings are to be used just to evaluate the quantity or quality of research, they are frankly of little use. They are dominated by the survey of academic opinion, which is not of professional quality.
There are several ways in which people can take part in the survey. They can be nominated by a university, they can sign up themselves, they can be recommended by a previous respondent or they can be asked because they have subscribed to an academic journal or an online database.
Apart from checking that they have a valid academic e-mail address, it is not clear whether QS makes any attempt to check whether the survey respondents are really qualified to make any judgements about research.
Not plausible
The result is that the academic survey and also the employer survey have produced results that do not appear plausible.
In recent years, there have been some odd results from QS surveys. My personal favorite is the New York University Tisch School of the Arts, which set up a branch in Singapore in 2007 and graduated its first batch of students from a three-year Film course in 2010.In the QS Asian University Rankings of that year, the Singapore branch got zero for the other criteria (presumably the school did not submit data) but it was ranked 149th in Asia for academic reputation and 114th for employer reputation.
Not bad for a school that had yet to produce any graduates when the survey was taken early in the year.
In all of the subject rankings this year, the two surveys account for at least half of the total weighting and, in two cases, Languages and English, all of it.
Consequently, while some of the results for some subjects may be quite reasonable for the world top 50 or the top 100, after that they are sometimes downright bizarre.
The problem is that although QS has a lot of respondents worldwide, when it gets down to the subject level there can be very few. In pharmacy, for example, there are only 672 for the academic survey and in materials science 146 for the employer survey. Since the leading global players will get a large share of the responses, this means that universities further down the list will be getting a handful of responses for the survey. The result is that the order of universities in any subject in a single country like the Philippines can be decided by just one or two responses to the surveys.
Another problem is that, after a few obvious choices like Harvard, MIT (Massachusetts Institute of Technology), Tokyo, most respondents probably rely on a university’s general reputation and that can lead to all sorts of distortions.
Many of the subject rankings at the country level are quite strange. Sometimes they even include universities that do not offer courses in that subject. We have already seen that there are universities in the Philippines that are ranked for subjects that they do not teach.
Somebody might say that maybe they are doing research in a subject while teaching in a department with a different name, such as an economic historian teaching in the economics department but publishing in history journals and getting picked up by the academic survey for history.
Maybe, but it would not be a good idea for someone who wants to study history to apply to that particular university.
Another example is from Saudi Arabia, where King Fahd University of Petroleum and Minerals was apparently top for history, even though it does not have a history department or indeed anything where you might expect to find a historian. There are several universities in Saudi Arabia that may not teach history very well but at least they do actually teach it.
These subject rankings may have a modest utility for students who can pick or choose among top global universities and need some idea whether they should study engineering at SUNY (State University of New York) Buffalo (New York) or Leicester (United Kingdom) or linguistics at Birmingham or Michigan.
But they are of very little use for anyone else.
The methodology of these rankings has not changed since last year so I am just reposting my article which was first published in the Philippine Daily Inquirer on 27th May and then reposted here on the 29th May 2013.
The QS University Rankings by Subject: Warning
It is time for the Philippines to think about constructing its own objective and transparent ranking or rating systems for its colleges and universities that would learn from the mistakes of the international rankers.
The ranking of universities is getting to be big business these days. There are quite a few rankings appearing from Scimago, Webometrics, University Ranking of Academic Performance (from Turkey), the Taiwan Rankings, plus many national rankings.
No doubt there will be more to come.
In addition, the big three of the ranking world—Quacquarelli Symonds (QS), Times Higher Education and Shanghai Jiao Tong University’s Academic Ranking of World Universities—are now producing a whole range of supplementary products, regional rankings, new university rankings, reputation rankings and subject rankings.
There is nothing wrong, in principle, with ranking universities. Indeed, it might be in some ways a necessity. The problem is that there are very serious problems with the rankings produced by QS, even though they seem to be better known in Southeast Asia than any of the others.
This is especially true of the subject rankings.
No new data
The QS subject rankings, which have just been released, do not contain new data. They are mostly based on data collected for last year’s World University Rankings—in some cases extracted from the rankings and, in others, recombined or recalculated.
There are four indicators used in these rankings. They are weighted differently for the different subjects and, in two subjects, only two of the indicators are used.
The four indicators are:
A survey of academics or people who claim to be academics or used to be academics, taken from a variety of sources. This is the same indicator used in the world rankings. Respondents were asked to name the best universities for research.
A survey of employers, which seem to comprise anyone who chooses to describe himself or herself as an employer or a recruiter.
The number of citations per paper. This is a change from the world rankings when the calculation was citations per faculty.
H-index. This is something that is easier to give examples for than to define. If a university publishes one paper and the paper is cited once, then it gets an index of one. If it publishes two or more papers and two of them are published twice each, then the index is two and so on. This is a way of combining quantity of research with quality as measured by influence on other researchers.
Out of these four indicators, three are about research and one is about the employability of a university’s graduates.
These rankings are not at all suitable for use by students wondering where they should go to study, whether at undergraduate or graduate level.
The only part that could be of any use is the employer review and that has a weight ranging from 40 percent for accounting and politics to 10 percent for arts and social science subjects, like history and sociology.
But even if the rankings are to be used just to evaluate the quantity or quality of research, they are frankly of little use. They are dominated by the survey of academic opinion, which is not of professional quality.
There are several ways in which people can take part in the survey. They can be nominated by a university, they can sign up themselves, they can be recommended by a previous respondent or they can be asked because they have subscribed to an academic journal or an online database.
Apart from checking that they have a valid academic e-mail address, it is not clear whether QS makes any attempt to check whether the survey respondents are really qualified to make any judgements about research.
Not plausible
The result is that the academic survey and also the employer survey have produced results that do not appear plausible.
In recent years, there have been some odd results from QS surveys. My personal favorite is the New York University Tisch School of the Arts, which set up a branch in Singapore in 2007 and graduated its first batch of students from a three-year Film course in 2010.In the QS Asian University Rankings of that year, the Singapore branch got zero for the other criteria (presumably the school did not submit data) but it was ranked 149th in Asia for academic reputation and 114th for employer reputation.
Not bad for a school that had yet to produce any graduates when the survey was taken early in the year.
In all of the subject rankings this year, the two surveys account for at least half of the total weighting and, in two cases, Languages and English, all of it.
Consequently, while some of the results for some subjects may be quite reasonable for the world top 50 or the top 100, after that they are sometimes downright bizarre.
The problem is that although QS has a lot of respondents worldwide, when it gets down to the subject level there can be very few. In pharmacy, for example, there are only 672 for the academic survey and in materials science 146 for the employer survey. Since the leading global players will get a large share of the responses, this means that universities further down the list will be getting a handful of responses for the survey. The result is that the order of universities in any subject in a single country like the Philippines can be decided by just one or two responses to the surveys.
Another problem is that, after a few obvious choices like Harvard, MIT (Massachusetts Institute of Technology), Tokyo, most respondents probably rely on a university’s general reputation and that can lead to all sorts of distortions.
Many of the subject rankings at the country level are quite strange. Sometimes they even include universities that do not offer courses in that subject. We have already seen that there are universities in the Philippines that are ranked for subjects that they do not teach.
Somebody might say that maybe they are doing research in a subject while teaching in a department with a different name, such as an economic historian teaching in the economics department but publishing in history journals and getting picked up by the academic survey for history.
Maybe, but it would not be a good idea for someone who wants to study history to apply to that particular university.
Another example is from Saudi Arabia, where King Fahd University of Petroleum and Minerals was apparently top for history, even though it does not have a history department or indeed anything where you might expect to find a historian. There are several universities in Saudi Arabia that may not teach history very well but at least they do actually teach it.
These subject rankings may have a modest utility for students who can pick or choose among top global universities and need some idea whether they should study engineering at SUNY (State University of New York) Buffalo (New York) or Leicester (United Kingdom) or linguistics at Birmingham or Michigan.
But they are of very little use for anyone else.
Thursday, February 20, 2014
Changing Responses to the QS Academic Survey
QS have published an interactive map showing the percentage distribution of the 62,084 responses to its academic survey in 2013. These are shown in tabular form below. In brackets is the percentage of the 3,069 responses in 2007. The symbol -- means that the percentage response was below 0.5 in 2007 and not indicated by QS. There is no longer a link to the 2007 data but the numbers were recorded in a post on this blog on the 4th of December 2007.
The proportion of respondents from the USA rose substantially between 2007 and 2013. There were also increases for European countries such as the UK, Italy, Germany, France, Spain, Hungary, Russia, Netherlands and Portugal although there were declines for some smaller countries like Belgium, Denmark, Sweden and Switzerland.
The percentage of respondents from Japan and Taiwan rose but there were significant falls for India, China, Malaysia, Hong Kong New Zealand, Australia, Singapore, Indonesia and the Philippines
The most notable change is the growing number of responses from Latin America including Brazil, Mexico, Chile, Argentina and Colombia.
US 17.4 (10.0)
UK 6.5 (5.6)
Brazil 6.3 (1.1)
Italy 4.7 (3.3)
Germany 3.8 (3.0)
Canada 3.4 (4.0)
Australia 3.2 (3.5)
France 2.9 (2.4)
Japan 2.9 (1.9)
Spain 2.7 (2.3)
Mexico 2.6 (0.8)
Hungary 2.0 --
Russia 1.7 (0.7)
India 1.7 (3.5)
Chile 1.7 --
Ireland 1.6 (1.5)
Malaysia 1.5 (3.2)
Belgium 1.4 (2.6))
Hong Kong 1.4 (1.9)
Taiwan 1.3 (0.7)
Netherlands 1.2 (0.6)
New Zealand 1.2 (4.1)
Singapore 1.2 (2.5)
China 1.1 (1.6)
Portugal 1.1 (0.9)
Colombia 1.1 --
Argentina 1.0 (0.7)
South Africa 1.0 (0.7)
Denmark 0.9 (1.2)
Sweden 0.9 (1.7)
Kazakhstan 0.9
Israel 0.8 --
Switzerland 0.8 (1.5)
Austria 0.8 (1.3)
Romania 0.8 --
Turkey 0.7 (1.1)
Pakistan 0.7 --
Norway 0.6 --
Poland 0.6 (0.8)
Thailand 0.6 (0.6)
Finland 0.8 (0.5)
Greece 07 (0.7)
Ukraine 0.5 --
Indonesia 0.5 (1.2)
Czech 0.5 --
Peru 0.4 --
Slovenia 0.4 --
Saudi Arabia 0.4 --
Lithuania 0.4 --
Uraguay 0.3 --
Philippines 0.3 (1.8)
Bulgaria 0.3 --
UAE 0.3 --
Egypt 0.3 --
Paraguay 0.2 --
Jordan 0.2 --
Nigeria 0.2 --
Latvia 0.2 --
Venezuela 0.2 --
Estonia 0.2 --
Ecuador 0.2 --
Slovakia 0.2 --
Iraq 0.2 --
Jamaica 0.1 --
Azerbaijan 0.1 --
Iran 0.1 (0.7) --
Palestine 0.1 --
Cyprus 0.1 --
Kuwait 0.1 --
Bahrain 0.1 --
Vietnam 0.1 --
Algeria 0.1 --
Puerto Rico 0.1 --
Costa Rica 0.1 --
Brunei 0.1 --
Panama 0.1 --
Taiwan 0.1 --
Sri Lanka 0.1 --
Oman 0.1 --
Icelan 0.1 --
Qatar 0.1 --
Bangladesh 0.1 --
The proportion of respondents from the USA rose substantially between 2007 and 2013. There were also increases for European countries such as the UK, Italy, Germany, France, Spain, Hungary, Russia, Netherlands and Portugal although there were declines for some smaller countries like Belgium, Denmark, Sweden and Switzerland.
The percentage of respondents from Japan and Taiwan rose but there were significant falls for India, China, Malaysia, Hong Kong New Zealand, Australia, Singapore, Indonesia and the Philippines
The most notable change is the growing number of responses from Latin America including Brazil, Mexico, Chile, Argentina and Colombia.
US 17.4 (10.0)
UK 6.5 (5.6)
Brazil 6.3 (1.1)
Italy 4.7 (3.3)
Germany 3.8 (3.0)
Canada 3.4 (4.0)
Australia 3.2 (3.5)
France 2.9 (2.4)
Japan 2.9 (1.9)
Spain 2.7 (2.3)
Mexico 2.6 (0.8)
Hungary 2.0 --
Russia 1.7 (0.7)
India 1.7 (3.5)
Chile 1.7 --
Ireland 1.6 (1.5)
Malaysia 1.5 (3.2)
Belgium 1.4 (2.6))
Hong Kong 1.4 (1.9)
Taiwan 1.3 (0.7)
Netherlands 1.2 (0.6)
New Zealand 1.2 (4.1)
Singapore 1.2 (2.5)
China 1.1 (1.6)
Portugal 1.1 (0.9)
Colombia 1.1 --
Argentina 1.0 (0.7)
South Africa 1.0 (0.7)
Denmark 0.9 (1.2)
Sweden 0.9 (1.7)
Kazakhstan 0.9
Israel 0.8 --
Switzerland 0.8 (1.5)
Austria 0.8 (1.3)
Romania 0.8 --
Turkey 0.7 (1.1)
Pakistan 0.7 --
Norway 0.6 --
Poland 0.6 (0.8)
Thailand 0.6 (0.6)
Finland 0.8 (0.5)
Greece 07 (0.7)
Ukraine 0.5 --
Indonesia 0.5 (1.2)
Czech 0.5 --
Peru 0.4 --
Slovenia 0.4 --
Saudi Arabia 0.4 --
Lithuania 0.4 --
Uraguay 0.3 --
Philippines 0.3 (1.8)
Bulgaria 0.3 --
UAE 0.3 --
Egypt 0.3 --
Paraguay 0.2 --
Jordan 0.2 --
Nigeria 0.2 --
Latvia 0.2 --
Venezuela 0.2 --
Estonia 0.2 --
Ecuador 0.2 --
Slovakia 0.2 --
Iraq 0.2 --
Jamaica 0.1 --
Azerbaijan 0.1 --
Iran 0.1 (0.7) --
Palestine 0.1 --
Cyprus 0.1 --
Kuwait 0.1 --
Bahrain 0.1 --
Vietnam 0.1 --
Algeria 0.1 --
Puerto Rico 0.1 --
Costa Rica 0.1 --
Brunei 0.1 --
Panama 0.1 --
Taiwan 0.1 --
Sri Lanka 0.1 --
Oman 0.1 --
Icelan 0.1 --
Qatar 0.1 --
Bangladesh 0.1 --
The SIRIS Lab
The SIRIS Lab has some interesting visualizations of the THE and QS rankings for 2013 and the changing Shanghai Rankings from 2003 to 2013 (thanks to wowter.net).
Be warned. They can get quite addictive.
Be warned. They can get quite addictive.
Tuesday, February 18, 2014
The New Webometrics Rankings
The latest Webometrics rankings are out.
In the overall rankings the top five are:
1. Harvard
2. MIT
3. Stanford
4. Cornell
5. Columbia.
Looking at the indicators one by one, the top five for presence (number of webpages in the main webdomain) are:
1. Karolinska Institute
2. National Taiwan University
3. Harvard
4. University of California San Francisco
5. PRES Universite de Bordeaux.
The top five for impact (number of external inlinks received from third parties) are:
1. University of California Berkeley
2. MIT
3. Harvard
4. Stanford
5. Cornell.
The top five for openness (number of rich files published in dedicated websites) are:
1. University of California San Francisco
2. Cornell
3. Pennsylvania State University
4. University of Kentucky
5. University of Hong Kong.
The top five for excellence (number of papers in the 10% most cited category) are:
1. Harvard
2. Johns Hopkins
3. Stanford
4. UCLA
5. Michigan
In the overall rankings the top five are:
1. Harvard
2. MIT
3. Stanford
4. Cornell
5. Columbia.
Looking at the indicators one by one, the top five for presence (number of webpages in the main webdomain) are:
1. Karolinska Institute
2. National Taiwan University
3. Harvard
4. University of California San Francisco
5. PRES Universite de Bordeaux.
The top five for impact (number of external inlinks received from third parties) are:
1. University of California Berkeley
2. MIT
3. Harvard
4. Stanford
5. Cornell.
The top five for openness (number of rich files published in dedicated websites) are:
1. University of California San Francisco
2. Cornell
3. Pennsylvania State University
4. University of Kentucky
5. University of Hong Kong.
The top five for excellence (number of papers in the 10% most cited category) are:
1. Harvard
2. Johns Hopkins
3. Stanford
4. UCLA
5. Michigan
Saturday, February 08, 2014
The Triple Package
I have just finished reading The Triple Package by Amy Chua and Jed Rubenfeld, a heavily anecdotal book that tells us, as every reader of the New York Times now knows, what really determines success.
An irritating thing is the presentation of urban legends -- no dogs, no Cubans and so on -- and generalizations to support the authors' thesis.
Here is one example: "men like Alfred Kazin, Norman Mailer, Delmore Schwatz, Saul Bellow, Celement Greenberg, Norman Podhoretz, and so many of the New York intellectuals who grew up excluded from anti-Semitic bastions of education and culture but went on to become famous writers and critics".
Alfred Kazin went to City College of New York when it was a selective institution. Norman Mailer went to Harvard at the age of 16 and, after serving in the army, to the Sorbonne. Delmore Schwartz attended Columbia, the University of Wisconsin and New York University and did postgraduate work at Harvard with Alfred North Whitehead. Saul Bellow was at the University of Chicago and then Northwestern. He was also also a postgraduate student at the University of Wisconsin. Clement Greenberg studied at Syracuse University. Norman Podhoretz was accepted by Harvard and NYU but went to Columbia which offered him a full scholarship. He went to Cambridge on a Fulbright and was offered a fellowship at Harvard which he turned down
Bellow famously endured several anti Semitic slights and sneers and no doubt did the others. But can we really say that were excluded from bastions of education?
i
An irritating thing is the presentation of urban legends -- no dogs, no Cubans and so on -- and generalizations to support the authors' thesis.
Here is one example: "men like Alfred Kazin, Norman Mailer, Delmore Schwatz, Saul Bellow, Celement Greenberg, Norman Podhoretz, and so many of the New York intellectuals who grew up excluded from anti-Semitic bastions of education and culture but went on to become famous writers and critics".
Alfred Kazin went to City College of New York when it was a selective institution. Norman Mailer went to Harvard at the age of 16 and, after serving in the army, to the Sorbonne. Delmore Schwartz attended Columbia, the University of Wisconsin and New York University and did postgraduate work at Harvard with Alfred North Whitehead. Saul Bellow was at the University of Chicago and then Northwestern. He was also also a postgraduate student at the University of Wisconsin. Clement Greenberg studied at Syracuse University. Norman Podhoretz was accepted by Harvard and NYU but went to Columbia which offered him a full scholarship. He went to Cambridge on a Fulbright and was offered a fellowship at Harvard which he turned down
Bellow famously endured several anti Semitic slights and sneers and no doubt did the others. But can we really say that were excluded from bastions of education?
i
Thursday, February 06, 2014
The Best Universities for Research
It seems to be the time of year when there a slow trickle of university ranking spin-offs before the big three world rankings starting in August. We have had young university rankings, best student cities, most international universities, BRICS rankings.
Something is missing though, a ranking of top universities for research. So to assuage the pent up demand here are the top 20 universities for research according to six different ranking indicators. There is considerable variation with only two universities, Harvard and Stanford, appearing in every list.
First the top twenty universities for research output according to Scimago. This is measured by publications in the Scopus database over a five year period.
1. Harvard
2. Tokyo
3. Toronto
4. Tsinghua
5. Sao Paulo
6. Michigan Ann Arbor
7. Johns Hopkins
8. UCLA
9. Zhejiang
10. University of Washington
11. Stanford
12. Graduate University of the Chinese Academy of Sciences
13. Shanghai Jiao Tong University
14. University College London
15. Oxford
16. Universite Pierre et Marie Curie Paris 6
17. University of Pennsylvania
18. Cambridge
19. Kyoto
20. Columbia
Next we have the normalized impact scores from Scimago, which measure citations to research publications taking account of field. This might be considered a measure of the quality of research rather than quantity. Note that a university would not be harmed if it had a large number of non-performing faculty who never wrote papers.
1. MIT
2. Harvard
3. University of California San Francisco
4= Stanford
4= Princeton
6. Duke
7. Rice
8. Chicago
9= Columbia
9= University of California Berkeley
9= University of California Santa Cruz
12. University Of California Santa Barbara
13. Boston University
14= Johns Hopkins
14= University of Pennsylvania
16. University of California San Diego
17= UCLA
17= University of Washington
17= Washington University of St Louis
20. Oxford
The citations per faculty indicator in the QS World University Rankings also uses Scopus. It is not normalized by field so medical schools and technological institutes can do very well.
1. Weizmann Institute of Technology
2. Caltech
3. Rockefeller University
4. Harvard
5. Stanford
6. Gwanju Institute of Science and Technology
7. UCLA
8. University of California San Francisco
9. Karolinska Institute
10. University of California Santa Barbara
11. University of California San Diego
12. London School of Hygiene and Tropical Medicine
13. MIT
14. Georgia Institute of Technology
15. University of Washington
16. Northwestern University
17. Emory
18. Tel Aviv
19. Minnesota Twin Cities
20. Cornell
The Times Higher Education -- Thomson Reuters Research Impact Citations Indicator is normalized by field (250 of them) and by year of publication. In addition, there is a "regional modification" that gives a big boost to universities in countries with generally low impact scores. A good score on this indicator can be obtained by contributing to multi-contributor publications, especially in physics, providing that total publications do not rise too much.
1= MIT
1= Tokyo Metropolitan University
3= University of California Santa Cruz
3= Rice
5. Caltech
6. Princeton
7. University of California Santa Barbara
8. University of California Berkeley
9= Harvard
9= Stanford
11. Florida Institute of Technology
12. Chicago
13. Royal Holloway,University of London
14. University of Colorado Boulder
15= Colorado School of Mines
15= Northwestern
17= Duke
17= Universty of California San Diego
19. Washington University of St Louis
20. Boston College
The Shanghai Academic Ranking of World Universities Highly Cited indicator counts the number of researchers on the lists compiled by Thomson Reuters. It seems that new lists will now be produced every year so this indicator could become less stable.
1. Harvard
2. Stanford
3. MIT
4. University of California Berkeley
5. Princeton
6. Michigan Ann Arbor
7. University of California San Diego
8. Yale
9. University of Pennsylvania
10. UCLA
11= Caltech
11= Columbia
13. University of Washington
14. Cornell
15. Cambridge.
16. University of California San Francisco
17. Chicago
18 University of Wisconsin Madison
19 University of Minnesota Twin Cities
20. Oxford
Finally, the MNCS indicator from the Leiden Ranking, which is the number of field normalized citations per paper. It is possible for a few widely cited papers in the right discipline to have a disproportionate effect. The high placing for Gottingen results from a single computer science paper the citation of which is required for intellectual property reasons.
1. MIT
2. Gottingen
3. Princeton
4. Caltech
5. Stanford
6. Rice
7. University of California Santa Barbara
8. University of California Berkeley
9 Harvard
10 University of California Santa Cruz
11. EPF Lausanne
12. Yale
13 University of California San Francisco
14. Chicago
15. University of California San Diego
16. Northwestern
17. University of Colorado Boulder
18. Columbia
19. University of Texas Austin
20. UCLA
Something is missing though, a ranking of top universities for research. So to assuage the pent up demand here are the top 20 universities for research according to six different ranking indicators. There is considerable variation with only two universities, Harvard and Stanford, appearing in every list.
First the top twenty universities for research output according to Scimago. This is measured by publications in the Scopus database over a five year period.
1. Harvard
2. Tokyo
3. Toronto
4. Tsinghua
5. Sao Paulo
6. Michigan Ann Arbor
7. Johns Hopkins
8. UCLA
9. Zhejiang
10. University of Washington
11. Stanford
12. Graduate University of the Chinese Academy of Sciences
13. Shanghai Jiao Tong University
14. University College London
15. Oxford
16. Universite Pierre et Marie Curie Paris 6
17. University of Pennsylvania
18. Cambridge
19. Kyoto
20. Columbia
Next we have the normalized impact scores from Scimago, which measure citations to research publications taking account of field. This might be considered a measure of the quality of research rather than quantity. Note that a university would not be harmed if it had a large number of non-performing faculty who never wrote papers.
1. MIT
2. Harvard
3. University of California San Francisco
4= Stanford
4= Princeton
6. Duke
7. Rice
8. Chicago
9= Columbia
9= University of California Berkeley
9= University of California Santa Cruz
12. University Of California Santa Barbara
13. Boston University
14= Johns Hopkins
14= University of Pennsylvania
16. University of California San Diego
17= UCLA
17= University of Washington
17= Washington University of St Louis
20. Oxford
The citations per faculty indicator in the QS World University Rankings also uses Scopus. It is not normalized by field so medical schools and technological institutes can do very well.
1. Weizmann Institute of Technology
2. Caltech
3. Rockefeller University
4. Harvard
5. Stanford
6. Gwanju Institute of Science and Technology
7. UCLA
8. University of California San Francisco
9. Karolinska Institute
10. University of California Santa Barbara
11. University of California San Diego
12. London School of Hygiene and Tropical Medicine
13. MIT
14. Georgia Institute of Technology
15. University of Washington
16. Northwestern University
17. Emory
18. Tel Aviv
19. Minnesota Twin Cities
20. Cornell
The Times Higher Education -- Thomson Reuters Research Impact Citations Indicator is normalized by field (250 of them) and by year of publication. In addition, there is a "regional modification" that gives a big boost to universities in countries with generally low impact scores. A good score on this indicator can be obtained by contributing to multi-contributor publications, especially in physics, providing that total publications do not rise too much.
1= MIT
1= Tokyo Metropolitan University
3= University of California Santa Cruz
3= Rice
5. Caltech
6. Princeton
7. University of California Santa Barbara
8. University of California Berkeley
9= Harvard
9= Stanford
11. Florida Institute of Technology
12. Chicago
13. Royal Holloway,University of London
14. University of Colorado Boulder
15= Colorado School of Mines
15= Northwestern
17= Duke
17= Universty of California San Diego
19. Washington University of St Louis
20. Boston College
The Shanghai Academic Ranking of World Universities Highly Cited indicator counts the number of researchers on the lists compiled by Thomson Reuters. It seems that new lists will now be produced every year so this indicator could become less stable.
1. Harvard
2. Stanford
3. MIT
4. University of California Berkeley
5. Princeton
6. Michigan Ann Arbor
7. University of California San Diego
8. Yale
9. University of Pennsylvania
10. UCLA
11= Caltech
11= Columbia
13. University of Washington
14. Cornell
15. Cambridge.
16. University of California San Francisco
17. Chicago
18 University of Wisconsin Madison
19 University of Minnesota Twin Cities
20. Oxford
Finally, the MNCS indicator from the Leiden Ranking, which is the number of field normalized citations per paper. It is possible for a few widely cited papers in the right discipline to have a disproportionate effect. The high placing for Gottingen results from a single computer science paper the citation of which is required for intellectual property reasons.
1. MIT
2. Gottingen
3. Princeton
4. Caltech
5. Stanford
6. Rice
7. University of California Santa Barbara
8. University of California Berkeley
9 Harvard
10 University of California Santa Cruz
11. EPF Lausanne
12. Yale
13 University of California San Francisco
14. Chicago
15. University of California San Diego
16. Northwestern
17. University of Colorado Boulder
18. Columbia
19. University of Texas Austin
20. UCLA
Tuesday, February 04, 2014
Will Global Rankings Boost Higher Education in Emerging Countries?
My article in University World News can be accessed here.
Monday, February 03, 2014
India and the World Rankings
There is an excellent article in Asian Scientist by Prof Pushkar of BITS Pilani that questions the developing obsession in India with getting into the top 100 or 200 of the world rankings.
Prof Pushkar observes that Indian universities have never done well in global rankings. He says:
and
Even so:
He also refers to the proposal that the Indian Institutes of Technology should combine just for the rankings. Apparently he has been in touch with Phil Baty of THE who is not buying the idea.
I would disagree with Professor Ashok's argument that combining universities would not be a good idea anyway because THE scales some indicators for size. That is true but the reputation survey is not scaled and adding votes in the survey would be beneficial for a combined institution if one could be created and then accepted by the rankers . Also, you currently need 200 publications a year to be ranked by THE so there would be a case for smaller places around the world --although probably not the IITs -- banding together to get past this threshold.
Prof Pushkar observes that Indian universities have never done well in global rankings. He says:
"there is no doubt that Indian universities need to play ‘catch up’ in order to place more higher education institutions in the top 400 or 500 in the world. It is particularly confounding that a nation which has sent a successful mission to Mars does not boast of one single institution in the top 100. “Not even one!” sounds like a real downer. Whether one considers the country a wannabe “major” power or an “emerging” power (or not), it is still surprising that India’s universities do not make the grade."
and
"It is also rather curious that the “lost decades” of India’s higher education – the 1980s and the 1990s – coincided with a period when the country registered high rates of economic growth. The neglect of higher education finally ended when the National Knowledge Commission drew attention to a “quiet crisis” in its 2006 report."
Even so:
"(d)espite everything that is wrong with India’s higher education, there is no reason for panic about the absence of its universities in the top 100 or 200. Higher education experts agree that the world rankings of universities are limited in terms of what they measure. Chasing world rankings may do little to improve the overall quality of higher education in the country."
He also refers to the proposal that the Indian Institutes of Technology should combine just for the rankings. Apparently he has been in touch with Phil Baty of THE who is not buying the idea.
I would disagree with Professor Ashok's argument that combining universities would not be a good idea anyway because THE scales some indicators for size. That is true but the reputation survey is not scaled and adding votes in the survey would be beneficial for a combined institution if one could be created and then accepted by the rankers . Also, you currently need 200 publications a year to be ranked by THE so there would be a case for smaller places around the world --although probably not the IITs -- banding together to get past this threshold.
Saturday, February 01, 2014
Recent Research: Rankings Matter
According to an article by Molly Alter and Randall Reback in Education Evaluation and Policy Analysis, universities in the USA get more applications if they receive high quality-of-life ratings and fewer if their peers are highly rated academically.
True for your school: How changing reputations alter demand for selective US colleges
Abstract
There is a comprehensive literature documenting how colleges’ tuition, financial aid packages, and academic reputations influence students’ application and enrollment decisions. Far less is known about how quality-of-life reputations and peer institutions’ reputations affect these decisions. This article investigates these issues using data from two prominent college guidebook series to measure changes in reputations. We use information published annually by the Princeton Review—the best-selling college guidebook that formally categorizes colleges based on both academic and quality-of-life indicators—and the U.S. News and World Report—the most famous rankings of U.S. undergraduate programs. Our findings suggest that changes in academic and quality-of-life reputations affect the number of applications received by a college and the academic competitiveness and geographic diversity of the ensuing incoming freshman class. Colleges receive fewer applications when peer universities earn high academic ratings. However, unfavorable quality-of-life ratings for peers are followed by decreases in the college’s own application pool and the academic competitiveness of its incoming class. This suggests that potential applicants often begin their search process by shopping for groups of colleges where non-pecuniary benefits may be relatively high.
True for your school: How changing reputations alter demand for selective US colleges
Abstract
There is a comprehensive literature documenting how colleges’ tuition, financial aid packages, and academic reputations influence students’ application and enrollment decisions. Far less is known about how quality-of-life reputations and peer institutions’ reputations affect these decisions. This article investigates these issues using data from two prominent college guidebook series to measure changes in reputations. We use information published annually by the Princeton Review—the best-selling college guidebook that formally categorizes colleges based on both academic and quality-of-life indicators—and the U.S. News and World Report—the most famous rankings of U.S. undergraduate programs. Our findings suggest that changes in academic and quality-of-life reputations affect the number of applications received by a college and the academic competitiveness and geographic diversity of the ensuing incoming freshman class. Colleges receive fewer applications when peer universities earn high academic ratings. However, unfavorable quality-of-life ratings for peers are followed by decreases in the college’s own application pool and the academic competitiveness of its incoming class. This suggests that potential applicants often begin their search process by shopping for groups of colleges where non-pecuniary benefits may be relatively high.
Friday, January 31, 2014
Department of Remarkable Coincidences
On the day that QS published their top 50 under-50 universities, Times Higher Education has announced that it will be holding a Young Universities Summit in Miami in April at which the top 100 universities under 50 will be revealed.
Also, the summit will see "a consultative discussion on proposed new rankings metrics designed to better capture innovation in innovation and knowledge transfer in world rankings in the future."
Innovation? What could that mean? Maybe counting patents.
Knowledge transfer? Could this mean doing something about the citations indicator? Has someone at THE seen who contributed to multi-author massively cited publications in 2012?
on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the future.on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the future.
a consultative discussion on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the future.a consultative discussion on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the future.
ll also host a consultative discussion on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the fu
Also, the summit will see "a consultative discussion on proposed new rankings metrics designed to better capture innovation in innovation and knowledge transfer in world rankings in the future."
Innovation? What could that mean? Maybe counting patents.
Knowledge transfer? Could this mean doing something about the citations indicator? Has someone at THE seen who contributed to multi-author massively cited publications in 2012?
on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the future.on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the future.
a consultative discussion on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the future.a consultative discussion on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the future.
ll also host a consultative discussion on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the fu
QS Young Universities Rankings
QS have produced a ranking of universities founded in the last fifty years. It is based on data collected for last year's World University Rankings.
The top five are:
1. Hong Kong University of Science and Technology
2. Nanyang Technological University, Singapore
3. Korean Advanced Institute of Science and Technology
4. City University of Hong Kong
5. Pohang University of Science and Technology, Korea
There are no universities from Russia or Mainland China on the list although there is one from Taiwan and another from Kazakhstan.
There are nine Australian universities in the top fifty.
The top five are:
1. Hong Kong University of Science and Technology
2. Nanyang Technological University, Singapore
3. Korean Advanced Institute of Science and Technology
4. City University of Hong Kong
5. Pohang University of Science and Technology, Korea
There are no universities from Russia or Mainland China on the list although there is one from Taiwan and another from Kazakhstan.
There are nine Australian universities in the top fifty.
Wednesday, January 29, 2014
The 25 Most International Universities
Times Higher Education has produced a succession of spin-offs from their World University Rankings: rankings of Asian universities, young universities and emerging economies universities, reputation rankings, a gender index.
Now there is a list of the world's most international universities, based on the international outlook indicator in the world rankings. This comprises data on international students, international faculty and international research collaboration.
The top five are:
1. Ecole Polytechnique Federale de Lausanne
2= Swiss Federal Institute of technology Zurich
2= University of Geneva
4. National University of Singapore
5. Royal Holloway, University of London
.
Now there is a list of the world's most international universities, based on the international outlook indicator in the world rankings. This comprises data on international students, international faculty and international research collaboration.
The top five are:
1. Ecole Polytechnique Federale de Lausanne
2= Swiss Federal Institute of technology Zurich
2= University of Geneva
4. National University of Singapore
5. Royal Holloway, University of London
.
Sunday, January 19, 2014
A bright idea from India
Has someone in India been reading this blog?
In a previous post I suggested that universities might improve their scores in the world rankings by merging. That would help in the QS and THE reputation surveys and the publications indicator in the Shanghai rankings.
If he is being reported correctly, Indian education minister Ashok Thakur proposes to go a step further and suggests that all the Indian Institutes of Technology should be assessed together by the international rankers, although presumably continuing to function separately in other respects. According to outlookindia:
Both QS and THE seem eager to do business in India but this is surely a non-starter. Apart from anything else, it could be followed by all the University of California and other US state university campuses, branches of the National University of Ireland and the Indian Institutes of Science and Management coming together for ranking purposes.
Also, the Secretary should consider that if any IIT follows the lead of Panjab University and joins the Hadron Collider Project or any other multi-contributor, multi-citation project, any gain in the THE citations indicator would be lost if it had to be shared with the other 12 institutes.
In a previous post I suggested that universities might improve their scores in the world rankings by merging. That would help in the QS and THE reputation surveys and the publications indicator in the Shanghai rankings.
If he is being reported correctly, Indian education minister Ashok Thakur proposes to go a step further and suggests that all the Indian Institutes of Technology should be assessed together by the international rankers, although presumably continuing to function separately in other respects. According to outlookindia:
"All the 13 IITs may compete as a single unit at the global level for a place among the best in the global ranking list.
Giving an indication in this regard, Higher Education Secretary Ashok Thakur said the idea is to position the IITs as a single unit much like the IIT brand which has become an entity in itself for finding a place among the top three best institutes the world-over.
International ranking agencies such as Times Higher Education and QS World University Ranking would be informed accordingly, he said.
Central universities and other institutes could follow on how the IITs position themselves in the ranking list, he said."
Both QS and THE seem eager to do business in India but this is surely a non-starter. Apart from anything else, it could be followed by all the University of California and other US state university campuses, branches of the National University of Ireland and the Indian Institutes of Science and Management coming together for ranking purposes.
Also, the Secretary should consider that if any IIT follows the lead of Panjab University and joins the Hadron Collider Project or any other multi-contributor, multi-citation project, any gain in the THE citations indicator would be lost if it had to be shared with the other 12 institutes.
Tuesday, January 07, 2014
Explain Please
I have often noticed that some university administrators and educational bureaucrats are clueless about international university rankings, even when their careers depend on a good performance.
The Economic Times of India reports that the Higher Education Secretary in the Human Resource Development Ministry, Ashok Thakur, said "institutions could improve their scores dramatically in Times Higher Education's globally cited World University Rankings as the British magazine has agreed to develop and include India-specific parameters for assessment from the next time."
This sounds like THE is going to insert a new indicator just for India in their world rankings, which is unbelievable. The Hindu puts it a little differently, suggesting that THE is preparing a separate ranking of Indian universities:
.
"Times Higher Education (THE) — recognised world over for its ranking of higher education institutions — has agreed to draw up an India-specific indicator that would act as a parameter for global education stakeholders and international students to judge Indian educational institutions.
This was disclosed by Higher Education Secretary in the Union Human Resource Development Ministry Ashok Thakur."
It would be interesting to find out what the minister actually said and what, if anything, THE has agreed to.
Ranking News
06/01/14
The latest Times Higher Education international reputation rankings, based on data collected for last year's World University Rankings, will be announced in Tokyo on March 6th.
The number of responses was 10,536 in 2013, down from 16,639 in 2012 and 17,554 in 2011
Why is the number of responses falling?
Is the decline linked with changes in the scores for the teaching and research indicators criteria, both of which include indicators based on the survey?
The latest Times Higher Education international reputation rankings, based on data collected for last year's World University Rankings, will be announced in Tokyo on March 6th.
The number of responses was 10,536 in 2013, down from 16,639 in 2012 and 17,554 in 2011
Why is the number of responses falling?
Is the decline linked with changes in the scores for the teaching and research indicators criteria, both of which include indicators based on the survey?
Saturday, December 21, 2013
Twenty Ways to Rise in the Rankings Quickly and Fairly Painlessly
Times
Higher Education has
just republished an article by Amanda Goodall, ‘Top 20 ways to improve your
world university ranking’. Much of her
advice is very sensible -- appointing university leaders with a strong research
record, for example -- but in most cases the road from her suggestions to a
perceptible improvement in the rankings is likely to be winding and very long. It
is unlikely that any of her proposals would have much effect on the rankings in
less than a decade or even two.
So here are 20 realistic proposals for a university wishing
to join the rankings game.
Before starting, any advice about how a university can rise
in the rankings should be based on these principles.
·
Rankings are proliferating and no doubt there will be more
in the future. There is something for almost anybody if you look carefully
enough.
·
The indicators and methodology of the better known rankings
are very different. Something that works with one may not work with another. It
might even have a negative effect.
· There is often a price to pay for getting ahead in the rankings. Everybody should consider whether it is worth it. Also, while rising from 300th place to 250th is quite easy, going from 30th to 25th is another matter.
· Don’t forget the number on the bottom. It might be easier to reduce the number of academic staff than to increase the number of citations or publications.
·
Rankings are at best an approximation to what universities
do. Nobody should get too excited about them.
The top 20 ways in which universities can quickly improve
their positions in one or more of the international university rankings are:
1. Get rid
of students
Over the years many universities acquire a collection of
branch campuses, general studies programmes, night schools, pre-degree programmes
and so on. Set them free to become independent universities or colleges. Almost
always, these places have relatively more students and relatively fewer faculty
than the main campus. The university will therefore do better in the Quacquarelli Symonds (QS) and Times Higher Education (THE) faculty student ratio
indicators. Also, staff in the spun off
branches and schools generally produce less research than those at the main
campus so you will get a boost in the productivity per capita indicator in the Shanghai ARWU rankings.
2. Kick out the old and bring in the young
Get rid of ageing professors, especially if unproductive and
expensive, and hire lots of indentured servants adjunct and temporary teachers
and researchers. Again, this will improve the university’s performance on the THE
and QS faculty student ratio indicators. They will not count as senior faculty so
this will be helpful for ARWU.
3. Hire research assistants
Recruiting slave labour cheap or unpaid research
assistants (unemployed or unemployable graduate interns?) will boost the score
for faculty student ratio in the QS rankings, since QS counts research-only
staff for their faculty student indicator. It will not, however, work for the THE
rankings. Remember that for QS more
faculty are good for faculty student ratio but bad for citations per faculty so
you have to analyse the potential trade off carefully.
4. Think about an exit option
If an emerging university wants to be included in the
rankings it might be better to focus on just one of them. Panjab University is doing very well in the
THE rankings but does not appear in the QS rankings. But remember that if you
apply to be ranked by THE and you do not like your placing then it is always
possible to opt out by not submitting data next year. But QS has a Hotel
California policy: once in, you can check out but you can never leave. It does
not matter how much you complain about the unique qualities of your institution
and how they are neglected by the rankers, QS will go on ranking you whether you
like it.
5. Get a medical school
If you do not have a
medical school or a research and/or teaching hospital then get one from
somewhere. Merge with an existing one or start your own. If you have one, get
another one. Medical research produces a disproportionate number of papers and
citations which is good for the QS citations per faculty indicator and the ARWU
publications indicator. Remember this strategy may not help so much with THE who use
field normalisation. Those citations of medical research will help there only
if they above the world average for field and year.
Update August 2016: QS now have a moderate form of field normalisation so the advantage of a medical school is reduced but the Shanghai rankings are still biased towards medical research.
Update August 2016: QS now have a moderate form of field normalisation so the advantage of a medical school is reduced but the Shanghai rankings are still biased towards medical research.
6. But if you are a
medical school, diversify
QS and THE supposedly do not include single subject
institutions in their general rankings, although from time to time one will, like the University of California at
San Francisco, Aston Business School or (National Research Nuclear University) Moscow Engineering Physics Institute (MEPhI),
slip through. If you are an independent
medical or single subject institution consider adding one or two more subjects
then QS and THE will count you although you will probably start sliding down the ARWU table.
Update August 2016: the
QS BRICS rankings include some Russian institutions that look like they
focus on one field and National Research Nuclear University MePhI is back in the THE world rankings.
7. Amalgamate
The Shanghai rankings count the total number of publications
in the SCI and SSCI, the total number of highly cited researchers and the total
number of papers without regard for the number of researchers. THE and QS count
the number of votes in their surveys without considering the number of alumni.
What about a new mega university formed by merging LSE,
University College London and Imperial College? Or a tres grande ecole from all
those little grandes ecoles around Paris?
Update August 2016: This is pretty much what the University of Paris-Saclay is doing.
Update August 2016: This is pretty much what the University of Paris-Saclay is doing.
8. Consider the
weighting of the rankings
THE gives a 30 % weighting to citations and 2.5% to income
from industry. QS gives 40 % to its academic survey and 5 % to international
faculty. So think about where you are going to spend your money.
9. The wisdom of crowds
Focus on research projects in those fields that have huge
multi - “author” publications, particle
physics, astronomy and medicine for example.
Such publications often have very large numbers of citations. Even if
your researchers make a one in two thousandth contribution Thomson Reuters, THE’s
data collector, will give them the same credit as they would get if they were
the only authors. This will not work for the
Leiden Ranking which uses fractionalised counting of citations. Note that this strategy works best when combined with number
10.
Update August 2016: THE methodological changes in 2015 mean that this does not work any more. Look at what happened to Middle East Technical University. But it is still worth looking out for projects with dozens or scores of contributors.
Update August 2016: THE methodological changes in 2015 mean that this does not work any more. Look at what happened to Middle East Technical University. But it is still worth looking out for projects with dozens or scores of contributors.
10. Do not produce too much
You need to produce 200 papers a year to be included in the
THE rankings. But producing more papers than this might be counterproductive. If
your researchers are producing five thousand papers a year then those five
hundred citations from a five hundred “author” report on the latest discovery
in particle physics will not have much impact. But if you are publishing three
hundred papers a year those citations will make a very big difference. This is
why Dr El Naschie’s frequently
cited papers in Chaos, Solitons and
Fractals were a big boost for Alexandria University but not for
Cambridge, Surrey, Cornell and Frankfurt universities with whom he also claimed
affiliation. However, Leiden will not rank universities until they reach
500 papers a year.
Update August 2016: See number 9.
Update August 2016: See number 9.
11. Moneyball Strategy
In his book Moneyball, Michael Lewis recounted the ascent
of the Oakland As baseball team through a strategy of buying undervalued players.
The idea was to find players who did things that led to their teams winning
even if they did not match the stereotype of a talented player.
This strategy was applied by George
Mason University in Virginia who created a top basketball team by
recruiting players who were overlooked by scouts because they were too small or
too fat and a top economics department by recruiting advocates of a market
economy at a time when such an idea was unfashionable.
Universities could recruit researchers who are prolific and
competent but are unpromotable or unemployable because they are in the wrong
group or fail to subscribe enthusiastically to current academic orthodoxies.
Maybe start with Mark
Regnerus and Jason
Richwine.
Update August 2016: See the story of Tim Groseclose's move from UCLA to George Mason
Update August 2016: See the story of Tim Groseclose's move from UCLA to George Mason
12. Expand doctoral
programmes
One indicator in the THE world rankings is the ratio of
doctoral to bachelor degree students.
Panjab University recently announced that they will
introduce integrated masters and doctors programmes. This could be a smart move
if it means students no longer go into master’s programmes but instead into
something that can be counted as a doctoral degree program.
13. The importance of names
Make sure that your researchers know which university they
are affiliated to and that they know its correct name. Make sure that branch
campuses, research institutes and other autonomous or quasi- autonomous groups
incorporate the university name in their publications. Keep an eye on Scopus
and ISI and make sure they know what you are called. Be especially careful if
you are an American state university.
14. Evaluate
staff according to criteria relevant to the rankings
If staff are to be appointed and promoted according to their
collegiality, the enthusiasm with which
they take part in ISO exercises, community
service, ability to make the faculty a pleasant place for everybody or commitment to diversity then you will get collegial,
enthusiastic etc faculty. But those are things that the rankers do not – for once
with good reason – attempt to measure.
While you are about it get rid of interviews for staff and
students. Predictive validity ranges from zero to low
15. Collaborate
The more authors a paper has the more likely it is to be
cited, even if it is only self-citation.
Also, the more collaborators you have the greater the chances of a good
score in the reputation surveys. And do not forget the percentage of
collaborators who are international is also an indicator in the THE rankings
16. Rebrand
It would be good to have names that are as distinctive and
memorable as possible. Consider a name change. Do you really think that the
average scientist filling out the QS or the THE reputation surveys is going to remember
which of the sixteen (?) Indian Institutes of Technology is especially good in
engineering.
Update August 2016: But not too memorable. I doubt that Lovely Professional University will get the sort of public interest it is hoping for.
Update August 2016: But not too memorable. I doubt that Lovely Professional University will get the sort of public interest it is hoping for.
17. Be proactive
Rankings are changing all the time so think about indicators
that might be introduced in the near future. It would seem quite easy, for
example, for rankers to collect data about patent applications.
Update August 2016: Make sure everyone keeps their Google Scholar Citations Profiles up to date.
Update August 2016: Make sure everyone keeps their Google Scholar Citations Profiles up to date.
18. Support your
local independence movement
It has been known for a long time that increasing the number
of international students and faculty is good for both the THE and QS rankings.
But there are drawbacks to just importing students. If it is difficult to move
students across borders why not create new borders?
If Scotland votes for independence in next year’s referendum
its scores for international students and international faculty in the QS and
THE rankings would go up since English and Welsh students and staff would be
counted as international.
Update August 2016: Scotland didn't but there may be another chance.
Update August 2016: Scotland didn't but there may be another chance.
19. Accept that some
things will never work
Realise that there are some things that are quite pointless
from a rankings perspective. Or any other for that matter. Do not bother telling staff and students to
click away at the website to get into Webometrics. Believe it or not, there are precautions against that sort of thing. Do not have motivational weekends. Do not have quality initiatives unless they
get rid of the cats.
Update August 2016: That should read do not do anything "motivational". The only thing they motivate is the departure of people with other options.
Update August 2016: That should read do not do anything "motivational". The only thing they motivate is the departure of people with other options.
20. Get Thee to an Island
Leiden Ranking has a little known ranking that measures the
distance between collaborators. At the moment the first place goes to the
Australian National University. Move to Easter Island or the Falklands and you
will be top for something.
Thursday, December 19, 2013
The QS BRICS Rankings
Quacquarelli Symonds (QS), in partnership with Interfax, the Russian news agency, have just published their BRICS [Brazil, Russia, India, China, South Africa] University Rankings. The top ten are:
1. Peking University
2. Tsinghua University
3. Lomonosov Moscow State University
4. Fudan University
5. Nanjing University
6= University of Science and Technology China
6= Shanghai Jiao Tong University
8. Universidade de Sao Paulo
9. Zhejiang University
10. Universidade Estadual de Campinas
The highest ranked Indian university is the Indian Institute of Technology Delhi in thirteenth place and the top South African institution is the University of Cape Town which is eleventh.
The methodology is rather different from the QS World University Rankings. The weighting for the academic survey has been reduced to 30% and that for the employer survey has gone up to 20%. Faculty student ratio accounts for 20% as it does in the world rankings, staff with PhDs for 10%, papers per faculty for 10%, citations per paper for 5% and international faculty and students for 5%.
There are some noticeable differences between these rankings and the BRICS and emerging countries rankings produced by Times Higher Education and Thomson Reuters.
Moscow State University is ahead of the University of Cape Town in the QS rankings but well behind in the THE rankings.
In the QS rankings the Indian Institutes of Technology are supreme among Indian institutions. There are seven before the University of Calcutta appears in 52nd place. In the THE rankings the best Indian performer was Panjab University, which is absent from the QS rankings.
I suspect that Panjab University is an example of rankings shopping, where universities target one specific ranking, and that there is a very smart person directing its ranking strategy. Panjab University has invested money in participation in the Hadron Collider project, exactly where it would profit from TR's field normalised citations indicator, while the number of publications did not rise excessively. Recently the university has proposed to establish integrated master's and doctoral programs, good for two TR indicators and to increase research collaboration, good for another.
The Moscow State Engineering Physics Institute, which was removed from the THE world rankings this year year because it was a single subject institution, is in 65th place in this table.
1. Peking University
2. Tsinghua University
3. Lomonosov Moscow State University
4. Fudan University
5. Nanjing University
6= University of Science and Technology China
6= Shanghai Jiao Tong University
8. Universidade de Sao Paulo
9. Zhejiang University
10. Universidade Estadual de Campinas
The highest ranked Indian university is the Indian Institute of Technology Delhi in thirteenth place and the top South African institution is the University of Cape Town which is eleventh.
The methodology is rather different from the QS World University Rankings. The weighting for the academic survey has been reduced to 30% and that for the employer survey has gone up to 20%. Faculty student ratio accounts for 20% as it does in the world rankings, staff with PhDs for 10%, papers per faculty for 10%, citations per paper for 5% and international faculty and students for 5%.
There are some noticeable differences between these rankings and the BRICS and emerging countries rankings produced by Times Higher Education and Thomson Reuters.
Moscow State University is ahead of the University of Cape Town in the QS rankings but well behind in the THE rankings.
In the QS rankings the Indian Institutes of Technology are supreme among Indian institutions. There are seven before the University of Calcutta appears in 52nd place. In the THE rankings the best Indian performer was Panjab University, which is absent from the QS rankings.
I suspect that Panjab University is an example of rankings shopping, where universities target one specific ranking, and that there is a very smart person directing its ranking strategy. Panjab University has invested money in participation in the Hadron Collider project, exactly where it would profit from TR's field normalised citations indicator, while the number of publications did not rise excessively. Recently the university has proposed to establish integrated master's and doctoral programs, good for two TR indicators and to increase research collaboration, good for another.
The Moscow State Engineering Physics Institute, which was removed from the THE world rankings this year year because it was a single subject institution, is in 65th place in this table.
Sunday, December 08, 2013
Africa Excels in Latest THE Rankings, China Performs Poorly
It is unlikely that you will see a headline like this in the mainstream media. Experts and analysts have focused almost exclusively on the number of universities in the top 10 or the top 50 or the top 100 of the Times Higher Education (THE) BRICS and Emerging Economies Rankings (BRICSEE) -- powered, in case anyone has forgotten, by Thomson Reuters -- and concluded that China is the undisputed champion of the world.
Looking at the number of universities in the BRICSEE rankings compared to population -- as in the previous post -- gives a different picture with China still ahead of Russia, India and Brazil but not so much.
Another way of analysing a country's higher education system is by looking at the proportion of universities that achieve "world class" status.
Assuming -- a big assumption I agree -- that getting into the BRICSEE top 100 is a measure of world class quality, then the percentage of a country's universities that are world class might be considered a guide to the overall quality of the higher education system.
Here is a ranking of the BRICS and emerging countries according to the percentage of universities in the THE BRICSEE top 100.
The total number of universities is in brackets and is derived from Webometrics.
First place goes to South Africa. Egypt is third. Even Morocco does better than Russia and Brazil. China does not do very well although it is still quite a bit ahead of India, Russia and Brazil. Taiwan remains well ahead of Mainland China.
Of course, this should not be taken too seriously. It is probably a lot harder to start a university in Taiwan than it is in Brazil or India. South Africa has a large number of professional schools and private colleges that are not counted by Webometrics that may be of a similar standard to universities in other countries.
Some of the high fliers might find that their positions are precarious. Egypt's third university in the BRICSEE rankings is Alexandria which is still reaping the benefits from Dr El Naschies' much cited papers in 2007 and 2008 but that will not last long. The UAE's role as an international higher education hub may not survive a fall in the price of oil.
1. South Africa (25) 20.00%
2. Taiwan (157) 13.38%
3. Egypt (59) 5.08%
4. Turkey (164) 4.27%
5= UAE (50) 4.00%
5= Hungary (75) 4.00%
7. Czech Republic (82) 3.66%
8. Thailand (188) 2.66%
9. Chile (78) 2.56%
10. Malaysia (91) 2.20%
11. China (1164) 1.98%
12. Poland (440) 0.91%
13. India (1604) 0.62%
14. Morocco (212) 0.47%
15. Colombia (285) 0.35%
16. Brazil (1662) 0.24%
17. Mexico (898) 0.22%
18. Russia (1188) 0.17%
19= Indonesia (358) 0%
19= Philippines (265) 0%
19= Pakistan (300) 0%
19= Peru (92) 0%
Looking at the number of universities in the BRICSEE rankings compared to population -- as in the previous post -- gives a different picture with China still ahead of Russia, India and Brazil but not so much.
Another way of analysing a country's higher education system is by looking at the proportion of universities that achieve "world class" status.
Assuming -- a big assumption I agree -- that getting into the BRICSEE top 100 is a measure of world class quality, then the percentage of a country's universities that are world class might be considered a guide to the overall quality of the higher education system.
Here is a ranking of the BRICS and emerging countries according to the percentage of universities in the THE BRICSEE top 100.
The total number of universities is in brackets and is derived from Webometrics.
First place goes to South Africa. Egypt is third. Even Morocco does better than Russia and Brazil. China does not do very well although it is still quite a bit ahead of India, Russia and Brazil. Taiwan remains well ahead of Mainland China.
Of course, this should not be taken too seriously. It is probably a lot harder to start a university in Taiwan than it is in Brazil or India. South Africa has a large number of professional schools and private colleges that are not counted by Webometrics that may be of a similar standard to universities in other countries.
Some of the high fliers might find that their positions are precarious. Egypt's third university in the BRICSEE rankings is Alexandria which is still reaping the benefits from Dr El Naschies' much cited papers in 2007 and 2008 but that will not last long. The UAE's role as an international higher education hub may not survive a fall in the price of oil.
1. South Africa (25) 20.00%
2. Taiwan (157) 13.38%
3. Egypt (59) 5.08%
4. Turkey (164) 4.27%
5= UAE (50) 4.00%
5= Hungary (75) 4.00%
7. Czech Republic (82) 3.66%
8. Thailand (188) 2.66%
9. Chile (78) 2.56%
10. Malaysia (91) 2.20%
11. China (1164) 1.98%
12. Poland (440) 0.91%
13. India (1604) 0.62%
14. Morocco (212) 0.47%
15. Colombia (285) 0.35%
16. Brazil (1662) 0.24%
17. Mexico (898) 0.22%
18. Russia (1188) 0.17%
19= Indonesia (358) 0%
19= Philippines (265) 0%
19= Pakistan (300) 0%
19= Peru (92) 0%
Thursday, December 05, 2013
The THE BRICS and Emerging Markets Rankings: Good News for China?
Times Higher Education (THE) has just published its BRICS and Emerging Economies Rankings. The methodology is the same as that used in their World University Rankings and the data was supplied by Thomson Reuters. Emerging economies are those listed in the FTSE Emerging Markets Indices.
At first sight, China appears to do very well with Peking University in first place and Tsinghua University in second and a total of 23 universities in the top 100.
Third place goes to the University of Cape Town while Taiwan National University is fourth and Bogazici University in Turkey is fifth.
Taiwan has 21 universities in the Top 100, India 10, Turkey 7 and South Africa and Thailand 5 each.
Although China tops the list of "hot emergent properties", as THE puts it, and Simon Marginson compares the PRC favourably to Russia which is "in the doldrums", we should remember that China does have a large population. When we look at population size, China's achievement shrinks considerably while Taiwan emerges as the undisputed winner, Eastern Europe does very well and the gap between Russia and China is drastically reduced.
The following is the number of universities in the the BRICS and Emerging Economies University Rankings per 1,000,000 population (Economist Pocket World in figures 2010) The total number of universities in the rankings is in brackets.
1. Taiwan (21) 0.913
2. United Arab Emirates (2) 0.400
3= Czech Republic (3) 0.300
3= Hungary (3) 0.300
5. Chile (2) 0.118
6. South Africa (5) 0.114
7. Poland (4) 0.102
8. Turkey (7) 0.093
9= Malaysia (2) 0.077
9= Thailand (5) 0.077
11. Egypt (3) 0.039
12. Morocco (1) 0.0031
13. Brazil (4) 0.021
14. Colombia (1) 0.021
15. Mexico (2) 0.018
16. China (23) 0.017
17. Russia (2) 0.014
18. India (10) 0.008
19= Indonesia (0) 0.000
19= Pakistan (0) 0.000
19= Peru (0) 0.000
19= Philippines (0) 0.000
It is very significant that the top two universities in these rankings are in China. But, taking population size into consideration, it looks as though Mainland China is still way behind Taiwan, Singapore and Hong Kong and even the smaller nations of Eastern Europe.
At first sight, China appears to do very well with Peking University in first place and Tsinghua University in second and a total of 23 universities in the top 100.
Third place goes to the University of Cape Town while Taiwan National University is fourth and Bogazici University in Turkey is fifth.
Taiwan has 21 universities in the Top 100, India 10, Turkey 7 and South Africa and Thailand 5 each.
Although China tops the list of "hot emergent properties", as THE puts it, and Simon Marginson compares the PRC favourably to Russia which is "in the doldrums", we should remember that China does have a large population. When we look at population size, China's achievement shrinks considerably while Taiwan emerges as the undisputed winner, Eastern Europe does very well and the gap between Russia and China is drastically reduced.
The following is the number of universities in the the BRICS and Emerging Economies University Rankings per 1,000,000 population (Economist Pocket World in figures 2010) The total number of universities in the rankings is in brackets.
1. Taiwan (21) 0.913
2. United Arab Emirates (2) 0.400
3= Czech Republic (3) 0.300
3= Hungary (3) 0.300
5. Chile (2) 0.118
6. South Africa (5) 0.114
7. Poland (4) 0.102
8. Turkey (7) 0.093
9= Malaysia (2) 0.077
9= Thailand (5) 0.077
11. Egypt (3) 0.039
12. Morocco (1) 0.0031
13. Brazil (4) 0.021
14. Colombia (1) 0.021
15. Mexico (2) 0.018
16. China (23) 0.017
17. Russia (2) 0.014
18. India (10) 0.008
19= Indonesia (0) 0.000
19= Pakistan (0) 0.000
19= Peru (0) 0.000
19= Philippines (0) 0.000
It is very significant that the top two universities in these rankings are in China. But, taking population size into consideration, it looks as though Mainland China is still way behind Taiwan, Singapore and Hong Kong and even the smaller nations of Eastern Europe.
Subscribe to:
Posts (Atom)