QS have published an interactive map showing the percentage distribution of the 62,084 responses to its academic survey in 2013. These are shown in tabular form below. In brackets is the percentage of the 3,069 responses in 2007. The symbol -- means that the percentage response was below 0.5 in 2007 and not indicated by QS. There is no longer a link to the 2007 data but the numbers were recorded in a post on this blog on the 4th of December 2007.
The proportion of respondents from the USA rose substantially between 2007 and 2013. There were also increases for European countries such as the UK, Italy, Germany, France, Spain, Hungary, Russia, Netherlands and Portugal although there were declines for some smaller countries like Belgium, Denmark, Sweden and Switzerland.
The percentage of respondents from Japan and Taiwan rose but there were significant falls for India, China, Malaysia, Hong Kong New Zealand, Australia, Singapore, Indonesia and the Philippines
The most notable change is the growing number of responses from Latin America including Brazil, Mexico, Chile, Argentina and Colombia.
US 17.4 (10.0)
UK 6.5 (5.6)
Brazil 6.3 (1.1)
Italy 4.7 (3.3)
Germany 3.8 (3.0)
Canada 3.4 (4.0)
Australia 3.2 (3.5)
France 2.9 (2.4)
Japan 2.9 (1.9)
Spain 2.7 (2.3)
Mexico 2.6 (0.8)
Hungary 2.0 --
Russia 1.7 (0.7)
India 1.7 (3.5)
Chile 1.7 --
Ireland 1.6 (1.5)
Malaysia 1.5 (3.2)
Belgium 1.4 (2.6))
Hong Kong 1.4 (1.9)
Taiwan 1.3 (0.7)
Netherlands 1.2 (0.6)
New Zealand 1.2 (4.1)
Singapore 1.2 (2.5)
China 1.1 (1.6)
Portugal 1.1 (0.9)
Colombia 1.1 --
Argentina 1.0 (0.7)
South Africa 1.0 (0.7)
Denmark 0.9 (1.2)
Sweden 0.9 (1.7)
Kazakhstan 0.9
Israel 0.8 --
Switzerland 0.8 (1.5)
Austria 0.8 (1.3)
Romania 0.8 --
Turkey 0.7 (1.1)
Pakistan 0.7 --
Norway 0.6 --
Poland 0.6 (0.8)
Thailand 0.6 (0.6)
Finland 0.8 (0.5)
Greece 07 (0.7)
Ukraine 0.5 --
Indonesia 0.5 (1.2)
Czech 0.5 --
Peru 0.4 --
Slovenia 0.4 --
Saudi Arabia 0.4 --
Lithuania 0.4 --
Uraguay 0.3 --
Philippines 0.3 (1.8)
Bulgaria 0.3 --
UAE 0.3 --
Egypt 0.3 --
Paraguay 0.2 --
Jordan 0.2 --
Nigeria 0.2 --
Latvia 0.2 --
Venezuela 0.2 --
Estonia 0.2 --
Ecuador 0.2 --
Slovakia 0.2 --
Iraq 0.2 --
Jamaica 0.1 --
Azerbaijan 0.1 --
Iran 0.1 (0.7) --
Palestine 0.1 --
Cyprus 0.1 --
Kuwait 0.1 --
Bahrain 0.1 --
Vietnam 0.1 --
Algeria 0.1 --
Puerto Rico 0.1 --
Costa Rica 0.1 --
Brunei 0.1 --
Panama 0.1 --
Taiwan 0.1 --
Sri Lanka 0.1 --
Oman 0.1 --
Icelan 0.1 --
Qatar 0.1 --
Bangladesh 0.1 --
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Thursday, February 20, 2014
The SIRIS Lab
The SIRIS Lab has some interesting visualizations of the THE and QS rankings for 2013 and the changing Shanghai Rankings from 2003 to 2013 (thanks to wowter.net).
Be warned. They can get quite addictive.
Be warned. They can get quite addictive.
Tuesday, February 18, 2014
The New Webometrics Rankings
The latest Webometrics rankings are out.
In the overall rankings the top five are:
1. Harvard
2. MIT
3. Stanford
4. Cornell
5. Columbia.
Looking at the indicators one by one, the top five for presence (number of webpages in the main webdomain) are:
1. Karolinska Institute
2. National Taiwan University
3. Harvard
4. University of California San Francisco
5. PRES Universite de Bordeaux.
The top five for impact (number of external inlinks received from third parties) are:
1. University of California Berkeley
2. MIT
3. Harvard
4. Stanford
5. Cornell.
The top five for openness (number of rich files published in dedicated websites) are:
1. University of California San Francisco
2. Cornell
3. Pennsylvania State University
4. University of Kentucky
5. University of Hong Kong.
The top five for excellence (number of papers in the 10% most cited category) are:
1. Harvard
2. Johns Hopkins
3. Stanford
4. UCLA
5. Michigan
In the overall rankings the top five are:
1. Harvard
2. MIT
3. Stanford
4. Cornell
5. Columbia.
Looking at the indicators one by one, the top five for presence (number of webpages in the main webdomain) are:
1. Karolinska Institute
2. National Taiwan University
3. Harvard
4. University of California San Francisco
5. PRES Universite de Bordeaux.
The top five for impact (number of external inlinks received from third parties) are:
1. University of California Berkeley
2. MIT
3. Harvard
4. Stanford
5. Cornell.
The top five for openness (number of rich files published in dedicated websites) are:
1. University of California San Francisco
2. Cornell
3. Pennsylvania State University
4. University of Kentucky
5. University of Hong Kong.
The top five for excellence (number of papers in the 10% most cited category) are:
1. Harvard
2. Johns Hopkins
3. Stanford
4. UCLA
5. Michigan
Saturday, February 08, 2014
The Triple Package
I have just finished reading The Triple Package by Amy Chua and Jed Rubenfeld, a heavily anecdotal book that tells us, as every reader of the New York Times now knows, what really determines success.
An irritating thing is the presentation of urban legends -- no dogs, no Cubans and so on -- and generalizations to support the authors' thesis.
Here is one example: "men like Alfred Kazin, Norman Mailer, Delmore Schwatz, Saul Bellow, Celement Greenberg, Norman Podhoretz, and so many of the New York intellectuals who grew up excluded from anti-Semitic bastions of education and culture but went on to become famous writers and critics".
Alfred Kazin went to City College of New York when it was a selective institution. Norman Mailer went to Harvard at the age of 16 and, after serving in the army, to the Sorbonne. Delmore Schwartz attended Columbia, the University of Wisconsin and New York University and did postgraduate work at Harvard with Alfred North Whitehead. Saul Bellow was at the University of Chicago and then Northwestern. He was also also a postgraduate student at the University of Wisconsin. Clement Greenberg studied at Syracuse University. Norman Podhoretz was accepted by Harvard and NYU but went to Columbia which offered him a full scholarship. He went to Cambridge on a Fulbright and was offered a fellowship at Harvard which he turned down
Bellow famously endured several anti Semitic slights and sneers and no doubt did the others. But can we really say that were excluded from bastions of education?
i
An irritating thing is the presentation of urban legends -- no dogs, no Cubans and so on -- and generalizations to support the authors' thesis.
Here is one example: "men like Alfred Kazin, Norman Mailer, Delmore Schwatz, Saul Bellow, Celement Greenberg, Norman Podhoretz, and so many of the New York intellectuals who grew up excluded from anti-Semitic bastions of education and culture but went on to become famous writers and critics".
Alfred Kazin went to City College of New York when it was a selective institution. Norman Mailer went to Harvard at the age of 16 and, after serving in the army, to the Sorbonne. Delmore Schwartz attended Columbia, the University of Wisconsin and New York University and did postgraduate work at Harvard with Alfred North Whitehead. Saul Bellow was at the University of Chicago and then Northwestern. He was also also a postgraduate student at the University of Wisconsin. Clement Greenberg studied at Syracuse University. Norman Podhoretz was accepted by Harvard and NYU but went to Columbia which offered him a full scholarship. He went to Cambridge on a Fulbright and was offered a fellowship at Harvard which he turned down
Bellow famously endured several anti Semitic slights and sneers and no doubt did the others. But can we really say that were excluded from bastions of education?
i
Thursday, February 06, 2014
The Best Universities for Research
It seems to be the time of year when there a slow trickle of university ranking spin-offs before the big three world rankings starting in August. We have had young university rankings, best student cities, most international universities, BRICS rankings.
Something is missing though, a ranking of top universities for research. So to assuage the pent up demand here are the top 20 universities for research according to six different ranking indicators. There is considerable variation with only two universities, Harvard and Stanford, appearing in every list.
First the top twenty universities for research output according to Scimago. This is measured by publications in the Scopus database over a five year period.
1. Harvard
2. Tokyo
3. Toronto
4. Tsinghua
5. Sao Paulo
6. Michigan Ann Arbor
7. Johns Hopkins
8. UCLA
9. Zhejiang
10. University of Washington
11. Stanford
12. Graduate University of the Chinese Academy of Sciences
13. Shanghai Jiao Tong University
14. University College London
15. Oxford
16. Universite Pierre et Marie Curie Paris 6
17. University of Pennsylvania
18. Cambridge
19. Kyoto
20. Columbia
Next we have the normalized impact scores from Scimago, which measure citations to research publications taking account of field. This might be considered a measure of the quality of research rather than quantity. Note that a university would not be harmed if it had a large number of non-performing faculty who never wrote papers.
1. MIT
2. Harvard
3. University of California San Francisco
4= Stanford
4= Princeton
6. Duke
7. Rice
8. Chicago
9= Columbia
9= University of California Berkeley
9= University of California Santa Cruz
12. University Of California Santa Barbara
13. Boston University
14= Johns Hopkins
14= University of Pennsylvania
16. University of California San Diego
17= UCLA
17= University of Washington
17= Washington University of St Louis
20. Oxford
The citations per faculty indicator in the QS World University Rankings also uses Scopus. It is not normalized by field so medical schools and technological institutes can do very well.
1. Weizmann Institute of Technology
2. Caltech
3. Rockefeller University
4. Harvard
5. Stanford
6. Gwanju Institute of Science and Technology
7. UCLA
8. University of California San Francisco
9. Karolinska Institute
10. University of California Santa Barbara
11. University of California San Diego
12. London School of Hygiene and Tropical Medicine
13. MIT
14. Georgia Institute of Technology
15. University of Washington
16. Northwestern University
17. Emory
18. Tel Aviv
19. Minnesota Twin Cities
20. Cornell
The Times Higher Education -- Thomson Reuters Research Impact Citations Indicator is normalized by field (250 of them) and by year of publication. In addition, there is a "regional modification" that gives a big boost to universities in countries with generally low impact scores. A good score on this indicator can be obtained by contributing to multi-contributor publications, especially in physics, providing that total publications do not rise too much.
1= MIT
1= Tokyo Metropolitan University
3= University of California Santa Cruz
3= Rice
5. Caltech
6. Princeton
7. University of California Santa Barbara
8. University of California Berkeley
9= Harvard
9= Stanford
11. Florida Institute of Technology
12. Chicago
13. Royal Holloway,University of London
14. University of Colorado Boulder
15= Colorado School of Mines
15= Northwestern
17= Duke
17= Universty of California San Diego
19. Washington University of St Louis
20. Boston College
The Shanghai Academic Ranking of World Universities Highly Cited indicator counts the number of researchers on the lists compiled by Thomson Reuters. It seems that new lists will now be produced every year so this indicator could become less stable.
1. Harvard
2. Stanford
3. MIT
4. University of California Berkeley
5. Princeton
6. Michigan Ann Arbor
7. University of California San Diego
8. Yale
9. University of Pennsylvania
10. UCLA
11= Caltech
11= Columbia
13. University of Washington
14. Cornell
15. Cambridge.
16. University of California San Francisco
17. Chicago
18 University of Wisconsin Madison
19 University of Minnesota Twin Cities
20. Oxford
Finally, the MNCS indicator from the Leiden Ranking, which is the number of field normalized citations per paper. It is possible for a few widely cited papers in the right discipline to have a disproportionate effect. The high placing for Gottingen results from a single computer science paper the citation of which is required for intellectual property reasons.
1. MIT
2. Gottingen
3. Princeton
4. Caltech
5. Stanford
6. Rice
7. University of California Santa Barbara
8. University of California Berkeley
9 Harvard
10 University of California Santa Cruz
11. EPF Lausanne
12. Yale
13 University of California San Francisco
14. Chicago
15. University of California San Diego
16. Northwestern
17. University of Colorado Boulder
18. Columbia
19. University of Texas Austin
20. UCLA
Something is missing though, a ranking of top universities for research. So to assuage the pent up demand here are the top 20 universities for research according to six different ranking indicators. There is considerable variation with only two universities, Harvard and Stanford, appearing in every list.
First the top twenty universities for research output according to Scimago. This is measured by publications in the Scopus database over a five year period.
1. Harvard
2. Tokyo
3. Toronto
4. Tsinghua
5. Sao Paulo
6. Michigan Ann Arbor
7. Johns Hopkins
8. UCLA
9. Zhejiang
10. University of Washington
11. Stanford
12. Graduate University of the Chinese Academy of Sciences
13. Shanghai Jiao Tong University
14. University College London
15. Oxford
16. Universite Pierre et Marie Curie Paris 6
17. University of Pennsylvania
18. Cambridge
19. Kyoto
20. Columbia
Next we have the normalized impact scores from Scimago, which measure citations to research publications taking account of field. This might be considered a measure of the quality of research rather than quantity. Note that a university would not be harmed if it had a large number of non-performing faculty who never wrote papers.
1. MIT
2. Harvard
3. University of California San Francisco
4= Stanford
4= Princeton
6. Duke
7. Rice
8. Chicago
9= Columbia
9= University of California Berkeley
9= University of California Santa Cruz
12. University Of California Santa Barbara
13. Boston University
14= Johns Hopkins
14= University of Pennsylvania
16. University of California San Diego
17= UCLA
17= University of Washington
17= Washington University of St Louis
20. Oxford
The citations per faculty indicator in the QS World University Rankings also uses Scopus. It is not normalized by field so medical schools and technological institutes can do very well.
1. Weizmann Institute of Technology
2. Caltech
3. Rockefeller University
4. Harvard
5. Stanford
6. Gwanju Institute of Science and Technology
7. UCLA
8. University of California San Francisco
9. Karolinska Institute
10. University of California Santa Barbara
11. University of California San Diego
12. London School of Hygiene and Tropical Medicine
13. MIT
14. Georgia Institute of Technology
15. University of Washington
16. Northwestern University
17. Emory
18. Tel Aviv
19. Minnesota Twin Cities
20. Cornell
The Times Higher Education -- Thomson Reuters Research Impact Citations Indicator is normalized by field (250 of them) and by year of publication. In addition, there is a "regional modification" that gives a big boost to universities in countries with generally low impact scores. A good score on this indicator can be obtained by contributing to multi-contributor publications, especially in physics, providing that total publications do not rise too much.
1= MIT
1= Tokyo Metropolitan University
3= University of California Santa Cruz
3= Rice
5. Caltech
6. Princeton
7. University of California Santa Barbara
8. University of California Berkeley
9= Harvard
9= Stanford
11. Florida Institute of Technology
12. Chicago
13. Royal Holloway,University of London
14. University of Colorado Boulder
15= Colorado School of Mines
15= Northwestern
17= Duke
17= Universty of California San Diego
19. Washington University of St Louis
20. Boston College
The Shanghai Academic Ranking of World Universities Highly Cited indicator counts the number of researchers on the lists compiled by Thomson Reuters. It seems that new lists will now be produced every year so this indicator could become less stable.
1. Harvard
2. Stanford
3. MIT
4. University of California Berkeley
5. Princeton
6. Michigan Ann Arbor
7. University of California San Diego
8. Yale
9. University of Pennsylvania
10. UCLA
11= Caltech
11= Columbia
13. University of Washington
14. Cornell
15. Cambridge.
16. University of California San Francisco
17. Chicago
18 University of Wisconsin Madison
19 University of Minnesota Twin Cities
20. Oxford
Finally, the MNCS indicator from the Leiden Ranking, which is the number of field normalized citations per paper. It is possible for a few widely cited papers in the right discipline to have a disproportionate effect. The high placing for Gottingen results from a single computer science paper the citation of which is required for intellectual property reasons.
1. MIT
2. Gottingen
3. Princeton
4. Caltech
5. Stanford
6. Rice
7. University of California Santa Barbara
8. University of California Berkeley
9 Harvard
10 University of California Santa Cruz
11. EPF Lausanne
12. Yale
13 University of California San Francisco
14. Chicago
15. University of California San Diego
16. Northwestern
17. University of Colorado Boulder
18. Columbia
19. University of Texas Austin
20. UCLA
Tuesday, February 04, 2014
Will Global Rankings Boost Higher Education in Emerging Countries?
My article in University World News can be accessed here.
Monday, February 03, 2014
India and the World Rankings
There is an excellent article in Asian Scientist by Prof Pushkar of BITS Pilani that questions the developing obsession in India with getting into the top 100 or 200 of the world rankings.
Prof Pushkar observes that Indian universities have never done well in global rankings. He says:
and
Even so:
He also refers to the proposal that the Indian Institutes of Technology should combine just for the rankings. Apparently he has been in touch with Phil Baty of THE who is not buying the idea.
I would disagree with Professor Ashok's argument that combining universities would not be a good idea anyway because THE scales some indicators for size. That is true but the reputation survey is not scaled and adding votes in the survey would be beneficial for a combined institution if one could be created and then accepted by the rankers . Also, you currently need 200 publications a year to be ranked by THE so there would be a case for smaller places around the world --although probably not the IITs -- banding together to get past this threshold.
Prof Pushkar observes that Indian universities have never done well in global rankings. He says:
"there is no doubt that Indian universities need to play ‘catch up’ in order to place more higher education institutions in the top 400 or 500 in the world. It is particularly confounding that a nation which has sent a successful mission to Mars does not boast of one single institution in the top 100. “Not even one!” sounds like a real downer. Whether one considers the country a wannabe “major” power or an “emerging” power (or not), it is still surprising that India’s universities do not make the grade."
and
"It is also rather curious that the “lost decades” of India’s higher education – the 1980s and the 1990s – coincided with a period when the country registered high rates of economic growth. The neglect of higher education finally ended when the National Knowledge Commission drew attention to a “quiet crisis” in its 2006 report."
Even so:
"(d)espite everything that is wrong with India’s higher education, there is no reason for panic about the absence of its universities in the top 100 or 200. Higher education experts agree that the world rankings of universities are limited in terms of what they measure. Chasing world rankings may do little to improve the overall quality of higher education in the country."
He also refers to the proposal that the Indian Institutes of Technology should combine just for the rankings. Apparently he has been in touch with Phil Baty of THE who is not buying the idea.
I would disagree with Professor Ashok's argument that combining universities would not be a good idea anyway because THE scales some indicators for size. That is true but the reputation survey is not scaled and adding votes in the survey would be beneficial for a combined institution if one could be created and then accepted by the rankers . Also, you currently need 200 publications a year to be ranked by THE so there would be a case for smaller places around the world --although probably not the IITs -- banding together to get past this threshold.
Saturday, February 01, 2014
Recent Research: Rankings Matter
According to an article by Molly Alter and Randall Reback in Education Evaluation and Policy Analysis, universities in the USA get more applications if they receive high quality-of-life ratings and fewer if their peers are highly rated academically.
True for your school: How changing reputations alter demand for selective US colleges
Abstract
There is a comprehensive literature documenting how colleges’ tuition, financial aid packages, and academic reputations influence students’ application and enrollment decisions. Far less is known about how quality-of-life reputations and peer institutions’ reputations affect these decisions. This article investigates these issues using data from two prominent college guidebook series to measure changes in reputations. We use information published annually by the Princeton Review—the best-selling college guidebook that formally categorizes colleges based on both academic and quality-of-life indicators—and the U.S. News and World Report—the most famous rankings of U.S. undergraduate programs. Our findings suggest that changes in academic and quality-of-life reputations affect the number of applications received by a college and the academic competitiveness and geographic diversity of the ensuing incoming freshman class. Colleges receive fewer applications when peer universities earn high academic ratings. However, unfavorable quality-of-life ratings for peers are followed by decreases in the college’s own application pool and the academic competitiveness of its incoming class. This suggests that potential applicants often begin their search process by shopping for groups of colleges where non-pecuniary benefits may be relatively high.
True for your school: How changing reputations alter demand for selective US colleges
Abstract
There is a comprehensive literature documenting how colleges’ tuition, financial aid packages, and academic reputations influence students’ application and enrollment decisions. Far less is known about how quality-of-life reputations and peer institutions’ reputations affect these decisions. This article investigates these issues using data from two prominent college guidebook series to measure changes in reputations. We use information published annually by the Princeton Review—the best-selling college guidebook that formally categorizes colleges based on both academic and quality-of-life indicators—and the U.S. News and World Report—the most famous rankings of U.S. undergraduate programs. Our findings suggest that changes in academic and quality-of-life reputations affect the number of applications received by a college and the academic competitiveness and geographic diversity of the ensuing incoming freshman class. Colleges receive fewer applications when peer universities earn high academic ratings. However, unfavorable quality-of-life ratings for peers are followed by decreases in the college’s own application pool and the academic competitiveness of its incoming class. This suggests that potential applicants often begin their search process by shopping for groups of colleges where non-pecuniary benefits may be relatively high.
Subscribe to:
Posts (Atom)