QS have published an interactive map showing the percentage distribution of the 62,084 responses to its academic survey in 2013. These are shown in tabular form below. In brackets is the percentage of the 3,069 responses in 2007. The symbol -- means that the percentage response was below 0.5 in 2007 and not indicated by QS. There is no longer a link to the 2007 data but the numbers were recorded in a post on this blog on the 4th of December 2007.
The proportion of respondents from the USA rose substantially between 2007 and 2013. There were also increases for European countries such as the UK, Italy, Germany, France, Spain, Hungary, Russia, Netherlands and Portugal although there were declines for some smaller countries like Belgium, Denmark, Sweden and Switzerland.
The percentage of respondents from Japan and Taiwan rose but there were significant falls for India, China, Malaysia, Hong Kong New Zealand, Australia, Singapore, Indonesia and the Philippines
The most notable change is the growing number of responses from Latin America including Brazil, Mexico, Chile, Argentina and Colombia.
US 17.4 (10.0)
UK 6.5 (5.6)
Brazil 6.3 (1.1)
Italy 4.7 (3.3)
Germany 3.8 (3.0)
Canada 3.4 (4.0)
Australia 3.2 (3.5)
France 2.9 (2.4)
Japan 2.9 (1.9)
Spain 2.7 (2.3)
Mexico 2.6 (0.8)
Hungary 2.0 --
Russia 1.7 (0.7)
India 1.7 (3.5)
Chile 1.7 --
Ireland 1.6 (1.5)
Malaysia 1.5 (3.2)
Belgium 1.4 (2.6))
Hong Kong 1.4 (1.9)
Taiwan 1.3 (0.7)
Netherlands 1.2 (0.6)
New Zealand 1.2 (4.1)
Singapore 1.2 (2.5)
China 1.1 (1.6)
Portugal 1.1 (0.9)
Colombia 1.1 --
Argentina 1.0 (0.7)
South Africa 1.0 (0.7)
Denmark 0.9 (1.2)
Sweden 0.9 (1.7)
Kazakhstan 0.9
Israel 0.8 --
Switzerland 0.8 (1.5)
Austria 0.8 (1.3)
Romania 0.8 --
Turkey 0.7 (1.1)
Pakistan 0.7 --
Norway 0.6 --
Poland 0.6 (0.8)
Thailand 0.6 (0.6)
Finland 0.8 (0.5)
Greece 07 (0.7)
Ukraine 0.5 --
Indonesia 0.5 (1.2)
Czech 0.5 --
Peru 0.4 --
Slovenia 0.4 --
Saudi Arabia 0.4 --
Lithuania 0.4 --
Uraguay 0.3 --
Philippines 0.3 (1.8)
Bulgaria 0.3 --
UAE 0.3 --
Egypt 0.3 --
Paraguay 0.2 --
Jordan 0.2 --
Nigeria 0.2 --
Latvia 0.2 --
Venezuela 0.2 --
Estonia 0.2 --
Ecuador 0.2 --
Slovakia 0.2 --
Iraq 0.2 --
Jamaica 0.1 --
Azerbaijan 0.1 --
Iran 0.1 (0.7) --
Palestine 0.1 --
Cyprus 0.1 --
Kuwait 0.1 --
Bahrain 0.1 --
Vietnam 0.1 --
Algeria 0.1 --
Puerto Rico 0.1 --
Costa Rica 0.1 --
Brunei 0.1 --
Panama 0.1 --
Taiwan 0.1 --
Sri Lanka 0.1 --
Oman 0.1 --
Icelan 0.1 --
Qatar 0.1 --
Bangladesh 0.1 --
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Thursday, February 20, 2014
The SIRIS Lab
The SIRIS Lab has some interesting visualizations of the THE and QS rankings for 2013 and the changing Shanghai Rankings from 2003 to 2013 (thanks to wowter.net).
Be warned. They can get quite addictive.
Be warned. They can get quite addictive.
Tuesday, February 18, 2014
The New Webometrics Rankings
The latest Webometrics rankings are out.
In the overall rankings the top five are:
1. Harvard
2. MIT
3. Stanford
4. Cornell
5. Columbia.
Looking at the indicators one by one, the top five for presence (number of webpages in the main webdomain) are:
1. Karolinska Institute
2. National Taiwan University
3. Harvard
4. University of California San Francisco
5. PRES Universite de Bordeaux.
The top five for impact (number of external inlinks received from third parties) are:
1. University of California Berkeley
2. MIT
3. Harvard
4. Stanford
5. Cornell.
The top five for openness (number of rich files published in dedicated websites) are:
1. University of California San Francisco
2. Cornell
3. Pennsylvania State University
4. University of Kentucky
5. University of Hong Kong.
The top five for excellence (number of papers in the 10% most cited category) are:
1. Harvard
2. Johns Hopkins
3. Stanford
4. UCLA
5. Michigan
In the overall rankings the top five are:
1. Harvard
2. MIT
3. Stanford
4. Cornell
5. Columbia.
Looking at the indicators one by one, the top five for presence (number of webpages in the main webdomain) are:
1. Karolinska Institute
2. National Taiwan University
3. Harvard
4. University of California San Francisco
5. PRES Universite de Bordeaux.
The top five for impact (number of external inlinks received from third parties) are:
1. University of California Berkeley
2. MIT
3. Harvard
4. Stanford
5. Cornell.
The top five for openness (number of rich files published in dedicated websites) are:
1. University of California San Francisco
2. Cornell
3. Pennsylvania State University
4. University of Kentucky
5. University of Hong Kong.
The top five for excellence (number of papers in the 10% most cited category) are:
1. Harvard
2. Johns Hopkins
3. Stanford
4. UCLA
5. Michigan
Saturday, February 08, 2014
The Triple Package
I have just finished reading The Triple Package by Amy Chua and Jed Rubenfeld, a heavily anecdotal book that tells us, as every reader of the New York Times now knows, what really determines success.
An irritating thing is the presentation of urban legends -- no dogs, no Cubans and so on -- and generalizations to support the authors' thesis.
Here is one example: "men like Alfred Kazin, Norman Mailer, Delmore Schwatz, Saul Bellow, Celement Greenberg, Norman Podhoretz, and so many of the New York intellectuals who grew up excluded from anti-Semitic bastions of education and culture but went on to become famous writers and critics".
Alfred Kazin went to City College of New York when it was a selective institution. Norman Mailer went to Harvard at the age of 16 and, after serving in the army, to the Sorbonne. Delmore Schwartz attended Columbia, the University of Wisconsin and New York University and did postgraduate work at Harvard with Alfred North Whitehead. Saul Bellow was at the University of Chicago and then Northwestern. He was also also a postgraduate student at the University of Wisconsin. Clement Greenberg studied at Syracuse University. Norman Podhoretz was accepted by Harvard and NYU but went to Columbia which offered him a full scholarship. He went to Cambridge on a Fulbright and was offered a fellowship at Harvard which he turned down
Bellow famously endured several anti Semitic slights and sneers and no doubt did the others. But can we really say that were excluded from bastions of education?
i
An irritating thing is the presentation of urban legends -- no dogs, no Cubans and so on -- and generalizations to support the authors' thesis.
Here is one example: "men like Alfred Kazin, Norman Mailer, Delmore Schwatz, Saul Bellow, Celement Greenberg, Norman Podhoretz, and so many of the New York intellectuals who grew up excluded from anti-Semitic bastions of education and culture but went on to become famous writers and critics".
Alfred Kazin went to City College of New York when it was a selective institution. Norman Mailer went to Harvard at the age of 16 and, after serving in the army, to the Sorbonne. Delmore Schwartz attended Columbia, the University of Wisconsin and New York University and did postgraduate work at Harvard with Alfred North Whitehead. Saul Bellow was at the University of Chicago and then Northwestern. He was also also a postgraduate student at the University of Wisconsin. Clement Greenberg studied at Syracuse University. Norman Podhoretz was accepted by Harvard and NYU but went to Columbia which offered him a full scholarship. He went to Cambridge on a Fulbright and was offered a fellowship at Harvard which he turned down
Bellow famously endured several anti Semitic slights and sneers and no doubt did the others. But can we really say that were excluded from bastions of education?
i
Thursday, February 06, 2014
The Best Universities for Research
It seems to be the time of year when there a slow trickle of university ranking spin-offs before the big three world rankings starting in August. We have had young university rankings, best student cities, most international universities, BRICS rankings.
Something is missing though, a ranking of top universities for research. So to assuage the pent up demand here are the top 20 universities for research according to six different ranking indicators. There is considerable variation with only two universities, Harvard and Stanford, appearing in every list.
First the top twenty universities for research output according to Scimago. This is measured by publications in the Scopus database over a five year period.
1. Harvard
2. Tokyo
3. Toronto
4. Tsinghua
5. Sao Paulo
6. Michigan Ann Arbor
7. Johns Hopkins
8. UCLA
9. Zhejiang
10. University of Washington
11. Stanford
12. Graduate University of the Chinese Academy of Sciences
13. Shanghai Jiao Tong University
14. University College London
15. Oxford
16. Universite Pierre et Marie Curie Paris 6
17. University of Pennsylvania
18. Cambridge
19. Kyoto
20. Columbia
Next we have the normalized impact scores from Scimago, which measure citations to research publications taking account of field. This might be considered a measure of the quality of research rather than quantity. Note that a university would not be harmed if it had a large number of non-performing faculty who never wrote papers.
1. MIT
2. Harvard
3. University of California San Francisco
4= Stanford
4= Princeton
6. Duke
7. Rice
8. Chicago
9= Columbia
9= University of California Berkeley
9= University of California Santa Cruz
12. University Of California Santa Barbara
13. Boston University
14= Johns Hopkins
14= University of Pennsylvania
16. University of California San Diego
17= UCLA
17= University of Washington
17= Washington University of St Louis
20. Oxford
The citations per faculty indicator in the QS World University Rankings also uses Scopus. It is not normalized by field so medical schools and technological institutes can do very well.
1. Weizmann Institute of Technology
2. Caltech
3. Rockefeller University
4. Harvard
5. Stanford
6. Gwanju Institute of Science and Technology
7. UCLA
8. University of California San Francisco
9. Karolinska Institute
10. University of California Santa Barbara
11. University of California San Diego
12. London School of Hygiene and Tropical Medicine
13. MIT
14. Georgia Institute of Technology
15. University of Washington
16. Northwestern University
17. Emory
18. Tel Aviv
19. Minnesota Twin Cities
20. Cornell
The Times Higher Education -- Thomson Reuters Research Impact Citations Indicator is normalized by field (250 of them) and by year of publication. In addition, there is a "regional modification" that gives a big boost to universities in countries with generally low impact scores. A good score on this indicator can be obtained by contributing to multi-contributor publications, especially in physics, providing that total publications do not rise too much.
1= MIT
1= Tokyo Metropolitan University
3= University of California Santa Cruz
3= Rice
5. Caltech
6. Princeton
7. University of California Santa Barbara
8. University of California Berkeley
9= Harvard
9= Stanford
11. Florida Institute of Technology
12. Chicago
13. Royal Holloway,University of London
14. University of Colorado Boulder
15= Colorado School of Mines
15= Northwestern
17= Duke
17= Universty of California San Diego
19. Washington University of St Louis
20. Boston College
The Shanghai Academic Ranking of World Universities Highly Cited indicator counts the number of researchers on the lists compiled by Thomson Reuters. It seems that new lists will now be produced every year so this indicator could become less stable.
1. Harvard
2. Stanford
3. MIT
4. University of California Berkeley
5. Princeton
6. Michigan Ann Arbor
7. University of California San Diego
8. Yale
9. University of Pennsylvania
10. UCLA
11= Caltech
11= Columbia
13. University of Washington
14. Cornell
15. Cambridge.
16. University of California San Francisco
17. Chicago
18 University of Wisconsin Madison
19 University of Minnesota Twin Cities
20. Oxford
Finally, the MNCS indicator from the Leiden Ranking, which is the number of field normalized citations per paper. It is possible for a few widely cited papers in the right discipline to have a disproportionate effect. The high placing for Gottingen results from a single computer science paper the citation of which is required for intellectual property reasons.
1. MIT
2. Gottingen
3. Princeton
4. Caltech
5. Stanford
6. Rice
7. University of California Santa Barbara
8. University of California Berkeley
9 Harvard
10 University of California Santa Cruz
11. EPF Lausanne
12. Yale
13 University of California San Francisco
14. Chicago
15. University of California San Diego
16. Northwestern
17. University of Colorado Boulder
18. Columbia
19. University of Texas Austin
20. UCLA
Something is missing though, a ranking of top universities for research. So to assuage the pent up demand here are the top 20 universities for research according to six different ranking indicators. There is considerable variation with only two universities, Harvard and Stanford, appearing in every list.
First the top twenty universities for research output according to Scimago. This is measured by publications in the Scopus database over a five year period.
1. Harvard
2. Tokyo
3. Toronto
4. Tsinghua
5. Sao Paulo
6. Michigan Ann Arbor
7. Johns Hopkins
8. UCLA
9. Zhejiang
10. University of Washington
11. Stanford
12. Graduate University of the Chinese Academy of Sciences
13. Shanghai Jiao Tong University
14. University College London
15. Oxford
16. Universite Pierre et Marie Curie Paris 6
17. University of Pennsylvania
18. Cambridge
19. Kyoto
20. Columbia
Next we have the normalized impact scores from Scimago, which measure citations to research publications taking account of field. This might be considered a measure of the quality of research rather than quantity. Note that a university would not be harmed if it had a large number of non-performing faculty who never wrote papers.
1. MIT
2. Harvard
3. University of California San Francisco
4= Stanford
4= Princeton
6. Duke
7. Rice
8. Chicago
9= Columbia
9= University of California Berkeley
9= University of California Santa Cruz
12. University Of California Santa Barbara
13. Boston University
14= Johns Hopkins
14= University of Pennsylvania
16. University of California San Diego
17= UCLA
17= University of Washington
17= Washington University of St Louis
20. Oxford
The citations per faculty indicator in the QS World University Rankings also uses Scopus. It is not normalized by field so medical schools and technological institutes can do very well.
1. Weizmann Institute of Technology
2. Caltech
3. Rockefeller University
4. Harvard
5. Stanford
6. Gwanju Institute of Science and Technology
7. UCLA
8. University of California San Francisco
9. Karolinska Institute
10. University of California Santa Barbara
11. University of California San Diego
12. London School of Hygiene and Tropical Medicine
13. MIT
14. Georgia Institute of Technology
15. University of Washington
16. Northwestern University
17. Emory
18. Tel Aviv
19. Minnesota Twin Cities
20. Cornell
The Times Higher Education -- Thomson Reuters Research Impact Citations Indicator is normalized by field (250 of them) and by year of publication. In addition, there is a "regional modification" that gives a big boost to universities in countries with generally low impact scores. A good score on this indicator can be obtained by contributing to multi-contributor publications, especially in physics, providing that total publications do not rise too much.
1= MIT
1= Tokyo Metropolitan University
3= University of California Santa Cruz
3= Rice
5. Caltech
6. Princeton
7. University of California Santa Barbara
8. University of California Berkeley
9= Harvard
9= Stanford
11. Florida Institute of Technology
12. Chicago
13. Royal Holloway,University of London
14. University of Colorado Boulder
15= Colorado School of Mines
15= Northwestern
17= Duke
17= Universty of California San Diego
19. Washington University of St Louis
20. Boston College
The Shanghai Academic Ranking of World Universities Highly Cited indicator counts the number of researchers on the lists compiled by Thomson Reuters. It seems that new lists will now be produced every year so this indicator could become less stable.
1. Harvard
2. Stanford
3. MIT
4. University of California Berkeley
5. Princeton
6. Michigan Ann Arbor
7. University of California San Diego
8. Yale
9. University of Pennsylvania
10. UCLA
11= Caltech
11= Columbia
13. University of Washington
14. Cornell
15. Cambridge.
16. University of California San Francisco
17. Chicago
18 University of Wisconsin Madison
19 University of Minnesota Twin Cities
20. Oxford
Finally, the MNCS indicator from the Leiden Ranking, which is the number of field normalized citations per paper. It is possible for a few widely cited papers in the right discipline to have a disproportionate effect. The high placing for Gottingen results from a single computer science paper the citation of which is required for intellectual property reasons.
1. MIT
2. Gottingen
3. Princeton
4. Caltech
5. Stanford
6. Rice
7. University of California Santa Barbara
8. University of California Berkeley
9 Harvard
10 University of California Santa Cruz
11. EPF Lausanne
12. Yale
13 University of California San Francisco
14. Chicago
15. University of California San Diego
16. Northwestern
17. University of Colorado Boulder
18. Columbia
19. University of Texas Austin
20. UCLA
Tuesday, February 04, 2014
Will Global Rankings Boost Higher Education in Emerging Countries?
My article in University World News can be accessed here.
Monday, February 03, 2014
India and the World Rankings
There is an excellent article in Asian Scientist by Prof Pushkar of BITS Pilani that questions the developing obsession in India with getting into the top 100 or 200 of the world rankings.
Prof Pushkar observes that Indian universities have never done well in global rankings. He says:
and
Even so:
He also refers to the proposal that the Indian Institutes of Technology should combine just for the rankings. Apparently he has been in touch with Phil Baty of THE who is not buying the idea.
I would disagree with Professor Ashok's argument that combining universities would not be a good idea anyway because THE scales some indicators for size. That is true but the reputation survey is not scaled and adding votes in the survey would be beneficial for a combined institution if one could be created and then accepted by the rankers . Also, you currently need 200 publications a year to be ranked by THE so there would be a case for smaller places around the world --although probably not the IITs -- banding together to get past this threshold.
Prof Pushkar observes that Indian universities have never done well in global rankings. He says:
"there is no doubt that Indian universities need to play ‘catch up’ in order to place more higher education institutions in the top 400 or 500 in the world. It is particularly confounding that a nation which has sent a successful mission to Mars does not boast of one single institution in the top 100. “Not even one!” sounds like a real downer. Whether one considers the country a wannabe “major” power or an “emerging” power (or not), it is still surprising that India’s universities do not make the grade."
and
"It is also rather curious that the “lost decades” of India’s higher education – the 1980s and the 1990s – coincided with a period when the country registered high rates of economic growth. The neglect of higher education finally ended when the National Knowledge Commission drew attention to a “quiet crisis” in its 2006 report."
Even so:
"(d)espite everything that is wrong with India’s higher education, there is no reason for panic about the absence of its universities in the top 100 or 200. Higher education experts agree that the world rankings of universities are limited in terms of what they measure. Chasing world rankings may do little to improve the overall quality of higher education in the country."
He also refers to the proposal that the Indian Institutes of Technology should combine just for the rankings. Apparently he has been in touch with Phil Baty of THE who is not buying the idea.
I would disagree with Professor Ashok's argument that combining universities would not be a good idea anyway because THE scales some indicators for size. That is true but the reputation survey is not scaled and adding votes in the survey would be beneficial for a combined institution if one could be created and then accepted by the rankers . Also, you currently need 200 publications a year to be ranked by THE so there would be a case for smaller places around the world --although probably not the IITs -- banding together to get past this threshold.
Saturday, February 01, 2014
Recent Research: Rankings Matter
According to an article by Molly Alter and Randall Reback in Education Evaluation and Policy Analysis, universities in the USA get more applications if they receive high quality-of-life ratings and fewer if their peers are highly rated academically.
True for your school: How changing reputations alter demand for selective US colleges
Abstract
There is a comprehensive literature documenting how colleges’ tuition, financial aid packages, and academic reputations influence students’ application and enrollment decisions. Far less is known about how quality-of-life reputations and peer institutions’ reputations affect these decisions. This article investigates these issues using data from two prominent college guidebook series to measure changes in reputations. We use information published annually by the Princeton Review—the best-selling college guidebook that formally categorizes colleges based on both academic and quality-of-life indicators—and the U.S. News and World Report—the most famous rankings of U.S. undergraduate programs. Our findings suggest that changes in academic and quality-of-life reputations affect the number of applications received by a college and the academic competitiveness and geographic diversity of the ensuing incoming freshman class. Colleges receive fewer applications when peer universities earn high academic ratings. However, unfavorable quality-of-life ratings for peers are followed by decreases in the college’s own application pool and the academic competitiveness of its incoming class. This suggests that potential applicants often begin their search process by shopping for groups of colleges where non-pecuniary benefits may be relatively high.
True for your school: How changing reputations alter demand for selective US colleges
Abstract
There is a comprehensive literature documenting how colleges’ tuition, financial aid packages, and academic reputations influence students’ application and enrollment decisions. Far less is known about how quality-of-life reputations and peer institutions’ reputations affect these decisions. This article investigates these issues using data from two prominent college guidebook series to measure changes in reputations. We use information published annually by the Princeton Review—the best-selling college guidebook that formally categorizes colleges based on both academic and quality-of-life indicators—and the U.S. News and World Report—the most famous rankings of U.S. undergraduate programs. Our findings suggest that changes in academic and quality-of-life reputations affect the number of applications received by a college and the academic competitiveness and geographic diversity of the ensuing incoming freshman class. Colleges receive fewer applications when peer universities earn high academic ratings. However, unfavorable quality-of-life ratings for peers are followed by decreases in the college’s own application pool and the academic competitiveness of its incoming class. This suggests that potential applicants often begin their search process by shopping for groups of colleges where non-pecuniary benefits may be relatively high.
Friday, January 31, 2014
Department of Remarkable Coincidences
On the day that QS published their top 50 under-50 universities, Times Higher Education has announced that it will be holding a Young Universities Summit in Miami in April at which the top 100 universities under 50 will be revealed.
Also, the summit will see "a consultative discussion on proposed new rankings metrics designed to better capture innovation in innovation and knowledge transfer in world rankings in the future."
Innovation? What could that mean? Maybe counting patents.
Knowledge transfer? Could this mean doing something about the citations indicator? Has someone at THE seen who contributed to multi-author massively cited publications in 2012?
on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the future.on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the future.
a consultative discussion on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the future.a consultative discussion on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the future.
ll also host a consultative discussion on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the fu
Also, the summit will see "a consultative discussion on proposed new rankings metrics designed to better capture innovation in innovation and knowledge transfer in world rankings in the future."
Innovation? What could that mean? Maybe counting patents.
Knowledge transfer? Could this mean doing something about the citations indicator? Has someone at THE seen who contributed to multi-author massively cited publications in 2012?
on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the future.on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the future.
a consultative discussion on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the future.a consultative discussion on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the future.
ll also host a consultative discussion on proposed new rankings metrics designed to better capture innovation and knowledge transfer in world rankings in the fu
QS Young Universities Rankings
QS have produced a ranking of universities founded in the last fifty years. It is based on data collected for last year's World University Rankings.
The top five are:
1. Hong Kong University of Science and Technology
2. Nanyang Technological University, Singapore
3. Korean Advanced Institute of Science and Technology
4. City University of Hong Kong
5. Pohang University of Science and Technology, Korea
There are no universities from Russia or Mainland China on the list although there is one from Taiwan and another from Kazakhstan.
There are nine Australian universities in the top fifty.
The top five are:
1. Hong Kong University of Science and Technology
2. Nanyang Technological University, Singapore
3. Korean Advanced Institute of Science and Technology
4. City University of Hong Kong
5. Pohang University of Science and Technology, Korea
There are no universities from Russia or Mainland China on the list although there is one from Taiwan and another from Kazakhstan.
There are nine Australian universities in the top fifty.
Wednesday, January 29, 2014
The 25 Most International Universities
Times Higher Education has produced a succession of spin-offs from their World University Rankings: rankings of Asian universities, young universities and emerging economies universities, reputation rankings, a gender index.
Now there is a list of the world's most international universities, based on the international outlook indicator in the world rankings. This comprises data on international students, international faculty and international research collaboration.
The top five are:
1. Ecole Polytechnique Federale de Lausanne
2= Swiss Federal Institute of technology Zurich
2= University of Geneva
4. National University of Singapore
5. Royal Holloway, University of London
.
Now there is a list of the world's most international universities, based on the international outlook indicator in the world rankings. This comprises data on international students, international faculty and international research collaboration.
The top five are:
1. Ecole Polytechnique Federale de Lausanne
2= Swiss Federal Institute of technology Zurich
2= University of Geneva
4. National University of Singapore
5. Royal Holloway, University of London
.
Sunday, January 19, 2014
A bright idea from India
Has someone in India been reading this blog?
In a previous post I suggested that universities might improve their scores in the world rankings by merging. That would help in the QS and THE reputation surveys and the publications indicator in the Shanghai rankings.
If he is being reported correctly, Indian education minister Ashok Thakur proposes to go a step further and suggests that all the Indian Institutes of Technology should be assessed together by the international rankers, although presumably continuing to function separately in other respects. According to outlookindia:
Both QS and THE seem eager to do business in India but this is surely a non-starter. Apart from anything else, it could be followed by all the University of California and other US state university campuses, branches of the National University of Ireland and the Indian Institutes of Science and Management coming together for ranking purposes.
Also, the Secretary should consider that if any IIT follows the lead of Panjab University and joins the Hadron Collider Project or any other multi-contributor, multi-citation project, any gain in the THE citations indicator would be lost if it had to be shared with the other 12 institutes.
In a previous post I suggested that universities might improve their scores in the world rankings by merging. That would help in the QS and THE reputation surveys and the publications indicator in the Shanghai rankings.
If he is being reported correctly, Indian education minister Ashok Thakur proposes to go a step further and suggests that all the Indian Institutes of Technology should be assessed together by the international rankers, although presumably continuing to function separately in other respects. According to outlookindia:
"All the 13 IITs may compete as a single unit at the global level for a place among the best in the global ranking list.
Giving an indication in this regard, Higher Education Secretary Ashok Thakur said the idea is to position the IITs as a single unit much like the IIT brand which has become an entity in itself for finding a place among the top three best institutes the world-over.
International ranking agencies such as Times Higher Education and QS World University Ranking would be informed accordingly, he said.
Central universities and other institutes could follow on how the IITs position themselves in the ranking list, he said."
Both QS and THE seem eager to do business in India but this is surely a non-starter. Apart from anything else, it could be followed by all the University of California and other US state university campuses, branches of the National University of Ireland and the Indian Institutes of Science and Management coming together for ranking purposes.
Also, the Secretary should consider that if any IIT follows the lead of Panjab University and joins the Hadron Collider Project or any other multi-contributor, multi-citation project, any gain in the THE citations indicator would be lost if it had to be shared with the other 12 institutes.
Tuesday, January 07, 2014
Explain Please
I have often noticed that some university administrators and educational bureaucrats are clueless about international university rankings, even when their careers depend on a good performance.
The Economic Times of India reports that the Higher Education Secretary in the Human Resource Development Ministry, Ashok Thakur, said "institutions could improve their scores dramatically in Times Higher Education's globally cited World University Rankings as the British magazine has agreed to develop and include India-specific parameters for assessment from the next time."
This sounds like THE is going to insert a new indicator just for India in their world rankings, which is unbelievable. The Hindu puts it a little differently, suggesting that THE is preparing a separate ranking of Indian universities:
.
"Times Higher Education (THE) — recognised world over for its ranking of higher education institutions — has agreed to draw up an India-specific indicator that would act as a parameter for global education stakeholders and international students to judge Indian educational institutions.
This was disclosed by Higher Education Secretary in the Union Human Resource Development Ministry Ashok Thakur."
It would be interesting to find out what the minister actually said and what, if anything, THE has agreed to.
Ranking News
06/01/14
The latest Times Higher Education international reputation rankings, based on data collected for last year's World University Rankings, will be announced in Tokyo on March 6th.
The number of responses was 10,536 in 2013, down from 16,639 in 2012 and 17,554 in 2011
Why is the number of responses falling?
Is the decline linked with changes in the scores for the teaching and research indicators criteria, both of which include indicators based on the survey?
The latest Times Higher Education international reputation rankings, based on data collected for last year's World University Rankings, will be announced in Tokyo on March 6th.
The number of responses was 10,536 in 2013, down from 16,639 in 2012 and 17,554 in 2011
Why is the number of responses falling?
Is the decline linked with changes in the scores for the teaching and research indicators criteria, both of which include indicators based on the survey?
Saturday, December 21, 2013
Twenty Ways to Rise in the Rankings Quickly and Fairly Painlessly
Times
Higher Education has
just republished an article by Amanda Goodall, ‘Top 20 ways to improve your
world university ranking’. Much of her
advice is very sensible -- appointing university leaders with a strong research
record, for example -- but in most cases the road from her suggestions to a
perceptible improvement in the rankings is likely to be winding and very long. It
is unlikely that any of her proposals would have much effect on the rankings in
less than a decade or even two.
So here are 20 realistic proposals for a university wishing
to join the rankings game.
Before starting, any advice about how a university can rise
in the rankings should be based on these principles.
·
Rankings are proliferating and no doubt there will be more
in the future. There is something for almost anybody if you look carefully
enough.
·
The indicators and methodology of the better known rankings
are very different. Something that works with one may not work with another. It
might even have a negative effect.
· There is often a price to pay for getting ahead in the rankings. Everybody should consider whether it is worth it. Also, while rising from 300th place to 250th is quite easy, going from 30th to 25th is another matter.
· Don’t forget the number on the bottom. It might be easier to reduce the number of academic staff than to increase the number of citations or publications.
·
Rankings are at best an approximation to what universities
do. Nobody should get too excited about them.
The top 20 ways in which universities can quickly improve
their positions in one or more of the international university rankings are:
1. Get rid
of students
Over the years many universities acquire a collection of
branch campuses, general studies programmes, night schools, pre-degree programmes
and so on. Set them free to become independent universities or colleges. Almost
always, these places have relatively more students and relatively fewer faculty
than the main campus. The university will therefore do better in the Quacquarelli Symonds (QS) and Times Higher Education (THE) faculty student ratio
indicators. Also, staff in the spun off
branches and schools generally produce less research than those at the main
campus so you will get a boost in the productivity per capita indicator in the Shanghai ARWU rankings.
2. Kick out the old and bring in the young
Get rid of ageing professors, especially if unproductive and
expensive, and hire lots of indentured servants adjunct and temporary teachers
and researchers. Again, this will improve the university’s performance on the THE
and QS faculty student ratio indicators. They will not count as senior faculty so
this will be helpful for ARWU.
3. Hire research assistants
Recruiting slave labour cheap or unpaid research
assistants (unemployed or unemployable graduate interns?) will boost the score
for faculty student ratio in the QS rankings, since QS counts research-only
staff for their faculty student indicator. It will not, however, work for the THE
rankings. Remember that for QS more
faculty are good for faculty student ratio but bad for citations per faculty so
you have to analyse the potential trade off carefully.
4. Think about an exit option
If an emerging university wants to be included in the
rankings it might be better to focus on just one of them. Panjab University is doing very well in the
THE rankings but does not appear in the QS rankings. But remember that if you
apply to be ranked by THE and you do not like your placing then it is always
possible to opt out by not submitting data next year. But QS has a Hotel
California policy: once in, you can check out but you can never leave. It does
not matter how much you complain about the unique qualities of your institution
and how they are neglected by the rankers, QS will go on ranking you whether you
like it.
5. Get a medical school
If you do not have a
medical school or a research and/or teaching hospital then get one from
somewhere. Merge with an existing one or start your own. If you have one, get
another one. Medical research produces a disproportionate number of papers and
citations which is good for the QS citations per faculty indicator and the ARWU
publications indicator. Remember this strategy may not help so much with THE who use
field normalisation. Those citations of medical research will help there only
if they above the world average for field and year.
Update August 2016: QS now have a moderate form of field normalisation so the advantage of a medical school is reduced but the Shanghai rankings are still biased towards medical research.
Update August 2016: QS now have a moderate form of field normalisation so the advantage of a medical school is reduced but the Shanghai rankings are still biased towards medical research.
6. But if you are a
medical school, diversify
QS and THE supposedly do not include single subject
institutions in their general rankings, although from time to time one will, like the University of California at
San Francisco, Aston Business School or (National Research Nuclear University) Moscow Engineering Physics Institute (MEPhI),
slip through. If you are an independent
medical or single subject institution consider adding one or two more subjects
then QS and THE will count you although you will probably start sliding down the ARWU table.
Update August 2016: the
QS BRICS rankings include some Russian institutions that look like they
focus on one field and National Research Nuclear University MePhI is back in the THE world rankings.
7. Amalgamate
The Shanghai rankings count the total number of publications
in the SCI and SSCI, the total number of highly cited researchers and the total
number of papers without regard for the number of researchers. THE and QS count
the number of votes in their surveys without considering the number of alumni.
What about a new mega university formed by merging LSE,
University College London and Imperial College? Or a tres grande ecole from all
those little grandes ecoles around Paris?
Update August 2016: This is pretty much what the University of Paris-Saclay is doing.
Update August 2016: This is pretty much what the University of Paris-Saclay is doing.
8. Consider the
weighting of the rankings
THE gives a 30 % weighting to citations and 2.5% to income
from industry. QS gives 40 % to its academic survey and 5 % to international
faculty. So think about where you are going to spend your money.
9. The wisdom of crowds
Focus on research projects in those fields that have huge
multi - “author” publications, particle
physics, astronomy and medicine for example.
Such publications often have very large numbers of citations. Even if
your researchers make a one in two thousandth contribution Thomson Reuters, THE’s
data collector, will give them the same credit as they would get if they were
the only authors. This will not work for the
Leiden Ranking which uses fractionalised counting of citations. Note that this strategy works best when combined with number
10.
Update August 2016: THE methodological changes in 2015 mean that this does not work any more. Look at what happened to Middle East Technical University. But it is still worth looking out for projects with dozens or scores of contributors.
Update August 2016: THE methodological changes in 2015 mean that this does not work any more. Look at what happened to Middle East Technical University. But it is still worth looking out for projects with dozens or scores of contributors.
10. Do not produce too much
You need to produce 200 papers a year to be included in the
THE rankings. But producing more papers than this might be counterproductive. If
your researchers are producing five thousand papers a year then those five
hundred citations from a five hundred “author” report on the latest discovery
in particle physics will not have much impact. But if you are publishing three
hundred papers a year those citations will make a very big difference. This is
why Dr El Naschie’s frequently
cited papers in Chaos, Solitons and
Fractals were a big boost for Alexandria University but not for
Cambridge, Surrey, Cornell and Frankfurt universities with whom he also claimed
affiliation. However, Leiden will not rank universities until they reach
500 papers a year.
Update August 2016: See number 9.
Update August 2016: See number 9.
11. Moneyball Strategy
In his book Moneyball, Michael Lewis recounted the ascent
of the Oakland As baseball team through a strategy of buying undervalued players.
The idea was to find players who did things that led to their teams winning
even if they did not match the stereotype of a talented player.
This strategy was applied by George
Mason University in Virginia who created a top basketball team by
recruiting players who were overlooked by scouts because they were too small or
too fat and a top economics department by recruiting advocates of a market
economy at a time when such an idea was unfashionable.
Universities could recruit researchers who are prolific and
competent but are unpromotable or unemployable because they are in the wrong
group or fail to subscribe enthusiastically to current academic orthodoxies.
Maybe start with Mark
Regnerus and Jason
Richwine.
Update August 2016: See the story of Tim Groseclose's move from UCLA to George Mason
Update August 2016: See the story of Tim Groseclose's move from UCLA to George Mason
12. Expand doctoral
programmes
One indicator in the THE world rankings is the ratio of
doctoral to bachelor degree students.
Panjab University recently announced that they will
introduce integrated masters and doctors programmes. This could be a smart move
if it means students no longer go into master’s programmes but instead into
something that can be counted as a doctoral degree program.
13. The importance of names
Make sure that your researchers know which university they
are affiliated to and that they know its correct name. Make sure that branch
campuses, research institutes and other autonomous or quasi- autonomous groups
incorporate the university name in their publications. Keep an eye on Scopus
and ISI and make sure they know what you are called. Be especially careful if
you are an American state university.
14. Evaluate
staff according to criteria relevant to the rankings
If staff are to be appointed and promoted according to their
collegiality, the enthusiasm with which
they take part in ISO exercises, community
service, ability to make the faculty a pleasant place for everybody or commitment to diversity then you will get collegial,
enthusiastic etc faculty. But those are things that the rankers do not – for once
with good reason – attempt to measure.
While you are about it get rid of interviews for staff and
students. Predictive validity ranges from zero to low
15. Collaborate
The more authors a paper has the more likely it is to be
cited, even if it is only self-citation.
Also, the more collaborators you have the greater the chances of a good
score in the reputation surveys. And do not forget the percentage of
collaborators who are international is also an indicator in the THE rankings
16. Rebrand
It would be good to have names that are as distinctive and
memorable as possible. Consider a name change. Do you really think that the
average scientist filling out the QS or the THE reputation surveys is going to remember
which of the sixteen (?) Indian Institutes of Technology is especially good in
engineering.
Update August 2016: But not too memorable. I doubt that Lovely Professional University will get the sort of public interest it is hoping for.
Update August 2016: But not too memorable. I doubt that Lovely Professional University will get the sort of public interest it is hoping for.
17. Be proactive
Rankings are changing all the time so think about indicators
that might be introduced in the near future. It would seem quite easy, for
example, for rankers to collect data about patent applications.
Update August 2016: Make sure everyone keeps their Google Scholar Citations Profiles up to date.
Update August 2016: Make sure everyone keeps their Google Scholar Citations Profiles up to date.
18. Support your
local independence movement
It has been known for a long time that increasing the number
of international students and faculty is good for both the THE and QS rankings.
But there are drawbacks to just importing students. If it is difficult to move
students across borders why not create new borders?
If Scotland votes for independence in next year’s referendum
its scores for international students and international faculty in the QS and
THE rankings would go up since English and Welsh students and staff would be
counted as international.
Update August 2016: Scotland didn't but there may be another chance.
Update August 2016: Scotland didn't but there may be another chance.
19. Accept that some
things will never work
Realise that there are some things that are quite pointless
from a rankings perspective. Or any other for that matter. Do not bother telling staff and students to
click away at the website to get into Webometrics. Believe it or not, there are precautions against that sort of thing. Do not have motivational weekends. Do not have quality initiatives unless they
get rid of the cats.
Update August 2016: That should read do not do anything "motivational". The only thing they motivate is the departure of people with other options.
Update August 2016: That should read do not do anything "motivational". The only thing they motivate is the departure of people with other options.
20. Get Thee to an Island
Leiden Ranking has a little known ranking that measures the
distance between collaborators. At the moment the first place goes to the
Australian National University. Move to Easter Island or the Falklands and you
will be top for something.
Thursday, December 19, 2013
The QS BRICS Rankings
Quacquarelli Symonds (QS), in partnership with Interfax, the Russian news agency, have just published their BRICS [Brazil, Russia, India, China, South Africa] University Rankings. The top ten are:
1. Peking University
2. Tsinghua University
3. Lomonosov Moscow State University
4. Fudan University
5. Nanjing University
6= University of Science and Technology China
6= Shanghai Jiao Tong University
8. Universidade de Sao Paulo
9. Zhejiang University
10. Universidade Estadual de Campinas
The highest ranked Indian university is the Indian Institute of Technology Delhi in thirteenth place and the top South African institution is the University of Cape Town which is eleventh.
The methodology is rather different from the QS World University Rankings. The weighting for the academic survey has been reduced to 30% and that for the employer survey has gone up to 20%. Faculty student ratio accounts for 20% as it does in the world rankings, staff with PhDs for 10%, papers per faculty for 10%, citations per paper for 5% and international faculty and students for 5%.
There are some noticeable differences between these rankings and the BRICS and emerging countries rankings produced by Times Higher Education and Thomson Reuters.
Moscow State University is ahead of the University of Cape Town in the QS rankings but well behind in the THE rankings.
In the QS rankings the Indian Institutes of Technology are supreme among Indian institutions. There are seven before the University of Calcutta appears in 52nd place. In the THE rankings the best Indian performer was Panjab University, which is absent from the QS rankings.
I suspect that Panjab University is an example of rankings shopping, where universities target one specific ranking, and that there is a very smart person directing its ranking strategy. Panjab University has invested money in participation in the Hadron Collider project, exactly where it would profit from TR's field normalised citations indicator, while the number of publications did not rise excessively. Recently the university has proposed to establish integrated master's and doctoral programs, good for two TR indicators and to increase research collaboration, good for another.
The Moscow State Engineering Physics Institute, which was removed from the THE world rankings this year year because it was a single subject institution, is in 65th place in this table.
1. Peking University
2. Tsinghua University
3. Lomonosov Moscow State University
4. Fudan University
5. Nanjing University
6= University of Science and Technology China
6= Shanghai Jiao Tong University
8. Universidade de Sao Paulo
9. Zhejiang University
10. Universidade Estadual de Campinas
The highest ranked Indian university is the Indian Institute of Technology Delhi in thirteenth place and the top South African institution is the University of Cape Town which is eleventh.
The methodology is rather different from the QS World University Rankings. The weighting for the academic survey has been reduced to 30% and that for the employer survey has gone up to 20%. Faculty student ratio accounts for 20% as it does in the world rankings, staff with PhDs for 10%, papers per faculty for 10%, citations per paper for 5% and international faculty and students for 5%.
There are some noticeable differences between these rankings and the BRICS and emerging countries rankings produced by Times Higher Education and Thomson Reuters.
Moscow State University is ahead of the University of Cape Town in the QS rankings but well behind in the THE rankings.
In the QS rankings the Indian Institutes of Technology are supreme among Indian institutions. There are seven before the University of Calcutta appears in 52nd place. In the THE rankings the best Indian performer was Panjab University, which is absent from the QS rankings.
I suspect that Panjab University is an example of rankings shopping, where universities target one specific ranking, and that there is a very smart person directing its ranking strategy. Panjab University has invested money in participation in the Hadron Collider project, exactly where it would profit from TR's field normalised citations indicator, while the number of publications did not rise excessively. Recently the university has proposed to establish integrated master's and doctoral programs, good for two TR indicators and to increase research collaboration, good for another.
The Moscow State Engineering Physics Institute, which was removed from the THE world rankings this year year because it was a single subject institution, is in 65th place in this table.
Sunday, December 08, 2013
Africa Excels in Latest THE Rankings, China Performs Poorly
It is unlikely that you will see a headline like this in the mainstream media. Experts and analysts have focused almost exclusively on the number of universities in the top 10 or the top 50 or the top 100 of the Times Higher Education (THE) BRICS and Emerging Economies Rankings (BRICSEE) -- powered, in case anyone has forgotten, by Thomson Reuters -- and concluded that China is the undisputed champion of the world.
Looking at the number of universities in the BRICSEE rankings compared to population -- as in the previous post -- gives a different picture with China still ahead of Russia, India and Brazil but not so much.
Another way of analysing a country's higher education system is by looking at the proportion of universities that achieve "world class" status.
Assuming -- a big assumption I agree -- that getting into the BRICSEE top 100 is a measure of world class quality, then the percentage of a country's universities that are world class might be considered a guide to the overall quality of the higher education system.
Here is a ranking of the BRICS and emerging countries according to the percentage of universities in the THE BRICSEE top 100.
The total number of universities is in brackets and is derived from Webometrics.
First place goes to South Africa. Egypt is third. Even Morocco does better than Russia and Brazil. China does not do very well although it is still quite a bit ahead of India, Russia and Brazil. Taiwan remains well ahead of Mainland China.
Of course, this should not be taken too seriously. It is probably a lot harder to start a university in Taiwan than it is in Brazil or India. South Africa has a large number of professional schools and private colleges that are not counted by Webometrics that may be of a similar standard to universities in other countries.
Some of the high fliers might find that their positions are precarious. Egypt's third university in the BRICSEE rankings is Alexandria which is still reaping the benefits from Dr El Naschies' much cited papers in 2007 and 2008 but that will not last long. The UAE's role as an international higher education hub may not survive a fall in the price of oil.
1. South Africa (25) 20.00%
2. Taiwan (157) 13.38%
3. Egypt (59) 5.08%
4. Turkey (164) 4.27%
5= UAE (50) 4.00%
5= Hungary (75) 4.00%
7. Czech Republic (82) 3.66%
8. Thailand (188) 2.66%
9. Chile (78) 2.56%
10. Malaysia (91) 2.20%
11. China (1164) 1.98%
12. Poland (440) 0.91%
13. India (1604) 0.62%
14. Morocco (212) 0.47%
15. Colombia (285) 0.35%
16. Brazil (1662) 0.24%
17. Mexico (898) 0.22%
18. Russia (1188) 0.17%
19= Indonesia (358) 0%
19= Philippines (265) 0%
19= Pakistan (300) 0%
19= Peru (92) 0%
Looking at the number of universities in the BRICSEE rankings compared to population -- as in the previous post -- gives a different picture with China still ahead of Russia, India and Brazil but not so much.
Another way of analysing a country's higher education system is by looking at the proportion of universities that achieve "world class" status.
Assuming -- a big assumption I agree -- that getting into the BRICSEE top 100 is a measure of world class quality, then the percentage of a country's universities that are world class might be considered a guide to the overall quality of the higher education system.
Here is a ranking of the BRICS and emerging countries according to the percentage of universities in the THE BRICSEE top 100.
The total number of universities is in brackets and is derived from Webometrics.
First place goes to South Africa. Egypt is third. Even Morocco does better than Russia and Brazil. China does not do very well although it is still quite a bit ahead of India, Russia and Brazil. Taiwan remains well ahead of Mainland China.
Of course, this should not be taken too seriously. It is probably a lot harder to start a university in Taiwan than it is in Brazil or India. South Africa has a large number of professional schools and private colleges that are not counted by Webometrics that may be of a similar standard to universities in other countries.
Some of the high fliers might find that their positions are precarious. Egypt's third university in the BRICSEE rankings is Alexandria which is still reaping the benefits from Dr El Naschies' much cited papers in 2007 and 2008 but that will not last long. The UAE's role as an international higher education hub may not survive a fall in the price of oil.
1. South Africa (25) 20.00%
2. Taiwan (157) 13.38%
3. Egypt (59) 5.08%
4. Turkey (164) 4.27%
5= UAE (50) 4.00%
5= Hungary (75) 4.00%
7. Czech Republic (82) 3.66%
8. Thailand (188) 2.66%
9. Chile (78) 2.56%
10. Malaysia (91) 2.20%
11. China (1164) 1.98%
12. Poland (440) 0.91%
13. India (1604) 0.62%
14. Morocco (212) 0.47%
15. Colombia (285) 0.35%
16. Brazil (1662) 0.24%
17. Mexico (898) 0.22%
18. Russia (1188) 0.17%
19= Indonesia (358) 0%
19= Philippines (265) 0%
19= Pakistan (300) 0%
19= Peru (92) 0%
Thursday, December 05, 2013
The THE BRICS and Emerging Markets Rankings: Good News for China?
Times Higher Education (THE) has just published its BRICS and Emerging Economies Rankings. The methodology is the same as that used in their World University Rankings and the data was supplied by Thomson Reuters. Emerging economies are those listed in the FTSE Emerging Markets Indices.
At first sight, China appears to do very well with Peking University in first place and Tsinghua University in second and a total of 23 universities in the top 100.
Third place goes to the University of Cape Town while Taiwan National University is fourth and Bogazici University in Turkey is fifth.
Taiwan has 21 universities in the Top 100, India 10, Turkey 7 and South Africa and Thailand 5 each.
Although China tops the list of "hot emergent properties", as THE puts it, and Simon Marginson compares the PRC favourably to Russia which is "in the doldrums", we should remember that China does have a large population. When we look at population size, China's achievement shrinks considerably while Taiwan emerges as the undisputed winner, Eastern Europe does very well and the gap between Russia and China is drastically reduced.
The following is the number of universities in the the BRICS and Emerging Economies University Rankings per 1,000,000 population (Economist Pocket World in figures 2010) The total number of universities in the rankings is in brackets.
1. Taiwan (21) 0.913
2. United Arab Emirates (2) 0.400
3= Czech Republic (3) 0.300
3= Hungary (3) 0.300
5. Chile (2) 0.118
6. South Africa (5) 0.114
7. Poland (4) 0.102
8. Turkey (7) 0.093
9= Malaysia (2) 0.077
9= Thailand (5) 0.077
11. Egypt (3) 0.039
12. Morocco (1) 0.0031
13. Brazil (4) 0.021
14. Colombia (1) 0.021
15. Mexico (2) 0.018
16. China (23) 0.017
17. Russia (2) 0.014
18. India (10) 0.008
19= Indonesia (0) 0.000
19= Pakistan (0) 0.000
19= Peru (0) 0.000
19= Philippines (0) 0.000
It is very significant that the top two universities in these rankings are in China. But, taking population size into consideration, it looks as though Mainland China is still way behind Taiwan, Singapore and Hong Kong and even the smaller nations of Eastern Europe.
At first sight, China appears to do very well with Peking University in first place and Tsinghua University in second and a total of 23 universities in the top 100.
Third place goes to the University of Cape Town while Taiwan National University is fourth and Bogazici University in Turkey is fifth.
Taiwan has 21 universities in the Top 100, India 10, Turkey 7 and South Africa and Thailand 5 each.
Although China tops the list of "hot emergent properties", as THE puts it, and Simon Marginson compares the PRC favourably to Russia which is "in the doldrums", we should remember that China does have a large population. When we look at population size, China's achievement shrinks considerably while Taiwan emerges as the undisputed winner, Eastern Europe does very well and the gap between Russia and China is drastically reduced.
The following is the number of universities in the the BRICS and Emerging Economies University Rankings per 1,000,000 population (Economist Pocket World in figures 2010) The total number of universities in the rankings is in brackets.
1. Taiwan (21) 0.913
2. United Arab Emirates (2) 0.400
3= Czech Republic (3) 0.300
3= Hungary (3) 0.300
5. Chile (2) 0.118
6. South Africa (5) 0.114
7. Poland (4) 0.102
8. Turkey (7) 0.093
9= Malaysia (2) 0.077
9= Thailand (5) 0.077
11. Egypt (3) 0.039
12. Morocco (1) 0.0031
13. Brazil (4) 0.021
14. Colombia (1) 0.021
15. Mexico (2) 0.018
16. China (23) 0.017
17. Russia (2) 0.014
18. India (10) 0.008
19= Indonesia (0) 0.000
19= Pakistan (0) 0.000
19= Peru (0) 0.000
19= Philippines (0) 0.000
It is very significant that the top two universities in these rankings are in China. But, taking population size into consideration, it looks as though Mainland China is still way behind Taiwan, Singapore and Hong Kong and even the smaller nations of Eastern Europe.
Wednesday, November 27, 2013
Mysteries of the Rankings
One of the problems with interpreting the Times Higher Education (THE) World University Rankings is that some of the indicators are bundled together with others so that is very difficulty to work out exactly what happens from one year to the next.
One group of indicators, International Outlook: People, Research, includes the ratio of international to domestic staff, the ratio of international to domestic students and the proportion of papers with an international co-author.
Research: Volume, Income, Reputation includes the results of a reputation survey, research income, which is scaled against staff numbers and normalised for purchasing power parity and the number of papers per staff.
Teaching: the Learning Environment is based on five indicators: the academic reputation survey, staff student ratio, the ratio of doctoral to bachelor's degrees, the number of doctorates divided by the number of academic staff and institutional income scaled against academic staff numbers.
This means that if a university enjoys a rise or suffers a fall for the international, teaching or research indicators it is impossible to see exactly what was the cause.
If, for example, the score for teaching for a given university increased it could result from one or more of the following:
The published scores for the indicator groups do not represent raw data. They are z scores and therefore can change if the mean scores for all ranked universities change. So, a rise in the teaching indicators score might also result from one or more of the following, which could contribute to lowering the mean score from which the university z scores are calculated:
Unless THE and TR disaggregate their data interpreting their results is going to be very problematical. It would also be unwise to judge the impact of changes in public or university policies by looking at changes in the rankings.
An example is the University of Hong Kong (UHK). Between 2012 and 2013 it fell from 35th to 43rd place in the THE world rankings, its overall score falling from 75 .6 to 65.5. There were slight falls for citations from 62.1 to 61.5 and international outlook from 81.7 to 80.3 and a somewhat larger fall for income from industry from 62.5 to 56.9. There were very large falls for the combined teaching indicators, from 78.4 to 61.6, and the research indicators, from 85.9 to 69.9.
THE may be on to something. UHK has slipped in the QS and Shanghai rankings as well. But exactly what they are on to is not clear.
At the recent World Class Universities conference in Shanghai, Gerard Postiglione referred to suggestions that the fall in the position of UHK and other Hong Kong research universities in the THE rankings may have resulted from a conflict between indigenous and international leadership, difficulties in attracting funding from industry and the transition from a British-style 3 year degree programme to a 4 year American style programme.
It is difficult to see how the change from a three year to a four year system would have a direct effect. Such a change could impact the THE rankings by increasing the number of undergraduate students compared to doctoral candidates or compared to faculty but this change began to come into effect in 2012 whereas the date in the 2013 rankings refers to 2011. It is possible though that arguments about the impending switch might have had an effect on the reputation survey but unless THE and TR publish separate data for all their indicators it is not possible to be certain about the causes for the decline of Hong Kong universities. .
One group of indicators, International Outlook: People, Research, includes the ratio of international to domestic staff, the ratio of international to domestic students and the proportion of papers with an international co-author.
Research: Volume, Income, Reputation includes the results of a reputation survey, research income, which is scaled against staff numbers and normalised for purchasing power parity and the number of papers per staff.
Teaching: the Learning Environment is based on five indicators: the academic reputation survey, staff student ratio, the ratio of doctoral to bachelor's degrees, the number of doctorates divided by the number of academic staff and institutional income scaled against academic staff numbers.
This means that if a university enjoys a rise or suffers a fall for the international, teaching or research indicators it is impossible to see exactly what was the cause.
If, for example, the score for teaching for a given university increased it could result from one or more of the following:
- a rise in the number of votes received in the academic survey
- an increase in the number of teaching staff
- a fall in the number of students
- an increase in the number of doctoral degrees awarded
- a fall in the number of bachelor degrees awarded
- an increase in institutional income.
The published scores for the indicator groups do not represent raw data. They are z scores and therefore can change if the mean scores for all ranked universities change. So, a rise in the teaching indicators score might also result from one or more of the following, which could contribute to lowering the mean score from which the university z scores are calculated:
- a fall in the mean number of votes for ranked universities in the academic survey
- a reduction in the mean number of teaching staff at ranked universities
- an increase in the mean number of students at ranked universities
- a fall in the mean number of doctoral degrees awarded ranked universities
- an increase in the mean number of bachelor degrees awarded at ranked universities
- a fall in the mean institutional income of ranked universities.
Unless THE and TR disaggregate their data interpreting their results is going to be very problematical. It would also be unwise to judge the impact of changes in public or university policies by looking at changes in the rankings.
An example is the University of Hong Kong (UHK). Between 2012 and 2013 it fell from 35th to 43rd place in the THE world rankings, its overall score falling from 75 .6 to 65.5. There were slight falls for citations from 62.1 to 61.5 and international outlook from 81.7 to 80.3 and a somewhat larger fall for income from industry from 62.5 to 56.9. There were very large falls for the combined teaching indicators, from 78.4 to 61.6, and the research indicators, from 85.9 to 69.9.
THE may be on to something. UHK has slipped in the QS and Shanghai rankings as well. But exactly what they are on to is not clear.
At the recent World Class Universities conference in Shanghai, Gerard Postiglione referred to suggestions that the fall in the position of UHK and other Hong Kong research universities in the THE rankings may have resulted from a conflict between indigenous and international leadership, difficulties in attracting funding from industry and the transition from a British-style 3 year degree programme to a 4 year American style programme.
It is difficult to see how the change from a three year to a four year system would have a direct effect. Such a change could impact the THE rankings by increasing the number of undergraduate students compared to doctoral candidates or compared to faculty but this change began to come into effect in 2012 whereas the date in the 2013 rankings refers to 2011. It is possible though that arguments about the impending switch might have had an effect on the reputation survey but unless THE and TR publish separate data for all their indicators it is not possible to be certain about the causes for the decline of Hong Kong universities. .
Sunday, November 10, 2013
BRICS PLUS
We heard recently that QS will be having a BRICS ranking (Brazil, Russia, India, China, South Africa).
Now, Times Higher Education has announced its intention to produce BRICS and Emerging Economies Rankings in early December.
The emerging economies will include "South Africa, Czech Republic, Hungary, Malaysia, Mexico, Poland, Taiwan, Thailand, Turkey, Chile, Colombia, Egypt, Indonesia, Morocco, Pakistan, Peru, Philippines and UAE."
It seems that THE will be using the same methodology as the World University Rankings.
Now, Times Higher Education has announced its intention to produce BRICS and Emerging Economies Rankings in early December.
The emerging economies will include "South Africa, Czech Republic, Hungary, Malaysia, Mexico, Poland, Taiwan, Thailand, Turkey, Chile, Colombia, Egypt, Indonesia, Morocco, Pakistan, Peru, Philippines and UAE."
It seems that THE will be using the same methodology as the World University Rankings.
Friday, November 01, 2013
Watching the Rankings: Print Edition
Monday, October 28, 2013
Another Ranking on the Way
Just when you thought there was no more space left for a new ranking, there was this story in University World News:
Actually, it is not really new. Webometrics has a BRICS ranking, with one Russian, one Brazilian and eight Chinese universities in the top ten. It will be interesting to see if QS can produce a better performance for Russia.
"The Russian government is behind the world’s first university ranking for the BRICS countries – Brazil, Russia, India, China and South Africa. UK-based QS Quacquarelli Symonds will produce the pilot BRICS ranking in December.
Last Wednesday QS announced that it had been appointed by the Interfax Group, a leading information provider in Russia and Eurasian nations, to produce a ranking of the BRICS countries, to be called the 'QS University Rankings: BRICS'.
The British company said the ranking had “received support from ministries of education and higher education around the world”.
After Russian President Vladimir Putin announced last year that a goal was to have at least five Russian universities in the top 100 in global university rankings by 2020, the Ministry of Education sprang into action that included using rankings as a measure of progress.
According to Wednesday’s statement, Interfax was selected out of five bidders to launch two pilot rankings. The first will include universities in Commonwealth of Independent States countries – formerly within the Soviet Union – and the second will rank BRICS universities."
Actually, it is not really new. Webometrics has a BRICS ranking, with one Russian, one Brazilian and eight Chinese universities in the top ten. It will be interesting to see if QS can produce a better performance for Russia.
Friday, October 25, 2013
QS goes to Oman.
It seems that QS and THE have been quite busy lately, seeking "engagement" with various countries. QS was in Oman in September. The report contains an interesting insight into how methodological issues can lead to a university falling or rising in the rankings through no fault or merit of its own.
"A few days after the seminar, the global launch of the QS World University Rankings 2013/14 was held in Turkey, Istanbul, during which it was revealed that Sultan Qaboos University had dropped in its rankings this year too. In an exclusive statement, the head of QS Intelligence Unit, Ben Sowter, observed that most institutions in the Middle East featured in the QS World University Rankings 2013/14 have dropped in rank this year.
'Scores for academic reputation and research citations have declined across the region this year, which has caused most institutions to lose ground on the international competition,” Sowter said about the Middle East’s drop in rankings. Sowter further added that “having said that, there were over 100 new universities added to the list this year; and many of the institutions worldwide already in the rankings have noticeably improved in academic reputation. This has led to some universities, such as Sultan Qaboos University in Oman, showing a drop in ranking, even though their score may have improved relative to last year.”
Commenting on the drop of Sultan Qaboos University, he said “Sultan Qaboos University was first featured in 2011. We were initially unsuccessful in reaching anyone at SQU in order to file official numbers; hence figures that were available at the time on the university website were taken for the faculty and student numbers. However, it seems that the number used for academic staff was actually the total number of staff, thus inflating the faculty/student ration of SQU resulting in a higher ranking.
This was corrected by an official submission in 2012 by the administration at SQU. Since the position published in 2011 was unnaturally high, the drops in 2012 and 2013 have been largely corrective, rather than reflecting deterioration in SQU’s actual performance.” He added that “Analysis of our results over time reveal that institutions in general are producing more research, attracting more international research and doing a better job of communicating their achievements to the world at large. Increasingly, institutions need to exhibit continuous improvement just to maintain the same position, and a drop in overall ranking may not signify an objective deterioration in performance. Such may be the case for SQU.” '
"A few days after the seminar, the global launch of the QS World University Rankings 2013/14 was held in Turkey, Istanbul, during which it was revealed that Sultan Qaboos University had dropped in its rankings this year too. In an exclusive statement, the head of QS Intelligence Unit, Ben Sowter, observed that most institutions in the Middle East featured in the QS World University Rankings 2013/14 have dropped in rank this year.
'Scores for academic reputation and research citations have declined across the region this year, which has caused most institutions to lose ground on the international competition,” Sowter said about the Middle East’s drop in rankings. Sowter further added that “having said that, there were over 100 new universities added to the list this year; and many of the institutions worldwide already in the rankings have noticeably improved in academic reputation. This has led to some universities, such as Sultan Qaboos University in Oman, showing a drop in ranking, even though their score may have improved relative to last year.”
Commenting on the drop of Sultan Qaboos University, he said “Sultan Qaboos University was first featured in 2011. We were initially unsuccessful in reaching anyone at SQU in order to file official numbers; hence figures that were available at the time on the university website were taken for the faculty and student numbers. However, it seems that the number used for academic staff was actually the total number of staff, thus inflating the faculty/student ration of SQU resulting in a higher ranking.
This was corrected by an official submission in 2012 by the administration at SQU. Since the position published in 2011 was unnaturally high, the drops in 2012 and 2013 have been largely corrective, rather than reflecting deterioration in SQU’s actual performance.” He added that “Analysis of our results over time reveal that institutions in general are producing more research, attracting more international research and doing a better job of communicating their achievements to the world at large. Increasingly, institutions need to exhibit continuous improvement just to maintain the same position, and a drop in overall ranking may not signify an objective deterioration in performance. Such may be the case for SQU.” '
Wednesday, October 16, 2013
THE Criticised by its Own Adviser
Simon Marginson has been a critic of the QS university rankings for some time. Now, he has finally added the Times Higher Education rankings to the list.
According to the Australian:
According to the Australian:
"What we should collectively do, in my view, is start to critique and discredit the bad social science at the base of multi-indicator rankings," he said. "We are universities; it is not hard for us to say what is good science and what is bad. We need to push at bad ranking methods or at least weaken their legitimacy." He told the Canberra meeting that threats by QS to sue him, and the predilection of governments to use rankings as a proxy for quality, made speaking out even more important. He said the results of questionable rankings "slide in all directions" from year to year because they mix survey and objective data, and adjust arbitrary weightings. "The link back to the real world is over-determined by indicator selection, weightings, poor survey returns and ignorant respondents, scaling decisions and surface fluctuation that is driven by small changes between almost equally ranked universities," he said. "The rankers shape the table, not the real state of the sector - or not enough. There is scope for manipulation in conversations between the universities and the rankers." Professor Marginson, who has been a vocal opponent of survey-based rankings for years, sits on the advisory board of the THE. While that ranking was superior to QS, it was still fatally flawed once outside the top 50 universities, he said.
The Rise of Pseudoscience
In the midst of celebrating its rise to glory in the Times Higher Education World Education, Panjab University have taken the time to hold a meeting and what has "India's Top University" done?
It has agreed to give affiliation to the Homeopathic Medical College and Hospital in Sector 26.
PU is not alone. The appeasement of pseudoscience is becoming increasingly widespread. In Malaysia the Cyberjaya University College of Medical Sciences is offering a degree in homeopathy, apparently approved by the Malaysian Qualifications Agency. Meanwhile, the British Health Secretary is a true believer in homeopathy. No doubt the Ministry of Defence will be appointing generals who believe in complementary warfare like archery -- after all, it worked wonders at Agincourt.
It has agreed to give affiliation to the Homeopathic Medical College and Hospital in Sector 26.
PU is not alone. The appeasement of pseudoscience is becoming increasingly widespread. In Malaysia the Cyberjaya University College of Medical Sciences is offering a degree in homeopathy, apparently approved by the Malaysian Qualifications Agency. Meanwhile, the British Health Secretary is a true believer in homeopathy. No doubt the Ministry of Defence will be appointing generals who believe in complementary warfare like archery -- after all, it worked wonders at Agincourt.
Subscribe to:
Posts (Atom)