Say It Loud
Phil Baty has an article in THE, based on a speech in Hong Kong entitled 'Say it loud: I'm a ranker and I'm proud'. Very interesting but personally I prefer the James Brown version.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Sunday, March 27, 2011
Saturday, March 26, 2011
Growth of Academic Publications: Southwest Asia, 2009-2010
One of several surprises in last year's THE rankings was the absence of any Israeli university from the Top 200. QS had three and, as noted earlier on this blog, over a fifth of Israeli universities were in the Shanghai 500, a higher proportion than any other country. It seems that in the case of at least two universities, Tel Aviv and the Hebrew University of Jerusalem, there was a failure of communication that meant that data was not submitted to Thomson Reuters, who collect and analyse data for THE.
The high quality of Israeli universities might seem rather surprising since Israeli secondary school students perform poorly on international tests of scholastic attainment and the national average IQ is mediocre. It could be that part of the reason for the strong Israeli academic performance is the Psychometric Entrance Test for university admission that measures quantitative and verbal reasoning and also includes an English test. The contrast with the trend in the US, Europe and elsewhere towards holistic assessment, credit for leadership, community involvement, overcoming adversity and being from the right post code area is striking.
Even so, Israeli scientific supremacy in the Middle East is looking precarious. Already the annual production of academic papers in Israel has been exceeded by Iran and Turkey.
Meanwhile, the total number of papers produced in Israel is shrinking while that of Iran and Turkey continues to grow at a respectable rate. The fastest growth in Southwest Asia comes from Saudi Arabia and the smaller Gulf states of Qatar and Bahrain.
Countries ranked by percentage increase in publications in the ISI Science, Social Science and Arts and Humanities indexes and Conference Proceedings between 2009 and 2010. (total 2010 publications in brackets)
1. Saudi Arabia 35% (3924)
2. Qatar 31% (453)
3. Syria 14% (333)
4. Bahrain 13% (184)
5. Palestine 9% (24)
6. UAE 6% (303)
7. Turkey 5% (26835)
8. Lebanon 4% (2058)
9. Iran 4% (21047)
10. Oman 4% (494)
11. Jordan 1% (1637)
12. Iraq -3% (333)
13. Israel -4% (17719)
14. Yemen -8% (125)
15. Kuwait -13% (759)
(data collected 23/3/11)
This of course may not say very much about the quality of research. A glance at the ISI list of highly cited researchers shows that Israel is ahead, for the moment at least, with 50 compared to 29 for Saudi Arabia and one each for Turkey and Iran.
One of several surprises in last year's THE rankings was the absence of any Israeli university from the Top 200. QS had three and, as noted earlier on this blog, over a fifth of Israeli universities were in the Shanghai 500, a higher proportion than any other country. It seems that in the case of at least two universities, Tel Aviv and the Hebrew University of Jerusalem, there was a failure of communication that meant that data was not submitted to Thomson Reuters, who collect and analyse data for THE.
The high quality of Israeli universities might seem rather surprising since Israeli secondary school students perform poorly on international tests of scholastic attainment and the national average IQ is mediocre. It could be that part of the reason for the strong Israeli academic performance is the Psychometric Entrance Test for university admission that measures quantitative and verbal reasoning and also includes an English test. The contrast with the trend in the US, Europe and elsewhere towards holistic assessment, credit for leadership, community involvement, overcoming adversity and being from the right post code area is striking.
Even so, Israeli scientific supremacy in the Middle East is looking precarious. Already the annual production of academic papers in Israel has been exceeded by Iran and Turkey.
Meanwhile, the total number of papers produced in Israel is shrinking while that of Iran and Turkey continues to grow at a respectable rate. The fastest growth in Southwest Asia comes from Saudi Arabia and the smaller Gulf states of Qatar and Bahrain.
Countries ranked by percentage increase in publications in the ISI Science, Social Science and Arts and Humanities indexes and Conference Proceedings between 2009 and 2010. (total 2010 publications in brackets)
1. Saudi Arabia 35% (3924)
2. Qatar 31% (453)
3. Syria 14% (333)
4. Bahrain 13% (184)
5. Palestine 9% (24)
6. UAE 6% (303)
7. Turkey 5% (26835)
8. Lebanon 4% (2058)
9. Iran 4% (21047)
10. Oman 4% (494)
11. Jordan 1% (1637)
12. Iraq -3% (333)
13. Israel -4% (17719)
14. Yemen -8% (125)
15. Kuwait -13% (759)
(data collected 23/3/11)
This of course may not say very much about the quality of research. A glance at the ISI list of highly cited researchers shows that Israel is ahead, for the moment at least, with 50 compared to 29 for Saudi Arabia and one each for Turkey and Iran.
Thursday, March 24, 2011
Growth in Academic Publications: Southeast Asia 2009-2010
Countries ranked by percentage increase in publications in the ISI Science, Social Science and Arts and Humanities indexes and Conference Proceedings between 2009 and 2010. (total 2010 publications in brackets)
1. Malaysia 31% (8603)
2. Laos 30% (96)
3. Indonesia 30% (1631)
4. Brunei 16% (88)
5. Papua New Guinea 5% (67)
6. Vietnam 5% (1247)
7. Singapore 4% (11900)
8. Thailand 2% (2248)
9. Timor Leste 0% (4)
10. Cambodia -5% (158)
11. Myanmar -12% (78)
Singapore is still the dominant research power in Southeast Asia but Malaysia and Indonesia, admittedly with much larger populations, are closing fast. Thailand is growing very slowly and Myanmar is shrinking.
(data collected 23/3/11)
Countries ranked by percentage increase in publications in the ISI Science, Social Science and Arts and Humanities indexes and Conference Proceedings between 2009 and 2010. (total 2010 publications in brackets)
1. Malaysia 31% (8603)
2. Laos 30% (96)
3. Indonesia 30% (1631)
4. Brunei 16% (88)
5. Papua New Guinea 5% (67)
6. Vietnam 5% (1247)
7. Singapore 4% (11900)
8. Thailand 2% (2248)
9. Timor Leste 0% (4)
10. Cambodia -5% (158)
11. Myanmar -12% (78)
Singapore is still the dominant research power in Southeast Asia but Malaysia and Indonesia, admittedly with much larger populations, are closing fast. Thailand is growing very slowly and Myanmar is shrinking.
(data collected 23/3/11)
Tuesday, March 22, 2011
Comparing Rankings 3: Omissions
The big problem with the Asiaweek rankings of 1999-2000 was that they relied on data submitted by universities. This meant that if enough were dissatisfied they could effectively sabotage the rankings by withholding information, which is in fact what happened.
The THES-QS rankings, and since 2010 the QS rankings, avoided this problem by ranking universities whether they liked or not. Nonetheless, there were a few omissions in the early years: Lancaster, Essex, Royal Holloway University of London and the SUNY campuses at Binghamton, Buffalo and Albany.
In 2010 THE decided that they would not rank universities that did not submit data, a principled decision but one that has its dangers. Too many conscientious objectors (or maybe poor losers) and the rankings would begin to lose face validity.
When the THE rankings came out last year, there were some noticeable absentees, among them the Chinese University of Hong Kong, the University of Queensland, Tel Aviv University, the Hebrew University of Jerusalem, the University of Texas at Austin, the Catholic University of Louvain, Fudan University, Rochester, Calgary, the Indian Institutes of Technology and Science and Sciences Po Paris.
As Danny Byrne pointed out in University World News, Texas at Austin and Moscow State University were in the top100 in the Reputation Rankings but not in the THE World University Rankings. Producing a reputation-only ranking without input from the institutions could be a smart move for THE.
Monday, March 21, 2011
QS comments on the THE Reputation Ranking
In University World News, Danny Byrne from QS comments on the new THE reputation ranking.
In University World News, Danny Byrne from QS comments on the new THE reputation ranking.
So why has THE decided to launch a world ranking based entirely on institutional reputation? Is it for the benefit of institutions like Moscow State University, which did not appear in THE's original top 200 but now appears 33rd in the world?
The data on which the new reputational ranking is based has been available for six months and comprised 34.5% of the world university rankings published by THE in September 2010.
But this is the first time the magazine has allowed anyone to view this data in isolation. Allowing users to access the data six months ago may have attracted less attention, but it would perhaps have been less confusing for prospective students.
The order of the universities in the reputational rankings differs from the THE's overall ranking. But no new insights have been offered and nothing has changed. This plays into the hands of those who are sceptical about university rankings.
Wednesday, March 16, 2011
Worth Reading
Ellen Hazelkorn, 'Questions Abound as the College-Rankings Race Goes Global' in Chronicle of Higher Education
"It is amazing that more than two decades after U.S. News & World Report first published its special issue on "America's Best Colleges," and almost a decade since Shanghai Jiao Tong University first published the Academic Ranking of World Universities, rankings continue to dominate the attention of university leaders. Indeed, the range of people watching them now includes politicians, students, parents, businesses, and donors. Simply put, rankings have caught the imagination of the public and have insinuated their way into public discourse and almost every level of government. There are even iPhone applications to help individuals and colleges calculate their ranks.
More than 50 country-specific rankings and 10 global rankings are available today, including the European Union's new U-Multirank, due this year. What started as small-scale, nationally focused guides for students and parents has become a global business that heavily influences higher education and has repercussions well beyond academe."
Tuesday, March 15, 2011
Bright Ideas Department
This is from today's Guardian:
This is from today's Guardian:
The coalition is considering a Soviet-style central intervention policy to effectively fine individual universities if they impose unreasonable tuition fees next year.Next bright idea? A Gulag for recalcitrant vice-chancellors? Re-education camps for those who don't take their teaching philosophy statements seriously enough?
Vince Cable, the business secretary whose department is responsible for universities, and David Willetts, the universities minister, are looking at allowing colleges that charge a modest fee to expand and constraining those that are charging too much.
The government, through the Higher Education Funding Council, sets the grant and numbers for each university and has the power to fine a university as much as £3,000 per student if it over-recruits in a single year.
Ministers are looking at cutting funding from universities that unreasonably charge the maximum £9,000 fee from 2012-13. They admit it is likely most universities will charge well over £8,000 a year.
One minister said: "A form of dramatic centralisation is under active consideration - a form of Gosplan if you like," a reference to the Russian state planning committee set up in the 1920s.
Saturday, March 12, 2011
Going Global Hong Kong 2011
Speeches about rankings by Martin Davidson, British Council, Phil Baty, THE, John Molony, QS, and others can be seen here.
Speeches about rankings by Martin Davidson, British Council, Phil Baty, THE, John Molony, QS, and others can be seen here.
Thursday, March 10, 2011
A Bit More on the THE Reputation Rankings
There is a brief article in the Guardian with a lot of comments.
Incidentally, I don't see Alexandria, Hong Kong Baptist and Bilkent Universities in the top 100 for reputation despite the outstanding work that gave them high scores for research impact in the 2010 THE WUR. Perhaps I'm not looking hard enough.
There is a brief article in the Guardian with a lot of comments.
Incidentally, I don't see Alexandria, Hong Kong Baptist and Bilkent Universities in the top 100 for reputation despite the outstanding work that gave them high scores for research impact in the 2010 THE WUR. Perhaps I'm not looking hard enough.
The THE Reputation Rankings
Times Higher Education have constructed a reputation ranking from the data collected for last year's World University Rankings. There is a weighting of two thirds for research and one third for postgraduate teaching. The top five are:
1. Harvard
2. MIT
3. Cambridge
4. UC Berkeley
5. Stanford
Scores are given only for the top fifty universities. Then another fifty are sorted in bands of ten without scores. Evidently, the number of responses favouring universities outside the top 100 was so small that it was not worth listing.
This means that the THE reputational survey reveals significant differences between Harvard and MIT or between Cambridge and Oxford but it would be of no help to those wondering whether to study or work at the University of Cape Town or the University of Kwazulu-Natal or Trinity College Dublin or University College Dublin.
The scores for research reputation (top fifty for total reputation scores only) show a moderate correlation with the THE citations indicator (.422) and, perhaps surprisingly, a higher correlation with the citations per faculty score on the QS World University Rankings of 2010 (.538).
looking at the QS academic survey, which asked only about research, we can see that there was an insignificant correlation of .213 between the QS scores and the score for citations per faculty in the QS rankings (THE reputation ranking top 50 only). However, there was a higher correlation between the QS survey and the THE citations indicator of .422, the same as that between the THE research reputation scores and the the THE citations indicator.
Comparing the two research surveys with a third party, the citations indicator in the Scimago 2010 rankings, the THE research reputation survey did better with a correlation of .438 compared to an insignificant .188 for the QS academic survey.
This seems to suggest that the THE reputational survey does a better job at differentiating between the world's elite universities. But once we leave the top 100 it is perhaps less helpful and there may still be a role for the QS rankings
Times Higher Education have constructed a reputation ranking from the data collected for last year's World University Rankings. There is a weighting of two thirds for research and one third for postgraduate teaching. The top five are:
1. Harvard
2. MIT
3. Cambridge
4. UC Berkeley
5. Stanford
Scores are given only for the top fifty universities. Then another fifty are sorted in bands of ten without scores. Evidently, the number of responses favouring universities outside the top 100 was so small that it was not worth listing.
This means that the THE reputational survey reveals significant differences between Harvard and MIT or between Cambridge and Oxford but it would be of no help to those wondering whether to study or work at the University of Cape Town or the University of Kwazulu-Natal or Trinity College Dublin or University College Dublin.
The scores for research reputation (top fifty for total reputation scores only) show a moderate correlation with the THE citations indicator (.422) and, perhaps surprisingly, a higher correlation with the citations per faculty score on the QS World University Rankings of 2010 (.538).
looking at the QS academic survey, which asked only about research, we can see that there was an insignificant correlation of .213 between the QS scores and the score for citations per faculty in the QS rankings (THE reputation ranking top 50 only). However, there was a higher correlation between the QS survey and the THE citations indicator of .422, the same as that between the THE research reputation scores and the the THE citations indicator.
Comparing the two research surveys with a third party, the citations indicator in the Scimago 2010 rankings, the THE research reputation survey did better with a correlation of .438 compared to an insignificant .188 for the QS academic survey.
This seems to suggest that the THE reputational survey does a better job at differentiating between the world's elite universities. But once we leave the top 100 it is perhaps less helpful and there may still be a role for the QS rankings
Wednesday, March 09, 2011
The Second Wave
It seems that another wave of rankings is coming. The new edition of America's best graduate schools will be out soon, QS will be releasing detailed subject rankings and, according to Bloomberg Businessweek, THE's ranking by reputation is imminent. It seems that the academic anglosphere dominates when reputation alone is considered.
It seems that another wave of rankings is coming. The new edition of America's best graduate schools will be out soon, QS will be releasing detailed subject rankings and, according to Bloomberg Businessweek, THE's ranking by reputation is imminent. It seems that the academic anglosphere dominates when reputation alone is considered.
Tuesday, March 08, 2011
Comment on the Paris Ranking
Ben Wildavsky in the Chronicle of Higher Education says:
The Mines ParisTech ranking is an explicitly chauvinistic exercise, born of French unhappiness with the dismal showing of its universities in influential surveys such as the Academic Ranking of World Universities created at Shanghai Jiao Tong University in 2003. When designing the Mines ParisTech ranking, with a view to influencing the architects of the Shanghai methodology, the college says in the FAQ section of its survey results, “we believed it was useful to highlight the good results of French institutions at a time when the Shanghai ranking was widely and is still widely discussed, and not always to the advantage of our own schools and universities.” What’s more, it goes on, “these results constitute a genuine communication tool at an international level, both for the recruitment of foreign students as well as among foreign companies which are not always very familiar with our education system.” Given the genesis of the ranking, it doesn’t seem too surprising that three French institutions made it into this year’s top 10 — École Polytechnique and École Nationale d’Administration joined HEC Paris — while Mines ParisTech itself placed 21st in the world.
Sunday, March 06, 2011
The Paris rankings
The fifth edition of the Professional Ranking of World Universities from Paris Mines Tech has just been published. This is based on one indicator, the number of top corporate CEOs. Here are the top ten:
1. Harvard
2. Tokyo
3. Keio
4. HEC, France
5. Kyoto
5. Oxford
7. Ecole Polytechnique
8. Waseda
9. ENA
10. Seoul National University
The fifth edition of the Professional Ranking of World Universities from Paris Mines Tech has just been published. This is based on one indicator, the number of top corporate CEOs. Here are the top ten:
1. Harvard
2. Tokyo
3. Keio
4. HEC, France
5. Kyoto
5. Oxford
7. Ecole Polytechnique
8. Waseda
9. ENA
10. Seoul National University
Subscribe to:
Posts (Atom)