The Shanghai Rankings 2
The Shanghai Rankings get more interesting when we look at the individual indicators. Here are the 2012 top five for Almuni who have won Nobel and Fields awards.
1. Harvard
2. Cambridge
3. MIT
4. Berkeley
5. Columbia
In the top fifty for this indicator there are the Ecole Normale Superieure, Moscow State University, the Technical University of Munich, Goettingen, Strasbourg and the City University of New York City College.
Essentially, this indicator allows universities that have seen better decades to gain a few points from an academic excellence that has long been in decline. City College of New York is an especially obvious victim of politics and bureaucracy.
The top five in the Awards indicator, faculty who have won Nobel prizes and Fields medals, are:
1.. Harvard
2. Cambridge
3. Princeton
4. Chicago
5. MIT
The top fifty includes the Universities of Buenos Aires, Heidelberg, Paris Dauphine, Bonn, Munich and Freiburg. Again, this indicator may be a pale reflection of past glory rather than a sign of future accomplishments.
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Sunday, August 19, 2012
Saturday, August 18, 2012
The Shanghai Rankings 1
The 2012 edition of Shanghai Jiao Tong University's Academic Ranking of World Universities has been published. Here are the top ten, which are the same as last year's top ten.
1. Harvard
2. Stanford
3. MIT
4. UC Berkeley
5. Cambridge
6. Caltech
7. Princeton
8. Columbia
9. Chicago
10. Oxford
It is necessary to go down to the 19th and 20th places to find any changes. Tokyo is now 19th and University College London 20th, reversing last year's order and restoring that of 2003.
The 2012 edition of Shanghai Jiao Tong University's Academic Ranking of World Universities has been published. Here are the top ten, which are the same as last year's top ten.
1. Harvard
2. Stanford
3. MIT
4. UC Berkeley
5. Cambridge
6. Caltech
7. Princeton
8. Columbia
9. Chicago
10. Oxford
It is necessary to go down to the 19th and 20th places to find any changes. Tokyo is now 19th and University College London 20th, reversing last year's order and restoring that of 2003.
Saturday, August 11, 2012
What’s up at Wits?
The university of Witwatersrand is in turmoil. Faculty are going on strike for higher salaries, claiming that there has been a
drastic decline in quality in recent years. Evidence for this decline is the university’s fall by more than a hundred places in the QS world rankings.
The administration has argued that these rankings are not valid.
THE University of the Witwatersrand is one of SA's largest and oldest academic institutions. According to its strategic planning division, at the end of last year there were about 1300 academic staff, 2000 administrative staff and nearly 30000 students, with 9000 of these being postgraduates.
There is no doubt that Wits has pockets of excellence, and many talented academics who are players on the global stage. However, this excellence is being overwhelmed and dragged down by inefficient bureaucracy in its administrative processes.
There are more administrative staff than academic staff, and as one academic said: "It is impossible to get anything done."
David Dickinson, president of the Academic Staff Association of Wits University - which has more than 700 members and is threatening to strike, said: "Between 2007 and last year, we fell more than 100 places in the QS World University Rankings.. A significant problem is that the most important part of the university has been forgotten: its employees."
The university is ranked second in the country, after the University of Cape Town, but scraped into the top 400 in the world at 399th on the QS World University rankings for last year.
The faculty are correct about the QS rankings. Between 2007
and 2011 the university fell from 283rd place to 399th.
The decline was especially apparent in the employer review, from 191st to below
301st and international faculty, from 69th to 176th.
But there is a problem. From 2007 to 2011 Wits steadily improved on some indicators in the Shanghai rankings, from 10.9 to 11.2 for publications in
Nature and Science, from 26.2 to 29.9 for publications and from 14.8 to 16.3 for
faculty productivity. The score for alumni winning Nobel prizes has declined from
23.5 to 21.2 but this was because the two alumni were being compared to an
increase for front runner Harvard.
So which ranking is correct? Probably they both are because they refer to
two different periods. The alumni who contributed to the Alumni indicator
in ARWU graduated in 1982 and 2002. Publications and papers In Nature and Science could reflect the fruits of research projects that began up to a decade earlier.
The QS rankings (formerly the THE-QS rankings) are heavily weighted towards two surveys of debatable validity. The declining score for Wits in the employer review from 59 points (well above the mean of 50) to 11 is remarkable and almost certainly is nothing to do with the university but is the result of a flooding of the survey by supporters of other institutions leading to a massive increase in the average number of responses.
The decline in other scores such as international faculty and faculty student ration could be the result of short term policy changes. However, if it is correct that research and teaching are being strangled by bureaucracy and mistaken policies then sooner or later we should start seeing indications in the Shanghai rankings.
Sunday, August 05, 2012
Philippine Universities and the QS English Rankings
The QS subject rankings have produced quite a few surprises. Among them is the high position of several Philippine universities in the 2012 English Literature and Language ranking. In the top 100 we find Ateneo de Manila University, the University of the Philippines and De La Salle University. Ateneo de Manila in 24th place is ahead of Birmingham, Melbourne, Lancaster and University College Dublin.
How did the Philippine universities do so well? First, the subject rankings are based on different combinations of criteria. Those for English Literature and Language rankings have a 90 per cent weighting for the academic survey conducted in 2011 and 10 percent for the employer survey. There is, unlike the natural sciences, nothing for citations. Essentially then the English ranking is a measure of reputation in that subject and these universities were picked by a large number of survey respondents..
One feature of the QS academic survey is that respondents can choose to nominate universities globally or by region. Ateneo de Manila's performing better than Birmingham or Melbourne in this subject most probably means that it was being compared with others in Asia while the latter were assessed internationally.
Also, the category English Literature and Language is an extremely diverse one, covering scholars toiling away at a critical edition of Chaucer, post-modern cultural theorists and researchers in language education. I suspect that the high scores for Ateneo de Manila and the other universities came from dozens of postgraduate TESOL students in the US and Australia. It would be a good idea for QS to have separate rankings for English literature and English language education.
As usual, university administrators seem to be somewhat confused about the rankings. The Dean of the Faculty of Arts and Letters at the University of Santo Tomas is reported as saying;
The University, he pointed out, did not get any request for data from QS, the London consultancy that comes out with annual university rankings:
“With due respect to the QS, I think we should also know how the data is being collected, because as far as we are concerned, we are the academic unit taking care of arts and humanities and philosophy and literature,” he told the Varsitarian.
The QS survey may have been perception-based, and data gathering could have relied on what’s available on the Internet, Vasco added. “The question is, how do they source the data? Do they simply get it from the general information known about the University? Do they simply get it from the website? What if the website is not updated? What information will you get there?” he asked.
Vasco also said it would be difficult to compete in other clusters of the Arts and Humanities category of the QS subject rankings, namely Philosophy, Modern Languages, Geography, History, and Linguistics.
“[We] do not offer the same breadth of programs being surveyed under the arts and humanities cluster in the QS survey,” Vasco said.
The growing number of participants in the QS survey has contributed to the general decline of Philippine schools in various QS rankings, the Artlets dean noted. “More and more international universities from highly industrialized countries are participating, like universities from Europe, North America, and even Asia-Pacific,” he said. “Chances are, Philippine schools will slide down to lower rankings.”
For once, QS is being unfairly treated. The methodology of the subject rankings is explained quite clearly here
Friday, August 03, 2012
QS Stars
University World News (UWN) has published an article by David Jobbins about QS Stars, which are awarded to universities that pay (most of them anyway) for an audit and a three year licence to use the stars and which are shown alongside the listings in the QS World University Rankings. Participation is not spread evenly around the world and it is mainly medioce universities or worse that have signed up according to a QS brochure. Nearly half of the universities that have opted for the stars are from Indonesia.
Jobbins refers to a report in Private Eye which in turn refers to the Irish Examiner. He writes:
The stars appear seamlessly alongside the listing for each university on the World University Rankings, despite protestations from QS that the two are totally separate operations.
The UK magazine Private Eye reported in its current issue that two Irish universities – the University of Limerick and University College Cork, UCC – had paid “tens of thousands” of euro for their stars.
The magazine recorded that UCC had told the Irish Examiner that the €22,000 (US$26,600) cost of obtaining the stars was worthwhile, as it could be recouped through additional international student recruitment.
The total cost for the audit and a three-year licence is US$30,400, according to the scheme prospectus.
The Irish Examiner article by Neil Murray is quite revealing about the motivation for signing up for an audit:
UCC paid almost €22,000 for its evaluation, which includes a €7,035 audit fee and three annual licence fees of €4,893. It was awarded five-star status, which it can use for marketing purposes for the next three years.
The audit involved a visit to the college by QS researchers but is mostly based on analysis of data provided by UCC on eight criteria. The university’s five-star rating is largely down to top marks for research, infrastructure, internationalisation, innovation, and life science, but it got just three stars for teaching and engagement.So now we know how much a single international student adds to the revenue of an Irish university.
About 3,000 international students from more than 100 countries earn UCC approximately €19 million a year.
UCC vice-president for external affairs Trevor Holmes said there are plans to raise the proportion of international students from 13% — one of the highest of any Irish college — to 20%.
"Should UCC’s participation in QS Stars result in attracting a single additional, full-time international student to study at UCC then the costs of participation are covered," he said.
"In recent times, unlike many other Irish universities, UCC has not been in a position to spend significant sums on marketing and advertising domestically or internationally. QS Stars represents a very cost-effective approach of increasing our profile in international media and online."
So far, there is nothing really new here. The QS Stars system has been well publicised and it probably was a factor in Times Higher Education dropping QS as its data collecting partner and replacing them with Thomson Reuters.
What is interesting about the UWN article is that a number of British and American universities have been given the stars without paying anything. These include Oxford and Cambridge and 12 leading American institutions that are described by QS as "independently audited based on publicly available information". It would be interesting to know whether the universities gave permission to QS to award them stars in the rankings. Also, why are there differences between the latest rankings and the QS brochure? Oxford does not have any stars in last year's rankings but is on the list in the brochure. Boston University has stars but is not on the list. It may be just a matter of updating.
It would probably be a good idea for QS to remove the stars from the rankings and keep them in the university profiles.
Monday, July 30, 2012
New International Ranking
A new ranking has appeared, the CWUR World Universities Rankings published by the Center for World University Ranking in Jeddah, Saudi Arabia. The top ten are:
1. Harvard
2. MIT
3. Stanford
4. Cambridge
5. Caltech
6. Princeton
7. Oxford
8. Yale
9. Columbia
10. UC Berkeley
The criteria are:
- Quality of Faculty. This is based on full time faculty members who have won a variety of awards, including the Nobel Prize and the Fields Medal, as in the Shanghai rankings, and others such as the Abel, Templeton and World Food Prizes. Weighting of 4.
- Quality of research: Publications in top journals. For science and the social sciences top journals are those included in the ISI journal citation reports weighted according to the Article Impact Score (AIS). For the humanities, the list of journals is compiled from the INT1 set of prestigious international journals published by the European reference Index for the Humanities. Weighting of 1.
- Quality of Research: Highly influential research. This is based on the number of publications in journals multiplied by the journals' AIS. Weighting of 1.
- Quality of Research: Citations. This includes citations from journals in science, the social sciences and the arts and humanities. Weighting of 1.
- Quality of Research: Patents. Weighting of 1.
- Alumni who have won awards -- listed under Quality of Faculty -- relative to the institution's size, which is determined by current enrollment.Weighting of 4.
- Number of alumni who are heads of companies in the Forbes Global 2000 list. Weighting of 4.
The top 100 includes several medical schools and graduate only and specialised institutions like Rockefeller University and UC San Francisco. There are five Japanese universities and one Korean but none from Hong Kong or mainland China. A indication of the ranking's objectivity is that Israeli schools do well.
It is disappointing that the new rankings include only 100 institutions. Also, they do not give scores but only rank order for the various indicators, something that will make it difficult to track performance if further editions appear.
If the CWUR rankings had appeared in 2003 at the same time as the Shanghai rankings they would have been judged to be more comprehensive and valid. But, after nine years Shanghai is the market leader for research-based rankings and catching up will be a difficult task.
Friday, July 27, 2012
If you want to be a millionaire, go to...
Skandia Millionaire Monitor has conducted a survey of millionaires in several countries. British millionaires were asked which university they had attended. The top five were:
1. London
2. Oxford
3. Cambridge
4. Leeds
5. Manchester
Something interesting is that at every university except St. Andrews, including Oxford and Cambridge, state school educated millionaires outnumbered those with a private education.
Skandia Millionaire Monitor has conducted a survey of millionaires in several countries. British millionaires were asked which university they had attended. The top five were:
1. London
2. Oxford
3. Cambridge
4. Leeds
5. Manchester
Something interesting is that at every university except St. Andrews, including Oxford and Cambridge, state school educated millionaires outnumbered those with a private education.
Thursday, July 12, 2012
UI GreenMetric World University Rankings
Universitas Indonesia has been asking universities to take part in a ranking based on "university sustainability." According to UI:
"The world faces unprecedented civilizational challenges such as population trends, global warming, and overexploitation of natural resources, oil-dependent energy, water and food shortages and sustainability. We realize that higher education has a crucial role to play in addressing these challenges. UI Green Metric raises awareness as it helps assess and compare efforts at education for sustainable development, sustainability research, campus greening, and social outreach."
The ranking has six criteria: Setting and Infrastructure, Energy and Climate Change, Waste, Water, Transportation and Education.
The ranking is based entirely on data submitted by universities and that in itself drastically limits its validity. Also, should the promotion of sustainability, however worthy a cause, be a major concern of universities. Is there nobody else taking an interest in such things?
Universitas Indonesia has been asking universities to take part in a ranking based on "university sustainability." According to UI:
"The world faces unprecedented civilizational challenges such as population trends, global warming, and overexploitation of natural resources, oil-dependent energy, water and food shortages and sustainability. We realize that higher education has a crucial role to play in addressing these challenges. UI Green Metric raises awareness as it helps assess and compare efforts at education for sustainable development, sustainability research, campus greening, and social outreach."
The ranking has six criteria: Setting and Infrastructure, Energy and Climate Change, Waste, Water, Transportation and Education.
The ranking is based entirely on data submitted by universities and that in itself drastically limits its validity. Also, should the promotion of sustainability, however worthy a cause, be a major concern of universities. Is there nobody else taking an interest in such things?
Monday, July 09, 2012
The QS Subject Rankings
QS has produced rankings of universities by subject. These seem to be quite popular, probably because the methodology and weighting varies from one subject to another so that almost everybody can score well in something.
Outside the top forty or fifty in each subject, however, they should not be taken too seriously. They depend on only two or three criteria in varying combinations, the academic survey, the employer survey and citations per paper.
So, citations per paper contribute 50% of the weighting for biology and earth sciences but nothing for English and 10% for philosophy and sociology. A high score for biology could be the result of a large number of citations, indicating -- perhaps -- a substantial research impact. A high score for English (language and literature) is largely due to the survey of academic opinion, a rather dubious instrument.
Anyway, MIT is first for these subjects:
Linguistics
Computer Science
Chemical Engineering
Civil Engineering
Electrical Engineering
Mechanical Engineering
Economics and Econometrics
Physics and Astronomy
Mathematics
Chemistry
Materials Science
Harvard for these:
Modern Languages
Medicine
Psychology
Pharmacy and Pharmacology
Earth and Marine Sciences
Politics and International Studies
Law
Sociology
Education
Oxford for these:
Philosophy
Geography
History
Stanford for these:
Environmental Sciences
Statistics and Operational Research
Communication and Media Studies
and Cambridge for:
English Literature and Language.
As we get to the lower reaches of these rankings, the number of responses to the surveys or the number of citations gets amaller so that trivial changes in the number of citations will lead .
QS has produced rankings of universities by subject. These seem to be quite popular, probably because the methodology and weighting varies from one subject to another so that almost everybody can score well in something.
Outside the top forty or fifty in each subject, however, they should not be taken too seriously. They depend on only two or three criteria in varying combinations, the academic survey, the employer survey and citations per paper.
So, citations per paper contribute 50% of the weighting for biology and earth sciences but nothing for English and 10% for philosophy and sociology. A high score for biology could be the result of a large number of citations, indicating -- perhaps -- a substantial research impact. A high score for English (language and literature) is largely due to the survey of academic opinion, a rather dubious instrument.
Anyway, MIT is first for these subjects:
Linguistics
Computer Science
Chemical Engineering
Civil Engineering
Electrical Engineering
Mechanical Engineering
Economics and Econometrics
Physics and Astronomy
Mathematics
Chemistry
Materials Science
Harvard for these:
Modern Languages
Medicine
Psychology
Pharmacy and Pharmacology
Earth and Marine Sciences
Politics and International Studies
Law
Sociology
Education
Oxford for these:
Philosophy
Geography
History
Stanford for these:
Environmental Sciences
Statistics and Operational Research
Communication and Media Studies
and Cambridge for:
English Literature and Language.
As we get to the lower reaches of these rankings, the number of responses to the surveys or the number of citations gets amaller so that trivial changes in the number of citations will lead .
El Naschie vs.Nature
The journal Nature has been totally vindicated. The judgement by Mrs Victoria Sharp has dismissed El Naschie's claims. I would be very surprised if there has ever been a more unambiguous judgement .
To review the case, at the end of 2008 Nature published an article, Self-publishing editor set to retire, which described how Mohamed El Naschie, the editor of the applied mathematics/theoretical physics journal, Chaos, Solitons and Fractals, had published an unusually large number of his own papers, which were of poor quality, without proper peer review. Furthermore, the journal had acquired a falsely high impact factor through self-citation and citation by a limited number of friends and disciples.
El Naschie sued Nature and author Quirin Schiermeier for libel. Now, Mrs Justice Sharp has found for the defendants.
The case is of interest to this blog since it was the citation of El Naschie's papers by himself and a few associates that contributed to Alexandria University's reaching fourth place for research impact in the 2010 Times Higher Education (THE) World university Rankings, powered by Thomson Reuters (TR). El Naschie did not, of course, do it all by himself. TR's methodology inflated his citations because they were recent, because they were assigned to a low citing field, applied maths, and becuse he was affiliated to a university in a low citing region. Since then TR has tweaked its citation indicator to avoid the repetition of such a strange result.
This is a victory for academic freedom although one wonders what would have happened if El Naschie had chosen a critic with a less substantial bank account.
Here are some comments. First place goes to El Naschie Watch which has been following the affairs of EL Naschie for some time.
El Naschie Watch
Nature
BBC News
New Scientist
Guardian
Times Higher Education
The journal Nature has been totally vindicated. The judgement by Mrs Victoria Sharp has dismissed El Naschie's claims. I would be very surprised if there has ever been a more unambiguous judgement .
El Naschie sued Nature and author Quirin Schiermeier for libel. Now, Mrs Justice Sharp has found for the defendants.
The case is of interest to this blog since it was the citation of El Naschie's papers by himself and a few associates that contributed to Alexandria University's reaching fourth place for research impact in the 2010 Times Higher Education (THE) World university Rankings, powered by Thomson Reuters (TR). El Naschie did not, of course, do it all by himself. TR's methodology inflated his citations because they were recent, because they were assigned to a low citing field, applied maths, and becuse he was affiliated to a university in a low citing region. Since then TR has tweaked its citation indicator to avoid the repetition of such a strange result.
This is a victory for academic freedom although one wonders what would have happened if El Naschie had chosen a critic with a less substantial bank account.
Here are some comments. First place goes to El Naschie Watch which has been following the affairs of EL Naschie for some time.
El Naschie Watch
Nature
BBC News
New Scientist
Guardian
Times Higher Education
Friday, July 06, 2012
Power and Responsibility: The Growing Influence of Global Rankings
My article can be accessed at University World News. Comments can be submitted here.
My article can be accessed at University World News. Comments can be submitted here.
Sunday, June 24, 2012
Productive Universities
QS has been analysing university research output using Scopus data. The world's most productive university measured by the number of papers is Toronto. The top ten contain Harvard and five more US institutions, University College London, Sao Paulo and Tokyo.
Harvard is first for total citations and Rockefeller, a specialist medical school, for citations per paper.
It seems that the presence or absence of a medical school makes a lot of difference to performance measures based on total publications or citations. In general, there are substantial differences between disciplines with medicine and the humanities at opposite ends of the spectrum. The performance of schools like Toronto may to some extent reflect their balance of discipline.
Times Higher and Thomson Reuters would say that the answer to this problem lies in normalisation. But that raises another questions, namely whether all disciplines can be considered equal.
QS has been analysing university research output using Scopus data. The world's most productive university measured by the number of papers is Toronto. The top ten contain Harvard and five more US institutions, University College London, Sao Paulo and Tokyo.
Harvard is first for total citations and Rockefeller, a specialist medical school, for citations per paper.
It seems that the presence or absence of a medical school makes a lot of difference to performance measures based on total publications or citations. In general, there are substantial differences between disciplines with medicine and the humanities at opposite ends of the spectrum. The performance of schools like Toronto may to some extent reflect their balance of discipline.
Times Higher and Thomson Reuters would say that the answer to this problem lies in normalisation. But that raises another questions, namely whether all disciplines can be considered equal.
Friday, June 22, 2012
Boring is Good
QS have produced the second instalment of their "Latin University Rankings" (i.e. Latin American University rankings). This time there have been few changes. The top seven are the same as last year. According to QS, "the familiar look of the top ten in 2012 QS University Rankings: Latin America is evidence that last year’s inaugural exercise provided a fair and accurate overview of the current hierarchy of the region’s universities".
True, but does that mean that other rankings were invalidated by noticeable instability?
Here are the top ten:
1. Universidade de Sao Paulo, Brazil
2. Pontificia Universidad Catolica de Chile
3. Universidade Estadual de Campinas, Brazil
4. Universidad de Chile
5. Universidad Nacional Autonoma de Mexico
6. Universidad de Los Andes, Colombia
7. Tecnologico de Monterrey, Mexico
8. Universidade Federal do Rio de Janeiro, Brazil
9. Universidad de Concepcion, Chile
10. Universidad de Santiago de Chile
Wednesday, June 20, 2012
The Complete University Guide
David Jobbins has drawn my attention to the online British Complete University Guide. This includes the 2013 League Table with the top five being:
1. Cambridge
2.. LSE
3. Oxford
4. Imperial College
5. Durham
At the bottom we have Southampton Solent, West of Scotland, London Metropolitan, East London and Bolton.
The criteria are entry standards, student satisfaction, research assessment and graduate prospects.
David Jobbins has drawn my attention to the online British Complete University Guide. This includes the 2013 League Table with the top five being:
1. Cambridge
2.. LSE
3. Oxford
4. Imperial College
5. Durham
At the bottom we have Southampton Solent, West of Scotland, London Metropolitan, East London and Bolton.
The criteria are entry standards, student satisfaction, research assessment and graduate prospects.
Sunday, June 17, 2012
A little bit of sex but not too much, we're British university students
Student Beans has published a British university sex league, which consists of the results of a survey of the number of sex partners since starting university. At the top is Bangor University with an average of 8.31, followed by Heriot-Watt and Plymouth. This is probably a result of savage cuts which have curtailed library hours and left students with nothing else to do.
At the bottom are Roehampton (1.83), Chester (1,71) and Exeter (1.15). There seems no obvious explanation for such a broad variation. Comparing scores with those in the QS rankings produced only a trivial and insignificant correlation.
The methodology does not look very sound: there is no sign of any proper sampling or precautions against multiple responses.
The most sexually active students are in economics, social work, marketing and leisure. No surprises there. The least are in education (that's a relief), earth sciences, theology and, valiantly trying to slow the pace of global warming, environmental science.
Student Beans has published a British university sex league, which consists of the results of a survey of the number of sex partners since starting university. At the top is Bangor University with an average of 8.31, followed by Heriot-Watt and Plymouth. This is probably a result of savage cuts which have curtailed library hours and left students with nothing else to do.
At the bottom are Roehampton (1.83), Chester (1,71) and Exeter (1.15). There seems no obvious explanation for such a broad variation. Comparing scores with those in the QS rankings produced only a trivial and insignificant correlation.
The methodology does not look very sound: there is no sign of any proper sampling or precautions against multiple responses.
The most sexually active students are in economics, social work, marketing and leisure. No surprises there. The least are in education (that's a relief), earth sciences, theology and, valiantly trying to slow the pace of global warming, environmental science.
Tuesday, June 12, 2012
The Uses of Rankings
The Indian Universities Grants Commission has laid down new regulations "to ensure academic collaboration between Indian and foreign educational institutes follows the highest standards".
The foreign institutions allowed to collaborate with Indian universities and colleges "must figure in list of top 500 global educational institutes, as ranked by the Times Higher Education Rankings or the Shanghai Rankings".
This sounds a little odd. The Times Higher Education World University Rankings only have 400 universities listed on their iphone app. Perhaps they will provide the Indian authorities with the remaining 100.
Another problem is that the Shanghai and THE rankings, especially the latter are not totally stable. So what happens if a university enters the top 500 before a contract is signed and then slips out the year after?
The Indian Universities Grants Commission has laid down new regulations "to ensure academic collaboration between Indian and foreign educational institutes follows the highest standards".
The foreign institutions allowed to collaborate with Indian universities and colleges "must figure in list of top 500 global educational institutes, as ranked by the Times Higher Education Rankings or the Shanghai Rankings".
This sounds a little odd. The Times Higher Education World University Rankings only have 400 universities listed on their iphone app. Perhaps they will provide the Indian authorities with the remaining 100.
Another problem is that the Shanghai and THE rankings, especially the latter are not totally stable. So what happens if a university enters the top 500 before a contract is signed and then slips out the year after?
Thursday, May 31, 2012
The THE New University Rankings
Times Higher Education have produced their ranking of 100 universities founded in the last fifty years. Here are the top ten:
1. Pohang University of Science and Technology
2. École Polytechnique Fédérale de Lausanne
3. Hong Kong University of Science and Technology
4. University of California, Irvine
5. Korea Advanced Institute of Science and Technology
6. Université Pierre et Marie Curie
7. University of California, Santa Cruz
8. University of York
9. Lancaster University
10. University of East Anglia
The list looks rather different from the QS new university ranking published two days ago. That is unsurprising since the QS table is heavily weighted towards two reputation surveys while the THE rankings are influences by various measures of income and by normalised citations.
Times Higher Education have produced their ranking of 100 universities founded in the last fifty years. Here are the top ten:
1. Pohang University of Science and Technology
2. École Polytechnique Fédérale de Lausanne
3. Hong Kong University of Science and Technology
4. University of California, Irvine
5. Korea Advanced Institute of Science and Technology
6. Université Pierre et Marie Curie
7. University of California, Santa Cruz
8. University of York
9. Lancaster University
10. University of East Anglia
The list looks rather different from the QS new university ranking published two days ago. That is unsurprising since the QS table is heavily weighted towards two reputation surveys while the THE rankings are influences by various measures of income and by normalised citations.
Wednesday, May 30, 2012
QS Asian University Rankings
QS have just published their 2012 Asian University Rankings. I will comment in a bit more detail later.
The top ten are:
1. The Hong Kong University of Science and Technology
2. National University of Singapore
3. University of Hong Kong
4. Seoul National University
5. Chinese University of Hong Kong
6. Peking University
7. Korean Advanced Institute of Science and Technology
8. University of Tokyo
9. Pohang University of Science and Technology
10. Kyoto University
Tuesday, May 29, 2012
The QS Under 50 Top 50
Early this month, Times Higher Education announced that they would publish a ranking of the top 100 universities less than 50 years old. The date for publication was May 31.
Now QS have just announced their ranking of new universities. The top ten are
1. Chinese University of Hong Kong
2. Hong Kong University of Science and Technology
3. Warwick
4. Nanyang Technological University, Singapore
5. Korea Advanced Institute of Science and Technology
6. University of York, UK
7. Pohang University of Science and Technology
8. Maastricht University
9. City University of Hong Kong
10. University of California Irvine
East Asia, especially Hong Kong and Korea, make a strong showing although there are no Mainland Chinese universities in the top 50.
No doubt there will be quiet smirks around the QS offices. And no doubt THE will say something about originality on Thursday.
Saturday, May 19, 2012
Exaggerated Metaphor Alert
Bloomberg has an interesting article by Mark C. Taylor, claiming that competition is killing higher education in the US. I think that, like the Roman soldier in Night at the Museum, he is speaking metaphorically.
There are some amusing points about the craze for buildings and programs of every conceivable variety.
"It’s about “keeping up with the Joneses,” an official at Wright State University said in a Dayton Daily News article last fall detailing why colleges in Ohio were spending hundreds of millions of dollars on student centers and other nonacademic attractions in a down economy. In Georgia, state legislators arereviewing questionable practices used to fund 173 projects to build student housing, parking garages, stadiums and recreation centers.
Private universities with large endowments often start the cycle. Schools such as Harvard University and New York University, for example, take on billion-dollar debts. In a trickle-down effect, less affluent schools also feel pressure to borrow and spend -- money they do not have. "
Then he describes how some schools have been gaming the rankings by reclassifying tutorials in order to decrease class size or by creating superfluous and expensive doctoral programs.
'Second- and third-tier universities often create unneeded doctoral programs to become eligible for additional federal support and to increase their global profile. For example, the University of North Texas has 36,000 students and advertisesitself as “a student-focused public research university”offering “97 bachelor’s, 82 master’s and 35 doctoral degree programs.”
Even this is not enough. Although severe budget shortfallshave led to cuts of as much as 90 percent for some programs, the university is adding new doctoral programs in a quest for the elusive top-tier status. This makes no educational sense and violates basic market principles. If successful, the University of North Texas will join too many other schools that are spending large amounts for unneeded programs that turn out products -- doctoral graduates -- for which the supply far outweighs the demand. This is a national issue, as pointed out in an article this month in the Chronicle of Higher Education titled “The Ph.D. Now Comes With Food Stamps.” '
Bloomberg has an interesting article by Mark C. Taylor, claiming that competition is killing higher education in the US. I think that, like the Roman soldier in Night at the Museum, he is speaking metaphorically.
There are some amusing points about the craze for buildings and programs of every conceivable variety.
"It’s about “keeping up with the Joneses,” an official at Wright State University said in a Dayton Daily News article last fall detailing why colleges in Ohio were spending hundreds of millions of dollars on student centers and other nonacademic attractions in a down economy. In Georgia, state legislators arereviewing questionable practices used to fund 173 projects to build student housing, parking garages, stadiums and recreation centers.
Private universities with large endowments often start the cycle. Schools such as Harvard University and New York University, for example, take on billion-dollar debts. In a trickle-down effect, less affluent schools also feel pressure to borrow and spend -- money they do not have. "
Then he describes how some schools have been gaming the rankings by reclassifying tutorials in order to decrease class size or by creating superfluous and expensive doctoral programs.
'Second- and third-tier universities often create unneeded doctoral programs to become eligible for additional federal support and to increase their global profile. For example, the University of North Texas has 36,000 students and advertisesitself as “a student-focused public research university”offering “97 bachelor’s, 82 master’s and 35 doctoral degree programs.”
Even this is not enough. Although severe budget shortfallshave led to cuts of as much as 90 percent for some programs, the university is adding new doctoral programs in a quest for the elusive top-tier status. This makes no educational sense and violates basic market principles. If successful, the University of North Texas will join too many other schools that are spending large amounts for unneeded programs that turn out products -- doctoral graduates -- for which the supply far outweighs the demand. This is a national issue, as pointed out in an article this month in the Chronicle of Higher Education titled “The Ph.D. Now Comes With Food Stamps.” '
Sunday, May 13, 2012
Ranking Countries
Universitas 21, a global alliance of research intensive universities, has produced a ranking of higher education systems. The US is top but the UK performs less well in tenth place. Muslim countries do particularly badly.
What might be interesting would be to compare resources with output and produce an index of efficiency.
"A nation’s economic development depends crucially on the presence of an educated and skilled workforce and on technological improvements that raise productivity. The higher education sector contributes to both these needs: it educates and trains; it undertakes pure and applied research. Furthermore, in a globalised world, a quality higher education system that is well-connected internationally facilitates the introduction of new ideas, and fosters trade and other links with foreign countries, through the movement of students and researchers across national frontiers.
Given the importance of higher education, a nation needs a comprehensive set of indicators in order to evaluate the quality and worth of its higher education system. A good higher education system is well-resourced and operates in a favourable regulatory environment. Domestic and international connectivity are also important. The success of the system is measured by output variables such as research performance, participation rates and employment. We use such indicators to derive a ranking of national higher education systems. The measures are grouped under four main headings: Resources, Environment, Connectivity and Output.
The resource measures we use relate to government expenditure, total expenditure, and R&D expenditure in tertiary institutions. The environment variable comprises the gender balance in students and academic staff, a data quality variable and a quantitative index of the policy and regulatory environment based on survey results. We surveyed the following attributes of national systems of higher education: degree of monitoring (and its transparency), freedom of employment conditions and in the choice of the CEO, and diversity of funding. Our survey results are combined with those from the World Economic Forum. Data limitations restrict the connectivity variables to numbers of international students and articles written jointly with international collaborators.
Nine output measures are included and cover research output and its impact, the presence of world-class universities, participation rates and the qualifications of the workforce. The appropriateness of training is measured by relative unemployment rates. The measures are constructed for 48 countries and territories at various stages of development.
The top ten countries, in rank order, are the United States, Sweden, Canada, Finland, Denmark, Switzerland, Norway, Australia, the Netherlands and the United Kingdom. "
Universitas 21, a global alliance of research intensive universities, has produced a ranking of higher education systems. The US is top but the UK performs less well in tenth place. Muslim countries do particularly badly.
What might be interesting would be to compare resources with output and produce an index of efficiency.
"A nation’s economic development depends crucially on the presence of an educated and skilled workforce and on technological improvements that raise productivity. The higher education sector contributes to both these needs: it educates and trains; it undertakes pure and applied research. Furthermore, in a globalised world, a quality higher education system that is well-connected internationally facilitates the introduction of new ideas, and fosters trade and other links with foreign countries, through the movement of students and researchers across national frontiers.
Given the importance of higher education, a nation needs a comprehensive set of indicators in order to evaluate the quality and worth of its higher education system. A good higher education system is well-resourced and operates in a favourable regulatory environment. Domestic and international connectivity are also important. The success of the system is measured by output variables such as research performance, participation rates and employment. We use such indicators to derive a ranking of national higher education systems. The measures are grouped under four main headings: Resources, Environment, Connectivity and Output.
The resource measures we use relate to government expenditure, total expenditure, and R&D expenditure in tertiary institutions. The environment variable comprises the gender balance in students and academic staff, a data quality variable and a quantitative index of the policy and regulatory environment based on survey results. We surveyed the following attributes of national systems of higher education: degree of monitoring (and its transparency), freedom of employment conditions and in the choice of the CEO, and diversity of funding. Our survey results are combined with those from the World Economic Forum. Data limitations restrict the connectivity variables to numbers of international students and articles written jointly with international collaborators.
Nine output measures are included and cover research output and its impact, the presence of world-class universities, participation rates and the qualifications of the workforce. The appropriateness of training is measured by relative unemployment rates. The measures are constructed for 48 countries and territories at various stages of development.
The top ten countries, in rank order, are the United States, Sweden, Canada, Finland, Denmark, Switzerland, Norway, Australia, the Netherlands and the United Kingdom. "
Monday, May 07, 2012
And now for something a little bit different
Times Higher Education has announced that it will publish a ranking of new universities (less than fifty years old).
The Times Higher Education 100 Under 50 will – as its name suggests – rank the world’s top 100 universities under the age of 50. The table and analysis will be published online and as a special supplement to the magazine on 31 May, 2012.
The vast majority of the world’s top research-led universities have at least one thing in common: they are old. Building upon centuries of scholarly tradition, institutions such as the University of Oxford, which can trace its origins back to 1096, can draw on endowment income generated over many years and have been able to cultivate rich networks of loyal and successful alumni (including in Oxford’s case a string of British Prime Ministers) to help build enduring brands.
Deja Vu All Over Again
Malaysia's love-hate affair with international rankings has taken another twist. The official target now is get one university in the top 50 and three in the top 100 in the QS rankings. That basically means that a Malaysian university will have to be the equal of New South Wales, Tsinghua or Warwick.
Last year Universiti Malaya got into the top 500 in the Shanghai ARWU ranking. That is a solid achievement and it might mean more if Malaysia could get another university there.
This is as part of its efforts to have a local university ranked among the world's top 50 universities by 2020.
Deputy Higher Education Minister Datuk Saifuddin Abdullah said the National Higher Education Strategic Plan also called for at least three local universities to be ranked among the world's top 100 universities.
To achieve this, he told the house that it needed to continuously recruit international students and participate in international education fairs to promote the "Education Malaysia" brand.
He was replying to Senator Mohd Khalid Ahmad who wanted to know why no local universities had been ranked among the world's top 200.
Saifuddin said the ministry was also intensifying promotional activities on the Internet and introducing student mobility programmes. This will allow them to take short-term courses with credits, and have better staff and student exchange programmes with foreign universities.
He said they were also having better scholarship coordination with foreign agencies and other bodies to facilitate the intake of foreign students at local universities.
He said the QS World University Ranking (QS WUR) was the preferred benchmark used to gauge a university.
What have I done?
I was recently in a public library somewhere in Southeast Asia. While browsing around I discovered that access to this blog was blocked because of "other adult material".
I thought that perhaps someone was upset with university rankings in general, which is entirely understandable, but the THE, QS, ARWU, Webometrics and HEEACT sites were all unblocked.
My best guess is that the filter software interprets anything with "watch" in it as something to do with voyeurism. Or perhaps the post about "does size really matter?" was misunderstood.
Or perhaps it just means that this blog is very mature and sophisticated.
Subscribe to:
Posts (Atom)