Tuesday, August 28, 2012

An International Student Bubble?

Something of a mania for international students seems to be developing. In a single issue of University World News there are stories from Canada, China and Poland about plans to recruit students from abroad.


A new report urging Canadian universities to nearly double international student enrolment by 2022 signals a fundamental policy change in Canada.

The report, released last week, recommends that Canada increase the number of foreign students from 240,000 in 2011 to 450,000 by 2022.

The government-appointed panel led by Amit Chakma, president and vice-chancellor of the University of Western Ontario, also laid out a blueprint for how the federal government ought to support universities in their recruitment efforts.

From China

China has been wooing foreign universities and foreign students in a bid to internationalise its universities and as part of a ‘soft power’ policy to project itself internationally.

“China wants to be seen as a major player internationally in terms of education,” said Anthony Welch, a professor of international education at the University of Sydney.

“There is a clear national policy in China of ‘soft power’ using education. I would argue that is a good thing for all partners,” said Yang Rui, an assistant professor in Hong Kong University’s faculty of education.

The article by Yojana Sharma also refers to efforts by universities and governments in Malaysia and Singapore to recruit more students from abroad

From Poland

Polish universities have introduced a free iPhone and iPad app to spread information internationally about opportunities in Polish higher education, and an Android version is promised soon.

The use of the latest technology will move the promotion of Polish higher education to a completely new level, according to a Polish Press Agency report quoting Dr Wojciech Marchwica of the Perspektywy Educational Foundation (Fundacja Edukacyjna Perspektywy), coordinator of the Study in Poland programme.
The universities are hoping to attract high-quality students from Ukraine, Russia, Belarus and Kazakhstan.

A few years ago, Ukraine was declared by the Study in Poland coordinating committee to be a priority source country, as it is tied to Poland by history, culture and geographical proximity.

The effort has already brought measurable results: the number of students from Ukraine grew from 1,989 in 2005 to 6,321 in 2012, an increase of more than 300%. In 2009 Study in Poland opened its first foreign office in Kyiv, at the Kyiv Polytechnic Institute.

China, Canada, Poland, Singapore and Malaysia are  not the only places struggling for more international students.

So why is there such a craze for moving students back and forth across international borders?

One reason for adding more international students is that it is probably the easiest  way to rise in the rankings (excluding ARWU) and the one with the quickest returns for universities outside the top 200 or 300. Getting faculty to do research and write papers is not always popular and may produce a backlash especially if senior staff have political connections. Writing papers that are  readable and citeable is even more difficult. Recruiting faculty to boost faculty student ratios can be expensive and may have an adverse impact on other indicators. The QS surveys are rather opaque and the THE citations indicator painfully complicated. But finding students who can cross a frontier to get a degree is comparatively easy and may even pay for itself. For University College Cork just one international student would pay for the cost of joining the QS Stars.

There are other reasons. Canada appears to be hunting for students from abroad as proxy for a meritocratic immigration policy. The problem here is that those talented engineers and computer scientists may be followed by not so talented spouses, siblings and cousins. China appears to be using universities to further diplomatic objectives and Poland seems to be trying to challenge Russian cultural hegemony.


Comment on the QS Subject Rankings

An enjoyable although perhaps a little intemperate comment on the QS philosophy rankings from the Leiter Reports.

Several readers sent this item, the latest worthless misinformation from the "world universities" ranking industry, in which "QS" (which, contrary to rumor, does not actually stand for 'Quirky Silliness") is a main player. As a commenter at The Guardian site notes, five of the universities ranked tops in Geography do not even have geography departments! And which are the "top five" US universities in philosophy?
1. Harvard University
2. University of California, Berkeley
3. Princeton University
4. Stanford University
5. Yale University
That corresponds decently to the top five American research universities, to be sure, but it has nothing to do with the top five U.S. philosophy departments, at least not in the 21st-century. But it should hardly be surprising that if you ask academics teaching in philosophy departments in Japan or Italy to rank the best philosophy departments, many of them will use general university reputation as a proxy. Indeed, every department that is pretty obviously "overrated" in philosophy in this list is at a top research university, and every department obviously underrated is not: so, e.g., Rutgers comes in at a mere 13th, Pittsburgh at 18th (behind Brown and Penn), and North Carolina at 20th.
One may hope that no student thinking about post-graduate work will base any decisions on this nonsense.
Important Dates

September 11th. From QS Intelligence Unit

QS Intelligence Unit is pleased to invite you to attend this afternoon event featuring the global exclusive release of the full QS World University Rankings® 2012-2013 Tuesday 11th September 2012 in Trinity College Dublin. Be among 300 university delegates present for a focused ninety minute session and networking reception on the eve of the EAIE conference.


September 12th. From Morse Code

The 2013 edition of U.S. News's Best Colleges rankings will go live on usnews.com on Wednesday, September 12. National UniversitiesNational Liberal Arts CollegesRegional Universities, and Regional Colleges are included in these rankings.

Our website will have the most complete version of the rankings, tables, and lists. It will have extensive statistical profiles for each school as well as wide-ranging interactivity and a college search to enable students and parents to find the school that best fits their needs. These exclusive rankings will also be published in our Best Colleges 2013 edition guidebook, which will go on sale September 18 on newsstands and at usnews.com.


October 3rd.  Times Higher Education
 
The annual THE rankings, which UK universities and science minister David Willetts said are "fast becoming something of a fixture in the academic calendar", will be published live online at 21.00 BST on 3 October.
A special rankings print supplement will also be published with the 4 October edition of THE, and the results will be available on a free interactive iPhone application.

Monday, August 27, 2012

The Shanghai Rankings 3

Two of the indicators in the Shanghai rankings measure research achievement at the highest level. The highly cited researchers indicator is based on a list of those scientists who have been cited most frequently by other researchers. Since ARWU counts current but not past affiliations of researchers, it is possible for a university to boost its score by recruiting researchers. This indicator might then  be seen as signalling a willingness to invest in and to retain international talent and hence a sign of future excellence. 

The top five for this indicator are

1,  Harvard
2.  Stanford
3.  UC Berkeley
4.  MIT
5.  Princeton

This indicator shows that there are a lot of US state universities and non-Ivy League schools that are doing well on this indicator. There is the University of Michigan (6th), University of Washington (13th), University of Minnesota (19th), Penn State (23rd), and Rutgers  (42nd).

Before this year, the methodology for this indicator was simple. If a highly cited researcher had two affiliations then there was a straightforward fifty-fifty division. Things were complicated when King Abdulaziz University (KAU) in Jeddah signed up scores of researchers on part time contracts, a story recounted in Science. ARWU has responded deftly by asking researchers to indicate how their time was divided if they had joint affiliations and this seems to have deflated KAU's score considerably but has had no or minimal effect for anyone else.

The top five universities for papers in Nature and Science are:

1.  Harvard
2.  Stanford
3.  MIT
4.  UC Berkeley
5.  Cambridge

High fliers on this indicator include several specialised science and medical institutions such as Imperial College London, Rockefeller University, Karolinka Institutet and the University of Texas Southwestern Medical Center.
Self Citation

In 2010 Mohamed El Naschie, former editor of the journal Chaos, Solitons and Fractals, embarrassed a lot of people by launching the University of Alexandria into the world's top five universities for research impact in the new Times Higher Education (THE) World University Rankings. He did this partly by  diligent self citation and partly by lot of mutual citation with a few friends and another journal. He was also helped by a ranking indicator that gave  the university disproportionate credit for citations in a little cited field, for citations in a short period of time and for being in a country were there are few citations.

Clearly self citation was only part of he story of Alexandria's brief and undeserved success but it was not an insignificant one.

It now seems that Thomson Reuters (TR), who collect and process the data for THE beginning to get a bit worried about  "anomalous citation patterns" . According to an article by Paul Jump in THE.

When Thomson Reuters announced at the end of June that a record 26 journals had been consigned to its naughty corner this year for "anomalous citation patterns", defenders of research ethics were quick to raise an eyebrow.

"Anomalous citation patterns" is a euphemism for excessive citation of other articles published in the same journal. It is generally assumed to be a ruse to boost a journal's impact factor, which is a measure of the average number of citations garnered by articles in the journal over the previous two years.

Impact factors are often used, controversially, as a proxy for journal quality and, even more contentiously, for the quality of individual papers published in the journal and even of the people who write them.

When Thomson Reuters discovers that anomalous citation has had a significant effect on a journal's impact factor, it bans the journal for two years from its annual Journal Citation Reports (JCR), which publishes up-to-date impact factors.

"Impact factor is hugely important for academics in choosing where to publish because [it is] often used to measure [their] research productivity," according to Liz Wager, former chair of the Committee on Publication Ethics.

"So a journal with a falsely inflated impact factor will get more submissions, which could lead to the true impact factor rising, so it's a positive spiral."

One trick employed by editors is to require submitting authors to include superfluous references to other papers in the same journal.

A large-scale survey by researchers at the University of Alabama in Huntsville's College of Business Administration published in the 3 February edition of Science found that such demands had been made of one in five authors in various social science and business fields.

That TR are beginning to crack down on self citation is good news. But will they follow their rivals QS and stop counting self citation in the citation indicator in their rankings? When I spoke to Simon Pratt of TR at the Shanghai World Class Universities conference in Shanghai at the end of last year he seemed adamant that they would go on counting self citations.

Even if TR and THE start excluding self citations, it would probably not be enough.. It may soon become necessary to exclude intra-journal citations as well.

Friday, August 24, 2012

Universiti Malaya Again

In many countries performance in international university rankings has become as much a symbol of national accomplishment as winning Olympic medals or qualifying for the World Cup. When a local university rises in the rankings it is cause for congratulations for everyone, especially for administrators. When they fall it is an occasion for soul-searching and a little bit of schadenfreude for opposition groups.

Malaysia has been particularly prone to this syndrome. There was a magical moment in 2004 when the first THES-QS ranking put Universiti Malaya (UM), the country's first university, in the world's top 100. Then it went crashing down . Since then it has moved erratically up and down around the 200th position.

Lim Kit Siang, leader emeritus of the Malaysian opposition has this to say in his blog:

At the University of Malaya’s centennial celebrations in June 2005, the then Deputy Prime Minister Datuk Seri Najib Razak threw the challenge to University of Malaya to raise its 89th position among the world’s top 100 universities in THES-QS (Times Higher Education Supplement-Quacquarelli Symonds) ranking in 2004 to 50 by the year 2020.

Instead of accepting Najib’s challenge with incremental improvement of its THES ranking, the premier university went into a free fall when in 2005 and 2006 it fell to 169th and 192nd ranking respectively, and in the following two years in 2007 and 2008, fell out of the 200 Top Universities ranking altogether.

In 2009, University of Malaya made a comeback to the 200 Top Universities Ranking when it was placed No. 180, but in 2010 it again fell out of the 200 Top Universities list when it dropped to 207th placing.

For the 2011 QS Top 200 Universities Ranking, University of Malaya returned to the Top 200 Universities Ranking, being placed at No. 167.

In the THES-QS World University Rankings 2009, University of Malaya leapfrogged 50 places from No. 230 placing in 2008 to No. 180 in 2009; while in the 2011 QS World University Ranking, University of Malaya leapt 40 places from No. 207 in 2010 to No. 167 in 2011.

The QS World University Rankings 2012 will be released in 20 days’ time. Can University of Malaya make another leapfrog as in 2009 and 2011 to seriously restore her place as one of the world’s top 100 universities by before 2015?


The government has announced that in addition to Najib’s challenge to University of Malaya in 2005 to be among the world’s Top 50 universities by 2020, the National Higher Education Strategic Plan called for at least three Malaysian universities to be ranked among the world’s top 100 universities.

Recently, the U.S. News World’s Best Universities Rankings included five local universities in its Top 100 Asian Universities, but this is not really something to celebrate about.

The U.S. News World’s Best Universities Ranking is actually based on the QS 2012 Top 300 Asian University Rankings released on May 30 this year, which commented that overall, although University of Malaya improved its ranking as compared to 2011 ranking, the majority of Malaysian universities dropped in their rankings this year as compared to 2011.
There is a lot of detail missing here. UM"s fluctuating scores had nothing to do with failed or successful policies but resulted from errors, corrections of errors, or "clarification of data", changes in methodology and variations in the collecting and reporting of data .

UM was only in the top 100 of the THES-QS rankings because of a mistake by QS, the data collectors, who thought that ethnic minority students and faculty were actually foreigners and therefore handed out a massive and undeserved boost for the international faulty and international student indicators.

Its fall in 2005 was the result of QS's belated realisation of its mistake.

The continued decline in 2007 may have been because QS changed its procedures to prevent respondents to the academic survey voting for their own institutions or because of the introduction of Z scores which had the effect of substantially boosting the scores in citation per faculty for mediocre universities like Peking but only slightly for laggards like UM.

The rise in 2009 from 230th to 180th position was largely the result of a big improvement in the score for faculty student ratio comprising both a reported fall in the number of students and a reported rise in the number of faculty. It is unlikely that the university administration had thrown 6000 students into the Klang River:more probably somebody told somebody that diploma and certificate students need not be included in the data reported to QS.

Whether UM rises again in the QS rankings is less interesting than its performance in the Shanghai Academic Ranking of World Universities. In 2011 it moved into the top 500 with.scores of 3.4 for highly cited researchers and 34.6 for ISI-indexed publications (compared to 100 for the front-runner Harvard) and 16 for per capita productivity (in this case the top scorer was Caltech).

In 2012 UM had the same score for highly cited researchers and registered a score of 38.6 for publications and a slight improvement to 16.7 for productivity. This meant that UM was now ranked in 439th place and that reaching the 300-400 band in  a few years time would not be impossible.

UM has managed to make it into the Shanghai rankings by actively encouraging research among its faculty and by recruiting international researchers, policies that are unpopular and in marked contrast to those of other Malaysian universities.

What will happen in the QS rankings when they come out next month? Something to watch out for is the employer survey, which has a weighting of ten per cent. In 2011 something odd was going on . Apparently there had been an enthusiastic response to the rankings in Latin America especially the employer survey so that QS resorted to capping the scores for many universities. They reported that:


"QS received a dramatic level of response from Latin America in 2011, these counts and all subsequent analysis have been adjusted by applying a weighting to responses from countries with a distinctly disproportionate level of response."
It seems that one effect of the inflated number of responses was to raise the mean score so that universities with below average scores saw a dramatic fall in their adjusted scores. If there is a further increase in responses this year universities like UM may see a further reduction for this indicator.

Sunday, August 19, 2012

The Shanghai Rankings 2

The Shanghai Rankings get more interesting when we look at the individual indicators. Here are the 2012 top five for Almuni who have won Nobel and Fields awards.

1. Harvard
2. Cambridge
3. MIT
4. Berkeley
5. Columbia

In the top fifty for this indicator there are the Ecole Normale Superieure, Moscow State University, the Technical University of Munich, Goettingen, Strasbourg and the City University of New York City College.

Essentially, this indicator allows universities that have seen better decades to gain a few points from an academic excellence that has long been in decline. City College of New York is an especially obvious victim of politics and bureaucracy.

The top five in the Awards indicator, faculty who have won Nobel prizes and Fields medals, are:

1.. Harvard
2.  Cambridge
3.  Princeton
4.  Chicago
5.  MIT

The top fifty includes the Universities of Buenos Aires, Heidelberg, Paris Dauphine, Bonn, Munich and Freiburg. Again, this indicator may be a pale reflection of past glory rather than a sign of future accomplishments.





Saturday, August 18, 2012

The Shanghai Rankings 1

The 2012 edition of Shanghai Jiao Tong University's Academic Ranking of World Universities has been published. Here are the top ten, which are the same as last year's top ten.

1.  Harvard
2.  Stanford
3.  MIT
4.  UC Berkeley
5.  Cambridge
6.  Caltech
7.  Princeton
8.  Columbia
9.  Chicago
10. Oxford

It is necessary to go down to the 19th and 20th places to find any changes. Tokyo is now 19th and University College London 20th, reversing last year's order and restoring that of 2003.

Saturday, August 11, 2012


What’s up at Wits?
 
The university of Witwatersrand is in turmoil. Faculty are going on strike for higher salaries, claiming that there has been a drastic decline in quality in recent years. Evidence for this decline is the university’s fall by more than a hundred places in the QS world rankings. The administration has argued that these rankings are not valid.

THE University of the Witwatersrand is one of SA's largest and oldest academic institutions. According to its strategic planning division, at the end of last year there were about 1300 academic staff, 2000 administrative staff and nearly 30000 students, with 9000 of these being postgraduates.

There is no doubt that Wits has pockets of excellence, and many talented academics who are players on the global stage. However, this excellence is being overwhelmed and dragged down by inefficient bureaucracy in its administrative processes.

There are more administrative staff than academic staff, and as one academic said: "It is impossible to get anything done."

David Dickinson, president of the Academic Staff Association of Wits University - which has more than 700 members and is threatening to strike, said: "Between 2007 and last year, we fell more than 100 places in the QS World University Rankings.. A significant problem is that the most important part of the university has been forgotten: its employees."

The university is ranked second in the country, after the University of Cape Town, but scraped into the top 400 in the world at 399th on the QS World University rankings for last year.
The faculty are correct about the QS rankings. Between 2007 and 2011 the university fell from 283rd place to 399th. The decline was especially apparent in the employer review, from 191st to below 301st and international faculty, from 69th to 176th.

But there is a problem. From 2007 to 2011 Wits steadily improved on some indicators in the Shanghai rankings, from 10.9 to 11.2 for publications in Nature and Science, from 26.2 to 29.9 for publications and from 14.8 to 16.3 for faculty productivity. The score for alumni winning Nobel prizes has declined from 23.5 to 21.2 but this was because the two alumni were being compared to an increase for front runner Harvard.

So which ranking is correct? Probably they both are because they refer to two different periods. The  alumni who contributed to the Alumni indicator in ARWU graduated in 1982 and 2002. Publications and papers In Nature and Science could reflect the fruits of research projects that began up to a decade earlier.

The QS rankings (formerly the THE-QS rankings) are heavily weighted towards two surveys of debatable validity. The declining score for Wits in the employer review from 59 points (well above the mean of 50) to 11 is remarkable and almost certainly is nothing to do with the university but is the result of a flooding of the survey by supporters of other institutions leading to a massive increase in the average number of responses.

The decline in other scores such as international faculty and faculty student ration could be the result of short term policy changes. However, if it is correct that research and teaching are being strangled by bureaucracy and mistaken policies then sooner or later we should start seeing indications in the Shanghai rankings.


Sunday, August 05, 2012

Philippine Universities and the QS English Rankings

The QS subject rankings have produced quite a few surprises. Among them is the high position of several Philippine universities in the 2012 English Literature and Language ranking. In the top 100 we find Ateneo de Manila University, the University of the Philippines and De La Salle University. Ateneo de Manila in 24th place is ahead of Birmingham, Melbourne, Lancaster and University College Dublin.

How did the Philippine universities do so well? First, the subject rankings are based on different combinations of criteria. Those for English Literature and Language rankings have a 90 per cent weighting for the academic survey conducted in 2011 and 10 percent for the employer survey. There is, unlike the natural sciences, nothing for citations. Essentially then the English ranking is a measure of reputation in that subject and these universities were picked by a large number of survey respondents..

One feature of the QS academic survey is that respondents can choose to nominate universities globally or by region. Ateneo de Manila's performing better than Birmingham or Melbourne in this subject most probably means that it was being compared with others in Asia while the latter were assessed internationally.

Also, the category English Literature and Language is an extremely diverse one, covering scholars toiling away at a critical edition of Chaucer, post-modern cultural theorists and researchers in language education. I suspect that the high scores for Ateneo de Manila and the other universities came from dozens of postgraduate TESOL students in the US and Australia. It would be a good idea for QS to have separate rankings for English literature and English language education.

As usual, university administrators seem to be somewhat confused  about the rankings. The Dean of the Faculty of Arts and Letters at the University of Santo Tomas is reported as saying;

The University, he pointed out, did not get any request for data from QS, the London consultancy that comes out with annual university rankings:
“With due respect to the QS, I think we should also know how the data is being collected, because as far as we are concerned, we are the academic unit taking care of arts and humanities and philosophy and literature,” he told the Varsitarian.
The QS survey may have been perception-based, and data gathering could have relied on what’s available on the Internet, Vasco added. “The question is, how do they source the data? Do they simply get it from the general information known about the University? Do they simply get it from the website? What if the website is not updated? What information will you get there?” he asked.
Vasco also said it would be difficult to compete in other clusters of the Arts and Humanities category of the QS subject rankings, namely Philosophy, Modern Languages, Geography, History, and Linguistics.
“[We] do not offer the same breadth of programs being surveyed under the arts and humanities cluster in the QS survey,” Vasco said.
The growing number of participants in the QS survey has contributed to the general decline of Philippine schools in various QS rankings, the Artlets dean noted. “More and more international universities from highly industrialized countries are participating, like universities from Europe, North America, and even Asia-Pacific,” he said. “Chances are, Philippine schools will slide down to lower rankings.”

For once, QS is being unfairly treated. The methodology of the subject rankings is explained quite clearly here



Friday, August 03, 2012


QS Stars

University World News (UWN) has published an article by David Jobbins about QS Stars, which are awarded to universities that pay (most of them anyway) for an audit and a three year licence to use the stars and which are shown alongside the listings in the  QS World University Rankings. Participation is not spread evenly around the world and it is mainly medioce universities or worse that have signed up according to a QS brochure. Nearly half of the universities that have opted for the stars are from Indonesia.

Jobbins refers to a report in Private Eye which in turn refers to the Irish Examiner. He writes:

The stars appear seamlessly alongside the listing for each university on the World University Rankings, despite protestations from QS that the two are totally separate operations.

The UK magazine Private Eye reported in its current issue that two Irish universities – the University of Limerick and University College Cork, UCC – had paid “tens of thousands” of euro for their stars.

The magazine recorded that UCC had told the Irish Examiner that the €22,000 (US$26,600) cost of obtaining the stars was worthwhile, as it could be recouped through additional international student recruitment.

The total cost for the audit and a three-year licence is US$30,400, according to the scheme prospectus.


 The Irish Examiner article by Neil Murray is quite revealing about the motivation for signing up for an audit:

UCC paid almost €22,000 for its evaluation, which includes a €7,035 audit fee and three annual licence fees of €4,893. It was awarded five-star status, which it can use for marketing purposes for the next three years.

The audit involved a visit to the college by QS researchers but is mostly based on analysis of data provided by UCC on eight criteria. The university’s five-star rating is largely down to top marks for research, infrastructure, internationalisation, innovation, and life science, but it got just three stars for teaching and engagement.
About 3,000 international students from more than 100 countries earn UCC approximately €19 million a year.

UCC vice-president for external affairs Trevor Holmes said there are plans to raise the proportion of international students from 13% — one of the highest of any Irish college — to 20%.

"Should UCC’s participation in QS Stars result in attracting a single additional, full-time international student to study at UCC then the costs of participation are covered," he said.

"In recent times, unlike many other Irish universities, UCC has not been in a position to spend significant sums on marketing and advertising domestically or internationally. QS Stars represents a very cost-effective approach of increasing our profile in international media and online."
So now we know how much a single international student adds to the revenue of an Irish university.

So far, there is nothing really new here. The QS Stars system has been well publicised and it probably was a factor in Times Higher Education dropping QS as its data collecting partner and replacing them with Thomson Reuters.

What is interesting about the UWN article is that a number of British and American universities have been given the stars without paying anything. These include Oxford and Cambridge and 12 leading American institutions that are described by QS as "independently audited based on publicly available information". It would be interesting to know whether the universities gave permission to QS to award them stars in the rankings. Also, why are there differences between the latest rankings and the QS brochure? Oxford does not have any stars in last year's rankings but is on the list in the brochure. Boston University has stars but is not on the list. It may be just a matter of updating.

It would probably be a good idea for QS to remove the stars from the rankings and keep them in the university profiles.




Monday, July 30, 2012

New International Ranking

 A new ranking has appeared, the CWUR World Universities Rankings published by the Center for World University Ranking in Jeddah, Saudi Arabia. The top ten are:

1.  Harvard
2.  MIT
3.  Stanford
4.  Cambridge
5.  Caltech
6.  Princeton
7.  Oxford
8.  Yale
9.  Columbia
10. UC Berkeley

The criteria are:
  • Quality of Faculty. This is based on full time faculty members who have won a variety of awards, including the Nobel Prize and the Fields Medal, as in the Shanghai rankings, and     others such as the Abel, Templeton and World Food Prizes. Weighting of 4.
  • Quality of research: Publications in top journals. For science and the social sciences top journals are those included in the ISI journal citation reports weighted according to the Article Impact Score (AIS). For the humanities, the list of journals is compiled from the INT1 set of prestigious international journals published by the European reference Index for the Humanities. Weighting of 1.
  • Quality of Research: Highly influential research. This is based on the number of publications in journals multiplied by the journals' AIS. Weighting of 1.
  • Quality of Research: Citations. This includes citations from journals in science, the social sciences and the arts and humanities. Weighting of 1.
  • Quality of Research: Patents. Weighting of 1.
  • Alumni who have won awards -- listed under Quality of Faculty -- relative to the institution's size, which is determined by current enrollment.Weighting of 4.
  • Number of alumni who are heads of companies in the Forbes Global 2000 list. Weighting of 4.
Basically, this is an expanded and elaborated version of the Shanghai ARWU. Like the Shanghai rankings, the CWUR rankings claim to assess quality of teaching by the accomplishments of alumni although they use many more international prizes. Similarly, faculty quality is assessed by the number of staff who have won these prizes. The quality of research is measured by four factors rather than three and these may be more resistant to manipulation than the highly cited researchers indicator in the Shanghai rankings. In addition, the new rankings include publications and citations in the arts and humanities.
 
The top 100 includes several medical schools and graduate only and specialised institutions like Rockefeller University and UC San Francisco. There are five Japanese universities and one Korean  but none from Hong Kong or mainland China. A indication of the ranking's objectivity is that Israeli schools do well.


It is disappointing that the new rankings include only 100 institutions. Also, they do not give scores but only rank order for the various indicators, something that will make it difficult to track performance if further editions appear.

If the CWUR rankings had appeared in 2003 at the same time as the Shanghai rankings they would have been judged to be more comprehensive and valid.  But, after nine years Shanghai is the market leader for research-based rankings and catching up will be a difficult task.



Friday, July 27, 2012

If you want to be a millionaire, go to...

Skandia Millionaire Monitor has conducted a survey of millionaires in several countries. British millionaires were asked which university they had attended. The top five were:

1.  London
2.  Oxford
3.  Cambridge
4.  Leeds
5.  Manchester

Something interesting is that at every university except St. Andrews, including Oxford and Cambridge, state school educated millionaires outnumbered those with a private education.

Thursday, July 12, 2012

UI GreenMetric World University Rankings

Universitas Indonesia has been asking universities to take part in a ranking based on "university sustainability."  According to UI:

"The world faces unprecedented civilizational challenges such as population trends, global warming, and overexploitation of natural resources, oil-dependent energy, water and food shortages and sustainability. We realize that higher education has a crucial role to play in addressing these challenges. UI Green Metric raises awareness as it helps assess and compare efforts at education for sustainable development, sustainability research, campus greening, and social outreach."

The ranking has six criteria: Setting and Infrastructure, Energy and Climate Change, Waste, Water, Transportation and Education.

The ranking is based entirely on data submitted by universities and that in itself drastically limits its validity. Also, should the promotion of sustainability, however worthy a cause, be a major concern of universities. Is there nobody else taking an interest in such things?



Monday, July 09, 2012

The QS Subject Rankings

QS has produced rankings of universities by subject. These seem to be quite popular, probably because the methodology and weighting varies from one subject to another so that almost everybody can score well in something.

Outside the top forty or fifty in each subject, however, they should not be taken too seriously. They depend on only two or three criteria in varying combinations, the academic survey, the employer survey and citations per paper.

So, citations per paper contribute 50% of the weighting for biology and earth sciences but nothing for English and 10% for philosophy and sociology. A high score for biology could be the result of a large number of citations, indicating -- perhaps -- a substantial research impact. A high score for English (language and literature) is largely due to the survey of academic opinion, a rather dubious instrument.

Anyway, MIT is first for these subjects:

Linguistics
Computer Science
Chemical Engineering
Civil Engineering
Electrical Engineering
Mechanical Engineering
Economics and Econometrics
Physics and Astronomy 
Mathematics
Chemistry
Materials Science

Harvard for these:

Modern Languages
Medicine
Psychology
Pharmacy and Pharmacology
Earth and Marine Sciences
Politics and International Studies
Law
Sociology
Education

Oxford for these:

Philosophy
Geography
History

Stanford for these:

Environmental Sciences
Statistics and Operational Research
Communication and Media Studies

and Cambridge for:

English Literature and Language.


















As we get to the lower reaches of these rankings, the number of responses to the surveys or the  number of citations gets amaller so that trivial changes in the number of citations will lead .

El Naschie vs.Nature

The journal Nature has been totally vindicated. The judgement by Mrs Victoria Sharp has dismissed El Naschie's claims. I would be very surprised if there has ever been a more unambiguous judgement .
To review the case, at the end of 2008 Nature published an article, Self-publishing editor set to retire, which described how Mohamed El Naschie, the editor of the applied mathematics/theoretical physics journal, Chaos, Solitons and Fractals, had published an unusually large number of his own papers, which were of poor quality, without proper peer review. Furthermore, the journal had acquired a falsely high impact factor through self-citation and citation by a limited number of friends and disciples.

El Naschie sued Nature and author Quirin Schiermeier for libel. Now, Mrs Justice Sharp has found for the defendants.

The case is of interest to this blog since it was the citation of El Naschie's papers by himself and a few associates that contributed to  Alexandria University's reaching fourth place for research impact in the 2010 Times Higher Education (THE) World university Rankings, powered by Thomson Reuters (TR). El Naschie did not, of course, do it all by himself. TR's methodology inflated his citations because they were recent, because they were assigned to a low citing field, applied maths, and becuse he was affiliated to a university in a low citing region. Since then TR has tweaked its citation indicator to avoid the repetition of such a strange result.

This is a victory for academic freedom although one wonders what would have happened if El Naschie had chosen a critic with a less substantial bank account.

Here are some comments. First place goes to El Naschie Watch which has been following the affairs of EL Naschie for some time.

El Naschie Watch

Nature

BBC News

New Scientist

Guardian

Times Higher Education



















Friday, July 06, 2012

Power and Responsibility: The Growing Influence of Global Rankings


My article can be accessed at University World News. Comments can be submitted here.

Sunday, June 24, 2012

Productive Universities

QS has been analysing university research output using Scopus data. The world's most productive university measured by the number of papers is Toronto. The top ten contain Harvard and five more US institutions, University College London, Sao Paulo and Tokyo.

Harvard is first for total citations and Rockefeller, a specialist medical school, for citations per paper.

It seems that the presence or absence of a medical school makes a lot of difference to performance measures based on total publications or citations. In general, there are substantial differences between disciplines with medicine and the humanities at opposite ends of the spectrum. The performance of schools like Toronto may to some extent reflect their balance of discipline.

Times Higher and Thomson Reuters would say that the answer to this problem lies in normalisation. But that raises another questions, namely whether all disciplines can be considered equal.

Friday, June 22, 2012

Boring is Good

QS have produced the second instalment of their "Latin University Rankings" (i.e. Latin American University rankings). This time there have been few changes. The top seven are the same as last year. According to QS, "the familiar look of the top ten in 2012 QS University Rankings: Latin America is evidence that last year’s inaugural exercise provided a fair and accurate overview of the current hierarchy of the region’s universities".

True, but does that mean that other rankings were invalidated by noticeable instability?

Here are the top ten:

1.  Universidade de Sao Paulo, Brazil
2.  Pontificia Universidad Catolica de Chile
3.  Universidade Estadual de Campinas, Brazil
4.  Universidad de Chile
5.  Universidad Nacional Autonoma de Mexico
6.  Universidad de Los Andes, Colombia
7. Tecnologico de Monterrey, Mexico
8. Universidade Federal do Rio de Janeiro, Brazil
9. Universidad de Concepcion, Chile 
10. Universidad de Santiago de Chile




Wednesday, June 20, 2012

The Complete University Guide

David Jobbins has drawn my attention to the online British  Complete University Guide. This includes the 2013 League Table with the top five being:

1.  Cambridge
2.. LSE
3.  Oxford
4.  Imperial College
5.  Durham

At the bottom we have Southampton Solent, West of Scotland, London Metropolitan, East London and Bolton.

The criteria are entry standards, student satisfaction, research assessment and graduate prospects.

Sunday, June 17, 2012

A little bit of sex but not too much, we're British university students

Student Beans has published a British university sex league, which consists of the results of a survey of the number of sex partners since starting university. At the top is Bangor University with an average of 8.31, followed by Heriot-Watt and Plymouth. This is probably a result of savage cuts which have curtailed library hours and left students with nothing else to do.

At the bottom are Roehampton (1.83), Chester (1,71) and Exeter (1.15). There seems no obvious explanation for such a broad variation. Comparing scores with those in the QS rankings produced only a trivial and insignificant correlation.

The methodology does not look very sound: there is no sign of any proper sampling or precautions against multiple responses.

The most sexually active students are in economics, social work, marketing and leisure. No surprises there. The least are in education (that's a relief), earth sciences, theology and, valiantly trying to slow the pace of global warming, environmental science.

Tuesday, June 12, 2012

The Uses of Rankings

The Indian Universities Grants Commission has laid down new regulations "to ensure academic collaboration between Indian and foreign educational institutes follows the highest standards".

The foreign institutions allowed to collaborate with Indian universities and colleges "must figure in list of top 500 global educational institutes, as ranked by the Times Higher Education Rankings or the Shanghai Rankings".

This sounds a little odd. The Times Higher Education World University Rankings only have 400 universities listed on their iphone app. Perhaps they will provide the Indian authorities with the remaining 100.

Another problem is that the Shanghai and THE rankings, especially the latter are not totally stable. So what happens if a university enters the top 500 before a contract is signed and then slips out the year after?


Thursday, May 31, 2012

The THE New University Rankings

Times Higher Education have produced their ranking of 100 universities founded in the last fifty years. Here are the top ten:

1. Pohang University of Science and Technology
2. École Polytechnique Fédérale de Lausanne
3. Hong Kong University of Science and Technology
4. University of California, Irvine
5. Korea Advanced Institute of Science and Technology
6. Université Pierre et Marie Curie
7. University of California, Santa Cruz
8. University of York
9. Lancaster University
10. University of East Anglia

The list looks rather different from the QS new university ranking published two days ago. That is unsurprising since the QS table is heavily weighted towards two reputation surveys while the THE rankings are influences by various measures of income and by normalised citations.

Wednesday, May 30, 2012

QS Asian University Rankings

QS have just published their 2012 Asian University Rankings. I will comment in a bit more detail later.

The top ten are:

1.  The Hong Kong University of Science and Technology
2.  National University of Singapore
3.  University of Hong Kong
4.  Seoul National University
5.  Chinese University of Hong Kong
6.  Peking University
7.  Korean Advanced Institute of Science and Technology
8.  University of Tokyo
9.  Pohang University of Science and Technology
10. Kyoto University


Tuesday, May 29, 2012

The QS Under 50 Top 50

Early this month, Times Higher Education announced that they would publish a ranking of the top 100 universities less than 50 years old. The date for publication was May 31.

Now QS have just announced their ranking of new universities. The top ten are

1.  Chinese University of Hong Kong
2.  Hong Kong University of Science and Technology
3.  Warwick
4.  Nanyang Technological University, Singapore
5.  Korea Advanced Institute of Science and Technology
6.  University of York, UK
7.  Pohang University of Science and Technology
8.  Maastricht University
9.  City University of Hong Kong
10. University of California Irvine

East Asia, especially Hong Kong and Korea, make a strong showing although there are no Mainland Chinese universities in the top 50.

No doubt there will be quiet smirks around the QS offices. And no doubt THE will say something about originality on Thursday.