Monday, December 19, 2011

Leiden Ranking: Many Ways to Rate Research

My article on the Leiden Rankings in University World News can be found here.

 It looks as though a two-tier international university ranking system is emerging.

At the top we have the 'big three', Shanghai's Academic Ranking of World Universities, the QS World University Rankings and, since 2010, the Times Higher Education World University Rankings.

These receive massive attention from the media, are avidly followed by academics, students and other stakeholders and are often quoted in promotional literature. Graduation from a university included in these has even been proposed as a requirement for immigration.

Then we have the rankings by SCImago and Webometrics, both from Spain, the Performance Ranking of Scientific Papers for World Universities produced by the Higher education Evaluation and Accreditation Council of Taiwan, and the Leiden Ranking, published by the Centre for Science and Technology Studies at Leiden University.

These rankings get less publicity but are technically very competent and in some ways more reliable than the better-known rankings.

Thursday, December 08, 2011

Update 6 on El Naschie vs Nature

There have been no reports for several days and the trial is now over. There will be a judgement in January.
What to do about the research bust

Mark Bauerlein has an article in the Chronicle of Higher Education on the disparity between the extraordinary effort and intelligence poured into scholarly writing in the humanities and the meager attention such writing receives.

"I devised a study of literary research in four English departments at public universities—the University of Georgia, the University at Buffalo, the University of Vermont, and the University of Illinois at Urbana-Champaign—collecting data on salaries, books and articles published, and the reception of those works. The findings:
  • Those universities pay regular English faculty, on average, around $25,000 a year to produce research. According to the faculty handbooks, although universities don't like to set explicit proportions, research counts as at least one-third of professors' duties, and we may calculate one-third of their salaries as research pay. This figure does not include sabbaticals, travel funds, and internal grants, not to mention benefits, making the one-third formula a conservative estimate.
  • Professors in those departments respond diligently, producing ample numbers of books and articles in recent years. At Georgia, from 2004 to 2009, current faculty members produced 22 authored books, 15 edited books, and 200 research essays. The award of tenure didn't produce any drop-off in publication, either. Senior professors continue their inquiries, making their departments consistently relevant and industrious research centers.
  • Finally, I calculated the impact of those publications by using Google Scholar and my own review of books published in specific areas to count citations. Here the impressive investment and productivity appear in sobering context. Of 13 research articles published by current SUNY-Buffalo professors in 2004, 11 of them received zero to two citations, one had five, one 12. Of 23 articles by Georgia professors in 2004, 16 received zero to two citations, four of them three to six, one eight, one 11, and one 16. "
Bauerlain suggests that these limited citation counts are telling us something, that talented scholars might find better things to do and that society might direct resources elsewhere.

The QS World University Rankings would apparently agree. Their citations indicator simply counts the total number of citations and divides it by the total number of faculty. This is a very crude measure, especially since it counts the current number of faculty but then counts the citations to articles written over a five year period. Any university seeking a boost in the QS rankings could simply axe a few English, history and philosophy specialists and replace them with oncologists and engineers. True, the world would lose studies about Emily Dickinson's Reluctant Ecology of Place, cited once according to Google Scholar, or Negotiations of Homoerotic Tradition in Paradise Regained, but if this was accompanied by even a small advance in cancer treatment, who would really care? There would be an even better effect on the Shanghai rankings which do not not count publications or citations in the humanities but still include the faculty in their productivity indicator.

But there are those who would argue that while disciplines go about citing differently they must be regarded as being on the same level in all other respects. Thomson Reuters, who collect the data for the Times Higher Education rankings, now normalise their data so that citations in a specific discipline in a specific country in a specific year are benchmarked against the average for that discipline in that country in that year. That would mean that the article by the Buffalo professors with five citations might look quite good.

I have a suggestion for those professors of English and other disciplines which hardly anyone seems to read anymore. Go to some Central Asian or East African republic where the papers in your field get only a few citations: the next article you write with its miserable handful of citations will be well above average for that country and your new university will suddenly perform well in the Times Higher rankings. Just make sure that your employer produces two hundred papers a year altogether.

Friday, December 02, 2011

European Universities and Rankings

Research Trends, the newsletter from Scopus,  reports on  a conference of European universities that discussed international rankings. The participants found positive aspects to rankings but also had criticisms:

Going through the comparison of the various methodologies, the report details what is actually measured, how the scores for indicators are measured, and how the final scores are calculated — and therefore what the results actually mean.
The first criticism of university rankings is that they tend to principally measure research activities and not teaching. Moreover, the ‘unintended consequences’ of the rankings are clear, with more and more institutions tending to modify their strategy in order to improve their position in the rankings instead of focusing on their main missions.
For some ranking systems, lack of transparency is a major concern, and the QS World University Ranking in particular was criticized for not being sufficiently transparent.
The report also reveals the subjectivity in the proxies chosen and in the weight attached to each, which leads to composite scores that reflect the ranking provider’s concept of quality (for example, it may be decided that a given indicator may count for 25% or 50% of overall assessment score, yet this choice reflects a subjective assessment of what is important for a high-quality institute). In addition, indicator scores are not absolute but relative measures, which can complicate comparisons of indicator scores. For example, if the indicator is number of students per faculty, what does a score of, say, 23 mean? That there are 23 students per faculty member? Or does it mean that this institute has 23% of the students per faculty compared with institutes with the highest number of students/faculty? Moreover, considering simple counts or relative values is not neutral. As an example, the Academic Ranking of World Universities ranking does not take into consideration the size of the institutions.

I am not sure these criticisms are entirely fair. It seems that the weighting of the various indicators in the Times Higher Education rankings emerged from a lot of to and fro-ing between various stakeholders and advisers. In the end, far too much weighting was given to citations but that is not quite the same as assigning arbitrary or subjective values.

The Shanghai rankings do have an indicator, productivity per capita , that takes  faculty size into account although it is only ten per cent of the total ranking. The problem here is that faculty in the humanities are counted but not their publications.

I am not sure why QS is being singled out with regard to transparency. The THE rankings are also, perhaps in a different way, quite opaque. Aggregate scores are given for teaching environment, research and international orientation without indicating the scores that make up these criteria.

So what is to be done?

The EUA report makes several recommendations for ranking-makers, including the need to mention what the ranking is for, and for whom it is intended. Among the suggestions to improve the rankings, the following received the greatest attention from the audience:
  1. Include non-journal publications properly, including books, which are especially important for social sciences and the arts and humanities;
  2. Address language issues (is an abstract available in English, as local language versions are often less visible?);
  3. Include more universities: currently the rankings assess only 1–3% of the 17,000 existing universities worldwide;
  4. Take into consideration the teaching mission with relevant indicators.

The first of these may become feasible now that Thomson Reuters has a book citation index. The second and third are uncontroversial. The fourth is very problematical in many ways.

The missing indicator here is student quality. To be very blunt, universities can educate and instruct students but they can do very little to make them brighter.  A big contribution to any university ranking would be a comparison of the relative cognitive ability of its students. That, however, is a goal that requires passing through many minefields.

Thursday, December 01, 2011

Diversity and Rankings

Robert Morse, director of data research at the US News and World Report discusses the question of whether "diversity" should be included in the ranking of American law schools.

"I was one of speakers on the "Closing Plenary: Reforming U.S. News Rankings to Include Diversity" panel, which discussed many of the issues pertaining to whether U.S. News should add a measure of diversity directly into the Best Law Schools rankings. I pointed out that U.S. News believes diversity is important and that is why we all ready publish a separate law school diversity index.

Our current index identifies law schools where law students are most and least likely to encounter classmates from a different racial or ethnic group. However, the current index does not measure how successful each law school is at meeting a diversity goal or benchmark at the school, state, local, or national level. It also gives schools enrolling one ethnic group a low score, though that school's enrolment may match its state's ethnic population or the school may be a Historically Black College or University. It's for these reasons the current index would not be appropriate to add into the rankings".

Diversity here does not mean diversity of ideology, religion, class, politics or nationality. It simply means the numbers of recognised minorities, mainly African-Americans, Hispanics and Asian Americans.

It is interesting to look at the diversity index and to see the likely effect of including diversity in the law school rankings. The most diverse law school is the University of Hawaii. The University of the District of Columbia and Florida International University also get high scores. Low scorers include Harvard, Yale and UCLA.

Somehow, I do not think that an indicator that benefited Florida International University at the expense of Harvard would add to the credibility of these rankings.

Unless it can be demonstrated that there is something magically transforming about the statistical profile of a law school reflecting that of its city, state, or nation or future nation, this proposal does not sound like a very good idea.
The Utility of Rankings

Another advantage of a good performance in international university rankings is that graduates will be able to get into Russian postgraduate programs (if your university is in a G8 country).

Russia’s education ministry is currently drawing up a list of foreign universities whose qualifications will be recognized.

The list will include only universities located within the G8 countries that enter the top 300 in the Academic Ranking of World Universities or the QS World University Rankings. Officials say there will be around 300 institutions meeting the criteria.

The reform is intended to attract more students to take part in Russian MA and PhD programs.

Saturday, November 26, 2011

Update 5 on El Naschie vs Nature

The BBC has another piece on the Nature case by Pallab Ghosh. It seems that El Naschie is now admitting that his papers were not peer reviewed but argues that this was because he had no peers who could review them:

He said that he would discuss his papers with fellow scientists, and only when he thought that they were of a sufficiently high standard would he publish them. "I am too arrogant and have too much self-respect to allow a bad paper to pass through," he said.
Prof El Naschie called one witness, Prof Otto Rossler - an honorary editor of Chaos, Solitons and Fractals.
He told the court that there was no-one who could peer review him, referring to Prof El Naschie, because "if you have something new to offer, peer review is dangerous", adding that in such cases "peer review delays progress in science".
Prof El-Naschie asked his witness whether he thought that his (Prof El Naschie) papers were of "poor quality".
Prof Rossler replied: "On the contrary, they were very important and will become more important in the future."
And he added: "You are the most hard-working and diligent scientists I have ever met."

It is a useful to compare Rossler with Neil Turok, Nature's expert witness. Rossler is best known lately for warning that the Large Hadron Collider risked creating black holes that would destroy the world. See El Naschie Watch for this and other information. He is also a self-proclaimed simultaneous submitter, something that for journal editors is almost as bad as plagiarism. A comment on Rossler's claims concludes "To conclude: this text would not pass the referee process in a serious journal".

It seems that it is increasingly difficult to argue that Alexandria University's remarkable scores for research impact in the Times Higher Education World University Rankings were the result of outstanding, excellent or even controversial research.

I am not sure what an honorary editor is but at the moment Rossler is not listed as any sort of editor at the official Chaos Solitons and Fractals home page.

Incidentally, since the trial will presumably turn to the question of El Naschie's affiliations at some point, this page lists him as Founding Editor but does not give any affiliation.
The Politics of Ranking

One of the more interesting aspects of the university ranking business is the way it is used by local politicians to advance their agenda. This is especially obvious in Malaysia where errors and methodological changes have sent local universities bouncing up down the QS rankings. Every rise is proclaimed to be a vindication of government policy while every fall is accompanied by head shaking from the opposition.

This year Universiti Malaya moved into the QS top 200. There is nothing surprising about that: it has been there before. More significant was getting into the Shanghai Academic ranking of World Universities top 500. That is a lot harder but also more likely to reflect real underlying changes. It seems that UM has finally realised that a little bit of encouragement and financial support can produce quite significant results in a short period time.

Patrick Lee, in the blog of opposition leader Lee Kit Siang, comments that:  

PETALING JAYA: Malaysia has little to show for its universities despite spending more money on tertiary education than do many other countries.
Malaysian universities lag behind many counterparts in Asia, including those located in neighbouring countries like Thailand and Singapore, according to a World Bank report released today.
“While Malaysia spends slightly more than most countries on its university students, leading Malaysian universities perform relatively poorly in global rankings,” said the report, entitled Malaysia Economic Monitor: Smart Cities.
Citing the Quacquarelli Symonds (QS) World University Rankings 2010, it noted that Universiti Malaya (UM) was ranked 207th worldwide and 29th in Asia.
It also quoted a US News and World 2011 report on the World’s Best Universities, which put UM, Universiti Kebangsaan Malaysia, Universiti Sains Malaysia and Universiti Putra Malaysia at 167th, 279th, 335th and 358th place respectively.
Even more worrying, the World Bank report observed, was the “increasing gap” between Malaysia’s and Singapore’s universities.
It compared UM with the National University of Singapore (NUS), which QS cited as the leading university in Southeast Asia.
“The gap between UM and NUS has been high and generally increasing, especially in the sciences,” the report said.
According to the report, UM and NUS were on par when it came to science and technology in 2005. However, UM has lost out to NUS over the past six years.
The report also said many of Malaysia’s university graduates did not seem to have the skills that would help them get employment.

Firstly, the QS  and the US News and World Report rankings are the same. Secondly, it is a lot easier to start a university in Malaysia than in Singapore.

Even so, moving into the Shanghai rankings is a real advance and should be recognised as such. 
Update 4 on El Naschie vs Nature

Gervase de Wilde has an article on the case in Inform's Blog. He refers to the Guardian's account and then adds the following comment:

The case seems to offer ammunition to libel reformers. Even in the absence of the ill-advised and incoherent aspects of his case which were excluded before trial, and of the implicit comparisons of his work to Einstein’s made during the first five days at the High Court, his claim against a venerable and highly respected scientific journal seems a poor substitute for meeting their allegations head on in some form of correspondence or public debate. Moreover, the journal had published the Claimant’s own defence of his methods in running CSF, that he sought to emphasise scientific content above impressive affiliations, in the original article.

A spokesperson for the Libel Reform campaign, speaking to the Guardian, commented that reform can’t come soon enough, since
“Scientists expect publications like Nature to investigate and write about controversies within the scientific community. The threat of libel action is preventing scientific journals from discussing what is good and bad science.”

However, the public interest defence argued for by campaigners is one which is already being employed. The BBC reports that Andrew Caldecott QC’s opening statement for the Defendants described their defence as relying on the article being “true, honest opinion and responsible journalism on an issue of public interest”.

As the choice of witnesses indicates, the case does touch on the seemingly incomprehensible branch of physics in which the Claimant has made his academic career. In this respect there is a threat of a libel action stifling academic debate, and a similarity to BCA v Singh 2010 EWCA Civ 350, where opinions expressed in a controversy on what was essentially a scientific matter were at issue. But it is also about the methods he employed in running a publication in the context of a widely recognised system of accreditation and review, and about allegations regarding the professional affiliations which feature on his website. These are the kind of criticisms that might be made about any professional person, and would not necessarily come under scope of a scientific exception for “rigorous debate” on good and bad science urged by campaigners.

It is questionable whether the phrase "academic career" is appropriate since El Naschie has not apparently held any formal permanent academic posts recently unless one counts the award of an Emeritus Professorship by Alexandria University, a strange distinction since there is no sign of the professorship from which he retired.

Saturday, November 19, 2011

Update 3 on El Naschie vs Nature

The Guardian has a substantial report on the case by Alo Jha. It seems that El Naschie believes that expert witness Neil Turok is unqualified to understand his work.It is difficult to see how this argument, even if valid, is relevant to the point of whether or not peer review took place.

Should the court decide in favour of El Naschie, it would provide some sort of justification for the methods used in the citations indicator in the Times Higher Education rankings which gave high scores to Alexandria University mainly or partly because of the many citations of papers by El Naschie.

El Naschie is suing Nature as a result of a news article published in 2008, after the scientist's retirement as editor-in-chief of the journal Chaos, Solitons and Fractals. The article alleged that El Naschie had self-published several research papers, some of which did not seem to have been peer reviewed to an expected standard and also said that El Naschie claimed affiliations and honorary professorships with international institutions that could not be confirmed by Nature. El Naschie claims the allegations in the article were false and had damaged his reputation.
On Friday, Nature called Professor Neil Turok, a cosmologist and director of the Perimeter Institute in Canada, as an expert witness to assess some of the work published by El Naschie.
Turok described his expertise as being in cosmology. "I work at the theoretical end of cosmology … my work consists of applying unified theories, such as string theory, to the most difficult questions in cosmology, namely the beginning of the universe or the initial singularity, the moment where everything was at a single point in the conventional description."
In his evidence, Turok said he found it difficult to understand the logic in some of El Naschie's papers. The clear presentation of scientific ideas was an important step in getting an idea accepted, he said. "There are two questions – one is whether the work is clearly presented and readers would be able to understand it. It would be difficult for a trained theoretical physicist to understand [some of El Naschie's papers]. I couldn't understand it and I made a serious attempt to understand it. The second question is about the correctness of the theory and that will be decided by whether it agrees with experiments. Most theories in theoretical physics are speculative – we form a logical set of rules and deductions and we try, ultimately, to test the deductions in experiments. For me, clear presentation is the first thing in the presentation of a theory."
In response, El Naschie pointed out that even Albert Einstein had made mistakes in his publications. "Einstein is the most sloppy scientist ever. He never defined his quantities, he doesn't put in references and he made so many mistakes of mathematics and concepts. He was a very natural man when he explained something to lay people. But Einstein, whom I admire very much because he had imagination and the courage to stand up to the bloody Nazis, Einstein was an extremely sloppy man."
Later in the session, El Naschie accused Turok of having "no idea" about mathematics and being unqualified to assess his work. "If somebody doesn't understand things, it's his own limitation," El Naschie said.

Friday, November 18, 2011

Times Higher Social Science Rankings

1.  Stanford
2.  Harvard
3.  Oxford
The Influence of Rankings

Varsity, the student newspaper at Cambridge, suggests that British universities are recruiting staff in order to improve their position in the QS rankings:

Matthew Knight, chairman of Universities HR and the University of Leeds HR director, said: “Within the context of £9,000 fees, many universities have a strategic drive to improve the quality of the student experience.
“Therefore, many are taking the opportunity to improve student staff ratios regardless of the numbers of applicants. So there’s a lot of recruitment going on at some universities, although there’s no specific pattern to this.”
As the QS World University Rankings use student-faculty ratios as the only globally comparable indicator to determine their tables, an increase in employment can be used to promote a university’s image and attract students.

Thursday, November 17, 2011

Update 2 on El Naschie and Nature

Note that New Scientist describes El Naschie as an "independent physicist". Does this imply that he has no affiliation and that Nature was correct in questioning his claims to academic status?
Update on El Naschie and Nature

The New Scientist has provided some coverage of the trial which is also discussed at El Naschie Watch. On November 15,  this item by Chelsea Whyte appeared:

Benjamin De Lacy Costello, a materials scientist at the University of the West of England in Bristol, UK, testified yesterday that when El Naschie was editor, the peer-review process at Chaos, Solitons and Fractals was "frustrating" and unlike that of other journals.

With regard to the dispute over El Naschie's affiliations, Timothy John Pedley, former head of the department of applied mathematics and theoretical physics at the University of Cambridge, said that El Naschie was a visiting scholar with access to libraries and collaborations at the department, but was not an honorary scholar working with the privileges of a professor.

On November 16 this update appeared:
Update: Mohamed El Naschie, a former editor of the scientific journal Chaos, Solitons and Fractals, appeared in London's High Court today for the libel lawsuit he has brought against the scientific journal Nature.

El Naschie is representing himself.
During El Naschie's cross-examination of journalist Quirin Schiermeier, who wrote the 2008 article about him, Schiermeier stood by the content of the work, saying, "We wrote the article because you published 58 papers in one year in a journal where you acted as editor-in-chief. That is unusual and potentially unethical."

El Naschie responded that he felt it wasn't unheard of for journals to publish work that isn't peer-reviewed. He also said that his work had been stolen. "We published my work to secure it," he told the court. "Senior people are above this childish, vain practice of peer review."

I am not an expert, but it seems that El Naschie does not appear to dispute any longer  that his pattern of self-publication was unusual or that there had  been little or no peer review. He is simply claiming that publication was necessary to preempt the theft of his work by rivals and that the absence of peer review was excused by his seniority. Whether that is inconsistent with Nature's comments is, I assume, a matter for the judge to decide.

El Naschie and Nature

The El Naschie vs Nature case is under way at the Royal Courts of Justice in London.

Briefly, Mohamed El Naschie, the former editor of the journal Chaos, Solitons and Fractals, is suing the journal Nature and the writer Quirin Schiermeier for its comments on the journal's publication of many of his own papers.

El Naschie is claiming that he was defamed by the suggestion that his papers were of poor quality and were published without a normal peer review process. He also claims that he had been defamed by the imputation that he had claimed academic affiliations to which he was not entitled.

The case is of vital importance to academic freedom since if successful it would mean that wealthy persons could stifle even the most balanced and temperate comments on scientific and scholarly activities.

 It is also of importance to the question of international university ranking since El Naschie's unusual self-publication and self-citation within a short period of time in a field where citations are low allowed Alexandria University to achieve an extraordinarily high score in the 2010 Times Higher Education World University Rankings. Even this year,  the university had a n unreasonably high score in the ranking's research impact indicator. If El Naschie were successful in his claim then Times Higher and Thomson Reuters, who collected and analysed the data for the rankings, would be able to argue that they had uncovered a small pocket of excellence.

The case has been covered extensively in El Naschie Watch and has been discussed in the scientific press.

Updates will be provided from time to time.

Wednesday, November 16, 2011

The THE Subject Rankings

The ranking seasons has drawn to a close, or at least it will when we have digested the feasibility report from the European Commission's U-Multirank project. Meanwhile, to tie up some loose ends, here are the top 3 from each of THE's subject group rankings.

Engineering and Technology

1.  Caltech
2.  MIT
3.  Princeton

Arts and Humanities

1.  Stanford
2.  Harvard
3.  Chicago

Clinical, Pre-Clinical and Health

1.  Oxford
2.  Harvard
3.  Imperial College London

Life Sciences

1.  Harvard
2.  MIT
3.  Cambridge

Physical Sciences

1.  Caltech
2.  Princeton
3.  UC Berkeley

Social Sciences

To be posted on the 17th of November.

Monday, November 07, 2011

Conference in Shanghai

I hope to post something in a day or two on the recent World Class Universities conference in Shanghai. Meanwhile, there is an interesting comment by Alex Usher of Higher Education Strategy Associates, a Canadian consulting firm.

"In discussions like this the subject of rankings is never far away, all the more so at this meeting because its convenor, Professor Nian Cai Liu, is also the originator of the Academic Ranking of World Universities, also known as the Shanghai Rankings. This is one of three main competing world rankings in education, the others being the Times Higher Education Supplement (THES) and the QS World Rankings.

The THES and QS rankings are both commercially-driven exercises. QS actually used to do rankings for THES, but the two parted ways a couple of years ago when QS’s commercialism was seen to have gotten a little out of hand. After the split, THES got a little ostentatious about wanting to come up with a “new way” of doing rankings, but in reality, the two aren’t that different: they both rely to a considerable degree on institutions submitting unverified data and on surveys of “expert” opinion. Shanghai, on the other hand, eschews surveys and unverified data, and instead relies entirely on third-party data (mostly bibliometrics).

In terms of reliability, there’s really no comparison. If you look at the correlation between the indicators used in each of the rankings, THES and QS are very weak (meaning that the final results are highly sensitive to the weightings), while the Shanghai rankings are very strong (meaning their results are more robust). What that means is that, while the Shanghai rankings are an excellent rule-of-thumb indicator of concentrations of scientific talent around the world, the QS and THES rankings in many respects are simply measuring reputation.

(I could be a bit harsher here, but since QS are known to threaten academic commentators with lawsuits, I’ll be circumspect.)

Oddly, QS and THES get a lot more attention in the Canadian press than do the Shanghai rankings. I’m not sure whether this is because of a lingering anglophilia or because we do slightly better in those rankings (McGill, improbably, ranks in the THES’s top 20). Either way, it’s a shame, because the Shanghai rankings are a much better gauge of comparative research output, and with its more catholic inclusion policy (500 institutions ranked compared to the THES’s 200), it allows more institutions to compare themselves to the best in the world – at least as far as research is concerned. "

Some technical points. First, Times Higher Education Supplement changed its name to Times Higher Education when it converted to a magazine format in 2008.

Second, the Shanghai rankings are not entirely free from commercial pressures themselves although that has probably had the laudable effect of maintaining a stable methodology since 2003.

Third, both THE and QS accept data from institutions but both claim to have procedures to validate them. Also, the Shanghai rankings do include data from government agencies in their productivity per capita criterion and in some places that might not be any more valid than data from universities.

Fourth, until recently there has been a significant difference in the expert opinion used by THE and by QS. Most of QS's survey respondents were drawn from the mailing lists of the Singapore- and London- based academic publishers, World Scientific,  while THE's are drawn from those who have published papers in the ISI indexes. All other things being equal, we would expect THE's respondents to be more expert. This year the difference has been reduced somewhat as QS are getting most of their experts from the Mardev lists supplemented by a sign up facility.

Fifth, although THE publish a list of 200 universities in print and on their site, there is a fairly easily downloadable iphone app available that lists 400 universities.

The most important point though is the question of consistency. It is quite true that the various indicators in the Shanghai rankings correlate quite closely or very closely with one another (.46 to .90 in 2011 according to a conference paper by Ying Chen  and Yan Wu of the Shanghai Center for World- Class Universities) while some of those in the QS and THE rankings have little or no relation to one another. However, it could be argued that if two indicators show a high correlation with one another then they are to some extent measuring the same thing and one of them is redundant. Still, that is probably better than indicators which statistically have little to do with one another.

What is more important perhaps is the consistency from one year to another. The main virtue of the Shanghai rankings is that changes in position can be assumed to reflect actual real world changes whereas those in the THE and QS rankings could easily be the result of methodological changes or, in the case of THE, omissions or inclusions.

Friday, October 28, 2011

An Error

This replaces an earlier post.

Last year Times Higher Education admitted to an error involving Monash University and the University of Adelaide

Also, after the launch of the World University Rankings 2010 it became apparent that, owing to a data processing error, the ranking positions of two Australian universities in the top 200 list were incorrect — the University of Adelaide and Monash University.

Both universities remain in the top 1 per cent of world universities.

This year, a representative of Adelaide commented on the error: 

Adelaide's DVCR Mike Brooks said it had been "disconcerting'' that there had been a data processing error last year in the first iteration of the revised rankings since their split from QS. "It certainly raises further questions about the credibility of the rankings,'' Professor Brooks said.

"Based on our own analysis we believe that we have a similar ranking this year to that of 2010. The shift in position is attributed to the error in the processing last year, ongoing changes in THE methodology and increased competition.''

"I think the students and the wider community are able to judge for themselves.  As South Australia's leading research-university and only member of the Group of Eight, I know that we are in an incredibly strong position for the future.''

Adelaide's fall seems to have been due very largely to a massive fall in the score for research impact. How much of this was due to the correction of the 2010 error, how much to changes in methodology and how much to the inherent instability of the normalisation procedure is not clear

Monday, October 17, 2011

GLOBAL: Despite ranking changes, questions persist 

 My article on the Times Higher Education World University Rankings can be accessed at University World News

The international university ranking scene is starting to look like the heavyweight boxing division. Titles are proliferating and there is no longer an undisputed champion of the world. Caltech has just been crowned top university by Times Higher Education and Thomson Reuters, their data collectors and analysts, while QS have put Cambridge in first place. Over at Webometrics, MIT holds the number one spot. But Harvard has the consolation of remaining the top university in the Scimago and HEEACT rankings as well as the Academic Ranking of World Universities, ARWU, published by Shanghai Jiao Tong University.

Read here

Sunday, October 09, 2011

Rising Stars of the THE - TR Rankings

These are some of the universities that have risen significantly in the rankings compared to last year.


Karolinska Institute


UC Davis
Michigan State




Caltech in First Place

The big news of the 2011 THE - TR rankings is that Caltech has replaced Harvard as the world's top university. So how exactly did they do it?

According to the Times Higher iPad apps for this year and last (easily downloadable from the rankings page), Harvard's total score fell from 96.1 to 93.9 and Caltech's from 96.0 to 94.8, turning a 0.1 Harvard lead into one of 0.9 for Caltech.

Harvard continued to do better than Caltech in two indicators, with 95 .8 for teaching and 67.5 for international orientation compared to 95.7 and 56.0 for Caltech.

Caltech is much better than Harvard in industry income - innovation but that indicator has a weighting of only 2.5 %.

Harvard's slight lead in the research indicator has turned into a slight lead of 0.8 for Caltech.

Caltech is still ahead for citations but Harvard caught up a bit, narrowing the lead to 0.1.

So, it seems that what made the difference was the research indicator. it seems unlikely that Caltech could overcome Harvard's massive lead in reputation for research and postgraduate teaching: last year it was 100 compared with 23.5. That leaves us with research income per faculty.
According to Phil Baty :

"Harvard reported funding increases that are similar in proportion to those of many other universities, whereas Caltech reported a steep rise (16 per cent) in research funding and an increase in totalinstitutional income."

This seems generally compatible with Caltech's 2008-2009 financial statement according to which:

Before accounting for investment losses, total unrestricted revenues increased 6.7% including JPL, and 14.0% excluding JPL


Research awards in FY 2009 reached an all-time high of $357 million, including $29 million of funds secured from the federal stimulus package. Awards from federal sponsors increased by 34.4%, while awards from nonfederal sponsors increased by 20.7%.  We also had a good year in terms of private giving, as donors continue to recognize the importance of the research and educational efforts of our outstanding faculty and students.

It seems that research income is going to be the tie-breaker at the top of the THE - TR rankings.  This might not be such a good thing. Income is an input. It is not a product, although universities everywhere apparently think so. There are negative backwash effects coming if academics devote their energies to securing grants rather than actually doing research.
Update on Alexandria

Elnaschiewatch reports that Hend Hanafi, President of Alexandria University, has resigned following prolonged student protests.

Apparently she was under fire because of her links to the old regime but one wonders whether her university's apparent fall of nearly 200 places in the THE - TR rankings gave her a final push. If so, we hope that Times Higher will send a letter of apology for unrealistically raising the hopes of faculty and students. 
Meanwhile over in Alexandria
One of the strangest results of the 2010 THE - TR rankings was the elevation of Alexandria University in Egypt to the improbable status of fourth best university in the world for research impact and 147th overall. It turned out that this was almost entirely the work of precisely one marginal academic figure, Mohamed El Naschie, former editor of the journal Chaos Solitons and Fractals, whose worked was copiously cited by himself, other authors in his journal and those in an Israeli - published journal  (now purchased by De Gruyter) of which he was an editor.

The number of citations collected by El Naschie was not outrageously high but it was much higher than usual for his discipline and many of them were within a year of publication. This meant that El Naschie and Alexandria University received massive credit for  his citations since Thomson Reuters' normalisation system meant comparison with  the international average in a field where citations are low especially in the first year of publication.

Alexandria was not the only university to receive an obviously inflated score for research impact. Hong Kong Baptist University received a score of 97.6 and Bilkent one of 95.7, although in those two cases it seems that the few papers that contributed to these scores did have genuine merit.

It should be remembered that the citation scores were averages and that a few highly cited papers could have a grossly disproportionate effect if the total number of published papers was low.

This year Thomson Reuters went to some length to reduce the impact of a few highly cited papers. They have to some extent succeeded. Alexandria's score is down to 61.4  for citations (it is in 330th place overall),  Bilkent's to 60.8 (222nd place overall) and HKBU's to 59.7 (290th place overall).

These scores are not as ridiculous as those of 2010 but they are still unreasonable. Are we really expected to believe that these schools have a greater research impact than the University of Sydney, Kyoto University, the London School of Economics, Monash University and Peking University who all have scores in the fifties for this indicator?

I for one cannot believe that a single paper or a few papers, no matter how worthwhile, can justify inclusion in the top 300 world universities.

There is another problem. Normalisation of citations by year is inherently unstable. One or two papers in a low citation discipline cited within a year of publication will give a boost to the citations indicator score but after a year their impact diminishes because the citations are now coming more than a year after publication.

Alexandria's score was due to fall anyway because El Naschie has published vary little lately so his contribution to the citations score has fallen whatever methodological changes were introduced. And if he ever starts publishing again?

Also, if Thomson Reuters are normalising by field across the board, this rises the possibility that universities will be able to benefit by simply reclassifying research grants, moving research centres fromone field to another, manipulating abstracts and key words and so on.

Friday, October 07, 2011

Who else is down ?

Just looking at the top 200 of the THE rankings, these universities have fallen quite a bit.

University of North Carolina Chapel Hill
Ecole Normale Superieure
Ecole Polytechnique
Trinity College Dublin
University College Dublin
William and Mary College
University of Virginia
Asian Decline?

The Shanghai rankings have shown that universities in Korea, China (including Taiwan and Hong Kong) and the Middle East  have been steadily advancing over the years. Did they get it wrong?

The latest  Times Higher Education -Thomson Reuters rankings appear to prove that Asian universities have inexplicably collapsed over the last year. Tokyo has dropped from 26th to 30th place. Peking has fallen twelve places to 49th. Pohang University of Science and Technology and the Hong Kong University of Science and Technology have slipped out of the top fifty. Bilkent and Hong Kong Baptist University are way down.The decline of China University of Science and Technology is disastrous, from 49th to 192nd. Asian universities are going to be dangerous places for the next few days with students and teachers dodging university administrators jumping out of office windows.

Of course, massive declines like this do not reflect reality: they are simply the result of the methodological changes introduced this year. 

Anyone accessing a ranking site or downloading an iPad app should be made to click on a box reading "I understand that the methodological changes in the rankings mean that comparison with last year's ranking is pointless and I promise not to issue a public statement or say anything to anyone until I have had a cup of tea and I have made sure that everybody else understands this."

Thursday, October 06, 2011

New Arrivals in the THE Top 200.

Every time a new ranking is published there are cries for the dismissal or worse of vice-chancellors or presidents who allowed their universities to lose ground. There will no doubt be more demands as the results of this year's THE rankings are digested. This will be very unjust since there are reasons why universities might take a tumble that have nothing to do with any decline in quality.

First, Thomas Reuters, THE's data collectors, have introduced several methodological changes. In the top 20 or 30 these might not mean very much but lower down the effect could be very large.

Second, rankers sometimes make mistakes and so do those who collect  data for institutions.

Third, many new universities have taken part this year. I counted thirteen just in the top 200 and there are certainly many more in the 200s and300s. A university ranked 200 last year would lose 13 places even if it had exactly the same relative score.

The thirteen newcomers are Texas at Austin, Rochester, Hebrew  University of Jerusalem, University of Florida, Brandeis, Chinese University of Hong Kong, Nijmegan, Medical University of South Carolina, Louvain, Universite Paris Diderot vii, Queen's University, Canada, Sao Paulo, Western Australia.
Highlights of the THE rankings

Some interesting results.

57.  Ohio State University
103.  Cape Town
107 Royal Holloway
149. Birkbeck
184. Iowa State
197. Georgia Health Sciences University
201-225. Bilkent
201-225 University of Medicine and Dentistry of New Jersey
226-250 Creighton University USA
226-250 Tokyo Metropolitan
251-275 Wayne State
276-300 University of Crete
276-300 University of Iceland
276-300 Istanbul Technical University
276-300 Queensland University of Technology
276-300 Tokyo Medical and Dental University
301-350 Alexandria
301-350 Aveiro University
301-350 Hertfordshire
301-350 Plymouth University, UK
301-350 Sharif University of Technology
301-350 National University of Ireland, Maynooth
301-350 Taiwan Ocean University
301-350 Old Dominion University, USA
THE Rankings Out

Here is the top 10.

1. Caltech
2. Harvard
3. Stanford
4. Oxford
5. Princeton
6. Cambridge
7. MIT
8. Imperial College London
9. Chicago
10. Berkeley
THE Rankings: Caltech Ousts Harvard

This is from the Peninsula in Qatar

LONDON: US and British institutions once again dominate an annual worldwide league table of universities published yesterday, but there is a fresh name at the top, unseating long-time leader Harvard.
California Institute of Technology (Caltech) knocked the famous Massachusetts institution from the summit of the Times Higher Education (THE) league table for the first time in eight years, with US schools claiming 75 of the top 200 places.
Next is Britain, which boasts 32 establishments in the top 200, but an overhaul in the way in which the country’s universities are funded has raised concerns over its continuing success.
Asia’s increasing presence in the annual table has stalled, with 30th placed University of Tokyo leading the continent’s representation.
China’s top two universities hold on to their elite status, but no more institutions from the developing powerhouse managed to break into the top 200.
THE attributed Caltech’s success to “consistent results across the indicators and a steep rise in research funding”.
THE Rankings


The Guardian appears to have heard something.

On Thursday, the Times Higher Education its global universities rankings. As usual, UK universities shine disproportionately. Altogether a dozen are in the top 100 in the world, with seven in the top 50.

Wednesday, October 05, 2011

Latin American Rankings

QS have produced their new Latin American rankings. The Top five are:

1. Universidade de Sao Paulo
2. Pontificia Universidad Catolica de Chile
3. Unidersidade Estadual de Campinas, Brazil
4. Universidad de Chile
5. Universidad Nacional Autonoma de Mexico (UNAM)

In Times Higher Education, Terrance Karran claims that universities that do well in the THE rankings (and the other ones?) are those that show more regard for academic freedom, which is equated to "compliance" with the AAUP's academic freedom statement.

Perhaps an annual prize could be awarded to the university that has the most academic freedom. I propose that it be called the Lawrence Summers Prize

David Willetts, the Brish minister for universities and science says that he expects that more British universities will be in the Times Higher Education World University Rankings top 200.

And if more British universities, then fewer........?

Tuesday, October 04, 2011

The US News rankings

The U.S. News rankings of American colleges and universities were released on September 13th. For more information go here.

The top 10 national unuiversities are:

1.  Harvard
2.  Princeton
3.  Yale
4.  Columbia
5 = Caltech
5 = MIT
5= Stanford
5= Chicago
5= University of Pennsylvania
10. Duke

Wednesday, September 14, 2011

Announcement from THE

Times Higher Education have just announced that they will only rank 200 universities this year. Another 200 will be listed alphabetically but not ranked.

Let us be clear: the Times Higher Education World University Rankings list only the world’s top 200 research-led global universities.

We stop our annual list at the 200th place for two reasons. First, it helps us to make sure that we compare like with like. Although those ranked have different histories, cultures, structures and sizes, they all share some common characteristics: they recruit from the same global pool of students and staff; they push the boundaries of knowledge with research published in the world’s leading journals; and they teach at both the undergraduate and doctoral level in a research-led environment.
We unashamedly rank only around 1 per cent of the world’s universities – all of a similar type – because we recognise that the sector’s diversity is one of its great strengths, and not every university should aspire to be one of the global research elite.
But we also stop the ranking list at 200 in the interests of fairness. It is clear that the lower down the tables you go, the more the data bunch up and the less meaningful the differentials between institutions become. The difference between the institutions in the 10th and 20th places, for example, is much greater than the difference between number 310 and number 320. In fact, ranking differentials at this level become almost meaningless, which is why we limit it to 200.
If THE are going to provide sufficient detail about the component indicators to enable analysts to work out how universities compare with each other this would be be a good idea. It would avoid  raucous demands that university heads resign whenever the top national university slips 20 places in the rankings but would allow analysts to figure out exactly where schools were standing.

 It is true, as Phil Baty says, that there is not much difference between being 310 and 320 but there is, or there would be if the methodology was valid, a difference between 310 and 210. If THE are just going to present us with a list of 200 universities that did not (quite?) make it into the top 200 a lot of usable information will be lost.

The argument that THE is interested only in the ranking of the leading research led institutions seems to run counter to THE's emphasis on its bundle of teaching indicators and the claim that normalization of citations data can uncover hidden pockets of excellence. If we are concerned only with universities with a research led environment then  a few pockets or even a single pocket should be of little concern.

One also wonders what would happen if disgruntled universities decided that it was not worth the effort of collecting masses of data for TR and THE if the only reward is to be lumped among 200 also rans.

Tuesday, September 13, 2011

700 Universities

QS have released a ranked list of 700 universities. See here.

Sunday, September 11, 2011

QS: The Employer Survey

The employer survey indicator in the QS World University Rankings might be regarded as a valuable assessment tool since it provides an external check on university quality. There are, however, some odd things about this indicator in the 2011 QS Rankings.

Thirteen universities are given scores of 100, of which 10 are listed as in 4th= place, presumably meaning that they had scores that were identical down to the first or second decimal point. Then 15 schools are listed as being in 15th place with a score of 90, 48 in 51st place with a score of 59.4 and 52 in 100th= place with a score of 55.9.

This is probably something to do with a massive upsurge in responses from Latin America, although exactly what is not clear. QS report that:

"QS received a dramatic level of response from Latin America in 2011, these counts and all subsequent analysis have been adjusted by applying a weighting to responses from countries with a distinctly disproportionate level of response."

The economist David Blanchflower has dismissed the QS rankings as "a load of old baloney".

Much of what he says is sensible, indeed obvious. But not entirely.

"This ranking is complete rubbish and nobody should place any credence in it."

A bit too strong. The QS rankings are not too bad in parts, having improved over the last few years, and are moderately accurate about sorting out universities within a country or region. I doubt that anyone seriously thinks that Cambridge is the best university in the world unless we start counting May balls and punting on the Cam but it is quite reasonable that it is better than Oxford or Durham. Similarly, I wonder if anyone could argue that it is rubbish that Tokyo is the best university in Japan or Cape Town in Africa.

"It is unclear whether having more foreign students and faculty should even have a positive rank; less is probably better."

Students yes, but if nothing else more international faculty does mean that a university is recruiting from a larger pool of talent.

Blanchflower does not mention the academic and employer surveys both of which are flawed but do provide another dimension of assessment or the faculty student ratio which is very crude but might have a slightly closer relationship to teaching quality than the number of alumni who received Nobel prizes decades ago.

He then goes on to compare the QS rankings unfavorably with the Shanghai rankings (That actually is Shanghai Jiao Tong University not what he calls the University of Shanghai). I would certainly agree with most of what he says here but I think that we should remember that flawed as they are the QS rankings do, unlike the Shanghai index, give some recognition to excellence in the arts and humanities, make some attempt to assess teaching and  provide a basis for discriminating among those universities without Nobel prize winners or Fields medalists.

Finally, I would love to see if Blanchflower has any comments on last year's THE-Thomson Reuters rankings which put Alexandria, Bilkent and Hong Kong Baptist University among the world's research superpowers.

Friday, September 09, 2011

Well Done, QS
QS have just indicated that they have excluded self-citations from their citations per faculty indicator in this year's World University Rankings. This is a very positive move that will remove some of the distortions that have crept into this indicator over the last few years. It would have been even better if they had excluded citations within journals and within institutions. Maybe next year.

It will be interesting to see if Times Higher Education and Thomson Reuters do the same with their rankings in October.It would not be very difficult and it might help to exclude Alexandria University and a few others from an undeserved place in the world's top universities for research impact.

(By the way Karolinska Institute is not in the US)

Although it may not make very much difference at the very top of this indicator, it seems that some places have suffered severely and others have benefited  from the change. According to the QS intelligence unit:

  • Of all of the institutions we looked at the institution with the largest absolute number of self-citations, by some margin, is Harvard with over 93,000 representing 12.9% of their overall citations count
  • The top five institutions producing over 3,000 papers, in terms of proportion of self-citations are all in Eastern Europe – St Petersburg State University, Czech Technical University, Warsaw University of Technology, Babes-Bolyai University and Lomonosov Moscow State University
  • The top five in terms of the difference in citations per paper when self-citations are excluded are Caltech, Rockefeller, UC Santa Cruz, ENS Lyon and the University of Hawaii
  • And the top 10 in terms of the difference in citations per faculty when self-citations are included are:
# Institution Country
1 California Institute of Technology (Caltech) United States
2 Rockefeller University United States
3 Stanford University United States
4 Gwangju Institute of Science and Technology (GIST) South Korea
5 Karolinska Institute United States
6 Princeton University United States
7 Leiden University Netherlands
8 Harvard University United States
9 University of California, San Diego (UCSD) United States
10 University of California, San Francisco (UCSF) United States

Wednesday, September 07, 2011

The Best University in the World
Update 8/9/2011 -- some comments added

For many people the most interesting thing about the QS rankings is the battle for the top place. The Shanghai rankings put Harvard in first place year after year and no doubt will do so for the next few decades. QS when it was in partnership with Times Higher Education also routinely put Harvard first. This is scarcely surprising since the research prowess of Cambridge has steadily declined in recent years. Still, Cambridge, Oxford and two London colleges did quite well mainly because they got high scores for international faculty and students and for the academic survey (not surprising since a disproportionate number of responses came from the UK, Australia and New Zealand) but not well enough to get over their not very distinguished research record.

Last year, however, Cambridge squeezed past Harvard. This was not because of the  academic and employer surveys. That remained at 100 for both places. What happened was that between 2009 and 2010 Cambridge's score for citations per faculty increased from 89 to 93. This would be a fine achievement if it represented a real improvement. Unfortunately, almost every university with scores above 60 for this indicator in 2009 went up by a similar margin in 2010 while universities with scores below 50 slumped. Evidently, there was a new method of converting raw scores. Perhaps a mathematician out there can help.

And this year?

Cambridge and Harvard are both at 100 for the academic and employer surveys just like last year. (Note that although Harvard does better than Cambridge in both surveys they get the same reported score of 100).

For the faculty student ratio Harvard narrowed the gap a little from 3 to 2.5 points. In citations per faculty Cambridge slipped a bit by 0.3 points. However, Cambridge pulled further ahead on international students and faculty.

Basically, from 2004 to 2009 Harvard reigned supreme because its obvious superiority in research was more than enough to offset the advantages Cambridge enjoyed with regard to internationalisation (small country and policies favouring international students), faculty student ratio (counting non-teaching research staff) and the academic survey (disproportionate responses from the UK and Commonwealth). But this year and last the change in the method of converting the raw scores for citations per faculty artificially boosted Cambridge's overall scores.

So, is Cambridge really the world's top university?

Tuesday, September 06, 2011

The THE-TR Rankings

The THE-TR World University Rankings will be published on October 6th.

There will be some changes. The weighting given to the citations indicator will be slightly reduced to 30% and internationalisation gets 7.5% instead of 5%.

There will be some tweaking of the citations indicator to avoid a repeat of the Alexandria and other anomalies. Let's hope it works.

In the research indicator there will be a reduction in the weighting given to the survey and public research income as a percentage of research income will be removed.

There will, unfortunately, be a slight increase in the percentage given of international students and a decline in that for international faculty.

Monday, September 05, 2011

Commentary on the 2011 QS World University Rankings

From India

"University of Cambridge retains its number one spot ahead of Harvard, according to the QS World University Rankings 2011, released today. Meanwhile, MIT jumps to the third position, ahead of Yale and Oxford.

While the US continues to dominate the world ranking scenario, taking 13 of top 20 and 70 of top 300 places, 14 of 19 Canadian universities have ranked lower than 2010. As far as Europe is concerned, Germany, one of the emerging European destinations in recent times, has no university making it to the top 50 despite its Excellence Initiative.

Asian institutions - particularly those from Japan, Korea, Singapore, Hong Kong and China - have fared well at a discipline level in subject rankings produced by QS this year - this is particularly true in technical and hard science fields.

Despite the Indian government's efforts to bring about a radical change in the Indian higher education sector, no Indian university has made it to the top 200 this year. However, China has made it to the top 50 and Middle East in the top 200 for the first time.

According to Ben Sowter, QS head of research, "There has been no (relative) improvement from any Indian institution this year. The international higher education scene is alive with innovation and change, institutions are reforming, adapting and revolutionising. Migration amongst international students and faculty continues to grow with little sign of slowing. Universities can no longer do the same things they have always done and expect to maintain their position in a ranking or relative performance.""
Commentary on 2011 QS World University Rankings

SEÁN FLYNN, Education Editor
TCD AND UCD have continued to slide down the world university rankings in a trend which will concern Government, business and heads of colleges.
The latest QS rankings – published this morning – show a substantial drop in ranking for most Irish universities.
TCD drops down 13 places to 65; UCD is down 20 places from 114 to 134. NUI Galway suffers the most dramatic fall, down 66 places to 298. UCC bucked the trend, up marginally from 184 to 181.
The new international league table is a serious blow to the Irish university sector. Two years ago TCD was in the elite top 50 colleges, while UCD was in the top 100. Over the past two years both of Ireland’s leading colleges have lost significant ground.
The fall in Irish rankings was widely expected as the university sector has struggled to cope with a 6 per cent decline in employment and a funding crisis.
Commentary on the 2011 QS World University Rankings

"In tough times, good news comes for Australian institutions in Eighth QS World University Rankings®
- Eighth annual QS World University Rankings® sees all of the Group of Eight featured in the top 300
- Australian National University (26) remains Australia’s best-performing university but falls by 6 places.
- Seventeen Australian institutions featured in the top 300
- Based on six indicators including surveys of over 33,000 global academics and 16,000 graduate employers, the largest of their kind ever conducted
- New in 2011: results published alongside comparative international tuition fee on"
Commentary on the 2011 QS World University Rankings

"PETALING JAYA: Universiti Malaya (UM) is the only Malaysian institution that has made it to the top 200 of the QS World University Rankings 2011/12.
It moved up 40 places to 167 this year compared to 207 in 2010.

Universiti Kebangsaan Malaysia (UKM), Universiti Sains Malaysia (USM), Universiti Putra Malaysia (UPM) and Universiti Teknologi Malaysia (UTM) have all slid down the rankings (see table).

UKM is ranked 279 this year compared to 263 in 2010; USM at 335 (309), UPM 358 (319) and UTM at between 401 and 450 (365).
For the first time, the International Islamic University Malaysia (IIUM) and Universiti Teknologi Mara (UiTM) were included in the rankings at 451-500 and 601+ respectively."
Commentary on the QS 2011 World University Rankings

"Dubai: UAE University (UAEU) has moved up 34 places to come 338th in the Quacquarelli Symonds (QS) World University Rankings, which looked at more than 2,000 institutions to come up with a top 500 list.
UAEU officials said the university is working toward a top 100 spot. The university was also ranked 299th in the Life Sciences & Medicine subject category.The University of Cambridge was ranked as the top university in the world followed by Harvard University, Massachusetts Institute of Technology (MIT), Yale University, University of Oxford.
Saudi Arabia's King Saud University (KSU) came 200th and tops the list among Middle East institutions with King Fahd University of Petroleum & Minerals (KFUPM) and King Abdul Aziz University (KAU) coming in second and fifth respectively. American University of Beirut came third and UAEU fourth."
QS Rankings Published

The rankings have now been published and can be accessed here.

The top 300 are included with total scores and tuition fees.
QS Rankings Update

Some highlights are provided by CNW

  • Global:  University of Cambridge retains number one spot ahead of Harvard, while MIT jumps to third ahead of Yale and Oxford; 38 countries in top 300
  • Government and private funding for technology-focussed research is eroding the dominance of traditional comprehensive universities. The average age of the top 100 institutions has dropped by seven years since 2010, reflecting the emergence of newer specialist institutions particularly in Asia
  • US/Canada: US takes 13 of top 20 and 70 of top 300 places; McGill (17) and Toronto (23) both up, but 14 of 19 Canadian universities rank lower than 2010
  • UK/Ireland: Oxford (5) and Imperial (6) leapfrog UCL (7), as four UK universities make the top 10; TCD (65) and UCD (134) both drop
  • Continental Europe: ETH Zurich (18) leads ENS Paris (33), EPFL (35) and ParisTech (36); no German university in top 50 despite Excellence Initiative
  • Asia: HKU (22) leads Tokyo (25), NUS (28) and Kyoto (32); India: IITB drops out of top 200; China: Tsinghua (47) joins Peking (46) in top 50
  • Australia: Gap between ANU (26) and Melbourne (31) closes from 18 to five, ahead of Sydney (38); G8 all make top 100
  • Middle East: King Saud University (200) makes top 200 for first time
  • Latin America: USP (169) makes top 200 for first time; five universities in top 300 (Brazil, Chile and Argentina)
QS Rankings Update

The BBC reports on Scottish universities in the rankings.

The University of Glasgow has climbed 18 places in an international league table of higher education institutions.
Glasgow is now 59th in the QS World University Rankings, ahead of St Andrews which is in 97th place.
The University of Edinburgh is the highest ranked Scottish institution moving up two places to 20th position.
Principal of Glasgow, Professor Anton Muscatelli, said it had confirmed its position as one of the world's leading universities.

QS Rankings Update

The National reports that UAE University has risen from 372nd to 338th place.

QS Rankings 2011 Update

According to the Herald Sun, Cambridge has retained its place at the top of the QS rankings.

MELBOURNE is clawing its way up the ranks of the world's best universities, but Canberra is clinging on to top spot.
Australian National University is the nation's best tertiary institution, claiming 26th spot in the international league table.
But Melbourne University is hot on its heels - ranked 31st after jumping seven spots over the past year.
The UK's famed Cambridge University has claimed poll position, followed by Harvard University, Massachusetts Institute of Technology and Yale University in the US.
Oxford University rounded out the top five, according to the QS World University Rankings, released yesterday.
Victoria's other top performers were Monash University, which jumped one spot to 60, and RMIT at 228.

QS Rankings Update

Although they have  not been released yet some news about the QS 2011 rankings is trickling out.

From Todayonline in Singapore

NTU [Nanyang Technological University] , NUS [National University of Singapore] climb the ladder in global university rankings 
NTU leaps 16 places to take 58th spot, while NUS moves up three notches to take 28th spot

Saturday, September 03, 2011

The QS Rankings are Coming

QS will release their 2011 World University Rankings at 0101 GMT on Monday.They have already sent out fact files to the 600+ listed universities.

Things to look for.

Will Harvard regain it's position at the top from Cambridge? It might if QS revert to the previous method of converting the raw scores for the citations per faculty indicator.

Will spending cuts lead to a decline in the observed quality of British universities?

Will universities in China,Korea, Latin America and the Middle East repeat the successes they recorded in the Shanghai rankings?

Will Universiti Malaya return to the top 200? If it does will it be acknowledged as the number 1 Malaysian University?

Watch this space

Monday, August 29, 2011

Japanese Universities Send a Strong Request

A very interesting document from the top 11 Japanese research universities has appeared. They are unhappy with the citations indicator in last year's Times Higher Education -- Thomson Reuters World University Rankings.

"The purpose of analyzing academic research data, particularly publication and citation trends is to provide diverse objective information on universities and other academic institutions that can be used by researchers and institutions for various evaluations and the setting of objectives. The 2010 Thomson Reuters / THE World University Rankings, however, do not give sufficient consideration to the unique characteristics of universities in different countries or the differing research needs and demands from society based on country, culture and academic field. As a result, those rankings are likely to lead to an unbalanced misleading and misuse of the citation index.

RU11 strongly requests, therefore, that Thomson Reuters / THE endeavors to contribute to academic society by providing objective and impartial data, rather than imposing a simplistic and trivialized form of university assessment."

It is a tactical mistake to go on about uniqueness. This is an excuse that has been used too often by institutions whose flaws have been revealed by international rankings.

Still, they do have a point. They go on to show that when the position of Asian universities according to the citations indicator in the THE-TR rankings is compared with the citations per paper indicator in the 2010 QS Asian university rankings, citations per paper over an 11 year period from TR's Essential Science Indicators and citations per paper/citations per faculty in the 2010 QS World university rankings (I assume they mean citations per faculty here since the QS World University Rankings do not have a citations per paper indicator) leading Japanese universities do badly while Chinese, Korean and other Asian universities do very well.

They complain that the THE--TR rankings emphasise "home run papers" and research that produces immediate results and that regional modification (normalisation) discriminates against Japanese universities.

This no doubt is a large part of the story but I suspect that the distortions of the 2010 THE--TR indicator are also because differences in the practice of self citation and intra--university citation, because TR's methodology actually favors those who publish relatively few papers and because of its bias towards low--cited disciplines.

The document continues:

"1. The ranking of citations based on either citations per author (or faculty) or citations per paper represent two fundamentally different ways of thinking with regards to academic institutions: are the institutions to be viewed as an aggregation of their researchers, or as an aggregation of the papers they have produced?  We believe that the correct approach is to base the citations ranking on citations per faculty as has been the practice in the past.

2. We request a revision of the method used for regional modification. 

3. We request the disclosure of the raw numerical data used to calculate the citation impact score for the various research fields at each university."

I suspect that TR and THE would reply that their methodology identifies pockets of excellence (which for some reason cannot be found anywhere in the Japanese RU 11), that the RU 11 are just poor losers and that they are right and QS is wrong.

This question might be resolved by looking at other measures of citations such as those produced by HEEACT, Scimago and ARWU.

It could be that this complaint if was sent to TR was the reason for TR and THE announcing that they were changing the regional weighting process this year. If that turns out to be the case and TR is perceived as changing its methodology to suit powerful vested interests then we can expect many academic eyebrows to be raised.

If the RU 11 are still unhappy then THE and TR might see a repeat of the demise of the Asiaweek rankings brought on in part because of a mass abstention by Japanese and other universities.