Thursday, December 23, 2010

Top US Colleges by Salary

PayScale has published its ranking of US colleges by mid-career median salary. The top five are:

1. Harvey Mudd College
2.  Princeton
3.  Dartmouth
4.  Harvard
5.  Caltech

The top schools in various categories are:

Engineering: Harvey Mudd
Ivy League: Princeton
Liberal Arts: Harvey Mudd
Party Colleges: Union College, NY
Private Research Universities: Princeton
State Universities: Colorado School of Mines

Saturday, December 04, 2010

Can 25 Million Citations be Wrong?

Perhaps not but a few hundred might.

University World News has an article by Phil Baty, deputy editor of Times Higher Education, that discusses the recent THE World University Rankings. He is mainly concerned with the teaching component of the rankings and I hope to discuss this in a little while. However, there are some remarks about the citation component that are worth commenting on. He says:

"We look at research in a number of different ways, examining research reputation, income and research volume (through publication in leading academic journals indexed by Thomson Reuters). But we give the highest weighting to an indicator of 'research influence', measured by the number of times a university's published research is cited by academics around the globe.

We looked at more than 25 million citations over a five-year period from more than five million articles. All the data were normalised to reflect variations in citation volume between different subject areas.

This indicator has proved controversial, as it has shaken up the established order, giving high scores to some smaller institutions with clear pockets of research excellence, often at the expense of the larger research-intensive universities.

We make no apology for recognising quality over quantity, but we concede that our decision to openly include in the tables the two or three extreme statistical outliers, in the interests of transparency, has given some fuel for criticism, and has given us some food for thought for next year's table."
First, normalisation of data means that the number of citations is compared to a benchmark derived from the world average number of citations for a subject area. A large number of citations might mean that an article has been warmly received. It might equally well mean that the article was in a field where articles are typically cited a lot. Comparing simple numbers of citations in  a field like literary studies to those in medical research would be  unfair to the former since citations there are relatively scarce. So part of the reason for Alexandria  University's remarkable success in the recent THE WUR was not just the number of citations of the papers of Mohamed El Naschie but also that he was  publishing in a field with a low frequency of citations. Had he published papers on medicine nobody would have noticed.

It is also very likely -- although I cannot recall seeing direct confirmation -- that Thomson Reuters were benchmarking by year so that a university would score more for a citation to a recently cited article than one to an article published four years ago.

In the case of Alexandria and other universities that scored unexpectedly well we are not talking about millions of citations. We are talking about dozens of papers and hundreds of citations the effect of which has been enormously magnified because the papers were in a low-citation field  and were cited within months of publication.

Remember also that we are talking about averages. This means that a university will get a higher score the smaller the number of papers that are published in ISI- indexed journals. Alexandria did not do so well just because El Naschie published a lot and was cited a lot. It also did well because overall it published few articles. Had Alexandria researchers published more then its score would have been correspondingly lower..

Perhaps El Naschie constitutes a clear pocket of excellence, although that is not entirely uncontroversial. But he is a clear pocket of excellence who only became visible because of the modest achievement of the rest of the university. Conversely, there are probably many modestly cited researchers in Europe and the USA who might welcome a move to a  university in Asia or Latin America where a few papers and citations in a low cited discipline would blossom into such a pocket.

Is Aleaxandria one of two or three anomalies? There are in fact many more more anomalies perhaps not quite so obvious and this can be seen by comparing scores for research impact with other rankings of research output and impact such as HEEACT and Scimago or with the scores for research in the THE rankings themselves. It would also be interesting if THE released the released of the academic reputational survey.

Consider what would happen if we had  a couple of universities that were generally similar, with the same income, staff-student ratio and so on. One however had published two or three times the number of ISI indexed articles as the other. Both had a few researchers who had been cited more frequently than is usual for their discipline. Under the current system, the first university would get a much lower score than the second. Can this really be consider a preference for quality over quantity? Only if we think that publishing in ISI journals adds to quantity but does not indicate quality.

 I hope that food for thought means radical revision of the citations indicator.

A minimal list of changes would include adding more markers of research impact, removing self citations and citations to the same university and the same journal and combining the score for the various disciplinary fields.

 If this can be done then the THE rankings may become what was promised.

Friday, December 03, 2010

Is there a Future for Citations?

simplification administrative has some caustic comments on the role of self-citation and reciprocal citation  in the remarkable performance of Alexandria University in the 2010 THE rankings.

The title, 'Bibliometry -- already broken', is perhaps unduly pessimistic but THE and Thomson Reuters are going to have to move quickly if they are to rescue their rankings. An obvious remedy would include removing self-citations and intra-university and intra-journal citations  from the count.

Monday, November 29, 2010


An article by William Patrick Leonard in the Korea Herald discusses how this year's rankings by Shanghai Jiao Tong University, THE and QS, whatever their faults, indicate a long term shift in academic excellence from the USA and Europe to China and other parts of Asia.

"All three release their findings as the school year begins. Each employs a similar blend of indicators, which purportedly measure the relative quality of the institutions surveyed. All emphasize institutional reputation expressed in the quality and quantity of faculty publications, peer assessments, faculty/student ratios, budgets and other input quality measures.

These rankings are clearly flawed; for example, it is not evident that the volume of scholarly publications or peer assessments reflects quality in the classroom. Nevertheless, the rankings show that other fast-growing countries are willing to apply their resources to higher education, just as the United States has been doing for years"
He then observes how in the US sport seems to take precedence over education.

"Yet, instead of strengthening our academic programming, some are planning costly recreational diversions. In May, the National Football Foundation & College Hall of Fame announced that “six new college football teams are set to take the field for the first time this season with 11 more programs set to launch between 2011 and 2013.” Such an announcement is simultaneously sad and humorous. The resources spent to implement and subsequently prop up these programs could be used to improve technology, science, and engineering programs. Sadly, some institutions have opted for stadiums over instructional infrastructure."

Saturday, November 20, 2010

Comment on the New York Times Article

This is from Paul Wouters at CWTS, Leiden University

"However, the reason for this high position is the performance of exactly one (1) academic: Mohamed El Naschie, who published 323 articles in the Elsevier journal Chaos, Solitons and Fractals of which he is the founding editor. His articles frequently cite other articles in the same journal by the same author. On many indicators, the Alexandrian university does not score very high, but on the number of citations indicator the university scores 99.8 which puts it as the 4th most highly cited in the world. This result clearly does not make any sense at all. Apparently, the methodology used by the THES is not only problematic because it puts a high weight on surveys and perceived reputation. It is also problematic because the way the THES survey counts citations and makes them comparable across fields (in technical terms, the way these counts are normalized) is not able to filter out these forms of self-promotion by self-citations. In other words: the way the THES uses citation analysis does not meet one of the requirements of sound indicators: robustness against simple forms of manipulation."

BTW, to be a bit pedantic, it's not THES any more.
The Influence of Rankings

From an announcement about merit scholarships from The Islamic Development Bank.

"The successful candidate must secure admission to one universities listed in the Times Higher Education Supplement (THES)."

That obviously needs updating but does it mean Alexandria but not Texas?

Thursday, November 18, 2010


A previous post, "Debate, Anyone" contained some data about comparative rates of self-citation among various univeristies. The methodology used was not appropriate (calcuting the number of articles that contained self-citations as a percentage of the sum of citations). I am recalculating although the relative level of self-citation among universities is most unlikely to be affected.
Article in New York Times

The New York Times has a long article, which is being quoted globally, about the THE World University Rankings. So far it has been cited in newspapers in Italy, Spain, France and Egypt and no doubt other places to come.

Sunday, November 14, 2010

Another Ranking

An organisation called Eduroute has produced a ranking of universities by "web quality". The idea sounds interesting but we need to be given some more information.The top five are:

1. Harvard
2. MIT
3.  Cornell
4.  Stanford
5.  UC Berkeley

After that there are some surprises with National Taiwan University in 7th place, University of the Basque Country 16th, Sofia University 44th, Yildiz Technical University 57th.

The structure of these rankings is as follows:

Volume 20%  "The volume of information published is measure by a set of commands that run on the major search engines. "

Online scientific information  10%  "Eduroute measures this aspect through the search engines which specialize in publishing researches and scholarly articles and which search a university's website for all available publications."

Links quantity 30%:   "Here Eduroute measures the number of incoming links whether these links are from academic or nonacademic websites."

Quality of links and contents 40%:  "it was of great importance to measure this aspect of any website in order to reflect the true size of a university's website on the internet and to measure the degree in which the university is concerned with the quality of content it provides on its website."

This is rather vague and, since only rank order is given, there is no explanation for the high position of the University of the Basque Country, Yildiz Technical University and so on.

The location and personnel of Eduroute also remains mysterious. I did, however, receive this message

"Eduroute is an organization involved in determining rankings for universities. We pride ourselves in offering people with a true collection of information that will assist them when it comes to classifying universities and their rankings.

When coming up with rankings for universities we put into consideration several parameters in order to come up with as accurate a conclusion as possible. This methodology is frequently evaluated and improved on to obtain a solid benchmark that can cast a true reflection of rankings for universities. Eduroute focuses on studying universities’ websites. We believe that the support and investment a university inputs into its website is proportional to the degree of interaction of the website and its users (students, staff and lecturers). The volume and content of the university’s website is analysed while also putting into consideration the traffic flow to the website. The number of external links leading to the university’s website is also a key factor as it is a reflection of how popular the site is. Such parameters and more are useful while determining the rankings for universities. Educational institutions also have the opportunity of registering with us so as to be included in the rankings.

At Eduroute we put all our energies in ensuring the rankings for universities we offer are as accurate as possible. We believe this information is of vital importance both to the general public and to the universities and as such we offer a professional service that satisfies both parties.


May Attia

Project Manager"

Friday, November 12, 2010

Article by Philip Altbach

Inside Higher Education has a substantial and perceptive article on international university rankings by Philip Altbach.

Towards the end there is a round-up of the current season. Some quotations:

Forty percent of the QS rankings are based on a reputational survey. This probably accounts for the significant variability in the QS rankings over the years. Whether the QS rankings should be taken seriously by the higher education community is questionable.


Some of AWRU’s criteria clearly privilege older, prestigious Western universities — particularly those that have produced or can attract Nobel prizewinners. The universities tend to pay high salaries and have excellent laboratories and libraries. The various indexes used also heavily rely on top peer-reviewed journals in English, again giving an advantage to the universities that house editorial offices and key reviewers. Nonetheless, AWRU’s consistency, clarity of purpose, and transparency are significant advantages.


Some of the rankings are clearly inaccurate. Why do Bilkent University in Turkey and the Hong Kong Baptist University rank ahead of Michigan State University, the University of Stockholm, or Leiden University in Holland? Why is Alexandria University ranked at all in the top 200? These anomalies, and others, simply do not pass the "smell test." Let it be hoped that these, and no doubt other, problems can be worked out.

Sunday, October 24, 2010

Navarra Round Table

Presentations by Nian Cai Liu, Zhuo Lin Feng, Daniel Torres-Salinas, Jean Rapp, Phil Baty and Isidro Aguillo at the Ranking Round Table in Navarra can be found here.

Tuesday, October 19, 2010

The THE-QS World Universities Rankings, 2004-2009

See here for a draft of an article on the THE-QS rankings.

Sunday, October 17, 2010

Debate, anyone?

Times Higher Education and Thomson Reuters have said that they wish to engage and that they will be happy to debate their new rankings methodology. So far we have not seen much sign of a debate although I will admit that perhaps more things were said at the recent seminars in London and Spain than got into print. In particular, they have been rather reticent about defending the citations indicator which gives the whole ranking a very distinctive cast and which is likely to drag down what could have been a promising development in ranking methodology.

First, let me comment on the few attempts to defend this indicator, which accounts for nearly a third of the total weighting and for more in some of the subject rankings. It has been pointed out that David Willetts, British Minister for Universities and Science has congratulated THE on its new methodology.

“I congratulate THE for reviewing the methodology to produce this new picture of
the best in higher education worldwide. It should prompt all of us who care
about our universities to see how we can improve the range and quality of the
data on offer. Prospective students — in all countries — should have good
information to hand when deciding which course to study, and where. With the
world to choose from, it is in the interests of universities themselves to
publish figures on graduate destinations as well as details of degree
Willetts has praised THE for reviewing its methodology. So have many of us but that is not quite the same as endorsing what has emerged from that review.

Steve Smith, President of Universities UK and Vice-Chancellor of Exeter University is explicit in supporting the new rankings, especially the citations component.

But, as we shall see in a moment, there are serious issues with the robustness of citations as a measure of research impact and, if used inappropriately, they can become indistinguishable from a subjective measure of reputation.

The President of the University of Toronto makes a similar point and praises the new rankings’ reduced emphasis on subjective reputational surveys and refers to the citations (knowledge transfer?) indicator.

It might be argued that this indicator is noteworthy for revealing that some universities possess hitherto unsuspected centres of research excellence. An article by Phil Baty in THE of the 16th of September refers to the most conspicuous case, a remarkably high score for citations by Alexandria University, which according to the THE rankings has had a greater research impact than any university in the world except Caltech, MIT and Princeton. Baty suggests that there is some substance to Alexandria University’s extraordinary score. He refers to Ahmed Zuweil, a Nobel prize winner who left Alexandria with a master’s degree some four decades ago. Then he mentions some frequently cited papers by a single author in one journal.

The author in question is Mohamed El Naschie, who writes on mathematical physics and the journals – there are two that should be given the credit for Alexandria’s performance, not one – are Chaos, Solitons and Fractals and the International Journal of Nonlinear Sciences and Numerical Simulation. The first is published by Elsevier and was until recently edited by El Naschie. It has published a large number of papers by El Naschie and these have been cited many times by himself and by some other writers in CSF and IJNSNS.

The second journal is edited by Ji-Huan He of Donghua University in Shanghai, China with El Naschie as co-editor and is published by the Israeli publishing company, Freund Publishing House Ltd of Tel Aviv.

An amusing digression. In the instructions for authors in the journal the title is given as International Journal of Nonlinear Sciences and Numerical Stimulation. This could perhaps be described as a Freundian slip.

Although El Naschie has written a large number of papers and these have been cited many times, his publication and citation record is far from unique. He is not, for example, found in the ISI list of highly cited researchers. His publications and citations were perhaps necessary to push Alexandria into THE’s top 200 universities but they were not enough by themselves. This required a number of flaws in TR’s methodology.

First, TR assigned a citation impact score that compares actual citations of a paper with a benchmark score based on the expected number of citations for a specific subject in a specific year. Mathematics is a field where citations are relatively infrequent and usually occur a few years after publication. Since El Naschie published in a field in which citations are relatively scarce and published quite recently this boosted the impact score of his papers. The reason for using this approach is clear and sensible, to overcome the distorting effects of varying citation practices in different disciplines when comparing individual researchers or departments. But there are problems if this method is used to compare whole universities. A great deal depends on when the cited and citing articles are published and in which subject they were classified by TR.

A question for TR. How are articles classified? Is it possible to influence the category in which they are placed by the use of key words or the wording of the title?

Next, note that TR were measuring average citation impact. A consequence of this is that the publication of large numbers of papers that are cited less frequently than the high fliers could drag down the score. This explains an apparent oddity of the citation scores in the 2010 THE rankings. El Naschie listed nine universities as his affiliation in varying combinations between 2004 and 2008. yet it was only Alexandria that managed to leave the Ivy League and Oxbridge standing in the research impact dust. Recently, El Naschie’s list of affiliations has consisted of Alexandria, Cairo, Frankfurt University and Shanghai Jiao Tong University.

What happened was quite simply that all the others were producing so many papers that El Naschie’s made little or no difference. For once, it would be quite correct if El Naschie announced that he could not have done it without the support of his colleagues. Alexandria University owes its success not only to El Naschie and his citers but also to all those researchers who refrained from submitting articles to ISI–indexed journals or conference proceedings.

TR have some explaining to do here. If an author lists more than one affiliation, are they all counted? Or are fractions awarded for each paper? Is there any limit on the number of affiliations that an author may have. I think that it is two but would welcome clarification

As for the claim that Alexandria is strong in research, a quick look at the Scimago rankings is enough to dispose of that. It is ranked 1,047th in the 2010 rankings, which admittedly include many non-university organizations, for total publications over a decade. Also, one must ask how much of El Naschie’s writing was actually done in Alexandria, seeing that he had eight other affiliations between 2004 and 2008.

It has to be said that even if El Naschie is, as has been claimed in comments on Phil’s THE article and elsewhere, one of the most original thinkers of our time, it is strange that THE and TR should use a method that totally undermines their claim that the new methodology is based on evidence rather than reputation By giving any sort of credence to the Alexandria score, THE are asking us to believe that Alexandria is strong in research because precisely one writer is highly reputed by himself and a few others. Incidentally, will TR tell us what score Alexandria got in the research reputation survey?

I am not qualified to comment on the scientific merits of El Naschie’s work. At the moment it appears, judging from the comments in various physics blogs, that among physicists and mathematicians there are more detractors than supporters. There are also few documented signs of conventional academic merit in recent years such as permanent full time appointments or research grants. None of his papers between 2004 and 2008 in ISI-indexed journals,for example, apparently received external funding. His affiliations, if documented, turn out to be honorary, advisory or visiting.To be fair, readers might wish to visit El Naschie’s site. I will also publish any comments of a non-libellous nature that support or dispute the scientific merits of his writings.

Incidentally, it is unlikely that Alexandria’s score of 19.3 for internationalisation was faked. TR use a logarithm. If there were zero international staff and students a university would get a score of 1 and a score of 19.3 actually represents a small percentage. On the other hand, I do wonder whether Alexandria counted those students in the branch campuses in Lebanon, Sudan and Chad.

Finally, TR did not take the very simple and obvious step of not counting individual self-citations. Had they done so, they would have saved everybody, including themselves a lot of trouble. It would have been even better if they had excluded intra-institutional and intra-journal citation. See here for the role of citations among the editorial board of IJNSNS in creating an extraordinarily high Journal Impact Factor.

THE and TR have done everyone a great service by highlighting the corrosive effect of self citation on the citations tracking industry. It has become apparent that there are enormous variations in the prevalence of self citation in its various forms and that these have a strong influence on the citation impact score.

Professor Dirk Van Damme is reported to have said at the London seminar that the world’s elite universities were facing a challenge from universities in the bottom half of the top 200. If this were the case then THE could perhaps claim that their innovative methodology had uncovered reserves of talent ignored by previous rankings. But what exactly was the nature of the challenge? It seems that it was the efficiency with which the challengers turned research income into citations. And how did they do that?

I have taken the simple step of dividing the score for citations by the score for the research indicator (which includes research income) and then sorting the resulting values. The top ten are Alexandria, Hong Kong Baptist University, Barcelona, Bilkent, William and Mary, ENS de Lyon, Royal Holloway, Pompeu Fabra, University College Dublin, the University of Adelaide.

Seriously, these are a threat to the world’s elite?

The high scores for citations relative to research were the result of a large number of citations or a small number of total publications or both. It is of interest to note that in some cases the number of citations was the result of assiduous self-citation.

This section of the post contained comments about comparative rates of self citation among various universities. The method used was not correct and I am recalculating.
As noted already, using the THE iPad app to change the importance attached to various indicators can produce very different results. This is a list of universities that rise more than a hundred places when the citations indicator is set to ‘not important’. They have suffered perhaps because of a lack of super-cited papers, perhaps also because they just produced too many papers.

Sung Kyun Kwan
Texas A and M
Shanghai Jiao Tong University
Delft University of Technology
National Chiao Tung University (Taiwan)
Royal Institute of Technology Sweden

Here is a list of universities that fall more than 100 places when the citations indicator is set to ‘not important’. They have benefitted from a few highly cited papers or low publication counts or a combination of the two.

Boston College
University of California Santa Cruz
Royal Holloway, University of London
Pompeu Fabra
Kent State University
Hong Kong Baptist University
Victoria University Wellington
Tokyo Metropolitan University
University of Warsaw

There are many others that rise or fell seventy, eighty, ninety places when citations are taken out of the equation. This is not a case of a few anomalies. The whole indicator is one big anomaly.

Earlier Jonathon Adams in a column that has attracted one comment said:

"Disciplinary diversity is an important factor, as is international diversity. How would you show the emerging excellence of a really good university in a less well known country such as Indonesia? This is where we would be most controversial, and most at risk, in using the logic of field-normalisation to add a small weighting in favour of relatively good institutions in countries with small research communities. Some may feel that we got that one only partially right."

The rankings do not include universities in Indonesia, really good or otherwise. The problem is with good, mediocre and not very good universities in the US, UK, Spain, Turkey, Egypt, New Zealand, Poland etc. . It is a huge weighting, not a small one, the universities concerned range from relatively good to relatively bad, in one case the research community seems to consist of one person and many are convinced that TR got that one totally wrong.

Indonesia may be less well known to TR but it is very well known to itself and neighbouring countries.

I will publish any comments by anyone who wishes to defend the citation indicator of the new rankings. Here are some questions they might wish to consider.

Was it a good idea to give such a heavy weighting to research impact, 32.5% in the oerall rankings , 37.5 in at least 2 subject rankings? Is it possible that commercial considerations, citations data being a lucrative business for TR , had something to do with it?

Are citations such a robust indictor? Is there not enough evidence now to suggest that manipulation of citations, including self-citation, intra-institutional citation and intra-journal citation, is so pervasive that the robustness of this measure is very slight?

Since there are several ways to measure research impact, would it not have been a good idea to have used several methods? After all, Leiden University has several different ways of assessing impact. Why use only one?

Why set the threshold for inclusion so low at 50 papers per year?

Monday, October 11, 2010

Auditing University Rankings

The Chronicle of Higher Education reports on a meeting of the International Rankings Expert Group's (IREG) Observatory on Academic Ranking and Excellence.

The Organisation has set up a mechanism to audit the various university rankings.

"The audit project, which he [Gero Federkeil] is helping to manage, will be based closely on IREG's principles, which emphasize clarity and openness in the purposes and goals of rankings, the design and weighting of indicators, the collection and processing of data, and the presentation of results.

"We all say that rankings should aim at delivering transparency about higher-education institutions, but we think there should be transparency about rankings too," Mr. Federkeil said. The audit process could eventually give rise to an IREG quality label, which would amount to an identification of trustworthy rankings, thereby enhancing the credibility of rankings and improving their quality, Mr. Federkeil said.

At the Berlin meeting last week, Mr. Federkeil and Ying Cheng, of the Center for World-Class Universities at Shanghai Jiao Tong University, which produces the best-known and most influential global ranking of universities, outlined the proposed methodology and procedure for the audit. The IREG executive committee will nominate audit teams consisting of three to five people. The chair of each team must not have any formal affiliation with a ranking organization, and at least one member of the audit team must be a member of the IREG executive committee. Audits will be based on self-reported data as well as possible on-site visits, and each full audit is expected to take about five months to complete."

The executive committee of IREG includes Liu Nian Cai from Shanghai Rankings Consultancy, Bob Morse from US News & World Report and Gero Federkeil from CHE, the German ranking body.

Members of the Observatory include HEEACT (Taiwan), the Kazakhstan and Slovak ranking agencies and QS Intelligence Agency.

Would anybody like to guess who will be the first to be audited?

Friday, October 01, 2010

Return of the British Gift for Understatement

One of the tiresome things about the ranking season is the mania for inflated adjectives that readers of THE and other publications and sites have had to endure: rigorous, innovative, transparent, robust, sophisticated and so.

It was a relief to read a column by Jonathon Adams in today's THE which uses words like "small" and "good". Here is an example.

Disciplinary diversity is an important factor, as is international diversity. How would you show the emerging excellence of a really good university in a less well known country such as Indonesia? This is where we would be most controversial, and most at risk, in using the logic of field-normalisation to add a small weighting in favour of relatively good institutions in countries with small research communities. Some may feel that we got that one only partially right.

I think the bit about "using the logic of field-normalisation to add a small weighting in favour of relatively good institutions in countries with small research communities" refers to the disconcerting presentation of Alexandria University as the 4th best in the world for research impact.

Thursday, September 30, 2010

The THE Life Sciences Ranking (subscription required)

First is MIT, then Harvard, then Stanford. Nothing to argue about there.

For research impact (i.e. citations) the order is:

1.. MIT
2. University of Barcelona
3. Harvard
4. Princeton
5. Stanford
6. Oxford
7. Dundee
8. Hong Kong
9. Monash
10. Berkeley

In this subject group, the citations indicator gets 37.5%.
Rankings Undermined by Flawed Indicator

See commentary in University World News

Tuesday, September 28, 2010

From the Straits Times of Singapore

Students at Nanyang Technological University in Singapore have been dismayed by the poor performance in the THE World University Rankings. Apparently they are concerned about future job prospects. The university was ranked 174th, much less than it is used to. The apparent poor performance for citations (222nd) was at the root of the problem.

Su Guanning the President has written to the Straits Times

"The QS 2010 Ranking is today the most widely-used world university ranking.
Nanyang Technological University was placed 74th in the list and the National
University of Singapore 31st, both down by one position from last year. The
performance of both universities has been consistent over the last four years.

NTU started off as a practice-oriented, teaching university in 1991 and
is in fact the youngest university in the Top 100 ranked in the QS 2010.

The Times Higher Education 2010 rankings is entirely new. Its criteria
have yet to be accepted by many universities. A detailed analysis reveals it is
88 per cent computed from research-related indicators with unusual normalization
of data resulting in some bizarre results. In a joint article in Edmonton
Journal, Indira Samarasekera, the President and Vice-Chancellor of the
University of Alberta, and Carl Amrhein, the Provost and Vice-President
(Academic) of the same university, wrote that the Times ranking has very
peculiar outcomes that do not pass the “reasonableness test”. They advised the
public to take the “rankings with a truckload of salt”.

Malcolm Grant, President and Provost of University College London pointed out to The Guardian that research citations, if not intelligently applied, can lead to bizarre
results. He cited the example of Egypt’s Alexandria University being ranked
above Harvard and Stanford universities in research influence in the Times
Higher Education 2010. "

Sunday, September 26, 2010

Something is Awry

A comment by the President of the University of Alberta.

"This year, Times Higher Education partnered with Thompson
Reuters to compile a new ranking, which was released a few days ago. This
ranking is also based on objective and subjective measures. A school’s final
score is based on an aggregate of teaching, research reputation/income,
citations/research influence, industry income and international mix. Academics are surveyed to rate teaching and research quality, a precarious exercise since teaching is difficult, if not impossible, for academics to assess at institutions other than their own. At first blush, this ranking appears to capture a broader range of indicators and does a good job of identifying the top 20. We applaud the University of Toronto for being named among the top 20; this is very good for Canada and Canadian universities.

However, when one looks under the hood, beyond the top 20 universities, the new ranking has some very peculiar outcomes that does not pass the “reasonableness test.” Alexandria University in Egypt received one of the highest scores for citations while also receiving the lowest score for research reputation. By comparison #1 Harvard received near perfect scores on both measures. Logic would suggest that the aggregate score for research reputation should correlate with the score for citations, which measure research influence. However, in spite of the vast discrepancy between its scores, Alexandria, which has never been on any other international ranking, came in ahead of many top universities. Furthermore many universities considered by peers as among the best do not appear in the top 200.
Something is awry.

Completely Obscure

The President of the University of Groningen has commented on the THE 2010-11 World University rankings:

"President of the Board of the University of Groningen Prof. Sibrand Poppema is
very critical of the way that THE has determined the quality of university
education and of the way the number of citations per academic publication has
been calculated. ‘There’s definitely something wrong in that’, said Poppema
about the latter in an interview for the Groningen university paper UK. The
number of citations is a measure of the quality of the academic research.
However, because not all academic fields are cited as frequently as some, THE
uses a calculation method to be able to compare the results even so. Poppema
calls this method ‘completely obscure’."

Friday, September 24, 2010

Here We Go Again

Times Higher Education have released the ranking of the world's top universities in engineering and technology (subscription required). Caltech is number 1 and MIT is second. So far, nothing strange.

But one wonders about the Ecole Normale Superieure in Paris in 34th place and Birkbeck College London in 43rd.

Clicking on the Citations Indicator we find that the Ecole has a score of 99.7 and Birkbeck 100.

So Birkbeck has the highest research impact for engineering and technology in the world and ENS Paris the second highest?
Missing the Point

One of the most depressing things about university rankings is that whenever rankers make an error or use inappropraiate methodologies to push a un iversity up the table, administrators produce detailed justifications of their ascent usually avoiding the real reason. Here the head of Alexandria University explains why her university is in the top 200 without mentioning the real reasons, namely the deficiencies of the citation indicator.

Thursday, September 23, 2010

From the Economist

But I suspect that today's league tables say as much about
the motives behind those who compile them (and, indeed, those who laud their
findings) as they do about the true global standing of the institutions
concerned. Britain is poised to slash its public services, and the axe hangs
over its universities just as surely as it does over almost all other area of
public life (only the National Health Service and the overseas aid budget have
won reprieves). Other countries with rickety public finances are nevertheless
splurging on universities, including America, Canada, France and Germany. Even
Australia and China, which avoided recession, have big plans.

In such circumstances, it makes sense for British universities to present themselves as a national treasure whose crown is slipping for want of investment. Universities
UK, which represents vice-chancellors, issued a statement from Steve Smith, its
president, saying, "The tables may show that the UK remains the second-strongest
university system in the world, but the most unmistakable conclusion is that
this position is genuinely under threat. The higher education sector is one of
the UK's international success stories, but it faces unprecedented competition.
Our competitors are investing significant sums in their universities, just when
the UK is contemplating massive cuts in its expenditure on universities and

He may be right, but the evidence he uses to support his
conclusion is far from objective.

And a comment on the article

deanquill wrote: Sep 20th 2010 7:31 GMT .It's notable that Israel, which had two
universities just outside The Times' top 100 last year, had none in the list at
all this year. The universities apparently failed to return statistical data and
were excluded; they say they never received the request from The Times' new
survey organizers. Incompetence is more likely than conspiracy but it doesn't
say much about the list's accuracy

Good point, but the magazine in question is not The Times and the data was collected by Thomson Reuters.
Selected Comments from Times Higher Education

Mike Reddin 17 September, 2010
World university rankings take national ranking systems from the ridiculous to the bizarre. Two of the most glaring are made more so by these latest meta analyses.
Number One: R&D funding is scored not by its quality or contribution to learning or understanding but by the amount of money spent on that research; it ranks expensive research higher than cheap research; it ranks a study of 'many things' better than the study of a 'few things'; it ranks higher the extensive and expensive pharmacological trial than the paper written in tranquility over the weekend. I repeat, it does not score 'contribution to knowledge'.

Number Two. Something deceptively similar happens in the ranking of citations. We rank according to number alone - not 'worth' - not whether the paper merited writing in the first place, not whether we are the better for or the worse without it, not whether it adds to or detracts from the sum of human knowledge. Write epic or trash .... as long as it is cited, you score. Let me offer utter rubbish - the more of you that denounce me the better; as long as you cite my name and my home institution.

Which brings me full circle: the 'rankings conceit' equates research / knowledge / learning / thinking / understanding with institutions - in this case, universities and universities alone. Our ranking of student 'outcomes' (our successes/failure as individuals on many scales) wildly presumes that they flow from 'inputs' (universities). Do universities *cause* these outcomes - do they add value to those they have admitted? Think on't. Mike Reddin

jorge Sanchez 18 September, 2010
this is ridiculous~ LSE was placed 67 in the previous year and THE decided to end relations with QS because of this issue. now since THE is no longer teaming up with QS, how could you possibly explain this anomaly by placing LSE ranked 86 in the table????

Mark 18 September, 2010
where is the "chinese university of Hong Kong in the table??? it is no longer in the top 200 best universities....

last year was in the top 50 now is off the table??? is this a serious ranking?????

Of course it's silly 18 September, 2010
Just look at the proposition that teaching is better if you have a higher proportion of doctoral students to undergraduate students.

This is just plainly silly, as 10 seconds thinking about the reputation of teaching in the US will tell you: liberal arts colleges offer extraordinary teaching in the absence of PhD programmes.

Matthew H. Kramer 18 September, 2010
Though some tiers of these rankings are sensible, there are some bizarre anomalies. Mirabile dictu, the University of Texas doesn't appear at all; the University of Virginia is ridiculously low at 72; NYU is absurdly low at 60; the University of Hong Kong is preposterously overrated at 21. Moreover, as has been remarked in some of the previous comments -- and as is evident from a glance at the rankings -- the criteria hugely favor technical institutes. The rank of MIT at 3 is credible, because MIT is outstanding across the board. However, Cal Tech doesn't belong at 2, and Imperial (which has no programs at all in the humanities and social sciences) certainly doesn't belong at 9. Imperial and especially Cal Tech are outstanding in what they do, but neither of them is even close to outstanding across the gamut of subjects that are covered by any full-blown university. I hope that some of these anomalies will be eliminated through further adjustments in the criteria. The exclusion of Texas is itself sufficiently outlandish to warrant some major modifications in those criteria.

Matthew H. Kramer 18 September, 2010
Weird too is the wholesale exclusion of Israeli universities. Hebrew University, Tel Aviv University, and Technion belong among the top 200 in any credible ranking of the world's universities.

Neil Fazel 19 September, 2010
No Sharif, no U. Texas, no Technion. Another ranking to be ignored.

OZ academic 20 September, 2010
While the criteria seem to be OK, although they might be debated, how to carry out the statistical analyses and how to collect the data are the issues for the validity of the poll. The omission of Chinese University of Hong Kong, in the inclusion of the Hong Kong Baptist University and Hong Kong Polytechnic University in the world's top 200 universities, seems to be very "mysterious" to me. As I understand the Chinese University of Hong Kong is more or less of a similar standard in teaching and research in comparison to the Hong Kong University and the Hong Kong University of Science and Technology, but they have some slight edges over the Hong Kong Baptist University and the Hong Kong Polytechnic University. I wonder if there are mix-ups in the data collection processes. If this is true, then there are disputes in this poll not only in the criteria of assessment but also in the accuracy in data collections and analyses.
Texas Opted Out

From the Texas Tribune

"University officials said UT's [University of Texas] absence is not due to an epic fall — they simply declined to participate.

Kristi Fisher, director of UT’s Office of Information Management and Analysis, said they opted out for two reasons. First, budget cuts have caused resource constraints, and projects must be chosen carefully. Also, the survey was using new methodology for the first time, and there was talk it might be suspect. “The last thing we wanted to do was spend a lot of resources to participate in a survey that might have flawed methodology behind it,” Fisher said. "

Wednesday, September 22, 2010

What Happens When You Set the THE Rankings Citations Indicator to Not Important Continued

This is a selection of the universities that go up when citations is set as not important. The number of of places is on the right.

Tokyo University 10
Korean Advanced Institute of Science and Technology 38
Osaka 70
Warwick 78
Kyushu 100
Sung Kyun Kwan 100
Texas A & M 103
Sao Paulo 107
Surrey 123
Shanghai Jiao Tong 158

And a selection of those that fell after citations is set to not important.

Sussex 83
Univerity College Dublin 99
UC Santa Cruz 102
Tasmania 106
Royal Holloway 119
Pompeu Fabra 142
Bilkent 154
Kent State 160
Hong Kong Baptist 164
Alexandria 234
The THE World University Rankings With Citations Set to Not Important

The THE rankings Iphone app has the excellent feature of allowing users to adjust the weightings of the five indicator groups. This is the top 200 when the citations -- research impact indicator is set to 'not important'. The number in brackets on the right is the position in the official ranking.

  1. Harvard (1)
  2. Caltech (2)
  3. MIT (3)
  4. Stanford (4)
  5. Princeton (5)
  6. Imperial College London (9)
  7. Cambridge ( 6)
  8. Oxford (6)
  9. Yale (10)
  10. UC Berkeley (8)
  11. UC Los Angeles (11)
  12. Johns Hopkins (13)
  13. Swiss Federal Institute of technology Zurich (15)
  14. University of Michigan (15)
  15. Chicago (12)
  16. Tokyo (26)
  17. Cornell (14)
  18. Toronto (17)
  19. University College London (22)
  20. Columbia (18)
  21. University of Pennsylvania (19)
  22. University of Illinoi-Urbana (33)
  23. McGill (35)
  24. Carnegie Mellon (20)
  25. Hong Kong (21)
  26. Georgia Institute of Technology (27)
  27. Kyoto (57)
  28. British Columbia (30)
  29. University of Washiongton (23)
  30. National University of Singapore (34)
  31. Duke (24)
  32. Peking (37)
  33. Universityof North Carolina (30)
  34. Karolinska Institute (34)
  35. Tsinghua University, Beijing (58)
  36. Northwestern University (25)
  37. Pohang University of scienc and technology (28)
  38. UC San Diego (32)
  39. Melbourne (36)
  40. UC Santa Barbara (29)
  41. Korean Advanced Institute of Science and Technology (79)
  42. UC Davis (54)
  43. University of Masachusetts (56)
  44. Washington University St Louis (38)
  45. Edinburgh (40)
  46. Australian National University (43)
  47. Minnesota (52)
  48. Purdue (106)
  49. Vanderbilt (51)
  50. LSE (86)
  51. Ecole Polytechnique (39)
  52. Case Western Reserve (65)
  53. Wisconsin (43)
  54. Ohio State (66)
  55. Delft University of Technology (151)
  56. Sydney (71)
  57. Brown (55)
  58. EPF Lausanne (48)
  59. Tokyo Institute of Technology (112)
  60. Osaka (130)
  61. Catholic University of Leuven (119)
  62. Univerity of Virginia (72)
  63. Tohoku (132)
  64. Ecole Normale Superieure Paris (64)
  65. Tufts (53)
  66. University of Munich (61)
  67. Manchester (87)
  68. Hing Kong University of Science and Technology (41)
  69. Emory (61)
  70. Gottingen (43)
  71. Seoul National University (109)
  72. Pittsburgh (54)
  73. Rutgers (105)
  74. New York University (60)
  75. Yeshiva (68)
  76. University of Southern California (73)
  77. Alberta (127)
  78. Uppsala (147)
  79. UC Irvine (49)
  80. University of Science and Technology China (49)
  81. Queensland (81)
  82. Ghent (124)
  83. Zurich (90)
  84. King’s College London (77)
  85. Eindhoven University of Technology (114)
  86. Ruprecht Karl University of Heidelberg (83)
  87. National Chiao Tung University (181)
  88. Rice (47)
  89. Lund (89)
  90. University of Utah (83)
  91. Royal Institute of Technology Sweden (193)
  92. Bristol (68)
  93. McMaster (93)
  94. Boston (59)
  95. Rensselaer Polytchnic Institute (1040
  96. University Of Colorado (67)
  97. Montreal (138)
  98. University of Iowa (132)
  99. National Taiwan University (115)
  100. Leiden (124)
  101. Notre Dame (63)
  102. University of Arizona (95)
  103. George Washington (103)
  104. Texas A & M (207)
  105. Georgetown (164)
  106. Lomonosov Moscow State (237)
  107. National Tsing Hua University (107)
  108. Geneva 118)
  109. Birmingham (145)
  110. Southampton (90)
  111. Wagening (114)
  112. Medical College of Georgia (158)
  113. Technical University of Munich (101)
  114. New South Wales (152)
  115. Illinois-Chicago (197)
  116. Michigan State (122)
  117. Trinity College Dublin (76)
  118. Tokyo Medical and Dental (217)
  119. Nanyang Technological (174)
  120. Technical University of Denmark (122)
  121. Sheffield (137)
  122. York (81)
  123. St Andrews 103)
  124. Nanjing (120)
  125. Lausanne (136)
  126. Glasgow (128)
  127. VU Amsterdam (13()
  128. Twente (185)
  129. Utrecht (143)
  130. Sung Kyun Kwan (230)
  131. Stony Brook (78)
  132. Wake Forest (90)
  133. Helsinki (102)
  134. Basel (95)
  135. Freiborg (132)
  136. Adelaide (73)
  137. Nagoya (206)
  138. Ruhr University Bochum
  139. Sao Paol o (232)
  140. Free University of Berlin (212)
  141. Maryland College Park (98)
  142. Warwick (220)
  143. Technion (221)
  144. Iowa State (156)
  145. Chalmers university of Technology (223)
  146. Dartmouth (99)
  147. RWTH Aachen (182)
  148. Kansas (232)
  149. Swedish University Agricultural sciences (199)
  150. Groningen (170)
  151. State University of Campinas (248)
  152. Nottingham (174)
  153. Leeds (168)
  154. Penn State (109)
  155. Maastricht (209)
  156. Zhejiang (197)
  157. Humboldt (178)
  158. Vienna (195)
  159. Hong Kong Polytechnic (149)
  160. Queen Mary London (120)
  161. Aarhus (167)
  162. Sussex (79)
  163. University of Georgia (246)
  164. National Sun Yat-Sen (163)
  165. William and Mary (75)
  166. Kiel (210)
  167. Lancaster (214)
  168. Indiana University ((156)
  169. Newcastle, UK (152)
  170. UC Santa Cruz (68)
  171. Aberdeen (149)
  172. Durham
  173. University College Dublin
  174. Liverspool (165)
  175. Dalhousie (193)
  176. University of delaware (159)
  177. UC Riverside (117)
  178. University of Amsterdam (165)
  179. Surrey (302)
  180. Konstanz (186)
  181. University of South Carolina (214)
  182. Wurzburg (168)
  183. Cape Town (107)
  184. Tokushima (317)
  185. Reading (210)
  186. Stockholm (129)
  187. University of Waterloo, Canada (267)
  188. Wshington State University (264)
  189. Copenhagen (177)
  190. Hokkaido (293)
  191. Hawaii (105)
  192. Yonsei (190)
  193. Leicester (216)
  194. Kyushu (294)
  195. Bergen (135)
  196. Shanghai Jiao Tong (258)
  197. Pierre and Marie Curie (140)
  198. ENS De Lyon (100)
  199. Erasmus (159)
  200. Tromso (227)


Apart from spam about Viagra and where to meet beautiful Russian women and so on and one racist diatribe I have until now published every comment sent to this post. However, I think it is time to indicate a change in policy. I will publish any comment providing it does not contain negative general comments about the character of a specific individual. Thus I will allow critical comments about citation patterns or the scientific validity of specific research but not, for example, a statement that someone is a "fraud", unless convicted of fraud in court.

Tuesday, September 21, 2010

From The Students' Room 3

Click here.

From AfghanistanBananistan

I am intriged about the LSE's ranking.

TBH, it does not really matter where a university ranks in the world in relation to other global universities, it just concerns me when UK universities have odd rankings amongst their own. No one would complain if LSE came 4/5th in the UK and 86th in the world, or Warwick came 8th in UK and 240th in the world.

The RAE claims that LSE has the highest proportion of world leading research in the UK, and is in the top 3 in every type of analysis of the results (often tied with Oxford). The RAE may not be perfect, but for LSE to suddenly drop in the THES ranking (esp citations) seems really shocking, and perhaps wrong.

Afterall, someone from THES wrote a few months back that LSE ranking in 66th was 'clearly a big mistake'. This was when THES clearly stated that citations hindered social science and arts institutions. You mean to tell me that LSE ranks 66th (80th in this yers QS ranking), when citations are weighted against social sciences, yet LSE drops to 86th when citations not weighted to disadvantage social science institutions?

I guess my main point is that in the QS ranking LSE has ranked in the top 5 in the world every year for social sciences, with citations the key element of the ranking. How come LSE can have the 4th best citation count in the social sciences, yet when taken account in an overall ranking, where citations are normalised, it has such a low citation count in the THES ranking? I understand Warwick, for it may rank 44th for humanities, but it has to take into account all its own faculties which may drag it down - yet LSE only has social sciences.

I find it hard to conceptualise that Sussex and York universities have more pervasive research throughout ALL their departments than just LSE for social science. Bear in mind that LSE, as a specialist instituon, is one of (probably the biggest) social sicence dept in the world, with more specific social science researchers that anywhere else. Therefore, how come its research is not cited more? What are the 50 odd fellows of the British academy doing, and should they not have been appointed if their research is not influential.

In short, does THES not think that something is up with their methodology, just as they admitted last year. Just because some said there is evidence of the world being flat, does not make it true. I find it hard to believe that the university that created many fields in the social sciences and has won nearly a quarter of all nobel prizes in economics, can rank so badly for its citations score? Aferall, can the fellow social science researchers who voted LSE the 6th best in the world for infoluence IR and is the only uni to compete with the US economics depts in all economics world rankings, be so wrong?

The same logic applies to Warwick vs. Dundee and Lancaster.

If a ranking of the world's best ever football players only had Maradonne at number 20 and Pele at 15, wouldn't the raking lose credibility. You could rank them purely by influence measured by the number of goals scores. Pele/Maradonna would not rank that well, but we them as the best players because we have seen them with our own eyes.

P.S I dont even go to LSE.