Friday, November 12, 2010

Article by Philip Altbach

Inside Higher Education has a substantial and perceptive article on international university rankings by Philip Altbach.

Towards the end there is a round-up of the current season. Some quotations:

On QS
Forty percent of the QS rankings are based on a reputational survey. This probably accounts for the significant variability in the QS rankings over the years. Whether the QS rankings should be taken seriously by the higher education community is questionable.

On ARWU

Some of AWRU’s criteria clearly privilege older, prestigious Western universities — particularly those that have produced or can attract Nobel prizewinners. The universities tend to pay high salaries and have excellent laboratories and libraries. The various indexes used also heavily rely on top peer-reviewed journals in English, again giving an advantage to the universities that house editorial offices and key reviewers. Nonetheless, AWRU’s consistency, clarity of purpose, and transparency are significant advantages.


On THE

Some of the rankings are clearly inaccurate. Why do Bilkent University in Turkey and the Hong Kong Baptist University rank ahead of Michigan State University, the University of Stockholm, or Leiden University in Holland? Why is Alexandria University ranked at all in the top 200? These anomalies, and others, simply do not pass the "smell test." Let it be hoped that these, and no doubt other, problems can be worked out.

Saturday, October 23, 2010

Navarra Round Table

Presentations by Nian Cai Liu, Zhuo Lin Feng, Daniel Torres-Salinas, Jean Rapp, Phil Baty and Isidro Aguillo at the Ranking Round Table in Navarra can be found here.

Monday, October 18, 2010

The THE-QS World Universities Rankings, 2004-2009

See here for a draft of an article on the THE-QS rankings.

Sunday, October 17, 2010

Debate, anyone?

Times Higher Education and Thomson Reuters have said that they wish to engage and that they will be happy to debate their new rankings methodology. So far we have not seen much sign of a debate although I will admit that perhaps more things were said at the recent seminars in London and Spain than got into print. In particular, they have been rather reticent about defending the citations indicator which gives the whole ranking a very distinctive cast and which is likely to drag down what could have been a promising development in ranking methodology.

First, let me comment on the few attempts to defend this indicator, which accounts for nearly a third of the total weighting and for more in some of the subject rankings. It has been pointed out that David Willetts, British Minister for Universities and Science has congratulated THE on its new methodology.


“I congratulate THE for reviewing the methodology to produce this new picture of
the best in higher education worldwide. It should prompt all of us who care
about our universities to see how we can improve the range and quality of the
data on offer. Prospective students — in all countries — should have good
information to hand when deciding which course to study, and where. With the
world to choose from, it is in the interests of universities themselves to
publish figures on graduate destinations as well as details of degree
programmes.”
Willetts has praised THE for reviewing its methodology. So have many of us but that is not quite the same as endorsing what has emerged from that review.

Steve Smith, President of Universities UK and Vice-Chancellor of Exeter University is explicit in supporting the new rankings, especially the citations component.


But, as we shall see in a moment, there are serious issues with the robustness of citations as a measure of research impact and, if used inappropriately, they can become indistinguishable from a subjective measure of reputation.

The President of the University of Toronto makes a similar point and praises the new rankings’ reduced emphasis on subjective reputational surveys and refers to the citations (knowledge transfer?) indicator.


It might be argued that this indicator is noteworthy for revealing that some universities possess hitherto unsuspected centres of research excellence. An article by Phil Baty in THE of the 16th of September refers to the most conspicuous case, a remarkably high score for citations by Alexandria University, which according to the THE rankings has had a greater research impact than any university in the world except Caltech, MIT and Princeton. Baty suggests that there is some substance to Alexandria University’s extraordinary score. He refers to Ahmed Zuweil, a Nobel prize winner who left Alexandria with a master’s degree some four decades ago. Then he mentions some frequently cited papers by a single author in one journal.

The author in question is Mohamed El Naschie, who writes on mathematical physics and the journals – there are two that should be given the credit for Alexandria’s performance, not one – are Chaos, Solitons and Fractals and the International Journal of Nonlinear Sciences and Numerical Simulation. The first is published by Elsevier and was until recently edited by El Naschie. It has published a large number of papers by El Naschie and these have been cited many times by himself and by some other writers in CSF and IJNSNS.

The second journal is edited by Ji-Huan He of Donghua University in Shanghai, China with El Naschie as co-editor and is published by the Israeli publishing company, Freund Publishing House Ltd of Tel Aviv.

An amusing digression. In the instructions for authors in the journal the title is given as International Journal of Nonlinear Sciences and Numerical Stimulation. This could perhaps be described as a Freundian slip.

Although El Naschie has written a large number of papers and these have been cited many times, his publication and citation record is far from unique. He is not, for example, found in the ISI list of highly cited researchers. His publications and citations were perhaps necessary to push Alexandria into THE’s top 200 universities but they were not enough by themselves. This required a number of flaws in TR’s methodology.

First, TR assigned a citation impact score that compares actual citations of a paper with a benchmark score based on the expected number of citations for a specific subject in a specific year. Mathematics is a field where citations are relatively infrequent and usually occur a few years after publication. Since El Naschie published in a field in which citations are relatively scarce and published quite recently this boosted the impact score of his papers. The reason for using this approach is clear and sensible, to overcome the distorting effects of varying citation practices in different disciplines when comparing individual researchers or departments. But there are problems if this method is used to compare whole universities. A great deal depends on when the cited and citing articles are published and in which subject they were classified by TR.

A question for TR. How are articles classified? Is it possible to influence the category in which they are placed by the use of key words or the wording of the title?

Next, note that TR were measuring average citation impact. A consequence of this is that the publication of large numbers of papers that are cited less frequently than the high fliers could drag down the score. This explains an apparent oddity of the citation scores in the 2010 THE rankings. El Naschie listed nine universities as his affiliation in varying combinations between 2004 and 2008. yet it was only Alexandria that managed to leave the Ivy League and Oxbridge standing in the research impact dust. Recently, El Naschie’s list of affiliations has consisted of Alexandria, Cairo, Frankfurt University and Shanghai Jiao Tong University.

What happened was quite simply that all the others were producing so many papers that El Naschie’s made little or no difference. For once, it would be quite correct if El Naschie announced that he could not have done it without the support of his colleagues. Alexandria University owes its success not only to El Naschie and his citers but also to all those researchers who refrained from submitting articles to ISI–indexed journals or conference proceedings.

TR have some explaining to do here. If an author lists more than one affiliation, are they all counted? Or are fractions awarded for each paper? Is there any limit on the number of affiliations that an author may have. I think that it is two but would welcome clarification

As for the claim that Alexandria is strong in research, a quick look at the Scimago rankings is enough to dispose of that. It is ranked 1,047th in the 2010 rankings, which admittedly include many non-university organizations, for total publications over a decade. Also, one must ask how much of El Naschie’s writing was actually done in Alexandria, seeing that he had eight other affiliations between 2004 and 2008.

It has to be said that even if El Naschie is, as has been claimed in comments on Phil’s THE article and elsewhere, one of the most original thinkers of our time, it is strange that THE and TR should use a method that totally undermines their claim that the new methodology is based on evidence rather than reputation By giving any sort of credence to the Alexandria score, THE are asking us to believe that Alexandria is strong in research because precisely one writer is highly reputed by himself and a few others. Incidentally, will TR tell us what score Alexandria got in the research reputation survey?

I am not qualified to comment on the scientific merits of El Naschie’s work. At the moment it appears, judging from the comments in various physics blogs, that among physicists and mathematicians there are more detractors than supporters. There are also few documented signs of conventional academic merit in recent years such as permanent full time appointments or research grants. None of his papers between 2004 and 2008 in ISI-indexed journals,for example, apparently received external funding. His affiliations, if documented, turn out to be honorary, advisory or visiting.To be fair, readers might wish to visit El Naschie’s site. I will also publish any comments of a non-libellous nature that support or dispute the scientific merits of his writings.

Incidentally, it is unlikely that Alexandria’s score of 19.3 for internationalisation was faked. TR use a logarithm. If there were zero international staff and students a university would get a score of 1 and a score of 19.3 actually represents a small percentage. On the other hand, I do wonder whether Alexandria counted those students in the branch campuses in Lebanon, Sudan and Chad.

Finally, TR did not take the very simple and obvious step of not counting individual self-citations. Had they done so, they would have saved everybody, including themselves a lot of trouble. It would have been even better if they had excluded intra-institutional and intra-journal citation. See here for the role of citations among the editorial board of IJNSNS in creating an extraordinarily high Journal Impact Factor.

THE and TR have done everyone a great service by highlighting the corrosive effect of self citation on the citations tracking industry. It has become apparent that there are enormous variations in the prevalence of self citation in its various forms and that these have a strong influence on the citation impact score.

Professor Dirk Van Damme is reported to have said at the London seminar that the world’s elite universities were facing a challenge from universities in the bottom half of the top 200. If this were the case then THE could perhaps claim that their innovative methodology had uncovered reserves of talent ignored by previous rankings. But what exactly was the nature of the challenge? It seems that it was the efficiency with which the challengers turned research income into citations. And how did they do that?

I have taken the simple step of dividing the score for citations by the score for the research indicator (which includes research income) and then sorting the resulting values. The top ten are Alexandria, Hong Kong Baptist University, Barcelona, Bilkent, William and Mary, ENS de Lyon, Royal Holloway, Pompeu Fabra, University College Dublin, the University of Adelaide.

Seriously, these are a threat to the world’s elite?

The high scores for citations relative to research were the result of a large number of citations or a small number of total publications or both. It is of interest to note that in some cases the number of citations was the result of assiduous self-citation.

This section of the post contained comments about comparative rates of self citation among various universities. The method used was not correct and I am recalculating.
As noted already, using the THE iPad app to change the importance attached to various indicators can produce very different results. This is a list of universities that rise more than a hundred places when the citations indicator is set to ‘not important’. They have suffered perhaps because of a lack of super-cited papers, perhaps also because they just produced too many papers.

Loughborough
Kyushu
Sung Kyun Kwan
Texas A and M
Surrey
Shanghai Jiao Tong University
Delft University of Technology
National Chiao Tung University (Taiwan)
Royal Institute of Technology Sweden
Tokushima
Hokkaido

Here is a list of universities that fall more than 100 places when the citations indicator is set to ‘not important’. They have benefitted from a few highly cited papers or low publication counts or a combination of the two.

Boston College
University of California Santa Cruz
Royal Holloway, University of London
Pompeu Fabra
Bilkent
Kent State University
Hong Kong Baptist University
Alexandria
Barcelona
Victoria University Wellington
Tokyo Metropolitan University
University of Warsaw

There are many others that rise or fell seventy, eighty, ninety places when citations are taken out of the equation. This is not a case of a few anomalies. The whole indicator is one big anomaly.

Earlier Jonathon Adams in a column that has attracted one comment said:

"Disciplinary diversity is an important factor, as is international diversity. How would you show the emerging excellence of a really good university in a less well known country such as Indonesia? This is where we would be most controversial, and most at risk, in using the logic of field-normalisation to add a small weighting in favour of relatively good institutions in countries with small research communities. Some may feel that we got that one only partially right."

The rankings do not include universities in Indonesia, really good or otherwise. The problem is with good, mediocre and not very good universities in the US, UK, Spain, Turkey, Egypt, New Zealand, Poland etc. . It is a huge weighting, not a small one, the universities concerned range from relatively good to relatively bad, in one case the research community seems to consist of one person and many are convinced that TR got that one totally wrong.


Indonesia may be less well known to TR but it is very well known to itself and neighbouring countries.


I will publish any comments by anyone who wishes to defend the citation indicator of the new rankings. Here are some questions they might wish to consider.


Was it a good idea to give such a heavy weighting to research impact, 32.5% in the oerall rankings , 37.5 in at least 2 subject rankings? Is it possible that commercial considerations, citations data being a lucrative business for TR , had something to do with it?

Are citations such a robust indictor? Is there not enough evidence now to suggest that manipulation of citations, including self-citation, intra-institutional citation and intra-journal citation, is so pervasive that the robustness of this measure is very slight?

Since there are several ways to measure research impact, would it not have been a good idea to have used several methods? After all, Leiden University has several different ways of assessing impact. Why use only one?


Why set the threshold for inclusion so low at 50 papers per year?

Monday, October 11, 2010

Auditing University Rankings

The Chronicle of Higher Education reports on a meeting of the International Rankings Expert Group's (IREG) Observatory on Academic Ranking and Excellence.

The Organisation has set up a mechanism to audit the various university rankings.

"The audit project, which he [Gero Federkeil] is helping to manage, will be based closely on IREG's principles, which emphasize clarity and openness in the purposes and goals of rankings, the design and weighting of indicators, the collection and processing of data, and the presentation of results.

"We all say that rankings should aim at delivering transparency about higher-education institutions, but we think there should be transparency about rankings too," Mr. Federkeil said. The audit process could eventually give rise to an IREG quality label, which would amount to an identification of trustworthy rankings, thereby enhancing the credibility of rankings and improving their quality, Mr. Federkeil said.

At the Berlin meeting last week, Mr. Federkeil and Ying Cheng, of the Center for World-Class Universities at Shanghai Jiao Tong University, which produces the best-known and most influential global ranking of universities, outlined the proposed methodology and procedure for the audit. The IREG executive committee will nominate audit teams consisting of three to five people. The chair of each team must not have any formal affiliation with a ranking organization, and at least one member of the audit team must be a member of the IREG executive committee. Audits will be based on self-reported data as well as possible on-site visits, and each full audit is expected to take about five months to complete."

The executive committee of IREG includes Liu Nian Cai from Shanghai Rankings Consultancy, Bob Morse from US News & World Report and Gero Federkeil from CHE, the German ranking body.

Members of the Observatory include HEEACT (Taiwan), the Kazakhstan and Slovak ranking agencies and QS Intelligence Agency.

Would anybody like to guess who will be the first to be audited?

Thursday, September 30, 2010

Return of the British Gift for Understatement

One of the tiresome things about the ranking season is the mania for inflated adjectives that readers of THE and other publications and sites have had to endure: rigorous, innovative, transparent, robust, sophisticated and so.

It was a relief to read a column by Jonathon Adams in today's THE which uses words like "small" and "good". Here is an example.

Disciplinary diversity is an important factor, as is international diversity. How would you show the emerging excellence of a really good university in a less well known country such as Indonesia? This is where we would be most controversial, and most at risk, in using the logic of field-normalisation to add a small weighting in favour of relatively good institutions in countries with small research communities. Some may feel that we got that one only partially right.

I think the bit about "using the logic of field-normalisation to add a small weighting in favour of relatively good institutions in countries with small research communities" refers to the disconcerting presentation of Alexandria University as the 4th best in the world for research impact.
The THE Life Sciences Ranking (subscription required)

First is MIT, then Harvard, then Stanford. Nothing to argue about there.

For research impact (i.e. citations) the order is:

1.. MIT
2. University of Barcelona
3. Harvard
4. Princeton
5. Stanford
6. Oxford
7. Dundee
8. Hong Kong
9. Monash
10. Berkeley

In this subject group, the citations indicator gets 37.5%.
Rankings Undermined by Flawed Indicator

See commentary in University World News

Tuesday, September 28, 2010

From the Straits Times of Singapore

Students at Nanyang Technological University in Singapore have been dismayed by the poor performance in the THE World University Rankings. Apparently they are concerned about future job prospects. The university was ranked 174th, much less than it is used to. The apparent poor performance for citations (222nd) was at the root of the problem.

Su Guanning the President has written to the Straits Times

"The QS 2010 Ranking is today the most widely-used world university ranking.
Nanyang Technological University was placed 74th in the list and the National
University of Singapore 31st, both down by one position from last year. The
performance of both universities has been consistent over the last four years.

NTU started off as a practice-oriented, teaching university in 1991 and
is in fact the youngest university in the Top 100 ranked in the QS 2010.

The Times Higher Education 2010 rankings is entirely new. Its criteria
have yet to be accepted by many universities. A detailed analysis reveals it is
88 per cent computed from research-related indicators with unusual normalization
of data resulting in some bizarre results. In a joint article in Edmonton
Journal, Indira Samarasekera, the President and Vice-Chancellor of the
University of Alberta, and Carl Amrhein, the Provost and Vice-President
(Academic) of the same university, wrote that the Times ranking has very
peculiar outcomes that do not pass the “reasonableness test”. They advised the
public to take the “rankings with a truckload of salt”.

Malcolm Grant, President and Provost of University College London pointed out to The Guardian that research citations, if not intelligently applied, can lead to bizarre
results. He cited the example of Egypt’s Alexandria University being ranked
above Harvard and Stanford universities in research influence in the Times
Higher Education 2010. "

Sunday, September 26, 2010

Something is Awry

A comment by the President of the University of Alberta.


"This year, Times Higher Education partnered with Thompson
Reuters to compile a new ranking, which was released a few days ago. This
ranking is also based on objective and subjective measures. A school’s final
score is based on an aggregate of teaching, research reputation/income,
citations/research influence, industry income and international mix. Academics are surveyed to rate teaching and research quality, a precarious exercise since teaching is difficult, if not impossible, for academics to assess at institutions other than their own. At first blush, this ranking appears to capture a broader range of indicators and does a good job of identifying the top 20. We applaud the University of Toronto for being named among the top 20; this is very good for Canada and Canadian universities.


However, when one looks under the hood, beyond the top 20 universities, the new ranking has some very peculiar outcomes that does not pass the “reasonableness test.” Alexandria University in Egypt received one of the highest scores for citations while also receiving the lowest score for research reputation. By comparison #1 Harvard received near perfect scores on both measures. Logic would suggest that the aggregate score for research reputation should correlate with the score for citations, which measure research influence. However, in spite of the vast discrepancy between its scores, Alexandria, which has never been on any other international ranking, came in ahead of many top universities. Furthermore many universities considered by peers as among the best do not appear in the top 200.
Something is awry.

Completely Obscure

The President of the University of Groningen has commented on the THE 2010-11 World University rankings:


"President of the Board of the University of Groningen Prof. Sibrand Poppema is
very critical of the way that THE has determined the quality of university
education and of the way the number of citations per academic publication has
been calculated. ‘There’s definitely something wrong in that’, said Poppema
about the latter in an interview for the Groningen university paper UK. The
number of citations is a measure of the quality of the academic research.
However, because not all academic fields are cited as frequently as some, THE
uses a calculation method to be able to compare the results even so. Poppema
calls this method ‘completely obscure’."

Friday, September 24, 2010

Here We Go Again

Times Higher Education have released the ranking of the world's top universities in engineering and technology (subscription required). Caltech is number 1 and MIT is second. So far, nothing strange.

But one wonders about the Ecole Normale Superieure in Paris in 34th place and Birkbeck College London in 43rd.

Clicking on the Citations Indicator we find that the Ecole has a score of 99.7 and Birkbeck 100.

So Birkbeck has the highest research impact for engineering and technology in the world and ENS Paris the second highest?

Thursday, September 23, 2010

Missing the Point

One of the most depressing things about university rankings is that whenever rankers make an error or use inappropraiate methodologies to push a un iversity up the table, administrators produce detailed justifications of their ascent usually avoiding the real reason. Here the head of Alexandria University explains why her university is in the top 200 without mentioning the real reasons, namely the deficiencies of the citation indicator.
From the Economist



But I suspect that today's league tables say as much about
the motives behind those who compile them (and, indeed, those who laud their
findings) as they do about the true global standing of the institutions
concerned. Britain is poised to slash its public services, and the axe hangs
over its universities just as surely as it does over almost all other area of
public life (only the National Health Service and the overseas aid budget have
won reprieves). Other countries with rickety public finances are nevertheless
splurging on universities, including America, Canada, France and Germany. Even
Australia and China, which avoided recession, have big plans.

In such circumstances, it makes sense for British universities to present themselves as a national treasure whose crown is slipping for want of investment. Universities
UK, which represents vice-chancellors, issued a statement from Steve Smith, its
president, saying, "The tables may show that the UK remains the second-strongest
university system in the world, but the most unmistakable conclusion is that
this position is genuinely under threat. The higher education sector is one of
the UK's international success stories, but it faces unprecedented competition.
Our competitors are investing significant sums in their universities, just when
the UK is contemplating massive cuts in its expenditure on universities and
science."

He may be right, but the evidence he uses to support his
conclusion is far from objective.


And a comment on the article


deanquill wrote: Sep 20th 2010 7:31 GMT .It's notable that Israel, which had two
universities just outside The Times' top 100 last year, had none in the list at
all this year. The universities apparently failed to return statistical data and
were excluded; they say they never received the request from The Times' new
survey organizers. Incompetence is more likely than conspiracy but it doesn't
say much about the list's accuracy
.

Good point, but the magazine in question is not The Times and the data was collected by Thomson Reuters.

Wednesday, September 22, 2010

Selected Comments from Times Higher Education


Mike Reddin 17 September, 2010
World university rankings take national ranking systems from the ridiculous to the bizarre. Two of the most glaring are made more so by these latest meta analyses.
Number One: R&D funding is scored not by its quality or contribution to learning or understanding but by the amount of money spent on that research; it ranks expensive research higher than cheap research; it ranks a study of 'many things' better than the study of a 'few things'; it ranks higher the extensive and expensive pharmacological trial than the paper written in tranquility over the weekend. I repeat, it does not score 'contribution to knowledge'.

Number Two. Something deceptively similar happens in the ranking of citations. We rank according to number alone - not 'worth' - not whether the paper merited writing in the first place, not whether we are the better for or the worse without it, not whether it adds to or detracts from the sum of human knowledge. Write epic or trash .... as long as it is cited, you score. Let me offer utter rubbish - the more of you that denounce me the better; as long as you cite my name and my home institution.

Which brings me full circle: the 'rankings conceit' equates research / knowledge / learning / thinking / understanding with institutions - in this case, universities and universities alone. Our ranking of student 'outcomes' (our successes/failure as individuals on many scales) wildly presumes that they flow from 'inputs' (universities). Do universities *cause* these outcomes - do they add value to those they have admitted? Think on't. Mike Reddin http://www.publicgoods.co.uk/



jorge Sanchez 18 September, 2010
this is ridiculous~ LSE was placed 67 in the previous year and THE decided to end relations with QS because of this issue. now since THE is no longer teaming up with QS, how could you possibly explain this anomaly by placing LSE ranked 86 in the table????


Mark 18 September, 2010
where is the "chinese university of Hong Kong in the table??? it is no longer in the top 200 best universities....

last year was in the top 50 now is off the table??? is this a serious ranking?????


Of course it's silly 18 September, 2010
Just look at the proposition that teaching is better if you have a higher proportion of doctoral students to undergraduate students.

This is just plainly silly, as 10 seconds thinking about the reputation of teaching in the US will tell you: liberal arts colleges offer extraordinary teaching in the absence of PhD programmes.



Matthew H. Kramer 18 September, 2010
Though some tiers of these rankings are sensible, there are some bizarre anomalies. Mirabile dictu, the University of Texas doesn't appear at all; the University of Virginia is ridiculously low at 72; NYU is absurdly low at 60; the University of Hong Kong is preposterously overrated at 21. Moreover, as has been remarked in some of the previous comments -- and as is evident from a glance at the rankings -- the criteria hugely favor technical institutes. The rank of MIT at 3 is credible, because MIT is outstanding across the board. However, Cal Tech doesn't belong at 2, and Imperial (which has no programs at all in the humanities and social sciences) certainly doesn't belong at 9. Imperial and especially Cal Tech are outstanding in what they do, but neither of them is even close to outstanding across the gamut of subjects that are covered by any full-blown university. I hope that some of these anomalies will be eliminated through further adjustments in the criteria. The exclusion of Texas is itself sufficiently outlandish to warrant some major modifications in those criteria.



Matthew H. Kramer 18 September, 2010
Weird too is the wholesale exclusion of Israeli universities. Hebrew University, Tel Aviv University, and Technion belong among the top 200 in any credible ranking of the world's universities.


Neil Fazel 19 September, 2010
No Sharif, no U. Texas, no Technion. Another ranking to be ignored.


OZ academic 20 September, 2010
While the criteria seem to be OK, although they might be debated, how to carry out the statistical analyses and how to collect the data are the issues for the validity of the poll. The omission of Chinese University of Hong Kong, in the inclusion of the Hong Kong Baptist University and Hong Kong Polytechnic University in the world's top 200 universities, seems to be very "mysterious" to me. As I understand the Chinese University of Hong Kong is more or less of a similar standard in teaching and research in comparison to the Hong Kong University and the Hong Kong University of Science and Technology, but they have some slight edges over the Hong Kong Baptist University and the Hong Kong Polytechnic University. I wonder if there are mix-ups in the data collection processes. If this is true, then there are disputes in this poll not only in the criteria of assessment but also in the accuracy in data collections and analyses.
Texas Opted Out

From the Texas Tribune

"University officials said UT's [University of Texas] absence is not due to an epic fall — they simply declined to participate.

Kristi Fisher, director of UT’s Office of Information Management and Analysis, said they opted out for two reasons. First, budget cuts have caused resource constraints, and projects must be chosen carefully. Also, the survey was using new methodology for the first time, and there was talk it might be suspect. “The last thing we wanted to do was spend a lot of resources to participate in a survey that might have flawed methodology behind it,” Fisher said. "
What Happens When You Set the THE Rankings Citations Indicator to Not Important Continued

This is a selection of the universities that go up when citations is set as not important. The number of of places is on the right.

Tokyo University 10
Korean Advanced Institute of Science and Technology 38
Osaka 70
Warwick 78
Kyushu 100
Sung Kyun Kwan 100
Texas A & M 103
Sao Paulo 107
Surrey 123
Shanghai Jiao Tong 158

And a selection of those that fell after citations is set to not important.

Sussex 83
Univerity College Dublin 99
UC Santa Cruz 102
Tasmania 106
Royal Holloway 119
Pompeu Fabra 142
Bilkent 154
Kent State 160
Hong Kong Baptist 164
Alexandria 234
The THE World University Rankings With Citations Set to Not Important

The THE rankings Iphone app has the excellent feature of allowing users to adjust the weightings of the five indicator groups. This is the top 200 when the citations -- research impact indicator is set to 'not important'. The number in brackets on the right is the position in the official ranking.


  1. Harvard (1)
  2. Caltech (2)
  3. MIT (3)
  4. Stanford (4)
  5. Princeton (5)
  6. Imperial College London (9)
  7. Cambridge ( 6)
  8. Oxford (6)
  9. Yale (10)
  10. UC Berkeley (8)
  11. UC Los Angeles (11)
  12. Johns Hopkins (13)
  13. Swiss Federal Institute of technology Zurich (15)
  14. University of Michigan (15)
  15. Chicago (12)
  16. Tokyo (26)
  17. Cornell (14)
  18. Toronto (17)
  19. University College London (22)
  20. Columbia (18)
  21. University of Pennsylvania (19)
  22. University of Illinoi-Urbana (33)
  23. McGill (35)
  24. Carnegie Mellon (20)
  25. Hong Kong (21)
  26. Georgia Institute of Technology (27)
  27. Kyoto (57)
  28. British Columbia (30)
  29. University of Washiongton (23)
  30. National University of Singapore (34)
  31. Duke (24)
  32. Peking (37)
  33. Universityof North Carolina (30)
  34. Karolinska Institute (34)
  35. Tsinghua University, Beijing (58)
  36. Northwestern University (25)
  37. Pohang University of scienc and technology (28)
  38. UC San Diego (32)
  39. Melbourne (36)
  40. UC Santa Barbara (29)
  41. Korean Advanced Institute of Science and Technology (79)
  42. UC Davis (54)
  43. University of Masachusetts (56)
  44. Washington University St Louis (38)
  45. Edinburgh (40)
  46. Australian National University (43)
  47. Minnesota (52)
  48. Purdue (106)
  49. Vanderbilt (51)
  50. LSE (86)
  51. Ecole Polytechnique (39)
  52. Case Western Reserve (65)
  53. Wisconsin (43)
  54. Ohio State (66)
  55. Delft University of Technology (151)
  56. Sydney (71)
  57. Brown (55)
  58. EPF Lausanne (48)
  59. Tokyo Institute of Technology (112)
  60. Osaka (130)
  61. Catholic University of Leuven (119)
  62. Univerity of Virginia (72)
  63. Tohoku (132)
  64. Ecole Normale Superieure Paris (64)
  65. Tufts (53)
  66. University of Munich (61)
  67. Manchester (87)
  68. Hing Kong University of Science and Technology (41)
  69. Emory (61)
  70. Gottingen (43)
  71. Seoul National University (109)
  72. Pittsburgh (54)
  73. Rutgers (105)
  74. New York University (60)
  75. Yeshiva (68)
  76. University of Southern California (73)
  77. Alberta (127)
  78. Uppsala (147)
  79. UC Irvine (49)
  80. University of Science and Technology China (49)
  81. Queensland (81)
  82. Ghent (124)
  83. Zurich (90)
  84. King’s College London (77)
  85. Eindhoven University of Technology (114)
  86. Ruprecht Karl University of Heidelberg (83)
  87. National Chiao Tung University (181)
  88. Rice (47)
  89. Lund (89)
  90. University of Utah (83)
  91. Royal Institute of Technology Sweden (193)
  92. Bristol (68)
  93. McMaster (93)
  94. Boston (59)
  95. Rensselaer Polytchnic Institute (1040
  96. University Of Colorado (67)
  97. Montreal (138)
  98. University of Iowa (132)
  99. National Taiwan University (115)
  100. Leiden (124)
  101. Notre Dame (63)
  102. University of Arizona (95)
  103. George Washington (103)
  104. Texas A & M (207)
  105. Georgetown (164)
  106. Lomonosov Moscow State (237)
  107. National Tsing Hua University (107)
  108. Geneva 118)
  109. Birmingham (145)
  110. Southampton (90)
  111. Wagening (114)
  112. Medical College of Georgia (158)
  113. Technical University of Munich (101)
  114. New South Wales (152)
  115. Illinois-Chicago (197)
  116. Michigan State (122)
  117. Trinity College Dublin (76)
  118. Tokyo Medical and Dental (217)
  119. Nanyang Technological (174)
  120. Technical University of Denmark (122)
  121. Sheffield (137)
  122. York (81)
  123. St Andrews 103)
  124. Nanjing (120)
  125. Lausanne (136)
  126. Glasgow (128)
  127. VU Amsterdam (13()
  128. Twente (185)
  129. Utrecht (143)
  130. Sung Kyun Kwan (230)
  131. Stony Brook (78)
  132. Wake Forest (90)
  133. Helsinki (102)
  134. Basel (95)
  135. Freiborg (132)
  136. Adelaide (73)
  137. Nagoya (206)
  138. Ruhr University Bochum
  139. Sao Paol o (232)
  140. Free University of Berlin (212)
  141. Maryland College Park (98)
  142. Warwick (220)
  143. Technion (221)
  144. Iowa State (156)
  145. Chalmers university of Technology (223)
  146. Dartmouth (99)
  147. RWTH Aachen (182)
  148. Kansas (232)
  149. Swedish University Agricultural sciences (199)
  150. Groningen (170)
  151. State University of Campinas (248)
  152. Nottingham (174)
  153. Leeds (168)
  154. Penn State (109)
  155. Maastricht (209)
  156. Zhejiang (197)
  157. Humboldt (178)
  158. Vienna (195)
  159. Hong Kong Polytechnic (149)
  160. Queen Mary London (120)
  161. Aarhus (167)
  162. Sussex (79)
  163. University of Georgia (246)
  164. National Sun Yat-Sen (163)
  165. William and Mary (75)
  166. Kiel (210)
  167. Lancaster (214)
  168. Indiana University ((156)
  169. Newcastle, UK (152)
  170. UC Santa Cruz (68)
  171. Aberdeen (149)
  172. Durham
  173. University College Dublin
  174. Liverspool (165)
  175. Dalhousie (193)
  176. University of delaware (159)
  177. UC Riverside (117)
  178. University of Amsterdam (165)
  179. Surrey (302)
  180. Konstanz (186)
  181. University of South Carolina (214)
  182. Wurzburg (168)
  183. Cape Town (107)
  184. Tokushima (317)
  185. Reading (210)
  186. Stockholm (129)
  187. University of Waterloo, Canada (267)
  188. Wshington State University (264)
  189. Copenhagen (177)
  190. Hokkaido (293)
  191. Hawaii (105)
  192. Yonsei (190)
  193. Leicester (216)
  194. Kyushu (294)
  195. Bergen (135)
  196. Shanghai Jiao Tong (258)
  197. Pierre and Marie Curie (140)
  198. ENS De Lyon (100)
  199. Erasmus (159)
  200. Tromso (227)


Comments

Apart from spam about Viagra and where to meet beautiful Russian women and so on and one racist diatribe I have until now published every comment sent to this post. However, I think it is time to indicate a change in policy. I will publish any comment providing it does not contain negative general comments about the character of a specific individual. Thus I will allow critical comments about citation patterns or the scientific validity of specific research but not, for example, a statement that someone is a "fraud", unless convicted of fraud in court.

Tuesday, September 21, 2010

From The Students' Room 3

Click here.

From AfghanistanBananistan

I am intriged about the LSE's ranking.

TBH, it does not really matter where a university ranks in the world in relation to other global universities, it just concerns me when UK universities have odd rankings amongst their own. No one would complain if LSE came 4/5th in the UK and 86th in the world, or Warwick came 8th in UK and 240th in the world.

The RAE claims that LSE has the highest proportion of world leading research in the UK, and is in the top 3 in every type of analysis of the results (often tied with Oxford). The RAE may not be perfect, but for LSE to suddenly drop in the THES ranking (esp citations) seems really shocking, and perhaps wrong.

Afterall, someone from THES wrote a few months back that LSE ranking in 66th was 'clearly a big mistake'. This was when THES clearly stated that citations hindered social science and arts institutions. You mean to tell me that LSE ranks 66th (80th in this yers QS ranking), when citations are weighted against social sciences, yet LSE drops to 86th when citations not weighted to disadvantage social science institutions?

I guess my main point is that in the QS ranking LSE has ranked in the top 5 in the world every year for social sciences, with citations the key element of the ranking. How come LSE can have the 4th best citation count in the social sciences, yet when taken account in an overall ranking, where citations are normalised, it has such a low citation count in the THES ranking? I understand Warwick, for it may rank 44th for humanities, but it has to take into account all its own faculties which may drag it down - yet LSE only has social sciences.

I find it hard to conceptualise that Sussex and York universities have more pervasive research throughout ALL their departments than just LSE for social science. Bear in mind that LSE, as a specialist instituon, is one of (probably the biggest) social sicence dept in the world, with more specific social science researchers that anywhere else. Therefore, how come its research is not cited more? What are the 50 odd fellows of the British academy doing, and should they not have been appointed if their research is not influential.

In short, does THES not think that something is up with their methodology, just as they admitted last year. Just because some said there is evidence of the world being flat, does not make it true. I find it hard to believe that the university that created many fields in the social sciences and has won nearly a quarter of all nobel prizes in economics, can rank so badly for its citations score? Aferall, can the fellow social science researchers who voted LSE the 6th best in the world for infoluence IR and is the only uni to compete with the US economics depts in all economics world rankings, be so wrong?

The same logic applies to Warwick vs. Dundee and Lancaster.

If a ranking of the world's best ever football players only had Maradonne at number 20 and Pele at 15, wouldn't the raking lose credibility. You could rank them purely by influence measured by the number of goals scores. Pele/Maradonna would not rank that well, but we them as the best players because we have seen them with our own eyes.

P.S I dont even go to LSE.
From The Students' Room 2

Click here

Dear Phil Baty,

I had sent you email to inquire about the ranking performance of Taiwan's universities.
I am waiting for your reply.
Please share your opinion.
Thanks!

best regards,

yuching
From the Students' Room

There are some interesting comments about the latest THE rankings at their "The Students' Room"

Here is one from Martin

I appreciate that measuring the research impact of an institution is difficult. Unfortunately, the THE seems to have got it quite badly wrong this time. The best evidence for this is not that Warwick doesn't make it into the top 200 (even though this hurts; yes I am faculty at Warwick), but the fact that the University of Alexandria makes it number 4, based on the output of one single person.


Some suggestions follow:


1. Do not count self-citations. Even better, it is common sense that a citation by someone further from your field should count more than a citation by your former PhD student. Of course, these things are difficult to figure out in an automated way. But one could for example use the collaboration distance (coauthors have distance one, coauthors of coauthors have distance two, etc) to weight citations (with a cap at 4, say).

2. Quality of research is paramount. As already pointed out, it is easy to get many citations for mediocre research if there are sufficiently many other mediocre researchers working in the same area. Again, this is vastly more common than you may think, for the simple reason that it is easier to perform mediocre research than world-class research. Furthermore, you get more recognition as counted by basic citation count, so why not doing it?

One way of taking this into account is to give higher weight to citations coming from articles published in highly respected journals. (Similarly, when measuring "research output", higher weight should be given to articles published in journals with high reputation.)

However, please *DO NOT* use the impact factor as a measure of the quality of a journal, as it can be (and is!) very easily manipulated, as the story of the journal "Chaos, Solitons and Fractals" shows. Instead, the only reliable way of assessing the quality of a journal within a given field is to ask researchers in that field to provide their own rankings. Yes, this seems subjective, but unfortunately that's all you are ever going to get, and I can assure you that you will get a very consistent picture within each area. The fact that the "Annals of Mathematics" is the most respected journal in mathematics simply cannot be measured in terms of impact factor.

3. Count current citations to older work. If someone's article turns out to spawn an entire new field of research five years later, it will not show up at all in the current metric. This makes simply no sense. Of course, this doesn't happen all that often, but the reason why top institutions have a reputation is precisely because of those instances in which it happens. Furthermore, there are areas of research (like mathematics) in which the "lifespan" of a good article is measured in decades, which goes way beyond the two to five years that you use as a rule. Counting current citations to older articles would be one small but absolutely essential step to correct this.

4. Measure the total impact of the institution in a field, and not its "average" impact. The only way I could see that the output of one single person can count so much is that this person somehow has an abnormally high weight, probably due to the fact that there is very little research output from the U. of Alexandria. If this suspicion is indeed correct (I hope that I am wrong on this one), then this would effectively mean that universities are penalised by having large (and influential!) departments and should rather strive to have only very few but quite prolific researchers on their payroll.

There is probably more, but I am getting hungry now ;-) I very much hope that you will take these comments to heart. Best wishes,

Martin
Back to Citations

I was hoping to get away from the citations indicator in the THE rankings for a while but there are several comments here and elsewhere that need to be discussed.

First, I had assumed that the very high citation scores for Alexandria, and to a lesser extent other universities, were the result of a large number of citations that, while perhaps excessive, were from reputable scholars. Many of the citations to Alexandria University papers are to and from papers by Mohamed El Naschie. I assumed that his CV was evidence that he was a distinguished and esteemed scientist. Looking closely at his CV there seem to be a number of points that require clarification. There are references to academic appointments for a single calendar year, not the academic year as one would expect. There is a reference to a Professorship at "DAMTP, Cambridge" but the university, nor a college, is not mentioned. Also , there seems to be a period when El Naschie was a professor simultaneously at the Free University of Brussels, DAMTP Cambridge and the University of Surrey.

I hope that these points can be clarified. The TR citations indicator would still be a problem even if it was being skewed by heavy citation of groundbreaking research but it would be more of a problem if there any doubts, whether or not justified, about the excellence of that research.

Monday, September 20, 2010

Alexandria Gets the News

The news about being among the top 200 universities in the world has finally reached Alexandria. The reaction is very predictable:


"“I believe we are well deserving of being on the list,” professor of economics
Amr Hussein told Bikya Masr. “We have worked hard to improve our system of
education and it is showing that we are succeeding in doing so.”...

Hend Hanafi, President of Alexandria University, told local media that she is proud
of the ranking and hopes the university will continue to make efforts to
consistently improve the quality of education at the university."

Will somebody please send the President an account of what happened to Universiti Malaya and its Vice-Chancellor after THE and QS put them in the top 100 in 2004.
Omissions

Several writers have noticed the absence of any Israeli universities from the top 200 universities. In the QS rankings this year there three. Those who have downloaded the ipad
app will have noticed that there are only two in the top 400. So what happened to Tel Aviv and the Hebrew University of Jerusalem?

There is an article about this in Haaretz.
"Haaretz has learned that most Israeli universities were not on the list because they failed to respond to repeated requests for information, including on faculty and students, which is necessary for the listing.

TAU and the Hebrew University say that they never received such a request from THE. According to THE, only the Technion and Bar-Ilan University responded with information, but they were ranked 221 and 354 respectively.

As for the other universities, the editor of the ranking, Phil Baty, told Haaretz that although it is upsetting for Israel, he hoped that the Israeli universities would recognize the amount of serious work invested in creating the ranking and the degree to which the methodology was transparent, and would participate in the initiative, like other universities have done. He also expressed certainty that next year "they will be included."

Didn't do their homework

THE says that more than 6,000 universities participated in the ranking and most provided the necessary information.

A spokesperson for the Hebrew University responded that contrary to the claim of
the survey's editors, "following an examination we did not find any such request
[for information]. When we asked for the correspondence to the university on the
subject, they could not provide it. The university is saddened by the fact that
the editors of the ranking did not carry out their work responsibly, and thus
harmed the university." "
There are other surprising omissions such as all the Indian Institutes of Technology, the University at Buffalo: SUNY and the Catholic University of Louvain (the French one -- the Dutch one is there at 120)