Times Higher Education and Thomson Reuters have said that they wish to engage and that they will be happy to debate their new rankings methodology. So far we have not seen much sign of a debate although I will admit that perhaps more things were said at the recent seminars in London and Spain than got into print. In particular, they have been rather reticent about defending the citations indicator which gives the whole ranking a very distinctive cast and which is likely to drag down what could have been a promising development in ranking methodology.
First, let me comment on the few attempts to defend this indicator, which accounts for nearly a third of the total weighting and for more in some of the subject rankings. It has been pointed out that David Willetts, British Minister for Universities and Science has congratulated THE on its new methodology.
“I congratulate THE for reviewing the methodology to produce this new picture ofWilletts has praised THE for reviewing its methodology. So have many of us but that is not quite the same as endorsing what has emerged from that review.
the best in higher education worldwide. It should prompt all of us who care
about our universities to see how we can improve the range and quality of the
data on offer. Prospective students — in all countries — should have good
information to hand when deciding which course to study, and where. With the
world to choose from, it is in the interests of universities themselves to
publish figures on graduate destinations as well as details of degree
programmes.”
Steve Smith, President of Universities UK and Vice-Chancellor of Exeter University is explicit in supporting the new rankings, especially the citations component.
But, as we shall see in a moment, there are serious issues with the robustness of citations as a measure of research impact and, if used inappropriately, they can become indistinguishable from a subjective measure of reputation.
The President of the University of Toronto makes a similar point and praises the new rankings’ reduced emphasis on subjective reputational surveys and refers to the citations (knowledge transfer?) indicator.
It might be argued that this indicator is noteworthy for revealing that some universities possess hitherto unsuspected centres of research excellence. An article by Phil Baty in THE of the 16th of September refers to the most conspicuous case, a remarkably high score for citations by Alexandria University, which according to the THE rankings has had a greater research impact than any university in the world except Caltech, MIT and Princeton. Baty suggests that there is some substance to Alexandria University’s extraordinary score. He refers to Ahmed Zuweil, a Nobel prize winner who left Alexandria with a master’s degree some four decades ago. Then he mentions some frequently cited papers by a single author in one journal.
The author in question is Mohamed El Naschie, who writes on mathematical physics and the journals – there are two that should be given the credit for Alexandria’s performance, not one – are Chaos, Solitons and Fractals and the International Journal of Nonlinear Sciences and Numerical Simulation. The first is published by Elsevier and was until recently edited by El Naschie. It has published a large number of papers by El Naschie and these have been cited many times by himself and by some other writers in CSF and IJNSNS.
The second journal is edited by Ji-Huan He of Donghua University in Shanghai, China with El Naschie as co-editor and is published by the Israeli publishing company, Freund Publishing House Ltd of Tel Aviv.
An amusing digression. In the instructions for authors in the journal the title is given as International Journal of Nonlinear Sciences and Numerical Stimulation. This could perhaps be described as a Freundian slip.
Although El Naschie has written a large number of papers and these have been cited many times, his publication and citation record is far from unique. He is not, for example, found in the ISI list of highly cited researchers. His publications and citations were perhaps necessary to push Alexandria into THE’s top 200 universities but they were not enough by themselves. This required a number of flaws in TR’s methodology.
First, TR assigned a citation impact score that compares actual citations of a paper with a benchmark score based on the expected number of citations for a specific subject in a specific year. Mathematics is a field where citations are relatively infrequent and usually occur a few years after publication. Since El Naschie published in a field in which citations are relatively scarce and published quite recently this boosted the impact score of his papers. The reason for using this approach is clear and sensible, to overcome the distorting effects of varying citation practices in different disciplines when comparing individual researchers or departments. But there are problems if this method is used to compare whole universities. A great deal depends on when the cited and citing articles are published and in which subject they were classified by TR.
A question for TR. How are articles classified? Is it possible to influence the category in which they are placed by the use of key words or the wording of the title?
Next, note that TR were measuring average citation impact. A consequence of this is that the publication of large numbers of papers that are cited less frequently than the high fliers could drag down the score. This explains an apparent oddity of the citation scores in the 2010 THE rankings. El Naschie listed nine universities as his affiliation in varying combinations between 2004 and 2008. yet it was only Alexandria that managed to leave the Ivy League and Oxbridge standing in the research impact dust. Recently, El Naschie’s list of affiliations has consisted of Alexandria, Cairo, Frankfurt University and Shanghai Jiao Tong University.
What happened was quite simply that all the others were producing so many papers that El Naschie’s made little or no difference. For once, it would be quite correct if El Naschie announced that he could not have done it without the support of his colleagues. Alexandria University owes its success not only to El Naschie and his citers but also to all those researchers who refrained from submitting articles to ISI–indexed journals or conference proceedings.
TR have some explaining to do here. If an author lists more than one affiliation, are they all counted? Or are fractions awarded for each paper? Is there any limit on the number of affiliations that an author may have. I think that it is two but would welcome clarification
As for the claim that Alexandria is strong in research, a quick look at the Scimago rankings is enough to dispose of that. It is ranked 1,047th in the 2010 rankings, which admittedly include many non-university organizations, for total publications over a decade. Also, one must ask how much of El Naschie’s writing was actually done in Alexandria, seeing that he had eight other affiliations between 2004 and 2008.
It has to be said that even if El Naschie is, as has been claimed in comments on Phil’s THE article and elsewhere, one of the most original thinkers of our time, it is strange that THE and TR should use a method that totally undermines their claim that the new methodology is based on evidence rather than reputation By giving any sort of credence to the Alexandria score, THE are asking us to believe that Alexandria is strong in research because precisely one writer is highly reputed by himself and a few others. Incidentally, will TR tell us what score Alexandria got in the research reputation survey?
I am not qualified to comment on the scientific merits of El Naschie’s work. At the moment it appears, judging from the comments in various physics blogs, that among physicists and mathematicians there are more detractors than supporters. There are also few documented signs of conventional academic merit in recent years such as permanent full time appointments or research grants. None of his papers between 2004 and 2008 in ISI-indexed journals,for example, apparently received external funding. His affiliations, if documented, turn out to be honorary, advisory or visiting.To be fair, readers might wish to visit El Naschie’s site. I will also publish any comments of a non-libellous nature that support or dispute the scientific merits of his writings.
Incidentally, it is unlikely that Alexandria’s score of 19.3 for internationalisation was faked. TR use a logarithm. If there were zero international staff and students a university would get a score of 1 and a score of 19.3 actually represents a small percentage. On the other hand, I do wonder whether Alexandria counted those students in the branch campuses in Lebanon, Sudan and Chad.
Finally, TR did not take the very simple and obvious step of not counting individual self-citations. Had they done so, they would have saved everybody, including themselves a lot of trouble. It would have been even better if they had excluded intra-institutional and intra-journal citation. See here for the role of citations among the editorial board of IJNSNS in creating an extraordinarily high Journal Impact Factor.
THE and TR have done everyone a great service by highlighting the corrosive effect of self citation on the citations tracking industry. It has become apparent that there are enormous variations in the prevalence of self citation in its various forms and that these have a strong influence on the citation impact score.
Professor Dirk Van Damme is reported to have said at the London seminar that the world’s elite universities were facing a challenge from universities in the bottom half of the top 200. If this were the case then THE could perhaps claim that their innovative methodology had uncovered reserves of talent ignored by previous rankings. But what exactly was the nature of the challenge? It seems that it was the efficiency with which the challengers turned research income into citations. And how did they do that?
I have taken the simple step of dividing the score for citations by the score for the research indicator (which includes research income) and then sorting the resulting values. The top ten are Alexandria, Hong Kong Baptist University, Barcelona, Bilkent, William and Mary, ENS de Lyon, Royal Holloway, Pompeu Fabra, University College Dublin, the University of Adelaide.
Seriously, these are a threat to the world’s elite?
The high scores for citations relative to research were the result of a large number of citations or a small number of total publications or both. It is of interest to note that in some cases the number of citations was the result of assiduous self-citation.
This section of the post contained comments about comparative rates of self citation among various universities. The method used was not correct and I am recalculating.
As noted already, using the THE iPad app to change the importance attached to various indicators can produce very different results. This is a list of universities that rise more than a hundred places when the citations indicator is set to ‘not important’. They have suffered perhaps because of a lack of super-cited papers, perhaps also because they just produced too many papers.
Loughborough
Kyushu
Sung Kyun Kwan
Texas A and M
Surrey
Shanghai Jiao Tong University
Delft University of Technology
National Chiao Tung University (Taiwan)
Royal Institute of Technology Sweden
Tokushima
Hokkaido
Here is a list of universities that fall more than 100 places when the citations indicator is set to ‘not important’. They have benefitted from a few highly cited papers or low publication counts or a combination of the two.
Boston College
University of California Santa Cruz
Royal Holloway, University of London
Pompeu Fabra
Bilkent
Kent State University
Hong Kong Baptist University
Alexandria
Barcelona
Victoria University Wellington
Tokyo Metropolitan University
University of Warsaw
There are many others that rise or fell seventy, eighty, ninety places when citations are taken out of the equation. This is not a case of a few anomalies. The whole indicator is one big anomaly.
Earlier Jonathon Adams in a column that has attracted one comment said:
"Disciplinary diversity is an important factor, as is international diversity. How would you show the emerging excellence of a really good university in a less well known country such as Indonesia? This is where we would be most controversial, and most at risk, in using the logic of field-normalisation to add a small weighting in favour of relatively good institutions in countries with small research communities. Some may feel that we got that one only partially right."
The rankings do not include universities in Indonesia, really good or otherwise. The problem is with good, mediocre and not very good universities in the US, UK, Spain, Turkey, Egypt, New Zealand, Poland etc. . It is a huge weighting, not a small one, the universities concerned range from relatively good to relatively bad, in one case the research community seems to consist of one person and many are convinced that TR got that one totally wrong.
Indonesia may be less well known to TR but it is very well known to itself and neighbouring countries.
I will publish any comments by anyone who wishes to defend the citation indicator of the new rankings. Here are some questions they might wish to consider.
Was it a good idea to give such a heavy weighting to research impact, 32.5% in the oerall rankings , 37.5 in at least 2 subject rankings? Is it possible that commercial considerations, citations data being a lucrative business for TR , had something to do with it?
Are citations such a robust indictor? Is there not enough evidence now to suggest that manipulation of citations, including self-citation, intra-institutional citation and intra-journal citation, is so pervasive that the robustness of this measure is very slight?
Since there are several ways to measure research impact, would it not have been a good idea to have used several methods? After all, Leiden University has several different ways of assessing impact. Why use only one?
Why set the threshold for inclusion so low at 50 papers per year?
12 comments:
Hi Richard, great post. Very thorough. Your "El Naschie’s list of affiliations" link attempts to connect to my blog, I think, but it's broken. Relevant posts on my blog include Introduction to Mohamed El Naschie and The many titles of Dr. M.S. El Naschie.
Thank you for your assessment of the THE/TR methodology, which makes many useful points.
> To be fair, readers might wish to visit El Naschie’s site. I will also publish any comments of a non-libellous nature that support or dispute the scientific merits of his writings.
The following comments are not libelous, but instead are easily documented facts:
1) El Naschie's work has absolutely zero impact in either mathematics or physics. The tiny number of researchers who recognize his name know it only from the self-publication / self-citation scandal at Chaos, Solitons, and Fractals before he was forced to resign as editor.
2) It is not common practice for editors of journals to self-publish as many as 50 articles per year, including many with significantly overlapping content.
3) El Naschie's articles contain many naive errors in arithmetic, and as well so many typographical errors that many do not appear to have been proofread.
(Recall again these are easily documented facts, hence do not constitute libel, and need not be removed.)
Readers might also wish to visit
http://elnaschiewatch.blogspot.com/
which appears higher in google searches than the El Naschie you link.
Also, your link to El Naschie's RationalWiki page is broken because you appended a period.
You say a lot about El Naschie's affiliations. It should never be assumed that El Naschie is affiliated with an institution merely because he says so. The institution must be asked. For example, he has been kicked off the arXiv for arrogating affiliation with Cambridge University.
Richard, I am relieved to see you naming names. I was worried about you.
"He is not, for example, found in the ISI list of highly cited researchers."
This is not entirely true, e.g., see:
http://sciencewatch.com/dr/rs/10jan-rs/
(El Naschie was a "rising star" in physics from June 2009 to August 2009)
Richard, by the way. About the four affiliations El Naschie gives that you linked to on my blog post Party in Shanghai for El Naschie's 65th birthday!
Alexandria, Cairo, Frankfurt and Shanghai.
I'm pretty sure none of those places ever paid him a salary. Any affiliations that exist at all are purely honorary and based on the desire to flatter him, as he is a rich potential benefactor. On El Naschie's website he claims to have been made a full professor, but it's not true.
There is a long story about his alleged association with Frankfurt which you can read about. See Fortasse pecunia olet interdum and then this update. In a nutshell, he's a member of a private, non-university friends group that exists to solicit funds for the physics department. He uses that to claim affiliation with the U.
El Naschie himself has just written about the controversy: El Naschie on Alexandria University's THE rank.
Hot off the press. El Naschie has just written a newspaper column criticizing "global companies [who] win millions by classifying universities."
Thanks due to the fact that this discriminative article, it's very notable blogs
thanks amigo! great post!
really an eye opener for me.
- Robson
Il semble que vous soyez un expert dans ce domaine, vos remarques sont tres interessantes, merci.
- Daniel
Very good stuff.
Post a Comment