Tuesday, July 12, 2011

This WUR had such promise

The new Times Higher Education World University Rankings of 2010 promised much, new indicators based on income, a reformed survey that included questions on postgraduate teaching, a reduction in the weighting given to international students.

But the actual rankings that came out in September were less than impressive.  Dividing the year's intake of undergraduate students by the total of academic faculty looked rather odd. Counting the ratio of doctoral students to undergraduates, while omitting masters programs, was an invitation to the herding of marginal students into substandard doctoral degree programmes.

The biggest problem though was the insistence on giving a high weighting – somewhat higher than originally proposed -- to citations. Nearly a third of the total weighting was assigned to the average citations per paper normalized by field and year. The collection of statistics about citations is the bread and butter of Thomson Reuters (TR), THE’s  data collector, and one of their key products is the Incites system, which apparently was the basis for their procedure during the 2010 ranking exercise. This compares the citation records of academics with international scores benchmarked by year and field. Of course, those who want to find out exactly where they stand have to find out what the benchmark scores are and that is something that cannot be easily calculated without Thomson Reuters.

Over the last two or three decades the number of citations received by papers, along with the amount of money attracted from funding agencies, has become an essential sign of scholarly merit. Things have now reached the point where, in many universities, research is simply invisible unless it has been funded by an external agency and then published in a journal noted for being cited frequently by writers who contribute to journals that are frequently cited. The boom in citations has begun to resemble classical share and housing bubbles as citations acquire an inflated value increasingly detached from any objective reality.

It has become clear that citations can be manipulated as much as, perhaps more than, any other indicator used by international rankings. Writers can cite themselves, they can cite co-authors, they can cite those who cite them. Journal editors and reviewers can  make suggestions to submitters about who to cite. And so on.

Nobody, however, realized quite how unrobust citations might become until the unplanned intersection of THE’s indicator and a bit of self citation and mutual citation by two peripheral scientific figures raised questions about the whole business.

One of these two was Mohamed El Naschie who comes from a wealthy Egyptian family. He studied in Germany and took a Ph D in engineering at University College London. Then he taught in Saudi Arabia while writing several papers that appear to have been of an acceptable academic standard although not very remarkable. 

But this was not enough. In 1993 he started a new journal dealing with applied mathematics and theoretical physics called Chaos, Solitons and Fractals (CSF), published by the leading academic publishers, Elsevier. El Naschie’s journal published many papers written by himself. He has, to his credit, avoided exploiting junior researchers or insinuating himself into research projects to which he has contributed little. Most of his papers do not appear to be research but rather theoretical speculations many of which concern the disparity between the mathematics that describes the universe and that which describes subatomic space and suggestions for reconciling the two.

Over the years El Naschie has listed a number of universities as affiliations. The University of Alexandra was among the most recent of them. It was not clear, however, what he did at or for the university and it was only recently, after the publication of the 2010 THE World University Rankings, that there is documentation of any official connection.

El Naschie does not appear to be highly regarded by physicists and mathematicians, as noted earlier in this blog,  and he has been criticized severely in the physics and mathematics blogosphere.  He has, it is true, received some very vocal support but he is not really helped by the extreme enthusiasm and uniformity of style of his admirers. Here is a fairly typical example, from the comments in Times Higher Education: 
“As for Mohamed El Naschie, he is one of the most original thinkers of our time. He mastered science, philosophy, literature and art like very few people. Although he is an engineer, he is self taught in almost everything, including politics. Now I can understand that a man with his charisma and vast knowledge must be the object of envy but what is written here goes beyond that. My comment here will be only about what I found out regarding a major breakthrough in quantum mechanics. This breakthrough was brought about by the work of Prof. Dr. Mohamed El Naschie”
Later, a professor at Donghua University, China, Ji-Huan He, an editor at El Naschie’s  journal, started a similar publication, the International Journal of Nonlinear Sciences and Numerical Simulation (IJNSNS), whose editorial board included El Naschie. This journal was published by the respectable and unpretentious Israeli company, Freund of Tel Aviv. Ji-Huan He’s journal has published 29 of his own papers and 19 by El Naschie. The  two journals have contained articles that cite and are cited by articles in the other. Since they deal with similar topics some degree of cross citation is to be expected but here it seems to be unusually large.

Let us look at how El Naschie worked. An example is his paper, ‘The theory of Cantorian spacetime and high energy particle physics (an informal review)’, published in Chaos, Solitons and Fractals,41/5, 2635-2646, in  September  2009.

There are 58 citations in the bibliography. El Naschie cites himself 24 times, 20 times to papers in Chaos, Solitons and Fractals and 4 in IJNSNS.  Ji-Huan He is cited twice along with four  other authors from CSF. This paper has been cited 11 times, ten times in CSF in issues of the journal published later in the year.

Articles in mathematics and theoretical physics do not get cited very much. Scholars in those fields prefer to spend time thinking about an interesting paper before settling down to comment. Hardly any papers get even a single citation in the same year. Here we have 10 for one paper. That might easily be 100 times the average for that discipline and that year.

The object of this exercise had nothing to do with the THE rankings. What it did do was to push El Naschie’s  journal into the top ranks of scientific journals as measured by the Journal Impact Factor, that is the number of citations per paper within a two year period. It also meant that for a brief period El Naschie was listed by Thomson Reuters’ Science Watch as a rising star of research.

Eventually, Elsevier appointed a new editorial board at CSF that did not include El Naschie. The journal did however continue to refer to him as the founding editor. Since then the number of citations has declined sharply.

Meanwhile, Ji-huan  He was also accumulating a large number of citations, many of them from conference proceedings that he had organized. He was launched into the exalted ranks of the ISI Highly Cited Researchers and his journal topped the citation charts in mathematics. Unfortunately, early this year Freund sold off its journals to the reputed German publishers De Gruyter, who appointed a new editorial board that did not include either him or El Naschie.

El Naschie, He and a few others have been closely scrutinized by Jason Rush, a mathematician formerly of the University of Washington. Rush was apparently infuriated by El Naschie s unsubstantiated claims to have held senior positions at a variety of universities including Cambridge, Frankfurt, Surrey and Cornell. Since 2009 he has closely, perhaps a little obsessively, published a blog that chronicles the activities of El Naschie and those associated with him. Most of what is known about El Naschie and He was unearthed by his blog, El Naschie Watch.

Meanwhile, Thomson Reuters were preparing their analysis of citations for the THE rankings. They used the Incites system and compared the number of citations with benchmark scores representing the average for year and field.
This meant that for this criterion a high score did not necessarily represent a large number of citations. It could simply represent more citations than normal in a short period of time in fields where citation was infrequent and, perhaps more significantly since we are talking about averages here, a small total number of publications. Thus, Alexandria, with only a few publications but listed as the affiliation of an author who was cited much more frequently than usual in theoretical physics or applied mathematics, did spectacularly well.


This is rather like declaring Norfolk (very flat according to Oscar Wilde) the most mountainous county in England because of a few hillocks that were nonetheless relatively much higher than the surrounding plains.

Thomson Reuters would have done themselves a lot of good if they had taken the sensible course of using several indicators of research impact, such as total citations, citations per faculty, the h-index or references in social media or if they had allocated a smaller weighting to the indicator or if they had imposed a reasonable  threshold number of publications instead of just 50 or if they had not counted self-citations, or citations within journals or if they had figured out a formula to detect mutual citations..

So, in September  THE published its rankings with University of Alexandria in the top 200 overall and in fourth place for research impact, ahead of Oxford, Cambridge and most of the Ivy league. Not bad for a university that had not even been counted by HEEACT, QS or the Shanghai rankings and that in 2010 had lagged behind two other institutions in Alexandria itself in Webometrics.

When the rankings were published THE pointed out that Alexandria had once had a famous library and that a former student had gone on to the USA to eventually win a Nobel prize decades later. Still, they did concede that the success of Alexandria was mainly due  to one "controversial" author.

Anyone with access to the Web of Science could determine in a minute precisely who the controversial author was. For a while it was unclear exactly how a few dozen papers and a few hundred citations could put Alexandria among the world’s elite. Some observers wasted time wondering if  Thomson Reuters had been counting papers from a community college in Virginia or Minnesota, a branch of the Louisiana State University or federal government offices in the Greater Washington area. Eventually, it was clear that El Naschie could not, as he himself asserted, have done it by himself: he needed the help of the very distinctive features of Thomson Reuters’ methodology.

There were  other oddities in the 2010 rankings. Some might have accepted a high placing for Bilkent University in Turkey. It was well known for its Academic English programs. It also had one much cited article whose apparent impact was increased because it was classified as multidisciplinary, usually a low cited category, thereby scoring well above the world benchmark. However, when regional patterns were analyzed, the rankings began to look rather strange, especially the research impact indicator. In Australia, the Middle East, Hong Kong and Taiwan the order of universities, looked rather different from what local experts expected. Hong Kong Baptist University the third best in the SAR? Pohang University of Science and Technology so much better than Yonsei or KAIST? Adelaide the fourth best Australian university?

In the UK or the US these placings might seem plausible or at least not worth bothering about. But in the Middle East the idea of Alexandria as top university even in Egypt is a joke and the places awarded to the others look very dubious.

THE and Thomson Reuters tried to shrug off the complaints by saying that there were just a few outliers which they were prepared to debate and that anyone who criticized them had a vested interest in the old THE-QS rankings which had been discredited. They  dropped hints that the citations indicator would be reviewed but so far nothing specific has emerged.

A few days ago, however,  Phil Baty of THE seemed to imply that there was nothing wrong with the citations indicator.
Normalised data allow fairer comparisons, and that is why Times Higher Education will employ it for more indicators in its 2011-12 rankings, says Phil Baty.
One of the most important features of the Times Higher Education World University Rankings is that all our research citations data are normalised to take account of the dramatic variations in citation habits between different academic fields.
Treating citations data in an “absolute manner”, as some university rankings do, was condemned earlier this year as a “mortal sin” by one of the world’s leading experts in bibliometrics, Anthony van Raan of the Centre for Science and Technology Studies at Leiden University. In its rankings, Times Higher Education gives most weight to the “research influence” indicator – for our 2010-11 exercise, this drew on 25 million citations from 5 million articles published over five years. The importance of normalising these data has been highlighted by our rankings data supplier, Thomson Reuters: in the field of molecular biology and genetics, there were more than 1.6 million citations for the 145,939 papers published between 2005 and 2009; in mathematics, however, there were just 211,268 citations for a similar number of papers (140,219) published in the same period.
To ignore this would be to give a large and unfair advantage to institutions that happen to have more provision in molecular biology, say, than in maths. It is for this crucial reason that Times Higher Education’s World University Rankings examine a university’s citations in each field against the global average for that subject.

Yes, but when we are assessing hundreds of universities in very narrowly defined fields we start running into quite small samples that can be affected by deliberate manipulation or by random fluctuations.

Another point is that if there are many more journals, papers, citations and grants in oncology or genetic engineering than in the spatialization of gender performativity or the influence of Semitic syntax on Old Irish then perhaps society is telling us something about what it values and that is something that should not be dismissed so easily.

So, it could be  we are going to get the University of Alexandria in the top 200 again, perhaps joined by Donghua university.

At the risk of being repetitive, there are a few simple  things that Times Higher  and TR could do to make the citations indicator more credible. There are also  more ways of measuring research excellence.Possibly they are thinking about them but so far there is no sign  of this.

The credibility of last year's rankings has  declined further with the decisions of the judge presiding over the libel case brought by El Naschie against Nature (see here for commentary). Until now it could be claimed that El Naschie was a wll known scientist by virtue of the large numbers of citations that he had received or at least an interesting and controversial maverick.

El  Naschie is pursuing a case against  Nature for publishing an article that suggested his writings were not of a high quality and that those published in his journal did not appear to be properly peer reviewed

The judge has recently ruled  ruled that  El Naachie cannot proceed with a claim for specific damages since he has not brought any evidence for this. He can only go ahead with a claim for general damages for loss of reputation and hurt feelings. Even here, it looks like it will be tough going. El Naschie seems to be unwilling or unable to find expert witnesses to testify to the scientific merits of his papers.

"The Claimant is somewhat dismissive of the relevance of expert evidence in this case, largely on the basis that his field of special scientific knowledge is so narrow and fluid that it is difficult for him to conceive of anyone qualifying as having sufficient "expert" knowledge of the field. Nevertheless, permission has been obtained to introduce such evidence and it is not right that the Defendants should be hindered in their preparations."

He also seems to have problems with locating records that would demonstrate that his many articles published in Chaos, Solitons and Fractals were adequately reviewed.
  1. The first subject concerns the issue of peer-review of those papers authored by the Claimant and published in CSF. It appears that there were 58 articles published in 2008. The Claimant should identify the referees for each article because their qualifications, and the regularity with which they reviewed such articles, are issues upon which the Defendants' experts will need to comment. Furthermore, it will be necessary for the Defendants' counsel to cross-examine such reviewers as are being called by the Claimant as to why alleged faults or defects in those articles survived the relevant reviews.

  2. Secondly, further information is sought as to the place or places where CSF was administered between 2006 and 2008. This is relevant, first, to the issue of whether the Claimant has complied with his disclosure obligations. The Defendants' advisers are not in a position to judge whether a proportionate search has been carried out unless they are properly informed as to how many addresses and/or locations were involved. Secondly, the Defendants' proposed expert witnesses will need to know exactly how the CSF journal was run. This information should be provided.
It would therefore  seem to be getting more and more difficult for anyone to argue that TR's methodology has uncovered a pocket of excellence in Alexandria.

Unfortunately, it is beginning to look as though THE will not only use much the same method as last time but will apply normalisation to other indicators as well.
But what about the other performance indicators used to compare institutions? Our rankings examine the amount of research income a university attracts and the number of PhDs it awards. For 2011-12, they will also look at the number of papers a university has published that are co-authored by an international colleague.
Don’t subject factors come into play here, too? Shouldn’t these also be normalised? We think so. So I am pleased to confirm that for the 2011-12 World University Rankings, Times Higher Education will introduce subject normalisation to a range of other ranking indicators.
This is proving very challenging. It makes huge additional demands on the data analysts at Thomson Reuters and, of course, on the institutions themselves, which have had to provide more and richer data for the rankings project. But we are committed to constantly improving and refining our methodology, and these latest steps to normalise more indicators evidence our desire to provide the most comprehensive and rigorous tables we can.
What this might mean is that universities that spend modest amounts of money in fields where little money is usually spent would get a huge score. So what would happen if an eccentric millionaire left millions to establish a lavishly funded research chair in continental philosophy at Middlesex University?  There are no doubt precautions that Thomson Reuters could take but will they? The El Naschie business does not inspire very much confidence that they will.

The reception of the 2010 THE WUR rankings suggests that the many in the academic world have doubts about the wisdom of using normalised citation data without considering the potential for gaming or statistical anomalies. But the problem may run deeper and involve citations as such. QS, THE 's rival and former partner, have produced a series of subject rankings based on data from 2010. The overall results for each subject are based on varying combinations of the scores for academic opinion, employer opinion and citations per paper (not per faculty as in the general rankings).

The results are interesting. Looking at citations per paper alone we see that Boston College and Munich are jointly first in Sociology. Rutgers is third for politics and international studies. MIT is third for philosophy (presumably Chomsky and co). Stellenbosch is first for Geography and Area studies. Padua is first for linguistics. Tokyo Metropolitan University is second for biological sciences and Arizona State University first.


Pockets of excellence or statistical anomalies? These results may not be quite as incredible as Alexandria in the THE rankings but they are not a very good advertisement for the validity of citations as a measure of research excellence.

It appears that THE have not made their minds up yet. There is still time to produce a believable and rigorous ranking system. But whatever happens, it is unlikely that citations,  normalized or unnormalized, will continue to be the unquestionable gold standard of academic and scientific research.


    2 comments:

    1. Anonymous1:49 PM

      This is a useful overview, and it is indeed unfortunate that THE is unwilling to correct a demonstrably flawed methodology. Some comments:

      > CSF), published by the leading academic publishers, Elsevier

      Strictly speaking it was via Maxwell's Pergamon Press, with slightly different standards and contracts from the parent organization.

      > Articles in mathematics and theoretical physics do not get cited very much. ... Hardly any papers get even a single citation in the same year.

      This is certainly not true of theoretical physics, where according to SLAC-SPIREs many articles receive significant numbers of citations in their first year. It may be closer to correct in applied mathematics, where according to Arnold the citation numbers are much lower. Not coincidentally, it was in applied math, not physics, that Alexandria rated so anomalously highly.

      On the other hand, as you've correctly noted, El Naschie's and He's articles would not be recognized as either physics or mathematics by actual researchers in those fields.

      ReplyDelete
    2. A very enjoyable read, Richard! A couple of details to round out the story. You say

      "When the rankings were published THE pointed out that Alexandria had once had a famous library and that a former student had gone on to the USA to eventually win a Nobel prize decades later."

      referring to the chemist Ahmed Zewail. El Naschie is becoming ever more unhinged: Not content to lose law suits against Nature (and Die Zeit which you didn't mention) he is suing Zewail for allegedly stealing his ideas about time and using them in a public lecture. He will lose this one too. It's absurd.

      Regarding El Naschie's claimed academic affiliations, you mentioned Cambridge, Frankfurt, Surrey and Cornell. He also claims affiliation with the Solvay Institute in Belgium, the home of the late Nobel laureate Ilya Prigogine, whom El Naschie calls "my teacher" with gross exaggeration. I believe that his only real academic appointment is a recent emeritus professorship at Alexandria University, very unwisely given to him by AU President Hend Hanafi, a Hosni Mubarak/Egyptian National Democratic Party loyalist like El Naschie. The AU physics department is embarrassed by it. To the best of my belief all other claimed affiliations are either completely fictitious, or as in the case of some Chinese universities that he claims, merely honorary.

      ReplyDelete