Sunday, September 19, 2010

Perverse Incentives

Until we get a clear statement from Thomson Reuters we have to assume that the citations indicator in the recent THE rankings was constructed by counting citations to articles published in the period 2004 - 2008, dividing these by the expected number of articles and then dividing again by the total number of articles.

It seems then that universities could improve their score on this indicator by getting cited more often or by reducing the number of papers published in ISI indexed journals. Doing both could bring remarkable results.

This seems to be what has happened in the case of Alexandria University, which according to the new THE ranking, is fourth best in the world for research impact.

The university has accumulated a large number of citations to papers published by Mohamed El Naschie, mainly in two journals, Chaos, Solitons and Fractals, published by Elsevier, and the International Journal of Nonlinear Mathematics and Numerical Simulation, published by the Israeli company Freund. El Naschie was editor of the first until recently and is co-editor of the second. Many of the citations are by himself.

I am unable to judge the merits of El Naschie's work. I assume that since he has been a Professor at Cambridge, Cornell and the University of Surrey and publishes in journals produced by two very reputable companies that his papers are of a very high quality.

It is not enough, however, to simply get lots of citations. The actual citation/expected citation number will -- if this is what happened -- be divided by the total number of papers. And this is where there is a problem. If a university has very few papers in ISI journals in the relevant period they will end by getting a very good score. A lot of papers and your score goes way down. This probably explains why Warwick ranks 285th for research impact and LSE 193rd: there were just too many people writing papers that were above average but not way above average.

An article by David Glenn in the Chronicle of Higher Education talks about the perverse incentives of the new rankings. Here, we have another. If a university simply stopped publishing for a year its score on this indicator would go up since it would still be accumulating citations for articles published in previous years.

6 comments:

  1. Anonymous10:24 PM

    Richard..
    As you can see from the following paragraph (http://www.timeshighereducation.co.uk/world-university-rankings/2010-2011/analysis-methodology.html) Thomson has normalised citations against each of their 251 subject categories (it's extremely difficult to get this data directly from WOS).. They have great experience in this kind of analysis.. to get an idea, check their in-cites website http://sciencewatch.com/about/met/thresholds/#tab3 where they have citations thresholds for the last 10 years against broad fields.

    Paragraph mentioned above:
    "Citation impact: it's all relative
    Citations are widely recognised as a strong indicator of the significance and relevance — that is, the impact — of a piece of research.
    However, citation data must be used with care as citation rates can vary between subjects and time periods.
    For example, papers in the life sciences tend to be cited more frequently than those published in the social sciences.
    The rankings this year use normalised citation impact, where the citations to each paper are compared with the average number of citations received by all papers published in the same field and year. So a paper with a relative citation impact of 2.0 is cited twice as frequently as the average for similar papers.
    The data were extracted from the Thomson Reuters resource known as Web of Science, the largest and most comprehensive database of research citations available.
    Its authoritative and multidisciplinary content covers more than 11,600 of the highest-impact journals worldwide. The benchmarking exercise is carried out on an exact level across 251 subject areas for each year in the period 2004 to 2008.
    For institutions that produce few papers, the relative citation impact may be significantly influenced by one or two highly cited papers and therefore it does not accurately reflect their typical performance. However, institutions publishing fewer than 50 papers a year have been excluded from the rankings.
    There are occasions where a groundbreaking academic paper is so influential as to drive the citation counts to extreme levels — receiving thousands of citations. An institution that contributes to one of these papers will receive a significant and noticeable boost to its citation impact, and this reflects such institutions' contribution to globally significant research projects."

    Dahman

    ReplyDelete
  2. Anonymous10:24 PM

    Richard..
    As you can see from the following paragraph (http://www.timeshighereducation.co.uk/world-university-rankings/2010-2011/analysis-methodology.html) Thomson has normalised citations against each of their 251 subject categories (it's extremely difficult to get this data directly from WOS).. They have great experience in this kind of analysis.. to get an idea, check their in-cites website http://sciencewatch.com/about/met/thresholds/#tab3 where they have citations thresholds for the last 10 years against broad fields.

    Paragraph mentioned above:
    "Citation impact: it's all relative
    Citations are widely recognised as a strong indicator of the significance and relevance — that is, the impact — of a piece of research.
    However, citation data must be used with care as citation rates can vary between subjects and time periods.
    For example, papers in the life sciences tend to be cited more frequently than those published in the social sciences.
    The rankings this year use normalised citation impact, where the citations to each paper are compared with the average number of citations received by all papers published in the same field and year. So a paper with a relative citation impact of 2.0 is cited twice as frequently as the average for similar papers.
    The data were extracted from the Thomson Reuters resource known as Web of Science, the largest and most comprehensive database of research citations available.
    Its authoritative and multidisciplinary content covers more than 11,600 of the highest-impact journals worldwide. The benchmarking exercise is carried out on an exact level across 251 subject areas for each year in the period 2004 to 2008.
    For institutions that produce few papers, the relative citation impact may be significantly influenced by one or two highly cited papers and therefore it does not accurately reflect their typical performance. However, institutions publishing fewer than 50 papers a year have been excluded from the rankings.
    There are occasions where a groundbreaking academic paper is so influential as to drive the citation counts to extreme levels — receiving thousands of citations. An institution that contributes to one of these papers will receive a significant and noticeable boost to its citation impact, and this reflects such institutions' contribution to globally significant research projects."

    Dahman

    ReplyDelete
  3. Hi. I am the one who posted to Scribd the El Naschie CV you linked to. The CV is fraud and nonsense like everything else about El Naschie. You said

    "I assume that since he has been a Professor at Cambridge, Cornell and the University of Surrey and publishes in journals produced by two very reputable companies that his papers are of a very high quality."

    You are badly, badly wrong. My blog, El Naschie Watch, is dedicated to debunking the crackpot charlatan El Naschie.

    Read this: Introduction to Mohamed El Naschie

    and also this, which references your website: Caltech, MIT, Princeton, Alexandria University

    ReplyDelete
  4. You also write:

    "The university has accumulated a large number of citations to papers published by Mohamed El Naschie, mainly in two journals, Chaos, Solitons and Fractals, published by Elsevier, and the International Journal of Nonlinear Mathematics and Numerical Simulation, published by the Israeli company Freund. El Naschie was editor of the first until recently and is co-editor of the second. Many of the citations are by himself."

    Those two journals and their Editors-in-Chief are at the center of a citation scamming scandal. See this post: Christoph Drösser on impact-factor manipulation

    ReplyDelete
  5. Anonymous7:40 PM

    The ease of gaming the the citation record has been considered in SIAM news,
    www.ima.umn.edu/~arnold/siam-columns/integrity-under-attack.pdf, including this excerpt:

    "El Naschie’s papers in CSF make 4992 citations, about 2000 of which are to papers published in CSF, largely his own. In 2007, of the 65 journals in the Thomson Reuters category “Mathematics, Interdisciplinary Applications,” CSF was ranked number 2.

    Another journal whose high impact factor raises eyebrows is the International Journal of Nonlinear Science and Numerical Simulation (IJNSNS), founded in 2000 and published by Freund Publishing House. For the past three years, IJNSNS has had the highest impact factor in the category “Mathematics, Applied.” There are a variety of connections between IJNSNS and CSF. For example, Ji-Huan He, the founder and editor-in-chief of IJNSNS, is an editor of CSF, and El Naschie is one of the two co-editors of CSF; both publish copiously, not only in their own journals but also in each other's, and they cite each other frequently.

    Let me describe another element that contributes to IJNSNS's high impact factor. The Institute of Physics (IOP) publishes Journal of Physics: Conference Series (JPCS). Conference organizers pay to have proceedings of their conferences published in JPCS, and, in the words of IOP, “JPCS asks Conference Organisers to handle the peer review of all papers.” Neither the brochure nor the website for JPCS lists an editorial board, nor does either describe any process for judging the quality of the conferences. Nonetheless, Thomson Reuters counts citations from JPCS in calculating impact factors. One of the 49 volumes of JPCS in 2008 was the proceedings of a conference organized by IJNSNS editor-in-chief He at his home campus, Shanghai Donghua University. This one volume contained 221 papers, with 366 references to papers in IJNSNS and 353 references to He. To give you an idea of the effect of this, had IJNSNS not received a single citation in 2008 beyond the ones in this conference proceedings, it would still have been assigned a larger impact factor than any SIAM journal except for SIAM Review."

    ReplyDelete
  6. Anonymous2:10 AM

    typos "two many people" writing, may be "too many people".

    El Naschie is controversial. His CV says professor DAMTP, cambridge uk. not sure if this was or was not ath uni of cambridge

    ReplyDelete