Sunday, September 19, 2010

Another Post on Citations (And I hope the last for a while)

I think that it is clear now that the strange results for the citations indicator in the THE rankings are not the result of a systematic error or several discrete errors. Rather they result from problems with the inclusion of journals in the ISI indexes, possibly a failure to exclude self citations, problems with the classification of papers by subject and, most seriously, the interaction of a few highly cited papers with a small number of papers overall. Taken together , they undermine the credibility and validity of the indicator and do little to promote confidence in the rankings as a whole.

2 comments:

  1. Anonymous6:25 AM

    This much publicised Brand New THE Ranking is still wrong and not credible...Will governments rely on it to budget their education policy? Scary.....

    ReplyDelete
  2. Webometrics Ranking Editor11:47 AM

    The solution to the citation problem is pretty simple and it was provided by the CWTS team in its Leiden Ranking. They called it the "brute force" approach and it is obtained from multiplying the "crown indicator" by the total number of papers.

    ReplyDelete