Wednesday, September 22, 2010

Selected Comments from Times Higher Education


Mike Reddin 17 September, 2010
World university rankings take national ranking systems from the ridiculous to the bizarre. Two of the most glaring are made more so by these latest meta analyses.
Number One: R&D funding is scored not by its quality or contribution to learning or understanding but by the amount of money spent on that research; it ranks expensive research higher than cheap research; it ranks a study of 'many things' better than the study of a 'few things'; it ranks higher the extensive and expensive pharmacological trial than the paper written in tranquility over the weekend. I repeat, it does not score 'contribution to knowledge'.

Number Two. Something deceptively similar happens in the ranking of citations. We rank according to number alone - not 'worth' - not whether the paper merited writing in the first place, not whether we are the better for or the worse without it, not whether it adds to or detracts from the sum of human knowledge. Write epic or trash .... as long as it is cited, you score. Let me offer utter rubbish - the more of you that denounce me the better; as long as you cite my name and my home institution.

Which brings me full circle: the 'rankings conceit' equates research / knowledge / learning / thinking / understanding with institutions - in this case, universities and universities alone. Our ranking of student 'outcomes' (our successes/failure as individuals on many scales) wildly presumes that they flow from 'inputs' (universities). Do universities *cause* these outcomes - do they add value to those they have admitted? Think on't. Mike Reddin http://www.publicgoods.co.uk/



jorge Sanchez 18 September, 2010
this is ridiculous~ LSE was placed 67 in the previous year and THE decided to end relations with QS because of this issue. now since THE is no longer teaming up with QS, how could you possibly explain this anomaly by placing LSE ranked 86 in the table????


Mark 18 September, 2010
where is the "chinese university of Hong Kong in the table??? it is no longer in the top 200 best universities....

last year was in the top 50 now is off the table??? is this a serious ranking?????


Of course it's silly 18 September, 2010
Just look at the proposition that teaching is better if you have a higher proportion of doctoral students to undergraduate students.

This is just plainly silly, as 10 seconds thinking about the reputation of teaching in the US will tell you: liberal arts colleges offer extraordinary teaching in the absence of PhD programmes.



Matthew H. Kramer 18 September, 2010
Though some tiers of these rankings are sensible, there are some bizarre anomalies. Mirabile dictu, the University of Texas doesn't appear at all; the University of Virginia is ridiculously low at 72; NYU is absurdly low at 60; the University of Hong Kong is preposterously overrated at 21. Moreover, as has been remarked in some of the previous comments -- and as is evident from a glance at the rankings -- the criteria hugely favor technical institutes. The rank of MIT at 3 is credible, because MIT is outstanding across the board. However, Cal Tech doesn't belong at 2, and Imperial (which has no programs at all in the humanities and social sciences) certainly doesn't belong at 9. Imperial and especially Cal Tech are outstanding in what they do, but neither of them is even close to outstanding across the gamut of subjects that are covered by any full-blown university. I hope that some of these anomalies will be eliminated through further adjustments in the criteria. The exclusion of Texas is itself sufficiently outlandish to warrant some major modifications in those criteria.



Matthew H. Kramer 18 September, 2010
Weird too is the wholesale exclusion of Israeli universities. Hebrew University, Tel Aviv University, and Technion belong among the top 200 in any credible ranking of the world's universities.


Neil Fazel 19 September, 2010
No Sharif, no U. Texas, no Technion. Another ranking to be ignored.


OZ academic 20 September, 2010
While the criteria seem to be OK, although they might be debated, how to carry out the statistical analyses and how to collect the data are the issues for the validity of the poll. The omission of Chinese University of Hong Kong, in the inclusion of the Hong Kong Baptist University and Hong Kong Polytechnic University in the world's top 200 universities, seems to be very "mysterious" to me. As I understand the Chinese University of Hong Kong is more or less of a similar standard in teaching and research in comparison to the Hong Kong University and the Hong Kong University of Science and Technology, but they have some slight edges over the Hong Kong Baptist University and the Hong Kong Polytechnic University. I wonder if there are mix-ups in the data collection processes. If this is true, then there are disputes in this poll not only in the criteria of assessment but also in the accuracy in data collections and analyses.
Texas Opted Out

From the Texas Tribune

"University officials said UT's [University of Texas] absence is not due to an epic fall — they simply declined to participate.

Kristi Fisher, director of UT’s Office of Information Management and Analysis, said they opted out for two reasons. First, budget cuts have caused resource constraints, and projects must be chosen carefully. Also, the survey was using new methodology for the first time, and there was talk it might be suspect. “The last thing we wanted to do was spend a lot of resources to participate in a survey that might have flawed methodology behind it,” Fisher said. "
What Happens When You Set the THE Rankings Citations Indicator to Not Important Continued

This is a selection of the universities that go up when citations is set as not important. The number of of places is on the right.

Tokyo University 10
Korean Advanced Institute of Science and Technology 38
Osaka 70
Warwick 78
Kyushu 100
Sung Kyun Kwan 100
Texas A & M 103
Sao Paulo 107
Surrey 123
Shanghai Jiao Tong 158

And a selection of those that fell after citations is set to not important.

Sussex 83
Univerity College Dublin 99
UC Santa Cruz 102
Tasmania 106
Royal Holloway 119
Pompeu Fabra 142
Bilkent 154
Kent State 160
Hong Kong Baptist 164
Alexandria 234
The THE World University Rankings With Citations Set to Not Important

The THE rankings Iphone app has the excellent feature of allowing users to adjust the weightings of the five indicator groups. This is the top 200 when the citations -- research impact indicator is set to 'not important'. The number in brackets on the right is the position in the official ranking.


  1. Harvard (1)
  2. Caltech (2)
  3. MIT (3)
  4. Stanford (4)
  5. Princeton (5)
  6. Imperial College London (9)
  7. Cambridge ( 6)
  8. Oxford (6)
  9. Yale (10)
  10. UC Berkeley (8)
  11. UC Los Angeles (11)
  12. Johns Hopkins (13)
  13. Swiss Federal Institute of technology Zurich (15)
  14. University of Michigan (15)
  15. Chicago (12)
  16. Tokyo (26)
  17. Cornell (14)
  18. Toronto (17)
  19. University College London (22)
  20. Columbia (18)
  21. University of Pennsylvania (19)
  22. University of Illinoi-Urbana (33)
  23. McGill (35)
  24. Carnegie Mellon (20)
  25. Hong Kong (21)
  26. Georgia Institute of Technology (27)
  27. Kyoto (57)
  28. British Columbia (30)
  29. University of Washiongton (23)
  30. National University of Singapore (34)
  31. Duke (24)
  32. Peking (37)
  33. Universityof North Carolina (30)
  34. Karolinska Institute (34)
  35. Tsinghua University, Beijing (58)
  36. Northwestern University (25)
  37. Pohang University of scienc and technology (28)
  38. UC San Diego (32)
  39. Melbourne (36)
  40. UC Santa Barbara (29)
  41. Korean Advanced Institute of Science and Technology (79)
  42. UC Davis (54)
  43. University of Masachusetts (56)
  44. Washington University St Louis (38)
  45. Edinburgh (40)
  46. Australian National University (43)
  47. Minnesota (52)
  48. Purdue (106)
  49. Vanderbilt (51)
  50. LSE (86)
  51. Ecole Polytechnique (39)
  52. Case Western Reserve (65)
  53. Wisconsin (43)
  54. Ohio State (66)
  55. Delft University of Technology (151)
  56. Sydney (71)
  57. Brown (55)
  58. EPF Lausanne (48)
  59. Tokyo Institute of Technology (112)
  60. Osaka (130)
  61. Catholic University of Leuven (119)
  62. Univerity of Virginia (72)
  63. Tohoku (132)
  64. Ecole Normale Superieure Paris (64)
  65. Tufts (53)
  66. University of Munich (61)
  67. Manchester (87)
  68. Hing Kong University of Science and Technology (41)
  69. Emory (61)
  70. Gottingen (43)
  71. Seoul National University (109)
  72. Pittsburgh (54)
  73. Rutgers (105)
  74. New York University (60)
  75. Yeshiva (68)
  76. University of Southern California (73)
  77. Alberta (127)
  78. Uppsala (147)
  79. UC Irvine (49)
  80. University of Science and Technology China (49)
  81. Queensland (81)
  82. Ghent (124)
  83. Zurich (90)
  84. King’s College London (77)
  85. Eindhoven University of Technology (114)
  86. Ruprecht Karl University of Heidelberg (83)
  87. National Chiao Tung University (181)
  88. Rice (47)
  89. Lund (89)
  90. University of Utah (83)
  91. Royal Institute of Technology Sweden (193)
  92. Bristol (68)
  93. McMaster (93)
  94. Boston (59)
  95. Rensselaer Polytchnic Institute (1040
  96. University Of Colorado (67)
  97. Montreal (138)
  98. University of Iowa (132)
  99. National Taiwan University (115)
  100. Leiden (124)
  101. Notre Dame (63)
  102. University of Arizona (95)
  103. George Washington (103)
  104. Texas A & M (207)
  105. Georgetown (164)
  106. Lomonosov Moscow State (237)
  107. National Tsing Hua University (107)
  108. Geneva 118)
  109. Birmingham (145)
  110. Southampton (90)
  111. Wagening (114)
  112. Medical College of Georgia (158)
  113. Technical University of Munich (101)
  114. New South Wales (152)
  115. Illinois-Chicago (197)
  116. Michigan State (122)
  117. Trinity College Dublin (76)
  118. Tokyo Medical and Dental (217)
  119. Nanyang Technological (174)
  120. Technical University of Denmark (122)
  121. Sheffield (137)
  122. York (81)
  123. St Andrews 103)
  124. Nanjing (120)
  125. Lausanne (136)
  126. Glasgow (128)
  127. VU Amsterdam (13()
  128. Twente (185)
  129. Utrecht (143)
  130. Sung Kyun Kwan (230)
  131. Stony Brook (78)
  132. Wake Forest (90)
  133. Helsinki (102)
  134. Basel (95)
  135. Freiborg (132)
  136. Adelaide (73)
  137. Nagoya (206)
  138. Ruhr University Bochum
  139. Sao Paol o (232)
  140. Free University of Berlin (212)
  141. Maryland College Park (98)
  142. Warwick (220)
  143. Technion (221)
  144. Iowa State (156)
  145. Chalmers university of Technology (223)
  146. Dartmouth (99)
  147. RWTH Aachen (182)
  148. Kansas (232)
  149. Swedish University Agricultural sciences (199)
  150. Groningen (170)
  151. State University of Campinas (248)
  152. Nottingham (174)
  153. Leeds (168)
  154. Penn State (109)
  155. Maastricht (209)
  156. Zhejiang (197)
  157. Humboldt (178)
  158. Vienna (195)
  159. Hong Kong Polytechnic (149)
  160. Queen Mary London (120)
  161. Aarhus (167)
  162. Sussex (79)
  163. University of Georgia (246)
  164. National Sun Yat-Sen (163)
  165. William and Mary (75)
  166. Kiel (210)
  167. Lancaster (214)
  168. Indiana University ((156)
  169. Newcastle, UK (152)
  170. UC Santa Cruz (68)
  171. Aberdeen (149)
  172. Durham
  173. University College Dublin
  174. Liverspool (165)
  175. Dalhousie (193)
  176. University of delaware (159)
  177. UC Riverside (117)
  178. University of Amsterdam (165)
  179. Surrey (302)
  180. Konstanz (186)
  181. University of South Carolina (214)
  182. Wurzburg (168)
  183. Cape Town (107)
  184. Tokushima (317)
  185. Reading (210)
  186. Stockholm (129)
  187. University of Waterloo, Canada (267)
  188. Wshington State University (264)
  189. Copenhagen (177)
  190. Hokkaido (293)
  191. Hawaii (105)
  192. Yonsei (190)
  193. Leicester (216)
  194. Kyushu (294)
  195. Bergen (135)
  196. Shanghai Jiao Tong (258)
  197. Pierre and Marie Curie (140)
  198. ENS De Lyon (100)
  199. Erasmus (159)
  200. Tromso (227)


Comments

Apart from spam about Viagra and where to meet beautiful Russian women and so on and one racist diatribe I have until now published every comment sent to this post. However, I think it is time to indicate a change in policy. I will publish any comment providing it does not contain negative general comments about the character of a specific individual. Thus I will allow critical comments about citation patterns or the scientific validity of specific research but not, for example, a statement that someone is a "fraud", unless convicted of fraud in court.

Tuesday, September 21, 2010

From The Students' Room 3

Click here.

From AfghanistanBananistan

I am intriged about the LSE's ranking.

TBH, it does not really matter where a university ranks in the world in relation to other global universities, it just concerns me when UK universities have odd rankings amongst their own. No one would complain if LSE came 4/5th in the UK and 86th in the world, or Warwick came 8th in UK and 240th in the world.

The RAE claims that LSE has the highest proportion of world leading research in the UK, and is in the top 3 in every type of analysis of the results (often tied with Oxford). The RAE may not be perfect, but for LSE to suddenly drop in the THES ranking (esp citations) seems really shocking, and perhaps wrong.

Afterall, someone from THES wrote a few months back that LSE ranking in 66th was 'clearly a big mistake'. This was when THES clearly stated that citations hindered social science and arts institutions. You mean to tell me that LSE ranks 66th (80th in this yers QS ranking), when citations are weighted against social sciences, yet LSE drops to 86th when citations not weighted to disadvantage social science institutions?

I guess my main point is that in the QS ranking LSE has ranked in the top 5 in the world every year for social sciences, with citations the key element of the ranking. How come LSE can have the 4th best citation count in the social sciences, yet when taken account in an overall ranking, where citations are normalised, it has such a low citation count in the THES ranking? I understand Warwick, for it may rank 44th for humanities, but it has to take into account all its own faculties which may drag it down - yet LSE only has social sciences.

I find it hard to conceptualise that Sussex and York universities have more pervasive research throughout ALL their departments than just LSE for social science. Bear in mind that LSE, as a specialist instituon, is one of (probably the biggest) social sicence dept in the world, with more specific social science researchers that anywhere else. Therefore, how come its research is not cited more? What are the 50 odd fellows of the British academy doing, and should they not have been appointed if their research is not influential.

In short, does THES not think that something is up with their methodology, just as they admitted last year. Just because some said there is evidence of the world being flat, does not make it true. I find it hard to believe that the university that created many fields in the social sciences and has won nearly a quarter of all nobel prizes in economics, can rank so badly for its citations score? Aferall, can the fellow social science researchers who voted LSE the 6th best in the world for infoluence IR and is the only uni to compete with the US economics depts in all economics world rankings, be so wrong?

The same logic applies to Warwick vs. Dundee and Lancaster.

If a ranking of the world's best ever football players only had Maradonne at number 20 and Pele at 15, wouldn't the raking lose credibility. You could rank them purely by influence measured by the number of goals scores. Pele/Maradonna would not rank that well, but we them as the best players because we have seen them with our own eyes.

P.S I dont even go to LSE.
From The Students' Room 2

Click here

Dear Phil Baty,

I had sent you email to inquire about the ranking performance of Taiwan's universities.
I am waiting for your reply.
Please share your opinion.
Thanks!

best regards,

yuching
From the Students' Room

There are some interesting comments about the latest THE rankings at their "The Students' Room"

Here is one from Martin

I appreciate that measuring the research impact of an institution is difficult. Unfortunately, the THE seems to have got it quite badly wrong this time. The best evidence for this is not that Warwick doesn't make it into the top 200 (even though this hurts; yes I am faculty at Warwick), but the fact that the University of Alexandria makes it number 4, based on the output of one single person.


Some suggestions follow:


1. Do not count self-citations. Even better, it is common sense that a citation by someone further from your field should count more than a citation by your former PhD student. Of course, these things are difficult to figure out in an automated way. But one could for example use the collaboration distance (coauthors have distance one, coauthors of coauthors have distance two, etc) to weight citations (with a cap at 4, say).

2. Quality of research is paramount. As already pointed out, it is easy to get many citations for mediocre research if there are sufficiently many other mediocre researchers working in the same area. Again, this is vastly more common than you may think, for the simple reason that it is easier to perform mediocre research than world-class research. Furthermore, you get more recognition as counted by basic citation count, so why not doing it?

One way of taking this into account is to give higher weight to citations coming from articles published in highly respected journals. (Similarly, when measuring "research output", higher weight should be given to articles published in journals with high reputation.)

However, please *DO NOT* use the impact factor as a measure of the quality of a journal, as it can be (and is!) very easily manipulated, as the story of the journal "Chaos, Solitons and Fractals" shows. Instead, the only reliable way of assessing the quality of a journal within a given field is to ask researchers in that field to provide their own rankings. Yes, this seems subjective, but unfortunately that's all you are ever going to get, and I can assure you that you will get a very consistent picture within each area. The fact that the "Annals of Mathematics" is the most respected journal in mathematics simply cannot be measured in terms of impact factor.

3. Count current citations to older work. If someone's article turns out to spawn an entire new field of research five years later, it will not show up at all in the current metric. This makes simply no sense. Of course, this doesn't happen all that often, but the reason why top institutions have a reputation is precisely because of those instances in which it happens. Furthermore, there are areas of research (like mathematics) in which the "lifespan" of a good article is measured in decades, which goes way beyond the two to five years that you use as a rule. Counting current citations to older articles would be one small but absolutely essential step to correct this.

4. Measure the total impact of the institution in a field, and not its "average" impact. The only way I could see that the output of one single person can count so much is that this person somehow has an abnormally high weight, probably due to the fact that there is very little research output from the U. of Alexandria. If this suspicion is indeed correct (I hope that I am wrong on this one), then this would effectively mean that universities are penalised by having large (and influential!) departments and should rather strive to have only very few but quite prolific researchers on their payroll.

There is probably more, but I am getting hungry now ;-) I very much hope that you will take these comments to heart. Best wishes,

Martin
Back to Citations

I was hoping to get away from the citations indicator in the THE rankings for a while but there are several comments here and elsewhere that need to be discussed.

First, I had assumed that the very high citation scores for Alexandria, and to a lesser extent other universities, were the result of a large number of citations that, while perhaps excessive, were from reputable scholars. Many of the citations to Alexandria University papers are to and from papers by Mohamed El Naschie. I assumed that his CV was evidence that he was a distinguished and esteemed scientist. Looking closely at his CV there seem to be a number of points that require clarification. There are references to academic appointments for a single calendar year, not the academic year as one would expect. There is a reference to a Professorship at "DAMTP, Cambridge" but the university, nor a college, is not mentioned. Also , there seems to be a period when El Naschie was a professor simultaneously at the Free University of Brussels, DAMTP Cambridge and the University of Surrey.

I hope that these points can be clarified. The TR citations indicator would still be a problem even if it was being skewed by heavy citation of groundbreaking research but it would be more of a problem if there any doubts, whether or not justified, about the excellence of that research.

Monday, September 20, 2010

Alexandria Gets the News

The news about being among the top 200 universities in the world has finally reached Alexandria. The reaction is very predictable:


"“I believe we are well deserving of being on the list,” professor of economics
Amr Hussein told Bikya Masr. “We have worked hard to improve our system of
education and it is showing that we are succeeding in doing so.”...

Hend Hanafi, President of Alexandria University, told local media that she is proud
of the ranking and hopes the university will continue to make efforts to
consistently improve the quality of education at the university."

Will somebody please send the President an account of what happened to Universiti Malaya and its Vice-Chancellor after THE and QS put them in the top 100 in 2004.
Omissions

Several writers have noticed the absence of any Israeli universities from the top 200 universities. In the QS rankings this year there three. Those who have downloaded the ipad
app will have noticed that there are only two in the top 400. So what happened to Tel Aviv and the Hebrew University of Jerusalem?

There is an article about this in Haaretz.
"Haaretz has learned that most Israeli universities were not on the list because they failed to respond to repeated requests for information, including on faculty and students, which is necessary for the listing.

TAU and the Hebrew University say that they never received such a request from THE. According to THE, only the Technion and Bar-Ilan University responded with information, but they were ranked 221 and 354 respectively.

As for the other universities, the editor of the ranking, Phil Baty, told Haaretz that although it is upsetting for Israel, he hoped that the Israeli universities would recognize the amount of serious work invested in creating the ranking and the degree to which the methodology was transparent, and would participate in the initiative, like other universities have done. He also expressed certainty that next year "they will be included."

Didn't do their homework

THE says that more than 6,000 universities participated in the ranking and most provided the necessary information.

A spokesperson for the Hebrew University responded that contrary to the claim of
the survey's editors, "following an examination we did not find any such request
[for information]. When we asked for the correspondence to the university on the
subject, they could not provide it. The university is saddened by the fact that
the editors of the ranking did not carry out their work responsibly, and thus
harmed the university." "
There are other surprising omissions such as all the Indian Institutes of Technology, the University at Buffalo: SUNY and the Catholic University of Louvain (the French one -- the Dutch one is there at 120)

Sunday, September 19, 2010

Another Post on Citations (And I hope the last for a while)

I think that it is clear now that the strange results for the citations indicator in the THE rankings are not the result of a systematic error or several discrete errors. Rather they result from problems with the inclusion of journals in the ISI indexes, possibly a failure to exclude self citations, problems with the classification of papers by subject and, most seriously, the interaction of a few highly cited papers with a small number of papers overall. Taken together , they undermine the credibility and validity of the indicator and do little to promote confidence in the rankings as a whole.
More on the THE Citations Indicator

See this comment on a previous post:


As you can see from the following paragraph(http://www.timeshighereducation.co.uk/world-university-rankings/2010-2011/analysis-methodology.html) Thomson has normalised citations against each of their 251 subject categories (it's extremely difficult to get this data directly from WOS).. They have great experience in this kind of analysis.. to get an idea, check their in-cites website http://sciencewatch.com/about/met/thresholds/#tab3 where they have citations thresholds for the last 10 years against broad fields.

Paragraph mentioned above:
"Citation impact: it's all relative
Citations are widely recognised as a strong indicator of the significance and relevance — that is, the impact — of a piece of research.
However, citation data must be used with care as citation rates can vary between subjects and time periods.
For example, papers in the life sciences tend to be cited more frequently than those published in the social sciences.
The rankings this year use normalised citation impact, where the citations to each paper are compared with the average number of citations received by all papers published in the same field and year. So a paper with a relative citation impact of 2.0 is cited twice as frequently as the average for similar papers.
The data were extracted from the Thomson Reuters resource known as Web of Science, the largest and most comprehensive database of research citations available.
Its authoritative and multidisciplinary content covers more than 11,600 of the highest-impact journals worldwide. The benchmarking exercise is carried out on an exact level across 251 subject areas for each year in the period 2004 to 2008.
For institutions that produce few papers, the relative citation impact may be significantly influenced by one or two highly cited papers and therefore it does not accurately reflect their typical performance. However, institutions publishing fewer than 50 papers a year have been excluded from the rankings.
There are occasions where a groundbreaking academic paper is so influential as to drive the citation counts to extreme levels — receiving thousands of citations. An institution that contributes to one of these papers will receive a significant and noticeable boost to its citation impact, and this reflects such institutions' contribution to globally significant research projects."


The quotation is from the bottom of the methodology page. It is easy to miss since it is separate from the general discussion of the citations indicator.

I will comment on Simon Pratt's claim that "An institution that contributes to one of these papers will receive a significant and noticeable boost to its citation impact, and this reflects such institutions' contribution to globally significant research projects."

First, were self-citations included in the analysis?

Second, do institutions receive the same credit for contributing to a research project by providing one out of twenty co-authors that they would for contributing all of them.

Third, since citation scores vary from one subject field to another, a paper will get a higher impact score if it is classified as a subject that typically receives few citations than as one in which citations are plentiful.

Fourth, the obvious problem that undermines the entire indicator is that the impact scores are divided by the total number of papers. A groundbreaking paper with thousands of citations would would make little difference to Harvard. Change the affiliation to a small college somewhere and it would stand out (providing the college could reach 50 papers a year)>

This explains something rather odd about the data for Alexandria University. Mohamed El Naschie has published many papers with several different affiliations. Yet the many citations to these papers produced a dramatic effect only for Alexandria. This, it seems, was because his Alexandria papers had a big effect because the total number of papers was so low.
Highlights from the Research Impact (Citations) Indicator


One very good thing to emerge from the current round of rankings is the iphone/ipad apps from THE and QS. The THE app is especially helpful since it contains the scores for the various indicators for each of 400 universities. It is possible then to construct a ranking for research imact as measured by citations, which gets nearly one third of the weighting.

Some highlights

1st Caltech
4th Alexandria University
9th Harvard
10th UC Santa Barbara
13th Hong Kong Baptist University
20th Bilkent
23rd Oxford
27th Royal Holloway
31st Johns Hopkins
41st University of Adelaide
45TH Imperial College
65th Australian National University
84th Kent State
11oth Mcgill
143th Tokyo Metropolitan University
164th Tokyo University
285th Warwick
302nd Delft University of Technology
368th South Australia
Perverse Incentives

Until we get a clear statement from Thomson Reuters we have to assume that the citations indicator in the recent THE rankings was constructed by counting citations to articles published in the period 2004 - 2008, dividing these by the expected number of articles and then dividing again by the total number of articles.

It seems then that universities could improve their score on this indicator by getting cited more often or by reducing the number of papers published in ISI indexed journals. Doing both could bring remarkable results.

This seems to be what has happened in the case of Alexandria University, which according to the new THE ranking, is fourth best in the world for research impact.

The university has accumulated a large number of citations to papers published by Mohamed El Naschie, mainly in two journals, Chaos, Solitons and Fractals, published by Elsevier, and the International Journal of Nonlinear Mathematics and Numerical Simulation, published by the Israeli company Freund. El Naschie was editor of the first until recently and is co-editor of the second. Many of the citations are by himself.

I am unable to judge the merits of El Naschie's work. I assume that since he has been a Professor at Cambridge, Cornell and the University of Surrey and publishes in journals produced by two very reputable companies that his papers are of a very high quality.

It is not enough, however, to simply get lots of citations. The actual citation/expected citation number will -- if this is what happened -- be divided by the total number of papers. And this is where there is a problem. If a university has very few papers in ISI journals in the relevant period they will end by getting a very good score. A lot of papers and your score goes way down. This probably explains why Warwick ranks 285th for research impact and LSE 193rd: there were just too many people writing papers that were above average but not way above average.

An article by David Glenn in the Chronicle of Higher Education talks about the perverse incentives of the new rankings. Here, we have another. If a university simply stopped publishing for a year its score on this indicator would go up since it would still be accumulating citations for articles published in previous years.

Saturday, September 18, 2010

More on the Citations Indicator in the THE Rankings
I am copying the whole of a comment to the previous post since it might be the key to the strange results of the citations indicator.

Perhaps someone from Thomson Reuters can confirm that this is method they were using.
Pablo said...

My bet is that TR uses the Leiden "Crown indicator" since this is what is embodied in their product InCites.

To cut it short, each paper is linked to a subdiscipline, a type of publication (letter, review, ...) and a year of publication. With this data for the whole world, it is easy to calculate the expected number of citations for a paper of a given type, in a given discipline, in a given year.
For a set of papers (e.g. all the papers of Alexandria university), the indicator is calculated as Sum(received citations)/Sum(expected citations).

This number can become very high if you have a small number of paper or if you look only at recent papers (if, on average you expect 0.1 citations for a recent paper in math, a single citation will give you a score of 10 for this paper!)

Note that Leiden as recently decided to change its favorite indicator for a mean(citations received/citations expected) which gives less weight for a few highly cited papers in a set. But it seems that TR has not implemented yet this new indicator.

Note also that, in order to avoid the overweight given to few papers in a small set, Leiden publish its own ranking of universities with thresholds on the total number of papers published.

Friday, September 17, 2010

The Citations Indicator in the THE World University Rankings

I am sure that many people waited for the new Times Higher Education rankings in the hope that they would be a significant improvement over the old THE-QS rankings.

In some ways there have been improvements and if one single indicator had been left out the new rankings could have been considered a qualified success.

However there is a problem and it is a big problem. This is the citations indicator, which consists of the number of citations to articles published between 2004 and 2008 in ISI indexed journals divided by the number of articles. It is therefore a measure of the average quality of articles since we assume that the more citations a paper receives the better it is.

Giving nearly a third of the total weighting to research impact is questionable. Giving nearly a third to just one of several possible indicators of research impact is dangerous. Apart from anything else, it means that any errors or methodological flaws might undermine the entire ranking.

THE have been at pains to suggest that one of the flaws in the old rankings was that the failure to take account of different citation patterns in different disciplines meant that universities with strengths in disciplines such as medicine where citation is frequent do much better than those that are strong in disciplines such as philosophy where citation is less common. We were told that the new data would be normalized by disciplinary group so that a university with a small number of citations in the arts and humanities could still do well if the number of citations was relatively high compared to the number of citations for the highest scorer in that disciplinary cluster.

I think we can assume that this means that in each of the six disciplinary groups, the number of citations per paper was calculated for each university. Then the mean for all universities in the group was calculated. Then the top scoring university was given a score of 100. Then Z scores were calculated, that is the number of standard deviations from the mean. Then the score for the whole indicator was found by calculating the mean score for the six disciplinary groups.

The crucial point here is the rather obvious one that no university can get more that 100 for each disciplinary group. If it were otherwise then Harvard, MIT and Caltech would be getting scores well in excess of 100.

So, let us look at some of the highest scores for citations per paper . First the university of Alesxandria, which is not listed in the ARWU top 500 and is not ranked by QS.and which is ranked 5,882nd in the world by Webometrics.

The new rankings put Alexandria in 4th place in the world for citations per paper. This meant that with the high weighting given to the citations indicator the university achieved a very respectable overall place of 147th.

How did this happen? For a start I would like to compare Alexandria with Cornell, an Ivy League university with a score of 88.1 for citations, well below Alexandria’s

I have used data from the Web of Science to analyse citation patterns according to the disciplinary groups indicated by Thomson Reuters. These scores may not be exactly those calculated by TR since I have made some instantaneous decisions about allocating subjects to different groups and TR may well have done it differently. I doubt though that it would make any real difference if I put biomedical engineering in clinical and health subjects and TR put it in engineering and technology or life sciences. Still I would welcome it if Thomson Reuters could show how their classification of disciplines into various groups produced the score that they have published.

So where did Alexandria’s high score come from. It was not because Alexandria does well in the arts and humanities. Alexandria had an average of 0.44 citations per paper and Cornell 0.85.

I was not because Alexandria is brilliant in the social sciences. It had 4.21 citations per paper and Cornell 7.98.

Was it medicine and allied disciplines? No. Alexandria had 4.97 and Cornell 11.53.

Life sciences? No. Alexandra had 5.30 and Cornell 13.49.

Physical Sciences? No. Alexandria had 6.54 and Cornell 16.31.

Engineering, technology and computer science? No. Alexandria had 6.03 and Cornell 9.59.

In every single disciplinary group Cornell is well ahead of Alexandria. Possibly, TR did something differently. Maybe they counted citations to papers in conference proceedings but that would only affect papers published in 2008 and after. At the moment, I cannot think of anything that would substantially affect the relative scores.

Some further investigation showed that while Alexandria’s citation record is less than stellar in all respects there is precisely one discipline, or subdiscipline or even subsubdiscipline, where it does very well. Looking at the disciplines one by one, I found that there is one where Alexandria does seem to have an advantage, namely mathematical physics. Here it has 11.52 citations per paper well ahead of Cornell with 6.36.

Phil Baty in THE states:

“Alexandria University is Egypt's only representative in the global top 200, in joint 147th place. Its position, rubbing shoulders with the world's elite, is down to an exceptional score of 99.8 in the "research-influence" category, which is virtually on a par with Harvard University.

Alexandria, which counts Ahmed H. Zewail, winner of the 1999 Nobel Prize for
Chemistry, among its alumni, clearly produces some strong research. But it is a
cluster of highly cited papers in theoretical physics and mathematics - and more
controversially, the high output from one scholar in one journal - that gives it
such a high score.

Mr Pratt said: "The citation rates for papers in these fields may not appear exceptional when looking at unmodified citation counts; however, they are as high as 40 times the benchmark for similar papers. "The effect of this is particularly strong given the relatively low number of papers the university publishes overall."

This is not very convincing. Does Alexandria produce strong research? Overall, No. It is ranked 1014 in the world for total papers over a ten year period by SCImago.

Let us assume, however, that Alexandria’s citations per paper were such that it was the top scorer not just in mathematical or interdisciplinary physics, but also in physics in general and in the physical sciences, including maths (which, as we have seen it was not anyway)

Even if the much cited papers in mathematical physics did give a maximum score of 100 for the physical sciences and maths group, how could that compensate for the low scores that the university should be getting for the other five groups? To attain a score of 99.8 Alexandria would have to be near the top for each of the six disciplinary groups. This is clearly not the case. I would therefore like to ask someone from Thomson Reuters to explain how they got from the citation and paper counts in the ISI database to an overall score.

Similarly we find that Bilkent University in Turkey had a score for citations of 91.7, quite a bit ahead of Cornell.

The number of citations per paper in each disciplinary group is as follows:

Arts and Humanities: Bilkent 0.44, Cornell 0.85
Social Sciences: Bilkent 2.92, Cornell 7.98
Medicine etc: Bilkent 9.42 Cornell 11.53
Life Sciences: Bilkent 5.44 Cornell 13.49
Physical Sciences: Bilkent 8.75 Cornell 16.31
Engineering and Computer Science: Bilkent 6.15 Cornell 9.59

Again, it is difficult to see how Bilkent could have surpassed Cornell. I did notice that one single paper in Science had received over 600 citations. Would that be enough to give Bilkent such a high score?

It has occurred to me that since this paper was listed under “multidisciplinary sciences” that maybe its citations have been counted more than once. Again, it would be a good idea for TR to explain step by step exactly what they did.

Now for Hong Kong Baptist University. It is surprising that this university should be in the top 200 since in all other rankings it has lagged well behind other Hong Kong universities. Indeed it lags behind on the other indicators in this ranking.

The number of citations per paper in the various disciplinary groups is as follows:

Arts and Humanities: HKBU 0.34, Cornell 0.85
Social Sciences: HKBU 4.50 Cornell 7.98
Medicine etc: 7.82 Cornell 11.53
Life Sciences: 10.11 Cornell 13.49
Physical Sciences: HKBU 10.78 Cornell 16.31
Engineering and Computer Science: HKBU 8.61 Cornell 9.59

Again, there seems to be a small group of prolific and highly accomplished and reputable researchers especially in chemistry and engineering who have boosted HKBU’s citations. But again, how could this affect the overall score. Isn’t this precisely what normalization by discipline was supposed to prevent?

There are other universities with suspiciously high scores for this indicator. Also one wonders whether among the universities that did not make it into the top 200 there were some unfairly penalized. Now that the iphone app is being downloaded across the world this may soon become apparent.

Once again I would ask TR to go step by step through the process of calculating these scores and to assure us that they are not the result of an error or series of errors. If they can do this I would be very happy to start discussing more fundamental questions about these rankings.
From the Chronicle of Higher Education

An article by Aishah Laby contains this news that I have not heard anywhere else:

Quacquarelli Symonds has continued to produce those rankings, now called the QS World University Rankings, and is partnering with U.S. News and World Report for their publication in the United States.

The relationship between the former collaborators has deteriorated into
barely veiled animosity. QS has accused Times Higher Education of unfairly
disparaging the tables they once published together. This week the company
threatened legal action against the magazine over what Simona Bizzozero, a QS
spokeswoman, described as "factually inaccurate" and misleading statements by
representatives of Times Higher Education. She said THE's role in the
collaboration was limited to publishing the rankings based on a methodology that
QS had developed. "What they're producing now is a brand-new exercise. A totally
brand-new exercise, with absolutely no links whatsoever to what QS produced and
is producing," she said. "So when they refer to their old methodology, that is
not correct."

Phil Baty, editor of the rankings for Times Higher Education, declined to respond to QS's complaints: "We are now looking forward, not looking backward."

I didn't know that the animosty was veiled, even barely.

There are some comments from Ellen Hazelkorn

"Really, nothing has changed," said Ellen Hazelkorn, executive director of the Higher Education Policy Research Unit at the Dublin Institute of Technology, whose book "Rankings and the Battle for Worldclass Excellence: The Reshaping of Higher Education" is due to be published in March.

Despite Times Higher Education's assurances that the new tables represent a much more rigorous and reliable guide than the previous rankings, the indicators on which the new rankings are based are as problematic in their own way, she believes. The heavily weighted measure of teaching, which she described as subjective and based on reputation, introduces a new element of
unreliability.

Gauging research impact through a subjective, reputation-based measure is troublesome enough, and "the reputational aspect is even more problematic once you extend it to teaching," she said.

Ms. Hazelkorn is also troubled by the role Thomson Reuters is playing through its
Global Institutional Profiles Project, to which institutions provide the data
used in the tables. She dislikes the fact that institutions are going to great
effort and expense to compile data that the company could then sell in various
ways.

"This is the monetarization of university data, like Bloomberg
made money out of financial data," she said.


Powered by Thomson Reuters

Thanks to Kris Olds in Global higher Ed for noticing the above in the THE World UniversityRankings banner.

A quotation from his article.

Thomson Reuters is a private global information services
firm, and a highly respected one at that. Apart from ‘deep pockets’, they have
knowledgeable staff, and a not insignificant number of them. For example, on 14
September Phil Baty, of Times Higher Education sent out this fact via their
Twitter feed:

2 days to #THEWUR. Fact: Thomson Reuters involved more
than 100 staff members in its global profiles project, which fuels the rankings

The incorporation of Thomson Reuters into the rankings games by Times
Higher Education was a strategically smart move for this media company for it
arguably (a) enhances their capacity (in principle) to improve ranking
methodology and implementation, and (b) improves the respect the ranking
exercise is likely to get in many quarters. Thomson Reuters is, thus, an
analytical-cum-legitimacy vehicle of sorts.

Thursday, September 16, 2010

Comment on the THE rankings

From The Age (Australia)

Les Field, the deputy vice-chancellor (research) at the University of NSW, said the new Times methodology had produced some curious results, such as Hong Kong Baptist University ranking close behind Harvard on citations.

''There are some anomalies which to my mind don't pass the reasonableness test,'' he said.

And Alexandria University, UC Santa Cruz, UC Santa Barbara, Pohang University of Science and Technology, Bilkent University, William & Mary, Royal Holloway, University of Barcelona, University of Adelaide.
Alexandria University

According to the THE rankings Alexandria University in Egypt (no. 147 overall) is the fourth university in the world for research impact, surpassed only by Caltech, MIT and Princeton.

Alexandria is not ranked by Shanghai Jiao Tong University or HEEACT. It is way down the SCImago rankings. Webometrics puts it in 5,882nd place and 7,253rd for the "Scholar" indicator.

That is not the only strange result for this indicator, which looks as though it will spoil the rankings as a whole.

More on Alexandria and some other universities in a few hours.
The Good News

There are some worthwhile improvements in the new THE World University Rankings.

First the weighting given to the subjective opinion survey has been reduced although probably not by enough. Very sensibly, the survey asked respondents to evaluate teaching as well as research.

The task ahead for THE now is to refine the sample of respondents and the questions they are invited to answer. It would make sense to exclude those with a non-university affiliation from answering questions about teaching. Similarly, there ought to be some way of eliciting the views of university teachers who do not do research, perhaps by some sort of rigorously validated sign up system. Something like this might also be developed to discover the views of students, at least graduate students.

The weighting given to international students has been reduced from five to two per cent.

There is a substantial weighting for a mixed bag of teaching indicators, including the survey. Some of these are questionable though such as the ratio of doctoral to undergraduate students.

For most indicators, the present rankings represent a degree of progress.

The problem with these rankings is the Citations Indicator, which has produced results that, to say the least, are bizarre.
First the Bad News about the THE Rankings

There is something seriously wrong with the citations indicator data. I am doing some checking right now.
Highlights of the THE Rankings

The top ten are:
1. Harvard
2. Caltech
3. MIT
4. Stanford
5. Princeton
6. Cambridge
6. Oxford
8. UC Berkeley
9. Imperial College
10. Yale

The best Asian university is the University of Hong Kong. Sao Paulo is best in South America and Melbourne in Australia. Cape Town is top in Africa followed by the University of Alexandria which is ranked 149th, a rather surprising result.

Wednesday, September 15, 2010

The new THE World University Rankings are out. Discussion follows in a few hours.