Top US Colleges by Salary
PayScale has published its ranking of US colleges by mid-career median salary. The top five are:
1. Harvey Mudd College
2. Princeton
3. Dartmouth
4. Harvard
5. Caltech
The top schools in various categories are:
Engineering: Harvey Mudd
Ivy League: Princeton
Liberal Arts: Harvey Mudd
Party Colleges: Union College, NY
Private Research Universities: Princeton
State Universities: Colorado School of Mines
Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Thursday, December 23, 2010
Saturday, December 04, 2010
Can 25 Million Citations be Wrong?
Perhaps not but a few hundred might.
University World News has an article by Phil Baty, deputy editor of Times Higher Education, that discusses the recent THE World University Rankings. He is mainly concerned with the teaching component of the rankings and I hope to discuss this in a little while. However, there are some remarks about the citation component that are worth commenting on. He says:
It is also very likely -- although I cannot recall seeing direct confirmation -- that Thomson Reuters were benchmarking by year so that a university would score more for a citation to a recently cited article than one to an article published four years ago.
In the case of Alexandria and other universities that scored unexpectedly well we are not talking about millions of citations. We are talking about dozens of papers and hundreds of citations the effect of which has been enormously magnified because the papers were in a low-citation field and were cited within months of publication.
Remember also that we are talking about averages. This means that a university will get a higher score the smaller the number of papers that are published in ISI- indexed journals. Alexandria did not do so well just because El Naschie published a lot and was cited a lot. It also did well because overall it published few articles. Had Alexandria researchers published more then its score would have been correspondingly lower..
Perhaps El Naschie constitutes a clear pocket of excellence, although that is not entirely uncontroversial. But he is a clear pocket of excellence who only became visible because of the modest achievement of the rest of the university. Conversely, there are probably many modestly cited researchers in Europe and the USA who might welcome a move to a university in Asia or Latin America where a few papers and citations in a low cited discipline would blossom into such a pocket.
Is Aleaxandria one of two or three anomalies? There are in fact many more more anomalies perhaps not quite so obvious and this can be seen by comparing scores for research impact with other rankings of research output and impact such as HEEACT and Scimago or with the scores for research in the THE rankings themselves. It would also be interesting if THE released the released of the academic reputational survey.
Consider what would happen if we had a couple of universities that were generally similar, with the same income, staff-student ratio and so on. One however had published two or three times the number of ISI indexed articles as the other. Both had a few researchers who had been cited more frequently than is usual for their discipline. Under the current system, the first university would get a much lower score than the second. Can this really be consider a preference for quality over quantity? Only if we think that publishing in ISI journals adds to quantity but does not indicate quality.
I hope that food for thought means radical revision of the citations indicator.
A minimal list of changes would include adding more markers of research impact, removing self citations and citations to the same university and the same journal and combining the score for the various disciplinary fields.
If this can be done then the THE rankings may become what was promised.
Perhaps not but a few hundred might.
University World News has an article by Phil Baty, deputy editor of Times Higher Education, that discusses the recent THE World University Rankings. He is mainly concerned with the teaching component of the rankings and I hope to discuss this in a little while. However, there are some remarks about the citation component that are worth commenting on. He says:
"We look at research in a number of different ways, examining research reputation, income and research volume (through publication in leading academic journals indexed by Thomson Reuters). But we give the highest weighting to an indicator of 'research influence', measured by the number of times a university's published research is cited by academics around the globe.First, normalisation of data means that the number of citations is compared to a benchmark derived from the world average number of citations for a subject area. A large number of citations might mean that an article has been warmly received. It might equally well mean that the article was in a field where articles are typically cited a lot. Comparing simple numbers of citations in a field like literary studies to those in medical research would be unfair to the former since citations there are relatively scarce. So part of the reason for Alexandria University's remarkable success in the recent THE WUR was not just the number of citations of the papers of Mohamed El Naschie but also that he was publishing in a field with a low frequency of citations. Had he published papers on medicine nobody would have noticed.
We looked at more than 25 million citations over a five-year period from more than five million articles. All the data were normalised to reflect variations in citation volume between different subject areas.
This indicator has proved controversial, as it has shaken up the established order, giving high scores to some smaller institutions with clear pockets of research excellence, often at the expense of the larger research-intensive universities.
We make no apology for recognising quality over quantity, but we concede that our decision to openly include in the tables the two or three extreme statistical outliers, in the interests of transparency, has given some fuel for criticism, and has given us some food for thought for next year's table."
It is also very likely -- although I cannot recall seeing direct confirmation -- that Thomson Reuters were benchmarking by year so that a university would score more for a citation to a recently cited article than one to an article published four years ago.
In the case of Alexandria and other universities that scored unexpectedly well we are not talking about millions of citations. We are talking about dozens of papers and hundreds of citations the effect of which has been enormously magnified because the papers were in a low-citation field and were cited within months of publication.
Remember also that we are talking about averages. This means that a university will get a higher score the smaller the number of papers that are published in ISI- indexed journals. Alexandria did not do so well just because El Naschie published a lot and was cited a lot. It also did well because overall it published few articles. Had Alexandria researchers published more then its score would have been correspondingly lower..
Perhaps El Naschie constitutes a clear pocket of excellence, although that is not entirely uncontroversial. But he is a clear pocket of excellence who only became visible because of the modest achievement of the rest of the university. Conversely, there are probably many modestly cited researchers in Europe and the USA who might welcome a move to a university in Asia or Latin America where a few papers and citations in a low cited discipline would blossom into such a pocket.
Is Aleaxandria one of two or three anomalies? There are in fact many more more anomalies perhaps not quite so obvious and this can be seen by comparing scores for research impact with other rankings of research output and impact such as HEEACT and Scimago or with the scores for research in the THE rankings themselves. It would also be interesting if THE released the released of the academic reputational survey.
Consider what would happen if we had a couple of universities that were generally similar, with the same income, staff-student ratio and so on. One however had published two or three times the number of ISI indexed articles as the other. Both had a few researchers who had been cited more frequently than is usual for their discipline. Under the current system, the first university would get a much lower score than the second. Can this really be consider a preference for quality over quantity? Only if we think that publishing in ISI journals adds to quantity but does not indicate quality.
I hope that food for thought means radical revision of the citations indicator.
A minimal list of changes would include adding more markers of research impact, removing self citations and citations to the same university and the same journal and combining the score for the various disciplinary fields.
If this can be done then the THE rankings may become what was promised.
Friday, December 03, 2010
Is there a Future for Citations?
simplification administrative has some caustic comments on the role of self-citation and reciprocal citation in the remarkable performance of Alexandria University in the 2010 THE rankings.
The title, 'Bibliometry -- already broken', is perhaps unduly pessimistic but THE and Thomson Reuters are going to have to move quickly if they are to rescue their rankings. An obvious remedy would include removing self-citations and intra-university and intra-journal citations from the count.
simplification administrative has some caustic comments on the role of self-citation and reciprocal citation in the remarkable performance of Alexandria University in the 2010 THE rankings.
The title, 'Bibliometry -- already broken', is perhaps unduly pessimistic but THE and Thomson Reuters are going to have to move quickly if they are to rescue their rankings. An obvious remedy would include removing self-citations and intra-university and intra-journal citations from the count.