QS Engineering Rankings
QS have started to published detailed subject rankings based on citations per paper over five years and their surveys of academics and employers. The first of these is engineering. There are five subfields: Computer Science and Information Systems, Chemical Engineering, Civil and Structural Engineering, Electrical and Electronic Engineering and Mechanical, Aeronautical and Manufacturing.
For Civil and Structural Engineering the weighting is 50% for the academic survey, 30% for the employers' survey and 20 % for citations per paper. For the others it is 40%, 30% and 30%.
MIT, not surprisingly, is top in each of the five engineering fields that are ranked. In general, the upper levels of these rankings seem reasonable. However, a look at the details, especially in the bottom half, 100-200 places, raises some questions.
One basic problem is that as QS make finer distinctions, they have to rely on smaller sets of data. There were 285 respondents to the academic survey for chemical engineering and 394 for civil and structural engineering. For the employer survey there were 836 for computer science. Each respondent to the academic survey was allowed to nominate up to 40 universities but usually the number was much lower than this. Around the 151-200 level the number of responses would surely have been very low. Similarly, the number of papers counted in each field varied considerably from 43,222 in civil and structural engineering to 514,95 in electrical and electronic engineering. We should therefore be rather sceptical about these rankings.
Something that is noticeable is that there is a reasonably high correlation between the scores for the academic survey and the employer survey. For electrical engineering it is .682, chemical engineering .695, civil engineering .695, computer science .722.
But there is no correlation at all between the citations per paper indicator and the surveys. For electrical engineering it is .064 between citations and academic survey and -.004 between citations and the employer survey. It is the same for the other subfields. None of the correlations are statistically significant.
Looking at the top universities for the three indicators, we see the same familiar places in each of the subfields according to the surveys: MIT, Stanford, Cambridge, Berkeley, Oxford, Harvard, Imperial College London, Melbourne, Caltech.
But looking at the top scorers for citations per paper, we find a much more varied and unfamiliar array of institutions: New York University, Wageningen, Dartmouth College, Notre Dame, Aalborg, Athens, Lund, Uppsala, Drexel, Tufts, IIT Roorkee, University of Washington, Rice, University of Massachusetts.
The agreement of employers and academic about the quality of engineering programs, even though they refer to different aspects, research and graduate employability, suggests that the surveys are moderately accurate, at least for the top hundred or so.
However, the lack of any correlation at all between the citations indicator and the surveys needs to be raised. It could be that citations have identified up and coming superstars. Perhaps the number of papers is so low in the various subfields that the indicator does not mean very much. Perhaps citations have been so manipulated in recent years -- see the case of Alexandria University -- that they are no longer a robust indicator of quality.
No comments:
Post a Comment