The London Daily Telegraph has published an article on 'How to Read the Different University Rankings' which refers to two international rankings, QS and THE and their spin-offs and some national ones. I assume this was intended for British students who might like to get a different perspective on UK university quality or who might be thinking of venturing abroad.
The article is not very satisfactory as it refers only to the QS and THE rankings and is uncritical.
So here is a brief survey of some global rankings for prospective students.
International university rankings fall into three groups: Internet-based like Webometrics and ic4u, research-based like the Academic Ranking of World Universities (ARWU) produced by the Shanghai Ranking Consultancy and "holistic" rankings such as the Quacquarelli Symonds (QS) and Times Higher Education (THE) world rankings which claim to measure reputation, teaching quality, internationalisation and other factors and combine indicators into a single index.
The first thing that you need to know is where an institution stands in the global university hierarchy. Most rankings cover only a fraction of the world's higher education institutions. Webometrics, however, rates the presence, impact, openness and research quality, measured by Google Scholar Citations, of nearly 24,000 universities. Only 5,483 of these get a score for quality so that a university with a rank of 5,484 is not doing anything resembling research at all.
While the Webometrics rankings are quite volatile at the top, the distinction between a university in 200th and one in 2,000th place is significant, even more so between 1,000th and 10,000th.
Webometrics can also help determine whether an institution is in fact a university in any sense of the word. If it isn't there the chances are that it isn't really a university. I recently heard of someone who claimed degrees from Gordon University in Florida. A search of Webometrics revealed nothing so it is very likely that this is not a reputable institution.
Here ic4u is less helpful since it ranks only 11,606 places and does not provide information about specific indicators.
Looking at the research rankings, the oldest and most respected by academics is the ARWU published by Shanghai Ranking Consultancy. This has six criteria, all of them related to research. The methodology is straightforward and perhaps unsophisticated by today's standards. The humanities are excluded, it has nothing to say about teaching and it favours large, old and rich universities. It also privileges medical schools, as shown by last year's world rankings which put the University of California at San Francisco, a medical school, in eighteenth place in the world.
On the other hand, it is stable and sensible and by default the ranking of choice for research students and faculty.
ARWU also publishes subject rankings which should be checked. But it is worth noting that ARWU has proved vulnerable to the tactics of King Abdulaziz University (KAU) in Jeddah which hands out contracts to over 200 adjunct faculty members who list KAU as a secondary affiliation. This has put the university in the world's top ten in the ARWU mathematics rankings.
There are a number of other global research rankings that could be consulted since they produce results that may be different from Shanghai. These include the National Taiwan University Rankings, the Best Global Universities published by US News, University Ranking by Academic Performance produced by Middle East Technical University and the CWTS Leiden Ranking. These, especially the Leiden Ranking, reach a high level of technical sophistication and should be consulted by anyone thinking of postgraduate study anywhere. The Leiden Ranking is very helpful in providing stability intervals for each score.
Since 2004 a number of rankings have appeared that attempt to go beyond simply measuring research output and quality. These are problematical in many ways. It is very difficult to find data that is comparable across international borders and their methodology can change. In addition, they rely on reputation surveys which can be biased and unreliable. At the moment the two best known international rankings of this type are the World University Rankings published by QS and THE.
It must be pointed that even these are still top heavy with research indicators. The QS rankings have a 40% weighting for an opinion survey of research quality, and another 20 percent for citations. Even the faculty student ratio indicator is sometimes a measure of research rather than teaching since it can be improved by adding research-only staff to the faculty. The THE rankings allot 30% to five research indicators and another 30% to citations, 2.5% to international research collaborations and 2.5% to research income from industry.
You should bear in mind that these rankings have been accused, not without justification, of national bias. A paper by Christopher Claassen of the University of Glasgow has found that the QS and THE rankings are seriously biased towards UK universities.
The metrics that do attempt to measure teaching quality in the THE and QS rankings are not very helpful. Both have faculty student ratio data but this is a very imprecise proxy for teaching resources. The THE rankings include five indicators in their super-indicator "Teaching: the Learning Environment", two of which measure the number of doctoral students or doctoral degrees, which does not say much about undergraduate instruction.
It is also a good idea to check the scores for the criteria that are combined to make up the composite score. If a university has a disproportionately high score for an indicator with a high weighting like QS's academic opinion survey (40%) or THE's citations indicator (30%) then alarm bells should start ringing.
In some ways the new Round University Ranking from Russia is an improvement. It uses data from Thomson Reuters as THE did until last year. It does nearly everything that THE and QS do and a few more things besides. Altogether there are 20 indicators, although some of these, such as three reputation indicators, are so similar that they are effectively redundant.
Recently a consortium of European organisations and universities created U-Multirank, which takes a different approach. This is basically an online evaluation tool that allows users to choose how they wish to compare and sort universities. Unfortunately, its coverage is rather uneven. Data about teaching and learning is limited for Asia, North America and the UK although good for Western
Europe.
International rankings are generally not helpful in providing information about graduate employability although QS do include a reputation survey of employers and the Jeddah based Center for World University Rankings counts alumni who become CEOs of major companies.
The main global rankers also publish regional and specialist spin-offs: Latin American, Asian and European rankings, new university rankings, subject rankings. These should be treated with scepticism since they depend on a relatively small number of data points and consequently can be unreliable.
To summarise, these are things to remember when using global rankings:
- first check with Webometrics to find out the approximate standing of a university in the world
- for prospective postgraduate students in science and medicine, the Shanghai rankings should be consulted and confirmed by looking at other research rankings
- potential undergraduate students should look at more than one "holistic" ranking and should always check the scores for specific indicators
- be very sceptical of universities that get a good score for only one or two indicators or that do well in only one ranking
- always look at scores for subjects if available but remember that they may be based on limited and unreliable data
- for teaching related information the best source is probably U-Multirank but this is available only for some universities.
No comments:
Post a Comment