Discussion and analysis of international university rankings and topics related to the quality of higher education. Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes2000@yahoo.com
Monday, November 27, 2017
Rankings Uproar in Hong Kong
There is a controversy brewing in Hong Kong about the submission of data to the QS World University Rankings. It seems that the City University of Hong Kong (CityU) has submitted a smaller figure for the total number of its students than that presented by the SAR's University Grants Committee (UGC). The objective of this was presumably to boost the score for faculty student ratio, which accounts for 20% of the total score in the QS rankings. The complaints apparently began with two other local universities and were reported in the Chinese language Apple Daily.
There is nothing new about this sort of thing. Back in 2006 I commented on the difference between the number of students at "Beijing University" on the university web site and that declared by QS. Ong Kian Ming has noted discrepancies between the number of students at Malaysian universities reported on web sites and that published by QS and there have been questions about the number of international students at Singapore universities
The first thing that strikes an outside observer about the affair is that the complaint seems to be just about QS and does not mention the THE rankings although exactly the same number of students, 9,240, appears on both the QS and THE pages. The original article in Chinese apparently makes no mention of THE.
This suggests that there might be a bit of politics going on here. THE seems to have a good relationship with some of the leading universities in Hong Kong such as the University of Hong Kong (UHK) and the Hong Kong University of Science and Technology (HKUST). In 2015 THE held a prestigious summit at HKUST where it announced after "feedback from the region" that it was introducing methodological changes that would dethrone the University of Tokyo from the number one spot in the Asian rankings and send it down to seventh place behind HKUST and UHK. It looks as though whoever is complaining about CityU is diverting their eyes from THE.
There is certainly a noticeable difference between the number of students submitted to QS and THE by CityU and that published by the UGC. This is not, however, necessarily nefarious. There are many ways in which a university could massage or trim data in ways compliant with the rankers' guidelines: using a specific definition of Full Time Equivalent, omitting or including branch campuses, research centres, affiliated institutions, counting students at the beginning or the end of the semester, counting or not counting exchange students or those in certificate, diploma, transitional or preparatory programmes. It is also not totally impossible that the government data may not be 100% accurate.
Other Hong Kong universities have also submitted student data that differs from that available at the UGC site but to a lesser extent.
The UGC's data refers to 13,725 full time equivalent students in 2014-15. It is possible that City University has found legitimate ways of whittling down this number. If nothing else, they could claim that they had to use data from earlier years because of uncertainty about the validity of current data.
The real problem here is that it is possible that some universities have learned that success in the rankings is sometimes as much a matter of careful reading of statistics and guidelines as it is of improved teaching or research.
Another thing that has so far gone unnoticed is that CityU has also been reducing the number of faculty. The UGC reports a total of 2,380 full time equivalent faculty while QS reports 1,349. If the university had just used the raw UGC figures it would have a faculty student ratio of 5.77. The QS figure is 6.85. So by modifying the UGC data, if that is where the university started, CityU actually got a worse result on this indicator. They would, however, have done a bit better on the citations per faculty indicator.
This leads on to what the Hong Kong universities did with their faculty numbers.
For the University of Hong Kong the UGC reports a total of 5,093 FTE staff but the QS site has 3,012. THE does not give a figure for the number of faculty but it is possible to calculate this from the number of students and the faculty student ratio, which are provided. The current THE profile of UHK has 18,364 students and 18 students per staff, which gives us 1,020 staff.
For HKUST the UGC number of staff is 2,398. The number calculated from THE data is 442. QS has a total of 1,150.
For the Chinese University of Hong Kong (CUHK) we have these numbers: UGC 5,070, QS 2,208, THE 1,044.
For the Polytechnic University of Hong Kong (PUHK): UGC, 3,356, QS 2,447, THE 809.
The UGC gives 2,380 FTE staff for CityU, QS 1,349, and THE 825.
The UGC also provides the number of faculty wholly funded by the UGC and this number is always much lower than the total faculty. The QS faculty numbers are generally quite similar to these although I do not know if there was a decision to exclude non-funded faculty. The calculated THE faculty numbers are much lower than those provided by the UGC and lower than the QS numbers.
I suspect that what is going on is that the leading Hong Kong universities have adopted the strategy of aiming for the THE rankings where their income, resources and international connections can yield maximum advantage. They presumably know that the weighting of the staff/student indicator, where it is better to have more faculty, is only 4.5% but the indicators where fewer total staff are better (international faculty, research income, research productivity, industry income, doctorates awarded, institutional income) have a combined weighting of 25.25%.
CityU in contrast has focussed on the QS rankings and looked for ways of reducing the number of students submitted.
It is possible that HKUST and UHK could justify the data the submitted to the rankers while CityU might not, It does, however, seem rather strange and unfair that City University's student data has come under such intense scrutiny while the faculty data of the other universities is so far unquestioned.
Ranking organisations should heed the suggestion by the International Rankings Experts Group (IREG) that indicators measure outcomes rather than inputs such as staff, facilities or income. They also should think about how much they should use data submitted by institutions. This may have been a good idea when they were ranking 200 or 300 places mainly in North America and Western Europe but now they are approaching 1,000 universities, sometimes very decentralised, and data collection is becoming more complicated and difficult.
QS used to talk about its "validation hierarchy" with central agencies such as HESA and NCES at the top, followed by direct contact with institutions, websites, and ending with "smart" averages. Perhaps this could be revived but with institutional data further down the hierarchy. The lesson of the latest arguments in Hong Kong and elsewhere is that data submitted by universities can often be problematical and unreliable.
Eѵery weekend i used to pay a qսick ᴠisіt this web site, because i want enjߋyment, since this tɦis ѕite conations tгuly nice funny stuff too.
ReplyDelete