Sunday, August 13, 2017

The Need for a Self Citation Index

In view of the remarkable performance of Veltech University in the THE Asian Rankings, rankers, administrators and publishers need to think seriously about the impact of self-citation, and perhaps also intra-institutional ranking. Here is the abstract of an article by Justin W Flatt, Alessandro Blassime, and Effy Vayena.

Improving the Measurement of Scientific Success by Reporting a Self-Citation Index

Abstract

: 
Who among the many researchers is most likely to usher in a new era of scientific breakthroughs? This question is of critical importance to universities, funding agencies, as well as scientists who must compete under great pressure for limited amounts of research money. Citations are the current primary means of evaluating one’s scientific productivity and impact, and while often helpful, there is growing concern over the use of excessive self-citations to help build sustainable careers in science. Incorporating superfluous self-citations in one’s writings requires little effort, receives virtually no penalty, and can boost, albeit artificially, scholarly impact and visibility, which are both necessary for moving up the academic ladder. Such behavior is likely to increase, given the recent explosive rise in popularity of web-based citation analysis tools (Web of Science, Google Scholar, Scopus, and Altmetric) that rank research performance. Here, we argue for new metrics centered on transparency to help curb this form of self-promotion that, if left unchecked, can have a negative impact on the scientific workforce, the way that we publish new knowledge, and ultimately the course of scientific advance.
Keywords:
 publication ethics; citation ethics; self-citation; h-index; self-citation index; bibliometrics; scientific assessment; scientific success


Saturday, August 12, 2017

The public sector: a good place for those with bad school grades

From the Economist ranking of British universities, which is based on the difference between expected and actual graduate earnings.

 That, as Basil Fawlty said in a somewhat different context, explains a lot.  

"Many of the universities at the top of our rankings convert bad grades into good jobs. At Newman, a former teacher-training college on the outskirts of Birmingham, classes are small (the staff:student ratio is 16:1), students are few (around 3,000) and all have to do a work placement as part of their degree. (Newman became a university only in 2013, though it previously had the power to award degrees.)

Part of Newman’s excellent performance can be explained because more than half its students take education-related degrees, meaning many will work in the public sector. That is a good place for those with bad school grades. Indeed, in courses like education or nursing there is no correlation between earnings and the school grades a university expects." 

Friday, August 11, 2017

Malaysia and the Rankings Yet Again

Malaysia has had a complicated relationship with global university rankings. There  was a fleeting moment of glory in 2004 when Universiti Malaya, the national flagship, leaped into the top 100 of the THES-QS world rankings. Sadly, it turned out that this was the result of an error by the rankers who thought that ethnic minorities were international faculty and students. Since then the country's leading universities have gone up and down, usually because of methodological changes rather than any merit or fault of their own.

Recently though, Malaysia seems to have adopted sensible, if not always popular, policies and made steady advances in the Shanghai rankings. There are now three universities in the top 500, UM, Universiti Sains Malaysia (USM) and Universiti Kebangsaan Malaysia (UKM). UM has been rising since 2011 although it fell a bit last year because of the loss of a single highly cited researcher listed in the Thomson Reuters database.

The Shanghai rankings rely on public records and focus on research in the sciences. For a broader based ranking with a consistent methodology and teaching metrics we can take a look at the Round University Rankings. There UM is overall 268th. For the 20 metrics included in these rankings UM's scores range from very good for number of faculty and reputation (except outside the region) to poor for doctoral degrees and normalised citations.

The story told by these rankings is that Malaysia is making steady progress in providing resources and facilities, attracting international students and staff, and producing a substantial amount of research in the natural sciences. But going beyond that is going to be very difficult. Citation counts indicate that Malaysian research gets little attention from the rest of the world. The Shanghai rankings report that UM has zero scores for highly cited researchers and papers in Nature and Science.

In this year's QS world rankings, UM reached 114th place overall and there are now hopes that it will soon reach the top 100. But it should be noted that UM's profile is very skewed with a score of 65.7 for academic reputation and 24.3 for citations per faculty. Going higher without an improvement in research quality will be very challenging since the reputation curve becomes very steep at this level, with dozens of survey responses needed just to go up a few points.

It might be better if Malaysia focused more on the Shanghai rankings, the Round University Rankings and the US News Best Global Universities. Progress in these rankings is often slow and gradual but their results are usually fairly consistent and reliable.