Sunday, March 30, 2014

The Nature Publication Index

Nature has long been regarded as the best or one of the two best scientific journals in the world. Papers published there and in Science  account for 20 % of the weighting for Shanghai Jiao Tong University's Academic Ranking of World Universities, the same as Nobel and Fields awards or publications in the whole of the Science Citation and Social Science Citation Indexes.

Sceptics may wonder whether Nature has seen better years and is perhaps sliding away from the pinnacle of scientific publishing. It has had some embarrassing moments in recent decades including the publication of a 1978 paper that gave credence to the alleged abilities of the psychic Uri Geller, the report of a study by Jacques Beneviste and others that purported to show that water has a memory. the questionable "hockey stick" article on global warming in 1998 and seven retracted papers on superconductivity by Jan Hendrik Schon.

But it still seems that Nature is highly regarded by the global scientific community and that the recent publication of the Nature Publication Index is a reasonable guide to current trends in scientific research. This counts the number of publications in Nature in 2013.

The USA remains on top with Harvard first, MIT second and Stanford third although China continues to make rapid progress. For many parts of the world, Latin America, Southern Europe, Africa, scientific achievement is extremely limited. Looking at the Asia-Pacific rankings  much of the region including Indonesia, Bangladesh and the Philippines is almost a scientific desert.

Sunday, March 23, 2014

At Last! A really Useful Ranking

Wunderground lists the top 25 snowiest universities in the US.

The top five are:

1.  Syracuse University
2.  Northern Arizona University (that's interesting)
3.  The University at Buffalo: SUNY
4.  Montana State University
5.  University at Colorado Boulder

Tuesday, March 04, 2014

Reactions to the QS Subject Rankings

It looks as though the QS subject rankings are a big hit. Here is just a sample of headlines and quotations from around the world.

World Ranking Recognises Agricultural Excellence at Lincoln [New Zealand]

CEU [Central European University, Hungary] Programs Rank Among the World's Top 100

Boston-Area Schools Rank Top in the World in These 5 Fields

"Cardiff has been ranked as one of the top universities in the world in a number of different subjects, according to a recent international league table."

NTU [National Taiwan University] leads local universities making QS rankings list

Swansea University continues to excel in QS world subject rankings

Penn State Programs Rank Well in 2014 QS World Rankings by Subject

Anna Varsity [India] Enters Top 250 in QS World Univ Rankings

Moscow State University among 200 best in the world

New Ranking Says Harvard And MIT Are The Best American Universities For 80% of Academic Subjects

QS: The University of Porto ranked among the best in the world

4 Indian Institutions in 2014 World Ranking

"The Institute of Education [London] has been ranked as the world's leading university for Education in the 2014 QS World University Rankings."

Nine UvA [University of Amsterdam] subject areas listed in QS World University Rankings top 50

"The University of Newcastle's [Australia] Civil and Structural Engineering discipline has surged in the QS World University Rankings by Subject list"


Sunday, March 02, 2014

The QS Subject Rankings: Reposting

QS have come out with their 2014 University Rankings by Subject, three months earlier than last year. Maybe this is to get ahead of Times Higher whose latest Reputation Rankings will be published next week.

The methodology of these rankings has not changed since last year so I am just reposting my article which was first published in the Philippine Daily Inquirer on 27th May and then reposted here on the 29th May 2013.

The QS University Rankings by Subject: Warning 

It is time for the Philippines to think about constructing its own objective and transparent ranking or rating systems for its colleges and universities that would learn from the mistakes of the international rankers.

The ranking of universities is getting to be big business these days. There are quite a few rankings appearing from Scimago, Webometrics, University Ranking of Academic Performance (from Turkey), the Taiwan Rankings, plus many national rankings.

No doubt there will be more to come.

In addition, the big three of the ranking world—Quacquarelli Symonds (QS), Times Higher Education and Shanghai Jiao Tong University’s Academic Ranking of World Universities—are now producing a whole range of supplementary products, regional rankings, new university rankings, reputation rankings and subject rankings.

There is nothing wrong, in principle, with ranking universities. Indeed, it might be in some ways a necessity. The problem is that there are very serious problems with the rankings produced by QS, even though they seem to be better known in Southeast Asia than any of the others.
This is especially true of the subject rankings.

No new data

The QS subject rankings, which have just been released, do not contain new data. They are mostly based on data collected for last year’s World University Rankings—in some cases extracted from the rankings and, in others, recombined or recalculated.

There are four indicators used in these rankings. They are weighted differently for the different subjects and, in two subjects, only two of the indicators are used.

The four indicators are:
A survey of academics or people who claim to be academics or used to be academics, taken from a variety of sources. This is the same indicator used in the world rankings. Respondents were asked to name the best universities for research.
A survey of employers, which seem to comprise anyone who chooses to describe himself or herself as an employer or a recruiter.
The number of citations per paper. This is a change from the world rankings when the calculation was citations per faculty.
H-index. This is something that is easier to give examples for than to define. If a university publishes one paper and the paper is cited once, then it gets an index of one. If it publishes two or more papers and two of them are published twice each, then the index is two and so on. This is a way of combining quantity of research with quality as measured by influence on other researchers.

Out of these four indicators, three are about research and one is about the employability of a university’s graduates.

These rankings are not at all suitable for use by students wondering where they should go to study, whether at undergraduate or graduate level.

The only part that could be of any use is the employer review and that has a weight ranging from 40 percent for accounting and politics to 10 percent for arts and social science subjects, like history and sociology.

But even if the rankings are to be used just to evaluate the quantity or quality of research, they are frankly of little use. They are dominated by the survey of academic opinion, which is not of professional quality.

There are several ways in which people can take part in the survey. They can be nominated by a university, they can sign up themselves, they can be recommended by a previous respondent or they can be asked because they have subscribed to an academic journal or an online database.

Apart from checking that they have a valid academic e-mail address, it is not clear whether QS makes any attempt to check whether the survey respondents are really qualified to make any judgements about research.

Not plausible

The result is that the academic survey and also the employer survey have produced results that do not appear plausible.

In recent years, there have been some odd results from QS surveys. My personal favorite is the New York University Tisch School of the Arts, which set up a branch in Singapore in 2007 and graduated its first batch of students from a three-year Film course in 2010.In the QS Asian University Rankings of that year, the Singapore branch got zero for the other criteria (presumably the school did not submit data) but it was ranked 149th in Asia for academic reputation and 114th for employer reputation.

Not bad for a school that had yet to produce any graduates when the survey was taken early in the year.

In all of the subject rankings this year, the two surveys account for at least half of the total weighting and, in two cases, Languages and English, all of it.

Consequently, while some of the results for some subjects may be quite reasonable for the world top 50 or the top 100, after that they are sometimes downright bizarre.

The problem is that although QS has a lot of respondents worldwide, when it gets down to the subject level there can be very few. In pharmacy, for example, there are only 672 for the academic survey and in materials science 146 for the employer survey. Since the leading global players will get a large share of the responses, this means that universities further down the list will be getting a handful of responses for the survey. The result is that the order of universities in any subject in a single country like the Philippines can be decided by just one or two responses to the surveys.

Another problem is that, after a few obvious choices like Harvard, MIT (Massachusetts Institute of Technology), Tokyo, most respondents probably rely on a university’s general reputation and that can lead to all sorts of distortions.

Many of the subject rankings at the country level are quite strange. Sometimes they even include universities that do not offer courses in that subject. We have already seen that there are universities in the Philippines that are ranked for subjects that they do not teach.

Somebody might say that maybe they are doing research in a subject while teaching in a department with a different name, such as an economic historian teaching in the economics department but publishing in history journals and getting picked up by the academic survey for history.

Maybe, but it would not be a good idea for someone who wants to study history to apply to that particular university.

Another example is from Saudi Arabia, where King Fahd University of Petroleum and Minerals was apparently top for history, even though it does not have a history department or indeed anything where you might expect to find a historian. There are several universities in Saudi Arabia that may not teach history very well but at least they do actually teach it.

These subject rankings may have a modest utility for students who can pick or choose among top global universities and need some idea whether they should study engineering at SUNY (State University of New York) Buffalo (New York) or Leicester (United Kingdom) or linguistics at Birmingham or Michigan.

But they are of very little use for anyone else.