Monday, June 23, 2014

It Was the God Particle!

The truth about Panjab University's (PU) rise in the Times Higher Education World University Rankings -- and no other -- is revealed in the Times of India.

Shimona Kanwar notes:

"The paper on the discovery of Higgs boson particle better known as the God particle, which earned the Nobel Prize in Physics last year, has come as blessing for Panjab University (PU). PU earned an overall score of 40.2, most of which has been contribution of citations from the university's publications. The paper on the God particle had 10,000 citations, which helped immensely give the numero uno status to PU in the country.
The Times Higher Education Asia University ranking-2014 had four parameters -teaching, international outlook, industry income, research and citations. Out of the 30% score on citations, 84.7 was the top score, which gave the university an edge over the other 21 participating universities. This included Jawaharlal Nehru University, Delhi University, Aligarh Muslim University and IIT Kharagpur among others. Though the CERN project which was associated with the discovery of the God particle involved participation from Delhi University as well, a huge number of PhD students in the project from PU apparently contributed in this rank."We had build parts of a detector, contributed for the hardware, software and physics analysis in the Compact Muon Solenoid (CMS) stage of the God particle discovery," said Prof Manjit Kaur of PU, who was part of the project.
Panjab University had 12-15 PhD students and five faculty members from the department of physics who worked in collaboration for the prestigious project."

A couple of things are missing though. Delhi University (DU) also joined the project but did not even get into the top 100 of the Asian rankings. How come? It wasn't those doctoral students. It was probably (we can't be certain without seeing the scores for all the indicators) because although PU had fewer citations than DU over the relevant period it also had significantly  fewer papers to divide them by.

The trick to getting on in the THE rankings is not just to get lots of citations in the right field and the right year and the right country but also to make sure the total number of papers doesn't get too high.

And, as I noted yesterday, if TR, THE's data collectors, do what they have done for the Highly Cited researchers database and stop counting physics publications with more than 30 affiliations, then PU will almost certainly fall out of the rankings altogether.

Thursday, June 19, 2014

The New Highly Cited Researchers List

Citations have become a standard feature of global university rankings, although they are measured in very different ways. Since 2003 the Shanghai Academic Ranking of World Universities has used the list of highly cited researchers published by Thomson Reuters (TR), who have now prepared a new list of about 3,500 names to supplement the old one which has 7,000 plus.

The new list got off to a bad a start in 2013 because the preliminary list was based on a faulty procedure and because of problems with the assigning of papers to fields or subfields. This led to ARWU having to repeat the 2012 scores for their highly cited researchers indicator in their 2013 rankings.

The list contains a number of researchers who appear more than once. Just looking at the number of Harvard researchers for a few minutes, I have noticed that David M Sabatini, primary affiliation MIT with secondary affiliations at Broad Institute Harvard and MIT, is listed  for Biology and Biochemistry and also for Molecular Biology and Genetics.

Eric S Lander, primary affiliations with Broad Institute Harvard and MIT and secondary affiliations with MIT and Harvard, is listed three times for  Biology and Biochemistry, Clinical Medicine and Molecular Biology and Genetics.

Frank B Hu, primary affiliation with Harvard and secondary affiliation with King Abdulaziz University, Saudi Arabia, is listed under Agricultural Sciences, Clinical Medicine and Molecular Biology and Genetics.

This no doubt represents the reality of scientific research in which a single researcher might well excel in two or more closely related fields but if ARWU are just going to count the number of researchers in the new list there will be distortions if some are counted more than once.

The new list refers to achievements over the period 2002-12. Unlike the old list, which just counted the number of citations, the new one is based on normalisation by field -- 21 in this case -- and by year of publication. In other words, it is not the number of citations that matters but the numbers in relation to the world average for field and year of citation.

TR acknowledge that there is a problem resulting from the growing number of massively cited, multi-authored papers and reviews, especially in the subfields of Particle and High-Energy Physics. To deal with this issue they have excluded from the analysis papers in Physics with more than thirty institutional addresses.

I do not know if TR are planning on doing this for their data for the Times Higher Education World University Rankings. If they are, places like Panjab University are in for a nasty shock.

Another noticeable thing about the new lists is the large number of  secondary affiliations. In many cases the joint affiliations seem quite legitimate. For example, there are many researchers in subjects such as Biology and Biochemistry with affiliation to an Ivy League school and a nearby hospital or research institute. On the other hand, King Abdulaziz University in Jeddah has 150 secondary affiliations. Whether Thomson Reuters or ARWU will be able to determine that these represent a genuine association is questionable.

The publication of the new lists is further evidence  that citations can be used to measure very different things. It would be unwise for any ranking organisation to use only one citations based indicator or only one database.





Wednesday, June 18, 2014

The Circulation of Data

Times Higher Education has a piece about the most highly educated cities in the world. First, of course, is London, followed by Paris, Los Angeles, San Francisco (presumably including Berkeley) and Stockholm. The data comes from a report by PricewaterhouseCoopers, the international financial services company, which includes information about the percentage of the population with degrees and the ranking of universities in the city by (surprise!) Times Higher Education.

Boston is not in the top ten because it was not evaluated by PricewaterhouseCoopers.

Note that the  rankings indicator is based only on those that actually take part in the THE rankings. So London's score would not be affected by places like London Metropolitan University or the University of East London.

Looking at the PricewaterhouseCoopers report, the most important indicator might be the PISA scores, which suggest that the future belongs not to London or Paris but to Shanghai.

Friday, June 06, 2014

Are they having fun in Shanghai?

Harvey Mudd College is a very expensive highly ranked private school with a strong emphasis on the teaching of engineering and technology. The US News and World Report 2013 rankings have it in 12th place among national liberal arts colleges, second for master's engineering schools and fourth for computer engineering. Even so, it seems that some feel that it is a failure because it is not getting enough women to take courses in key disciplines such as computer science.

The new president, Maria Klawe, is taking care of that.

The new introductory class for Computer Science at Harvey Mudd is designed for those who did not go to computer camp in high school and is supposed to be interesting. Students edit Darth Vader's voice and on one test the answer to every question is 42 ( guess what the media would say if that happened in an underachieving inner city high school). If you are not amused by the joke about 42 you should forget about going to Harvey Mudd.

The course used to be about programming and was dominated by "geeky know it alls" who have now been told to mind their manners and shut up. Programming in Java has been replaced by Python.

"It was so much fun; it was so much fun" said one student.

Also, all female first year students go to attend a conference on women in computing .

And so, at Harvey Mudd 40% of computer science majors are now women. Bridgette Eichelberger switched from engineering to computer science because the fun of engineering was nothing compared to the happiness of computer science.

Meanwhile over at Berkeley, the introductory computer science course is now called the "Beauty and Joy of Computing".

Someday universities in Shanghai, Seoul and Taipei may start turning their faculties of science and engineering into places where the daughters of the 1%, or maybe the 5%, can find fun and happiness and from which  repellent geeks and nerds have been cleansed. Until that happens the universities and corporations of the US have cause to be very very afraid.






Thursday, June 05, 2014

The World's Top Global Thinkers

And finally the results are out. The world's leading thinker, according to a poll conducted by Prospect magazine, is the economist Armatya Sen, followed by Raghuram Rajan, Governor of the Reserve Bank of India, and the novelist Arundhati Roy.

Sen received degrees from the the University of Calcutta and Cambridge and has taught at Jadavpur University, LSE, Oxford, Cambridge and Harvard. Rajan has degrees from IIT Delhi, the Indian Institute of Management Ahmedabad and MIT. Roy studied architecture at the School of Architecture and Planning in Delhi.

The careers of Sen and Ragan illustrate a typical feature of Indian higher education, some excellent undergraduate teaching but somehow the outstanding students end up leaving India.

Prospect notes that the poll received "intense media interest in India" so it would be premature to conclude that the country has become the new global Athens.

The top non-Asian thinker is Pope Francis.

Personally, I am disappointed that the historian Perry Anderson only got 28th place.  I am also surprised that feminist and queer theorist Judith Butler, whose brilliant satire -- Hamas as part of the global left and so on -- is under-appreciated, was only 21st.

Tuesday, June 03, 2014

Two Articles in New York Times

Sunday's New York Times had twoarticles on international rankings. The first, by D. D. Guttenplan, is 'Re-Evaluating the College Rankings Game' and includes interviews with Angela Yung Chi Hou of the Higher Education Evaluation and Accreditation Council of Taiwan, Ellen Hazelkorn and myself.

The second by Aisha Labi is about the recently published U-Multirank rankings which are sponsored by the European Union.

Monday, June 02, 2014

What should India do about the rankings?

India seems to be suffering from ranking fever. This is a serious problem that periodically sweeps across countries with the national media echoing statements from university heads and bureaucrats about being in the top one hundred or two hundred of something in the next few years or passionate claims that rankings do not reflect the needs of local society or that the uniquely transformative features of this or that institution -- how dare they ignore our sensitivity training or sustainability programs! --  are not recognised by the rankers.

There is  now a lot of debate about which is the best university in India and also about why Indian institutions, especially the sometimes lauded Indian Institutes of Technology (IITs), have a modest impact on the international rankings.

So what do the various rankings say about the quality of Indian universities (counting the IITs and other Institutes)? Starting with Webometrics, which measures the Internet presence of universities,  first place in India goes to IIT Bombay, 517th in the world, followed by IIT Madras, The Indian Institute of Science (IISc) in Bangalore, IIT Kanpur and the University of Delhi.

Moving on to research based rankings, only one Indian university is ranked in  Shanghai Jiao Tong University's Academic Ranking of World Universities (ARWU) top 500, and that is the IISc in the 301-400 band.

Scimago Institutional Rankings in their 2013 default list of number of publications, also puts IISc in first place in India, followed by IIT Kharagur, IIT Delhi, the University of Delhi, and IIT Madras.

The Leiden Ranking has the IISc in first place for number of publications although IIT Roorkee is first for publications in high quality journals.

Looking at the research-only rankings then, the best bet for top place would be the IISc which is ranked first in India by ARWU, Scimago, and, for number of publications, by Leiden Ranking, although for quality of research the IITs at Roorkee, Delhi and Guwahati perform comparatively well.

Moving on to rankings that attempt to assess factors other than research, we find that in the most recent QS World and Asian University Rankings first place in India goes to IIT Delhi with IIT Bombay second and IIT Kanpur third.

Last year's Times Higher Education world rankings produced an unusual result. Panjab University (PU) was ranked in the 226-250 band well ahead of the IITs Delhi, Kanpur, Kharagpur and Roorkee in the 350 - 400. In this case, Panjab university's feat was entirely due to its massive score for citations, 84.7 compared to IIT Delhi's 38.5, a score that was in stark contrast to a very poor 14 for research.

The main reason for PU's whopping score for citations appears to be that a few of its physicists are involved in the Large Hadron Collider project, which involves more than 2000 physicists in more than 150 research centers and 37 countries and consequently produces a huge number of citations. PU gets the credit for all of those citations even though its contribution to the cited papers is extremely small.

This only works because the overall number of papers produced is low. Hundreds or even thousands of citations are of little incremental value if they are spread out over thousands or tens of thousands of papers.

It would be unwise for other Indian universities to emulate PU's approach to get into the THE rankings. For one thing they would have to keep total publications low. For another, they may find that highly cited researchers might be a tempting target for universities in the US or Australia. And it does not work for any other ranking.

It is noticeable that The IISc is not included in the QS or THE rankings, presumably as a result of its own choice.

Should India's universities try to improve their ranking performance? Perhaps, but it would be better if they focused on improving their research performance, admissions policies, administration and selection processes. And here there is a much bigger problem for India, the utterly dismal performance of the country's school system.

In 2009, students from Tamil Nadu and Himachel Pradesh, which do better than the Indian average on social and economic development measures, took the PISA test. They were just ahead of the lowest ranked Kirgystan.

Riaz Haq writes:

"In Tamil Nadu, only 17% of students were estimated to possess proficiency in reading that is at or above the baseline needed to be effective and productive in life. In Himachal Pradesh, this level is 11%. “This compares to 81% of students performing at or above the baseline level in reading in the OECD countries, on an average,” said the study. 
The average Indian child taking part in PISA2009+ is 40 to 50 points behind the worst students in the economic superstars. Even the best performers in Tamil Nadu and Himachal Pradesh - the top 5 percent who India will need in science and technology to complete globally - were almost 100 points behind the average child in Singapore and 83 points behind the average Korean - and a staggering 250 points behind the best in the best.
The average child in HP & TN is right at the level of the worst OECD or American students (only 1.5 or 7.5 points ahead). Contrary to President Obama's oft-expressed concerns about American students ability to compete with their Indian counterparts, the average 15-year-old Indian placed in an American school would be among the weakest students in the classroom, says Lant Pritchett of Harvard University. Even the best TN/HP students are 24 points behind the average American 15 year old."

If this does not change, there is very little that anyone can do to improve the status of India's universities, apart from importing large numbers of Finnish, Chinese or Korean students, teachers and researchers.









Thursday, May 22, 2014

The Twitter Rankings

Coldlime, an Internet marketing company has produced rankings of British and American universities according to Twitter activity.

The five most influential universities in the UK are:

1.   Plymouth
2.   Glasgow
3.   Birmingham
4.   East Anglia
5.   Swansea

The top five in the US are:

1.   Texas at Austin
2.   Wisconsin at Madison
3.   Indiana at Bloomington
4.   Harvard
5.   Florida

Update on Alexandria University

I recently wrote that Alexandria University might be making a comeback. There is some more evidence that this might be the case. The Leiden Ranking now includes 750 universities and Alexandria is there. For most subjects and settings it does not do particularly well but there do seem to be some widely cited publications in Earth and Environmental Sciences plus, of course, a high score for citations in Maths, Computer Science and Engineering. There are not enough papers in Cognitive Sciences and Social Sciences to be ranked.

I received a comment on the previous Alexandria post from "Nadia" that appeared to praise Dr El Naschie and also made apparently defamatory remarks about a British judge and at least two Egyptian public figures. Since this blog has a policy of not publishing possibly libellous comments, it was not approved. However, if "Nadia" should wish to send a comment that is comprehensible and does not contain personal attacks, it will be published. It would help if she or he could verify her identity.

Sunday, May 11, 2014

Global Elite Threatened by New Rivals?

The Centre for Science and Technology Studies (CWTS) at Leiden University has produced its annual Leiden Ranking. I hope to write something more general in a couple of days but for the moment here is an elaboration on a brilliant headline by Julie Hare in The Australian.

UTS kicks Harvard's butt in ranking

Well, sort of.  CWTS provide data for nine different indicators and seven different subject groups and there are settings for size and fractional counting. In addition you can vary the number of papers required for inclusion. 

Try clicking on Medical Sciences and Impact Indicator PP (proportion of articles in the top 10% of journals), keeping the default settings for size-independence and fractional counting and making sure the threshold stays at 100 papers. First is Rockefeller University which had 305 papers in Medical Sciences over a four year period, of which a third were in the top 10% of journals. University of Technology Sydney (UTS) had 115 papers and 22 % of those are in the top 10% of journals, that is 6.325 (remember this is fractional counting) papers a year.

Poor Harvard is fifth with 11,958 papers in Medical Sciences of which 2,523.138 are in the top tenth. But that is only 21.1 % .

Change the threshold to 500 papers and UTS disappears. Stop using fractional counting and it goes down to 20th. Uncheck "calculate size-independent indicators" and it sinks to 438th.

There is nothing wrong with presenting data in this much detail: in fact it can be very valuable. There are very probably a few very able researchers at UTS and it is helpful that they have been uncovered. But it would not be a good idea if research grants were to flow to UTS or students to apply there because of a handful of excellent papers a year in one subject.

So, if you manipulate the settings and parameters, there is a good chance that all sorts of institutions will display a pocket of excellence or two. For citations in Cognitive Sciences the University of Puerto Rico (102 papers) is ahead of University College London. For citations in Earth and Environmental Sciences Universiti Sains Malaysia is ahead of Rice. For high quality papers in Maths, Computer Science and Engineering Universiti Malaya beats Yale and for citations Universiti Kebangsaan Malaysia is fourth in the world, although it disappears if the threshold is moved to 500 papers.

Again, this could be useful information but it would be dangerous to assume that these universities now present a major threat to Oxbridge and the Ivy League.

And yes, Alexandria University is there in 211th place for citations (check for size-independent calculations, check for fractional counting, set the threshold at 100 papers) for Maths, Computer Science and Engineering.
















Bibliography 1

The European Journal of Education has a special free issue on university rankings. See HERE.

Friday, May 02, 2014

Under-50 Universities: Comparing THE and QS

Times Higher Education has just released its list of new universities, those that are less than 50 years old.
The top ten are:
1.   Pohang University of Science and Technology (Postech)
2.   EPF Lausanne
3.   Korea Advanced Institute of Science and Technology (KAIST)
4.   Hong Kong University of Science and Technology
5.   Nanyang Technological University, Singapore
6.   Maastricht University
7.   University of California Irvine
8.   Universite Paris-Sud 11
9.   Universite Marie et Pierre Curie
10. Lancaster University

And here are the top ten from QS's top new universities:

1.   Hong Kong University of Science and Technology
2.   Nanyang Technological University
3.   KAIST
4.   City University of Hong Kong
5.   Pohang University of Science and Technology
6.   Maastricht
7.   University of California Irvine
8.   Hong Kong Polytechnic University
9.   Autonomous University of Barcelona
10. Antwerp University

In some respects the two lists are quite similar. KAIST is in 3rd place, Maastrict in 6th and Irvine in 7th in both lists.Both have two Korean institutions in the top five.

However, there are some noticeable differences, showing the effect of methodology and weighting. There are three Hong Kong universities in QS's top ten but only one in THE's, probably reflecting the greater weight given to internationalisation and reputation in the former. City University of Hong Kong is 4th in the QS rankings and 17th in THE's. Hong Kong Polytechnic University gets 8th and 30th place.

Meanwhile the two French universities in the THE top ten, helped by very substantial scores for citations, are not ranked at all by QS although they had more than enough points in the world rankings. This could be because there are different interpretations about when full university status was achieved.

Wednesday, April 30, 2014

Will Alexandria University Make a Comeback?

In 2010 the first edition of the new model Times Higher Education  World University Rankings -- powered by Thomson Reuters -- caused amusement and consternation by placing Alexandria University in fourth place for research impact and in the world's top 200 overall.

This extraordinary achievement was entirely the result of the writings of one man, Dr Mohamed El Naschie "of" several universities, one of which was Alexandria University. By citing himself and being cited by others for papers published in a field in which there normally few citations, especially in the first two years after publication, El Naschie pushed the university into a huge score for citation impact.

Anyone interested in the El Naschie story can consult the blog El Naschie Watch. An appraisal of his work's scientific merit can be found in the legal  judgement in 2012 of Mrs Justice Sharp.

In 2011 TR tweaked the citations indicator a bit and managed to get Alexandria's citations score down to 61.4, which was still massively disproportionate to its score for research and its overall score. Then in 2012 it disappeared from the top 400 altogether.

Still the university did not give in. Like an ageing boxer trying for ever more obscure titles, Alexandria showed up in 93rd place in the 2013 THE BRICS and Emerging Economies rankings with a still creditable 31.5 for citations. That score of course represented citations of El Naschie's papers in the years up to 2009 after which he stopped publishing in Web of Science journals. One would expect the score to dwindle further as the number of his countable papers diminished year by year.

It seemed that Alexandria was destined to fade away into the legions of the unranked universities of the world. After his month of wonders in September 2009 when he published eight papers in a single issue of Chaos, Solitons and Fractals, El Naschie published nothing in indexed journals in 2010, 2011 or 2012.

But in July 2013 El Naschie had a paper in the Russian journal Gravitation and Cosmology. Eleven out of 31 cited references were to his own works. That could be a useful boost for Alexandria. However, the paper so far remains uncited.

El Naschie gave Alexandria University as his affiliation and reprint address although the email address appears to be a relic of his days as editor of Chaos Solitons and Fractals.

Will there be more indexed papers from El Naschie? Will Alexandria return to the world rankings?





Sunday, April 27, 2014

The Continued Oppression of Women at Oxford

As the advance of women through academia continues there are a few stubborn pockets of non-compliance. It appears that one is Oxford, where in some schools men are more likely to get first class degrees than women. This has excited much comment among educational experts who are generally unconcerned about the poor or declining performance of men in most subjects in most British universities.

Back in 1993 a study by McNabb, Pal and Sloane in Economica found that men were more likely than women to get first class degrees at English and Welsh universities. They were also more likely to get third class, pass or "other" degrees and less likely to get upper seconds but that did not seem to cause much concern.

A recent report by the Higher Education Funding Council for England has discovered that women have caught up with men as far as firsts are concerned while men are still behind with regard to upper seconds and continue to get more third class and other poor degrees.

But there is still work to do. There remain some subjects in some places that have defied global and national trends.

One of these is Oxford where a third of male students got firsts last year compared with a quarter of women. Men were ahead in 26 out of 38 subjects (and presumably behind or equal in 12 although nobody seems very bothered about that). The gap was particularly large in Chemistry, English  and History.

What is the reason for the relatively poor performance of Oxford women in English, Chemistry and History (the relatively poor performance of men in other fields obviously requires no explanation)? A female English student  says it has something to do with the confidence engendered by  "a certain type of all-male public [i.e. private] school". That assumes that it is students from all-male public schools and not state school nerds who are getting all those firsts.

Deborah Cameron, Professor of Language and Communication at Oxford University whose career has obviously failed to reach its full potential because of male bias, claims that it is because borderline first/upper second men are pushed by their tutors in a way that women are not. Is there any real evidence for this?

None of this is new. There was a similar report in 2013. Men were ahead in Politics, Philosophy and Economics, incubator of future politicians, and Modern Languages but behind in Jurisprudence and Classics.

There will no doubt be soul searching, reports, workshops and committees and in the end the imbalance will be rectified, probably by supplementing written exams with coursework and assignments and shifting the borders between first and upper seconds a bit.

I suspect though that it would be more helpful to read Julian Tan in the Huffington Post who writes that he got a first at Oxford by not travelling during spring breaks, saying no to nights out, revising instead of going to the college ball, not sleeping much, not spending much and worrying and complaining too much.

Tan notes that he was in the top four per cent for his subject (he said fourth percentile but that wouldn't get him a first anywhere) so he could probably have had a few trips or nights out before slipping into 2(i) territory. I suspect though that he may located the secret of the surviving pockets of male supremacy, which is the bizarre medical condition that causes some, mainly male, students or employees to find writing code, sitting in archives, reading about how to put out fires or fiddling around with SPSS files more interesting than social relationships, sharing interactive moments or exploring one's emotions.


















Saturday, April 19, 2014

Should New Zealand Worry about the Rankings?

The Ministry of Education in New Zealand has just published a report by Warren Smart on the performance of the country's universities in the three best known international rankings. The report, which is unusually detailed and insightful, suggests that the eight universities -- Auckland, Otago, Canterbury, Victoria University of Wellington, Massey, Waikato, Auckland University of Technology and Lincoln  --  have a mixed record with regard to the Shanghai rankings and the Times Higher Education (THE) -- Thomson Reuters World University Rankings. Some are falling, some are stable and some are rising.

But things are a bit different when it comes to the QS World University Rankings. There the report finds a steady and general decline both overall and on nearly all of the component indicators. According to the New Zealand Herald this means that New Zealand is losing the race against Asia.

However, looking at the indicators one by one it is difficult to see any consistent and pervasive decline, whether absolute or relative.

Academic Survey

It is true that scores for the academic survey fell between 2007 and 2013 but one reason for this could be that the percentage of responses from New Zealand fell dramatically from 4.1% in 2007 to 1.2% in 2013 (see University Ranking Watch 20th February). This probably reflects the shift from a survey based on the subscription lists of World Scientific, a Singapore- based academic publishing company, to one with several sources, including a sign up facility.

Employer survey

In 2011 QS reported that there had been an enthusiastic response to the employer opinion survey from Latin America and it was found necessary to cap the scores of several universities where there had been a disproportionate response. One consequence of this was that the overall mean for this indicator rose dramatically so that universities received much lower scores in that year for the same number of responses. QS seems to have rectified the situation so that scores for New Zealand universities -- and many others -- recovered to some extent  in 2012 and 2013.

Citations per faculty and faculty student ratio

From 2007 to 2010 or 2011 scores fell for the citations per faculty indicator but have risen since then. The report notes that "the recent improvement in the citations per faculty score by New Zealand universities had not been matched by an increase in their academic reputations score, despite the academic reputation survey being focused on perceptions of research performance."

This apparent contradiction might be reconciled by the declining number of survey respondents from New Zealand noted above. Also, we should not forget the number on the bottom. A fall in the recorded number of faculty could have the same result as an increase in citations. It is interesting that  while the score for faculty student ratio for five  universities -- Auckland , Canterbury, Otago, Victoria University of Wellington and Waikato -- went down from 2010 to 2012, the score for citations per faculty went up. Both changes could result from an a decline in the number of faculty submitted by universities or recorded by QS. In only one case, Massey, did both scores rise. There was insufficient data for the other two universities.

International Faculty and International Students

The scores for international faculty have always been high and are likely to remain so. The scores for international students have been slipping but this indicator counts for only 5% of the total weighting.

New Zealand universities might benefit from looking at the process of submission of data to QS. Have they submitted lists of potential survey respondents? Are they aware of the definitions of faculty, students, international and so on? That might be more productive than worrying about a deep malaise in the tertiary sector.

And perhaps New Zealand salt producers could send out free packets every time the media have anxiety attacks about the rankings.




Thursday, April 17, 2014

The Scimago Ibero-America Ranking

In February the SCImago Research Group published its annual Ibero-American Institutions Ranking. This is not a league table but a research tool. The default order is according to the number of publications in the Scopus database over the period 2008-2012. The top five are:

1.  Universidade de Sao Paulo

2.  Universidade de Lisboa

3.  Universidad Nacional Autonoma de Mexico

4.  Universidade Estadual Paulista Julio de Mesquita Filho

5.  Universitat de Barcelona

Friday, April 11, 2014

Why are Britain's Universities Still Failing Male Students?

I doubt that you will see a headline like that in the mainstream media.

A report from the Higher Education Funding Council for England (Hefce) has shown that students who classify themselves as White do better than Black or Asian students who get the same grades at A levels. Mixed-race students are in between. The difference persists even when universities and subjects are analysed separately. 

Aaron Kiely in the Guardian says that this "suggests that higher education institutions are somehow failing black students, which should be a national embarrassment."

He then goes on to recount a study by the National Union of Students (NUS) that indicated that Black students suffered institutional barriers that eroded their self-esteem and confidence and that seven per cent said that the university environment was racist. 

A similar conclusion was drawn by Richard Adams also in the Guardian. He quoted Rachel Wenstone of the NUS as saying that it was "a national shame that black students and students from low participation backgrounds are appearing to do worse in degree outcomes than other students even when they get the same grades at A level."

It is interesting that the Hefce report also found that female students were more likely to get a 2 (i) than male students with the same grades, although there was no difference with regard to first class degrees. Men were also more likely to fail to complete their studies.

So is anyone worrying about why men are doing less well at university?







 

Thursday, April 10, 2014

The Parochial World of Global Thinkers

The magazine Prospect has just published its list of fifty candidates for the title of global thinker. It is rather different from last year. Number one in 2013, Richard Dawkins, biologist and atheist spokesman, is out. Jonathon Derbyshire, Managing Editor of Prospect, in an interview with the Digital Editor of Prospect says that is because Dawkins  has been saying the same thing for several years. Presumably Prospect only noticed this year.

The list is top heavy with philosophers and economists and Americans and Europeans. There is one candidate from China, one from Africa, one from Brazil and none from Russia. There is one husband and wife. A large number are graduates of Harvard or have taught there and quite a few are from Yale, MIT, Berkeley, Cambridge and Oxford. One wonders if the selectors made some of their choices by going through the contents pages of New Left Review. So far I have counted six contributors.

There are also no Muslims. Was Prospect worried about a repetition of that unfortunate affair in 2008?

All in all, apart from Pope Francis, this does not look like a global list. Unless, that is, thinking has largely retreated to the humanities and social science faculties of California, New England and Oxbridge.








Tuesday, April 01, 2014

Comparing the THE and QS Reputation Rankings

This year's Times Higher Education (THE) Reputation Rankings were  a bit boring, at least at the top, and that is just what they should be.

The top ten are almost the same as last year. Harvard is still first and MIT is second. Tokyo has dropped out of the top ten to 11th place and has been replaced by Caltech. Stanford is up three places and is now third. Cambridge and Oxford are both down one place. Further down, there is some churning but it is difficult to see any clear and consistent trends, although the media have done their best to find stories, UK universities falling or sliding or slipping, no Indian or Irish or African universities in the top 100.

These rankings may be more interesting for who is not there than for who is. There are some notable absentees from the top 100. Last year Tokyo Metropolitan University was, according to THE and data providers Thomson Reuters (TR), first in the world, along with MIT, for research impact. Yet it fails to appear in the top 100 in a reputation  survey in which research has a two thirds weighting. Rice University, joint first in the world for research impact with Moscow State Engineering Physics Institute  in 2012 is also absent. How is this possible? Am I missing something?

In general, the THE-TR reputation survey, the data collection for which was contracted out  to the pollsters Ipsos Mori CT, appears to be quite rigorous and reliable. Survey forms were sent out to a clearly defined group, researchers with papers in the ISI indexes. THE claim that this means that their respondents must therefore be active producers of academic research. That is stretching it a bit. Getting your name on a article published in a reputable journal might mean a high degree of academic competence or it could  just mean having some sort of influence over the research process. I have heard a report about an Asian university where researchers were urged to put their heads of  department on the list of co-authors. Still, on balance it seems that the respondents to the THE survey are mostly from a stable group, namely those who have usually made some sort of contribution to a research paper of sufficient merit to be included in an academic journal .

TR also appear to have used a systematic approach in sending out the survey forms. When the first survey was being prepared in 2010 they announced that the forms would be emailed according to the number of researchers recorded by UNESCO in 2007. It is not clear if this procedure has been followed strictly over the last four years. Oceania, presumably Australia and New  Zealand, appears to have a very large  number of responses this year, 10%, although TR reported in 2010 that UNESCO found only 2.1 % of the world's researchers in that region.

The number of responses received appears reasonably large although it has declined recently.  In 2013 TR collected 10, 536 responses, considerably less than in 2012 when it was 16,639. Again, it is not clear what happened.

The number of responses from the various subject areas has changed somewhat. Since 2012 the proportion from the social sciences has gone from 19% to 22% as has engineering and technology while life sciences has gone from 16% to 22%.

QS do not publish reputation surveys but it is possible to filter their ranking scores to find out how universities performed on their academic survey.

The QS approach is less systematic. They started out using the subscription lists of World Scientific, a Singapore based academic publishing company with links to Imperial College London. Then they added respondents from  Mardev, a publisher of academic lists, to beef up the number of names in the humanities. Since then the balance has shifted with more names coming from Mardev with some topping up from World Scientific. QS have also added a sign up facility where people are allowed to apply to receive survey forms. That was suspended in April 2013 but has recently been revived. They have also asked universities to submit lists of potential respondents and respondents to suggest further names. The  exact number of responses coming from all these different sources is not known.

Over the last few years QS have made their survey rather more rigorous. First, respondents were not allowed to vote for the universities where they were currently employed. They were restricted to one response per computer and universities were not allowed to solicit votes or instruct staff who to vote for or who not to vote for. Then they were told not to promote any form of participation in the surveys.

In addition to methodological changes, the proportion of responses from different countries has changed significantly since 2007 with a large increase from Latin America, especially Brazil and Mexico, the USA and larger European countries and a fall in those from India, China and the Asia-Pacific region. All of this means that it is very difficult to figure out whether the rise or fall of a university reflects a change in methodology or distribution of responses or a genuine shift in international reputation

Comparing the THE-TR and QS surveys there is some overlap at the top. The top five are the same in both although in a different order: Harvard, MIT, Stanford, Oxford and Cambridge.

After that, we find that the QS academic survey favours universities in Asia-Pacific and  Latin America. Tokyo is seventh according to QS but THE-TR have it in 11th place. Peking is 19th for QS and 41st for THE -TR. Sao Paulo is 51st in the QS indicator but is in the 81-90 band in the THE-TR rankings. The Autonomous National University of Mexico (UNAM) is not even in THE-TR's top 100 but QS put it 48th.

On the other hand Caltech, Moscow State University, Seoul National University and Middle Eastern Technical University do much better with THE-TR than with QS .

I suspect that the QS survey is tapping a younger less experienced pool of respondents from less regarded universities and from countries with high aspirations but so far limited achievements.






Sunday, March 30, 2014

The Nature Publication Index

Nature has long been regarded as the best or one of the two best scientific journals in the world. Papers published there and in Science  account for 20 % of the weighting for Shanghai Jiao Tong University's Academic Ranking of World Universities, the same as Nobel and Fields awards or publications in the whole of the Science Citation and Social Science Citation Indexes.

Sceptics may wonder whether Nature has seen better years and is perhaps sliding away from the pinnacle of scientific publishing. It has had some embarrassing moments in recent decades including the publication of a 1978 paper that gave credence to the alleged abilities of the psychic Uri Geller, the report of a study by Jacques Beneviste and others that purported to show that water has a memory. the questionable "hockey stick" article on global warming in 1998 and seven retracted papers on superconductivity by Jan Hendrik Schon.

But it still seems that Nature is highly regarded by the global scientific community and that the recent publication of the Nature Publication Index is a reasonable guide to current trends in scientific research. This counts the number of publications in Nature in 2013.

The USA remains on top with Harvard first, MIT second and Stanford third although China continues to make rapid progress. For many parts of the world, Latin America, Southern Europe, Africa, scientific achievement is extremely limited. Looking at the Asia-Pacific rankings  much of the region including Indonesia, Bangladesh and the Philippines is almost a scientific desert.




Sunday, March 23, 2014

At Last! A really Useful Ranking

Wunderground lists the top 25 snowiest universities in the US.

The top five are:

1.  Syracuse University
2.  Northern Arizona University (that's interesting)
3.  The University at Buffalo: SUNY
4.  Montana State University
5.  University at Colorado Boulder

Tuesday, March 04, 2014

Reactions to the QS Subject Rankings

It looks as though the QS subject rankings are a big hit. Here is just a sample of headlines and quotations from around the world.

World Ranking Recognises Agricultural Excellence at Lincoln [New Zealand]

CEU [Central European University, Hungary] Programs Rank Among the World's Top 100

Boston-Area Schools Rank Top in the World in These 5 Fields

"Cardiff has been ranked as one of the top universities in the world in a number of different subjects, according to a recent international league table."

NTU [National Taiwan University] leads local universities making QS rankings list

Swansea University continues to excel in QS world subject rankings

Penn State Programs Rank Well in 2014 QS World Rankings by Subject

Anna Varsity [India] Enters Top 250 in QS World Univ Rankings

Moscow State University among 200 best in the world

New Ranking Says Harvard And MIT Are The Best American Universities For 80% of Academic Subjects

QS: The University of Porto ranked among the best in the world

4 Indian Institutions in 2014 World Ranking

"The Institute of Education [London] has been ranked as the world's leading university for Education in the 2014 QS World University Rankings."

Nine UvA [University of Amsterdam] subject areas listed in QS World University Rankings top 50

"The University of Newcastle's [Australia] Civil and Structural Engineering discipline has surged in the QS World University Rankings by Subject list"























C

Sunday, March 02, 2014

The QS Subject Rankings: Reposting

QS have come out with their 2014 University Rankings by Subject, three months earlier than last year. Maybe this is to get ahead of Times Higher whose latest Reputation Rankings will be published next week.

The methodology of these rankings has not changed since last year so I am just reposting my article which was first published in the Philippine Daily Inquirer on 27th May and then reposted here on the 29th May 2013.



The QS University Rankings by Subject: Warning 

It is time for the Philippines to think about constructing its own objective and transparent ranking or rating systems for its colleges and universities that would learn from the mistakes of the international rankers.

The ranking of universities is getting to be big business these days. There are quite a few rankings appearing from Scimago, Webometrics, University Ranking of Academic Performance (from Turkey), the Taiwan Rankings, plus many national rankings.

No doubt there will be more to come.

In addition, the big three of the ranking world—Quacquarelli Symonds (QS), Times Higher Education and Shanghai Jiao Tong University’s Academic Ranking of World Universities—are now producing a whole range of supplementary products, regional rankings, new university rankings, reputation rankings and subject rankings.

There is nothing wrong, in principle, with ranking universities. Indeed, it might be in some ways a necessity. The problem is that there are very serious problems with the rankings produced by QS, even though they seem to be better known in Southeast Asia than any of the others.
This is especially true of the subject rankings.

No new data

The QS subject rankings, which have just been released, do not contain new data. They are mostly based on data collected for last year’s World University Rankings—in some cases extracted from the rankings and, in others, recombined or recalculated.

There are four indicators used in these rankings. They are weighted differently for the different subjects and, in two subjects, only two of the indicators are used.

The four indicators are:
A survey of academics or people who claim to be academics or used to be academics, taken from a variety of sources. This is the same indicator used in the world rankings. Respondents were asked to name the best universities for research.
A survey of employers, which seem to comprise anyone who chooses to describe himself or herself as an employer or a recruiter.
The number of citations per paper. This is a change from the world rankings when the calculation was citations per faculty.
H-index. This is something that is easier to give examples for than to define. If a university publishes one paper and the paper is cited once, then it gets an index of one. If it publishes two or more papers and two of them are published twice each, then the index is two and so on. This is a way of combining quantity of research with quality as measured by influence on other researchers.

Out of these four indicators, three are about research and one is about the employability of a university’s graduates.

These rankings are not at all suitable for use by students wondering where they should go to study, whether at undergraduate or graduate level.

The only part that could be of any use is the employer review and that has a weight ranging from 40 percent for accounting and politics to 10 percent for arts and social science subjects, like history and sociology.

But even if the rankings are to be used just to evaluate the quantity or quality of research, they are frankly of little use. They are dominated by the survey of academic opinion, which is not of professional quality.

There are several ways in which people can take part in the survey. They can be nominated by a university, they can sign up themselves, they can be recommended by a previous respondent or they can be asked because they have subscribed to an academic journal or an online database.

Apart from checking that they have a valid academic e-mail address, it is not clear whether QS makes any attempt to check whether the survey respondents are really qualified to make any judgements about research.

Not plausible

The result is that the academic survey and also the employer survey have produced results that do not appear plausible.

In recent years, there have been some odd results from QS surveys. My personal favorite is the New York University Tisch School of the Arts, which set up a branch in Singapore in 2007 and graduated its first batch of students from a three-year Film course in 2010.In the QS Asian University Rankings of that year, the Singapore branch got zero for the other criteria (presumably the school did not submit data) but it was ranked 149th in Asia for academic reputation and 114th for employer reputation.

Not bad for a school that had yet to produce any graduates when the survey was taken early in the year.

In all of the subject rankings this year, the two surveys account for at least half of the total weighting and, in two cases, Languages and English, all of it.

Consequently, while some of the results for some subjects may be quite reasonable for the world top 50 or the top 100, after that they are sometimes downright bizarre.

The problem is that although QS has a lot of respondents worldwide, when it gets down to the subject level there can be very few. In pharmacy, for example, there are only 672 for the academic survey and in materials science 146 for the employer survey. Since the leading global players will get a large share of the responses, this means that universities further down the list will be getting a handful of responses for the survey. The result is that the order of universities in any subject in a single country like the Philippines can be decided by just one or two responses to the surveys.

Another problem is that, after a few obvious choices like Harvard, MIT (Massachusetts Institute of Technology), Tokyo, most respondents probably rely on a university’s general reputation and that can lead to all sorts of distortions.

Many of the subject rankings at the country level are quite strange. Sometimes they even include universities that do not offer courses in that subject. We have already seen that there are universities in the Philippines that are ranked for subjects that they do not teach.

Somebody might say that maybe they are doing research in a subject while teaching in a department with a different name, such as an economic historian teaching in the economics department but publishing in history journals and getting picked up by the academic survey for history.

Maybe, but it would not be a good idea for someone who wants to study history to apply to that particular university.

Another example is from Saudi Arabia, where King Fahd University of Petroleum and Minerals was apparently top for history, even though it does not have a history department or indeed anything where you might expect to find a historian. There are several universities in Saudi Arabia that may not teach history very well but at least they do actually teach it.

These subject rankings may have a modest utility for students who can pick or choose among top global universities and need some idea whether they should study engineering at SUNY (State University of New York) Buffalo (New York) or Leicester (United Kingdom) or linguistics at Birmingham or Michigan.

But they are of very little use for anyone else.

Thursday, February 20, 2014

Changing Responses to the QS Academic Survey

QS have published an interactive map showing the percentage distribution of the 62,084 responses to its academic survey in 2013. These are shown in tabular form below. In brackets is the percentage of the 3,069 responses in 2007.  The symbol -- means that the percentage response was below 0.5 in 2007 and not indicated by QS. There is no longer a link to the 2007 data but the numbers were recorded in a  post on this blog  on the 4th of December 2007.

The proportion of respondents from the USA rose substantially between 2007 and 2013. There were also increases for European countries such as the UK, Italy, Germany, France, Spain, Hungary, Russia, Netherlands and Portugal although there were declines for some smaller countries like Belgium, Denmark, Sweden and Switzerland.

The percentage of respondents from Japan and Taiwan rose but there were significant falls for India, China, Malaysia, Hong Kong New Zealand, Australia, Singapore, Indonesia and the Philippines

The most notable change is the growing number of responses from Latin America including Brazil, Mexico, Chile, Argentina and Colombia.


US   17.4   (10.0)
UK   6.5   (5.6)
Brazil   6.3  (1.1)
Italy  4.7    (3.3)
Germany   3.8 (3.0)
Canada   3.4 (4.0)
Australia   3.2  (3.5)
France   2.9    (2.4)
Japan   2.9    (1.9)
Spain   2.7    (2.3)
Mexico   2.6  (0.8)
Hungary   2.0   --
Russia 1.7   (0.7)
India 1.7   (3.5)
Chile  1.7     --
Ireland   1.6    (1.5)
Malaysia  1.5   (3.2)
Belgium 1.4  (2.6))
Hong Kong 1.4  (1.9)
Taiwan 1.3  (0.7)
Netherlands 1.2   (0.6)
New Zealand 1.2  (4.1)
Singapore 1.2  (2.5)
China 1.1   (1.6)
Portugal 1.1  (0.9)
Colombia 1.1   --
Argentina  1.0  (0.7)
South Africa 1.0   (0.7)
Denmark  0.9  (1.2)
Sweden  0.9  (1.7)
Kazakhstan  0.9
Israel 0.8   --
Switzerland  0.8  (1.5)
Austria 0.8  (1.3)
Romania 0.8  --
Turkey 0.7  (1.1)
Pakistan 0.7  --
Norway  0.6   --
Poland 0.6   (0.8)
Thailand 0.6   (0.6)
Finland 0.8   (0.5)
Greece 07  (0.7)
Ukraine 0.5   --
Indonesia   0.5  (1.2)
Czech 0.5   --
Peru 0.4   --
Slovenia 0.4   --
Saudi Arabia 0.4   --
Lithuania 0.4   --
Uraguay  0.3   --
Philippines 0.3   (1.8)
Bulgaria 0.3   --
UAE  0.3   --
Egypt 0.3   --
Paraguay  0.2   --
Jordan 0.2   --
Nigeria   0.2   --
Latvia 0.2   --
Venezuela  0.2   --
Estonia 0.2   --
Ecuador  0.2   --
Slovakia  0.2   --
Iraq 0.2   --
Jamaica 0.1   --
Azerbaijan 0.1   --
Iran 0.1  (0.7)   --
Palestine 0.1   --
Cyprus 0.1   --
Kuwait 0.1   --
Bahrain 0.1   --
Vietnam 0.1   --
Algeria 0.1   --
Puerto Rico 0.1   --
Costa Rica 0.1   --
Brunei 0.1   --
Panama 0.1   --
Taiwan 0.1   --
Sri Lanka 0.1   --
Oman  0.1   --
Icelan 0.1   --
Qatar 0.1   --
Bangladesh 0.1   --