Wednesday, April 30, 2014

Will Alexandria University Make a Comeback?

In 2010 the first edition of the new model Times Higher Education  World University Rankings -- powered by Thomson Reuters -- caused amusement and consternation by placing Alexandria University in fourth place for research impact and in the world's top 200 overall.

This extraordinary achievement was entirely the result of the writings of one man, Dr Mohamed El Naschie "of" several universities, one of which was Alexandria University. By citing himself and being cited by others for papers published in a field in which there normally few citations, especially in the first two years after publication, El Naschie pushed the university into a huge score for citation impact.

Anyone interested in the El Naschie story can consult the blog El Naschie Watch. An appraisal of his work's scientific merit can be found in the legal  judgement in 2012 of Mrs Justice Sharp.

In 2011 TR tweaked the citations indicator a bit and managed to get Alexandria's citations score down to 61.4, which was still massively disproportionate to its score for research and its overall score. Then in 2012 it disappeared from the top 400 altogether.

Still the university did not give in. Like an ageing boxer trying for ever more obscure titles, Alexandria showed up in 93rd place in the 2013 THE BRICS and Emerging Economies rankings with a still creditable 31.5 for citations. That score of course represented citations of El Naschie's papers in the years up to 2009 after which he stopped publishing in Web of Science journals. One would expect the score to dwindle further as the number of his countable papers diminished year by year.

It seemed that Alexandria was destined to fade away into the legions of the unranked universities of the world. After his month of wonders in September 2009 when he published eight papers in a single issue of Chaos, Solitons and Fractals, El Naschie published nothing in indexed journals in 2010, 2011 or 2012.

But in July 2013 El Naschie had a paper in the Russian journal Gravitation and Cosmology. Eleven out of 31 cited references were to his own works. That could be a useful boost for Alexandria. However, the paper so far remains uncited.

El Naschie gave Alexandria University as his affiliation and reprint address although the email address appears to be a relic of his days as editor of Chaos Solitons and Fractals.

Will there be more indexed papers from El Naschie? Will Alexandria return to the world rankings?

Sunday, April 27, 2014

The Continued Oppression of Women at Oxford

As the advance of women through academia continues there are a few stubborn pockets of non-compliance. It appears that one is Oxford, where in some schools men are more likely to get first class degrees than women. This has excited much comment among educational experts who are generally unconcerned about the poor or declining performance of men in most subjects in most British universities.

Back in 1993 a study by McNabb, Pal and Sloane in Economica found that men were more likely than women to get first class degrees at English and Welsh universities. They were also more likely to get third class, pass or "other" degrees and less likely to get upper seconds but that did not seem to cause much concern.

A recent report by the Higher Education Funding Council for England has discovered that women have caught up with men as far as firsts are concerned while men are still behind with regard to upper seconds and continue to get more third class and other poor degrees.

But there is still work to do. There remain some subjects in some places that have defied global and national trends.

One of these is Oxford where a third of male students got firsts last year compared with a quarter of women. Men were ahead in 26 out of 38 subjects (and presumably behind or equal in 12 although nobody seems very bothered about that). The gap was particularly large in Chemistry, English  and History.

What is the reason for the relatively poor performance of Oxford women in English, Chemistry and History (the relatively poor performance of men in other fields obviously requires no explanation)? A female English student  says it has something to do with the confidence engendered by  "a certain type of all-male public [i.e. private] school". That assumes that it is students from all-male public schools and not state school nerds who are getting all those firsts.

Deborah Cameron, Professor of Language and Communication at Oxford University whose career has obviously failed to reach its full potential because of male bias, claims that it is because borderline first/upper second men are pushed by their tutors in a way that women are not. Is there any real evidence for this?

None of this is new. There was a similar report in 2013. Men were ahead in Politics, Philosophy and Economics, incubator of future politicians, and Modern Languages but behind in Jurisprudence and Classics.

There will no doubt be soul searching, reports, workshops and committees and in the end the imbalance will be rectified, probably by supplementing written exams with coursework and assignments and shifting the borders between first and upper seconds a bit.

I suspect though that it would be more helpful to read Julian Tan in the Huffington Post who writes that he got a first at Oxford by not travelling during spring breaks, saying no to nights out, revising instead of going to the college ball, not sleeping much, not spending much and worrying and complaining too much.

Tan notes that he was in the top four per cent for his subject (he said fourth percentile but that wouldn't get him a first anywhere) so he could probably have had a few trips or nights out before slipping into 2(i) territory. I suspect though that he may located the secret of the surviving pockets of male supremacy, which is the bizarre medical condition that causes some, mainly male, students or employees to find writing code, sitting in archives, reading about how to put out fires or fiddling around with SPSS files more interesting than social relationships, sharing interactive moments or exploring one's emotions.

Sunday, April 20, 2014

Should New Zealand Worry about the Rankings?

The Ministry of Education in New Zealand has just published a report by Warren Smart on the performance of the country's universities in the three best known international rankings. The report, which is unusually detailed and insightful, suggests that the eight universities -- Auckland, Otago, Canterbury, Victoria University of Wellington, Massey, Waikato, Auckland University of Technology and Lincoln  --  have a mixed record with regard to the Shanghai rankings and the Times Higher Education (THE) -- Thomson Reuters World University Rankings. Some are falling, some are stable and some are rising.

But things are a bit different when it comes to the QS World University Rankings. There the report finds a steady and general decline both overall and on nearly all of the component indicators. According to the New Zealand Herald this means that New Zealand is losing the race against Asia.

However, looking at the indicators one by one it is difficult to see any consistent and pervasive decline, whether absolute or relative.

Academic Survey

It is true that scores for the academic survey fell between 2007 and 2013 but one reason for this could be that the percentage of responses from New Zealand fell dramatically from 4.1% in 2007 to 1.2% in 2013 (see University Ranking Watch 20th February). This probably reflects the shift from a survey based on the subscription lists of World Scientific, a Singapore- based academic publishing company, to one with several sources, including a sign up facility.

Employer survey

In 2011 QS reported that there had been an enthusiastic response to the employer opinion survey from Latin America and it was found necessary to cap the scores of several universities where there had been a disproportionate response. One consequence of this was that the overall mean for this indicator rose dramatically so that universities received much lower scores in that year for the same number of responses. QS seems to have rectified the situation so that scores for New Zealand universities -- and many others -- recovered to some extent  in 2012 and 2013.

Citations per faculty and faculty student ratio

From 2007 to 2010 or 2011 scores fell for the citations per faculty indicator but have risen since then. The report notes that "the recent improvement in the citations per faculty score by New Zealand universities had not been matched by an increase in their academic reputations score, despite the academic reputation survey being focused on perceptions of research performance."

This apparent contradiction might be reconciled by the declining number of survey respondents from New Zealand noted above. Also, we should not forget the number on the bottom. A fall in the recorded number of faculty could have the same result as an increase in citations. It is interesting that  while the score for faculty student ratio for five  universities -- Auckland , Canterbury, Otago, Victoria University of Wellington and Waikato -- went down from 2010 to 2012, the score for citations per faculty went up. Both changes could result from an a decline in the number of faculty submitted by universities or recorded by QS. In only one case, Massey, did both scores rise. There was insufficient data for the other two universities.

International Faculty and International Students

The scores for international faculty have always been high and are likely to remain so. The scores for international students have been slipping but this indicator counts for only 5% of the total weighting.

New Zealand universities might benefit from looking at the process of submission of data to QS. Have they submitted lists of potential survey respondents? Are they aware of the definitions of faculty, students, international and so on? That might be more productive than worrying about a deep malaise in the tertiary sector.

And perhaps New Zealand salt producers could send out free packets every time the media have anxiety attacks about the rankings.

Thursday, April 17, 2014

The Scimago Ibero-America Ranking

In February the SCImago Research Group published its annual Ibero-American Institutions Ranking. This is not a league table but a research tool. The default order is according to the number of publications in the Scopus database over the period 2008-2012. The top five are:

1.  Universidade de Sao Paulo

2.  Universidade de Lisboa

3.  Universidad Nacional Autonoma de Mexico

4.  Universidade Estadual Paulista Julio de Mesquita Filho

5.  Universitat de Barcelona

Friday, April 11, 2014

Why are Britain's Universities Still Failing Male Students?

I doubt that you will see a headline like that in the mainstream media.

A report from the Higher Education Funding Council for England (Hefce) has shown that students who classify themselves as White do better than Black or Asian students who get the same grades at A levels. Mixed-race students are in between. The difference persists even when universities and subjects are analysed separately. 

Aaron Kiely in the Guardian says that this "suggests that higher education institutions are somehow failing black students, which should be a national embarrassment."

He then goes on to recount a study by the National Union of Students (NUS) that indicated that Black students suffered institutional barriers that eroded their self-esteem and confidence and that seven per cent said that the university environment was racist. 

A similar conclusion was drawn by Richard Adams also in the Guardian. He quoted Rachel Wenstone of the NUS as saying that it was "a national shame that black students and students from low participation backgrounds are appearing to do worse in degree outcomes than other students even when they get the same grades at A level."

It is interesting that the Hefce report also found that female students were more likely to get a 2 (i) than male students with the same grades, although there was no difference with regard to first class degrees. Men were also more likely to fail to complete their studies.

So is anyone worrying about why men are doing less well at university?


Thursday, April 10, 2014

The Parochial World of Global Thinkers

The magazine Prospect has just published its list of fifty candidates for the title of global thinker. It is rather different from last year. Number one in 2013, Richard Dawkins, biologist and atheist spokesman, is out. Jonathon Derbyshire, Managing Editor of Prospect, in an interview with the Digital Editor of Prospect says that is because Dawkins  has been saying the same thing for several years. Presumably Prospect only noticed this year.

The list is top heavy with philosophers and economists and Americans and Europeans. There is one candidate from China, one from Africa, one from Brazil and none from Russia. There is one husband and wife. A large number are graduates of Harvard or have taught there and quite a few are from Yale, MIT, Berkeley, Cambridge and Oxford. One wonders if the selectors made some of their choices by going through the contents pages of New Left Review. So far I have counted six contributors.

There are also no Muslims. Was Prospect worried about a repetition of that unfortunate affair in 2008?

All in all, apart from Pope Francis, this does not look like a global list. Unless, that is, thinking has largely retreated to the humanities and social science faculties of California, New England and Oxbridge.

Tuesday, April 01, 2014

Comparing the THE and QS Reputation Rankings

This year's Times Higher Education (THE) Reputation Rankings were  a bit boring, at least at the top, and that is just what they should be.

The top ten are almost the same as last year. Harvard is still first and MIT is second. Tokyo has dropped out of the top ten to 11th place and has been replaced by Caltech. Stanford is up three places and is now third. Cambridge and Oxford are both down one place. Further down, there is some churning but it is difficult to see any clear and consistent trends, although the media have done their best to find stories, UK universities falling or sliding or slipping, no Indian or Irish or African universities in the top 100.

These rankings may be more interesting for who is not there than for who is. There are some notable absentees from the top 100. Last year Tokyo Metropolitan University was, according to THE and data providers Thomson Reuters (TR), first in the world, along with MIT, for research impact. Yet it fails to appear in the top 100 in a reputation  survey in which research has a two thirds weighting. Rice University, joint first in the world for research impact with Moscow State Engineering Physics Institute  in 2012 is also absent. How is this possible? Am I missing something?

In general, the THE-TR reputation survey, the data collection for which was contracted out  to the pollsters Ipsos Mori CT, appears to be quite rigorous and reliable. Survey forms were sent out to a clearly defined group, researchers with papers in the ISI indexes. THE claim that this means that their respondents must therefore be active producers of academic research. That is stretching it a bit. Getting your name on a article published in a reputable journal might mean a high degree of academic competence or it could  just mean having some sort of influence over the research process. I have heard a report about an Asian university where researchers were urged to put their heads of  department on the list of co-authors. Still, on balance it seems that the respondents to the THE survey are mostly from a stable group, namely those who have usually made some sort of contribution to a research paper of sufficient merit to be included in an academic journal .

TR also appear to have used a systematic approach in sending out the survey forms. When the first survey was being prepared in 2010 they announced that the forms would be emailed according to the number of researchers recorded by UNESCO in 2007. It is not clear if this procedure has been followed strictly over the last four years. Oceania, presumably Australia and New  Zealand, appears to have a very large  number of responses this year, 10%, although TR reported in 2010 that UNESCO found only 2.1 % of the world's researchers in that region.

The number of responses received appears reasonably large although it has declined recently.  In 2013 TR collected 10, 536 responses, considerably less than in 2012 when it was 16,639. Again, it is not clear what happened.

The number of responses from the various subject areas has changed somewhat. Since 2012 the proportion from the social sciences has gone from 19% to 22% as has engineering and technology while life sciences has gone from 16% to 22%.

QS do not publish reputation surveys but it is possible to filter their ranking scores to find out how universities performed on their academic survey.

The QS approach is less systematic. They started out using the subscription lists of World Scientific, a Singapore based academic publishing company with links to Imperial College London. Then they added respondents from  Mardev, a publisher of academic lists, to beef up the number of names in the humanities. Since then the balance has shifted with more names coming from Mardev with some topping up from World Scientific. QS have also added a sign up facility where people are allowed to apply to receive survey forms. That was suspended in April 2013 but has recently been revived. They have also asked universities to submit lists of potential respondents and respondents to suggest further names. The  exact number of responses coming from all these different sources is not known.

Over the last few years QS have made their survey rather more rigorous. First, respondents were not allowed to vote for the universities where they were currently employed. They were restricted to one response per computer and universities were not allowed to solicit votes or instruct staff who to vote for or who not to vote for. Then they were told not to promote any form of participation in the surveys.

In addition to methodological changes, the proportion of responses from different countries has changed significantly since 2007 with a large increase from Latin America, especially Brazil and Mexico, the USA and larger European countries and a fall in those from India, China and the Asia-Pacific region. All of this means that it is very difficult to figure out whether the rise or fall of a university reflects a change in methodology or distribution of responses or a genuine shift in international reputation

Comparing the THE-TR and QS surveys there is some overlap at the top. The top five are the same in both although in a different order: Harvard, MIT, Stanford, Oxford and Cambridge.

After that, we find that the QS academic survey favours universities in Asia-Pacific and  Latin America. Tokyo is seventh according to QS but THE-TR have it in 11th place. Peking is 19th for QS and 41st for THE -TR. Sao Paulo is 51st in the QS indicator but is in the 81-90 band in the THE-TR rankings. The Autonomous National University of Mexico (UNAM) is not even in THE-TR's top 100 but QS put it 48th.

On the other hand Caltech, Moscow State University, Seoul National University and Middle Eastern Technical University do much better with THE-TR than with QS .

I suspect that the QS survey is tapping a younger less experienced pool of respondents from less regarded universities and from countries with high aspirations but so far limited achievements.